How to pre-check scrutinize all my shell scripts?

I have a few shell scripts.

I need to run them on Flavors of Unix and Linux OS.

The problem occurs that a set of commands in the script fail on one Operating System or the other.

Below are a list of few commands that did not work on all operating systems i tested specifically becoz they are either located at different paths as they are not set in the user profile that i login with or those commands do not even exist on the System i run the script.

find
openssl
nawk
grep

I wish to test my shell scripts on any Operating System and figure out what command or arguments to the commands are failing.

This can be done basically by grepping for "command not found" or " invalid arguments" strings in the output of the shell script run.

This is my requirement. Can you tell me if it actually able to find all the failing commands for "command not found" or " invalid arguments without executing the script.

Something like a dummy runs that executes the script only for me to collect the output without actually executing the commands on the OS.

Sort of a pseudo run rather than actual run of the scripts.

Let me know if & how is it possible.

Will be a great help. Thank you.

If you can't even find find, you've got a horrible login profile, or are running from cron.

I m not dealing with crontab .

find command works and i can find find but the argument passed with find may not work on some systems. So i have to give the path to the feasible find among the many find command installed on that system.

By the way, like i asked in the OP is there a way to pseudo run my script rather than actual run ? or any trick up your sleeve ?

Your explanation of find makes that especially difficult: find just existing isn't good enough, you actually need to know that it's a specific version. That's not the sort of thing you can discover without running it to see if it works.

I don't know what to suggest besides thorough error-checking. You could have variables like

AWK="mawk" and run them like $AWK -F: '{ code }' ... to make it easier to adapt your script to different systems.

You could also rewrite certain things more portably to avoid needing system-specific commands, i.e. using plain awk instead of GNU awk, etc.

You can check paths by hand, I suppose.

OLDIFS="$IFS" ; IFS=":"

for CMD in openssl find nawk grep
do
        F=""
        for P in $PATH # Depends on IFS splitting, do not quote
        do
                [ -e "$P/$X" ] && F=1
                [ -z "$F" ] || break
        done

        if [ -z "$F" ]
        then
                echo "Required command $X not found in path" >&2
                exit 1
        fi
done

IFS="$OLDIFS"

Again, this does not check versions of commands, just their existence.

A "dry run" as you put it would only check syntax errors. Only running them will detect whether they're actually doing what you expect them to do.

1 Like

You can create symlinks in a directory in the path pointing to the right command version.
You need to restrict your script to the minimal subset of commands/options common to ALL systems it runs on.

Most systems offer a way to set a basic (and basically working, if the system is set up sensibly) environment: in AIX (and other UNIXes) this is /etc/environment , in Linux this is /etc/profile , etc..

I have built on that and created a standard environment file for all of my scripts which gives me a constant environment by being sourced in in all my scripts. It is called f_env and looks like this (excerpt):

unset ENV                                   # clear the environment
#---------------------------------------------------- set basic environment

typeset -x OS=$(uname -a | cut -d' ' -f1)   # find out the OS
                                            # read in standard environment
case "$OS" in
     AIX)
          . /etc/environment
          ;;

     Linux)
          . /etc/profile
          ;;

     *)
          . /etc/environment
          ;;
esac
                                            # set default TERM variable
case "$OS" in
     AIX)
          TERMDEF="$(termdef)"
          ;;

     Linux)
          TERMDEF="$TERM"
          ;;

      *)
          TERMDEF="$TERM"
          ;;

esac
typeset -x TERM=${TERMDEF:-'wyse60'}        # set default TERM variable
typeset -x LANG=C                           # default language environment
typeset -x EDITOR=vi                        # what else ? ;-))
typeset -x VISUAL=$EDITOR

typeset -x PATH="/usr/bin"                 # set the path
           PATH="$PATH:/bin"
           PATH="$PATH:/etc"
           PATH="$PATH:/usr/sbin"
           PATH="$PATH:/usr/ucb"
           PATH="$PATH:/sbin"
           PATH="$PATH:/usr/bin/X11"
           PATH="$PATH:/usr/local/bin"      # tools, home for scripts
           PATH="$PATH:/usr/local/sbin"     # -"-

if [ -z "$DEVELOP" ] ; then
     typeset -x FPATH="/usr/local/lib/ksh"  # set fnc path for fnc lib
     FPATH="$FPATH:/usr/local/bin"
     FPATH="$FPATH:/usr/local/sbin"
else
     typeset -x FPATH=~/lib        # for lib development
fi

[....]

Most of my scripts start this way:

#! /bin/ksh
# --------------------------------------------- main()
if [ "$DEVELOP" != "" ] ; then
     . /usr/local/lib/ksh/f_env
else
     . ~/lib/f_env
fi
[....]

Notice that the variable DEVELOP is examined both in the script and the environment-file. The reason is i have this standard-environment together with a lot of shell-functions packaged into a "library" and always on the same place on every system ( /usr/local/lib/ksh ). Still i want to be able to test with newly written (or changed) library functions without messing with the public part. For this i have a variable "DEVELOP" set in my profile on my development system. When this variable is set FPATH is set to ~/lib and all the external shell functions are loaded from there instead.

This way i can test with my private copy of the library until i am ready to release the next version.

I hope this helps.

bakunin

2 Likes

Let me state something that i missed stating clearly.

I am not looking for a fix i.e the corrected commands and their locations. No !!

I m looking only to report the failing commands that fail due to the "command not found" or due to "illegal arguments".

I just need to report all failing commands in the script without executing the script on the server.

That's my requirement.

You can do that with proper error checking. It's a part of programming, the script can't do it for you.

die() {
        echo "$@" >&2
        exit 1
}

find ... || die "find did not work"

# etc
2 Likes

OK, now i am a bit baffled. In my experience the best error handling is the one where you dn't have to handle anything at all because you avoid the error altogether. Only the next best is to report the error properly.

This is quite easy: any responsible script programming includes checking the return codes of all the commands (at least the vital ones). This return code will be non-zero for:

a) - failing commands, regardless of why they fail, but there will be different error codes for the error condition
a1) - unknown argument/option
a2) - not enough rights
a3) - whatever else the error might be

b) - unknown commands

In case of b) the shell itself will return error 127, as you can easily test with a known and an unknown command:

# ls
[... some output...]
# echo $?
0
# /foo/bar/baz
/bin/ksh: /foo/bar/baz:  not found
# echo $?
127

You should be able to build an error handling function around that.

I hope this helps.

bakunin

So, all of you are asking me to write a separate script to report the failing commands.

Which means i will have to look for all command in 1000+ lines of unix shell script to figure out all the commands i use in there like:

ifconfig
ggrep
nawk
openssl
gzip

etc...

However, i wanted to know if it is somehow possible that my scripts are automatically check for all commands without executing the script & without me having to provide all commands in the check.

So, if any new script is introduced i should be able to validate all the command in that script without executing the script & without manually traversing line by line looking for unique commands in the scripts.

If it is not possible, then i will check for each command like some have suggested on this thread.

You might try one of the script compilers. Searching 'bash compiler' gets lots of hits.

No, but you might think a bit and find a more "computerised" way of achieving this goal. You can, for instance, write a small parser, which filters out all the shell keywords and other language constructs so that only commands are being left over, which you can then feed into a while-loop for instance to process as described below.

You can probably (but with less runtime-security) forego the parser and reduce it to a matter of sequential grep s to arrive at the same list. Note, though, that this will be a more error-prone (altough more easily to achieve) alternative, because parsing adds context to the processed code which filtering doesn't (to add some theoretical background: the difference between Chomsky-Type 3 and Chomsky-Type 2 grammatics coming in here).

What exactly do you mean by "automatically"? If i remember correctly (Don Cragun may help out here) /usr/bin/dowhateverineed was not part of the POSIX standard last time i checked, so: no, you won't get that automatically. You will have to do work to achieve results like the rest of us.

Here is an idea: implement, what i proposed before, put that into a script, pass to this script the name of a another script you want to check, let the script go over the process of extracting commands and then, for every command found, execute a test if this command exists at the target system. This takes already care of the "command not found"-problem.

Here is another idea, which might sound a bit outlandish: STICK TO THE POSIX STANDARD! There are a lot of POSIX-certified platforms and even non-certified platforms will for most parts adhere to the standard. So, if you use only what is in this standard you will at most times be on the safe side even without complicated tests. If you want to use every obscure platform-specific speciality you are doomed to ultimately fail anyway, having tests or not. So, how about changing your working habits instead of wanting to change the world?

Frankly, i work in environments where i have to deal with up to a dozen different platforms (different OSes, different versions, etc.) and my scripts usually work even in these environments without problems (that is: problems coming from this diversity - my scripts are not flawless at all, but i don't have your problems). And 1000 lines of shell code is a typical size of one of my scripts, of which i have (and use) many dozens. My pity for you to have to manage 1000 lines of code is somewhat limited.

I hope this helps.

bakunin

No, to write your original one properly. If commands are failing left and right but it doesn't quit, you neglected to check a lot of return values.

I dont know... When I write a script that runs on different OS which may not have the same version of software, path syntax etc... I test the OS and accordingly set the environment for my programs with the needed params... and it always works...

Hi.

There have been discussions about static checkers for shell scripts. However, because of the flexibility of shell, there are times when the script really needs to be executed. For example:

ls=~/bin/my-ls
$ls /

Unless one processes the assignment, there is no way to tell if the command $ls will work, or even if it exists.

However, as an experiment and proof-of-concept, I wrote a perl script that, with many kludges, can process a number of the Bourne-shell-family statements to isolate commands that are not available. The perl is about 150 lines long.

An example shell script for input might be:

#!/usr/bin/env bash

# @(#) Example-2        Demonstrate sample statements for diagnosis.

LC_ALL=C ; LANG=C ; export LC_ALL LANG
pe() { for _i;do printf "%s" "$_i";done; printf "\n"; }
pl() { pe;pe "-----" ;pe "$*"; }
em() { pe "$*" >&2 ; }
db() { ( printf " db, ";for _i;do printf "%s" "$_i";done;printf "\n" ) >&2 ; }
db() { : ; }
C=$HOME/bin/context && [ -f $C ] && $C

FILE=${1-data1}

p=$( basename $0 ) t1="$Revision: 1.14 $" v=${t1//[!0-9.]/}
[[ $# -gt 0 ]] && [[ "$1" =~ -version ]] &&  { echo "$p (local) $v" ; exit 0 ; }

[[ $# -le 0 ]] || echo1 Hi

cat $FILE | cat1 | cat2 ; cat3

v1="a|b\
|c|\
d"

./scramble1

~/scramble2

$HOME/scramble3

x=`cmd1`

y=$( cmd2 )

if [ stuff ] ; then other stuff; fi

if [ stuff0 ]
then
  other stuff
fi

if [[ modern stuff ]] ; then other stuff; fi

while [ repetitive exp ] ; do stuff ; done

while [[ more exps ]] ; do stuff ; done

if grep1 ; then stuff1 ; done

while gawk2 ; do stuff2 ; done

case $item in:
        a) a=1 ;;
        (b) case1 ;;
esac

db " End of script."

exit 0

and running the perl code across it yields:

$ ./p1 example-2
/home/drl/scramble3: not found
./scramble1: not found
basename is /usr/bin/basename
case1: not found
cat is /bin/cat
cat1: not found
cat2: not found
cat3: not found
cmd1: not found
cmd2: not found
db is a function defined in this script.
echo1: not found
exit is a special shell builtin
export is a special shell builtin
other: not found
pe is a function defined in this script.
printf is a shell builtin
stuff1: not found
stuff2: not found
/home/drl/scramble2: not found

For the adventurous, there is a far-more-complete, complex, shell parser found at: Shell::Parser - search.cpan.org

This is not a complete scanner of scripts. For example, note that single-line if/while are not processed, but some shifting around of the perl would allow that. Constructs like until , select , etc. are not processed; case within case is not recognized.

The basic idea is to use 2 passes. In pass 1 the code is broken apart to allow ease of recognition in pass 2 by writing to a scratch file, then in pass 2 the scratch file is processed, resulting in the warnings, and notes.

I have a long script that adapts itself to different platforms. This perl code correctly identified the sw_vers OSX command as not being found on Linux, as well as SuSE command zypper not being found on Debian.

Good luck ... cheers, drl