The Shell Scripting Tutorial


Hints and Tips

Check out www.shellscript.sh/examples/ for some more up-to-date tips and hints

The content below is, to be honest, rather outdated. the /examples part of this website has more, and more usable, tips and examples.


Unix is full of text manipulating utilities, some of the more powerful of which we will now discuss in this section of this tutorial. The significance of this, is that virtually everything under Unix is text. Virtually anything you can think of is controlled by either a text file, or by a command-line-interface (CLI). The only thing you can't automate using a shell script is a GUI-only utility or feature. And under Unix, there aren't too many of them!

You may have heard it said, that, with *nix, "everything is a file" - it's true.

We have a few subsections here ... what follows is general advice, hints and tips.

CGI Scripting

Exit Codes and flow control

Simple Expect replacement

Using trap to know when you've been interrupted - such as a CTRL-C, etc.

Workaround for the 'echo -n' vs 'echo \c' dichotomy

Documented Example of a real-life script I wrote - it helps users configure a SpeedTouch modem, and is available at http://speedtouchconf.sourceforge.net/.

We have already shown above a use of the simple but effective cut command. We shall discuss a few examples here some of the more common external programs to be used.

grep is an extremely useful utility for the shell script programmer.
An example of grep would be:


#!/bin/sh
steves=`grep -i steve /etc/passwd | cut -d: -f1`
echo "All users with the word \"steve\" in their passwd"
echo "Entries are: $steves"

This script looks fine if there's only one match. However, if there are two lines in /etc/passwd with the word "steve" in them, then the interactive shell will display:

$> grep -i steve /etc/passwd 
steve:x:5062:509:Steve Parker:/home/steve:/bin/bash
fred:x:5068:512:Fred Stevens:/home/fred:/bin/bash
$> grep -i steve /etc/passwd |cut -d: -f1
steve
fred

But the script will display:

Entries are: steve fred

By putting the result into a variable we have changed the NEWLINEs into spaces; the sh manpage tells us that the first character in $IFS will be used for this purpose. IFS is <space><tab><cr> by default. Maybe though we wanted to keep the NEWLINEs: It could look better if we made the spaces into NEWLINEs.... This is a job for tr:


#!/bin/sh
steves=`grep -i steve /etc/passwd | cut -d: -f1`
echo "All users with the word \"steve\" in their passwd"
echo "Entries are: "
echo "$steves" | tr ' ' '\012'

Note that tr translated the spaces into octal character 012 (NEWLINE).
Another common use of tr is its use of range... it can convert text to upper or lower case, for example:



#!/bin/sh
steves=`grep -i steve /etc/passwd | cut -d: -f1`
echo "All users with the word "steve" in their passwd"
echo "Entries are: "
echo "$steves" | tr ' ' '\012' | tr '[a-z]' '[A-Z]'

Here we have added a translation of [a-z] to [A-Z]. Note that there are exactly the same number of values in the range a-z as A-Z. This can then translate any character falling into the ASCII range a-z into A-Z ... in other words, converting lowercase letters into uppercase. tr is actually cleverer than this: tr [:lower:] [:upper:] would do the job just as well, and possibly more readably. It's also not as portable; not every tr can do this.

Cheating

Those who can't ... cheat

There is nothing wrong with cheating! Some things the shell just isn't very good at. Two useful tools are sed and awk. Whilst these are two hugely powerful utilities, which can be used as mini- programming languages in their own right, they are often used in shell scripts for very simple, specific reasons.

Whilst this means that the system has to load a largeish executable (52k for sed and 110k for awk), which is a nasty thing to do, the reason a good workman doesn't blame his tools, is that a good workman uses the right tools in the first place.
So let me introduce these two, with very simple uses.

Cheating with awk

Consider wc, which counts the number of characters, lines, and words in a text file. Its output is:

$ wc hex2env.c
	102	189	2306	hex2env.c

If we want to get the number of lines into a variable, simply using:

NO_LINES=`wc -l file`

which would read in the whole line.
Because the output is space-padded, we can't reliably get the number 102 into the string. Instead, we use the fact that awk works similarly to scanf in C - it strips unwanted whitespace. It puts these into variables $1 $2 $3 etc. So we use this construct:

NO_LINES=`wc -l file | awk '{ print $1 }'`

The variable NO_LINES is now 102.

Cheating with sed

Another handy utility is sed - the stream editor. Perl is very good at dealing with regular expressions, the shell isn't. So we can quickly use the s/from/to/g construct by invoking sed.For example:

sed s/eth0/eth1/g file1 >  file2

changes every instance of eth0 in file1 to eth1 in file2.
If we were only changing a single character, tr would be the tool to use, being smaller and therefore faster to load.
Another thing that tr can't do, is remove characters from a file:

echo ${SOMETHING} | sed s/"bad word"//g

This removes the phrase "bad word" from the variable ${SOMETHING}. It may be tempting to say, "But grep can do that!" - grep only deals with whole lines. Consider the file:


This line is okay.
This line contains a bad word. Treat with care.
This line is fine, too.

Grep would remove the whole second line, leaving only a two-line file; sed would change the file to read:

This line is okay.
This line contains a . Treat with care.
This line is fine, too.

Telnet hint

This is a useful technique that I picked up from Sun's Explorer utility. Although telnet is not used on servers any longer, it is still used by some network devices, such as terminal concentrators and the like. By creating a script such as this, your own script, or from a command line, you can run:

$ ./telnet1.sh | telnet

I have had a few people ask me about this, and have tended to point them towards the expect suite of code, which is pretty complex and bulky; this code should be pretty portable amongst systems (so long as they've got egrep). If it doesn't work on your system, try using GNU grep with the -q switch, or a proprietary grep and direct to /dev/null. Still a lot easier than installing expect, though.


telnet1.sh
#!/bin/sh
host=127.0.0.1
port=23
login=steve
passwd=hellothere
cmd="ls /tmp"

echo open ${host} ${port}
sleep 1
echo ${login}
sleep 1
echo ${passwd}
sleep 1
echo ${cmd}
sleep 1
echo exit

However, Sun add some clever error-checking code (note that the variables you could set and export from your current shell or shell script, to avoid storing passwords in readable files):

$ ./telnet2.sh | telnet > file1

telnet2.sh

#!/bin/sh
# telnet2.sh | telnet > FILE1 
host=127.0.0.1
port=23
login=steve
passwd=hellothere
cmd="ls /tmp"
timeout=3
file=file1
prompt="$"

echo open ${host} ${port}
sleep 1
tout=${timeout}
while [ "${tout}" -ge 0 ]
do
    if tail -1 "${file}" 2>/dev/null | egrep -e "login:" > /dev/null
    then
        echo "${login}"
        sleep 1
        tout=-5
        continue
    else
        sleep 1
        tout=`expr ${tout} - 1`
    fi
done

if [ "${tout}" -ne "-5" ]; then
  exit 1
fi

tout=${timeout}
while [ "${tout}" -ge 0 ]
do
    if tail -1 "${file}" 2>/dev/null | egrep -e "Password:" > /dev/null
    then
        echo "${passwd}"
        sleep 1
        tout=-5
        continue
    else
      if tail -1 "${file}" 2>/dev/null | egrep -e "${prompt}" > /dev/null
      then
        tout=-5
      else
        sleep 1
        tout=`expr ${tout} - 1`
      fi
    fi
done

if [ "${tout}" -ne "-5" ]; then
  exit 1
fi

> ${file}

echo ${cmd}
sleep 1
echo exit

Note that with this version, the output is grabbed to file1, and that this file is actually used by the script to check on its progress. I have added "> ${file}" so that the output received into the file is just the output of the command, not the logging-in process too.


  Previous: Functions  Next: Quick Reference   

My Paperbacks and eBooks

My Shell Scripting books, available in Paperback and eBook formats. This tutorial is more of a general introduction to Shell Scripting, the longer Shell Scripting: Expert Recipes for Linux, Bash and more book covers every aspect of Bash in detail.

Shell Scripting Tutorial

Shell Scripting Tutorial
is this tutorial, in 88-page Paperback and eBook formats. Convenient to read on the go, and in paperback format good to keep by your desk as an ever-present companion.

Also available in PDF form from Gumroad:Get this tutorial as a PDF
Shell Scripting: Expert Recipes for Linux, Bash and more

Shell Scripting: Expert Recipes for Linux, Bash and more
is my 564-page book on Shell Scripting. The first half covers all of the features of the shell in every detail; the second half has real-world shell scripts, organised by topic, along with detailed discussion of each script.