Arrow Key Method:
The simplest way to go through your command history is to use the up
and down arrow keys on your keyboard. At the command prompt, simply
press the up key to go to the last command you gave the Bash shell. If
you press up again, it will go to the second most-recent command, and
so on. If you press the down key (after having already used the up
key), you scroll forward in your history to more recent commands.
When the desired command is displayed, press Enter
to execute it.
Auto-complete Method:
Another way to use the Bash history is to begin to type the first part
of a command and then press control (Ctrl) and r
keys at the same time. When the text for the command that you have
already entered uniquely identifies the command you want, Bash
completes the rest of the command for you. For example, in your history
list you have these three commands: cd directory/directory2,
chmod 755 –R public_html, and cd
directory3/directory. At the command prompt type ch
and then Ctrl + r; the Bash shell automatically completes the command
to chmod 755 –R public_html. If you wanted to reissue
the command cd directory3/directory, you must type
enough of the command for the shell to uniquely identify it; thus you
would type cd directory3 before pressing Ctrl + r to
fill in the rest of the command. This method is the most useful when
you want to reissue a command from your history list that is not a
recent command, so using the arrow key method would take some time.
When the desired command is displayed, press Enter
to execute it.
Some Knowledge
vi
directory1/directory2/directory3/directory4/directory5/filename.txt
but what you want is
vi
Directory1/directory2/directory3/directory4/directory5/filename.txt
Instead of retyping the whole command to change “d” to “D”, simply
type
^directory1^Directory1
Note that you must make what appears in the “^xxx^yyy” uniquely
identifiable, else Bash will change only the first instance of that
character string. Using the example above, you must enter ^directory1^Directory1
rather than ^dir^Dir because the entire command
contains five instances of the character string dir.
When the desired command is displayed, press Enter
to execute it.
Keyboard
Shortcuts and
Aliases
Knowing and using keyboard shortcuts when using a command line
interface makes your work easier and faster. Use aliases for frequently
used commands that are either long or have tags that you always add to
them. Aliases are user-defined and are not available by default in
Bash, but can easily be added at any time.
Keys | Action, Use |
Ctrl + c | Kill the current process or command. |
Ctrl + w | Erase the word or string of characters (define by a preceding and following space) immediately preceding the cursor. |
Ctrl + u | Erase everything between the beginning of the command you are currently typing up to the cursor. |
Tab | Shortcut for file/directory names. For example, you have a directory named “sample_directory1” and you want to move to that directory. Type cd s and then press the Tab key, which completes the command as cd sample_directory1. Note that if you have a directory named “sample_directory1” and another named “sample_folder1”, you must type enough characters for the shell to know which directory beginning with “s” you wish to open. In this case, you must type cd sample_d + Tab. |
Ctrl + a |
Move to the start of the line |
Ctrl + e |
Move to the end of the line |
Alt + f |
Move forward a word |
Alt + b |
Move backward a word |
Ctrl + l |
Clear the screen, reprinting the
current line at the top |
Ctrl + k |
Cut (kill) the text from the
current cursor position to the end of the line |
Alt
+ d |
Cut (kill) from the cursor to
the end of the current word, or, if between words, to the end of the
next word |
Alt
+ DEL |
Cut (kill) from the cursor the
start of the current word, or, if between words, to the start of the
previous word |
Ctrl + w |
Cut (kill) from the cursor to
the previous whitespace |
Ctrl + y |
Paste(yank) the most recently
cuted (killed) text back into the buffer at the cursor |
Alt + a | Rotate the Paste (kill-ring),
and paste (yank) the new top. You can only do this if the prior command
is Ctrl-y or Alt-y. |
Aliases
Aliases are a way of customizing your Bash shell environment. An alias
takes an existing command or set of commands and makes them execute
with a new command word that you create. For example, you often use
this command to sort your files showing you which files and directories
are the largest:
du –a | sort –k 1n,1
(This command is useful for keeping track of used disk space.) So,
create an alias for this command. Using the editor of choice (see
“Opening Files” below) open the .bashrc file, which is the
configuration file for your Bash shell environment. Add the following
line to shorten the above command to sortfiles:
alias sortfiles=’du –a | sort –k 1n,1’
Save the changes and exit the editor. The next time you start your Bash
shell, your new alias will be in effect.
#!/bin/sh
# purpose: print out current directory name and contents
pwd
ls
#!/bin/shThe special variables $1-$9 correspond to the arguments passed to the script when it is invoked. For example, if we rewrite the script above as shown below, calling the script name, and then invoke the command name Dave Smith, the message "Your name is Dave Smith" will be printed out:
# name is a variable
name="fred"
echo "The name is $name"
#!/bin/sh
echo "Your name is $1 $2"
#!/bin/shComparisons
# join command - joins two files together to create a third
# Three parameters must be passed: two to join, the third to create
# If $3 doesn't exist, then the user can't have given all three
if [ "$3" ]
then
# this cat command will write out $1 and $2; the > operator redirects
# the output into the file $3 (otherwise it would appear on the screen)
cat $1 $2 > $3
else
echo "Need three parameters: two input and one output. Sorry."
fi
#!/bin/sh
# An alternative version of the join command
# This time we check that $# is exactly three. $# is a special
# variable which indicates how many parameters were given to
# the script by the user.
if [ $# -eq 3 ]
then
cat $1 $2 > $3
else
echo "Need exactly three parameters, sorry."
fi
#!/bin/sh
# checks whether a named file exists in a special directory (stored in
# the dir variable). If it does, prints out the top of the file using
# the head command.
# N.B. establish your own dir directory if you copy this!
dir=/home/cs0ahu/safe
if [ -f $dir/$1 ]
then
head $dir/$1
fi
The primitives available for comparison of numeric values are
For example:
#!/bin/sh
if test $# -le 5
then
echo Less than or equal to five parameters.
else
echo More than 5 parameters.
fi
exit 0
To test file types, a number of primitives are used
-s checks that the file exists and is not empty.
Confusingly, in Shell scripts no less than three different types
of quotes are used, all of which have special meanings. We have
already met two of these, and will now consider all three in detail.
Two types of quotes are basically designed to allow you to construct
messages and strings. The simplest type of quotes are single quotes;
anything between the two quote marks is treated as a simple string.
The shell will not attempt to execute or otherwise interpret any
words within the string.
The script below simply prints out the message: "your name
is fred."
#!/bin/sh
echo 'Your name is fred'
What happens if, rather than always using the name "fred," we want to make the name controlled by a variable? We might then try writing a script like this:
#!/bin/sh
name=fred
echo 'Your name is $name'
However, this will not do what we want! It will actually output the message "Your name is $name", because anything between the quote marks is treated as literal text - and that includes $name. For this reason, shells also understand double quotes. The text between double quotes marks is also interpreted as literal text, except that any variables in it are interpreted. If we change the above script to use double quotes, then it will do what we want:
#!/bin/sh
name=fred
echo "Your name is $name"
The above script writes out the message: "Your name is fred." Double quotes are so useful that we normally use them rather than single quotes, which are only really needed on the rate occasions when you actually want to print out a message with variable names in it.
The third type of quotes are called back-quotes. Back-quotes cause the Shell to treat whatever is between the quotes as a command, which is executed, then to substitute the output of the command in its place. This is the main way to get the results of commands into your script for further manipulation. Use of back-quotes is best described by an example:
#!/bin/sh
today=date
echo "Today is $today"
The date command prints out today's date. The above script attempts to use it to print out today's date. However, it does not work! The message printed out is "Today is date". The reason for this is that the assignment today=date simply puts the string "date" into the variable today. What we actually want to do is to execute the date command, and place the output of that command into the today variable. We do this using back-quotes:
#/bin/sh
today=`date`
echo "Today is $today"
Back-quotes have innumerable uses. Here is another example. This uses the grep command to check whether a file includes the word "and."
#!/bin/sh
# Check for the word "and" in a file
result=`grep and $1`
if [ "$result" ]
then
echo "The file $1 includes the word and"
fi
The grep command will output any lines in the file which do include the word "and." We assign the results of the grep command to the variable result, by using the back-quotes; so if the file does include any lines with the word "and" in them, result will end up with some text in it, but if the file doesn't include any lines with the word "and" in them, result will end up empty. The if-statement then checks whether result has actually got any text in it.
The test command can be used match filenames plus do string and numerical comparisons. For instance, if [ -z "$TEST" ]; ... will be true if the variable TEST is null. File test options are as listed below.
#!/bin/sh
if [ `whoami` != 'oracle' ]; then
echo Aborted - user `whoami` is incorrect, must be user oracle
exit 1
elif [ -z "$1" ] || [ -z "$2" ] || [ -z "$3" ] || [ -z "$4" ] || [ -z "$5" ] || [ -z "$6" ]; then
echo "$USAGE"
exit 1
elif [ -z "$PATH" ] || [ -z "ORACLE_BASE" ] || [ -z "ORACLE_HOME" ] || [ -z "TNS_ADMIN" ] || [ -z "ORACLE_SID" ] || [ -z "ORACLE_DBF" ] || [ -z "ORACLE_SBIN" ] || [ -z "ORACLE_UTILS" ] || [ -z "ORACLE_BACKUP" ] || [ -z "ORACLE_RESTORE" ]; then
echo Variable not defined
exit 1
else
...
fi
Whereas conditional statements allow programs to make choices about what to do, looping commands support repetition. Many scripts are written precisely because some repetitious processing of many files is required, so looping commands are extremely important.
The simplest looping command is the while command. An example is given below:
#!/bin/sh
# Start at month 1
month=1
while [ $month -le 12 ]
do
# Print out the month number
echo "Month no. $month"
# Add one to the month number
month=`expr $month + 1`
done
echo "Finished"
The above script repeats the while-loop twelve times; with the month number stepping through from 1 to 12. The body of the loop is enclosed between the do and done commands. Every time the while command is executed, it checks whether the condition in the square brackets is true. If it is, then the body of the while-loop is executed, and the computer "loops back" to the while statement again. If it isn't, then the body of the loop is skipped.
If a while-loop is ever to end, something must occur to make the condition become untrue. The above example is a typical example of how a loop can end. Here, the month variable is initially set to one. Each time through the loop it is incremented (i.e. has one added to it); once it reaches 12, the condition fails and the loop ends. This is the standard technique for repeating something a set number of times.
Occasionally, it can actually be useful to loop unconditionally, but to break out of the loop when something happens. You can do this using a while command with a piece of text as the condition (since the piece of text is always there), and a break command to break out of the loop. The computer will go round and round the loop continuously, until such time as it gets to the break statement; it will then go to the end of the loop. The break statement is issued from within an if-statement, so that it only happens when you want to loop to end. The example below loops continuously until the user guesses the right word. If you get inadvertently stuck in such a loop, you can always press Ctrl-C to break out.
This example also demonstrates how a shell script can get input from the user using the read command. The script loops continuously around the while-loop, asking the user for the password and placing their answer in the answer variable. If the answer variable is the same as the password variable, then the break command breaks out of the loop.
#!/bin/sh
password="open"
answer=""
# Loop around forever (until the break statement is used)
while [ "forever" ]
do
# Ask the user for the password
echo "Guess the password to quit the program> \c"
# Read in what they type, and put in it $answer
read answer
# If the answer is the password, break out of the while loop
if [ "$answer" = "$password" ]
then
break
fi
done
# If they get to here, they must've guessed the password,
# because otherwise it would just keep looping
echo "Good guess!"
Another form of looping command, which is useful in other circumstances, is the for command. The for command sets a variable to each of the values in a list, and executes the body of the command once for each value. A simple example is given below:
#!/bin/sh
for name in fred joe harry
do
echo "Hello $name"
done
The script above prints out the messages "Hello fred," "Hello joe," and "Hello harry." The command consists of the keyword for, followed by the name of a variable (in this case, $name, but you don't use the dollar in the for-statement itself), followed by the keyword in, followed by a list of values. The variable is set to each value in turn, and the code between the do and done keywords is executed once for each value.
The for-loop is most successful when combined with the ability to use wildcards to match file names in the current directory. The for-loop below uses the * wildcard to match all files and sub-directories in the current directory. Thus, the loop below is executed once for each file or directory, with $file set to each one's name. This script checks whether each one is a directory, using the -d option, and only writes out the name if it is. The effect is to list all the sub-directories, but not the files, in the current directory.
#!/bin/sh
for file in *
do
if [ -d "$file" ]
then
echo "$file"
fi
done
Options are passed into a script with a preceeding - (minus) sign. Parameters are passed in as space separated strings; strings containing spaces must be enclosed in double quotes. Options can be handled using a case statement of the getopts command. In addition to passing options and parameters into scripts there are a number of specialised variables with special functions. In general parameters are supplied as variable substitions to a script and options change the behaviour of a script.
Two methods of printing to the screen (STDOUT) are use of the echo command and the printf format arguments commands. Strings can use quoting as already explained. Special characters such as \n (newline), \t (tab) and \c (no newline) can be included by using escaping. Simple formatting for the printf command is exactly as occurs in C. The example below shows sinple use of the echo command.
#!/bin/shOutput can be redirected from STDOUT to a file or into STDIN from a file. A single < or > will overwrite, ie. command > file or command < file and two will append, ie. command >> file. Output can also be redirected from STDOUT into the STDIN of another command using a pipe (|) command. For instance, df -k | grep swap | grep -v grep will show available swap space capacity.
User input can be handled using the read command as shown in the example below where a file is read line by line from redirection into the while loop.
while read STRING do ... done < fileWhenever a command is executed three file handles are opened for that command execution. These file handles are STDIN, STDOUT and STDERR; typically responding to file descriptors of 0, 1 and 2 respectively. These file handles can be accessed by use of their file descriptors. All these file descriptors can be redirected to other files or from other files in the case of STDIN.
The /dev/null descriptor will discard STDOUT and STDERR
output.
Prompting for User Input
For scripts that will be run interacively we can prompt the user to
give us input. The read command
can be used to set a variable with a value read from user input:
#!/bin/sh
echo "Enter your name"
read name
echo "Hi $name, I hope you have a good day"
If you do not want the input to be display (for paswords), then use
the -s option for the read
command.
Functions can not be used in C-Shell. A function has the format of name () { command; ... }. Shell functions can be used to replace binaries or shell built-ins of the same name.
cd () { chdir ${1:-$HOME} ; PS1="`pwd`$ "; export PS1; } list () { ls -la; }The example below checks for the existence of all paths in the directory. Note how the local variable is unset after completion of the loop and note that the local variable is named in lowercase, an underscore character as the first character is sometimes used.
PATH= for dir in $PATH; do if [ -d "$dir" ]; echo "$dir ok" fi doneFunctions can be placed into libraries. These libraries can be included into script files by executing those functions within those script files. Note that function library files can only contain function definitions.
#!/bin/sh # #This is the function library # error () { echo "Error : " $@ >&2; } warning () { echo "Warning : " $@ $@ >&2; } email (subject,recipients,message) { if [ -z "$message" ]; then mailx -s $subject $recipients < /dev/null else mailx -s $subject $recipients < $message fi } #!/bin/sh # #This is the scripting calling functions within the function library # ./utilities.sh #Include the function library ...Text filtering can be executed with general Unix utilities, regular expressions, awk and sed).
The utilities head, tail, grep, sort, uniq and tr are all basic text filtering utilities.
tr 'A-Z' 'a-z' | tr -s ' ' < <filename> | grep <word> | grep -v grep | sort | uniq -c
sed is a stream editor, awk is a pattern matcher or simple programming language. These are common uses of these two utilities. Both sed and awk are executed as command 'script' files | STDIN. Both sed and awk can be used to match regular expressions or patterns to the contents of input. Perl pattern matching tends to function in a similar fashion to that of sed and awk. General meta-characters used for pattern matching are shown below.
Some pattern matching examples are shown below.
Patterns can be applied to files using sed where a particular action can be performed on the file content based on the matching results of those patterns in the form s/pattern/change/g where the g causes a global change to the input. p or d in the place of g would print or delete input lines respectively without changing the original input. sed can also be used to perform multiple updates as in sed -e 'command' -e 'command' ... -e 'command' files. sed could also be used to parse input from STDIN and display partial strings of STDIN input, much the same way as grep and awk would perform the same functionality. Personally I prefer grep and awk or even Perl.
Pattern matching in awk works the same way as in sed. awk simply has more functionality as a imple programming language. sed is an editor. awk is used for parsing the lines in a text file and taking actions on those lines. awk has very C-like syntax. awk allows if, while and for statements for flow control. awk also allows variable declarations (variable=value) plus passing in of shell variables into awk scripts (awk 'script' var=val var=val ... files. A HREF="../oracle.unixThings.html">See an example in Unix for Oracle under Disk Space and File Management). There is not really much point in going through the syntax of awk in this document since awk syntax is very simplistic. Typically awk in it's most simple form is used to parse files or STDIN and pull specific columns from the output as shown below.
# df -k | awk '{print $1 " " $5}'
Filesystem capacity
/proc 0%
/dev/dsk/c0t0d0s0 84%
fd 0%
swap 1%
The eval command can be used to process a command line twice. For instance, with the variable REDIRECT set to > file.out the command echo cat file.in $REDIRECT would not be executed but simply would send the text cat file.in > file.out to STDOUT, ie. the screen. In order to execute the command use the eval command as in eval echo cat file.in $REDIRECT.
The : command simply does nothing.
The type command gives the full path name of a Unix command, ie. type command1 command2 ... commandn.
The sleep n command pauses processing for n seconds.
The find command can be used to list files recursively through directories where those filenames match specified criteria. For instance, find all core-dump files on a machine using find ./ -name "core" -print. Using the -print option will restrict printing to the screen by excluding errors produced by non-accessible directories due to restrictive permissions. The format of the find command is find start-directory options actions. The -type f|d|b|c|l|p option allows specification of file types to find, ie. f, d, b, c, l or p (file, directory, block device, character device, link or named pipe). For instance, find / -type d -print finds directories only. The -size [+|-]n option finds only files of less than, greater than or equal to a specified number of blocks. For instance, find / -size +1000 -print finds all files greater than 1000 blocks in size. The find / [-mtime | -atime | -ctime] [+|-]n -print allows finding of files as per last modified (-mtime), last accessed (-atime) or last changed (-ctime). n determines more than, equal to or fewer than number of days from the current date. The -exec option allows execution of a Unix command on any file found by the find command. For example, find / -name "core" -exec rm -f {} \; will delete all core files recursively in the current directory. Be very careful using the -exec option with the find command, especially when executing something like an rm -f command. The results can be very upsetting.
The xargs command is used to provide a list of words from STDIN as arguments to another command, ie. ps -ef | grep ora_ | grep -v grep | xargs kill -9.
The expr command allows simple integer arithmetic, ie. expr 5 \* 12 echo's a result of 60. Available operators are +, - \* (escaped), / and % (modulus). expr can be used in shell scripts to increment variables, eg. VAR=`exp $VAR+1`.
The bc command will perform floating-point arithmetic and is not limited to integers as the expr command is.
The rsh (remote shell) command allows execution of a command from a remote machine, ie. run a command on another machine from the machine one is currently working on. ssh (secure shell) is a similar command but more secure by virtue of it's name and due to encryption and decryption between source and target machines.
Other
Sources
UNIX Bourne Shell Scripting :
http://users.sdsc.edu/~steube/Bshell/