Bash Line Commands & Shell Scripting

Overview

It is useful to be able to use Bash Shell commands to gather information about a test target as well as filter that information for only the necessary useful data. For example you could use “wget” to download the contents of a web page and then sort through and filter the results for links, domains, subdomains, and IP addresses that can be tested later for vulnerable open services.

#wget “http://www.some-website.com”

You can also use Bash shell commands to filter and sort through any type of resulting text output. In the example below I run an Nmap on a LAN and then filter and sort through the results.

Exercise 1 – Scan a network for IP addresses and open ports. Sort and filter your data with Bash line commands.

Try running these steps which have useful commands:

  1. In a test network environment or one that you have permission to scan with Nmap. Try the following commands in a root terminal:
  2. Run a Nmap scan on an entire network looking for responding hosts computers and open ports:
    #nmap 192.168.1.1-254  (substitute your network’s address scheme)
  3. Output/save the results to a text file:
    #nmap 192.168.1.1-254 > scan.txt
  4. Cat the results to your terminal:
    #cat scan.txt
  5. If there are active responding computers in the network you may see the following sample line: “Nmap scan report for 192.168.1.100.” Now you can use Bash to filter the results piping to a grep command:
    #cat scan.txt | grep “report”
  6. Using the cut command and specifying a delimiter of a space (-d ” “), you can segment the text string into separate pieces based on the spaces between the words and then grab the field you need (-f 5).
    #cat scan.txt | grep “report” | cut -d ” ” -f 5
    This way the result is just the responding IP addresses from the Nmap scan. Example:
         Nmap scan report for 192.168.1.100
    Nmap scan report for 192.168.1.101
    Nmap scan report for 192.168.1.102
    becomes just:
    192.168.1.100
    192.168.1.101
    192.168.1.102 
  7. If you needed to sort the data because of redundant addresses you could use the sort command to alphabetize and remove duplicate data:
           #cat scan.txt | grep “report” | cut -d ” ” -f 5
  8. The initial Nmap command that was issued against the network, would also have shown any open ports and services that could be contacted if we wanted to filter the results to show the IP addresses followed by the open port numbers we could try and format our Bash command to search for any lines with the string “report” and or any lines with the string “open” using an escape backslash and a pipe (\|) as an OR logical operator like this:
    #cat scan.txt | grep “report\|open”
  9. Now you need to pipe to the cut command to clean the results. What would your final command look like so your output would look clean? Hint, add more pipes and more cuts to the command to strip away the unnecessary words.

 

Exercise 2 – Download a webpage, sort and filter the results for subdomains and servers 

Try running these steps which have useful commands:

  1. Download the yahoo homepage using wget:
    #wget “http://yahoo.com
  2. Examine the results using cat:
    #cat index.html
  3. Now we need to try and filter our results to subdomains of yahoo.com. To do this we can search for the lines with matching text yahoo.com:
    #cat index.html | grep “yahoo.com”
  4. Your result should be be too large to be of use. If we want to continue to filter the result we can assume that most of the domains have a “http://” in front of them, and we can try cutting the lines based on a delimiter of a colon and taking the second field
    #cat index.html | grep “yahoo.com” | cut -d “:” -f 2
  5. Now most of the lines will start with // so we can cut based on a “/” and grab the 3rd field
    #cat index.html | grep “yahoo.com” | cut -d “:” -f 2 | cut -d “/” -f 3
  6. Remove duplicate lines with sort
    #cat index.html | grep “yahoo.com” | cut -d “:” -f 2 | cut -d “/” -f 3 | sort -u
  7. If you need to remove the trailing characters from a line with a domain name. Like in this example:
    info.yahoo.com” this is a sample of training text
    then you may have a problem because the trailing character starts with a quotation mark. In this case you will need to use a backslash to escape the quotation mark and have it function as a string instead of a demarcation point. See below:
    #cat index.html | grep “yahoo.com” | cut -d “:” -f 2 | cut -d “/” -f 3 | sort -u | cut -d “\”” -f 1
  8. Save your results to text
    #cat index.html | grep “yahoo.com” | cut -d “:” -f 2 | cut -d “/” -f 3 | sort -u | cut -d “\”” -f 1 > domains.txt
  9. Output your text file to the screen and continue to sort and cut and grep until you have only unique subdomains
      #cat domains.txt | sort -u | grep “yahoo.com”
  10. Output the results overwriting the text file:
      #cat domains.txt | sort -u | grep “yahoo.com” > domains.txt

You can see from the list above that there are actually many different subdomains, services and servers

Exercise 3 – Write a Bash shell script to automate host and ping commands 

Write a shell script which will loop through a list of domain names, issuing a host command on each name, in order to return the IP address that maps to each name.

  1. Using a text editor like nano or gedit or vim open a text file and name it ips.sh
  2. You will write a shell script file that will loop through the list of subdomains you created above and saved as domains.txt. Type the following lines of text into your ips.sh file:
    #!/bin/bash
    for name in $(cat domains.txt);do
    host $name;
    done
    Save the file and exit. This script will loop through the domains and issue a host command on each domain name.
  3. Give the script file executable permissions:
    #chmod 755 ips.sh
  4. Run the scripts:
    #./ips.sh
  5. Output the results to a text file:
    #./ips.sh > server-ips.txt
  6. Clean up the results. Rewrite the shell script adding pipes, grep, cut and sort commands so that it produces only ip addresses.
  7. You should now have a file with only the IP addresses that resolve to each subdomain.

Write a shell script which will loop through a list or sequence of IP addresses, issuing a single ping command on each IP address, in order to determine which IPs will respond.

  1. Using a text editor like nano, gedit or vim open a text file and name it ping.sh
  2. You will write a shell script file that will loop through a sequence or list of IP addresses like the one you just created. Type the following lines of text into your ping.sh file:
    #!/bin/bash
    for ip in $(seq 150 250);do
    ping -c 1 192.168.1.$ip
    done

    the (seq 150 250) command when issued will loop through all the numbers between 150 and 250
    or you could use the list of ips in the text file you created earlier,
       #!/bin/bash
    for ip in $(cat server-ips.txt);do
    ping -c 1 $ip
    done
  3. Save the file, give it executable permissions, and run it:
            #chmod 755 ping.sh
    #./ping.sh
  4. Edit your script file by adding grep, cut, and sort commands to clean up the results so that only replying IP addresses are listed

Additional Study

There are more powerful tools for filtering and replacing text like awk, sed, and higher order scripting languages like Perl, Python, Ruby, Php, and regular expressions . It is worth your while to learn how to use these languages.

Author: Dan

Dan teaches computer networking and security classes at Central Oregon Community College.

Leave a Reply

Your email address will not be published. Required fields are marked *