Hi,
I am trying to write a shell script which can read folder names from a text file and then go to the folder and picks up a xml file and write on my sipp script so that I can run the sipp script.
For example:
I have a text file called thelist.txt where I have provided all the folders name in which I have kept xml files for different purposes.
I want to write a shell script which can read the text file and then go to individual folders and picks up the xml file and write it into a sipp command.
It should be done 1000 times if I have 1000 names in the text files and I want to execute the script for 1000 xml files.
Here is the code which is just reading the name from testlist.txt and I have given the absolute path of the directory where it should read the .xml from.
Now I have achived my first hurdle. I want to execute the scripts one by one and not parallel. currently its executing all the scripts in parallel.
Please suggest
My Code:
for i in `cat testlist.txt`
do
# if [msml/$i/$i.xml]
# if [-f $i]
# then
sipp $2 -i $1 -m 1 -l 1 -s msml -sf msml/$i/$i.xml --trace_logs -timeout $3
# fi
i++
done
~
Third of all, unless your computer has 1000 cores, no point in running 1000 processes simultaneously, they will just go slower. You ought to wait after doing a certain number. Not all shells have facilities to wait for a particular process, the following script requires bash or newer ksh.
#!/bin/bash
# Save for later since set -- overwrites these
A=$1
B=$2
C=$3
D=$4
E=$5
set --
while read i
do
sipp $B -i $A -m 1 -l 1 -s msml -sf msml/$i/$i.xml --trace_logs -timeout $C &
# Use $1 $2 ... as a list of PIDs. The latest PID gets put on the end.
set -- $* $!
# If we have more than 3 processes going, wait for the oldest one.
if [ "$#" -gt 3 ]
then
wait $1
shift
fi
done < inputfile
wait
Last of all, if these processes are disk-intensive, running them in parallel may not make them faster anyway.
Thanks a lot for suggesting me the good practices. I will make sure I take care of these good practices in my script.
What is the good way to execute one script at a time?
sipp command goes in text file "testlist.txt" and it reads one folder name and it goes in the folder and reads .xml file execute the script and writes the result in a result file and then jump to the next script.
when I ran the new code then it went in loop and I was not even able to kill the processes.
Here is the flow I am trying to achieve:
I have an input file testlist.txt in which I have folder names msml1003, msml1004, msml1005 etc
I have folders msml1003, msml1004 etc which reside in a folder msml.
I have msml1003.xml, msml1004.xml inside folders msml1003, msml1004 etc
My code should read the foldername from "testlist.txt" and then go to msml/msml1003/ and copy msml1003.xml in sipp script given below. Once its done then it should write the results in results.csv file for pass or fail as per the response from the server.
After completing 1 test case then it will go to the next and so on and once it is done then it should exit automatically.
I have following code. Please help me to complete the full cycle as mentioned above.
#checking the file existence
if [ -f testlist.txt ]
then
echo "File testlist.txt Found"
#else
#echo "File does not exist"
if [ -d msml ]
then
echo "msml directory exist"
else
echo "Directory does not exist"
fi
fi
# Dleleting the result.csv
if [ -f result.csv ]
then
rm result.csv
echo TestID,Remark,Result >> result.csv
fi
#Deleting the old log files
while read i
do
#for i in `cat testlist.txt`
if [ -f msml/$i/$i*logs.log ]
then
rm msml/$i/msml*.log
fi
done < testlist.txt
#echo >&2
echo "done deleting old logs. Press any key to continue"
read -sn 1
#Executing sipp script
#Executing the test suite
while read i
do
echo "Test $i is exexuting"
# if [$i=""]
#then
# exit 1
# fi
if [ -f msml/$i/$i.xml ]
then
# if [-f $i]
# then
#echo "Test $i is exexuting"
sipp $2 -i $1 -m 1 -l 1 -s msml -sf msml/$i/$i.xml --trace_logs -timeout $3 &
else
echo "msml/$i/$i.xml does not exist"
fi
done < testlist.txt