searching a log file and appending to a .txt file

I'm new to shell scripting and am writing a script to help me log the free memory and hd space on a server. As of now, the script just runs 'df -h' and appends the output to a file and then runs 'top' and appends the output to a log file.

What I want to do, is have the script also search the log file for the specific values I'm tracking, and append them to another file for me.

As of now this is all I really have...

df -h > log
top >> log

Any help is apprecaited. Thanks!

p.s. I'm using Solaris 10

Do a man for "grep".

It would be something like this:

Basic:
grep "Value1" log > second.log
grep "Value2" log >> second.log

Moderate:
Put the values into a file named values.txt

cat /dev/null > second.log
for i in `cat values.txt`
do
grep $i >> second.log
done

Let me know if it works for you.

yes, this I've been able to figure out. Though, I don't quite understand the need to use cat in your moderate example.

After googling for a while, I figured I could pull out just the string using grep -o pattern filename, but it doesn't work for me. Next I tried to just pull out the substring I need using the below, but that isn't working for me either...

freeMem=`grep ",.*free mem," testlog.txt`
echo ${freeMem:22:2} ## get an error that says bad substitution.

Which "cat"?

The first one clears out second.log
The second one puts the contents of values.txt into array $i to loop through.