How to read file line by line and compare subset of 1st line with 2nd?

Hi all,

I have a log file say Test.log that gets updated continuously and it has data in pipe separated format. A sample log file would look like:

<date1>|<data1>|<url1>|<result1>
<date2>|<data2>|<url2>|<result2>
<date3>|<data3>|<url3>|<result3>
<date4>|<data4>|<url4>|<result4>

What I want to do is compare first line with second, second with third and so on to see if url1 is same as url2, url2 is same as url3 and so on.

I am trying to do this by copying the details of Test.log in a file File1.txt (I am going to do this periodically, maybe few times a day).
Then read File1.txt line by line and get url1 in a variable say Var1 by using

cut -d '|' -f2 <line1 of File1.txt>

and somehow be able to compare it with variable Var2 which I am hoping to get by using

cut -d '|' -f2 <line2 of File1.txt>

:frowning:

Problem's I am facing here are -
1) Cut is treating each space separated word as a field while I try to use it while reading File1.txt line by line in a loop like:

 
 while i in `cat File1.txt`
     do
         cut -d '|' -f2 $i
     done
 

OR

 
 while read i
     do
         cut -d '|' -f2 $i
     done < File1.txt
 

2) I have no idea how to be able to compare url1 with url2, url2 with url3 and so on.

Can anyone please help me here if what I am trying is even possible and if it is then how can I do it.

Thanks in advance.

Why don't you read the file, using "|" as the IFS character?

while IFS="|" read A B URL C
    do echo $URL
    done < file
<url1>
<url2>
<url3>
<url4>

In the loop, save URL to OLDURL, and compare e.g.

[ "$URL" != "$OLDURL" ] && echo different || echo equal
1 Like

If you want to compare the 3rd field of adjacent records:

awk FS="|" 'prev != $3 { print; } { prev = $3; }'

But result (output) is desired?

Thanks for the replies RudiC and derekludwig.
I am trying both of your suggestions.

@derekludwig, result field from the log/file is just a field which is not important.
I just need to compare field url.

I am trying both the suggestions and will post the outcome here soon.

---------- Post updated at 01:11 PM ---------- Previous update was at 03:25 AM ----------

RudiC, I followed your suggestion and it worked perfect for me :slight_smile:
Thank you both RudiC and derekludwig