average transaction time

Hi all,

I have large daily log file(s) that hold the times for requests and responses on different system requests. What I want to do is work out average transaction times for the day (one log = one day).

The problem I'm having is figuring out how to skip rows, i've sorted the output by uniq trans id and then worked out the time taken but then skipping to the next uniq trans id (two lines down) is proving difficult, any sugestions?

The below is the example of the extracted lines.

Date/TIME, Trans, Link name, uniq trans id, trans status

20070422095251 REQUEST - LINK_2:<437288668/1>:
20070422095251 RESPONSE - LINK_2:<437288668/1>:0:
20070422095251 REQUEST - LINK_2:<437288672/1>:
20070422095251 RESPONSE - LINK_2:<437288672/1>:0:
20070422095251 REQUEST - LINK_2:<437288699/1>:
20070422095252 RESPONSE - LINK_2:<437288699/1>:0:
20070422095252 REQUEST - LINK_2:<437288698/1>:
20070422095252 RESPONSE - LINK_2:<437288698/1>:0:
20070422095252 REQUEST - LINK_2:<437288730/1>:
20070422095253 RESPONSE - LINK_2:<437288730/1>:0:
20070422095253 REQUEST - LINK_2:<437288750/1>:
20070422095253 RESPONSE - LINK_2:<437288750/1>:0:

Could you tell us what output do you want from the listing given?

Hi,

All I plan is to display is the number of transactions per second or seconds per tranasction (which ever I decide).

From the log I cut out the time and trans id field and that will produce a file sorted by transaction id number.

I then want to go through and work out the time taken for each tranasaction, the problem is I don't know how to transend the file and work out the time for each transaction. Should I do a while read statement? I'm not sure.