Data sorting out from log file

I have a logfile looks like this:

 [2013-08-16 03:24:39,192][INFO ][AbstractServiceInvoker          [SID:187175558][] Received request of type [ReportNotification]

I need two write out every line to a new file if contains [INFO ] to a file called the SID name, like 187175558_xyz.txt

Thanks for helping!

---------- Post updated at 08:11 AM ---------- Previous update was at 08:10 AM ----------

I got this one liner:

awk -F'[]:]' '/INFO/ {print > ($6 "_xyz.txt")}' file

How can I do this in script file?
Thanks

put

awk -F'[]:]' '/INFO/ {print > ($6 "_xyz.txt")}' $1 

in a file, make it executable, and test... it need the filename fo argument ($1)

The filename is newcs-in.log

What I meant was using the script, you type

 scriptname filename

of course the output of your one liner would be STDOUT...

Would be great that the output will be a file, every line that contains the INFO line must be in a file that named by the SID data, like this:

[2013-08-16 03:24:39,192][INFO ][AbstractServiceInvoker          [SID:187175558][] Received request of type [ReportNotification]

To a file named: 187175558_sid.txt

Other line line that contains INFO to a new file.

Should work as is. Why don't you try and post the result?

Tried it. The result is 2 files with a lot of lines that contains the INFO message. That's not a problem. But i need every line that contains INFO message into a new file, that called by the actual line SID number.

Post (or better: attach) your input file, and the resulting output files (ls output).