processing with awk

I have many lines like the following in a file(there are also other kinds of lines)

  Host: 72.52.104.74 (tserv1.fmt2.he.net) Ports: 22/open/tcp//tcpwrapped///, 53/open/tcp//domain//PowerDNS 3.3/, 179/open/tcp//tcpwrapped///      Ignored State: closed (997)     Seq Index: 207  IP ID Seq: All zeros

and in these lines, fields which starts with "Host:", "Ports:" are separated by '\tab'. The number of fields in different lines may be different, which means that maybe in this line there are only fields starts with "Host:", "Ignored States:", while in another line the fields are as the above.

Now, I want to process these lines so that different fields are written into different files, like fields starting with "Hosts:" are written into file host.log, bla bla

------------------------------------------
the content of the file is like the following, and I want to process the 3rd and 7th line

# Nmap 5.51 scan initiated Wed Aug 22 18:42:27 2012 as: nmap -sS -P0 -O -sV -oN log -oX log.xml -oG log.grep -append-output 74.82.42.42
Host: 74.82.42.42 (ordns.he.net)        Status: Up
Host: 74.82.42.42 (ordns.he.net)        Ports: 22/open/tcp//tcpwrapped///, 53/open/tcp//domain//PowerDNS 3.3/, 179/open/tcp//tcpwrapped///      Ignored State: closed (997)       Seq Index: 196  IP ID Seq: All zeros
# Nmap done at Wed Aug 22 18:42:40 2012 -- 1 IP address (1 host up) scanned in 13.12 seconds
# Nmap 5.51 scan initiated Wed Aug 22 18:42:40 2012 as: nmap -sS -P0 -O -sV -oN log -oX log.xml -oG log.grep -append-output 72.52.104.74
Host: 72.52.104.74 (tserv1.fmt2.he.net) Status: Up
Host: 72.52.104.74 (tserv1.fmt2.he.net) Ports: 22/open/tcp//tcpwrapped///, 53/open/tcp//domain//PowerDNS 3.3/, 179/open/tcp//tcpwrapped///      Ignored State: closed (997)     Seq Index: 207  IP ID Seq: All zeros
# Nmap done at Wed Aug 22 18:43:00 2012 -- 1 IP address (1 host up) scanned in 19.48 seconds
.................
.................
 
grep "Host.*Ports" input.txt > host.log
awk '(/^Host/&&/Ports:/){print}' filename

thanks, but maybe my explanation is not clear in the original post,
actually

the file is like this

# Nmap 5.51 scan initiated Wed Aug 22 18:42:27 2012 as: nmap -sS -P0 -O  -sV -oN log -oX log.xml -oG log.grep -append-output 74.82.42.42
Host: 74.82.42.42 (ordns.he.net)        Status: Up
Host: 74.82.42.42 (ordns.he.net)        Ports:  22/open/tcp//tcpwrapped///, 53/open/tcp//domain//PowerDNS 3.3/,  179/open/tcp//tcpwrapped///      Ignored State: closed (997)       Seq  Index: 196
# Nmap done at Wed Aug 22 18:42:40 2012 -- 1 IP address (1 host up) scanned in 13.12 seconds
# Nmap 5.51 scan initiated Wed Aug 22 18:42:40 2012 as: nmap -sS -P0 -O  -sV -oN log -oX log.xml -oG log.grep -append-output 72.52.104.74
Host: 72.52.104.74 (tserv1.fmt2.he.net) Status: Up
Host: 72.52.104.74 (tserv1.fmt2.he.net) Ports:  22/open/tcp//tcpwrapped///, 53/open/tcp//domain//PowerDNS 3.3/,  179/open/tcp//tcpwrapped///      Ignored State: closed (997)     Seq  Index: 207  IP ID Seq: All zeros
# Nmap done at Wed Aug 22 18:43:00 2012 -- 1 IP address (1 host up) scanned in 19.48 seconds
.................
.................

so there are many lines like the 3rd and the 7th lines
there are many fields in these lines, separated by '\t'
and I want to extract the fields and put them into different log files.

for example, in hosts.log, I will have

22/open/tcp//tcpwrapped///
53/open/tcp//domain//PowerDNS 3.3/
179/open/tcp//tcpwrapped/// 
 22/open/tcp//tcpwrapped///
53/open/tcp//domain//PowerDNS 3.3/
179/open/tcp//tcpwrapped///
....
....

hosts.log is special coz there are sub-fields separated by ','

in Seq_Index.log, I will have

196
207
...
...

in ip_id.log,I will have

All Zeros
...
...

so my target is to get these log files

Still not clear about your requirement.

only these three files are to be generated

do you notice these two lines, the 3rd and the 7th lines

Host: 74.82.42.42 (ordns.he.net)        Ports:  22/open/tcp//tcpwrapped///, 53/open/tcp//domain//PowerDNS 3.3/,  179/open/tcp//tcpwrapped///      Ignored State: closed (997)       Seq  Index: 196

Host: 72.52.104.74 (tserv1.fmt2.he.net) Ports:  22/open/tcp//tcpwrapped///, 53/open/tcp//domain//PowerDNS 3.3/,  179/open/tcp//tcpwrapped///      Ignored State: closed (997)     Seq  Index: 207  IP ID Seq: All zeros

I want to creates several log files, which correspond to 4 keywords: Ports, Seq Index, OS, IP ID
but different lines may have different keywords
like in these two lines there are no "OS", in the first line there is no "IP ID"

in each log file, I want to have the character string which is just after the keyword, like in the Seq_index.log, I want to have 196(in the 3rd line), 207(in the 7th line), ........

let me give u a metaphor, there is a demographic list, listing ppl's name, sex, age, location
I want to get statistics like proportion of males, distribution of ages, so I want to get several files like sex.log, age.log, location.log.

Let me know if this neared anywhere to your requirement

perl -lne 'if(/Host.*Ports.*/){@a=split(/\t/);for($i=0;$i<=$#a;$i++){$a[$i]=~/\t*([\w\s]+):\s*(.*)/;open($1,">>$1.log");print $1 $2;close($1); }}' input_file

oh, I'm not familiar with perl, if there is shell thats better.
let me try it first, thanks

---------- Post updated at 07:31 AM ---------- Previous update was at 07:18 AM ----------

thanks it works fine except the Ports.log

22/open/tcp//tcpwrapped///, 53/open/tcp//domain//PowerDNS 3.3/, 179/open/tcp//tcpwrapped///
22/open/tcp//tcpwrapped///, 53/open/tcp//domain//PowerDNS 3.3/, 179/open/tcp//tcpwrapped///
22/open/tcp//tcpwrapped///, 53/open/tcp//domain//PowerDNS 3.3/, 179/open/tcp//tcpwrapped///, 1723/open/tcp//pptp//linux (Firmware: 1)/
...
...

coz there are sub-fields separated by ','

my target is like

22/open/tcp//tcpwrapped///
53/open/tcp//domain//PowerDNS 3.3/
179/open/tcp//tcpwrapped///
...
...
perl -lne 'if(/Host.*Ports.*/){@a=split(/\t/);for($i=0;$i<=$#a;$i++){$a[$i]=~/\t*([\w\s]+):\s*(.*)/;open($1,">>$1.log");print $1 $2;close($1); }}' input_file && sed -i 's/,/\n/g' Ports.log 

Check this...
But i tried solving within perl but somehow got stuck somewhere...out of time too...

it works, thanks