checking the request for urls - perl

hi everybody .

i'm trying to extract a bunch of urls from an http request but when i do that i get nothing .

here it's my code

 use LWP::UserAgent; 
$url = "$ARGV[0]"; $ua = LWP::UserAgent->new; 
$req = HTTP::Request->new(GET=>$url); 
$res = $ua->request($req); 
if ($res->is_success) {  
 
   
 
@a = $res->decoded_content;  
 
@u = grep /COM/, @a;  
 
print @u;  
 
}

and i also treid this

use LWP::UserAgent;
 $url = "$ARGV[0]";
 $ua = LWP::UserAgent->new;
 $req = HTTP::Request->new(GET=>$url);
 $res = $ua->request($req);
 if ($res->decoded_content =~ /COM/) { 
 
 print $res->decoded_content =~ /COM/ 
 
 }

usage: perl $0 The UNIX and Linux Forums - Learn UNIX and Linux from Experts and both of them didn't give any urls . and i also read about HTTP::parser and i couldn't understand it