download files

Hi,
I experience some difficulties to perform a script to be able to download some files via HTTP.

The Perl script is located on a Linux in a server ("/var/www/cgi-bin/down.pl") where I can run it via Internet Explorer.

I'm able to display the directory content ("/home/toto") but I'm not able to open or save the files from this directory to my local machine (Windows XP by example).

My goal is to be able to save a file when I click on them and choose the target on my local machine.

Can someone help me to fix this script? thanks in advance.

#!/usr/bin/perl
use CGI;
$|=1;
my $query=new CGI;
print qq~
<table align="center" width="90%">
<tr><td><font color="black" face="Arial" size="4"><b>Files in current directory</b></td></tr>
<tr>
<td valign="top">
<font color="black" face="Arial" size="2">
~;
$dir="/home/toto";
opendir(DIR, "$dir");
@current = readdir(DIR);
closedir(DIR);

foreach(@current){
unless($_ eq '.' || $_ eq '..' || -d qq~$dir/$_~){
push(@currentfiles, $_);
}
}
@currentfiles = sort { uc($a) cmp uc($b) } @currentfiles;
for($aa = 0; $aa <= $#currentfiles; $aa++){
print qq~

<a href="$dir/$currentfiles[$aa]" target="_blank">$currentfiles[$aa]</a><a href="$dir/$currentfiles[$aa]"> </a></font><br>
~;
}
print qq~
</font></tr>
</table>

The links won't work if your webserver is not configured to allow access these files, and your webserver can probably make its own listing of them anyway, no script required.

Also, please put code in [ code ] code tags [ /code ], it makes it much more readable.