Shell or Perl Loop Screenshot URLs

I am trying to capture screenshots from a huge list of URLs. I am able to manually capture images of individual pages; that is, I simply run the following command to get a screenshot of Foo.com

$ python /path/to/screencapture.sh http://www.foo.com

I want to modify the script so that instead of manually entering the URL, I create a file with a unique URL on every line, and the script loops through until it captures a screenshot of every URL.

For example, the file would look something like this:

foofile
http:://www.google.com
http:://www.yahoo.com
http:://www.espn.com

With the tool that you have already, you can write a wrapper script:

#! /bin/bash
while read URL
do
    python /path/to/screencapture.sh "$URL"
done < file_containing_urls.dat

Perfect, thanks. FYI if anyone needs the screen capture script... webkit2png