Shell Script to Capture a Screenshot

Hi All,

Suppose I want to take a screenshot of a website say Google and save that image. How should I do it?

I tried wget with this but of no help. It just makes a particular file in jpeg format but on opening the same it says corrupted. Although I can edit the jpeg as an HTML file.

wget "http://www.google.com" -O /tmp/abc.png

Can someone please help me on this?

T&R
Ankur

Naturally renaming a text file into .png does not make it a png image.

What is your system?

Hi

Yup got your point.

System Details:

Linux 2.6.18-371.9.1.el5 #1 SMP Tue May 13 06:52:49 EDT 2014 x86_64 x86_64 x86_64 GNU/Linux

Thanks
Ankur

That's your kernel -- the 5 megabyte file which gets loaded when you turn on your computer. Everything else is decided by whatever your distribution is, which we still don't know.

Your window manager is particularly important here.

I am running Linux on remote server and has connected to it via putty.

My Machine is Windows 64 Bit Build 7601, 6.1

Thanks

Linux is the 5-megabyte file which loads when the server turns on. If you don't know what distribution it is, uname -a and cat /etc/release

It's good to know you're not running a desktop on the linux server. My first try would have been to try and take a desktop screenshot and that obviously isn't going to work...

Hi

Attached the SS of uname -a

Thanks

IMHO, using wget on a website is not the same as getting a screen shot. The best you can do, as far as I know, is to get all of the images, html, etc using wget -m. It is probably analogous to using internet explorer to save the website. Hence you will have a directory of all of the files that are downloaded with the site. You can also try curl, but it probably works the same way.

So is there any other way to capture the SS via terminal?

You could have just copy-pasted instead of making a screenshot. I think you censored the bit I needed to know anyway. You also forgot cat /etc/release

You're in a terminal. There's no screen to shot.

haha, actually it was my server name. :P. The screenshot was caught by SNIP tool in window. I have logged in to remote server via putty.

When you download a web page you get an *.html file, not *.png. You would have to display the *.html in a browser to see it as an image, and then take a screenshot of that image...

...unless you know of or can write an application that converts *.html to *.png?
Like this one maybe...
How to convert an HTML web page to PNG image on Linux - Xmodulo

Usage: CutyCapt --url=http://www.example.org/ --out=localfile.png

Also the other one:

bash-3.2$ cat /etc/redhat-release
Red Hat Enterprise Linux Server release 5.10 (Tikanga)

1 Like

hi ankur328,

there is a lot of capture programs (most known are cutycapt,webkit2png, khtml2png vs...)
but i guess you want to save in the terminal and non graphical enviroment.
[ i m not sure it made in the tty(in pts -> putty ssh emul) and in without X server.]

at this point i can say my way via use X11 forwarding and firefox extension.
shortly it is forward the X11 request (wll come from Putty) to your X11 server (i used the Xming server) and save in the firefox.

1-) in the Putty settings check the (under SSH --> X11 ) "Enable X11 forwarding" )
2-) download and execute Xming in your windows.
3-) download the firefox extension.(commandlineprint2)
4-) copy the extension your redhat server and load it to the firefox and finally save the URL as image.

# firefox -install-global-extension "cmdlnprint_0_5_2/cmdlnprint_0_5_2.xpi"
# firefox -print http://www.google.com -printmode PNG -printfile /tmp/abc.png

i know its not what you want, but it can be try :slight_smile:

and my advice your alternative method is must be the cutycapt.
cutycapt
in there is a method a called "Using CutyCapt without X server" and it seems as very applicable.

regards
ygemici

Just generaly speaking of screenshots and shell scripts:

[[ -z "$1" ]] && echo "screenshot: Requires a filename!" && exit 1
OF="$1"
import -window root "$OF"

But thats probably not what you wanted.