get homepage and save it

Hello everyone,

I am searching for a way to open a webpage out of the shell and then save it's source to a file. Simply using wget doesn't work as there is java involved.
So my idea was to open the URL in firefox (or any other browser) where it it can be opened and then save the source to a file.

Unfortunately I did not find any way to make firefox automatically write the source of the opened page (...without further clicking...). Is there a way to do this out of the shell? Maybe with another browser?

Any Ideas from all the smart folks around here?

Thanks, cheers

shimaric

I don't understand the problem with wget.
Do you want to execute the java, not just copy it?
Do you need to also get links included in the page? (Try the -r switch.)

If the filename has a funny extension, you can use --force-html to force wget to behave better.

Finally, if you are trying to retrieve source as it looks before any server-side java executes, I think you are out of luck.

Also note that the curl command does the same thing wget does, but with a different syntax. If there is some wget peculiarity you don't like, curl may work better for you.

:b:

That solved everything! :slight_smile: Thanks for your instant help!

cheers shimaric