I would like to download a local copy of a web page and get all of the css, images, javascript, etc.
In previous discussions (e.g. here and here, both of which are more than two years old), two suggestions are generally put forward: wget -p
and httrack. However, these suggestions both fail. I would very much appreciate help with using either of these tools to accomplish the task; alternatives are also lovely.
Option 1: wget -p
wget -p
successfully downloads all of the web page's prerequisites (css, images, js). However, when I load the local copy in a web browser, the page is unable to load the prerequisites because the paths to those prerequisites haven't been modified from the version on the web.
For example:
- In the page's html,
<link rel="stylesheet href="/stylesheets/foo.css" />
will need to be corrected to point to the new relative path offoo.css
- In the css file,
background-image: url(/images/bar.png)
will similarly need to be adjusted.
Is there a way to modify wget -p
so that the paths are correct?
Option 2: httrack
httrack
seems like a great tool for mirroring entire websites, but it's unclear to me how to use it to create a local copy of a single page. There is a great deal of discussion in the httrack forums about this topic (e.g. here) but no one seems to have a bullet-proof solution.
Option 3: another tool?
Some people have suggested paid tools, but I just can't believe there isn't a free solution out there.
wget -E -H -k -K -p http://example.com
- only this worked for me. Credit: superuser.com/a/136335/94039 – Delatorrewget --random-wait -r -p -e robots=off -U mozilla http://www.example.com
– Uzzi