What is a good command line tool to create screenshots of websites on Linux? I need to automatically generate screenshots of websites without human interaction. The only tool that I found was khtml2png, but I wonder if there are others that aren't based on khtml (i.e. have good JavaScript support, ...).
A little more detail might be useful...
Start a firefox (or other browser) in an X session, either on your console or using a vncserver. You can use the --height
and --width
options to set the size of the window to full screen. Another firefox command can be used to set the URL being displayed in the first firefox window. Now you can grab the screen image with one of several commands, such as the "import" command from the Imagemagick package, or using gimp, or fbgrab, or xv.
#!/bin/sh
# start a server with a specific DISPLAY
vncserver :11 -geometry 1024x768
# start firefox in this vnc session
firefox --display :11
# read URLs from a data file in a loop
count=1
while read url
do
# send URL to the firefox session
firefox --display :11 $url
# take a picture after waiting a bit for the load to finish
sleep 5
import -window root image$count.jpg
count=`expr $count + 1`
done < url_list.txt
# clean up when done
vncserver -kill :11
-display :11
needed to be --display=:11
But you gave me a great starting point! Thanks for that! –
Risk Try nice small tool CutyCapt, which depends only on Qt and QtWebkit. ;)
apt-get install cutycapt
–
Gazelle Have a look at PhantomJS, which seems to be a free scritable Webkit engine that runs on Linux, OSX and Windows. I've not used it since we currently use Browshot (commercial solution), but when all our credits run out, we will seriously have a loot at it (since it's free and can run on our servers)
scrot is a command line tool for taking screenshots. See the man page and this tutorial.
You might also want to look at scripting the browser. There are firefox add-ons that take screenshots such as screengrab (which can capture the entire page if you want, not just the visible bit) and you could then script the browser with greasemonkey to take the screenshots.
See Webkit2png.
I think this is what I used in the past.
Edit I discover I haven't used the above, but found this page with reviews of many different programs and techniques.
I know its not a command line tool but you could easily script up something to use http://browsershots.org/ Not that useful for applications not hosted on external IPs.
A great tool none the less.
I don't know of anything custom built, I'm sure there could be something done with the gecko engine to render to a png file instead of the screen ...
Or, you could fire up firefox in full screen mode in a dedicated VNC server instance and use a screenshot grabber to take the screenshot. Fullscreen = minimal chrome, VNC server instance = no visible UI + you can choose your resolution.
Use xinit with Xvnc as the X server to do this - you'll need to read all the manpages.
Downsides are that the screenshot is always the same size, doesn't resize according to the web page ...
There is the import command, but you'll need X, and a little bash script that open the browser window, then take the screenshot and close the browser.
You can find more information here, or just typing import --help in a shell ;)
http://khtml2png.sourceforge.net/
The deb file
worked on my Ubuntu after installing libkonq4 ... but you may have to cover other dependencies.
I think javascript support may be better now!
Stephan
Not for the command line but at least for usage in batch operation for a larger set of urls you may use firefox with its addon fireshot (licensed version?).
- Open tabs for all urls in your set (e.g. "open tabs for all bookmarks in this folder...").
- Then in fireshot launch "Capture all tabs"
- In the edit window then call "select all shots -> save all shots"
Having set the screenshot properties (size, fileformat, etc.) before you end with a nice set of shotfiles.
Steffen
© 2022 - 2024 — McMap. All rights reserved.