Take webpage screenshot from command line in Ubuntu Linux

There are many instances when you want to take the screenshot of a webpage from within a php script or the command line. On ubuntu there are several ways to do it and most of them produce a webkit , gecko or khtml rendered screenshot image.

Some of the methods require X session to open a window and take screenshots. So VNC can be used to run in on servers for example. To Setup a VNC server on Ubuntu read this article.

1. wkhtmltopdf

wkhtmltopdf is a command line utility that converts html to pdf using webkit rendering engine.

Advantages :
1. Can automatically determine the height of the page to take full page screenshots unlike most other utilities.

Disadvantages :
1. Fails many times due to unknown reasons with an error saying "Painter not active".
2. Cannot render cufon fonts and flash animations. Sometimes it even fails in jquery animations which take long time to load.

3. cutycapt

Cutycapt is a utility to take the screenshot of a url, using the webkit rendering engine and save it to an image file.

Url
http://cutycapt.sourceforge.net/

Install

sudo apt-get install subversion libqt4-webkit libqt4-dev g++ cutycapt

Usage
To use cutycapt, simply run the command from the terminal, providing the url and the name for the output file.

$ cutycapt --url=http://www.google.com/ --out=google.png

It will create google.png file in home directory which would have the screenshot of www.google.com

Advantages

Disadvantages

1. Fails at CSS3 fonts.

2. Cannot automatically determine page height to take full page screenshot.
Workaround: If dimensions are available then the dimensions can be specified as the screensize and using such parameters with a virtual monitor can give full page screenshots

3. Opens up an annoying browers every time on the desktop
Workaround: Use xvfb,vnc.

The above tools and techniques can be used on a webserver with a language like PHP.

If you just need simple and quick solution, I recommend cutycapt. But if you need a stable, high performance system which can handle 1M request per day. Cutycapt will be a nightmare. I encounter a lot of problems when I try to use Cutycapt to handle 1M different websites. But after changed the source code of cutycapt, it still work perfect. I recommend to resize image inside of cutycapt. It will save around 1 seconds per request. BTW, page2images.com (multiple device screenshot) is built base on cutycapt. If you have any interesting, please take a look.

Hey, nice tutorial. I tried wkhtmltoimage and just wanted to share something. The command gives some weird errors/warnings like these –

$ ./wkhtmltoimage-amd64 http://google.com google.png
Loading page (1/2)
Rendering (2/2)
QPixmap: Cannot create a QPixmap when no GUI is being used ] 25%
QPixmap: Cannot create a QPixmap when no GUI is being used
Done

Although the screenshot *does get* generated. So, no worries i guess :D

This site, binarytides.com is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to Amazon.com.