MENU

FOLLOW

Shell script to compress all JPG images for whole website directories with jpegoptim

One of the main factors of website optimization is to reduce pages size by compressing images, high-quality images with big sizes can reduce the website or pages speed, especially that many of web users are using mobile devices with 3G connection.
Website speed is one of the main factor in Google search ranking.

Google research shows that page that loads in 6 seconds or more could cause 100% or more bounce rate.

One of the optimization techniques is to compress images to reduce overall page size:

According to Google:

Compressing images and text can be a game changer—30% of pages could save more than 250KB that way. Our analysis shows that the automotive, technology, and business and industrial market sectors have the most room for improvement.

jpegoptim is an optimization and compression tool, available for Linux and FreeBSD. It uses Huffman encoding which works by assigning the most frequent values in an input stream to the encodings with the smallest bit lengths.

I created this script to find all JPG images in the current directory (can be a website root directory), then output the found files in a For Loop for jpegoptim to compress each one. Finally, writes the output of the each image optimized in "jpegoptimLog" log file instead of the terminal.