How to optimize and compress images for your webpage?
Having a fast site in important, your readers want your to load very fast, and it is also important for Google, if you have a lot of images in your site optimizing and compressing them is a must to have a fast loading site. You will also have the benefit of consuming less bandwidth.
There are mainly two types of images in a website, png and jpg or jpeg, we are going to see how to optimize both of them.
Optimizing jpeg or jpg files for fast loading
We will use two programs for this
jpegoptim, so first let's install them (I am using Ubuntu)
sudo apt-get install jpegoptim libjpeg-progs
jpegoptim we will apply a lossy method of compressing images, we will compress them and reduce quality to 80% which is still great for webpages.
find . -name "*.jpg" | xargs jpegoptim -m 80 -t
You can use this command and compress them withouth losing quality
find . -name "*.jpg" | xargs jpegoptim -t
jpegtran we will convert those files into progressive images, this is going to load them layer by layer making it appear they load faster than a baseline image which loads byte by byte.
for img in `ls *.jpg`; do jpegtran -copy none -optimise -outfile $img $img; done
This line does not work with subfolders, but I was not able to make it work that way with
Optimizing png files for fast loading
Now is the turn of png files, we will use
optipng, so first install it:
sudo apt-get install optipng
This command is going to take some time, specially if you have lots of images.
find . -name "*.png" | xargs optipng -o5 -quiet -keep -preserve -log optipng.log