WordPress takes care of “optimizing” images so you don’t have to
Optimizing images for WordPress. There’s been a lot of digital ink spilled on the subject. There are tons of urban myths swirling around. There’s stuff that’s true, stuff that was true five years ago, stuff that was never true, and stuff that’s way over complicated or just plain wrong.
But the real lowdown, circa early 2020, is stupid simple.
You don’t optimize images for your WordPress site. WordPress does it.
All you have to do is upload a good quality image, at the largest size your site will need, saved at a JPEG compression of 82 or higher.
And, by the way, make sure you have ImageMagick enabled as your imaging library.
That’s it.
Now here’s the long-read version
That’s too easy. Some explanation is probably a good idea. Grab a cup of coffee. We’ll go through this parameter by parameter. But if you’re a tl;dr type of person, you’ve already got the straight dope.
First, let’s take a look at what WordPress does with images:
Size matters
When you upload an image, WordPress takes your image and makes a series of resized renditions of it.
tl;dr version: All you have to do is upload a good quality image, at the largest size your site will need, saved at a JPEG compression of 82 or higher.
(We’re talking about continuous tone JPEG images here. PNGs are big and should be avoided unless you need the transparency they offer. They’re a topic for another day.)
By default, WordPress will make renditions of your image at widths of 150 px, 300 px, 768 px, 1024 px, 1536 px, and 2048 px. That’s six copies of your original picture.
When the time comes, WordPress will use the way cool SRCSET image tag attribute to offer the assortment of images sizes to a browser. The browser (if all goes well) picks the smallest one that will work for the container size at hand.
The idea here is to have available image files that are close to the sizes that will be needed by browsers on varying sized devices. If all the browser needs is a little thumbnail for a phone, we don’t want to bog down page load times sending an enormous picture that’s sized for a giant 4K desktop monitor. On the other hand, an itty bitty little image is going to look like, well, not very good on that snazzy 30” Retina setup.
Themes, too
Themes and gallery plugins can add some sizes of their own to the mix. For purposes of this post, I used the 2020 theme, which adds image sizes of 1200 and 1980 pixels wide, bringing the total up to eight.
(Some of the renditions are constrained by “bounding box” dimensions, meaning that the image can be no deeper than it is wide. Some aren’t. And sizes can be forced to a certain aspect ratio, like 150×150 for thumbnails. For our purposes today, we don’t need to worry about this.)
Math-inclined readers will see that the jump in size between the various renditions is not even: a 300px wide image is four times the size of 150px wide image, while a 768px wide image is over five times as big as the 300px wide picture. Some of the jumps are only about twice. (The difference in file size, or area for that matter, is the difference in linear size squared. So, twice the pixel dimension equals four times the size.)
We can never actually serve a too-small image, only too-big. So there will be some waste. The biggest factor, image size-wise, in keeping load times down is having a file on hand that’s close to the size a browser will ask for. Hitting the mark exactly is wonderful. Being one pixel short and triggering sending that four-or-five-times-as-big file pretty much sucks.
Shorten load times
We get the shortest load times by having file sizes close to – just a bit bigger than – what the browser asks for. That means that the more closely-spaced rendition size we have, the quicker our pages will load.
But we pay for extra renditions in server storage space. Which we pay for in money. Free lunch once again eludes us.
We can tweak how many renditions, and at what sizes, WordPress will make of our images with some code in our functions.php file. Google “add image size” and “unfix image size” for code snippets. A few of the image sizes can be controlled from the settings panel in the WordPress backend. (And yeah, you can tweak which ones appear there, too.)
So, if we’re obsessed with page load time and we don’t much care about disk space on our server, we can twidle with our set of rendition sizes until the domestic livestock returns from the fields. But 99% of us have better things to do with our time.
Moving on
What does “at the largest size your site will need” mean? And what happens to the original file we upload?
Now things get a little tricky because these answers are dependent on your theme and how you configure your site.
We’ll start with the obvious. Well, not so obvious. We’ll start at the beginning.
“The largest size you’ll need” is likely just that. What’s the biggest picture you will need to serve to the biggest, highest resolution monitor you intend to support? For your theme, remember. What might that size be?
On this blog, the column width is capped at 960 pixels. I run mostly screenshots and I don’t cater to high-density monitors. So, for me, a picture in a post only needs to be 960 pixels wide. I upload images at 2048 pixels wide (assuming that the source is that big or bigger). That allows for full-width images. And WordPress does make me a nice 960px rendition.
Like most, or at least many, WordPress users, I link my images to the media file. When a visitor clicks on an image he or she sees the “original” image that I uploaded. 2048 pixels wide is fine for that, in my opinion, for this blog.
Since WordPress 5.3, I have a 2048 rendition by default. I could upload any (larger) size file I feel like and I’d get a 2048px copy. But, I still have to think about visitors clicking through to that media file when they want to see a picture bigger. So I still have to upload at 2048 pixels if I want that “media file” experience to be a 2048 pixel-wide image instead of something else.
Not quite that simple
But wait! It’s trickier than that. It would seem logical if the biggest image you serve to visitors on a page (if you’re not doing the click-a-picture-to-see-the-original-media-file thing that I do) would be the size you should upload. Because if a browser wants that size, WordPress can serve the original file. (Which isn’t touched, by the way. It is exactly the file you upload) Right?
Well…. your theme may have something to say about that. Some themes will serve the “original” file. Some won’t.
So, with some themes, your uploaded file serves two masters. It needs to be good enough to provide a good starting point to make those renditions. And it needs to be efficient enough to actually serve to visitors.
On the other hand, if your site doesn’t ever serve your original uploaded file, it ends up just sitting on the server wasting space, which costs some money but doesn’t affect page load time.
(My case is in between. I do serve the original file. But not often – only to people who click to see an image enlarged.)
What if we upload a file that matches one of WordPress’ rendition sizes? Well, I tried that with my WordPress 5.3.2 and 2020 test setup and a 2048px wide image (my regular upload size). In this case, WordPress and 2020 did serve the original. It skipped making the 2048 rendition. Very clever! I don’t know whether the code that did that lives in core or the theme, but it’s cool regardless.
But wait, there’s more – big images
With version 5.3, WordPress gained a new feature that automatically handles big images. If you upload an image bigger than a predetermined size, WordPress will now resize that image down to that size and add “-scaled” to its filename. It will then do with “-scaled” whatever it would do with the “original” image. In this case, the original “original” image just sits on the server taking up space.
This feature is specifically meant to allow people to upload images straight to their sites from their mobile phones. (Never mind that you can easily scale a picture on your phone if you’re so inclined.)
The default cut-off for “big images” is 2560px wide. That’s (would be) a pretty logical choice since 2560px is twice – or the Retina version of – a 1280px wide image. But WordPress by default doesn’t make a 1280px rendition! Go figure.
If you don’t want to limit your biggest image files to 2560px wide (which is pretty darn likely if you cater to high-resolution displays) you can change the ‘big_image_size_threshold’ by adding these two lines of code to your functions.php file.
function td_big_image_size_threshold( $threshold, $imagesize, $file, $attachment_id ) { return 4096; } add_filter( 'big_image_size_threshold', 'td_big_image_size_threshold', 10, 4 );
(functions.php lives in your child theme folder in your WordPress installation. The code here will change the threshold size to 4096px. Simply substitute the size you want for “4096” in the snippet.)
Or turn it off
If you want to simply turn off big_image_size, add this line of code to functions.php:
add_filter( 'big_image_size_threshold', '__return_false' );
So, “upload the largest image size you’ll need”. Short version seven words. With all the gory details – 702 words.
Putting the squeeze on
Now let’s take a look at “saved at a JPEG compression of 82 or higher”.
There is an awful lot of rubbish out there on this subject. “Magic” JPEG encoders, dodgy plugins, misinformation about image compression, and some lack of understanding about how WordPress works – the whole gang is here. Some of the nonsense might have actually made sense some years ago. But today….. Sigh.
Let’s once again start at the top.
Why save at a compression quality value of 82?
Because WordPress, by default, has its imaging library save your rendition files at a setting of 82.
Considering that the first job of the uploaded file is to provide a good quality basis for the renditions that will be made from it, it’s pretty obvious that we have to start off at least at the level of quality we are going to want in the final files.
This is lossy compression. One thing we can count on is that the result will never be better than what we started with.
Now a disclaimer
“82” in one program doesn’t necessarily equal “82” in some other program. Heck, in Photoshop, the scale runs from one through twelve in one part of the program and 1 through 100 in another. What’s more, the quality scale isn’t necessarily linear. Or necessarily anything.
That said, in my humble and limited experience, “82” tracks pretty well between ImageMagick, GD, and Photo Mechanic, and decently with XnView. If you put a really fine point on it, your mileage will most likely vary. So there.
Back to the plot
The high-level concept here is pretty straight-forward. If we damage the quality of the source image with more compression than we should and try to fix it later we will go nowhere but down in flames. Resist the urge to compress your source image more. It won’t help anything.
Remember that the images we will serve our visitors are not, except in fairly rare cases, the ones we upload, but rather the ones that WordPress makes from the images we upload.
Photographers and designers are taught to avoid re-compressing a JPEG file is at all possible. Re-compressing a JPEG file is exactly what we and WordPress are doing here. We’re navigating slippery slope territory. We must tread carefully.
(Overly) scrunching up our original images, adding compression artifacts in the process, will not make the renditions made by WordPress much – or any – smaller anyway. If we over-compress our source image we might make our rendered image a little bit smaller. Or, it could come out a bit bigger. Either way, not good news.
(If we over-compress our source file, one of two things will happen. JPEG compression tends to make stuff fuzzy. Fuzzy images compress real well, so we might indeed shave a few percent off our file size. On the other hand, jagged JPEG artifacts don’t compress very well, so if that’s what dominates in our particular picture, our file might actually grow a little. Either way, our compression-to-quality comparison will suffer and we lose on both the quality front and the load time front.)
Reduced quality means we will have to compensate with reduced compression to get back to where we need to be. Reducing compression on our renditions will for-sure make our files bigger and increase load time.
JPEG adjustments
Despite what snake-oily software salesmen may claim, there are really only four common compression parameters we can adjust on a JPEG image. And a couple of them are no-brainers.
The no-brainers are Huffman table optimization and Discrete Cosine Transform method.
Optimizing the Huffman table saves a fair bit of file size at no expense in quality. The only reason you would want not to use an optimized table is if you are using a JPEG decoder that’s 25 years old or so. I haven’t seen a piece of software that can’t read optimized JPEGS in a couple of decades. So we don’t much need to worry about that one.
The discrete cosine transform method is a trade-off between some CPU cycles while encoding the picture and quality. Unless you have a computer from the early nineties, you will never notice the slightly slower encode times. And we get a tad better quality and/or page load time for our CPU cycles investment. We won’t worry about this, either.
The big slider
That leaves the big quality slider and its little brother, chromatic subsampling.
The big slider is pretty obvious. It controls the aggressiveness of JPEG compression. We’ve already talked about it.
Chromatic subsampling does for color what luminance scaling does for detail. More of it means fewer colors and a more “JPEG-y” looking image.
Some programs allow you to manually choose more or less chrominance scaling and others simply mix in reasonable values on a sliding scale as you move the big quality slider.
ImageMagick (by default) and Photoshop both work the latter way. Chromatic subsampling is applied proportionately below a certain quality setting. (90, in the case of ImageMagick. In Photoshop it gets a little complicated. But, above “7” chroma subsampling is turned off.)
Ultimately, the image quality and compression ratio we get will depend on a balance of three factors: The two compression parameters and the content of the image itself.
Just why is it that we should use ImageMagick?
That matters to Google, honest people who might want to use a picture they find on the web, and, of course photographers and artists. For the time being, the alternative imaging library, GD, destroys that artist rights information.
So to make sure we’re rolling with the good guys we need to use ImageMagick. It’s easy, and it costs nothing in terms of load time or money. Go here for the complete low-down.
It’s mostly the image
Of the three factors, in my opinion, the content of the image has the most dramatic effect. Some images will compress tons. Others, only the minimum. Some pictures can stand a lot of chroma subsampling and not need as much overall compression. Be a bit too aggressive and others turn to hazy, fuzzy, blocky mush.
If we all had the same image to publish over and over, we could hone this optimizing thing to a fine edge. We could adjust parameters for hours. Those so inclined could run image quality assessment algorithms. We’d squeeze every last kilobyte out of that file.
But real-world images vary wildly. We have to be ready for whatever comes along and we can’t be endlessly fussing about how we’re going to compress them.
Which brings us full circle.
WordPress’s “82” quality level default works pretty darn well
At 82, we see a drastic reduction in file size compared to a minimally-compressed source file and we don’t see much of a noticeable decrease in quality. Pretty good. And it works that way with most images, most of the time.
“The largest size your site will need, saved at a JPEG compression of 82 or higher” and let WordPress do its thing at “82” is going to “just work” for almost all of us, almost all of the time.
One more thing
Remember that rights metadata that we need to protect? That’s IPTC metadata. It’s no bother. The lot of it generally only adds four to six kilobytes to our file weight. That’s only about a millisecond at today’s desktop download speeds. Imperceptible. That’s so little time that unless you could have 50 pictures above the fold on your page (and don’t have lazy load) it just won’t matter.
Optimizing your metadata
But IPTC metadata isn’t the only metadata on your picture. You also (usually) have Exif metadata in your file. Exif metadata is mostly camera logging information. Time, camera settings, that sort of thing. But Exif metadata has everything but the kitchen sink thrown in – including at least one thumbnail version of the image. Exif metadata usually consumes about 20KB of disk space. And I’ve seen much more. That’s not huge, really – four milliseconds worth or so – but it could be a little bit significant. In all but rare cases, Exif data serves no purpose on a published photo. It’s useful early in an asset’s lifecycle. But at the point of publication – usually useless. I recommend ridding ourselves of it.
Depending on the software you use to prepare your pictures for uploading, deleting all the Exif metadata can be as easy as ticking a tickbox one time in a save-as template, or kind of a pain. If it’s much of a hassle, I’d probably opt to just leave it. In this post I go through actually preparing images, step by step. I’ll show you that tickbox.
If you do use one of those “image optimization” plugins (I don’t recommend it. They just aren’t necessary) you’ll still have to make the metadata the way you want it before you upload. The best ones you’ll find are able to be set not to mess with metadata at all. But they aren’t sophisticated enough to distinguish the baby from the bathwater. The worst – most of them – just destroy everything.
More, more I say
What if you want to experiment with being more aggressive? Or maybe you are doing an art gallery site and your artists want better-looking images, load time be damned?
No problem. We can adjust the quality level WordPress tells ImageMagick (or GD) to use.
We cannot, at this point, as far as I have been able to divine, pass just any old argument we want to ImageMagick through WordPress. (Like, say, a chromatic scaling value.) We are stuck with ImageMagick’s default sliding mixture of luminance and chrominance scaling. But don’t worry. It will all be fine.
This snippet of code, pasted in functions.php, will tell the imaging library to change its compression quality setting:
add_filter('jpeg_quality', function($arg){return 100;});
Replace “100” with whatever value you want.
I have had decent results at 65. Files are significantly smaller than at 82. Image quality is noticeably worse but acceptable. (Or not. It’s all up to you.) Once upon a time, the WordPress default was 90. That might be worth a try if you’re underwhelmed by the quality at 82. Or go full hog at 100, if that’s the way you roll.
All that said, I use the default – 82.
Experiment?
Depending on what sort of content you publish. It might be worth experimenting. Or not.
If you do experiment, remember to run the experiment long enough to see results from a bunch of different images. Results vary dramatically from image to image.
(There are a bunch of posts on the internet about special recipes for making images smaller and “better” by asking ImageMagick to do all manner of transformations to the image that aren’t strictly compression. If we had the ability to freely “talk to” ImageMagick, some of those schemes might be fruitful avenues for experimentation. Just the right plugin, if somebody cares to write it, could maybe enable better communication with ImageMagick. But today we have responsive images, faster broadband, and probably better things to do with our time.)
What if you have a certain image that just got the short end of the stick? It’s either just too badly degraded or too darn big. Is there anything that can be done?
Why yes, since you ask, there is
Grabbing your image files with FTP
When WordPress makes your renditions, it stores them, and the originals, on your server in a directory at ~/wordpress/wp-content/uploads/year/month. (The year and month folders assume you left the default tick mark on “Organize my uploads into month- and year-based folders” in your Media Settings.)
You can FTP to your server, grab a set of files, and download them to your desktop. Once there, you can re-do the building of the renditions any way you want. Then, upload the new versions back, overwriting the ones that were already there. And your server will serve them.
Note that you’ll need to be careful to make your new renditions exactly the same dimensions as the ones you are replacing and make sure the filenames are exactly the same.
Don’t try to re-compress the renditions. That will just make a mess. Work with the “original” file – which, remember, is exactly as you uploaded it. That’s What ImageMagick used for its renditions. Or you could go all the way back to the source file you used at the very start if you need to.
This, by the way, is how those “image optimizing” plugins work. They grab files out of your upload folder, do stuff to them and put them back. Most of those plugins strike me as pretty snake oily. Many of them indiscriminately destroy the metadata you worked so hard to preserve. I tend to give them a wide berth.
How do we know if this stuff is working?
Let’s do some simple tests.
Is my site preserving metadata? Do I need to switch to ImageMagick?
- Upload an image that you know has IPTC metadata to your site and put it on a post or page. Make sure it’s small enough on the page that you won’t serve the original file by mistake.
- Preview or publish the page/post. (Preview will work OK.)
- Right-click and download the photo to your computer.
- View the image using either software of your choice or the IPTC’s online metadata viewer. (There’s a link to it in the footer of every page on this site, by the way.)
- Do you see metadata? Yes? You’re good. If not, contact your host’s customer care team and ask them to enable ImageMagick for you.
- If you need one, you can download a sample 3000px-wide image with metadata here.
Which rendition of my image is my server serving?
- View the page or post on whatever device you want, with the browser sized to whatever viewport you want.
- Right-click to download the image. You don’t even actually have to download the image. The Save-As dialog will pre-fill with your picture’s filename. The filename contains the dimensions. (If the filename is just what you uploaded, that’s the original file.)
Is my server (and theme) serving my original files?
- Put an image on a page or post. (I like to have nothing but the one image on the page for this, to limit potential clutter.)
- Publish or preview.
- Right-click on the picture and choose “Inspect”. You should be able to see the file that is being used by the browser as well as the SCRSET attribute. SCRSET is literally a menu of available renditions of the picture. WordPress is saying to the browser, “See anything here you like? Take whichever one you want.”
- Look through the list of pictures. If you see any originals listed, your setup does serve them. If not, well then, no.
- Remember that linking an original if a visitor clicks on a picture is a different thing.
- (In Chrome, you’ll see a cool popup that tells you both the dimensions of the file served and the dimensions as rendered by the browser. Which saves the bother of measuring.)
What about progressive JPEGs?
WordPress doesn’t natively support them. If you upload one, the original file will be progressive but the renditions won’t. If we could pass the right argument to ImageMagick, it would be happy to make our renditions progressive. But alas, we can’t. If you really want progressive JPEGs, you’ll have to use one of those plugins. (Or do the FTP download, fix, and re-upload thing.) If you go the plugin route be careful about that metadata!
So there you have it
The “what” is incredibly simple. The “why”, not so much. Hopefully, whether you read only the first couple of paragraphs or all the way to here, this post helped simplify your life. If just a little bit.
For the “How”, check out this post. We’ll go through handling an image to upload to WordPress real-world, in real software.
Do you know of an image optimization plugin that’s socially responsible and accomplishes something that WordPress’s built-in image processing can’t? Do you know a way we can pass arguments to ImageMagick? Want to build a plugin that will actually optimize metadata, rather than massacring it? Your community needs you! Jump in the comments!
What amazing information, God bless you Carl!
Just found this article. Very thorough and well described! I was searching for an answer to if more settings could be tweaked from ImageMagick, and it seems (from your research) this is not possible. Bummer. But I actually do use an older plugin (ImageMagick Sharpen Resized Images) that taps into ImageMagick’s sharpening features. It does a GREAT job of resharpening the soft resized images, but not sure at what point it’s applied and therefor if it is adding another round of compression along the way (plus it hasn’t been updated in years).
I’m curious if you have any thoughts on WebP conversions? It seems that WebP has a more up-to-date and advanced way of compressing images that retains much higher quality at the same filesize as a similar JPG. But I’m under the impression that most/all of the current plugins that do the WebP conversions are using the already resized & compressed JPG thumbnails, so just re-compressing the much lower quality thumbnails rather than starting from the high quality original. Garbage in…
If there was a method/plugin/function that could generate the WebP thumbnails from the original well-optimized original, I think you could simultaneously have much higher quality thumbnail images AND smaller filesizes. Since most browsers support WebP these days, a fallback JPG would be less & less utilized. Still searching for this possibility…
Thanks!
WEBP is interesting. Google has a utility that works with it pretty well. Well, sort of almost pretty well. ImageMagick has added some WEBP functionality, but nothing that preserves metadata yet. There is a fair amount of browser support, but you still have to serve backup JPEGs. Image editor support is scant yet. So, a work in progress, basically.
I lived through the JPEG2000 ordeal. Spent a bunch of time working out new workflows that would take advantage, including remote transmission – which was a bigger deal back then – and it was all for naught. So, for WEBP/FLIFF/HEIF/FUIF/JPEGXL and whatever other wonders are lurking out in vaporware land, I’m in wait-and-see mode for a while. I have made some test images with WEBP and it does indeed look nice at smaller file sizes, compared to JPEG. (After nearly 30 years, I’d kind of hope so.)
All that said, the biggest bang-for-buck jump in image load time that I know of right now is still to nail the size of the served file. (and, of course, CDN and lazyload) The trade-off being server space for all those renditions. (Which is a thing that FLIF was going to address and the yet-newer formats may still. Sigh.)
For all its flaws, JPEG remains for the moment a pretty decent choice, IMHO.