Topic on Project:Support desk

enable compression of images to speed up the page load

5
Lucy Tech (talkcontribs)

Is there a lossy or lossless compression tool for mediawiki in order to improve the page load? something like Smushit.com https://developers.google.com is telling me to 'enable compression' for some images to speed up the page load.

FriedhelmW (talkcontribs)

The HTTP protocol supports compression of any content.

Lucy Tech (talkcontribs)

could you give me more information how I can use this or links to pages that could help me? Thanks

Ciencia Al Poder (talkcontribs)

This page contains all the information you should know about http compression:

88.130.114.249 (talkcontribs)

LTech did not ask for a wa to do HTTP compression, but he in fact wants to compress images. Smushit does some kind of minification, which reduces the file size of the actual image. It does not do any HTTP compression (which in fact would help in addition to already having smaller file sizes right from the start).

When you upload an image into MediaWiki, you would be able to compress it, if, directly after upload, you could run an ImageMagick command. Does MediaWiki support that?

In my test, Smushit reduced the size of a test image by 3%; 97% were still there. A nice little start, but not very impressive. If you really want remarkably smaller image sizes, maybe you should think about lowering the value of $wgMaxUploadSize. That would enable you to "force" small image sizes.

Another aspect would be to use GZip to compress what later is transferred to the client. However, GZip is not extremely effective on images. Usually, compression is used for things like JavaScripts and CSS styles, not for images.

Reply to "enable compression of images to speed up the page load"