Requests for comment/Reducing image quality for mobile
|Reducing image quality for mobile|
core patch 119661 merged
Many mobile devices with low bandwidth and slow processor could benefit from showing JPEG images with reduced quality. Such images would transmit much faster and may require less memory to process. This proposal adds a quality control feature to the thumbnail image generation, and can be used by extensions whenever the benefit of the lower bandwidth outweighs the need of high quality.
- Quality vs pixel size: Image bytesize can be reduced by lowering both the quality and the pixel size of the picture. Yet, unlike quality, the pixel size changes the overall layout of the page, and should depend exclusively on the size and DPI of the screen, whereas lowering quality does not affect general page layout, yet brings very significant bytesize reduction. Thus, the code that has domain knowledge about screen size and layouts (e.g. Mobile Ext) could choose to reduce pixel size, whereas code that knows about network infrastructure (e.g. Zero) could additionally reduce quality.
- Why JPEG: unlike PNG, JPEG can be easily compressed without changing the thumbnail size specified by the page authors
- Target devices: this RFC mainly focuses on mobile market as it tends to have lower bandwidth constrains, but since this is a generic change in core, it could also be used for desktop optimization.
All small images - We could reduce quality of all smaller thumbnail images (e.g. if height or width is less than 300px).
- CONs - everyone is affected, people with good connection and good screens will get worse experience
Varnish magic - Varnish could make the decision to serve different backend images for the same image URL based on if the request is from Zero network. Varnish would rewrite the thumbnail URL by inserting the quality parameter (e.g. "-q30").
- PROs - Most flexible approach, does not require any Varnish infrastructure changes
Proposal - Core
Extend image URL syntax
- This part has been implemented as patch 119661 and needs a review
The production backend generates requested image if file is missing (404). The url is parsed with regex to extract the desired image width. In order to pass quality reduction parameter, we need an extra value "-qNNN". The value is optional, thus keeping the existing image cache intact.
This exact approach has already been used for SVG's "language" parameter and for DjVu's "page" parameter.
Quality parameter in Wiki markup
- There has been some objections to this section, so it might need to be either reworked or removed
Add a "quality" parameter to the image link wiki markup to specify desired quality reduction.
Above would render image.jpg with image quality set to 30% and width scaled to 100px. This parameter might be used by various template authors to substantially reduce thumbnail file size.
There are currently several workflows for image link parsing, assuming the above wiki markup and generated HTML <img src=".../image.jpg/100px-q30-image.jpg" />:
- 404 enabled
- markup → HTML → browser requests image → image is 404 and rendered on the fly using URL as parameters
- 404 disabled
- markup → HTML & image file is created based on markup params → browser requests existing image
- remote repo
- markup → server calls remote repo via API → (TBD - still investigating)
Proposal - Zero
Most of Zero network users operate older devices, frequently using slower networks. For Zero users, we propose server DOM rewrite (already being done for external links and some images). The rewrite would remove the
srcset img attribute and change the image src from the default
- Usage Assessment
If Zero replaces all images for all pages except File: namespace, we are theoretically looking at creating a copy of every image used. Yet, please keep in mind that most images used on Wiki pages have been significantly scaled down from the original (images tend to be used in smaller frames), and reducing their quality to 30 should make file size even smaller - by my random sampling of several wiki images - at least 30-40% of the original size. The images tend to be 5-10KB, which is 100-200GB absolute maximum if every file on commons (20.5 millions) are actually converted.