Manual:File cache

MediaWiki has an optional simplistic scheme for caching the rendered HTML of article pages.

Domain and range
Caching is only done for users who:
 * are not logged in.
 * do not have their user_newtalk flag active.

This covers the vast majority of requests to the wiki!

Caching is only done for pages which:
 * are not special pages.
 * are not redirects.
 * are being viewed in current version, plain view, no diffs.

Validation
The modification time of the cached file is compared with the cur_touched field for the given entry in the cur table, as well as a global $wgCacheEpoch timestamp set in LocalSettings. If the file is at least as new as both of these, it is considered valid and is sent directly to the client. If it is older or does not exist, parsing and rendering continues and the results are saved for future use.

Invalidation
The entire cache can be invalidated by setting $wgCacheEpoch to the current time, or of course one could delete all files in the cache.

Individual pages are invalidated by updating their cur_touched fields. This should be done on article creation, edit saves, renames, and creation and deletion of linked articles (in order to update edit links).

Some cases are not yet handled properly, which probably includes:
 * creation/deletion of talk pages
 * updating of images
 * browser 'reload' or 'refresh' just reloads same cached page without updates
 * output variables from extensions such as Extension:Dynamic Page List or Extension:Random Selection will not change on browser refresh

Expiration
There should probably be some method of expiration of cache pages, particularly for pages containing variables (it is X date, we have X articles, etc).

Compression
Optionally, the cache may be compressed to save space and bandwidth. (This requires that zlib be enabled in the PHP config.)

If compression is enabled, the cache files are saved as .html.gz. Browsers that advertise support for gzip in their Accept-Encoding field will be given the gzipped version straight; for those browsers that don't, we unzip the data on the fly and send them the plaintext.

A "Vary: User-agent" header will be sent to tell proxy caches to be more careful about who it resends data to. ("Vary: Accept-encoding" would be more appropriate, but Internet Explorer refuses to cache pages so marked.)

Emergency fallback
If the wiki can't contact the database server, it will try to show the cached version of whatever page was requested, regardless of whether it may be current or not, with a "database is down" message tacked into it.

This has some limitations:
 * special pages are not covered in any way, there's just a warning message
 * redirect pages are not cached, so clicking a link to a redirect doesn't go through to the final destination
 * attempts to use non-view actions result in a plain page view, which may be confusing
 * there may be issues with the MySQL connection timeout which make it take prohibitively long before giving up, particularly if using persistent connections and the db dies later.

-

See also Cache strategy.