Compression/it
This page is about data compression as it relates to MediaWiki.
Output
E' possibile con l'HTTP, comprimere singole pagine. Sia il browser che il server devono essere in grado di supportarle. E' acceso di default se il PHP ha il supporto zlib attivo (Non servono le mod Apache). L'irrilevante tempo speso dalla CPU a comprimere sul server è ridotto da cose come caricare gli scripts PHP, il risparmio di banda è considerevole.
See Manual:$wgUseGzip and Manual:$wgDisableOutputCompression for details.
Articles
On or about 2004-02-20 the old table and archive table were changed to allow some articles in the history table to be compressed.
Old entries marked with old_flags="gzip"
have their old_text compressed with zlib's deflate algorithm, with no header bytes.
PHP's gzinflate()
will accept this text plainly; in Perl etc set the window size to -MAX_WSIZE
to disable the header bytes.
Page histories
It is also possible to compress the history table in a way which exploits the similar data in the different versions, such as Reverse diff version control. See History compression for some actual numbers.
Cache compression
File cache talks about compression in the cached copies of pages. Now that the Wikimedia projects use squids, it's unclear how much of this is obsolete.