Compression/yo

This page is about data compression as it relates to MediaWiki.

Output
It is possible, with HTTP, to compress individual pages served. Both the browser and the server must support it, and it is normally negotiated, (with uncompressed version available). This is on by default if PHP has zlib support enabled (no Apache mods are required). The negligible CPU time spent compressing on the server is dwarfed by things like loading the PHP scripts, and the bandwidth savings are considerable.

See and  for details.

Articles
Lori tabi nipa 2004-02-20 tabili ati  ti a yi pada lati gba diẹ ninu awọn ohun èlò ninu awọn itan tabili lati wa ni fisinuirindigbindigbin. Awọn titẹ sii atijọ ti samisi pẹlu  ni old_text wọn fisinuirindigbindigbin pẹlu zlib's deflate algorithm, laisi awọn baiti akọsori. ti PHP yoo gba ọrọ yii ni gbangba; ni Perl ati be be lo ṣeto iwọn window si  lati mu awọn baiti akọsori kuro.

Page histories
It is also possible to compress the history table in a way which exploits the similar data in the different versions, such as Reverse diff version control. See History compression for some actual numbers.

Cache compression
cache fáìlì ń sọ̀rọ̀ nípa ìkọ̀kọ̀ nínú àwọn ẹ̀dà tí a fi pamọ́ ti àwọn ojúewé. Ni bayi ti awọn iṣẹ akanṣe Wikimedia ti n lo squids, ko ṣe akiyesi iye ti eyi jẹ ti atijo.