|MediaWiki file: compressOld.php|
|Source code:||master • 1.30.0 • 1.29.2|
compressOld.php file is a maintenance script to compress the text of old page revisions using gzip.
If used with option -t gzip, it compresses each revision text (including the current revision of all pages) and saves it back to the same table record.
If used with option -t concat, it compresses only previous revisions of each page but keeps the current revision uncompressed. The texts of all previous revisions of each page are concatenated and saved to the first table record created for that page. The remaining intermediary records are converted into stubs pointing to that first record. The concatenation allows for better compression. (This is a simplified explanation; depending on page size and the options used, the resulting structure for a given page can be more complex than this after the script is run.)
|Caution:||Keep in mind that after compression, the compressed texts will no longer be searchable/replaceable via SQL scripts. For this reason, if all revisions are compressed, including the current ones, Replace Text extension will no longer work since it makes use of SQL queries.|
php compressOld.php <database> [options...]
|-t <type>|| set compression type to either:|
gzip: compress revisions independently
|-c <chunk-size>||maximum number of revisions in a concat chunk|
|-b <begin-date>||earliest date to check for uncompressed revisions. The date must be provided as a MediaWiki timestamp.|
|-e <end-date>||latest revision date to compress. The date must be provided as a MediaWiki timestamp.|
|-s <start-id>|| the |
|-n <end-id>|| the |
|--extdb <cluster>||store specified revisions in an external cluster (untested)|
compressOld.php -e 20141231235959
This will concatenate and compress all revisions (except the current page revisions), which have been created before January 1st 2015.