Jump to content

Podręcznik:CompressOld.php

From mediawiki.org
This page is a translated version of the page Manual:CompressOld.php and the translation is 23% complete.

Szczegóły

compressOld.php file is a maintenance script to compress the text of old page revisions using gzip.

If used with option -t gzip, it compresses each revision text (including the current revision of all pages) and saves it back to the same table record.

If used with option -t concat, it compresses only previous revisions of each page but keeps the current revision uncompressed. The texts of all previous revisions of each page are concatenated and saved to the first table record created for that page. The remaining intermediary records are converted into stubs pointing to that first record. The concatenation allows for better compression. (This is a simplified explanation; depending on page size and the options used, the resulting structure for a given page can be more complex than this after the script is run.)

Uwaga! Uwaga: Keep in mind that after compression, the compressed texts will no longer be searchable/replaceable via SQL scripts. For this reason, if all revisions are compressed, including the current ones, Replace Text extension will no longer work since it makes use of SQL queries.
Uwaga! Uwaga: There is no script to uncompress the revisions once they have been compressed.
Depending on your setup, the database might already be stored on-disk in a compressed format, in which case compressing revisions will probably not give you the benefits you want.

Użycie

For a Wikimedia wiki:

php ./maintenance/run.php compressOld <database> [options...]

For a non-Wikimedia wiki:

php ./maintenance/run.php compressOld [options...]
In MediaWiki version 1.43.6 and earlier, you must invoke maintenance scripts using php maintenance/scriptName.php instead of php maintenance/run.php scriptName.

Opcje

Opcja/Parametr Opis
-t <typ> set compression type to either:

gzip: compress all revisions independently
concat: concatenate old revisions and compress in chunks (default)

--extdb <cluster> store specified revisions in an external cluster (untested)
Options for type gzip
-s <start-id> the old_id (from the text table) to start at
Options for type concat
-c <chunk-size> maximum number of revisions in a concat chunk, default to 20.
-b <begin-date> earliest date to check for uncompressed revisions (must be provided as a MediaWiki timestamp)
-e <end-date> latest revision date to compress (must be provided as a MediaWiki timestamp)
-s <start-id> the page_id (from the page table) to start at
-n <end-id> the page_id (from the page table) to stop at

Przykład:

compressOld.php -e 20141231235959 

This will concatenate and compress all revisions (except the current page revisions), which have been created before January 1st 2015.

The script will not try to recompress a revision that has already been compressed.

Zobacz też