User:Fram/Sandbox

From mediawiki.org
Revision as of 20:56, 11 December 2013 by Fram (talk | contribs)

It is important to make regular backups of the data in your wiki. This page provides an overview of the backup process for a typical MediaWiki wiki; you will probably want to devise your own backup scripts or schedule to suit the size of your wiki and your individual needs.

Overview

MediaWiki stores important data in two places:

Database 
Pages and their contents, users and their preferences, metadata, search index, etc.
File System 
Software configuration files, custom skins, extensions, images (inc. deleted images) etc.

Consider making the Wiki read-only before creating the backup - see [[1]]. This makes sure all parts of your backup are consistent (some of your installed extensions may write data nonetheless).

File transfer

Unless you have direct access to the server hosting the wiki, (and even then) you will have to choose a method for transferring files:

Database

Most of the critical data in the wiki is stored in the database, which is typically straightforward to back up. When using the default MySQL backend, the database can be dumped into a script file which can be used later to recreate the database and all the data in it from scratch.

Mysqldump from the command line

The most convenient way to create a dump file of the database you want to back up is to use the standard MySQL dump tool mysqldump from the command line. Be sure to get the parameters right or you may have difficulty restoring the database. Depending on database size, mysqldump could take a considerable amount of time.

First insert the following line into LocalSettings.php

$wgReadOnly = 'Dumping Database, Access will be restored shortly';

this can be removed as soon as the dump is completed.

Example of the command to run on the Linux/UNIX shell:

mysqldump -h hostname -u userid --password --default-character-set=whatever dbname > backup.sql

Substituting hostname, userid, password, dbname etc. as appropriate. If no character set is specified, mysqldump uses utf8, and earlier versions used latin1. Your wiki's database might be using binary. Check in the [[2]] file to find out which (usually under $wgDBTableOptions DEFAULT CHARSET). Otherwise mysql might dump according to the server's file system and not the wiki's. The dbname can also be found in LocalSettings.php under $wgDBname.

Other parameters might be useful such as

--quote-names       : Quote identifiers within backtick characters
--hex-blob          : Dump binary columns using hexadecimal notation 

See mysqldump for a full list of command line parameters.

The output from mysqldump can instead be piped to gzip, for a smaller output file, as follows

mysqldump -h hostname -u userid --password dbname | gzip > backup.sql.gz

A similar mysqldump command can be used to produce xml output instead, by including the --xml parameter.

mysqldump -h hostname -u userid --password --xml dbname > backup.xml

and to compress the file with a pipe to gzip

mysqldump -h hostname -u userid --password --xml dbname | gzip > backup.xml.gz

Remember to also backup the file system components of the wiki that might be required, eg. images, logo, and extensions.

Running mysqldump with Cron

Cron is the time-based job scheduler in Unix-like computer operating systems. Cron enables users to schedule jobs (commands or shell scripts) to run periodically at certain times or dates.

A sample command that you may run from a crontab may look like this:

nice -n 19 mysqldump -u $USER --password=$PASSWORD $DATABASE -c | nice -n 19 gzip -9 > ~/backup/wiki-$DATABASE-$(date '+%Y%m%d').sql.gz

The nice -n 19 lowers the priority of the process.

Use valid values for $USER, $PASSWORD, and $DATABASE.

This will write a backup file with the weekday in the filename so you 

would have a rolling set of backups. If you want to save the files and extensions as well, you might want to use [one].

[] Warning: Do not attempt to back up your Mediawiki database using [ts, users and their preferences, metadata, search index, mysqlhotcopy]. The table format used by Mediawiki cannot be backed up with this tool, and it will fail silently!

If you want to add this task in Cron through Cpanel then you must escape the character "%"

/usr/bin/mysqldump -u $USER --password=$PASSWORD $DATABASE -c | /bin/gzip > ~/backup/wiki-$DATABASE-$(date '+\%Y\%m\%d').sql.gz

or you will get an error:

/bin/sh: -c: line 0: unexpected EOF while looking for matching `''
/bin/sh: -c: line 1: syntax error: unexpected end of file

Tables

Under close examination one finds that some of the [[3]] dumped have various degrees of temporariness. So to save disk space (beyond just gziping), although those tables need to be present in a proper dump, their data does not. However, under certain circumstances the disadvantage of having to rebuild all this data may outweigh the saving in disk space (for example, on a large wiki where restoration speed is paramount).

See a mailing list thread about the topic.

Latin-1 to UTF-8 conversion

See the [section of the upgrading page] for information about this process. Also see the [talk:Backing_up_a_wiki|talk page] for more information about working with character sets in general.

PostgreSQL

You can use the pg_dump tool to back up a MediaWiki PostgreSQL database. For example:

pg_dump mywiki > mywikidump.sql

will dump the mywiki database to mywikidump.sql.

To restore the dump:

psql mywiki -f mywikidump.sql

You may also want to dump the global information, e.g. the database users:

pg_dumpall --globals > postgres_globals.sql

SQLite

See [up]

phpMyAdmin

Turn your wiki to read only by adding $wgReadOnly = 'Site Maintenance'; to LocalSettings.php.

Open the browser to your phpadmin link, login, choose the wiki database. (Check LocalSettings.php if you're not sure). Select Export. Make sure all items under Export are highlighted, and make sure Structure is highlighted (it's important to maintain the table structure). Optionally check Add DROP TABLE to delete existing references when importing. Make sure Data is checked. Select zipped. Then click on GO and save the backup file.[1]

Remove $wgReadOnly = 'Site Maintenance'; from LocalSettings.php

Remember to also backup the file system components of the wiki that might be required, eg. images, logo, and extensions.

External links

File system

MediaWiki stores other components of the wiki in the file system where this is more appropriate than insertion into the database, for example, site configuration files ([[4]], [[5]]),

image files (including deleted images, thumbnails and rendered math and
SVG images, if applicable), skin customisations, extension files, etc.

The best method to back these up is to place them into an archive file, such as a .tar file, which can then be compressed if desired. On Windows, applications such as WinZip or 7-zip can be used if preferred.

For Linux variants, assuming the wiki is stored in /srv/www/htdocs/wiki

  tar zcvhf wikidata.tgz /srv/www/htdocs/wiki

It should be possible to backup the entire "wiki" folder in "htdocs" if using XAMPP.

XML dump

It is also a good idea to create an XML dump in addition to the database dump. XML dumps contain the content of the wiki (wiki pages with all their revisions), without the site-related data (they do not contain user accounts, image metadata, logs, etc). XML dumps are independent of the database structure, and can be imported into future (and even past) versions of MediaWiki. They are also less likely to cause problems with character encoding, and can readily be processed by third party tools, which makes them a good fallback should your main database dump become unusable, and also as a means of redistributing content en masse.

To create an XML dump, use the command-line tool [[6]], located in the maintenance directory of your MediaWiki installation. Run the command as php dumpBackup.php without any arguments to display a brief description of the syntax. You need to specify whether you want a full dump of the complete history of every page, or just the current contents of each page. Prior to MediaWiki 1.16: If an attempt to use dumpBackup.php fails with a message about insufficient permissions, ensure that you have a properly configured [[7]] file. Instructions on creating the file are at [[8]].

You can also create an XML dump for a specific set of pages online, using [[9]], although attempting to dump large quantities of pages through this interface will usually time out.

To import an XML dump into a wiki, use the command-line tool [[10]]. For a small set of pages, you can also use the [[11]] page via your browser (by default, this is restricted to the sysop group). As an alternative to dumpBackup.php and importDump.php, you can use [[12]], which is faster, but requires a Java runtime environment.

See [XML dumps] for more information.

Without shell access to the server

If you have no shell access, then use the WikiTeam Python script dumpgenerator.py from a DOS, Unix or Linux command-line. To run the script see the WikiTeam tutorial.

See also Meta:Data dumps.

Scripts

[] Warning: Use
these at your own risk. Check your wiki's LocalSettings.php for which 
character set your wiki uses, and edit the script to suit.
  • Unofficial backup script by [[13]]; creates a backup of all files, a database dump, and an xml dump.
    • [backup script] by [[14]], [[15]] added the ability for the script to place the wiki into read only mode during the database dump.
      • [backup script] by [[16]], [[17]] added a few lines to circumvent Kaotic's problem with ?>.
        • [backup script] by [[18]], [[19]] set option for default character-set to binary, and other minor tweaks.*[backup script] by [[20]]; creates a backup of all files and the database, with optional backup rotation.*[Backup Script for Windows] - a script for backing up a Windows MediaWiki install. Note: Has no restore feature.*Unofficial web-based backup script, [[21]], by [[22]]

(allwiki.com); you can use it to back up your database, or use the backup files to recover the database, the operation is very easy.*WikiTeam tools - if you do not have server access (e.g. your wiki is in a free wikifarm), you can generate an XML dump and an image dump using WikiTeam tools (see some saved wikis)*[[23]]*Another backup script that: dumps DB, files, and XML; puts the site into read-only mode; timestamps backups; and reads the charset from LocalSettings. Script does not need to be modified for each site to be backed up. Does not (yet) rotate old backups. Usage: backup.sh -d backup/directory -w installation/directory

See also

References

External links

[[25]] [[26]]  [[27]] [[28]] [Indonesia] [[29]] [[30]]