Manual:Backing up a wiki/de

Es ist wichtig, regelmäßige Sicherungen (engl.: Backup, pl.: Backups) der Daten in ihrem Wiki zu machen. Diese Seite bietet einen Überblick über den Sicherungsvorgang für einen typischen MediaWiki-Wiki; möglicherweise möchten Sie Ihre eigenen Sicherungsskripte oder Zeitpläne anlegen, die an die Größe Ihres Wikis und Ihre individuellen Bedürfnisse angepasst sind.

Übersicht
MediaWiki speichert wichtige Daten an zwei Orten:
 * Datenbank : Seiten und ihr Inhalt, Benutzer und ihre Einstellungen, Metadaten, den Suchindex usw.
 * Dateisystem : Konfigurationsdateien, angepasste Skins, Erweiterungen, Bilder (inkl. gelöschte Bilder) usw.

Sie sollten in Erwägung ziehen, Ihr Wiki vor der Datensicherung auf read-only zu setzen, also vor Veränderung zu schützen — siehe auch Manual:$wgReadOnly. Dadurch wird sichergestellt, dass alle Teile der Datensicherung vollständig und konsistent sind.

Datenübertragung
Für die Übertragung des Backups vom Server stehen folgende Möglichkeiten zur Verfügung:


 * Nicht-private Daten können einfach auf archive.org veröffentlicht und/oder in einem -Verzeichnis Ihres Servers abgelegt werden.
 * SCP (or WinSCP), SFTP/FTP oder irgend ein anderes Übertragungsprotokoll, mit dem sie vertraut sind, ist verfügbar.
 * Ihr Webspace-Betreiber stellt Ihnen vielleicht sogar eine Datei-Manager-Oberfläche über Ihren Webbrowser zur Verfügung. Klären Sie das mit ihm ab.

Datenbank
Die meisten wichtigen Daten des Wikis werden in der Datenbank gespeichert, die i.d.R. sehr leicht zu sichern ist. Wenn Sie das MySQL-Backend benutzen (standardmäßig), stehen Ihnen verschiedene Tools zur Verfügung, um die Datenbank in eine Datei zu "dumpen" (engl. dumping, bedeutet soviel wie "exportieren"). Die dadurch erzeugte Skriptdatei kann verwendet werden, um die Datenbank von Grund auf neu einzurichten.

Abzug der Datenbank mit mysqldump in der shell
The most convenient way to create a dump file of the database you want to back up is to use the standard MySQL dump tool mysqldump from the command line. Be sure to get the parameters right or you may have difficulty restoring the database. Depending on database size, mysqldump could take a considerable amount of time.

Fügen Sie zunächst die folgende Zeile in die Datei LocalSettings.php ein $wgReadOnly = 'Dumping Database, Access will be restored shortly'; Diese Zeile kann wieder entfernt werden, sobald die Sicherung abgeschlossen ist.

Ein beispielhafter Aufruf von der Linux/UNIX Kommandozeile: mysqldump -h hostname -u userid -p --default-character-set=whatever dbname > backup.sql Substituting,  ,  , and   as appropriate. All four may be found in your LocalSettings.php (LSP) file. may be found under $wgDBserver; by default it is localhost. may be found under $wgDBuser,  may be found under $wgDBTableOptions, where it is listed after. If  is not specified mysqldump will likely use the default of utf8, or if using an older version of MySQL, latin1. While  may be found under $wgDBname. After running this line from the command line mysqldump will prompt for the server password (which may be found under Manual:$wgDBpassword in LSP).

See mysqldump for a full list of command line parameters.

Die Ausgabe von mysqldump kann durch Kompression mit gzip verkleinert werden. Dazu kann die Standardausgabe mit | auf gzip oder einen anderen Packer umgeleitet werden. mysqldump -h hostname -u userid -p dbname | gzip > backup.sql.gz

A similar mysqldump command can be used to produce XML output instead, by including the --xml parameter. mysqldump -h hostname -u userid -p --xml dbname > backup.xml and to compress the file with a pipe to gzip mysqldump -h hostname -u userid -p --xml dbname | gzip > backup.xml.gz

Remember to also backup the file system components of the wiki that might be required, e.g., images, logo, and extensions.

Zeitgesteuerter Aufruf von mysqldump von cron aus
Cron is the time-based job scheduler in Unix-like computer operating systems. Cron enables users to schedule jobs (commands or shell scripts) to run periodically at certain times or dates.

A sample command that you may run from a crontab may look like this:  nice -n 19 mysqldump -u $USER --password=$PASSWORD $DATABASE -c | nice -n 19 gzip -9 > ~/backup/wiki-$DATABASE-$(date '+%Y%m%d').sql.gz The  lowers the priority of the process.

Use valid values for,  , and. This will write a backup file with the weekday in the filename so you would have a rolling set of backups. If you want to save the files and extensions as well, you might want to use this one.

If you want to add this task in Cron through Cpanel then you must escape the character "%" /usr/bin/mysqldump -u $USER --password=$PASSWORD $DATABASE -c | /bin/gzip > ~/backup/wiki-$DATABASE-$(date '+\%Y\%m\%d').sql.gz

or you will get an error: /bin/sh: -c: line 0: unexpected EOF while looking for matching `'' /bin/sh: -c: line 1: syntax error: unexpected end of file

Tabellen
Under close examination one finds that some of the tables dumped have various degrees of temporariness. So to save disk space (beyond just gziping), although those tables need to be present in a proper dump, their data does not. However, under certain circumstances the disadvantage of having to rebuild all this data may outweigh the saving in disk space (for example, on a large wiki where restoration speed is paramount).

See mailing list thread mysql5 binary schema about the topic.

Latin-1 to UTF-8 conversion
See the relevant section of the upgrading page for information about this process. Also see the talk page for more information about working with character sets in general.

PostgreSQL
You can use the  tool to back up a MediaWiki PostgreSQL database. For example: pg_dump mywiki > mywikidump.sql will dump the  database to mywikidump.sql.

To restore the dump: psql mywiki -f mywikidump.sql

You may also want to dump the global information, e.g. the database users:

pg_dumpall --globals > postgres_globals.sql

phpMyAdmin
Turn your wiki to read only by adding  to LocalSettings.php.

Open the browser to your phpadmin link, login, choose the wiki database. (Check LocalSettings.php if you're not sure). Select Export. Make sure all items under Export are highlighted, and make sure Structure is highlighted (it's important to maintain the table structure). Optionally check Add DROP TABLE to delete existing references when importing. Make sure Data is checked. Select zipped. Then click on GO and save the backup file.

Remove  from LocalSettings.php

Remember to also backup the file system components of the wiki that might be required, eg. images, logo, and extensions.

HeidiSQL
HeidiSQL is similar to phpMyAdmin, but without any restrictions of phpMyAdmin's free version.

Dateisystem
MediaWiki stores other components of the wiki in the file system where this is more appropriate than insertion into the database, for example, site configuration files (,  (finally removed in 1.23)), image files (including deleted images, thumbnails and rendered math and SVG images, if applicable), skin customisations, extension files, etc.

Der beste Weg, diese Dateien zu sichern, liegt darin, sie in eine Archivdatei zu packen, die zum Beispiel das .tar-Format hat; eine Kompression ist bei Bedarf möglich. Unter Windows können dazu Programme wie WinZip verwendet werden.

For Linux variants, assuming the wiki is stored in /srv/www/htdocs/wiki tar zcvhf wikidata.tgz /srv/www/htdocs/wiki It should be possible to backup the entire "wiki" folder in "htdocs" if using XAMPP.

Backup the content of the wiki (XML dump)
It is also a good idea to create an XML dump in addition to the database dump. XML dumps contain the content of the wiki (wiki pages with all their revisions), without the site-related data (they do not contain user accounts, image metadata, logs, etc).

XML dumps are less likely to cause problems with character encoding, as a means of transfering large amounts of content quickly, and are easily be used by third party tools, which makes XML dumps a good fallback should your main database dump become unusable.

To create an XML dump, use the command-line tool, located in the   directory of your MediaWiki installation. See Manual:dumpBackup.php for more details.

Sie können auch für einen Teil der Seiten online einen XML-Dump erstellen, indem Sie die Export-Spezialseite (Special:Export) benutzen, auch wenn der Versuch, größere Textmengen oder längere Versionengeschichten mit diesem Tool zu sichern, meistens an einem Timeout scheitert.

To import an XML dump into a wiki, use the command-line tool. For a small set of pages, you can also use the Special:Import page via your browser (by default, this is restricted to the sysop group). As an alternative to  and , you can use MWDumper, which is faster, but requires a Java runtime environment.

See Manual:Importing XML dumps for more information.

Without shell access to the server
If you have no shell access, then use the WikiTeam Python script dumpgenerator.py from a DOS, Unix or Linux command-line. To run the script see the WikiTeam tutorial.

Siehe auch Data dumps.

Skripts

 * Unofficial backup script by User:Duesentrieb.


 * Unofficial backup script by Flominator; creates a backup of all files and the database, with optional backup rotation.


 * User:Darizotas/MediaWiki Backup Script for Windows - a script for backing up a Windows MediaWiki install. Note: Has no restore feature.


 * Unofficial web-based backup script, mw_tools, by Wanglong (allwiki.com); you can use it to back up your database, or use the backup files to recover the database, the operation is very easy.


 * WikiTeam tools - if you do not have server access (e.g. your wiki is in a free wikifarm), you can generate an XML dump and an image dump using WikiTeam tools (see some saved wikis)


 * Fullsitebackup


 * Another backup script that: dumps DB, files, and XML; puts the site into read-only mode; timestamps backups; and reads the charset from LocalSettings. Script does not need to be modified for each site to be backed up. Does not (yet) rotate old backups. Usage:


 * Script to make periodical backups mw_backup. This script will make daily, weekly and monthly backups of your database and images directory when run as a daily cron job.

Siehe auch

 * Manual:Restoring a wiki from backup
 * Manual:Moving a wiki
 * Manual:Upgrading
 * Manual:Restoring wiki code from cached HTML (if you don't have a successful backup)