매뉴얼:위키를 백업하는 법
귀하의 위키 데이터를 정기적으로 백업하는 것은 매우 중요합니다. 이 페이지에서는 일반적인 미디어위키 기반 위키의 백업 과정에 대한 개괄을 제공합니다. 당신은 아마도 위키의 크기와 당신의 개인적인 필요성에 맞춰서 자신의 백업 스크립트를 고안하거나 일정을 계획하고 싶을 것입니다.
개요
위키미디어는 중요한 데이터를 두 곳에 저장합니다.
- 데이터베이스
- 페이지 및 내용, 사용자 데이터 및 환경 설정, 메타데이터, 검색 색인 등의 내용
- 파일 시스템
- 소프트웨어 설정 파일, 맞춤 스킨, 확장기능, 이미지 (삭제된 이미지 포함) 등의 내용
백업하기 전에 위키를 읽기전용으로 만드는 것을 고려하십시오 - $wgReadOnly 를 보십시오. 이를 통해 당신의 모든 백업 파츠들이 일관성을 유지할 수 있습니다. (그렇더라도 일부 당신이 설치한 확장기능들이 데이터를 쓸(write) 수는 있습니다.)
파일 전송
You will have to choose a method for transferring files from the server where they are:
- Non-private data you can simply publish on archive.org and/or in a
dumps/
directory of your webserver. - SCP (or WinSCP), SFTP/FTP or any other transfer protocol you choose.
- The hosting company might provide a file manager interface via a web browser; check with your provider.
데이터베이스
Most of the critical data in the wiki is stored in the database. When using the default MySQL or MariaDB backend, the database can be dumped into a script file which can be used later to recreate the database and all the data in it from scratch.
MySQL
Automysqlbackup
See the package on Debian:
$ apt show automysqlbackup
[...]
Description: automysqlbackup creates backup every day, week and month for all of your MySQL database, to a configured folder. There's nothing to do but to install this package, and you'll rest assured that you have a way to go back in the history of your database.
[...]
패키지 설치:
# apt install automysqlbackup
All your databases will be saved in /var/lib/automysqlbackup/:
$ find /var/lib/automysqlbackup/
/var/lib/automysqlbackup/
/var/lib/automysqlbackup/weekly
/var/lib/automysqlbackup/weekly/my_wiki
/var/lib/automysqlbackup/weekly/my_wiki/my_wiki_week.18.2016-05-07_15h32m.sql.gz
/var/lib/automysqlbackup/monthly
/var/lib/automysqlbackup/daily
/var/lib/automysqlbackup/daily/my_wiki
Manual backup:
# automysqlbackup
Restore a database:
gunzip < /var/lib/automysqlbackup/weekly/my_wiki/my_wiki_week.18.2016-05-07_15h32m.sql.gz|mysql -uUSER -pPASSWORD my_wiki
For other distributions, see on Sourceforge.
Mysqldump from the command line
The most convenient way to create a dump file of the database you want to back up is to use the standard MySQL dump tool mysqldump from the command line. Be sure to get the parameters right or you may have difficulty restoring the database. Depending on database size, mysqldump could take a considerable amount of time.
First insert the following line into LocalSettings.php
$wgReadOnly = 'Dumping Database, Access will be restored shortly';
this can be removed as soon as the dump is completed.
Example of the command to run on the Linux/UNIX shell:
mysqldump -h hostname -u userid -p --default-character-set=whatever dbname > backup.sql
Substituting hostname
, userid
, whatever
, and dbname
as appropriate.
All four may be found in your LocalSettings.php (LSP) file.
hostname
may be found under $wgDBserver ; by default it is localhost.
userid
may be found under $wgDBuser , whatever
may be found under $wgDBTableOptions , where it is listed after DEFAULT CHARSET=
.
If whatever
is not specified mysqldump will likely use the default of utf8, or if using an older version of MySQL, latin1.
While dbname
may be found under $wgDBname .
After running this line from the command line mysqldump will prompt for the server password (which may be found under Manual:$wgDBpassword in LSP).
See mysqldump for a full list of command line parameters.
The output from mysqldump can instead be piped to gzip, for a smaller output file, as follows
mysqldump -h hostname -u userid -p dbname | gzip > backup.sql.gz
A similar mysqldump command can be used to produce XML output instead, by including the --xml parameter.
mysqldump -h hostname -u userid -p --xml dbname > backup.xml
and to compress the file with a pipe to gzip
mysqldump -h hostname -u userid -p --xml dbname | gzip > backup.xml.gz
Remember to also backup the file system components of the wiki that might be required, e.g., images, logo, and extensions.
Cron 으로 mysqldump 실행하기
Cron is the time-based job scheduler in Unix-like computer operating systems. Cron enables users to schedule jobs (commands or shell scripts) to run periodically at certain times or dates.
A sample command that you may run from a crontab may look like this:
nice -n 19 mysqldump -u $USER --password=$PASSWORD $DATABASE -c | nice -n 19 gzip -9 > ~/backup/wiki-$DATABASE-$(date '+%Y%m%d').sql.gz
The nice -n 19
lowers the priority of the process.
Use valid values for $USER
, $PASSWORD
, and $DATABASE
. This will write a backup file with the weekday in the filename so you would have a rolling set of backups. If you want to save the files and extensions as well, you might want to use this one.
If you want to add this task in Cron through Cpanel then you must escape the character "%"
/usr/bin/mysqldump -u $USER --password=$PASSWORD $DATABASE -c | /bin/gzip > ~/backup/wiki-$DATABASE-$(date '+\%Y\%m\%d').sql.gz
or you will get an error:
/bin/sh: -c: line 0: unexpected EOF while looking for matching `'' /bin/sh: -c: line 1: syntax error: unexpected end of file
Tables
Some of the tables dumped have different degrees of temporariness. So to save disk space (beyond just gziping), although those tables need to be present in a proper dump, their data does not. However, under certain circumstances the disadvantage of having to rebuild all this data may outweigh saving disk space (for example, on a large wiki where restoration speed is paramount).
See mailing list thread mysql5 binary schema about the topic.
Latin-1 to UTF-8 conversion
See the relevant section of the upgrading page for information about this process. Also see the talk page for more information about working with character sets in general.
PostgreSQL
You can use the pg_dump
tool to back up a MediaWiki PostgreSQL database. For example:
pg_dump mywiki > mywikidump.sql
will dump the mywiki
database to mywikidump.sql.
To restore the dump:
psql mywiki -f mywikidump.sql
You may also want to dump the global information, e.g. the database users:
pg_dumpall --globals > postgres_globals.sql
SQLite
If your wiki is currently offline, its database can be backed up by simply copying the database file.
Otherwise, you should use a maintenance script: php maintenance/sqlite.php --backup-to <backup file name>
, which will make sure that operation is atomic and there are no inconsistencies.
If your database is not really huge and server is not under heavy load, users editing the wiki will notice nothing but a short lag.
Users who are just reading will not notice anything in any case.
phpMyAdmin
Turn your wiki to read only by adding $wgReadOnly = 'Site Maintenance';
to LocalSettings.php.
Open the browser to your phpadmin link, login, choose the wiki database. (Check LocalSettings.php if you're not sure). Select Export. Make sure all items under Export are highlighted, and make sure Structure is highlighted (it's important to maintain the table structure). Optionally check Add DROP TABLE to delete existing references when importing. Make sure Data is checked. Select zipped. Then click on GO and save the backup file.[1]
Remove $wgReadOnly = 'Site Maintenance';
from LocalSettings.php
Remember to also backup the file system components of the wiki that might be required, e.g. images, logo, and extensions.
External links
- For a tutorial, see Siteground: MySQL Export: How to backup a MySQL database using phpMyAdmin
HeidiSQL
HeidiSQL is similar to phpMyAdmin, but without any restrictions of phpMyAdmin's free version. HeidiSQL requires a direct database connection, where some hosts may only offer web interfaces (phpMyAdmin) to firewalled databases.
File system
MediaWiki stores other components of the wiki in the file system where this is more appropriate than insertion into the database, for example, site configuration files (LocalSettings.php
, AdminSettings.php
(finally removed in 1.23)), image files (including deleted images, thumbnails and rendered math and SVG images, if applicable), skin customisations, extension files, etc.
The best method to back these up is to place them into an archive file, such as a .tar
file, which can then be compressed if desired. On Windows, applications such as WinZip or 7-zip can be used if preferred.
For Linux variants, assuming the wiki is stored in /srv/www/htdocs/wiki
tar zcvhf wikidata.tgz /srv/www/htdocs/wiki
It should be possible to backup the entire "wiki" folder in "htdocs" if using XAMPP.
Backup the content of the wiki (XML dump)
It is also a good idea to create an XML dump in addition to the database dump. XML dumps contain the content of the wiki (wiki pages with all their revisions), without the site-related data (they do not contain user accounts, image metadata, logs, etc).[2]
XML dumps are less likely to cause problems with character encoding, as a means of transferring large amounts of content quickly, and can easily be used by third party tools, which makes XML dumps a good fallback should your main database dump become unusable.
To create an XML dump, use the command-line tool dumpBackup.php
, located in the maintenance
directory of your MediaWiki installation.
See Manual:dumpBackup.php for more details.
You can also create an XML dump for a specific set of pages online, using Special:Export, although attempting to dump large quantities of pages through this interface will usually time out.
To import an XML dump into a wiki, use the command-line tool importDump.php
.
For a small set of pages, you can also use the Special:Import page via your browser (by default, this is restricted to the sysop group).
As an alternative to dumpBackup.php
and importDump.php
, you can use MWDumper, which is faster, but requires a Java runtime environment.
See Manual:Importing XML dumps for more information.
Without shell access to the server
If you have no shell access, then use the WikiTeam Python script dumpgenerator.py from a DOS, Unix or Linux command-line. Requires Python v2 (v3 doesn't yet work).
To get an XML, with edit histories, dump and a dump of all images plus their descriptions. Without extensions and LocalSettings.php configs.
python dumpgenerator.py --api=http://www.sdiy.info/w/api.php --xml --images
Full instructions are at the WikiTeam tutorial.
See also Meta:Data dumps.
스크립트
- Unofficial backup script by Flominator; creates a backup of all files and the database, with optional backup rotation.
- User:Darizotas/MediaWiki Backup Script for Windows - a script for backing up a Windows MediaWiki install. Note: Has no restore feature.
- WikiTeam tools - if you do not have server access (e.g. your wiki is in a free wikifarm), you can generate an XML dump and an image dump using WikiTeam tools (see some saved wikis).
- Another backup script that: dumps DB, files (just pictures by default, option to include all files in installation), and XML; puts the site into read-only mode; timestamps backups; and reads the charset from LocalSettings. Script does not need to be modified for each site to be backed up. Does not (yet) rotate old backups. Usage:
restore.sh -a backup/directory/dated_archive.tar.gz -w installation/directory
. Also provides a script to restore a backuprestore.sh -a backup/directory/dated_archive.tar.gz -w installation/directory
.
- Another unofficial MediaWiki backup script for Windows by Lanthanis that: exports the pages of specified namespaces as an XML file; dumps specified database tables; and adds further specified folders and files to a ZIP backup file.
Can be used with Windows task scheduler.
- Script to make periodical backups mw_backup. This script will make daily, weekly and monthly backups of your database and images directory when run as a daily cron job.
같이 보기
- Help:Export is a quick and easy way to save all pages on your wiki.
- Manual:Wiki 데이터를 백업데이터에서 복원
- Manual:Moving a wiki
- 메뉴얼:업그레이드
- Manual:Restoring wiki code from cached HTML - 성공적인 백업이 이뤄지지 않았다면
- Exporting all the files of a wiki
각주
- ↑ Manual_talk:Backing_up_a_wiki#Ubuntu_10.10_-_Step_by_Step_Instructions
- ↑ XML dumps are independent of the database structure, and can be imported into future (and even past) versions of MediaWiki.