Manual:System administration

This page documents common administrative tasks which you may wish to perform once your installation of MediaWiki is completed.

Toggle uploads on and off

 * See Configuring file uploads

Setting administrative rights
MediaWiki is a permissions-based wiki system. That means that users will only be able to perform the actions they are allowed to do. As declaring permissions individually for every user would be tedious and impractical, several User rights are pre-defined in MediaWiki. You are also allowed to declare new user groups, as your needs require. Individual extensions may also require creating new user rights.

Regardless of how they are created, user groups are fully customizable, by modifying the $wgGroupPermissions associative array in your LocalSettings.php file.

Sysop (Administrator)
The most common use. A user marked as 'sysop' can delete and undelete pages, block and unblock IPs, issue read-only SQL queries to the database, and use a shortcut revert-to-previous-contributor's-revision feature in contribs. (See Wikipedia:Administrators for info on sysops on Wikipedia specifically.)

Bot
A registered bot account. Edits by an account with this set will not appear by default in Recentchanges; this is intended for mass imports of data without flooding human edits from view. (Add &hidebots=0 to list changes made by bots e.g. like this)

Choosing a design: Selecting a skin
Currently, the following skins are available:
 * Monobook (default)
 * Chick
 * Classic
 * Cologne Blue
 * MySkin
 * Nostalgia
 * Simple

Setting and Unsetting the default skin
In MediaWiki 1.1.0,kin, adjust the following array in ./languages/Language.php ,

(0 = Standard, 1 = Nostalgia, 2 = CologneBlue) /* Use the "skin" element for the default skin.*/ $wgDefaultUserOptionsEn = array( "quickbar" => 1, "underline" => 1, "hover" => 1, "cols" => 80, "rows" => 25, "searchlimit" => 20, "contextlines" => 5, "contextchars" => 50, "skin" => 2, "math" => 1, "rcdays" => 7, "rclimit" => 50, "highlightbroken" => 1, "stubthreshold" => 0, "previewontop" => 1, "editsection"=>1,"editsectiononrightclick"=>0, "showtoc"=>1,        "date" => 0 );

Localization
MediaWiki's interface has been translated in over 100 languages, and a translated interface can be set up with minimal effort. The internationalized message files for the interface are located in the /languages subdirectory of your MediaWiki installation.

To modify the default language of your wiki, modify your LocalSettings.php file. Open it with a text editor, and look for the $wgLanguageCode variable declaration to the two-letter language code of your choice. (The language code for a particular language is visible on the messages file of your choice).

Registered users will be able to override this setting by modifying their user preferences.

In MediaWiki 1.8 or older, if you change this after installation, you should run the maintenance/rebuildMessages.php script to rebuild the user interface messages (MediaWiki namespace). Otherwise, the interface will not be switched to the new language, or it will appear as a mix of the old and new languages.

Note that running that script will override any custom interface messages you may have created!

As features are continuously added to MediaWiki, interface files may not be fully translated. The degree each file is translated is documented in Localization statistics. If you translate part of the interface, please submit your changes to BugZilla so other users can use them.

Upgrading MediaWiki

 * See Manual:Upgrading

Getting data: Importing a database dump
If you want some data to play with in a local copy, you can get database dumps from several Wikimedia sites; see an overview at Database download and the download site.

Getting the required software
Required software: bzip2. See Compression for some interesting facts about the bz2 format.

You can get bzip2 for free at the following locations:
 * Microsoft Windows: a command-line Windows version of bzip2 is available for free under a BSD license (bzip2 ). A GUI file archiver, 7-zip, that is also able to open bz2 compressed files is available for free (7-zip, ).
 * GNU/Linux: most distributions ship with the command-line bzip2 tool. E.g. on Debian/GNU Linux it's apt-gettable by apt-get install bzip2.
 * MacOS X: ships with the command-line bzip2 tool.

Getting the data
SQL dumps of the Wikipedia and several other WikiMedia sites are available at en:Wikipedia:Database download and the download site. You can download for free and use it according to the GNU Free Documentation License.

The dumps are (automatically?) created at least once a week, at least for the English Wikipedia. The total size of the database exceeded 3 Gigabytes at the end of November 2003, so you might want to start with smaller packages, e.g. only the current revision.

On the download site, you will find a list of available files, sorted by the WikiMedia project (e.g. en.wikipedia.org for the english Wikipedia). For each WikiMedia site you'll find two files:
 * cur - includes just the current revisions of the articles and is significantly smaller.
 * old - includes all revisions of the articles and is usually quite huge.

For local testing purposes you need only one cur file for one WikiMedia site. Please note that these downloads do not include images or other binary material.

Checking the file integrity
There are md5 checksums attached to every file, e.g.

md5 3cca5e147d5e699dcbade2316f79c340

This information helps you to verify the integrity of your download. If the checksums of the source file and your local download don't match, the archive has probably been corrupted during the transfer.

md5sum computes a 128-bit checksum (or fingerprint or message-digest) for each specified file. The tool to generate the checksums is Free Software and available for all supported platforms:


 * GNU/Linux: md5sum is part of the former GNU text utilities (now GNU core utilities); they can be download via the coreutils page; every GNU/Linux distribution includes them. E.g. on Debian GNU/Linux, you can get them via apt-get install coreutils.
 * Microsoft Windows: e.g. md5sum.exe (48 kB) or ;
 * MacOS X: The command-line tool /sbin/md5 is part of MacOS X.

Another widespread tool, openssl, can be used to generate MD5 checksums: openssl md5 the_filename

Importing the database dump
The following command will decompress the database dump 'cur_table.sql.bz2', output it to standard out, pipe this to the MySQL database 'wikidb', using the user 'wikiadmin' and the password 'adminpass':

bzip2 -dc cur_table.sql.bz2 | mysql -u wikiadmin -padminpass wikidb

Similarily, you import 'old_table.sql.bz2' into the MySQL database:

bzip2 -dc old_table.sql.bz2 | mysql -u wikiadmin -padminpass wikidb

This will take some time; depending on your system and the size of the database dump, you might have to wait for several minutes; e.g. on an SMP system with two AMD Athlon 1900+ CPUs and 1 GB RAM, it takes about ten minutes to process a 300 MB dump file. Please note that this command doesn't output any status information, neither when it's working nor when it's finished.

After this, you have to issue the following command from the MediaWiki maintenance directory:

pre 1.5 php rebuildLinks.php

since 1.5 php refreshLinks.php

This will show something like this:

Rebuilding/Refreshing link tables (pass 1). 1000 of 348842 articles scanned. 2000 of 348842 articles scanned. 3000 of 348842 articles scanned. 4000 of 348842 articles scanned. 5000 of 348842 articles scanned. 6000 of 348842 articles scanned. ...

This script will rebuild the link tracking tables from scratch. This again takes quite a while, maybe even several hours, depending on the database size and server configuration.

Platform specific notes:
 * On Debian GNU/Linux, the PHP interpreter is called 'php4'.
 * On Microsoft Windows XP, you can use the program bunzip2 (download Windows version) as follows: in a DOS box run "bzunip2 xxx" where xxx is the compressed file.

If you're lucky, the script finishes without any problem.

Troubleshooting
If the script encounters errors, you might have run into one of the following problems:

Errors in the SQL dump.

ERROR 1030 at line 146: Got error 28 from table handler

or

ERROR 3 at line 166: Error writing file '/var/log/mysql.log' (Errcode: 28)

Errors like this might indicate that you've run out of disk space. Check /var/log/mysql.log and disk space (df -h)

MySQL reports an error.

If you encounter error messages like the following when running the rebuildlinks script:

syntactical error in the sql statement "INSERT INTO rebuildlinks (rl_f_id,rl_f_title,rl_to) VALUES "

or:

1064: You have an error in your SQL syntax. Check the manual that corresponds to your MySQL server version for the right syntax to use near '' at line 1.

Please note, that the new rebuildlinks script was heavily rewritten during November and December 2003; you may wish to grab the current dev branch out of CVS (which we're running now on Wikipedia) or wait a few days for the new stable release [Brion Vibber].

MySQL server not running:

If you encounter a message like:

ERROR 2002: Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock' (2)

check if the mysqld is running, if not start it (e.g. /etc/init.d/mysql start) and run the script again.

Script finishes, but the counter on the Main_Page reports "0 articles"

When just importing cur_, rebuildlinks.php runs fine, but when accessing the Main_Page ("Hauptseite"), it says that there are "0 articles", but the wiki pages exist, in this case, several thousand pages.

To rebuild the article count manually:

Call MySQL monitor and type in your MediaWiki Admin password:

mysql -u root -p

Select the MediaWiki database:

mysql> use wiki

Then enter the following SQL statement:

mysql> SELECT @foo:=COUNT(*) FROM cur WHERE cur_namespace=0 AND cur_is_redirect=0 AND cur_text like '%[[%';

This will result in something like this:

++ | @foo:=COUNT(*) | ++ |         38189 | ++ 1 row in set (2.96 sec)

Then enter the yet another SQL statement:

mysql> UPDATE site_stats SET ss_good_articles=@foo;

This will result in something like this:

Query OK, 1 row affected (0.00 sec) Rows matched: 1 Changed: 1  Warnings: 0

That's it; now the counter on the Main_Page should be appropriate [Brion Vibber].

Other problems

tbd

Images and uploaded files
A collected archive of uploaded images is not yet available. (tbd: reason, workaround)


 * This would seem quite helpful. A single export that tar'd the db w/ the images/uploads. Especially if MediaWiki site maintainers just wanted to share their work/data.

Dumps in TomeRaider format
Database dumps are available also in TomeRaider format (in German).

This version contains all articles, but not the meta information like User pages, Upload info etc. or media like pictures or music. Mathematical formula which was inputted using TeX notation appears as source code instead of formula.

Downloads are available from http://download.wikipedia.org/tomeraider/.

Example: Download of the German Wikipedia in TomeRaider format:


 * Palm version
 * PocketPC version
 * EPOC version

More information is available from Erik Zachtes website.

Dumps in Mobipocket format
Database dumps for some Wikipedias are available also in Mobipocket format (in German). This format might be used on PDAs running operating systems like Palm, Pocket PC, Windows CE, Franklin E-BookMan, and EPOC.


 * Download

The Mobipocket reader for Palm OS, Pocket PC, Symbian, Smartphones, Franklin eBookman, BlackBerry, and Windows can be downloaded free of charge from www.mobipocket.de.

Backing up data: Creating a database dump
You can create a backup of the live databases using the mysqldump program. A simple way to run it is: mysqldump --user= --password= \  > wiki-mysql-dump.txt where the parameters to mysqldump should match your LocalSettings.php file as follows:

Depending on the size of your wiki, this could take a few seconds, to a few minutes. During that time, the database is locked, so parts of the wiki web interface will appear to be hung, or maybe spew out some weird error messages.

Instead of doing this, I have chosen to write a small script that makes a "hot copy" of the databases OLDAGE=10d CURDATE=`date "+%Y.%m.%d"` DESTDIR=/path/to/backup/destination WIKIWEB=/path/to/html/wiki echo "================================" echo "Backing up the database files..." cd $DESTDIR /usr/local/bin/mysqlhotcopy -u wikiuser -p wikipass wikidb $DESTDIR mv wikidb wikidb.$CURDATE echo "================================" echo "Checking for database corruption..." cd wikidb.$CURDATE /usr/local/bin/myisamchk -e -s *.MYI cd .. echo "================================" echo "Creating tarball..." tar -czpf wikidb.$CURDATE.tgz wikidb.$CURDATE rm -rf wikidb.$CURDATE echo "================================" echo "Copying LocalSettings.php (if needed)..." cp -nv $WIKIWEB/LocalSettings.php. echo "================================" echo "Incrementally copying the images..." /usr/local/bin/rsync -av $WIKIWEB/images. echo "================================" echo "Removing old backups" find $DESTDIR -name "wikidb.*.tgz" -ctime +$OLDAGE | xargs rm
 * 1) !/bin/sh
 * 1) cp -Rpnv $WIKIWEB/images.
 * 1) the above will try to return rm file not find when the find return nothing
 * 2) its better to use
 * 3) find $DESTDIR -name "wikidb.*.tgz" -ctime +$OLDAGE -exec rm -f {} \;

echo "================================" echo -n "backup script completed " date (This might have some minor errors, I needed to make it more generic to publish it.) --NickT 01:04, 1 Sep 2004 (UTC)

If you encounter problems with this backup script, try the following:


 * Check the paths (`which mysqlhotcopy myisamchk')
 * Remove the -n option to cp.
 * Connect to wikidb as the mysql root user, or grant the LOCK TABLES privilege to wikiuser
 * Execute the script as root, or some other user with access to the mysql files

Note that mysqlhotcopy needs to be run as root or a user that can access the mysql files. Also, mysqlhotcopy locks the database tables, but the mediawiki installation does not grant the LOCK TABLES privilege to wikiuser. To grant the privilege, do

mysql> grant lock tables on wikidb.* to wikiuser; mysql> flush privileges;

-- 84.188.216.116 02:39, 18 August 2005 (UTC)

Extending the database: Converting (importing) existing content
For existing sites it is always a tough task to migrate to a Wiki structure; the process of wikifying existing content from text files, HTML websites, or even office documents can be automated, but you'll have to write appropriate scripts on your own. As far as we know, there are no general-user ready-to-run scripts available for this purpose. Opposed to commercial Content Management Systems like Hyperwave or HTML editors like Microsoft FrontPage, MediaWiki and other Open Source WikiWiki software does not include import filters. There are some exceptions which will be discussed in the following sections.

Converting content from a UseMod Wiki
Prior to MediaWiki (Wikipedia Software Phase III and Phase II), the Wikipedia utilized the UseMod Wiki software written by Clifford Adams. UseModWiki is a Perl script which uses a database of text files to generate a WikiWiki site. Its primary access method is through CGI via the Web, but can be called directly by other Perl programs. To convert an existing UseMod Wiki site, there is a script available in the /maintenance subdirectory of your MediaWiki source folder.

The storage format of UseMod Wiki is well documented: DataBase.

tbd

Converting content from a PHPWiki
If you only have a few pages to convert, and the content isn't sensitive, you might want to try WebForce's online markup converter.

For larger PHPWikis, Isaac Wilcox has written a Perl script to do the conversion. It converts all the commonly used markup (still not 100% of markup, but most PHPWikis will only need minor tweaks after conversion; patches are welcome). It's written for the Mediawiki 1.4.x database schema, though updating it to handle 1.5.x should be fairly easy (again, patches welcome).

Also see PhpWiki conversion for a solution that uses "sed".

Converting content from a database dump
tbd

Converting content from a CSV text file
tbd

Converting content from HTML text file
If you have only a HTML excerpt or a few pages to convert, you might want to try Diberri's html2wiki converter, which uses HTML::WikiConverter Perl module from CPAN. For larger collection of files one should probably use the module itself.

There are probably other HTML to Wiki markup converters.

See the section below for importing into MediaWiki.

Converting content from a MS-Word document
Try: Word2MediaWikiPlus

Converting content from other sources
If you are able and willing to do some scripting by yourself, it is possible to import almost any existing textual content with a documented file format into MediaWiki.

The key to this is to insert rows into the cur table. Specifically, you need to fill the fields cur.cur_title and cur.cur_text. They contain the page titles and the wiki text, respectively. This is enough to start at. However, link tables must still be updated so the "What links here" etc. pages will work.


 * Note: As of version 1.5.0, the text above is incorrect; the database schema for handling revisions has changed significantly. I might be inclined to work upon some conversion scripts in the future, however. Rob Church Talk 12:06, 12 October 2005 (UTC)

As an example there might be the following script of some interest, which imports the public domain data from the CIA World Factbook 2002 into MediaWiki. It has been written by Evan Prodromou (E-Mail: evan(at)wikitravel(dot)org) for his own use on Wikitravel - The free, complete, up-to-date and reliable world-wide travel guide. It is licensed under the GNU General Public License and is available for download.

Please note that it's a one-time script; most paths and stuff are hard-coded, and lots of the code is for parsing the CIA World Factbook print pages, but it might serve as a good example of what can be done.


 * Does anybody know where this script is available? Matthewsim 23:27, 27 Aug 2004 (UTC)
 * found it on Wikitravel site Renmiri 21:57, 16 February 2006 (UTC)

Cleaning up the Database
Supposed you discover that you ran out of disk space and have to delete your local copy of the Wikipedia:

# df  Filesystem           1K-blocks      Used Available Use% Mounted on   /dev/hda1              2162884   2055124         0 100% / mysql> use wikidb mysql> drop database wikidb mysql> exit; # df  Filesystem           1K-blocks      Used Available Use% Mounted on   /dev/hda1              2162884   1726896    326120  85% /
 * 1) mysql -u wikiuser -p

Manually: Copy each page to your clipboard you want to lose the history for. Delete the page and Recreate it from the clipboard. This way everyone can see that you did that which might be perceived as nice or not-nice. Also enormously time consuming and prone to error.

Automatically: Study the database structure carefully and craft the right SQL statements to make the modification you want to do. Please do tell people that you have modified something, so they know that the right policy is to not really trust anything the history says. If the install is public, archived copies may be used to detect this which will put you into a bad light for those that can access this detail level of information which in turn widens the so called "digital gap" between those who know that you are a crook, who guess you are a crook and those who have no clue.

or

You could use mysqldump -uusername -ppassword databasename > file.sql and edit that with emacs or whatever tool you have available

Of course, if you are out of disk space, you couldn't dump to a local file. You could pipe the output to a remote file via ssh:

mysqldump -uusername -ppassword databasename | ssh user@host "cat > remote_file"

drop and create the database in MySQL

and

mysql -uusername -ppassword databasename < file.sql

to import the modified data.

Changing the second default title
A way to change the "second title" that appears on top of every page is to modify the MediaWiki:Tagline system message (in older versions, edit MediaWiki:Fromwikipedia).

Maintenance scripts
The Maintenance scripts are located in the 'maintenance' subdirectory. They include tools for the following purposes:

tbd

The .sql scripts in this directory are not meant to be run standalone, although they can be in some cases if you know what you're doing. Most of the time you'll want to run the .php scripts from the command line. You must run them from this directory, and the LocalSettings.php file in the directory above must point to the installation.

The scripts in the 'archive' subdirectory are for updating databases from older versions of the software.

In order to use the maintenance scripts you have to create AdminSettings.php as shown in AdminSettings.sample</tt>