Manual:System administration

This is part of the MediaWiki Documentation.

Introduction
This guidebook will help you to do common administrative tasks on your public or local MediaWiki installation which are recommended, but not required for running a MediaWiki site. It requires that you have a fully working installation. This document is of no relevance for you if you just want to learn how to add or edit articles.

This guide consists of the following sections ... (tbd)

File List
A full installation of the MediaWiki software consists of the following files in you webserver's public directory:

(tdb: short descriptions about what these files do)


 * Article.php
 * Block.php
 * CacheManager.php
 * DatabaseFunctions.php
 * DefaultSettings.php
 * DifferenceEngine.php
 * FulltextStoplist.php
 * GlobalFunctions.php
 * Interwiki.php
 * LanguageEn.php
 * Language.php
 * LanguageUtf8.php
 * LinkCache.php
 * LinksUpdate.php
 * LocalSettings.php
 * LocalSettings.php~
 * LogPage.php
 * Math.php
 * MemCachedClient.inc.php
 * MemcachedSessions.php
 * Namespace.php
 * OutputPage.php
 * redirect.phtml
 * SearchEngine.php
 * SearchUpdate.php
 * Setup.php
 * SiteStatsUpdate.php
 * SkinCologneBlue.php
 * SkinFramed.php
 * SkinNostalgia.php
 * Skin.php
 * SkinStandard.php
 * SpecialAllpages.php
 * SpecialAncientpages.php
 * SpecialAsksql.php
 * SpecialBlockip.php
 * SpecialBooksources.php
 * SpecialContributions.php
 * SpecialDebug.php
 * SpecialEmailuser.php
 * SpecialImagelist.php
 * SpecialIpblocklist.php
 * SpecialListusers.php
 * SpecialLockdb.php
 * SpecialLonelypages.php
 * SpecialLongpages.php
 * SpecialMaintenance.php
 * SpecialMovepage.php
 * SpecialNeglectedpages.php
 * SpecialNewpages.php
 * SpecialPopularpages.php
 * SpecialPreferences.php
 * SpecialRandompage.php
 * SpecialRecentchangeslinked.php
 * SpecialRecentchanges.php
 * SpecialShortpages.php
 * SpecialSpecialpages.php
 * SpecialStatistics.php
 * SpecialUndelete.php
 * SpecialUnlockdb.php
 * SpecialUnusedimages.php
 * SpecialUpload.php
 * SpecialUserlogin.php
 * SpecialUserlogout.php
 * SpecialVote.php
 * SpecialWantedpages.php
 * SpecialWatchlist.php
 * SpecialWhatlinkshere.php
 * texvc.phtml
 * Title.php
 * UpdateClasses.php
 * User.php
 * UserTalkUpdate.php
 * UserUpdate.php
 * Utf8Case.php
 * Version.php
 * ViewCountUpdate.php
 * WatchedItem.php
 * wiki.phtml

If you are running a different language than Englisch, there might be some other files as well:


 * LanguageDe.php - or other Langauge files

Also, you will find two directories below your MediaWiki directory:


 * style
 * upload

Additional, in the ../maintenance subdirectory of your MediaWiki's source directory are some tools, scripts and sql queries included.

The .sql scripts in this directory are not meant to be run standalone, although they can be in some cases if you know what you're doing. Most of the time you'll want to run the .php scripts from the command line. You must run them from this directory, and the LocalSettings.php file in the directory above must point to the installation.


 * build-intl-wiki.sql - Experimental: create shared international database for new interlinking code.
 * database.sql - SQL script to create database for wiki. This is run from the installation script which replaces the variables with their values from local settings.
 * indexes.sql - SQL to add non-unique indexes to Wikipedia database tables. This is read and executed by the install script; you should never have to run it by itself.
 * initialdata.sql - SQL to load database with initial values for testing. Most will be overwritten by install script.
 * interwiki.sql - Default interwiki prefixes, based more or less on the public interwiki map from MeatballWiki.
 * tables.sql - SQL to create the initial tables for the Wikipedia database. This is read and executed by the install script; you should never have to run it by itself. -- Only UNIQUE keys are defined here; the rest are added by indexes.sql.
 * users.sql - SQL script to create required database users with proper access rights. This is run from the installation script which replaces the password variables with their values from local settings.
 * wikipedia-interwiki.sql - For convenience, here are the *in-project* interwiki prefixes for Wikipedia.

The scripts in archive are for updating databases from older versions of the software.


 * apache-ampersand.diff


 * checktrans.php - Check to see if all messages have been translated into the selected language. To run this script, you must have a working installation, and it checks the selected language of that installation.
 * cleandb.php - Creating a new empty database; either this or the conversion script from the old format needs to be run, but not both.
 * fetchInterwiki.pl
 * importUseModWiki.php - Import data from a UseModWiki into a PediaWiki wiki.
 * rebuildall.php - Rebuild link tracking tables from scratch. This takes several hours, depending on the database size and server configuration.
 * rebuildlinks.inc
 * rebuildlinks.php - Rebuild link tracking tables from scratch. This takes several hours, depending on the database size and server configuration.
 * rebuildrecentchanges.inc
 * rebuildrecentchanges.php - Rebuild link tracking tables from scratch. This takes several hours, depending on the database size and server configuration.
 * rebuildtextindex.inc
 * rebuildtextindex.php - Rebuild link tracking tables from scratch. This takes several hours, depending on the database size and server configuration.

Choosing a design: Selecting a skin
Currently, the following three skins are available:
 * Skin Standard
 * Skin Nostalgie
 * Cologne Blue

(tbd: Screenshots)

Skin Standard
tbd

Skin Nostalgie
tbd

Cologne Blue
Screenshots:



tbd

See also Cologne Blue skin problems,.

Localisation

 * Language.php - Language.php,

Whether you want to start translating this file from scratch or if you want to update your translation, start by downloading the full version of the latest Wikipedia package from the project's SourceForge page and translate the Language.php file. Or update from the Language.php file on CVS. A copy of that file is NOT included on this Wikipedia page because it always falls behind the most current version. (Many translators have ended up having to re-synchronize their translations with the CVS version.)


 * Locales for the Wikipedia Software,.

A locale for a language version of Wikipedia is a file containing settings and translations specific for that version. The current versions are always available from the CVS source repository

Getting data: Importing a database dump
If you want some data to play with in a local copy, you can get database dumps from several Wimikedia sites; see an overview at Database download and the download site.

Getting the required software
Required software: bzip2.

bzip2 is a freely available, patent free, high-quality data compressor. It typically compresses files to within 10% to 15% of the best available techniques, whilst being around twice as fast at compression and six times faster at decompression.

bzip2 compresses files using the Burrows-Wheeler block-sorting text compression algorithm, and Huffman coding. Compression is generally considerably better than that achieved by more conventional LZ77/LZ78-based compressors, and approaches the performance of the PPM family of statistical compressors.

You can get bzip2 for free at the fopllowing locations:
 * Microsoft Windows: a command-line Windows version of bzip2 is available for free under a BSD license (bzip2 ). A GUI file archiver, 7-zip, that is also able to open bz2 compressed files is available for free (7-zip, ).
 * GNU/Linux: most distributions ship with the command-line bzip2 tool. E.g. on Debian/GNU Linux it's apt-gettable by apt-get install bzip2.
 * MacOS X: ships with the command-line bzip2 tool.

Getting the data
SQL dumps of the Wikipedia and several other WikieMedia sites are available at and the download site. You can download for free and use it according to the GNU Free Documentation License.

The dumps are (automatically?) created at least once a week, at least for the english Wikipedia. The total size of the database exceeded 3 Gigabytes at the end of November 2003, so you might want to start with smaller packages, e.g. only the current revision.

On the download site, you will find a list of available files, sorted by the WikiMedia project (e.g. en.wikipedia.org for the english Wikipedia). For each WikiMedia site you'll find two files:
 * cur - includes just the current revisions of the articles and is significantly smaller.
 * old - includes all revisions of the articles and is usually quite huge.

For local testing purposes you need ony one cur file for one WikiMedia site. Please note, that these downloads do not include images or other binary material.

Checking the file integrity
There are md5 checksums attached to every file, e.g.

md5 3cca5e147d5e699dcbade2316f79c340

This information helps you to verify the integrity of your download. If the checksums of the source file and your local download don't match, the archive has probably been corrupted during the transfer.

md5sum computes a 128-bit checksum (or fingerprint or message-digest) for each specified file. The tool to generate the checksums is Free Software and available for all supported platforms:


 * GNU/Linux: md5sum is part of the former GNU text utilities (now GNU core utilities); they can be download via the coreutils page; every GNU/Linux distribution includes them. E.g. on Debian GNU/Linux, you can get them via apt-get install coreutils.
 * Microsoft Windows: e.g. md5sum.exe (48 kB) or ;
 * MacOS X: should be included (?)

Importing the database dump
The following command will decompress the database dump 'cur_table.sql.bz2', output it to standard out, pipe this to the MySQL database 'wikidb', using the user 'wikiadmin' and the password 'adminpass':

bzip2 -dc cur_table.sql.bz2 | mysql -u wikiadmin -padminpass wikidb

Similarily, you import 'old_table.sql.bz2' into the MySQL database:

bzip2 -dc old_table.sql.bz2 | mysql -u wikiadmin -padminpass wikidb

This will take some time; depending on your system and the size of the database dump, you might have to wait for several minutes; e.g. on an SMP system with two AMD Athlon 1900+ CPUs and 1 GB RAM, it takes about ten minutes to process a 300 MB dump file. Please note that this command doesn't output any status information, neither when it's working nor when it's finished.

After this, you have to issue the following command from the MediaWiki maintenance directory:

php rebuildLinks.php

This will show something like this:

Rebuilding link tables (pass 1). 1000 of 348842 articles scanned. 2000 of 348842 articles scanned. 3000 of 348842 articles scanned. 4000 of 348842 articles scanned. 5000 of 348842 articles scanned. 6000 of 348842 articles scanned. ...

This script will rebuild the link tracking tables from scratch. This again takes quite a while, maybe even several hours, depending on the database size and server configuration.

Platform specific notes:
 * On Debian GNU/Linux, the PHP interpreter is called 'php4'.
 * On Microsoft Windows XP, you can use the program bunzip2 (download Windows version) as follows: in a DOS box run "bzunip2 xxx" where xxx is the compressed file.

If you're lucky, the script finishes without any problem.

Troubleshooting
If the script encounters errors, you might have run in one of the following problems:

Errors in the SQL dump.

ERROR 1030 at line 146: Got error 28 from table handler

or

ERROR 3 at line 166: Error writing file '/var/log/mysql.log' (Errcode: 28)

Errors like this might indicate that you've run out of disk space. Check /var/log/mysql.log and disk space (df -h)

MySQL reports an error.

If you encounter error messages like the following when running the rebuildlinks script:

syntactical error in the sql statement "INSERT INTO rebuildlinks (rl_f_id,rl_f_title,rl_to) VALUES "

or:

1064: You have an error in your SQL syntax. Check the manual that corresponds to your MySQL server version for the right syntax to use near '' at line 1.

Please note, that the new rebuildlinks script was heavily rewritten during November and December 2003; you may wish to grab the current dev branch out of CVS (which we're running now on Wikipedia) or wait a few days for the new stable release [Brion Vibber].

MySQL server not running:

If you encounter a message like:

ERROR 2002: Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock' (2)

check if the mysqld is running, if not start it (e.g. /etc/init.d/mysql start) and run the script again.

Script finishes, but the counter on the Main_Page reports "0 articles"

When just importing cur_, rebuildlinks.php runs fine, but when accessing the Main_Page ("Hauptseite"), it says that there are "0 articles", but the wiki pages exist, in this case, several thousand pages.

To rebuild the article count manually:

Call MySQL monitor and type in your MediaWiki Admin password:

mysql -u root -p

Select the MediaWiki database:

mysql> use wiki

Then enter the following SQL statement:

mysql> SELECT @foo:=COUNT(*) FROM cur WHERE cur_namespace=0 AND cur_is_redirect=0 AND cur_text like '%[[%';

This will result in something like this:

++ | @foo:=COUNT(*) | ++ |         38189 | ++ 1 row in set (2.96 sec)

Then enter the yet another SQL statement:

mysql> UPDATE site_stats SET ss_good_articles=@foo;

This will result in something like this:

Query OK, 1 row affected (0.00 sec) Rows matched: 1 Changed: 1  Warnings: 0

That's it; now the counter on the Main_Page should be appropriate [Brion Vibber].

Other problems

tbd

Images and uploaded files
A collected archive of uploaded images is not yet available. (tbd: reason, workaround)

Dumps in TomeRaider format
Database dumps are available also in TomeRaider format (in German).

This version contains all articles, but not the meta information like User pages, Upload info etc. or media like pictures or music. Mathematcal formula which was inputted using TeX notation appears as source code instead of furmula.

Downloads are available from http://download.wikipedia.org/tomeraider/current/.

Examplae: Download of the German Wikipedia in TomeRaider format:


 * Palm version
 * PocketPC version
 * EPOC version

More information is available from Erik Zachtes website.

Dumps in Mobipocket format
Database dumps for some Wikipedias are available also in Mobipocket format (in German). This format might be used on PDAs running operating systems like Palm, Pocket PC, Windows CE, Franklin E-BookMan, and EPOC.


 * Download

The Mobipocket reader for Palm OS, Pocket PC, Symbian, Smartphones, Franklin eBookman, and Windows can be downloaded free of charge from www.mobipocket.de.

Backuping data: Creating a database dump
tbd

Extending the database: Converting (importing) existing content
For existing sites it is always a tough task to migrate to a Wiki structure; the process of wikifying existing content from text files, HTML websites, or even office documents can be automated, but you'll have to write apporpriate scripts on your own. As far as we know, there are no general-user ready-to-run scripts available for this purpose. Opposed to commercial Content Management Systems like Hyperwave or HTML editors like Microsoft FrontPage, MediaWiki and other Open Source WikiWiki software does not include import filters. There are some exceptions wich will be discussed in the following sections.

Converting content from a UseMod Wiki
Prior to MediaWiki (Wikipedia Software Phase III and Phase II), the Wikipedia utilized the UseMod Wiki software written by Clifford Adams. UseModWiki is a Perl script which uses a database of text files to generate a WikiWiki site. Its primary access method is through CGI via the Web, but can be called directly by other Perl programs. To convert an existing UseMod Wiki site, there is a script available in the /maintenance subdirectory of your MediaWiki source folder.

The storage format of UseMod Wiki is well documented: DataBase.

tbd

Converting content from a database dump
tbd

Converting content from a CSV text file
tbd

Converting content from other sources
If you are able and willing to do some scripting by yourself, it is possible to import almost any existing textual content with a documented file format into MediaWiki.

As an example there might be the following script of some interest, which imports the public domain data from the CIA World Factbook 2002 into MediaWiki. It has been written by Evan Prodromou (E-Mail: evan(at)wikitravel(dot)org) for his own use on Wikitravel - The free, complete, up-to-date and reliable world-wide travel guide. It is licensed under the GNU General Public License and is available for download.

Please note that it's a one-time script; most paths and stuff are hard-coded, and lots of the code is for parsing the CIA World Factbook print pages, but it might serve as a good example of what can be done.

The Maintenance scripts
The Maintenance scripts are located in the 'maintenance' subdiretory. They include tools for the following purposes:

tbd

The .sql scripts in this directory are not meant to be run standalone, although they can be in some cases if you know what you're doing. Most of the time you'll want to run the .php scripts from the command line. You must run them from this directory, and the LocalSettings.php file in the directory above must point to the installation.

The scripts in the 'archive' subdirectory are for updating databases from older versions of the software.

Updating the software
To update from one version of MediaWiki to another - do these steps...

tbd

Checking your success: Analysing Server Logfiles
tbd

Best Practices
(experiences needed!) tdb