Manual:System administration

This is part of the MediaWiki Documentation.

...

Introduction
This guidebook will help you to do common administrative tasks on your public or local MediaWiki installation which are recommended, but not required for running a MediaWiki site. It requires that you have a fully working installation. This document is of no relevance for you if you just want to learn how to add or edit articles.

This guide consists of the following sections ... (tbd)

File List
See full file list for a listing of the files that are in your webserver's public directory as part of the MediaWiki software, with descriptions for many of them.

Enabling and Disabling Uploads
In newer versions of the MediaWiki software, starting with version 1.1.0 from 2003-12-08, uploads are disabled by default because of security considerations.

To enable uploads, you have to edit LocalSettings.php:

$wgDisableUploads = false;

Before you do this, please make sure that your upload directory is configured in a safe manner so it's not possible to upload and execute arbitrary PHP code.

Earlier versions of MediaWiki included a bug that potentially allows logged- in users to delete arbitrary files in directories writable by the web server user by manually feeding false form data; this is now fixed.

As a reminder, disable PHP script execution in the upload directory!

To disable uploads, you also have to edit LocalSettings.php:

$wgDisableUploads = true;

See also: Security Guide.

Getting administrative rights
MediaWiki does not yet have an interface for setting the user_rights field of user accounts. Assigning accounts 'sysop' status has to be done manually by issuing an SQL query in the database. Usually you'll want to do something like this:

Got to MySQL monitor:


 * 1) mysql -u myaccount -p

Select the MediaWiki database:

mysql> use wikidb

Issue a SQL statement which changes the user_rights of a particular (existing) MediaWiki user account:

mysql> UPDATE user SET user_rights='sysop' WHERE user_name='The Username';

The user_rights field is actually a comma-separated list; presently three values are recognized by the software:

Sysop (Administrator)
The most common use. A user marked as 'sysop' can delete and undelete pages, block and unblock IPs, issue read-only SQL queries to the database, and use a shortcut revert-to-previous-contributor's-revision feature in contribs. (See en:Wikipedia:Administrators for info on sysops on Wikipedia specifically.)

Developer
This is largely obsolete and will be removed from future versions of the software.

Bot
A registered bot account. Edits by an account with this set will not appear by default in Recentchanges; this is intended for mass imports of data without flooding human edits from view. (Add &hidebots=0 to list changes made by bots e.g. like this)

See also: Setting user rights in MediaWiki.

Choosing a design: Selecting a skin
Currently, the following three skins are available:
 * Monobook
 * Skin Standard
 * Skin Nostalgie
 * Cologne Blue

(tbd: Screenshots)

Setting the default skin
In MediaWiki 1.1.0, to set a default skin, adjust the following array in Language.php ,

(0 = Standard, 1 = Nostalgia, 2 = CologneBlue)


 * /* Use the "skin" element for the default skin.*/
 * $wgDefaultUserOptionsEn = array(
 * "quickbar" => 1, "underline" => 1, "hover" => 1,
 * "cols" => 80, "rows" => 25, "searchlimit" => 20,
 * "contextlines" => 5, "contextchars" => 50,
 * "skin" => 2, "math" => 1, "rcdays" => 7, "rclimit" => 50,
 * "highlightbroken" => 1, "stubthreshold" => 0,
 * "previewontop" => 1, "editsection"=>1,"editsectiononrightclick"=>0, "showtoc"=>1,
 * "date" => 0

Skin Standard
tbd

Skin Nostalgie
tbd

Cologne Blue
Screenshots:



tbd

See also Cologne Blue skin problems,.

Localisation

 * Language.php - Language.php,

Whether you want to start translating this file from scratch or if you want to update your translation, start by downloading the full version of the latest Wikipedia package from the project's SourceForge page and translate the Language.php file. Or update from the Language.php file on CVS. A copy of that file is NOT included on this Wikipedia page because it always falls behind the most current version. (Many translators have ended up having to re-synchronize their translations with the CVS version.)


 * Locales for the Wikipedia Software,.

A locale for a language version of Wikipedia is a file containing settings and translations specific for that version. The current versions are always available from the CVS source repository

Getting data: Importing a database dump
If you want some data to play with in a local copy, you can get database dumps from several Wimikedia sites; see an overview at Database download and the download site.

Getting the required software
Required software: bzip2. See Compression for some interesting facts about the bz2 format.

You can get bzip2 for free at the following locations:
 * Microsoft Windows: a command-line Windows version of bzip2 is available for free under a BSD license (bzip2 ). A GUI file archiver, 7-zip, that is also able to open bz2 compressed files is available for free (7-zip, ).
 * GNU/Linux: most distributions ship with the command-line bzip2 tool. E.g. on Debian/GNU Linux it's apt-gettable by apt-get install bzip2.
 * MacOS X: ships with the command-line bzip2 tool.

Getting the data
SQL dumps of the Wikipedia and several other WikieMedia sites are available at and the download site. You can download for free and use it according to the GNU Free Documentation License.

The dumps are (automatically?) created at least once a week, at least for the english Wikipedia. The total size of the database exceeded 3 Gigabytes at the end of November 2003, so you might want to start with smaller packages, e.g. only the current revision.

On the download site, you will find a list of available files, sorted by the WikiMedia project (e.g. en.wikipedia.org for the english Wikipedia). For each WikiMedia site you'll find two files:
 * cur - includes just the current revisions of the articles and is significantly smaller.
 * old - includes all revisions of the articles and is usually quite huge.

For local testing purposes you need only one cur file for one WikiMedia site. Please note, that these downloads do not include images or other binary material.

Checking the file integrity
There are md5 checksums attached to every file, e.g.

md5 3cca5e147d5e699dcbade2316f79c340

This information helps you to verify the integrity of your download. If the checksums of the source file and your local download don't match, the archive has probably been corrupted during the transfer.

md5sum computes a 128-bit checksum (or fingerprint or message-digest) for each specified file. The tool to generate the checksums is Free Software and available for all supported platforms:


 * GNU/Linux: md5sum is part of the former GNU text utilities (now GNU core utilities); they can be download via the coreutils page; every GNU/Linux distribution includes them. E.g. on Debian GNU/Linux, you can get them via apt-get install coreutils.
 * Microsoft Windows: e.g. md5sum.exe (48 kB) or ;
 * MacOS X: The command-line tool /sbin/md5 is part of MacOS X.

Another widespread tool, openssl, can be used to generate MD5 checksums: openssl md5 the_filename

Importing the database dump
The following command will decompress the database dump 'cur_table.sql.bz2', output it to standard out, pipe this to the MySQL database 'wikidb', using the user 'wikiadmin' and the password 'adminpass':

bzip2 -dc cur_table.sql.bz2 | mysql -u wikiadmin -padminpass wikidb

Similarily, you import 'old_table.sql.bz2' into the MySQL database:

bzip2 -dc old_table.sql.bz2 | mysql -u wikiadmin -padminpass wikidb

This will take some time; depending on your system and the size of the database dump, you might have to wait for several minutes; e.g. on an SMP system with two AMD Athlon 1900+ CPUs and 1 GB RAM, it takes about ten minutes to process a 300 MB dump file. Please note that this command doesn't output any status information, neither when it's working nor when it's finished.

After this, you have to issue the following command from the MediaWiki maintenance directory:

php rebuildLinks.php

This will show something like this:

Rebuilding link tables (pass 1). 1000 of 348842 articles scanned. 2000 of 348842 articles scanned. 3000 of 348842 articles scanned. 4000 of 348842 articles scanned. 5000 of 348842 articles scanned. 6000 of 348842 articles scanned. ...

This script will rebuild the link tracking tables from scratch. This again takes quite a while, maybe even several hours, depending on the database size and server configuration.

Platform specific notes:
 * On Debian GNU/Linux, the PHP interpreter is called 'php4'.
 * On Microsoft Windows XP, you can use the program bunzip2 (download Windows version) as follows: in a DOS box run "bzunip2 xxx" where xxx is the compressed file.

If you're lucky, the script finishes without any problem.

Troubleshooting
If the script encounters errors, you might have run in one of the following problems:

Errors in the SQL dump.

ERROR 1030 at line 146: Got error 28 from table handler

or

ERROR 3 at line 166: Error writing file '/var/log/mysql.log' (Errcode: 28)

Errors like this might indicate that you've run out of disk space. Check /var/log/mysql.log and disk space (df -h)

MySQL reports an error.

If you encounter error messages like the following when running the rebuildlinks script:

syntactical error in the sql statement "INSERT INTO rebuildlinks (rl_f_id,rl_f_title,rl_to) VALUES "

or:

1064: You have an error in your SQL syntax. Check the manual that corresponds to your MySQL server version for the right syntax to use near '' at line 1.

Please note, that the new rebuildlinks script was heavily rewritten during November and December 2003; you may wish to grab the current dev branch out of CVS (which we're running now on Wikipedia) or wait a few days for the new stable release [Brion Vibber].

MySQL server not running:

If you encounter a message like:

ERROR 2002: Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock' (2)

check if the mysqld is running, if not start it (e.g. /etc/init.d/mysql start) and run the script again.

Script finishes, but the counter on the Main_Page reports "0 articles"

When just importing cur_, rebuildlinks.php runs fine, but when accessing the Main_Page ("Hauptseite"), it says that there are "0 articles", but the wiki pages exist, in this case, several thousand pages.

To rebuild the article count manually:

Call MySQL monitor and type in your MediaWiki Admin password:

mysql -u root -p

Select the MediaWiki database:

mysql> use wiki

Then enter the following SQL statement:

mysql> SELECT @foo:=COUNT(*) FROM cur WHERE cur_namespace=0 AND cur_is_redirect=0 AND cur_text like '%[[%';

This will result in something like this:

++ | @foo:=COUNT(*) | ++ |         38189 | ++ 1 row in set (2.96 sec)

Then enter the yet another SQL statement:

mysql> UPDATE site_stats SET ss_good_articles=@foo;

This will result in something like this:

Query OK, 1 row affected (0.00 sec) Rows matched: 1 Changed: 1  Warnings: 0

That's it; now the counter on the Main_Page should be appropriate [Brion Vibber].

Other problems

tbd

Images and uploaded files
A collected archive of uploaded images is not yet available. (tbd: reason, workaround)

Dumps in TomeRaider format
Database dumps are available also in TomeRaider format (in German).

This version contains all articles, but not the meta information like User pages, Upload info etc. or media like pictures or music. Mathematcal formula which was inputted using TeX notation appears as source code instead of furmula.

Downloads are available from http://download.wikipedia.org/tomeraider/current/.

Examplae: Download of the German Wikipedia in TomeRaider format:


 * Palm version
 * PocketPC version
 * EPOC version

More information is available from Erik Zachtes website.

Dumps in Mobipocket format
Database dumps for some Wikipedias are available also in Mobipocket format (in German). This format might be used on PDAs running operating systems like Palm, Pocket PC, Windows CE, Franklin E-BookMan, and EPOC.


 * Download

The Mobipocket reader for Palm OS, Pocket PC, Symbian, Smartphones, Franklin eBookman, and Windows can be downloaded free of charge from www.mobipocket.de.

Backing up data: Creating a database dump
You can create a backup of the live databases using the mysqldump program. A simple way to run it is: mysqldump --user= --password= \  > wiki-mysql-dump.txt where the parameters to mysqldump should match your LocalSettings.php file as follows:

Depending on the size of your wiki, this could take a few seconds, to a few minutes. During that time, the database is locked, so parts of the wiki web interface will appear to be hung, or maybe spew out some weird error messages.

Instead of doing this, I have chosen to write a small script that makes a "hot copy" of the databases OLDAGE=10d CURDATE=`date "+%Y.%m.%d"` DESTDIR=/path/to/backup/destination WIKIWEB=/path/to/html/wiki echo "================================" echo "Backing up the database files..." cd $DESTDIR /usr/local/bin/mysqlhotcopy -u wikiuser -p wikipass wikidb $DESTDIR mv wikidb wikidb.$CURDATE echo "================================" echo "Checking for database corruption..." cd wikidb.$CURDATE /usr/local/bin/myisamchk -e -s *.MYI cd .. echo "================================" echo "Creating tarball..." tar -czpf wikidb.$CURDATE.tgz swwiki.$CURDATE rm -rf wikidb.$CURDATE echo "================================" echo "Copying LocalSettings.php (if needed)..." cp -nv $WIKIWEB/LocalSettings.php. echo "================================" echo "Incrementally copying the images..." /usr/local/bin/rsync -av $WIKIWEB/images. echo "================================" echo "Removing old backups" find $DESTDIR -name "wikidb.*.tgz" -ctime +$OLDAGE | xargs rm echo "================================" echo -n "backup script completed " date (This might have some minor errors, I needed to make it more generic to publish it.) --NickT 01:04, 1 Sep 2004 (UTC)
 * 1) !/bin/sh
 * 1) cp -Rpnv $WIKIWEB/images.

Extending the database: Converting (importing) existing content
For existing sites it is always a tough task to migrate to a Wiki structure; the process of wikifying existing content from text files, HTML websites, or even office documents can be automated, but you'll have to write apporpriate scripts on your own. As far as we know, there are no general-user ready-to-run scripts available for this purpose. Opposed to commercial Content Management Systems like Hyperwave or HTML editors like Microsoft FrontPage, MediaWiki and other Open Source WikiWiki software does not include import filters. There are some exceptions wich will be discussed in the following sections.

Converting content from a UseMod Wiki
Prior to MediaWiki (Wikipedia Software Phase III and Phase II), the Wikipedia utilized the UseMod Wiki software written by Clifford Adams. UseModWiki is a Perl script which uses a database of text files to generate a WikiWiki site. Its primary access method is through CGI via the Web, but can be called directly by other Perl programs. To convert an existing UseMod Wiki site, there is a script available in the /maintenance subdirectory of your MediaWiki source folder.

The storage format of UseMod Wiki is well documented: DataBase.

tbd

Converting content from a database dump
tbd

Converting content from a CSV text file
tbd

Converting content from other sources
If you are able and willing to do some scripting by yourself, it is possible to import almost any existing textual content with a documented file format into MediaWiki.

The key to this is to insert rows into the cur table. Specifically, you need to fill the fields cur.cur_title and cur.cur_text. They contain the page titles and the wiki text, respectively. This is enough to start at. However, link tables must still be updated so the "What links here" etc. pages will work.

As an example there might be the following script of some interest, which imports the public domain data from the CIA World Factbook 2002 into MediaWiki. It has been written by Evan Prodromou (E-Mail: evan(at)wikitravel(dot)org) for his own use on Wikitravel - The free, complete, up-to-date and reliable world-wide travel guide. It is licensed under the GNU General Public License and is available for download.

Please note that it's a one-time script; most paths and stuff are hard-coded, and lots of the code is for parsing the CIA World Factbook print pages, but it might serve as a good example of what can be done.


 * Does anybody know where this script is available? Matthewsim 23:27, 27 Aug 2004 (UTC)

Cleaning up the Database
Supposed you discover that you ran out of disk space and have to delete your local copy of the Wikipedia:

# df  Filesystem           1K-blocks      Used Available Use% Mounted on   /dev/hda1              2162884   2055124         0 100% / mysql> use wikidb mysql> drop database wikidb mysql> exit; # df  Filesystem           1K-blocks      Used Available Use% Mounted on   /dev/hda1              2162884   1726896    326120  85% /
 * 1) mysql -u wikiuser -p

Manually: Copy each page to your clipboard you want to lose the history for. Delete the page and Recreate it from the clipboard. This way everyone can see that you did that which might be perceived as nice or not-nice. Also enormously time consuming and prone to error.

Automatically: Study the database structure carefully and craft the right SQL statements to make the modification you want to do. Please do tell people that you have modified something, so they know that the right policy is to not really trust anything the history says. If the install is public, archived copies may be used to detect this which will put you into a bad light for those that can access this detail level of information which in turn widens the so called "digital gap" between those who know that you are a crook, who guess you are a crook and those who have no clue.

or

You could use mysqldump -uusername -ppassword databasename > file.sql and edit that with emacs or whatever tool you have available

Of course, if you are out of disk space, you couldn't dump to a local file. You could pipe the output to a remote file via ssh:

mysqldump -uusername -ppassword databasename | ssh user@host "cat > remote_file"

drop and create the database in MySQL

and

mysql -uusername -ppassword databasename < file.sql

to import the modified data.

Changing the second default title
How can I change the second default title on every new Page ("aus Wikipedia, der freien Enzyklopädie" in the German version resp. "from Wikipedia, the free Encyclopaedia" in the English version)?

If using $wgUseDatabaseMessages, edit MediaWiki:fromwikipedia. If not, edit LanguageDe.php which can be found in the subdirectory named languages, and find the line that defines the string for 'fromwikipedia' and change it.

The Maintenance scripts
The Maintenance scripts are located in the 'maintenance' subdirectory. They include tools for the following purposes:

tbd

The .sql scripts in this directory are not meant to be run standalone, although they can be in some cases if you know what you're doing. Most of the time you'll want to run the .php scripts from the command line. You must run them from this directory, and the LocalSettings.php file in the directory above must point to the installation.

The scripts in the 'archive' subdirectory are for updating databases from older versions of the software.

Updating the software
SeeInstead: upgrade mediawiki Please fix:Out of date

To update from one version of MediaWiki to a newer version, you can utilize the update.php script which is located in the MediaWIki source directory (not in the maintenance subdirectory!).


 * 1) Get the most recent version from the Sourceforge project page.


 * 1) If you're new to MediaWiki, it's suggested that you keep the different versions of the MediaWiki software in separate soure directories, e.g.

mediawiki-1.1.0       ; most recent release mediawiki-20031117    ; older relase

However, if you are sadistic, you might simply want to overwrite the old files.

Untar the file
tar xvfz mediawiki-1.1.0.tar.gz

This will give you a directory structure like this:

mediawiki-1.1.0/ mediawiki-1.1.0/.cvsignore mediawiki-1.1.0/AdminSettings.sample mediawiki-1.1.0/docs/ mediawiki-1.1.0/docs/deferred.doc mediawiki-1.1.0/docs/design.doc mediawiki-1.1.0/docs/globals.doc ... on and on for a few screens ... mediawiki-1.1.0/templates/montparnasse.tpl mediawiki-1.1.0/templates/paddington.tpl mediawiki-1.1.0/texvc.phtml mediawiki-1.1.0/update.php mediawiki-1.1.0/Version.php mediawiki-1.1.0/wiki.phtml


 * Copy your edited versions of LocalSettings.php and AdminSettings.php to the newly created directory.


 * Become root (or another user who is allowed to write in to the MediaWiki destination directory, e.g. /var/www/wiki).

su root

Run the update script from the MediaWiki source directory:
NOTE:

php update.php ; on Debian GNU/Linux, run php4 update.php

This will result in something like this:

Copying files... ok Checking database for necessary updates... ...ipblocks table is up to date. ...already have interwiki table ...indexes seem up to 20031107 standards Adding linkscc table... ok Initialising "MediaWiki" namespace...done Done.

Note: update.php is no longer included in MediaWiki.

You might need to rebuild the links:
php rebuildlinks.php

This will result in something like:

This script may take several hours to complete. If you abort during that time, your wiki will be in an inconsistent state. If you are going to abort, this is the time to do it.

Press control-c to abort (will proceed automatically in 15 seconds) Rebuilding link tables. Setting AUTOCOMMIT=1 Locking tables Deleting old data in links table. Deleting old data in brokenlinks table. Deleting old data in imagelinks table. Finding number of articles to process... 349383 Finding highest article id Starting processing purgefreq = 1000 .................

Troubleshooting:
If you run into the error Fatal error: Allowed memory size of 8388608 bytes exhausted at (null):0 (tried to allocate 43 bytes) in /var/www/wiki/Title.php on line 45 ...

Increase the memory allowed to your PHP pages in php.ini

Checking your success: Analysing Server Logfiles
tbd

Best Practices
(experiences needed!) tdb