Apache configuration

MediaWiki architecture

Apache webserver is the recommended web server for use with MediaWiki. Other servers may work, but who knows?

PHP as Apache Module
MediaWiki is written to use PHP as an Apache module. If you can use the wiki like this by default, then chances are your PHP is configured as a module: example.com/wiki/index.php/Main_Page

Be sure to enable mod_php in the directory that contains the MediaWiki scripts, but you should also remember to disable it in the upload directory, so that visitors don't have the ability to execute arbitrary code on your system!

RedHat/Fedora-based Linux

Install PHP:


 * 1) yum install php

Reload httpd:


 * 1) service httpd reload

Debian-based Linux

Install apache2 php5 module:


 * 1) apt-get install apache2 libapache2-mod-php5

Enable apache2 php5 module:


 * 1) a2enmod php5

PHP as CGI
If PHP is running as a CGI, then there are a couple things you can to do get it working.

First, change $wgArticlePath in LocalSettings.php to the ugly form of URL rewriting: $wgArticlePath= "$wgScript?title=$1";

You can still utilize the short urls of MediaWiki, by inserting this code into an .htaccess file and putting it in the wiki root (ie. /wiki/.htaccess): Options FollowSymLinks ExecCGI RewriteEngine On RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^(.+)$ /path/to/your/wiki/index.php?title=$1 [L,QSA]

Lastly, if you wanted to remove the ugly urls from being printed within mediawiki, you could also change $wgArticlePath in to this form, to get the short URLs there as well: $wgArticlePath = "$wgScriptPath/$1";

This seems to work for me, but I do not know if this will work with the /stats or not. HolisticEarth 05:28, 6 July 2006 (UTC)

mod_rewrite
URL rewriting is recommended to make some of your URLs look much nicer. This is very installation-specific, but see the Rewrite_rules page for some pointers.

mod_alias
Alternatively to using mod_rewrite, you may use mod_alias with similar effects. In this case it is assumed that mediawiki is not installed under apache's document root, but in some other directory, e.g. /usr/local/lib/mediawiki. Then you can add lines similar to the following example to your apache configuration:    Alias /mediawiki/ /usr/local/lib/mediawiki/ Alias /wiki/ /usr/local/lib/mediawiki/index.php/ Alias /wiki /usr/local/lib/mediawiki/index.php/  Options MultiViews # allow sub-directories to restrict usage via .htaccess AllowOverride Limit Order allow,deny Allow from all   Options MultiViews AllowOverride None Order allow,deny Allow from all # avoid execution of PHP scripts in upload directory  AddHandler None .php .phps ForceType text/plain    As in the mod_rewrite example, wgScriptPath is now "/mediawiki", and wgArticlePath is "/wiki/$1".

Once you have made the changes to map index.php to the shorter url, you may need to clear your web browsers cache and the objectcache table to ensure that all links are correctly pointing to the shorter url. In some cases, the old links to index.php in combination with the Alias configurations above will cause MediaWiki to try and edit index.php itself, rather than the page or section you are really trying to edit.

Note that this section on mod_alias was not written by an apache/php expert, so you should check whether it's fitting your security demands.

Patches
If you're using URL rewriting and want to be able to use the ampersand (&) in page titles, you'll need to patch Apache to properly escape the character when generating the query string. A patch for Apache 1.3.26 is available as maintenance/apache-ampersand.diff in the MediaWiki source. (No patch is yet available for Apache 2.0.x.)

Change your mod_rewrite config like so: RewriteEngine On   RewriteMap ampescape int:ampescape RewriteRule ^/wiki/(.*)$ /w/wiki.phtml?title=${ampescape:$1} [L] RewriteRule ^/wiki$ /w/wiki.phtml This way, /wiki/AT&T correctly becomes /w/wiki.phtml?title=AT%26T instead of /w/wiki.phtml?title=AT&T, which breaks up into "title=AT" and a useless "T".

Robots exclusion file
You probably don't want spiders trying to download every dynamically generated page... see robots.txt

Throttling
See request throttling.

Next page: Robots.txt >