Apache configuration

[https://httpd.apache.org/ Apache] is the webserver used most with MediaWiki.

PHP as Apache Module
MediaWiki is written to use 1>Special:MyLanguage/PHP config|PHP as an Apache module. Your PHP is probably configured as a module if you start with URLs like this: example.com/index.php/Main_Page

You can check which configuration and version of PHP you have by viewing your wiki's Special:Version page, or with phpinfo.

RedHat/Fedora-based Linux
Install PHP:
 * 1) yum install php php-xml

Reload httpd:
 * 1) service httpd reload

Debian-based Linux
Install apache2 php5 module:
 * 1) apt-get install apache2 libapache2-mod-php5 php5-cli php-apc php5-mcrypt

In Ubuntu 16.04.2 LTS

 * 1) sudo apt-get install libapache2-mod-php

Enable apache2 php5 module:
 * 1) a2enmod php5

Restart Apache:
 * 1) service apache2 restart

PHP as CGI
If PHP is running as a CGI, you will have "ugly" URLs by default, but you can still implement 1>Special:MyLanguage/Manual:Short URL|short URLs.

CGIWrap
If you have your own server running Apache and are running 1>#PHP as CGI|PHP as CGI, you can install [https://github.com/cgiwrap/cgiwrap CGIWrap]. This tool enables you to run the Apache server as a different user for CGIs.

That way, you can create a new user for your MediaWiki pages. Installing CGIWrap is beyond the scope of this document, especially since you must compile it accordingly to your own server. However, as a quick guideline, you can follow these rules:

useradd -M -s /sbin/nologin wikiuser
 * Create a Wikimedia user


 * Have a cgi-bin folder, containing CGIWrap (example in /home/myuser/cgi-bin). Once everything is configured, keep only cgiwrap, move the debug versions to another folder if you ever need it. Your cgiwrap file should be accessible only to Apache (chown and chmod accordingly).

ln -s /home/myuser/public_html/wiki /home/myuser/cgi-bin/wikilink
 * Inside the cgi-bin folder, create a symbolic link to the Wikimedia root.

AddHandler php-wrapper .php Action php-wrapper /cgi-bin/cgiwrap/wikiuser/wikilink
 * In your wiki's .htaccess file, add the following definitions:


 * Finally, chown and chmod all the .php files of your Wikimedia folder to be accessible solely by wikiuser.

The files will be accessible as usual. You do not need to specify in your path any cgi-bin, as this is transparently taken care of for you.

I strongly suggest you start out with /cgi-bin/cgiwrapd/... as your php-wrapper, as it will precisely show what is currently working. I also strongly suggest you do not delete your CGIWrap source folder until everything works perfectly as this is a real trial and error process, taking a long time. However, it's all worth your time as your MediaWiki will be run in its own separate process, in its own uid, without being able to interfere any other uid. Inverse is also true, except for root, that can read anything anywhere.

mod_alias / mod_rewrite
The recommended method of 1>Special:MyLanguage/Manual:Short URL|beautifying URLs involves [https://httpd.apache.org/docs/current/mod/mod_alias.html mod_alias]. Other methods use [https://httpd.apache.org/docs/current/mod/mod_rewrite.html mod_rewrite] instead.

mod_security
<tvar|1></> has been known to cause problems with MediaWiki. If you get errors seemingly at random, check your error log to see whether it is causing problems.

Thread stack size
The stack size for for each Apache thread is [<tvar|link>https://httpd.apache.org/docs/current/mod/mpm_common.html#ThreadStackSize</> configurable] and the default varies on different operating systems. To run MediaWiki on Windows environments it may be necessary to increase the stack size (if there are problems), as the 1MB default is small and can [<tvar|link2>https://bugs.php.net/bug.php?id=47689</> cause stack overflows] during PHP script execution. The following httpd.conf setting will set the stack size to about 8MB (about a typical Linux default):

Spiders and bots
You really should use a <tvar|1></> file to tell well-behaved spiders not to download dynamically generated pages (edit pages, for instance).

This can reduce the load on your webserver, preserve your bandwidth, and prevent duplicate content issues with search engines.

However, malicious bots could tie up your webserver and waste your bandwidth by downloading a large volume of pages extremely quickly.

1>m:Special:MyLanguage/Request throttling</>|Request throttling can help protect against this.