Apache configuration/fr

Apache est le serveur recommandé pour faire fonctionner MediaWiki.

PHP en tant que module Apache
MediaWiki est écrit pour utiliser en tant que module Apache. Votre PHP est probablement configuré en tant que module si vous démarrez avec une entrée du type :

example.com/index.php/Main_Page

Vous pouvez vérifier la configuration et la version de PHP dont vous bénéficiez en visualisant la page Spécial:Version de votre wiki ou bien avec [$link phpinfo].

Serveur Linux sur base RedHat/Fedora
Installez PHP :


 * 1) yum install php php-xml

Rechargez httpd:


 * 1) service httpd reload

Serveur Linux sur base Debian
Installez le module apache2 php5 :


 * 1) apt-get install apache2 libapache2-mod-php5 php5-cli php-apc php5-mcrypt

Avec Ubuntu 16.04.2 LTS

 * 1) sudo apt-get install libapache2-mod-php

Activez le module apache2 php5 :


 * 1) a2enmod php5

Redémarrez Apache:


 * 1) service apache2 restart

PHP en tant que CGI
Si PHP fonctionne en tant que CGI, vous aurez par défaut des URL "moches", mais vous pouvez toujours mettre en place des.

CGIWrap
If you have your own server running Apache and are running PHP as CGI, you can install CGIWrap. This tool enables you to run the Apache server as a different user for CGIs.

That way, you can create a new user for your MediaWiki pages. Installing CGIWrap is beyond the scope of this document, especially since you must compile it accordingly to your own server. However, as a quick guideline, you can follow these rules:

useradd -M -s /sbin/nologin wikiuser
 * Créez un utilisateur de Wikimedia
 * Have a cgi-bin folder, containing CGIWrap (example in /home/myuser/cgi-bin). Once everything is configured, keep only cgiwrap, move the debug versions to another folder if you ever need it. Your cgiwrap file should be accessible only to Apache (chown and chmod accordingly).

ln -s /home/myuser/public_html/wiki /home/myuser/cgi-bin/wikilink AddHandler php-wrapper .php Action php-wrapper /cgi-bin/cgiwrap/wikiuser/wikilink
 * Inside the cgi-bin folder, create a symbolic link to the Wikimedia root.
 * In your wiki's .htaccess file, add the following definitions:
 * Finally, chown and chmod all the .php files of your Wikimedia folder to be accessible solely by wikiuser.

The files will be accessible as usual. You do not need to specify in your path any cgi-bin, as this is transparently taken care of for you.

I strongly suggest you start out with /cgi-bin/cgiwrapd/... as your php-wrapper, as it will precisely show what is currently working. I also strongly suggest you do not delete your CGIWrap source folder until everything works perfectly as this is a real trial and error process, taking a long time. However, it's all worth your time as your MediaWiki will be run in its own separate process, in its own uid, without being able to interfere any other uid. Inverse is also true, except for root, that can read anything anywhere.

mod_alias / mod_rewrite
La méthode recommandée pour implique mod_alias. Une autre façon de le faire est d'utiliser mod_rewrite à la place.

mod_security
has been known to cause problems with MediaWiki. If you get errors seemingly at random, check your error log to see whether it is causing problems.

VisualEditor and Subpages
In order to prevent errors contacting the Parsoid server,  must be added to the wiki's VirtualHost config block (or to the general server config if VirutalHosts are not used).

Thread stack size
The stack size for each Apache thread is configurable and the default varies on different operating systems. To run MediaWiki on Windows environments it may be necessary to increase the stack size (if there are problems), as the 1MB default is small and can cause stack overflows during PHP script execution. The following httpd.conf setting will set the stack size to about 8MB (about a typical Linux default):

Spiders and bots
You really should use a file to tell well-behaved spiders not to download dynamically generated pages (edit pages, for instance). This can reduce the load on your webserver, preserve your bandwidth, and prevent duplicate content issues with search engines. However, malicious bots could tie up your webserver and waste your bandwidth by downloading a large volume of pages extremely quickly. Request throttling can help protect against this.