Apache configuration/es

Apache es el servidor web más usado con MediaWiki.

PHP como módulo de Apache
MediaWiki está escrito para utilizar |PHP como módulo de Apache. Tu instalación de PHP probablemente ya está configurada como módulo si comienzas las URLs así:

example.com/index.php/Main_Page

Se puede comprobar cuál es la configuración y la versión disponible de PHP visitando la página Special:Version de tu wiki, o con phpinfo.

Linux basados en Red Hat/Fedora
Instala PHP:


 * 1) yum install php php-xml

Recarga httpd:


 * 1) service httpd reload

Linux basados en Debian
Instala el módulo php5 de apache2:


 * 1) apt-get install apache2 libapache2-mod-php5 php5-cli php-apc php5-mcrypt

In Ubuntu 16.04.2 LTS

 * 1) sudo apt-get install libapache2-mod-php

Activa el módulo php5 de apache2:


 * 1) a2enmod php5

Reinicia Apache:


 * 1) service apache2 restart

PHP como CGI
Si PHP está funcionando como CGI, tendrás URLs «feas» de forma predeterminada, pero todavía podrás implementar URLs cortas.

CGIWrap
Si tienes tu propio servidor con Apache y estás ejecutando PHP como CGI, puedes instalar CGIWrap. Esta herramienta te permite ejecutar el servidor Apache como un usuario distinto para los CGIs.

De esa forma, puedes crear un usuario nuevo para las páginas de MediaWiki. La instalación de CGIWrap está fuera del ámbito de este documento, especialmente si se tiene en cuenta que se debe compilar de acuerdo con las características de tu propio servidor. Sin embargo, como indicación, se pueden seguir las siguientes reglas:

useradd -M -s /sbin/nologin wikiuser
 * Crea un usuario para Wikimedia
 * Have a cgi-bin folder, containing CGIWrap (example in /home/myuser/cgi-bin). Once everything is configured, keep only cgiwrap, move the debug versions to another folder if you ever need it. Your cgiwrap file should be accessible only to Apache (chown and chmod accordingly).

ln -s /home/myuser/public_html/wiki /home/myuser/cgi-bin/wikilink AddHandler php-wrapper .php Action php-wrapper /cgi-bin/cgiwrap/wikiuser/wikilink
 * Inside the cgi-bin folder, create a symbolic link to the Wikimedia root.
 * In your wiki's .htaccess file, add the following definitions:
 * Finally, chown and chmod all the .php files of your Wikimedia folder to be accessible solely by wikiuser.

The files will be accessible as usual. You do not need to specify in your path any cgi-bin, as this is transparently taken care of for you.

I strongly suggest you start out with /cgi-bin/cgiwrapd/... as your php-wrapper, as it will precisely show what is currently working. I also strongly suggest you do not delete your CGIWrap source folder until everything works perfectly as this is a real trial and error process, taking a long time. However, it's all worth your time as your MediaWiki will be run in its own separate process, in its own uid, without being able to interfere any other uid. Inverse is also true, except for root, that can read anything anywhere.

mod_alias / mod_rewrite
El método recomendado para embellecer URLs es utilizar mod_alias. Otros métodos utilizan mod_rewrite en su lugar.

mod_security
a veces da problemas con MediaWiki. Si obtienes errores de forma aparentemente aleatoria, comprueba el registro de errores para ver si está provocando problemas.

VisualEditor and Subpages
In order to prevent errors contacting the Parsoid server,  must be added to the wiki's VirtualHost config block (or to the general server config if VirutalHosts are not used).

Thread stack size
The stack size for each Apache thread is configurable and the default varies on different operating systems. To run MediaWiki on Windows environments it may be necessary to increase the stack size (if there are problems), as the 1MB default is small and can cause stack overflows during PHP script execution. The following httpd.conf setting will set the stack size to about 8MB (about a typical Linux default):

Arañas y bots
You really should use a file to tell well-behaved spiders not to download dynamically generated pages (edit pages, for instance). This can reduce the load on your webserver, preserve your bandwidth, and prevent duplicate content issues with search engines. However, malicious bots could tie up your webserver and waste your bandwidth by downloading a large volume of pages extremely quickly. Request throttling can help protect against this.