Apache 配置

From mediawiki.org
Jump to navigation Jump to search
This page is a translated version of the page Apache configuration and the translation is 51% complete.
Outdated translations are marked like this.
Other languages:
Deutsch • ‎Deutsch (Sie-Form)‎ • ‎English • ‎Türkçe • ‎dansk • ‎español • ‎français • ‎magyar • ‎occitan • ‎polski • ‎português • ‎português do Brasil • ‎русский • ‎中文 • ‎日本語




作为 Apache 模块的 PHP

MediaWiki is written to use PHP as an Apache module. Your PHP is probably configured as a module if you start with URLs like this:


You can check which configuration and version of PHP you have by viewing your wiki's Special:Version page, or with phpinfo().

Red Hat / 基于 Fedora 的 Linux

安装 PHP:

# yum install php php-xml

重新加载 httpd:

# service httpd reload
基于 Debian 的 Linux

安装 apache2 php5 模块:

# apt-get install apache2 libapache2-mod-php5 php5-cli php-apc php5-mcrypt
在Ubuntu 16.04.2 LTS中
# sudo apt-get install libapache2-mod-php

启用 apache2 php5 模块:

# a2enmod php5

重启 Apache:

# service apache2 restart

作为 CGI 的 PHP

如果PHP以CGI方式运行,您将默认使用“不美观的”URL,但可以参考短链接 进行修改。


If you have your own server running Apache and are running PHP as CGI, you can install CGIWrap. This tool enables you to run the Apache server as a different user for CGIs.

That way, you can create a new user for your MediaWiki pages. Installing CGIWrap is beyond the scope of this document, especially since you must compile it accordingly to your own server. However, as a quick guideline, you can follow these rules:

  • 创建一个Wikimedia用户
useradd -M -s /sbin/nologin wikiuser
  • Have a cgi-bin folder, containing CGIWrap (example in /home/myuser/cgi-bin). Once everything is configured, keep only cgiwrap, move the debug versions to another folder if you ever need it. Your cgiwrap file should be accessible only to Apache (chown and chmod accordingly).
chown apache:apache cgiwrap
chmod 500 cgiwrap
  • Inside the cgi-bin folder, create a symbolic link to the Wikimedia root.
ln -s /home/myuser/public_html/wiki /home/myuser/cgi-bin/wikilink
  • 在您的 wiki 的 .htaccess 文件中,添加以下定义:
AddHandler php-wrapper .php
Action php-wrapper /cgi-bin/cgiwrap/wikiuser/wikilink
  • Finally, chown and chmod all the .php files of your Wikimedia folder to be accessible solely by wikiuser.
find . -name \*.php -exec chown wikiuser:wikiuser {} \;
find . -name \*.php -exec chmod 500 {} \;

The files will be accessible as usual. You do not need to specify in your path any cgi-bin, as this is transparently taken care of for you.

I strongly suggest you start out with /cgi-bin/cgiwrapd/... as your php-wrapper, as it will precisely show what is currently working. I also strongly suggest you do not delete your CGIWrap source folder until everything works perfectly as this is a real trial and error process, taking a long time. However, it's all worth your time as your MediaWiki will be run in its own separate process, in its own uid, without being able to interfere any other uid. Inverse is also true, except for root, that can read anything anywhere.

mod_alias / mod_rewrite

The recommended method of beautifying URLs involves mod_alias. Other methods use mod_rewrite instead.


ModSecurity has been known to cause problems with MediaWiki. If you get errors seemingly at random, check your error log to see whether it is causing problems.


The stack size for for each Apache thread is configurable and the default varies on different operating systems. To run MediaWiki on Windows environments it may be necessary to increase the stack size (if there are problems), as the 1MB default is small and can cause stack overflows during PHP script execution. The following httpd.conf setting will set the stack size to about 8MB (about a typical Linux default):

<IfModule mpm_winnt_module>
ThreadStackSize 8388608


You really should use a robots.txt file to tell well-behaved spiders not to download dynamically generated pages (edit pages, for instance).

This can reduce the load on your webserver, preserve your bandwidth, and prevent duplicate content issues with search engines. However, malicious bots could tie up your webserver and waste your bandwidth by downloading a large volume of pages extremely quickly. Request throttling can help protect against this.