Apache configuration

<- MediaWiki architecture

Apache webserver is the recommended web server for use with MediaWiki. Other servers may work, but who knows.

PHP
You'll want to set up PHP as an Apache module, or likely you won't get anything running. ;)

Be sure to enable mod_php in the directory that contains the MediaWiki scripts, but you should also remember to disable it in the upload directory, so that visitors don't have the ability to execute arbitrary code on your system!

mod_rewrite
URL rewriting is recommended to make things not look too dreadful. So far this is limited to the paths to view regular pages. For instance, if in LocalSettings your $wgScript is '/w/wiki.phtml' and $wgArticlePath is '/wiki/$1':

RewriteEngine On   RewriteRule ^/wiki/(.*)$ /w/wiki.phtml?title=${$1} [L] RewriteRule ^/wiki$ /w/wiki.phtml

This will internally change /wiki/Some_Title to /w/wiki.phtml?title=Some_Title, which gives the script the variables it needs.

Patches
If you're using URL rewriting and want to be able to use the ampersand (&) in page titles, you'll need to patch Apache to properly escape the character when generating the query string. A patch for Apache 1.3.26 is available as maintenance/apache-ampersand.diff in the MediaWiki source. (No patch is yet available for Apache 2.0.x.)

Change your mod_rewrite config like so:

RewriteEngine On   RewriteMap ampescape int:ampescape RewriteRule ^/wiki/(.*)$ /w/wiki.phtml?title=${ampescape:$1} [L] RewriteRule ^/wiki$ /w/wiki.phtml

This way, /wiki/AT&T correctly becomes /w/wiki.phtml?title=AT%26T instead of /w/wiki.phtml?title=AT&T, which breaks up into "title=AT" and a useless "T".

Robots exclusion file
You probably don't want spiders trying to download every dynamically generated page... see robots.txt

Throttling
See request throttling.