Manual:Short URL

This Short URL help page describes how to configure Dreamhost-hosted MediaWiki installs to utilize shortened URLs instead of the default longer URLs that include "index.php?title=".


 * Original URL example: wikisite.tld/w/index.php?title=Main_Page
 * Shortened URL example: wikisite.tld/wiki/Main_Page

This page describes only one possible configuration for achieving short URLs. Depending on your preferences and specific needs, you may need to adjust these suggestions accordingly.

Note that for the following settings to work, the MediaWiki installation directory ("/w" in this example) must have a different name from the desired virtual directory ("/wiki" in this example). If you installed to the directory name you'd like to use in the shortened URLs, either rename the directory or delete and reinstall MediaWiki to a different directory through the Dreamhost Panel.

File: [web root]/.htaccess
Create or edit the .htaccess file in the root web directory (not the wiki directory) as follows:

Options +FollowSymLinks RewriteEngine On

RewriteRule ^wiki/(.*)$ /w/index.php?title=$1 [PT,L,QSA] RewriteRule ^wiki/*$ /w/index.php [L,QSA]
 * 1) Change "wiki/" below to match your desired virtual path.
 * 2) Change "w/" to match your actual installation directory.

RewriteCond %{REQUEST_URI} ^/stats/(.*)$ [OR] RewriteCond %{REQUEST_URI} ^/failed_auth.html$ RewriteRule ^.*$ - [L]
 * 1) Include this section to prevent Dreamhost /stats from breaking
 * 2) (see http://wiki.dreamhost.com/Mod_rewrite)
 * 1) end of /stats fix

File: [wiki root]/LocalSettings.php
Add the following to the /LocalSettings.php file:

$wgArticlePath = "/wiki/$1"; $wgUsePathInfo = true;
 * 1) Short URL stuff
 * 2) Change "wiki/" below to match your desired virtual path.

robots.txt
If you have implemented short URLs so that wiki article links look like  http:// wikisite.tld/Articlename or  http:// wikisite.tld/wikidir/Articlename then please preserve bandwidth on your server -- prevent search engine spiders from crawling all your wiki's action pages (edit, history, discuss, etc.) by putting the following into a file named 'robots.txt' at the root of the site's web directory (/home/yourusername/sitename.org/robots.txt):

User-agent: * Disallow: /index.php

If your URLs look like  http:// wikisite.tld/wikidir/Articlename you can be more specific like this instead:

User-agent: * Disallow: /wikidir/index.php

(Adjust wikidir to whatever your wiki directory is; leave it out if your wiki is the root directory of your site.)

Note that this will ban search engines from everything on your site that has /index.php or /wikidir/index.php in its path depending which of above you use. So if you have other PHP stuff running on that site/[dir/] location besides a MediaWiki with short URLs, be aware of that.

The robots.txt file is only valid when located at the top level of your site (/home/yourusername/sitename.org/robots.txt).


 * robots.txt validation tool: http://tool.motoricerca.info/robots-checker.phtml