Manual:Short URL/Prevent bots from crawling index.php

Prevent bots from crawling index.php
If you are using short URLs (see note below), you can make sure that search engines only index actual wiki pages, without indexing action views (such as edit or history pages, with URLs in the form

Create a file named robots.txt in the root of your MediaWiki installation with the following content.

Note: Creating a robots.txt file in the root of your MediaWiki with Disallow: /index.php without creating a Short URL first, will block all pages from being indexed. This is because your mediawiki page will still have a index.php in the title, which the robots.txt file will block "disallow".

If you are using long URLs, follow the link below.