Manual:Short URL/Prevent bots from crawling index.php

Prevent bots from crawling index.php
You can make sure that search engines only index actual wiki pages, without indexing action views (such as edit or history pages, with URLs in the form ).

Create a file named robots.txt in the root of your MediaWiki installation with the following content. User-agent: * Disallow: /index.php

Note: Creating a robots.txt file in the root of your MediaWiki with Disallow: /index.php without creating a Manual:Short URLs first, will block all pages from being indexed. This is because your mediawiki page will still have a index.php in the title, which the robots.txt file will block "disallow".