Thread:Project:Support desk/Robot.txt File/reply

Hi!

The robots.txt file always must be placed in the domain root. It does not matter where you have something else (e.g. where you have installed MediaWiki), robots.txt always must be available directly in the webroot folder of the domain.

To disallow special pages put this in the file:

User-agent: * Disallow: /view/Special

This assumes that your wiki is available at that place, e.g. Special:RecentChanges at view/Special:RecentChanges.