Extension:NoRobots

From MediaWiki.org
Jump to: navigation, search
MediaWiki extensions manual
Crystal Clear action run.png
No Robots

Release status: stable

Implementation Tag
Description Protect individual pages from being indexed by search engines.
Latest version n/a (2006-03-31)
MediaWiki 1.5 and 1.6
License No license specified
Download see below

Translate the NoRobots extension if possible

Check usage and version matrix; code metrics

No Robots extension lets you prevent search engines from indexing certain pages on your wiki. If you want to prevent search engines from indexing all pages in your wiki, check out Robots.txt.

This extension doesn't work with recent MediaWiki versions, but robot policies can be set by administrator using $wgArticleRobotPolicies.

Installation[edit | edit source]

1. Create a file called extensions/NoRobots.php with these contents:

<?php
if( !defined( 'MEDIAWIKI' ) ) {
        echo( "This is an extension to the MediaWiki package and cannot be run standalone.\n" );
        die( -1 );
}
$wgExtensionCredits['other'][] = array(
        'path'           => __FILE__,
        'name'           => 'NoRobots',
        'version'        => '1.0',
        'author'         => '',
        'url'            => 'https://www.mediawiki.org/wiki/Extension:NoRobots',
        'description'    => 'Protect individual pages from being indexed by search engines.'
);
 
 $wgExtensionFunctions[] = 'noRobots_Install';
 function noRobots_Install()
 {
    global $wgParser;
    $wgParser->setHook("norobots", 'noRobots_Render');
 }
 function noRobots_Render($source, $argv)
 {
    global $wgOut;
    $wgOut->setRobotpolicy('noindex,follow');
 }
 ?>

2. Add this to LocalSettings.php:

require_once("extensions/NoRobots.php");

Usage[edit | edit source]

Add "<norobots></norobots>" to any page you don't want indexed. That's it!