Extension:NoRobots

From MediaWiki.org
Jump to: navigation, search
MediaWiki extensions manualManual:Extensions
Crystal Clear action run.png
No Robots

Release status:Extension status stable

ImplementationTemplate:Extension#type Tag
DescriptionTemplate:Extension#description Protect individual pages from being indexed by search engines.
Latest versionTemplate:Extension#version n/a (2006-03-31)
MediaWikiTemplate:Extension#mediawiki 1.5 and 1.6
LicenseTemplate:Extension#license No license specified
Download see below

Translate the NoRobots extension if it is available at translatewiki.net

Check usage and version matrix.

No Robots extension lets you prevent search engines from indexing certain pages on your wiki. If you want to prevent search engines from indexing all pages in your wiki, check out Robots.txt.

This extension doesn't work with recent MediaWiki versions, but robot policies can be set by administrator using $wgArticleRobotPolicies.

Installation[edit]

1. Create a file called extensions/NoRobots.php with these contents:

<?php
if( !defined( 'MEDIAWIKI' ) ) {
        echo( "This is an extension to the MediaWiki package and cannot be run standalone.\n" );
        die( -1 );
}
$wgExtensionCredits['other'][] = array(
        'path'           => __FILE__,
        'name'           => 'NoRobots',
        'version'        => '1.0',
        'author'         => '',
        'url'            => 'https://www.mediawiki.org/wiki/Extension:NoRobots',
        'description'    => 'Protect individual pages from being indexed by search engines.'
);

 $wgExtensionFunctions[] = 'noRobots_Install';
 function noRobots_Install()
 {
    global $wgParser;
    $wgParser->setHook("norobots", 'noRobots_Render');
 }
 function noRobots_Render($source, $argv)
 {
    global $wgOut;
    $wgOut->setRobotpolicy('noindex,follow');
 }
 ?>

2. Add this to LocalSettings.php:

require_once("extensions/NoRobots.php");

Usage[edit]

Add "<norobots></norobots>" to any page you don't want indexed. That's it!