Manuale:$wgDefaultRobotPolicy

From MediaWiki.org
Jump to navigation Jump to search
This page is a translated version of the page Manual:$wgDefaultRobotPolicy and the translation is 31% complete.

Other languages:
Deutsch • ‎English • ‎Lëtzebuergesch • ‎dansk • ‎español • ‎français • ‎italiano • ‎polski • ‎português • ‎português do Brasil • ‎русский • ‎العربية • ‎മലയാളം • ‎日本語
Robot policies: $wgDefaultRobotPolicy
Allows specifying the default robot policy for all pages on the wiki
Introdotto nella versione:1.12.0 (r30602)
Rimosso nella versione:still in use
Valori concessi:(text string appropriate for a "robots" meta tag)
Valore predefinito:'index,follow'
Impostazioni: Alfabetiche | Per funzione

Dettagli

Allows specifying the default robot policy for all pages on the wiki. The default is to encourage indexing and following of links. This can be overridden on a per-namespace and/or per-article basis.

e.g.

$wgDefaultRobotPolicy = 'noindex,nofollow';
$wgNamespaceRobotPolicies = array( NS_MAIN => 'index,follow' );

would forbid indexing and following of links on all pages outside the main namespace.

Vedi anche

External link

Robot policy Files: robots.txt
Attributes: nofollow noindex
Configuration settings: $wgArticleRobotPolicies $wgDefaultRobotPolicy $wgExemptFromUserRobotsControl $wgNamespaceRobotPolicies