Manuale:$wgDefaultRobotPolicy

From MediaWiki.org
Jump to: navigation, search
This page is a translated version of the page Manual:$wgDefaultRobotPolicy and the translation is 29% complete.

Other languages:
العربية • ‎Deutsch • ‎English • ‎español • ‎français • ‎italiano • ‎日本語 • ‎Lëtzebuergesch • ‎polski • ‎português • ‎português do Brasil • ‎русский
Robot policies: $wgDefaultRobotPolicy
Allows specifying the default robot policy for all pages on the wiki
Introdotto nella versione: 1.12.0 (r30602)
Rimosso nella versione: still in use
Valori concessi: (text string appropriate for a "robots" meta tag)
Valore predefinito: 'index,follow'

Impostazioni: Alfabetiche | Per funzione

Dettagli

Allows specifying the default robot policy for all pages on the wiki. The default is to encourage indexing and following of links. This can be overridden on a per-namespace and/or per-article basis.

e.g.

$wgDefaultRobotPolicy = 'noindex,nofollow';
$wgNamespaceRobotPolicies = array( NS_MAIN => 'index,follow' );

would forbid indexing and following of links on all pages outside the main namespace.

Vedi anche

External link

Robot policyManual:Robot policy Files: robots.txtManual:Robots.txt
Attributes: nofollowManual:NofollownoindexManual:Noindex
Configuration settings: $wgArticleRobotPoliciesManual:$wgArticleRobotPolicies$wgDefaultRobotPolicyManual:$wgDefaultRobotPolicy$wgExemptFromUserRobotsControlManual:$wgExemptFromUserRobotsControl$wgNamespaceRobotPoliciesManual:$wgNamespaceRobotPolicies