Handbuch:$wgDefaultRobotPolicy

From MediaWiki.org
Jump to navigation Jump to search
This page is a translated version of the page Manual:$wgDefaultRobotPolicy and the translation is 29% complete.

Other languages:
العربية • ‎dansk • ‎Deutsch • ‎English • ‎español • ‎français • ‎italiano • ‎日本語 • ‎Lëtzebuergesch • ‎മലയാളം • ‎polski • ‎português • ‎português do Brasil • ‎русский
Robot policies: $wgDefaultRobotPolicy
Allows specifying the default robot policy for all pages on the wiki
Eingeführt in Version: 1.12.0 (r30602)
Entfernt in Version: weiterhin vorhanden
Erlaubte Werte: (text string appropriate for a "robots" meta tag)
Standardwert: 'index,follow'

Andere Einstellungen: Alphabetisch | Nach Funktion

Details

Allows specifying the default robot policy for all pages on the wiki. The default is to encourage indexing and following of links. This can be overridden on a per-namespace and/or per-article basis.

e.g.

$wgDefaultRobotPolicy = 'noindex,nofollow';
$wgNamespaceRobotPolicies = array( NS_MAIN => 'index,follow' );

would forbid indexing and following of links on all pages outside the main namespace.

Siehe auch

External link

Robot policyManual:Robot policy Dateien: robots.txtManual:Robots.txt
Eigenschaften: nofollowManual:NofollownoindexManual:Noindex
Konfigurationseinstellungen: $wgArticleRobotPoliciesManual:$wgArticleRobotPolicies$wgDefaultRobotPolicyManual:$wgDefaultRobotPolicy$wgExemptFromUserRobotsControlManual:$wgExemptFromUserRobotsControl$wgNamespaceRobotPoliciesManual:$wgNamespaceRobotPolicies