Manual:$wgDefaultRobotPolicy

From MediaWiki.org
Jump to navigation Jump to search
This page is a translated version of the page Manual:$wgDefaultRobotPolicy and the translation is 31% complete.

Other languages:
Deutsch • ‎English • ‎Lëtzebuergesch • ‎dansk • ‎español • ‎français • ‎italiano • ‎polski • ‎português • ‎português do Brasil • ‎русский • ‎العربية • ‎മലയാളം • ‎日本語
Robot policies: $wgDefaultRobotPolicy
Allows specifying the default robot policy for all pages on the wiki
Introduzida na versão:1.12.0 (r30602)
Removida na versão:still in use
Valores permitidos:(text string appropriate for a "robots" meta tag)
Valor por omissão:'index,follow'
Outras configurações: Lista Alfabética | Lista por Função

Detalhes

Allows specifying the default robot policy for all pages on the wiki. The default is to encourage indexing and following of links. This can be overridden on a per-namespace and/or per-article basis.

e.g.

$wgDefaultRobotPolicy = 'noindex,nofollow';
$wgNamespaceRobotPolicies = array( NS_MAIN => 'index,follow' );

would forbid indexing and following of links on all pages outside the main namespace.

Ver também

External link

Robot policyManual:Robot policy Files: robots.txtManual:Robots.txt
Attributes: nofollowManual:NofollownoindexManual:Noindex
Configuration settings: $wgArticleRobotPoliciesManual:$wgArticleRobotPolicies$wgDefaultRobotPolicyManual:$wgDefaultRobotPolicy$wgExemptFromUserRobotsControlManual:$wgExemptFromUserRobotsControl$wgNamespaceRobotPoliciesManual:$wgNamespaceRobotPolicies