Manual:Maxlag parameter/zh

如果您的MediaWiki是部署在並行式數據庫叢集中（例如维基媒体），頻繁的编辑可能导致副本伺服器出現延誤. 其中一種解決辦法，就是當副本延誤超過某一臨界值之後，中止所有機器人和維護工作. MediaWiki 1.10 引進了 maxlag 參數，讓客戶端的程式可以處理這種情況. 在 1.27 版，maxlag 參數的機制有所改變，只有 請求會使用到它.

使用 時，可以透過 URL 查詢字串或 POST 內文指明以秒為單位的整數作為maxlag 參數. For example, [/w/api.php?action=query&titles=MediaWiki&format=json&maxlag=1 this link] shows metadata about the page "MediaWiki" unless the lag is greater than 1 second while [/w/api.php?action=query&titles=MediaWiki&format=json&maxlag=-1 this one] (with -1 at the end) shows you the actual lag without metadata.

If the specified lag is exceeded at the time of the request, a 503 status code is returned (or 200 during API requests, see T33156), with a response body with the following format:

以下HTTP标头已设置：


 * Retry-After: a recommended minimum number of seconds that the client should wait before retrying
 * X-Database-Lag: The number of seconds of lag of the most lagged replica

Recommended usage for Wikimedia wikis is as follows:

如果你收到一个延迟错误提示，将你的脚本暂停至少5秒再进行尝试. 当心不要进入负载循环.
 * Use maxlag=5 (5 seconds). This is an appropriate non-aggressive value, set as default value on Pywikibot. Higher values mean more aggressive behaviour, lower values are nicer.
 * It's possible that with this value, you may get a low duty cycle at times of high database load. That's OK, just let it wait for off-peak. We give humans priority at times of high load because we don't want to waste their time by rejecting their edits.
 * Unusually high or persistent lag should be reported to on irc.freenode.net.
 * Interactive tasks (where a user is waiting for the result) may omit the maxlag parameter. Noninteractive tasks should always use it. See also API:Etiquette.

Note that the caching layer (Varnish or squid) may also generate error messages with a 503 status code, due to timeout of an upstream server. Clients should treat these errors differently, because they may occur consistently when you try to perform a long-running expensive operation. Repeating the operation on timeout would use excessive server resources and may leave your client in an infinite loop. You can distinguish between cache-layer errors and MediaWiki lag conditions using any of the following:


 * X-Database-Lag header is distinctive to replication lag errors in MediaWiki
 * No Retry-After in Varnish errors
 * X-Squid-Error header should be present in squid errors
 * The response body in replication lag errors will match the regex

For testing purposes, you may intentionally make the software refuse a request by passing a negative value, such as in the following URL: [/w/api.php?action=query&titles=MediaWiki&format=json&maxlag=-1 /w/api.php?action=query&titles=MediaWiki&format=json&maxlag=-1].

The maxlag parameter is checked in, and also applies to the action API.