Thread:Project:Support desk/Mediawiki Hanging with no errors./reply

The on-request job queue in 1.22 is a fail. Simply.

> Setting $wgJobRunRate to 0 or 0.01 slows the growth, but 1 causes it to grow by the thousands

I don't understand this. What is growing here? refreshLinks jobs or php-cgi processes? Setting $wgJobRunRate to 0 should eliminate the problem of spawning new php process for each request. Of course, that would make the number of jobs to grow as you edit your pages. The solution would be to execute runJobs.php manually, or set $wgPhpCli to false and $wgJobRunRate to a low value (0.1, 0.01).

Google bot is known to bring down entire sites when crawling the site at a gazillion pages per second. And your setup (using PHP as CGI) is particularly not helping because it is slower than a PHP module in apache. Be sure to have cache enabled. Particularly, the file cache may be helpful.

Apparently you can tell Google to slow down the crawl rate for your site: https://support.google.com/webmasters/answer/48620