Manual talk:Edit throttling/LQT Archive 1

Moved from the article core by Evan Moved to talk page by w:User:Chinasaur

Leaky bucket implementation
Rather than an explicit limit, why not give each user/IP a leaky bucket initialized to allow 10 edits when the first edit from that user happens - also, you could have several buckets, to limit edits per hour to 100, as well as limiting edits in a minute to e.g. 10. Pakaran 18:06, 6 Feb 2004 (UTC)

For those unfamiliar with the concept, the idea is that each edit requires a "token." Tokens not yet used are stored in a "bucket." in practice an integer variable, which "overflows" and wastes tokens when it reaches a specific value; tokens are added every N seconds. This is a common way to control excessive load in routers and such. We could have one bucket for short-term - for example a limit of 12 edits in 30 seconds, with one token added every 5 seconds and a limit of 6 in the bucket (someone check my math), and another bucket to have a long-term limit e.g. per hour. Pakaran 18:10, 6 Feb 2004 (UTC)

See SurgeProtector. Known technique.


 * Uh... that page seems to be about all kinds of protection against "surges", not particularly for edit floods, and different techniques for doing that. Ward's wiki seems to have some kind of edit flooding protection based on percentage of recent changes, which is pretty interesting. --Evan 03:40, 7 Feb 2004 (UTC)

Recent changes protection
Note also that for most wikipedians, Special:Recentchanges is still a important way to getting to know what is going on at the moment. (When a wiki becomes very active, it becomes impossible to keep up with the activity from Recentchanges, and people rely more on Watchlist, as I observe). Edit throttle has the effect of a newbie cluttering the Recentchanges by making too many edits on the same page in a short time, rather than using preview or writing in length at once.

Edit throttle is, for this reason, is one of a strongly desired features among some japanese wikipedians. Tomos 03:27, 20 May 2004 (UTC)

Proxied robots?
But using proxy servers, wouldn't it be possible for a bot to randomize or at least distribute its attacking IPs and thus effectively bypass the throttle? --w:User:Chinasaur


 * Sure, but at this level of vandalism you are certainly looking at a systemic effort for commercial gain (or at the very least ideological fervor). In order to gain access to this range of IPs one either needs to control a network of trojaned proxy servers or have up end access to the MIT internet link (i.e. have administrator access to a class A network).  This means that the vandal is adding the same text (or similar text) to all his edits, i.e. no one goes to this trouble just to put dirty words on alot of pages.  Mediawiki already has a feature to allow certain URLs (or arbitrary charachter strings??) to be added to a list of banned edits.

"Turing tests" for all edits?
Adding "Turing tests" (e.g. pictures with garbled text that the user must type) as a requirement for all edits should stop all robots except very sophisticated ones and be of little inconvenience to users, with the exception of blocking out blind users. Blind users may however be handled by allowing slow registered editing without tests.


 * This would be a great feature to have, especially for anonymous users. I've been getting frustrate by spam on my wiki lately. Thanks!