User:Krinkle/Patrolling

Preamble
The only initiatives backed by the foundation are prevention and restriction (AbuseFilter and FlaggedRevs). And though Patrolling is part of core, it has no workable interface and is actually disabled on the biggest wikis for bogus reasons.

Initiatives
We currently rely completely on community driven initiatives. To name a few:


 * Cobi's ClueBot: Seems like a candidate to integrate in the cluster and perhaps expose in AbuseFilter to prevent the edit if it scores above a certain threshold – instead of running in labs and reverting edits milliseconds after saving.


 * STIki, Huggle and the like: Standalone programs that need installing on a computer. Relies on irc.wikimedia.org. Naturally don't integrate in any usable workflow, not accessible from the web (though Huggle is working on a web-app version, it would still be standalone, not integrated).


 * RTRC, LiveRC and the like: Gadgets that implement an interface for the core patrolling feature. It is basically an enhanced version of SpecialRecentChanges. It features a live-reloading queue of edits and/or page creations, various filters (e.g. only show edits by anonymous users between 3 and 4 PM) and inline preview of the edit and ability to mark it as patrolled.


 * CVN's database, irc bot channels and SWMT: Similar to tools like Huggle and RTRC, except the feed of "potentially interesting events" is output via IRC instead of through a web interface or standalone application. One interesting aspect is that the CVN has a public API to its database which contain a shared watchlist for all patrollers (both per-wiki and globally) and a blacklist of users and IP-addresses of which activity should be paid attention to. Most of the items on the blacklist are automatically maintained by SWMTBot, whenever a user is blocked on a wiki, the user is blacklisted in CVN so that activity on other wikis is highlighted. To my knowledge this has been the most valuable system (and also the only system) to catch cross-wiki vandalism and repeated offenders. The blacklist duration is typically double the duration of the block. That way if a vandal is active on say nl.wikipedia.org and blocked. If during that block he goes to de.wikipedia.org or commons the CVN patrollers will pay extra attention to his edits, especially useful for more sudle vandalism that took a while to catch on one wiki and will now be immediately caught on the second wiki thanks to the CVN database through wiki sysops from different projects and languages collaborate.

And all of these (except for RTRC) either have no way of keeping track what is already reviewed or they implement their own database for keeping track of which edits are reviewed. They should be using MediaWiki's rc_patrolled flag, which would allow users to use any tool and contribute to working off the same list – instead of duplicating efforts. And it would make the information available from the Wiki interface, statistics and other extensions etc.

Workflow
Here is a case study of the review workflow on Dutch Wikipedia:

Proposal
Officially start focussing on counter-vandalism and reviewing of contributions. Create a team within Features (or extend the EE team) for working on the review related infrastructure

Projects

 * Web-compatible changes feed Implement a modern service for listening to recent changes events with machine-readable (preferably JSON) information about each event (both recent changes and log events). Keeping irc.wikimedia.org for backwards compatibility, possibly re-implemented by listening to this very feed. See also RFC/Structured data push notification support for recent changes.
 * Extension:ActivityMonitor An extension inspired by RTRC. The special page will open a socket to the changes feed and start populating the queue based on the given filters.