Extension:Moderation

The Moderation extension provides protection against vandalism for small and medium wikis.

This is one of the most effective anti-vandal protection methods and has very little impact on legitimate users.

Introduction

 * How does it work?
 * 1) Every edit (or image upload) by a new user is being sent to a moderation queue.
 * 2) Until the moderator approves this edit, the page is unchanged. Pending edits are neither on the  page history nor RecentChanges.
 * 3) The user can see his/her edit and continue editing his/her own version of the page.


 * How do the admins moderate?
 * 1) A new special page is provided (Special:Moderation). It's much like the RecentChanges, but has "Approve", "Reject", "Approve all" and "Reject all" buttons.
 * 2) Rejected edits go into the rejected archive.
 * 3) Approved edits are applied normally.
 * 4) Logs of "who approved what" are maintained. Only the moderators can see them.
 * 5) If edit conflict is detected and it can't be resolved automatically, the moderator has a merge button to apply the edit manually.


 * Why is it good?
 * 1) New users are not discouraged by annoying how>Special:MyLanguage/Captcha|captchas, phone number verifications, etc. They edit normally, like they would do in MediaWiki without moderation.
 * 2) Blocks become practically obsolete. And blocks are not good (consider the chance of hitting a legitimate user with a range block, or inability to allow good edits from a not-very-adequate user who sometimes has the urge to vandalize a page or two).
 * 3) Vandalism out of "wanting to be noticed" is discouraged. Noone would sit for 5 hours looking for new and new proxies to make admin angry, if it's known that all those actions are not a problem.
 * 4) Vandalism methods like "vandalizing one page from two accounts to prevent one-click rollback" are no longer effective.
 * 5) Website can operate in anonymous networks like TOR or I2P.

Alternatives


Does MediaWiki have other counter-vandalism methods? In brief - not really.

MediaWiki was developed for Wikipedia. At any given time, Wikipedia has hundreds of volunteers willing to revert vandalism in real time. Almost every other wiki besides Wikipedia doesn't have that kind of advantage. The built-in counter-vandalism idea of MediaWiki is that vandalism takes more time than reverting it. Normally that's true, but this does a poor job at discouraging vandalism, and the admins still have to check for vandalism often, even if the reverting itself doesn't take much of their time.

There are three known methods of fighting vandalism:


 * 1) Make all edits hard. For example, Lurkmore.to imposes a strong captcha on all edits from new users, and it takes a lot of edits to finally be able to edit without the captcha. Therefore the vandal has to spend a lot of time to do a handful of edits.
 * The obvious minus is that all legitimate users have to bypass the captcha as well, which could discourage minor edits like spelling fixes.
 * 1) Enforce user identification - for example, login via Facebook. If the social network verifies that all its users have a valid mobile phone number, then each vandalism attempt requires the vandal to go to the shop and buy a new SIM card. This method is extremely effective, though eliminates the anonymous editing and turns away the users who don't have an account in any supported social network.
 * A strong minus of this method is the impact on users' privacy. In non-democratic countries editing a page on politics can result in government trying to identify and persecute the user. For example, Lurkmore.to was contacted by Russian "example>wikipedia:ru:Главное управление по противодействию экстремизму МВД России|anti-extremist special force" with demands to disclose information about the authors of pages about Ramzan Kadyrov and Molotov cocktail.
 * 1) Mitigate the results of vandalism. For example, a user can create 100 pages with offensive titles, but they can all be deleted by two clicks in nuke>Special:MyLanguage/Extension:Nuke|Extension:Nuke. Moderation extension belongs to this category.

Is this extension stable?
This extension is stable. It has been deployed in production on Russian Uncyclopedia (absurdopedia.net) since November 2014.

The extension has an test>Manual:PHP unit testing/Writing unit tests for extensions|automated testsuite with significant coverage ([https://github.com/edwardspec/mediawiki-moderation/blob/master/README.testsuite phpunit] and [https://github.com/edwardspec/mediawiki-moderation/blob/master/tests/selenium/README Selenium]). Every change to Moderation is automatically tested on:
 * 1) newest version of MediaWiki,
 * 2) MediaWiki 1.27 (LTS).

Please read the files KNOWN_LIMITATIONS, [https://github.com/edwardspec/mediawiki-moderation/blob/master/TODO TODO] and WONT_DO for all known issues. Feel free to contact>Extension talk:Moderation</>|contact the author if you have any questions.

What's the difference from FlaggedRevs or Approved Revs?
Extension:FlaggedRevs and Extension:ApprovedRevs hide the bad revisions only from readers. The vandal edits will still exist in history and RecentChanges, and all editors will stumble upon them when they try to edit the page which was vandalized. Therefore administrators have to revert vandalism quickly.

On the other hand, Moderation completely eliminates vandal edits: non-approved revisions are simply not created in page history, etc. This ensures that not only readers, but also other editors won't see the vandal edits in any of the pages.

In short, (1) FlaggedRevs is for quality control but doesn't help against persistent vandalism. (2) Moderation is specifically against vandalism and renders it completely ineffective.

Parameters for <tvar|LS>LocalSettings.php</>

 * <tvar|wgModerationEnable>$wgModerationEnable</>: If set to false, then new edits are applied as usual (not sent to moderation). Default: true.
 * <tvar|wgModerationTimeToOverrideRejection>$wgModerationTimeToOverrideRejection</>: Time (in seconds) after which rejected edit could no longer be approved. Default: 2 weeks. Note: old rejected edits are NOT deleted (moderators can always look at them in Rejected folder even if this time has elapsed).
 * <tvar|ModerationOnlyInNamespaces>$wgModerationOnlyInNamespaces</>: if set to an array of namespace numbers (e.g. ), moderation is only enabled in these namespaces (edits in other namespaces will bypass moderation). Default (empty array): moderation is enabled everywhere.
 * <tvar|ModerationIgnoredInNamespaces>$wgModerationIgnoredInNamespaces</>: If set to an array of namespace numbers (e.g. ), non-automoderated users can bypass moderation in these namespaces. Default (empty array): moderation can't be bypassed anywhere.
 * <tvar|Enable>$wgModerationNotificationEnable</>: If true, notification email will be sent to <tvar|Email>$wgModerationEmail</> each time an edit is queued for moderation. Default: false.
 * <tvar|NewOnly>$wgModerationNotificationNewOnly</>: If true, only notify about new pages (but not about edits in existing pages). Default: false.

Additional anti-vandalism tips
In order to prevent vandalism, the following additional measures should be applied:


 * 1) Please rights>Special:MyLanguage/Manual:User rights</>|restrict the move>Move</>|renaming of pages to a trusted group (not just "automoderated"), because it can be used for difficult-to-revert vandalism.
 * 2) Registering new accounts with offensive names is still a way for a vandal to show itself in the RecentChanges. A simple solution is to remove newusers log from RecentChanges: <tvar|solution> </>

Recommended use / good practices
The following good-practices are advised:


 * 1) Only vandalism should be Rejected. Not-so-good edits with good intentions (e.g. adding excessive plot details into the Wikipedia article about film) are better made Approved and then reverted as usual. This way the author is not offended and the text is saved in page history, viewable by anyone.
 * 2) Any user that is deemed legitimate (does N good edits) should be granted   flag.
 * 3) Adding   flag via  is NOT recommended, as it motivates the vandals to do many very-minor edits (e.g. adding interwiki). Better grant the flag manually for one good edit and not grant it for 30 useless-edits-made-for-count.
 * 4) Abstain from using block>Special:MyLanguage/Manual:Block and unblock</>|blocks. Don't protect>Special:MyLanguage/Help:Protected pages</>|protect pages "just in case", except maybe for important templates.
 * 5) Allow the full rehabilitation of users with a bad history of editing. Their useful edits to the articles should be allowed, no matter how many times they were blocked. At the same time, trolling on talk pages should be rejected, so are the purposely-low-quality edits.

Compatibility with other extensions

 * 1) Extension:Moderation should be enabled last in LocalSettings.php, because it aborts at least pagecontentsave>Special:MyLanguage/Manual:Hooks/PageContentSave</>|PageContentSave hook.
 * 2) Extension:Moderation fully supports <tvar|CU>Extension:CheckUser</>, meaning that if CheckUser extension is enabled, then any approved edit will have correct IP, user-agent and XFF saved in the checkuser tables.
 * 3) Extension:Moderation is fully compatible with VE>Special:MyLanguage/Extension:VisualEditor</>|Extension:VisualEditor and MobileFrontend>Special:MyLanguage/Extension:MobileFrontend</>|Extension:MobileFrontend. Theoretically it should also work with other API-based editors.
 * 4) Flow>Special:MyLanguage/Extension:Flow</>|Extension:Flow will work, but edits in Flow forums will bypass moderation.
 * Moderation of Flow forums should be implemented in Extension:Flow itself. These forums use a non-text "content model>Special:MyLanguage/content model</>|content model", which is not supported by Moderation.
 * 1) (a) Upload extensions like MultiUpload>Special:MyLanguage/Extension:MultiUpload</>|Extension:MultiUpload and (b) uploads via API are only supported in MediaWiki 1.28+.
 * This can't be implemented for MediaWiki 1.27, because it doesn't have uploadverifyupload>Special:MyLanguage/Manual:Hooks/UploadVerifyUpload</>|UploadVerifyUpload hook.