Extension:Moderation/cs

Rozšíření Moderation poskytuje ochranu proti vandalismu pro malé a střední wiki.

This is one of the most effective anti-vandal protection methods and has very little impact on legitimate users.

Introduction

 * How does it work? :
 * 1) Every edit (or image upload) by a new user is being sent to a moderation queue.
 * 1) Until the moderator approves this edit, the page is unchanged. Pending edits are neither on the page history nor in RecentChanges.
 * 1) The user can see their edit and continue editing their own version of the page.

It's much like the RecentChanges, but has "Approve", "Reject", "Approve all" and "Reject all" buttons.
 * How do the admins moderate? :
 * 1) A new special page is provided (Special:Moderation).
 * 1) Rejected edits go into the rejected archive.
 * 1) Approved edits are applied normally.
 * 1) Logs of "who approved what" are maintained. Only the moderators can see them.
 * 1) If edit conflict is detected and it can't be resolved automatically, the moderator has a merge button to apply the edit manually.

They edit normally, like they would do in MediaWiki without moderation. And blocks are not good (consider the chance of hitting a legitimate user with a range block, or inability to allow good edits from a not-very-adequate user who sometimes has the urge to vandalize a page or two).
 * Why is it good? :
 * 1) New users are not discouraged by annoying, phone number verifications, etc.
 * 1) Blocks become practically obsolete.
 * 1) Vandalism out of "wanting to be noticed" is discouraged. No one would sit for 5 hours looking for new and new proxies to make admin angry, if it's known that all those actions are not a problem.
 * 1) Vandalism methods like "vandalizing one page from two accounts to prevent one-click rollback" are no longer effective.
 * 1) Website can operate in anonymous networks like TOR or I2P.
 * 1) Users can hide their mistakes from appearing in the revision history and even from moderators by fixing them in time.
 * 2) Since any edit is only permanently recorded upon approval, users can correct botched edit summaries.

Alternatives
Does MediaWiki have other counter-vandalism methods? In brief - not really.

MediaWiki was developed for Wikipedia. At any given time, Wikipedia has hundreds of volunteers willing to revert vandalism in real time. Almost every other wiki besides Wikipedia doesn't have that kind of advantage. The built-in counter-vandalism idea of MediaWiki is that vandalism takes more time than reverting it. Normally that's true, but this does a poor job at discouraging vandalism, and the admins still have to check for vandalism often, even if the reverting itself doesn't take much of their time.

There are three known methods of fighting vandalism:

For example, Lurkmore.to imposes a strong captcha on all edits from new users, and it takes a lot of edits to finally be able to edit without the captcha. Therefore the vandal has to spend a lot of time to do a handful of edits. If the social network verifies that all its users have a valid mobile phone number, then each vandalism attempt requires the vandal to go to the shop and buy a new SIM card. This method is extremely effective, though eliminates the anonymous editing and turns away the users who don't have an account in any supported social network. In non-democratic countries editing a page on politics can result in government trying to identify and persecute the user. For example, Lurkmore.to was contacted by Russian "anti-extremist special force" with demands to disclose information about the authors of pages about Ramzan Kadyrov and Molotov cocktail.  For example, a user can create 100 pages with offensive titles, but they can all be deleted by two clicks in. Moderation extension belongs to this category.
 * 1) Make all edits hard.
 * The obvious minus is that all legitimate users have to bypass the captcha as well, which could discourage minor edits like spelling fixes.
 * 1) Enforce user identification - for example, login via Facebook.
 * A strong minus of this method is the impact on users' privacy.
 * 1) Mitigate the results of vandalism.

Is this extension stable?
This extension is stable. It has been deployed in production on Russian Uncyclopedia (absurdopedia.net) since November 2014.

The extension has an automated testsuite with significant coverage (phpunit and Selenium). Every change to Moderation is automatically tested on:


 * 1) newest version of MediaWiki
 * 1) MediaWiki 1.35 (LTS)
 * 2) MediaWiki 1.31 ( legacy LTS )

Please read the files KNOWN_LIMITATIONS, TODO and WONT_DO for all known issues. Feel free to contact the author if you have any questions.

What's the difference from FlaggedRevs or Approved Revs?
and hide the bad revisions only from readers. The vandal edits will still exist in history and RecentChanges, and all editors will stumble upon them when they try to edit the page which was vandalized. Therefore editors would have to revert vandalism quickly.

On the other hand, Moderation completely eliminates vandal edits: non-approved revisions are simply not created in page history, etc. This ensures that not only readers, but also other editors won't see the vandal edits in any of the pages.

In short, (1) FlaggedRevs is for quality control but doesn't help against persistent vandalism. (2) Moderation is specifically against vandalism and renders it completely ineffective.

Installation
For modern versions of MediaWiki (1.35+), use the following instruction:

Installation for older versions of MediaWiki
For MediaWiki 1.31-1.34, replace the above-mentioned "git clone" command with the following:

For MediaWiki 1.27-1.30, replace the above-mentioned "git clone" command with the following:

For MediaWiki 1.23-1.26, replace the above-mentioned "git clone" command with the following:

These versions may still receive security fixes (if any), but not new features.

Parameters for LocalSettings.php

 * $wgModerationEnable: If set to false, then new edits are applied as usual (not sent to moderation). Default: true.
 * $wgModerationTimeToOverrideRejection: Time (in seconds) after which rejected edit could no longer be approved. Default: 2 weeks. Note: old rejected edits are NOT deleted (moderators can always look at them in Rejected folder even if this time has elapsed).
 * $wgModerationOnlyInNamespaces: If set to an array of namespace numbers (e.g. ), moderation is only enabled in these namespaces (edits in other namespaces will bypass moderation). Default (empty array): moderation is enabled everywhere.
 * $wgModerationIgnoredInNamespaces: If set to an array of namespace numbers (e.g. ), non-automoderated users can bypass moderation in these namespaces. Default (empty array): moderation can't be bypassed anywhere.
 * $wgModerationNotificationEnable: If true, notification email will be sent to $wgModerationEmail (e.g. ) each time an edit is queued for moderation. Default: false.
 * $wgModerationNotificationNewOnly: If true, only notify about new pages (but not about edits in existing pages). Default: false.

See also: #Configuration options ONLY for pre-publish review (options not recommended for 95% of wikis).

Additional anti-vandalism tips
In order to prevent vandalism, the following additional measures should be applied:


 * 1) Please restrict the renaming of pages to a trusted group (not just "automoderated"), because it can be used for difficult-to-revert vandalism.
 * 1) Registering new accounts with offensive names is still a way for a vandal to show itself in the RecentChanges. A simple solution is to remove newusers log from RecentChanges:

Recommended use / good practices
The following good-practices are advised:

Not-so-good edits with good intentions (e.g. adding excessive plot details into the wiki's article about a movie) are better made Approved and then reverted as usual, and with a reason in the edit summary. This way the author is not offended and the text is saved in page history, viewable by anyone for transparency and editor accountability. Better promote them to  manually for one good edit and not promote for 30 useless-edits-made-for-count. Don't "just in case", except maybe for important templates. Their useful edits to the articles should be allowed, no matter how many times they were blocked. At the same time, trolling on talk pages should be rejected, so are the purposely-low-quality edits.
 * 1) Only vandalism should be Rejected.
 * 1) Any user that is deemed legitimate (does N good edits) should be added into   group.
 * 1) Adding users to   group via  is NOT recommended, as it motivates the vandals to do many very-minor edits (e.g. adding interwiki).
 * 1) Abstain from using.
 * 1) Allow the full rehabilitation of users with a bad history of editing.
 * 1) Please note that an editor who appears to be resubmitting a rejected edit does not necessarily imply an intent to edit-war, but the editor might have made changes to their pending edit without noticing that it was rejected in the meantime.

Non-recommended use: Moderation as pre-publish review extension
Moderation is an anti-vandalism tool first, but some wikis use it for quality control. For example, a wiki of scientific works might choose to:


 * 1) Not Approve any edits until they meet the strict quality standards of the industry.
 * 1) Not Reject any edits that are not yet good enough, so that the author could continue editing it as long as necessary.

Pros of this approach:


 * 1) New page appears as a fully reviewed, correctly formatted document with no typos, etc.
 * 1) No one except the author and moderators would see the imperfect revisions.

Cons:


 * 1) Other users can't improve the article until it is Approved. In fact, they won't even know that it exists.
 * 1) Pending changes don't have an "edit history". Moderation stores only 1 pending change for each Page/User pair. That's inconvenient if you are preparing your page for publication for weeks. User can even accidentally delete the necessary text in their pending revision, and it won't be recoverable.

Configuration options ONLY for pre-publish review
The following parameters are only needed when using Moderation for review. They are not recommended for 95% of wikis (when following the Best Practices, they are totally not needed).

''Why not recommended? Answer: when following Best Practices, you would never Reject a good change just because it is formatted poorly. Whether this edit is good or not, you know from "diff" link. "Preview" link tells you "how is this page formatted", which shouldn't affect your decision.'' ''Why not recommended? Answer: easy to mess up. Moderator can accidentally delete the text of pending edit (and it won't be recoverable). Furthermore, these changes are not attributed to moderator (after approval, it looks as if the original author made the edit this way), which is creepy.''
 * $wgModerationPreviewLink: If true, Preview link is shown on Special:Moderation. Default: false.
 * $wgModerationEnableEditChange: If true, moderators can modify the text of pending changes before Approving. Default: false.

Compatibility with other extensions
Theoretically it should also work with other API-based editors. (currently very few extensions do)
 * 1) Extension:Moderation should be enabled last in LocalSettings.php, because it aborts at least  hook.
 * 1) Extension:Moderation fully supports, meaning that if CheckUser extension is enabled, then any approved edit will have correct IP, user-agent and XFF saved in the checkuser tables.
 * 1) Extension:Moderation is fully compatible with  and.
 * 1)  (also known as Flow) and  will work, but edits in Flow/CommentStreams forums will bypass moderation.
 * Moderation of Flow forums should be implemented in Extension:StructuredDiscussions itself. These forums use a non-text "content model", which is not supported by Moderation.
 * CommentStreams extension misinterprets "edit was queued for moderation" as an error, which can only be fixed in Extension:CommentStreams itself.
 * 1) Extensions that modify several slots of Multi-Content Revisions (not just the main slot, as MediaWiki itself does) are not yet supported.