Moderator Tools/December update

From mediawiki.org

Since July, the team has been working on defining what we mean by ‘content moderation’, chatting with administrators from across Wikimedia projects, and selecting the communities we want to focus our user research in. We’re just about to start focused user interviews in the Tamil and Ukrainian Wikipedias, and wanted to provide a quick update on our progress so far. Samwalton9 (WMF) (talk) 10:17, 2 December 2021 (UTC)[reply]

Defining content moderation[edit]

As a brand new Product team, we needed to first define our scope - what kinds of processes and use cases will we research and seek to improve, and which will we leave to other WMF teams?

We have defined ‘content moderation’ as processes which affect or regulate the content on a Wikimedia project, but aren't themselves direct contributions to content or actions taken against specific user(s).

This is a high level definition primarily intended to constrain us to thinking about the processes which relate to actions taken on content rather than users. In this sense, processes like page protection, deletion, reporting, and recent changes patrolling are all in scope, but user reporting and blocking aren’t. We’re also not focusing on processes which add new content to projects, like adding citations to unreferenced content.

Lessons so far[edit]

We have spoken to more than a dozen editors, mostly administrators, from various Wikimedia projects, in addition to some developers working on community maintained content moderation tools. We’ve also been busy reading reports, papers, and learning about past projects in this space. This has given us a good sense of the content moderation landscape, particularly on Wikipedia. Below we’ve summarised some of the common or important points we’ve learned so far:

Twinkle is a popular gadget providing many content moderation features missing from MediaWiki. It is only available on a few Wikimedia projects.
  • There are numerous gaps in content moderation functionality in MediaWiki - Buttons like Undo, Rollback, Delete, and Protect provide basic functionality for core content moderation processes on Wikipedia. But editors engaged in content moderation are almost always using other tools and software to work effectively, most of it community maintained (or not maintained at all). Functionality like batch deletion, content reporting, patrolling new edits, restoring an earlier page version, and maintenance tagging is largely provided by gadgets and user scripts on large projects. On smaller projects, these kinds of features are either missing or cumbersome to enact, requiring editors to attempt importing gadgets from other Wikis, a process which often results in frustration or failure.
  • Some content moderation processes are in place on a large number of projects - Workflows like discussing whether an article should be deleted, tagging articles with maintenance notices, and requesting speedy deletion are in place on a wide variety of Wikimedia projects. While these processes are largely similar in nature, each project has specific nuances, and on some projects - particularly smaller ones - they may not be very active or well developed. Reporting content for deletion is a good example - on most projects there is a process by which any editor can report a page through this process, and other editors can then share their view on whether the article should be deleted or not. Sometimes this is more or less like a vote or discussion, and other times articles need to go through another discussion process first. This workflow has stood out to us because, in almost all projects, this venue is hard for newer editors to find and use, and usually requires a multi-step process.
  • There are many community-maintained tools for reporting content - Relating to the previous finding, deletion discussions are usually hard to start. A typical process requires an editor to add a tag to the page in question, then fill out a template on the deletion discussion page, then list that discussion on another page, then notify the article creator. Many communities have created user scripts or gadgets to improve this reporting process. Examples include Twinkle, NominateForDel, and FastButtons. These tools allow editors to, in one or two clicks, report content they think should be removed from their project more efficiently than via the manual process. On some Wikis almost all deletion discussions are created via these tools.
  • PageTriage is only available on English Wikipedia - There have been a number of requests to make the PageTriage extension available on other Wikipedia projects (T50552), in addition to numerous improvement requests for the extension more generally. Patrolling new pages is a common workflow on many Wikimedia projects, and this is a well-used extension on English Wikipedia. It’s less clear how strong a need there is for this tool on smaller projects. A new page patrol project on Spanish Wikipedia was active for some time, but suffered from a high workload per-contributor. When the Community Tech team investigated the PageTriage extension, they found that it would likely require rewriting from the ground up to serve projects beyond English Wikipedia.
  • Flagged Revisions is impactful but poorly understood - On the majority of Wikimedia projects, if a user makes an edit, that edit is immediately displayed in the article to all readers. On some projects, however, the Flagged Revisions extension (also called Pending Changes when used in a more limited configuration) requires all edits to be reviewed before they are displayed in the live version of a page. Examples of projects using Flagged Revisions range from large projects like the German and Russian Wikipedias, through to smaller projects like Icelandic Wiktionary. Flagged Revisions represents a substantial change in the way projects are moderated, both in terms of the workflow for content moderators and for new editors. Despite this potentially large impact, the effects of Flagged Revisions are poorly understood, covering just a few reports on individual projects (e.g. 2008 report, 2010 results, 2019 analysis). We’re interested in learning more about the impact of Flagged Revisions - deployment of the extension is currently blocked due to an expectation that the negative impacts outweigh the positive, but the data appears inconclusive.
  • Content moderation tools drive away good faith editors - Numerous reports show that a significant factor in new editor retention rates is whether a user’s content is moderated in some way (e.g. edit is undone, article gets deleted). But this doesn’t necessarily need to be the case - the effect on new editors is exaggerated by the tools being used and messages being sent. In Sue Gardner’s blog post collating the reasons women don’t edit Wikipedia, users noted a feeling of exclusion and paranoia due to the perception of mean-spirited messages they received after their content was reverted. With 80% of all first warning messages delivered by “bots or power tools”, we could imagine exploring ways in which this effect could be reduced, for example with more welcoming messages when a user’s contributions are moderated.
  • When registration became a requirement on pt.wiki, the kinds of content moderation required shifted, for example an order of magnitude change in the number of page protections.
    Small projects with steady growth may not require new tools - One of our assumptions going into this project was that small Wikimedia projects might feel overwhelmed by content moderation duties. Because the active editing community is small, and wants to focus on building basic content and the processes and policies around them, we assumed that vandalism would be a substantial issue. From speaking to some admins on small Wikipedia projects, however, this may not be the case. We spoke to one admin who said that moderation didn’t really take much of their editing time, and another who noted that they could check the recent changes feed once every few weeks and feel confident they’d caught all the vandalism on their project. This has helped us understand where we should focus our efforts.
  • Banning anonymous editing could have a substantial impact - The team is also tracking the progress on restricting editing to logged-in editors, as trialled on Portuguese Wikipedia and Farsi Wikipedia. Data indicates that this may not have a substantially negative effect on editor growth, but can lighten the content moderation workload on active editors. If this change is adopted by more communities, we want to be prepared for the ways this might change the content moderation landscape for active editors.

Focused research and interviews[edit]

Building on this groundwork, we’re now starting the process of more focused interviews with administrators and content patrollers on two medium-sized Wikimedia projects. We hope that these interviews will give us a much more granular view on the problems editors are facing and will lead us to more specific use cases and problems for our developers to work on solving.

We have selected Ukrainian Wikipedia and Tamil Wikipedia for our research. These were selected due to them being medium-sized projects which have opted out of global sysop support. We think this puts them in an interesting position, having less global editor support, but perhaps not having the same number of tools and processes as the largest projects. We hope this will highlight the most pressing issues for content moderators.

Our interviews will be taking place over the next couple of months, and we’ll be asking participants to walk us through the kinds of content moderation they do, the tools they’re using, and the areas that cause difficulty or stress.

If you would like to speak with us about moderating content on your Wikimedia project, please let us know! While we’re focusing our research in two projects, we’re happy to chat with any editors who engage in the processes we’re looking into. You can message us via Talk:Moderator Tools.

We’ll take all this and write a report which identifies what we’re seeing as the priority areas for Product development in this space. The report will be shared with the wider community for feedback before being actioned.