Jump to content

Moderator Tools/Content Moderation in Medium-Sized Wikimedia Projects

From mediawiki.org
Full research report

During 2021/22, the Moderator Tools team has been carrying out research to understand the needs of content moderators in medium-sized Wikimedia projects. This page includes a summary of the findings and recommendations from the final report, which you can read in full here.

In this report, when we say ‘Content moderation’, we’re referring to the policies and processes which govern the content of a Wikimedia project, but not the direct contributions to content. This might include processes like patrolling new edits, using administrator tools to delete or protect pages, writing policies, and categorization and maintenance tagging tasks.

Do the findings below match your needs and the situation on your Wikimedia project, or not? Let us know on the talk page so we can continue developing our understanding of what technical projects we should work on.

From our first round of research we identified content moderation on mobile web as the priority problem for content moderators. We want to learn more about desired improvements to the mobile web interface over the coming months. Please see Content moderation on mobile web for information and questions we have about potential improvements to mobile web.

Research and interviews

[edit]

This research focused on "medium-sized" Wikimedia projects. We primarily interviewed editors from projects which were not in the top ~10 by size, with a specific focus on understanding content moderation on the Tamil and Ukrainian Wikipedia projects. Our interviewees were predominantly administrators on at least one project, though we also spoke to patrollers, stewards, and tool developers.

While we hope to ultimately work on products which solve needs for a wide range of community members, we wanted to focus on underserved communities when defining our overall direction, as these tend to not have received product investment in the past and have fewer technical volunteers to fill the gaps.

Findings

[edit]

Major technical obstacles

[edit]

One very obvious theme was that no content moderators we interviewed carried out moderation tasks on mobile. Even on Tamil Wikipedia, with a high percentage of mobile web pageviews (87%) and mobile editors (45.2%), none of the administrators we spoke to regularly edited on mobile. The reason why was very clear: moderation on mobile is so poor as to be practically unusable. It lacks many of the most basic features available via the desktop interface, such as undoing edits from a diff, and even when those features are present they are generally not optimized for mobile. This poses an accessibility and equity issue for communities where smartphones are the most common computing device, or in areas where access to desktops and laptops is intermittent or unreliable due to emergencies or crises.

Additionally, moderation tools are rarely well documented and are difficult to discover. Engaging with processes like maintenance tagging and nominating an article for deletion are obscure, and usually take multiple steps to engage with. Some features requested by our interviewees were already provided by tools such as the Title Blacklist and Abuse Filter. This suggests that these tools are either not widely available, or their functions are not well documented enough for administrators to learn how to use them on their own. On top of this, some of these tools are very powerful and therefore have the potential to cause tremendous accidental harm. This may discourage adoption by new administrators due to high perceived risk and a lack of opportunities to safely learn how to use them.

Major social hurdles

[edit]

Nearly every respondent in our report noted that administrators on their project felt overworked and understaffed. They pointed to strict requirements for gaining adminship as one major barrier to finding new admins. Additionally, our interviewees from Ukrainian Wikipedia also highlighted the difficulty of onboarding new administrators. Administrators gain a large range of impactful tools, but little guidance on how best to leverage them. This leads to hesitancy both for potential administrator candidates and for the wider community voting on new administrators.

Another theme was how invisible many content moderation tasks are, which leads to the feeling that performing them is less rewarding. Implicit in this was the point that the invisibility of most administration work may also contribute to difficulty in encouraging editors to think about performing moderator work themselves. This is also a problem for administrators seeking to review their own work, or for anyone trying to monitor administrator actions, since getting data on administrative actions is theoretically possible but difficult in practice right now. Therefore, it is hard for anyone to know what kinds of actions are most common, and to assess their impact.

Categorizing moderation capacity by wiki

[edit]

Content moderator needs vary by project size, and a small admin pool does not automatically mean that a community is understaffed. Therefore, categorizing wikis by administrative capacity is more complex than merely counting the size of their administrator user group or even looking at the number of edits made per month. The metrics we settled on included comparing the monthly active administrator to monthly active editor ratio to the number of edits per month, as well as undertaking a review of a wiki’s policies in a few common areas. Common content moderation policies include:

  • Speedy deletion
  • Deletion discussions
  • Administrator election and removal procedures

We determined that most smaller wikis either have no stated policies on content moderation topics (even if they are carrying out actions such as page deletion), or copy them verbatim from larger wikis. Developing the capacity needed to tailor such policies (both in terms of adapting policy as well as translating it to the language of the wiki) to their specific wiki may therefore be taken as a signal of growing moderation capacity.

In addition to this, we also looked at other characteristics such as percentage of majority-mobile editors, article count of the wiki, and the geographic and linguistic coverage of the project. We wanted to avoid selecting partner wikis that were culturally similar since borrowing policy from closely-related wikis (by language or culture) is a common practice, and we wanted to compare two different moderation contexts for this report.

Product recommendations

[edit]
Page history for an article with Advanced mobile contributions turned on.

Our primary product recommendation would be to improve content moderation from mobile web. We arrived at this conclusion from our main finding that moderation on mobile is practically impossible even for very basic tasks. It poses a barrier to participation for new administrators, as well as addressing an equity issue for many growing communities. One in four contributors to Wikimedia projects edits primarily from a mobile device, a figure which increases to 40-60% in some emerging markets. These are editors who cannot meaningfully participate in advanced editing or moderation due to their lack of access to laptop or desktop devices. Even for moderators whose primary device is not a smartphone, they are common as secondary computing devices and therefore an improvement to mobile moderation would benefit desktop-first editors as well.

Our aim would be to bring content moderation functionality in the mobile skin up to parity with the desktop editing experience. We would do so by building on the Advanced Mobile Contributions project, and incorporating these features into the default experience - with improved user interfaces where needed - rather than requiring editors to opt-in to an advanced editing mode.

Working within the mobile web interface could also be a beneficial starting point for further work in this space. Issues of accessibility and discoverability are accentuated here, making it a good avenue for developing hypotheses to solve other challenges.