Moderator Tools/Content Moderation in Medium-Sized Wikimedia Projects/ja

チームは2021/22年度にウィキメディアの中規模プロジェクト群におけるコンテンツ管理者（moderator）のニーズを調査してきました. This page includes a summary of the findings and recommendations from the final report, which you can read in full here.

この報告書では「コンテンツの適正化」と記す場合に、ウィキメディアの特定のプロジェクトで規定されるコンテンツに関する方針と手順のことを意味しており、コンテンツに対する直接の貢献は対象ではありません. This might include processes like patrolling new edits, using administrator tools to delete or protect pages, writing policies, and categorization and maintenance tagging tasks.

Do the findings below match your needs and the situation on your Wikimedia project, or not? Let us know on the talk page so we can continue developing our understanding of what technical projects we should work on.

From our first round of research we identified content moderation on mobile web as the priority problem for content moderators. We want to learn more about desired improvements to the mobile web interface over the coming months. ''' Please see Content moderation on mobile web for information and questions we have about potential improvements to mobile web. '''

Research and interviews
This research focused on "medium-sized" Wikimedia projects. We primarily interviewed editors from projects which were not in the top ~10 by size, with a specific focus on understanding content moderation on the Tamil and Ukrainian Wikipedia projects. Our interviewees were predominantly administrators on at least one project, though we also spoke to patrollers, stewards, and tool developers.

While we hope to ultimately work on products which solve needs for a wide range of community members, we wanted to focus on underserved communities when defining our overall direction, as these tend to not have received product investment in the past and have fewer technical volunteers to fill the gaps.

Major technical obstacles
One very obvious theme was that no content moderators we interviewed carried out moderation tasks on mobile. Even on Tamil Wikipedia, with a high percentage of mobile web pageviews (87%) and mobile editors (45.2%), none of the administrators we spoke to regularly edited on mobile. The reason why was very clear: moderation on mobile is so poor as to be practically unusable. It lacks many of the most basic features available via the desktop interface, such as undoing edits from a diff, and even when those features are present they are generally not optimized for mobile. This poses an accessibility and equity issue for communities where smartphones are the most common computing device, or in areas where access to desktops and laptops is intermittent or unreliable due to emergencies or crises.

Additionally, moderation tools are rarely well documented and are difficult to discover. Engaging with processes like maintenance tagging and nominating an article for deletion are obscure, and usually take multiple steps to engage with. Some features requested by our interviewees were already provided by tools such as the Title Blacklist and Abuse Filter. This suggests that these tools are either not widely available, or their functions are not well documented enough for administrators to learn how to use them on their own. On top of this, some of these tools are very powerful and therefore have the potential to cause tremendous accidental harm. This may discourage adoption by new administrators due to high perceived risk and a lack of opportunities to safely learn how to use them.

Major social hurdles
Nearly every respondent in our report noted that administrators on their project felt overworked and understaffed. They pointed to strict requirements for gaining adminship as one major barrier to finding new admins. Additionally, our interviewees from Ukrainian Wikipedia also highlighted the difficulty of onboarding new administrators. Administrators gain a large range of impactful tools, but little guidance on how best to leverage them. This leads to hesitancy both for potential administrator candidates and for the wider community voting on new administrators.

Another theme was how invisible many content moderation tasks are, which leads to the feeling that performing them is less rewarding. Implicit in this was the point that the invisibility of most administration work may also contribute to difficulty in encouraging editors to think about performing moderator work themselves. This is also a problem for administrators seeking to review their own work, or for anyone trying to monitor administrator actions, since getting data on administrative actions is theoretically possible but difficult in practice right now. Therefore, it is hard for anyone to know what kinds of actions are most common, and to assess their impact.

適正化の能力をウィキ単位で分類
Content moderator needs vary by project size, and a small admin pool does not automatically mean that a community is understaffed. Therefore, categorizing wikis by administrative capacity is more complex than merely counting the size of their administrator user group or even looking at the number of edits made per month. The metrics we settled on included comparing the monthly active administrator to monthly active editor ratio to the number of edits per month, as well as undertaking a review of a wiki’s policies in a few common areas. 一般的なコンテンツ適正化の方針を次にあげます.


 * Speedy deletion
 * Deletion discussions
 * Administrator election and removal procedures

検討の結果、小規模なウィキの大部分がコンテンツ適正化について方針を明文化していない（ページの削除などの処理は実際に行なっている）もしくは、規模の大きめのウィキからそっくり転記しています. そのような自分たち固有のウィキ用の方針作りに欠かせない能力の開発は（方針の導入の面でも個別のウィキの言語に翻訳する面でも）それ自体が、適正化の能力を高めようとしている兆候として受け取ることができます.

In addition to this, we also looked at other characteristics such as percentage of majority-mobile editors, article count of the wiki, and the geographic and linguistic coverage of the project. We wanted to avoid selecting partner wikis that were culturally similar since borrowing policy from closely-related wikis (by language or culture) is a common practice, and we wanted to compare two different moderation contexts for this report.

Product recommendations
私たちが第一に製品に推奨するなら、モバイルのウェブ環境におけるコンテンツの適正化を改善すべき点です. We arrived at this conclusion from our main finding that moderation on mobile is practically impossible even for very basic tasks. It poses a barrier to participation for new administrators, as well as addressing an equity issue for many growing communities. ウィキメディアのプロジェクト群に貢献する人の4人に1人は、主にモバイル機器で編集をしており、複数の新興市場ではこの数字は40-60%に膨らみます. These are editors who cannot meaningfully participate in advanced editing or moderation due to their lack of access to laptop or desktop devices. いつも使う機器がモバイルではない人でも、コンピュータ環境の二次的な機器として一般化しているし、だからこそモバイル版における適正化を改善すると、デスクトップ優先の編集者にも有益であるはずです.

Our aim would be to bring content moderation functionality in the mobile skin up to parity with the desktop editing experience. We would do so by building on the Advanced Mobile Contributions project, and incorporating these features into the default experience - with improved user interfaces where needed - rather than requiring editors to opt-in to an advanced editing mode.

モバイル版ウェブインターフェースで作業すると、この空間での今後の作業展開にとって有益なスタート地点かもしれません. Issues of accessibility and discoverability are accentuated here, making it a good avenue for developing hypotheses to solve other challenges.