The Moderator Tools team is exploring the content moderation tool needs of medium-sized Wikimedia projects as part of a cross-departmental pilot project in the Product department.
We want to understand what tools and processes are missing or hard to use in projects which are growing substantially so that we can prioritise Product investment. We believe there are particular stresses in these communities as a small number of administrators and content patrollers find themselves needing to review a growing number of edits, while also developing processes and workflows already established in larger communities.
The team's focus is on content moderation processes, including page protection, deletion, reporting, and recent changes patrolling, rather than user reporting and moderation, which is more within the purview of the Anti-Harassment Tools and Trust and Safety Tools teams.
While our focus isn't on the largest Wikimedia projects, we know that those communities also have content moderation tool needs. Where possible we will prioritise work which is beneficial to the largest number of contexts, but can't make any guarantees until we have done further research.
In the 2021/22 Annual Plan (July-June) the team will be focused on design research for the first 6 months (July - December 2021). We want to hear from a wide range of editors on the problems they're facing in keeping the content on their projects reliable and trustworthy. This research will then determine the second half of the year, as we start prioritising software development.
If you have something you want to share with us about this project, please feel free to leave a message on the talk page or contact a team member.