Jump to content

Contributors/Strategy/WE1.1

From mediawiki.org

This page describes the work Wikimedia Foundation Teams are doing as part of Wiki Experiences 1.1 objective key result, for Q1 and Q2 of the 2025-2026 fiscal year: increase the rate at which newcomers publish constructive edits on mobile web.

This work supports the longer-term Contributor Strategy.

To stay updated about this work, and to help shape it, we recommend adding the project pages linked below to your watch list. This page will be reserved for high-level and relatively infrequent updates.

Objective

[edit]

Increase the rate at which editors with ≤100 cumulative edits publish constructive edits on mobile web[1] by 4%,[2] as measured by controlled experiments (by the end of Q2).

Success criteria

[edit]

Because shifting a global metric like constructive edits happens over larger periods of time, and this work is bound to 6-months, we will evaluate the extent to which this work has been successful in the following way:

On a per-platform basis, we will calculate the proportion of interventions we deployed and evaluated through controlled experiments that met and/or exceeded the ≥4% target increase in constructive edits.[1]

Projects

[edit]
ID Hypothesis Documentation Status
WE1.1.1 If we prompt new(er) volunteers pasting text from an external site to confirm whether they wrote the content they are attempting to add, then we will see a ≥10% decrease in the percentage of new content edits new(er) volunteers publish that are reverted on the grounds of WP:COPYVIO (and related policies). Paste Check In progress In progress
WE1.1.2 If we deliver an initial beta version of the Revise Tone Structured Task, we can evaluate whether the Edit Check framework can technically support proactive suggestions launched from the Suggested Edits feed. Revise Tone Yes Done
WE1.1.3 If the Editing Team makes an initial set of edit suggestions available within VE via a URL parameter and invites ≥10 newcomers and patrollers to offer feedback about it, we will learn what improvements would need to be made before running a controlled experiment to evaluate the intervention's impact. VE Suggestion Mode MVP In progress In progress
WE1.1.4 If we deploy Reference Check at en.wiki through a controlled experiment, we will see a ≥4% increase in the constructive edits new(er) volunteers publish and learn whether there is sufficient support among patrollers and moderators to enable the feature more widely. Reference Check In progress In progress
WE1.1.5 If we test a progression system via design prototypes with newcomers, then we can identify which types of milestones, guidance, and recognition are perceived as most motivating, and use these insights to finalize a design for future pilot wiki experimentation. Progression System Yes Done
WE1.1.6 If we investigate the top technical, social, and behavioral barriers and enablers to mobile web editing through user research and data analysis, we will generate at least 3 actionable insights that close key knowledge gaps and strengthen our ability to confidently prioritize product investments for the second half of FY25/26 and beyond. Mobile web editing research Yes Done
WE1.1.7 If we analyze a pre-predetermined set of leading indicators ≥2 weeks after the start of the Tone Check A/B test, we will be able to identify what – if any – facets of the end-to-end experience need to be adjusted or investigated before we can be confident evaluating the impact of the feature. Tone Check Yes Done
WE1.1.8 If we apply the Tone Check model to published articles, we will learn whether we can identify the ≥10,000 tone issues (each with a probability score of 0.8 or higher) that are needed to build a high-quality (accuracy ≥ 70%) pool of suggestions to help guide editors in improving article tone. Revise Tone Yes Done
WE1.1.9 If we improve the "Add a Link" model infrastructure and pipelines using Airflow and code fixes, we'll be able to scale the "Add a Link" structured task from 275 languages to 300[3] languages, support all of the top 20 biggest Wikipedias, maintain higher quality predictions over time through automated retraining, and learn whether scaling "Add a link" will boost constructive edits among newcomers on wikis where this structured task wasn't previously supported. Add a Link Yes Done
WE1.1.10 If we interview ~10 experienced volunteers at en.wiki and fr.wiki who write AbuseFilters (and other gadgets/scripts/templates/edit notices) to automate patrolling/moderation workflows, we will identify ≥3 patterns/needs in the following areas that will help shape the value proposition of community-authored Edit Checks:
  1. Explore whether/how experienced contributors and tool writer/modifiers think of their tooling activity as encoding policy,
  2. Discuss how they feel about AI/LLMs and other such technologies being utilized in general, and within the onwiki moderation space, and
  3. Understand their reactions and processing post being shown the current three main Edit checks (tone, reference, and paste)
Understanding how volunteers "program policy" and their perception of Edit Check Yes Done
WE1.1.11 If we distribute a survey to ≥500 successful newcomers[i] and obtain high quality data that is representative of the broader successful newcomer population, we will be able to identify ≥4 actionable insights we can use to prioritize what aspects of the onboarding experience to improve. Research:Successful_Newcomers_Survey_2025 In progress In progress
WE1.1.12 If we enable ≥3 volunteers to evaluate ≥30 sample edits each, for each of the 10 new languages we are seeking to scale Tone Check to, we will learn how often volunteers agree with model predictions and be able to decide which new wikis to approach about deploying Tone Check to. Tone Check In progress In progress
WE1.1.13 Given we scaled “Add a Link” to 100% of newer volunteers at English Wikipedia, then newcomer constructive activation and retention will improve, which will increase constructive edits made by newer volunteers by ≥4%. Add a Link Yes Done
WE1.1.14 If we prompt new(er) volunteers to consider the tone they are writing in when an AI model detects them using non-neutral language, then we will see a ≥4% decrease in the percentage of new content edits new(er) volunteers publish that are reverted on the grounds of WP:NPOV (and related policies). Tone Check In progress In progress
WE1.1.15 If we conduct unmoderated usability tests of 2–3 prototypes for the logged-out editing prompt/UX, we will identify which design is most likely to cause more newcomers to create accounts, based on the call to action we observe test participants choosing most often when present with each design variant. Redesign the Logged-Out Warning Message on Mobile In progress In progress
WE1.1.17 If we develop a task generation engine for the Revise Tone structured task, integrate our recent learnings about which content to include or filter out, and provide pipelines that automatically refresh the task list, we'll enable a qualitative evaluation of the tasks generated and an A/B experiment that tests whether this type of task helps newcomer editors to make more constructive edits. Revise Tone In progress In progress
WE1.1.18 If we analyze a pre-predetermined set of leading indicators ~2 weeks after the start of the Revise Tone Structured Task A/B test, we will be able to identify what – if any – facets of the end-to-end experience need to be adjusted or investigated before we can be confident evaluating the impact of the feature. Revise Tone In progress In progress
WE1.1.19 If we enable people on mobile web to edit any article section, regardless of which edit icon they first tap, then the newcomer mobile edit abandonment rate will decrease by #% because they will be able to more more easily locate the content they tapped edit seeking to change. Run a controlled experiment to address section editing dead-end (mobile web) In progress In progress
WE1.1.20 If the Growth team scales the “Add a Link” task to at least 10 additional Wikipedias, then we can complete leading indicator analysis to confirm that the task is performing as expected and identify any wikis that may require further review. Add a Link In progress In progress
WE1.1.21 If we deploy the new Add-a-Link model version to both newly onboarded wikis and wikis currently using Add-a-Link, then the Growth team will be able to roll out Add-a-Link as a structured task to wikis where it did not previously exist, and wikis that already had the task will receive an updated model with fresher training data and improved offline performance. Add a Link In progress In progress

Background

[edit]

Newcomers struggle to start editing successfully. Particularly, people using mobile devices where screen space is limited and attention is often fragmented.

Some grow tired of the context, patience, and trial and error required to contribute constructively. And some people, like logged out readers, have yet to encounter a compelling opportunity to try.

Relatedly, experienced volunteers grow tired by the volume of preventable mistakes newcomers make.

WE 1.1 will address these issues by:

  1. Surfacing edit suggestions
  2. Offering actionable in-edit guidance
  3. Building more task-specific editing workflows

At the core of these efforts is the need for scalable ways to:

  • Detect how in-progress edits and existing content could be improved
  • Clarify the steps and concepts required to publish constructive edits

To grow this capacity, we will continue to experiment with machine learning to learn how it can best serve editors, across roles and experience levels.

References

[edit]
  1. 1.0 1.1 Where constructive edits are defined as edits to pages in any Wikipedia main namespace that are not reverted within 48 hours of being published.
  2. Analysis that informed the improvement target this work is meant to meet: T389403 § 10960480
  3. "300 languages" is an estimated number. More investigation is needed before we will consider this target finalized.