Edit Review Improvements

Edit Review Improvements is a project of the Collaboration Team, which is researching ways to reduce the negative effects current edit-review processes can have on new editors to the wikis.

Most edit-review and patrolling tools were designed to safeguard content quality and fend off bad actors—both vitally important missions.

A 1>#Related documents|body of research, however, suggests that these processes, particularly when they involve automated or semi-automated tools, can have the unintended consequence of discouraging and even driving away good-faith new editors.

To solve this problem, Collaboration Team is investigating ways to separate good-faith new users from current edit-review workflows and, ultimately, to provide a supportive review process that helps new users become productive contributors.

Problem
Research shows that for new wiki editors in particular, “being reverted predicts both a decrease in activity and a reduction in the probability of survival” as editors. At the same time, the increasing use of automated and semi-automated edit-review tools has brought about an increase in rejection of good-faith newcomers. The use of these tools “significantly increases the negative effect of rejection on desirable newcomer retention.” The above notwithstanding, edit-review tools are essential for vandalism fighters and others working to maintain wiki integrity and quality. How can we help and retain new users while maintaining the productivity of vandalism fighters and other edit reviewers?

Goals
Ensure good-faith new editors have more constructive, less discouraging experiences of edit and article review. By providing richer data about recent changes, enable patrollers and edit-reviewers of all types to work more efficiently and to pursue diverse interests (e.g., fighting vandalism, supporting new users) in a more effective and targeted way. Ultimately this project aims to have an effect on editor retention, an objective that aligns well with the overall goals of the Wikimedia Foundation 2016-17 Annual Plan, developed in close consultation with the user community.

The approach tracks in particular with the goals the Annual Plan lays out for the Product Team, which promise, among other things, to “Invest in new types of content…curation and collaboration tools.”

Solutions
This project is in a research phase and no concrete product plans have been formalized as of this writing (in June 2016).

To begin to address the problems of struggling but good-faith newcomers, however, a good first step will be to ensure that we can find them.

The Collaboration Team’s quarterly goal for the first quarter of 2016-2017 (July-September 2016) is to “create a process that enables edit-reviewers to identify the edits of good-faith new users, so that these edits can be reviewed separately.”

To achieve this, we propose to analyze recent changes using data from a variety of sources, including and most notably the machine-learning program ORES (Objective Revision Evaluation Service).

ORES’s good faith model, trained on human judgement, can find 95% of good-faith edits with 98% accuracy.

ORES can also predict edits that will be reverted and those that are damaging to the wikis.

While research shows that new editors are particularly vulnerable to rejection, there’s also evidence that edit-review and even rejection can be a powerful learning experience for newcomers. For reviewers interested in supporting new users, then, a stream of edits that are a) likely to be reverted but which were b) made in good faith will, we hope, represent a string of teachable moments.

The edit analysis described above may be presented to users in a number of ways, including: On a dedicated page similar to PageTriage's Special:NewPagesFeed. On Recent Changes and/or Watchlist via filters. In machine-readable feeds that can be ingested by edit-review programs (e.g. Huggle/Snuggle/STiki/etc.).

Current activity
To visualize possible product directions, the Collaboration Team is exploring design concepts while continuing to research the issues. To better gauge the size of the problem and be able to track progress, we’re working to to define and measure new-editor retention.

Design Research is organizing and conducting interviews with users touched by this issue in various ways, to better understand their motivations and workflows. Groups who will be interviewed in the near term include: anti-vandalism patrollers, recent changes patrollers, Teahouse hosts, Welcoming Committee members, and AfC reviewers. The Research and Data team is working to make predictions better by refining the accuracy of prediction models.

There was a discussion of the project at Wikimania 2016, in June

Future plans
Creating the streams/pages of “teachable moments” described above has the potential to establish edit-review as a new space for instructing and supporting new editors.

The mere existence of such a platform, however, won’t in itself ensure that this new practice will take root.

To truly have an impact on newcomer retention, interventions may be required at multiple points in the editing and review cycles: before publication, to spot problems and enable authors to seek help; during review, to facilitate a constructive process; and even after review, to help new users overcome rejection and learn from from their experiences.

In addition to exploring ideas for intervening at various points, we’re pursuing answers to questions such as these: How can we bring reviewers to this new activity? What would make reviewers most effective in the job of supporting newcomers during edit review? How can we make the process rewarding for reviewers, so that they stay involved?

The counter-vandalism community also has an important role to play in this arena.

Richer data about edits and editors should make patrollers of all types not only more discriminating about which edits might be in good faith, but also more efficient at their job of combating harm.

It will be important to work closely with vandalism fighters and others to understand how their processes and tools might best be adapted to realize these potential gains.

Principles
As we pursue this project, the following principles will guide our planning. Smart but human. Use technology to support rather than replace human interaction. Artificial intelligence can provide analysis, but humans should make decisions. Cross-community. Find solutions that will work across language groups and projects, rather than building wiki-specific tools. Platform not feature. Seek solutions that are extensible and reusable by current and future community-created and WMF tools. Mobile. Although edit-review is not currently popular on mobile, consider mobile users carefully in our plans. Adoption. In addition to creating new technology, focus on finding ways to encourage reviewers to adopt and continue to use the new tools. Integration. In seeking new solutions, build on and integrate with existing practices whenever possible. Incremental approach. As we move into this new area, proceed incrementally to each milestone and then evaluate where to go next. Participatory design. Collaborate with editors and tool developers already working in this space.

Related documents

 * Grants:IdeaLab/Fast and slow new article review
 * Research:Newcomer survival models