Article feedback/Version 5

This is the home page for Version 5 of the Article Feedback Tool. The Wikimedia Foundation is developing this new feature as an "on-ramp" to engage readers to contribute Wikipedia -- and become editors over time.

Objectives
In October 2011, Wikimedia kicked off the product planning process on new and alternative methods of providing feedback regarding the quality of articles, including ideas like a moderated free-text comment queue for suggestions. We've engaged Fabrice Florin as product development consultant to manage this project, as well as OmniTI in Maryland for the software development of these new features.

Our overall goals will be to 1) measure quality, 2) provide meaningful feedback regarding the development of articles to the editing community, 3) provide lightweight contribution mechanisms as a gateway to editing. The idea is to try new experiments alongside the existing article feedback tool (i.e. to replace it on some reasonably representative subset articles with an alternative implementation, gather data, and iterate).

We invite the Wikipedia community (as well as all Wikimedians), to participate in this experiment. Together, we hope to create and test new collaborative tools towards these objectives:
 * engage readers to participate more on Wikipedia
 * give editors new tools to improve article quality
 * encourage readers to become editors over time
 * invite a collaboration between editors and readers
 * experiment with outsourcing web development

The set of implementation of the Article Feedback Tool (Versions 1-4) were focused on the dual objectives of participation and quality. The existing tool intends to provide a quantitative measurement of quality of articles as well as an on-ramp for contribution (i.e., editing). Based on the research conducted by WMF, the tool shows promise in being an on-ramp for contribution. As implemented, the tool appears to provide a reasonable measurement for quality along some dimensions (Completeness and Trustworthy), while other dimensions of quality (Objective and Well-written) tend to show a lower correlation with ratings.

Based on this research and the input from the Wikipedia editing community (examples are here and here), the next version of the tool will focus even more heavily on participation. Editors told us that it would be valuable if they knew what readers were looking for. Version 5 of the Article Feedback Tool will focus on finding ways for readers to contribute productively to building the encyclopedia.

For example, we will try a version where we ask readers "Did you find what you were looking for?" We will also invite them to add a comment or suggestion for improvement. Even if this reader doesn't become an editor, the hope is that they will contribute productively by letting the editing community know what was missing from the article. As we did in the first few phases of the project, we will also invite them to make the edit themselves.

Throughout this project, we will continue to test various ways of measuring quality. We may, however, do this based on more implicit data. In the above example, the percentage of "yes"'s could be an indicator of article quality, even though we don't ask the reader to explicitly evaluate the quality of the article.

Feature Requirements


Key features for AFT V5 include:
 * new feedback forms
 * calls to action
 * feedback page
 * moderation tools
 * expanded feedback

In the first phase of this project (Oct.-Dec. 2011), we will create and test four different ways to extend the current rating tool:
 * Option 1: Share your feedback - Did you find what you were looking for? Add a comment
 * Option 2: Make a suggestion Suggest an improvement, ask a question, report a problem or give praise to the editors
 * Option 3: Review this page - Rate this article. Add a comment or suggestion for improvement.
 * Option 4: Edit this page - Help improve Wikipedia (call to action, for comparison purposes)

We plan to A/B test these four options against the current rating tool, to find out which is most effective for engaging readers, supporting editors and improving article quality.

We are also developing other features to make reader feedback more useful, including:
 * Feedback page - See all the posts for this article, vote them up or down (moderated by editors and administrators)
 * Expanded feedback (optional) - Tell us more: suggest improvements, rate article qualities or share your expertise.
 * Talk page features - Tools to feature the best feedback on the article talk page, with a recommended check list for editors

For a preview of what these forms and pages might look like, check our latest slides, which include simple wireframes for key touchpoints, as well a project plan and other useful exhibits.

We plan to develop more features in a second phase (Jan.-March 2012), based on community ideas and first phase results. We are starting a wish-list of features and open issues for that second phase on this Ideas page.

Metrics
To track our progress, we aim to answer these research questions:

Usage
 * How many people posted a rating?
 * How many people posted a comment?
 * How many people posted more feedback?
 * Which version of the tool was used most?
 * Which feedback input was used most?

Engagement
 * How many people posted an edit?
 * How many people signed up?
 * How many people logged in?
 * How many people shared their email?

Satisfaction
 * Did readers find the tool useful?
 * Did editors find the tool useful?
 * Which tool version (or input) was found most useful? Why?

Quality
 * How did reader ratings compare to editor ratings? to expert ratings?
 * How many feedback posts were voted up or down by the community?
 * How many feedback posts were featured or hidden by editors?
 * How many feedback posts led editors to email the contributor?

This section will be expanded and moved into a separate Metrics requirement page in coming days.

For now, here are some of the most recent AFT metrics we have collected from the Article Feedback Tool V4 in November 2011.

Working on this week
Here are some of the tasks we are working on this week (Oct. 31-Nov. 3):
 * Get developers started on Wikimedia development environment
 * Write updated requirements and specs
 * Develop first feedback form
 * Continue community outreach, and recruit workgroup
 * Set up process for integrating community feedback in development
 * Host second discussion with workgroup (IRC or Skype?)
 * Finalize metrics and data collection process

Schedule
The first phase of Version 5's development is timetabled below; those segments marked in green are the ones where we will be asking for community feedback and participation. These dates are still tentative and subject to change, so please check this page often for schedule updates. Even if you cannot participate right away, there will be more opportunities to contribute in future stages of the project.

Project Team
Here are the Wikimedia team members who are working on this project at this time:
 * VP Engineering / Product	Erik Moeller
 * Senior Product Manager	Howie Fung
 * Producer / Designer	Fabrice Florin
 * Features Engineering Director	Alolita Sharma
 * Research Analyst	Dario Taraborelli
 * UX Designer	Brandon Harris
 * Code Review / Testing	Roan Kattouw
 * Community Outreach	Oliver Keyes

(note that Fabrice and Oliver are Wikimedia contractors -- all others are WMF employees)

Here are the team members from OmniTI, our development partner:
 * VP Business Development  Leon Fayer
 * Project Manager  Yoni Shostak
 * PHP Developer 1  Reha Sterbin
 * PHP Developer 2  Greg Chiasson
 * UI Developer/Designer  Sean Heavey