Article feedback/Version 5

This is the home page for Version 5 of the Article Feedback Tool. In coming days, we will be expanding this page with more details and links to related pages for this project.

Objectives
In October 2011, Wikimedia kicked off the product planning process on new and alternative methods of providing feedback regarding the quality of articles, including ideas like a moderated free-text comment queue for suggestions. We've engaged Fabrice Florin as product development consultant to manage this project, as well as OmniTI in Maryland for the software development of these new features.

Our overall goals will be to 1) measure quality, 2) provide meaningful feedback regarding the development of articles to the editing community, 3) provide lightweight contribution mechanisms as a gateway to editing. The idea is to try new experiments alongside the existing article feedback tool (i.e. to replace it on some reasonably representative subset articles with an alternative implementation, gather data, and iterate).

We invite the Wikipedia community (as well as all Wikimedians), to participate in this experiment. Together, we hope to create and test new collaborative tools towards these objectives:
 * engage readers to participate more on Wikipedia
 * give editors new tools to improve article quality
 * encourage readers to become editors over time
 * invite a collaboration between editors and readers
 * experiment with outsourcing web development

The set of implementation of the Article Feedback Tool (Versions 1-4) were focused on the dual objectives of participation and quality. The existing tool intends to provide a quantitative measurement of quality of articles as well as an on-ramp for contribution (i.e., editing). Based on the research conducted by WMF, the tool shows promise in being an on-ramp for contribution. As implemented, the tool appears to provide a reasonable measurement for quality along some dimensions (Completeness and Trustworthy), while other dimensions of quality (Objective and Well-written) tend to show a lower correlation with ratings.

Based on this research and the input from the Wikipedia editing community (examples are here and here), the next version of the tool will focus even more heavily on participation. Editors told us that it would be valuable if they knew what readers were looking for. Version 5 of the Article Feedback Tool will focus on finding ways for Readers to contribute productively to building the encyclopedia.

For example, we will try a version where we ask readers "Did you find what you were looking for?" If the reader selects "no," we will ask them to fill out a form describing what they were looking for. Even if this reader doesn't become an editor, the hope is that they will contribute productively by letting the editing community know what was missing from the article. As we did in the first few phases of the project, we will also invite them to make the edit themselves.

Quality is still an important consideration and we will continue to test various ways of measuring quality. We may, however, do this implicitly. In the above example, the percentage of "yes"'s could be an indicator of article quality, even though we don't ask the reader to explicitly evaluate the quality of the article.

Feature Requirements
Key features for AFT V5 will include:
 * new feedback forms
 * calls to action
 * expanded feedback
 * feedback page
 * moderation tools

In the first phase of this project (Oct.-Dec. 2011), we will create and test four different ways to extend the current rating tool:
 * Option 1: Share your feedback - Did you find what you were looking for? Add a comment
 * Option 2: Make a suggestion Suggest an improvement, ask a question, report a problem or give praise to the editors
 * Option 3: Review this page - Rate this article. Add a comment or suggestion for improvement.
 * Option 4: Edit this page - Help improve Wikipedia (call to action, for comparison purposes)

We plan to A/B test these four options against the current rating tool, to find out which is most effective for engaging readers, supporting editors and improving article quality.

After readers post their feedback, they will see one of these calls to action:
 * take a survey (link to survey page)
 * edit this article (if logged in)
 * sign up or login (if logged out)
 * get email notifications (if my post is used)

We will also develop new features and touchpoints to make reader feedback more useful, including:
 * Expanded feedback (optional) - Tell us more: suggest improvements, rate article qualities or share your expertise.
 * Feedback page - See all the posts for this article, vote them up or down (moderated by editors and administrators)
 * Talk page features - Tools to feature the best feedback on the article talk page, with a recommended check list for editors

For a preview of what these forms and pages might look like,, which include simple wireframes for key touchpoints, as well a project plan and other useful exhibits.

In coming days, a complete set of feature requirements will be posted here.

We plan to develop more features in a second phase (Jan.-March 2012), based on first phase results. We are starting a wish-list of features for that second phase and will link it here shortly.

Here are some of the community feature ideas which we have categorized so far:

Phase 1: (Q4 2011)
 * Comment box
 * "Did you find what you were looking for?"
 * Checkboxes for suggested improvements
 * Make rating tool more visually compact
 * Hide AFT for recently created pages
 * Different calls to action

Phase 1.5: (Q4 2011)
 * Dashboard with recent feedback and page metrics
 * Allow permanent linking of page scores
 * Courtesy diff link to the rated revision

Phase 2: (Q1 2012)
 * Ratings details (counts, distribution, averages over time)
 * Promote (or post) feedback/comments to talk page
 * Let registered users track pages they rated
 * Comments feed per article (for editors) – RSS/API?

Metrics
To track our progress, we aim to answer these research questions:

Usage
 * How many people posted a rating?
 * How many people posted a comment?
 * How many people posted more feedback?
 * Which version of the tool was used most?
 * Which feedback input was used most?

Engagement
 * How many people posted an edit?
 * How many people signed up?
 * How many people logged in?
 * How many people shared their email?

Satisfaction
 * Did readers find the tool useful?
 * Did editors find the tool useful?
 * Which tool version (or input) was found most useful? Why?

Quality
 * How did reader ratings compare to editor ratings? to expert ratings?
 * How many feedback posts were voted up or down by the community?
 * How many feedback posts were featured or hidden by editors?
 * How many feedback posts led editors to email the contributor?

Working on this week
Here are some of the tasks we are working on this week (Oct. 24-28):
 * Announce project to Wikimedia community
 * Write updated requirements and specs
 * Develop first feedback form
 * Start community outreach, and recruiting workgroup
 * Host first discussion with workgroup (IRC or Skype?)
 * Set up metrics and data collection process

Schedule
Here are some key milestones for this first phase:
 * Phase 1 Development Start: 	20-Oct-11
 * First Requirements: 	28-Oct-11
 * First Community Outreach: 	27-Oct-11
 * First Mockups 	3-Nov-2011
 * Second Community Outreach: 	3-Nov-2011
 * Updated Requirements:   	4-Nov-11
 * First Partial Code Delivery:  	4-Nov-11
 * First Partial Code Review:  	8-Nov-11
 * First Feedback Form Testing	10-Nov-2011
 * Development/Design Iterations	through Nov. 21
 * First Feedback Page Testing	23-Nov-2011
 * First Release Code Delivery:  	23-Nov-11
 * Release Code Review Start:   	28-Nov-11
 * Release Code Testing Start:   	1-Dec-11
 * First Code Release:	7-Dec-11
 * A/B Testing Monitoring/Tweaks:      	through Dec. 21
 * Second Code Release: 	14-Dec-11
 * Phase 1 Development End:             	23-Dec-11
 * User Survey	6-January-2012
 * Phase 1 Report	16-Jan-2012

Workgroup
Link to workgroup page

Project Team
Here are the Wikimedia team members who are working on this project at this time:
 * VP Engineering / Product	Erik Moeller
 * Senior Product Manager	Howie Fung
 * Producer / Designer	Fabrice Florin
 * Features Engineering Director	Alolita Sharma
 * Research Analyst	Dario Taraborelli
 * UX Designer	Brandon Harris
 * Code Review / Testing	Roan Kattouw
 * Community Outreach	Oliver Keyes

(note that Fabrice and Oliver are Wikimedia contractors -- all others are WMF employees)

We will add team members from OmniTI, our development partner, in coming days.