Article feedback/Version 5

This is the project overview page for Version 5 of the Article Feedback Tool. The Wikimedia Foundation is developing this new feature as an "on-ramp" to engage readers to contribute Wikipedia -- and become editors over time.

See also: feature requirements page, as well as technical design page, data and metrics plan.

Goals
In October 2011, Wikimedia kicked off the product planning process on new and alternative methods of providing feedback regarding the quality of articles, including ideas like a moderated free-text comment queue for suggestions. The foundation engaged Fabrice Florin as product development consultant to manage this project, as well as OmniTI in Maryland for the software development of these new features.

Our overall goals will be to 1) measure quality, 2) provide meaningful feedback regarding the development of articles to the editing community, 3) provide lightweight contribution mechanisms as a gateway to editing. The idea is to try new experiments alongside the existing article feedback tool (i.e. to replace it on some reasonably representative subset articles with an alternative implementation, gather data, and iterate).

We invite the Wikipedia community (as well as all Wikimedians), to participate in this experiment. Together, we hope to create and test new collaborative tools towards these objectives:
 * engage readers to participate more on Wikipedia
 * give editors new tools to improve article quality
 * encourage readers to become editors over time
 * invite a collaboration between editors and readers
 * experiment with outsourcing web development

The set of implementation of the Article Feedback Tool (Versions 1-4) were focused on the dual objectives of participation and quality. The existing tool intends to provide a quantitative measurement of quality of articles as well as an on-ramp for contribution (i.e., editing). Based on the research conducted by WMF, the tool shows promise in being an on-ramp for contribution. As implemented, the tool appears to provide a reasonable measurement for quality along some dimensions (Completeness and Trustworthy), while other dimensions of quality (Objective and Well-written) tend to show a lower correlation with ratings.

Based on this research and the input from the Wikipedia editing community (examples are here and here), the next version of the tool will focus even more heavily on participation. Editors told us that it would be valuable if they knew what readers were looking for. Version 5 of the Article Feedback Tool will focus on finding ways for readers to contribute productively to building the encyclopedia.

For example, we will try a version where we ask readers "Did you find what you were looking for?" We will also invite them to add a comment or suggestion for improvement. Even if this reader doesn't become an editor, the hope is that they will contribute productively by letting the editing community know what was missing from the article. As we did in the first few phases of the project, we will also invite them to make the edit themselves.

Throughout this project, we will continue to test various ways of measuring quality. We may, however, do this based on more implicit data. In the above example, the percentage of "yes"'s could be an indicator of article quality, even though we don't ask the reader to explicitly evaluate the quality of the article.

Features


Key features for AFT V5 include: (see also full feature requirements page)
 * new feedback forms
 * calls to action
 * feedback page
 * moderation tools
 * expanded feedback

In the first phase of this project (Oct.-Dec. 2011), we will create and test four different ways to extend the current rating tool:
 * Option 1: Share your feedback - Did you find what you were looking for? Add a comment
 * Option 2: Make a suggestion Suggest an improvement, ask a question, report a problem or give praise to the editors
 * Option 3: Review this page - Rate this article. Add a comment or suggestion for improvement.
 * Option 4: Edit this page - Help improve Wikipedia (call to action, for comparison purposes)

We plan to A/B test these four options against the current rating tool, to find out which is most effective for engaging readers, supporting editors and improving article quality.

We are also developing other features to make reader feedback more useful, including:
 * Feedback page - See all the posts for this article, vote them up or down (moderated by editors and administrators)
 * Expanded feedback (optional) - Tell us more: suggest improvements, rate article qualities or share your expertise.
 * Talk page features - Tools to feature the best feedback on the article talk page, with a recommended check list for editors

We plan to develop more features in a second phase (Jan.-March 2012), based on community ideas and first phase results. We are starting a wish-list of features and open issues for that second phase on this Ideas page. For a preview of what these forms and pages might look like, check our latest slides, which include simple wireframes for key touchpoints, as well a project plan and other useful exhibits.

Metrics
AFT v.5
 * Metrics and research questions that will be used to test AFT v.5, as well as a detailed plan with the breakdown of the different tests that we will run, can be found on this page.

AFT v.4
 * We collected a number of high-level usage metrics (November 2011) from the current version of AFT as a baseline before starting the deployment of AFT v.5.
 * Several dashboards with real-time data collected from AFT v.4 are available from the toolserver:
 * Global daily ratings and conversions
 * Volume of daily ratings per article
 * Most frequently rated articles
 * Detailed reports from the analysis of data collected via AFT v.4 are available on this page.

Schedule
The first phase of Version 5's development is timetabled below; those segments marked in green are the ones where we will be asking for community feedback and participation. These dates are still tentative and subject to change, so please check this page often for schedule updates. Even if you cannot participate right away, there will be more opportunities to contribute in future stages of the project.

This week
Here are some of the tasks we are working on this week (Nov. 21-Nov. 23):
 * develop all 3 feedback forms, 1 call to action and start on the feedback page
 * finalize phase 1.0 test plan, as well as prepare phase 1.5 development plan
 * write copy for all phase 1.0 text prompts in forms, feedback page, calls to action and help
 * identify a subset of articles to be tested in phase 1.0 (mix of high-traffic and low-traffic pages)
 * schedule next month's IRC chats, post newsletter, update talk page with new ideas from community

Team
Here are the Wikimedia team members who are working on this project at this time:
 * VP Engineering / Product	Erik Moeller
 * Senior Product Manager	Howie Fung
 * Producer / Designer	Fabrice Florin
 * Features Engineering Director	Alolita Sharma
 * Senior Research Analyst	Dario Taraborelli
 * Research Consultant Aaron Halfaker
 * UX Designer	Brandon Harris
 * Code Review / Testing	Roan Kattouw
 * Community Outreach	Oliver Keyes

(note that Fabrice, Aaron and Oliver are Wikimedia contractors -- all others are WMF employees)

Here are the team members from OmniTI, our development partner:
 * VP Business Development  Leon Fayer
 * Project Manager  Yoni Shostak
 * Developer  Reha Sterbin
 * Developer  Greg Chiasson
 * UI Developer/Designer  Sean Heavey