Article feedback/Extended review

Because of the lack of a standard, readily available tool to create and store quality reviews of Wikipedia content, several groups and organizations have created their own ad-hoc tools for this purpose.

This page describes a standard system to conduct open quality review of Wikipedia content, and surface the results on a quality page.

This system is primarily intended for Wikipedia, but could also be used on other Wikimedia projects.

Note: This page describes the system's requirements. The wireframes are meant to give a general idea of how the pieces fall together. They do not supersede the MediaWiki style guide, which remains the authoritative reference for developers who create user interfaces.

System
The system will be implemented as a MediaWiki extension: it will make integration with Wikipedia user accounts easier, and review data will be stored locally. The Article feedback tool provides an existing framework that can be extended to support a more detailed quality review process.

Invitation e-mail
E-mail is the safest assumption we can make on the partner organization's infrastructure.

Organization members will get an invitation by e-mail to confirm their affiliation to the organization on Wikipedia; the invitation will contain a link that leads them to a special "Credentials attachment page" to confirm their affiliation (see below). The link will contain a token to identify the partner organization (mandatory), and if possible other information to prefill the fields.

If the partner organization agrees to provide Wikimedia with a structured document containing this information (e.g. a CSV file), Wikimedia will provide use a script to generate a list of tokens, send confirmation email, and import the tokens into the Wikimedia database.

If the partner organization prefers not to share this information with Wikimedia, Wikimedia will provide them with the script, or a modified version of it, to complete the first two steps (token generation and e-mail confirmation). Wikimedia can then complete the final step (token import into the database) using the same script or a modified version of it.

Because partner organizations can use a variety of platforms, the script needs to be highly portable and easy to use.

Credentials attachment page

 * If the user is logged in, the page displays the fields required for the confirmation. They also have the possibility of creating an alternate account and attaching their credentials and affiliation to it, if they prefer to use an alternate account for reviews.
 * Organization (non-editable, filled)
 * Real name (mandatory, editable, filled if possible)
 * Credentials (optional, editable, filled if possible)
 * Link to university page or biography (optional, editable, filled if possible)
 * If the user isn't logged in, the page offers the possibility to log in or sign up, and confirm the affiliation

Once the credentials are attached to the account, the user will be able to edit the information they entered in their preferences.

Location
Because articles can grow fairly long, it is better to allow the reviewer to scroll through the article while they are reviewing it, while keeping the review fields always visible. Furthermore, the review interface should not hide the content reviewed; the reviewer needs to be able (this means the review should not be displayed in a modal overlay).

Two locations will be built and tested:
 * In the left sidebar: the engagement rating (and link to the reviews page) will always be shown. When the user rates it, the sidebar will expand to show the rest of the review form.
 * As a right-side fixed-position feedback tab: when the feedback tab is clicked, it opens a column on the right side of the page that contains the review form.

Ideally, both columns should be resizable.

A/B testing these locations and comparing their success will help determine which one is best. Criteria may be: user engagement, quality of reviews, community perception.

Detailed description
The following table lists the default issues, and a longer description to be used as tooltip.

Possible future steps
The following will not be built for the first version of the tool, but may be considered for future iterations.
 * Different engagement rating, to be A/B tested later: other Likert scales, binary flag (e.g. thumbs up/down) or question ("Was is useful?", "Did you find what you were looking for?")
 * Allow the reviewer to disclose a conflict of interest (e.g. if they have significantly edited the article themselves, which could be automatically assessed, or if they're particularly biased on the topic itself).

Quality page and review management
A special page will be built to display and act on reviews. The page will contain several spaces, for example organized in the form of tabs.

The spaces will be:
 * A summary screen with navigable charts, to display aggregated data.
 * A list of reviews, where users can act on them
 * A list of praising reviews
 * The recycle bin of reviews.

Summary screen
The final type and form of the charts remain to be determined. The first charts will show the evolution of reviews over time. Reviews include praise, issues and abuse reports (which are issues, but should be visually recognizable).

Two main views will be built: one that plots reviews versus time (days, months, etc.), and one that plots them versus article revisions. Both charts will have an interactive component to allow the user to get more information, for example by displaying an information bubble on mouse hover. The additional information can contain links to a specific article revision, or to a set of reviews made on a specific day.

List of reviews
Users will have the ability to sort or filter the list of reviews for a given article, for example by date (to show the latest review first), by reviewer, by usefulness, by status, etc.



Triagers will be responsible for assessing the incoming reviews and acting depending on their content. The goal will be to surface the particularly relevant content from the quantity of reviews. Existing processes for treating inappropriate text (personal attacks, personally identifiable non-public information, libel, etc.) will continue to apply.

Actions available for reviews:
 * Mark as patrolled: The review doesn't require follow-up, is unspecific, or mentions an issue that was resolved.
 * Move to the recycle bin: The review consists of spam, nonsense, test.
 * Promote to the talk page (and thus autopatrol): The review is relevant, useful, and will raise an actionable issue that needs to be addressed.
 * (for administrators only) Delete / restore

Automatic actions:
 * Automatically patrol reviews that consist only of ratings (no text)
 * Automatically patrol reviews that were promoted to the talk page.

Promotion of the review to the talk page
Constructive criticism and particularly relevant reviews about an article will be manually promoted to the talk page. Each community will need to agree on guidelines for promoting a review to the talk page.


 * The review will indicate that it was promoted to the talk page, when and by whom.
 * The text appearing on the talk page will contain:
 * A link to the review
 * The name of the reviewer, and date of the review
 * The free-text comment of the review
 * The name of the promoter, and date of the promotion

Praise and recycle bin
The "Praise" and "Recycle bin" lists will be similar to the list of reviews, except for the fact that they will display a subset of them. The actions will be the similar; some will be disabled or reversed depending on the context.

Possible future steps
The following will not be built for the first version of the tool, but may be considered for future iterations.
 * Public list of one's reviews: Being able to showcase one's work on Wikipedia is a factor encouraging the participation of some "experts". This could take the form of a special page (e.g. )
 * If the volume of reviews warrants it, investigate automated review aggregation
 * Ability to filter reviews by source of knowledge and/or verified credentials.
 * A quality indicators box on the article page itself, to show information quality to readers. The system could use heuristics to provide a summary of quality information "at a glance". Plug-ins could be added to display other kinds of quality information or tools. The box could be an entry point leading to more detailed information such as reviews.

API
Every piece of data that is available through the user interface can be accessed through the API, as well as every action that can be performed.