Article feedback/Research/Rater expertise

 A summary of the analysis of rater expertise data as of May 2011 can be downloaded here.

Version 2 of the Article Feedback Tool allows raters to optionally specify if they consider themselves "experts" on the topic of the article they are rating. Although this feature entirely relies on self-identified expertise, the rationale for this initial experiment is that tracking expertise will allow us to:
 * capture quality feedback generated by different categories of users;
 * target different categories of raters with potentially different calls to action.

We collected rater expertise data since the deployment of AFT v.2 and its application to a sample of approximately 3,000 articles of the English Wikipedia. We observed the volume of ratings submitted by different categories of users as well as the breakdown of users by expertise type.

Despite significant changes in the UI in AFT v.2 and a different sample of articles to which the tool was applied, the proportion of ratings by user category has remained unchanged since AFT v.1. A sample of approximately 3,000 articles of the English Wikipedia generated in 7 weeks almost 50,000 unique ratings, the vast majority of which (95.2%) were submitted by anonymous readers (or editors who were not logged on when the ratings were submitted). Ratings of multiple articles by the same anonymous user were rare (2.2%), but we submit this might be an accident due to the small number of articles in the sample (less than 0.01% of the total number of articles in the English Wikipedia).