Contributors/Metrics

Like any team, the Editing department will do better if we have a set of clear, well-thought-out performance indicators that we can use to evaluate our work.

Key metrics
All these metrics are current calculated using manual queries on our editor month table. Historical values are available in an online spreadsheet.

Consumers

 * WMF quarterly reports
 * Editing team quarterly reviews
 * Office wiki product updates page
 * Wikistats 2.0?

Considerations
Some of these metrics, like global active editors and global mobile edits, are mostly out of our control. If we see, say, the global active editors number trend upwards over 6 months around the time we roll out a headline new feature, we have very little way to know if the new feature was responsible and many reasons to suspect that it wasn't (because so many other thing influence the metric). Whether they go up or down or stay flat, such metrics are interesting but not actionable. As Aaron Halfaker puts it, we might build a great windmill and write it off as a failure because the wind isn't blowing (or the reverse).

So, these numbers have limitations as metrics for our department. However, there's no question that it's highly important for the WMF and for the movement as a whole to have these numbers, and it doesn't seem that there's any better team to take responsibility for them than us.

However, there may still be opportunities to pick better metrics within these constraints. For example, focusing on rates (like the rate at which editors who've registered become contributors) instead of absolute numbers could take the high level "winds" (e.g. people's interest in Wikimedia projects) as given and focuses instead on how efficiently we convert them into our desired outcomes.

In addition, Editing contains a fairly heterogeneous group of teams. It not clear that one metric, or four, can be an actionable guide for all of them at once.

Barometer projects
Sometimes it's necessary or desirable to track individual projects rather than global numbers. However, it's always impossible to track hundreds of projects individually, so it's helpful to have a list of specific projects which we focus on. There are a number of considerations involved: One very simple solution is the six Wikipedias in the official languages of the United Nation.
 * Large size, which makes the data less noisy.
 * Mix of mature and developing projects.
 * Global diversity
 * Mix of Latin and non-Latin scripts

List, plus May 2015 active contributors and rank thereby:
 * English Wikipedia (31 601, 1st)
 * French Wikipedia (4 602, 3rd)
 * Spanish Wikipedia (4 318, 4th)
 * Russian Wikipedia (3 315, 6th)
 * Chinese Wikipedia (2 378, 8th)
 * Arabic Wikipedia (944, 12th)
 * Should we also include Commons? It's not clear that most metrics will apply well to it.

Apr–Jun 2015 quarterly report
Here.
 * Participation
 * Sign-ups
 * New editors
 * Active editors
 * Content
 * New articles
 * Edits