Wikimedia Apps/Short descriptions/Research

To understand how users of the Android app used this feature to contribute, and for an initial assessment of its effect on Wikidata content and on the workload of recent changes patrollers on Wikidata, we analyzed data from two sources: The (publicly available) edit history on Wikidata, and an EventLogging instrumentation (Schema:MobileWikiAppEdit). Below are the results from (mostly) the former. The feature had been available in the alpha version of the app since late 2016 (and there were earlier test implementations, also on iOS). On February 10, 2017, it was enabled in the beta version on the Russian, Hebrew and Catalan Wikipedia, and on February 28 rolled out in the production release, still restricted to these three languages. On April 24, it was activated for all but the ten largest languages, and on July 5 for all except English. By June 21, 20,193 description edits had been made using the feature since the beta rollout. By May 5, 2018, that number had risen to 168,695, with 8.5% of them having been reverted.

Results from the initial rollout (Russian, Hebrew and Catalan)
From the time of the language-limited beta rollout on February 10 until April 12, 5891 description edits were made using the app, by between 20 and 57 distinct users (including IPs) each day. For the three languages enabled in production, edits from the app already made up a substantial amount of all manual description edits (i.e. edits from the standard web interface, measured by excluding edits from easily detectable bots and edits made via external tools such as Quickstatements or reCH): 28.9% in Russian, 60.8% in Hebrew, 28.1% in Catalan (referring to the time from March 1 to April 12).

4.6% of the app edits made between February 10 and April 12 were reverted, which is higher than the rate for manual Wikidata description edits in general (e.g. 1.2% in the week from February 1 to February 7, 2017). On the other hand, this revert rate is considerably lower than e.g. on the English Wikipedia (8.1% for all edits including bots, and 29.0% for anonymous edits, per data from November 2015). Here is a language-specific breakdown for the three languages that were enabled first in the production version:

Results until August 2017 (rollouts to all but English)
54047 description edits were made via the app from February 10 to August 27, 2017. 5.5% of them were reverted. The revert rate was much lower for logged-in users (2.0%) than for anonymous users (12.6% - which however is still far below the aforementioned 29.0% for anonymous edits on English Wikipedia).

During that time, there were 50 users who made more than 100 edits each (one of them almost 2200). This shows that the feature is capable of attracting power users. However, we have been seeing a slight downward trend in the overall number of daily edits after each rollout, which indicates a need to address the problem that users who like adding new descriptions can soon find themselves without easily findable opportunities to add more of them (T164606).

Here are the edit numbers and revert ratios by language for app description edits during that timespan, for the ten languages that had seen the most edits at that point: During August 2017, 22.6% of the edits made via the app were Wikidata description edits. The majority of these (16.4%) added a new description rather than changing an existing one.

Some notes and caveats

 * As always, it is important to note that not all reverted edits are vandalism.
 * Conversely (like for all Wikidata edits), not all vandalism may have been detected and reverted
 * Members of the Reading team (in particular Dmitry and Tilman) were monitoring the edits from the app frequently, to make sure the feature was not becoming too disruptive, and on that occasion reverted many vandalism edits themselves (in a volunteer capacity). This drove up the revert rate compared to the baseline of normal description edits that did not receive such extra scrutiny. Specifically, among the 4.6% of reverted edits, 1.4% were reverted by members of the Reading team.
 * The Wikipedia revert rates are included above for an useful informal comparison, which however needs be taken with various grains of salt (e.g.: the cited total revert rate for WP do not exclude bots and are thus likely lower, they cover all namespaces instead of just a particular content type, etc.).
 * Data sources, calculations and further details are documented in several public PAWS notebooks (split due to technical issues): 1, 2, 3, 4).