Reading/Readers contributions via Android

About
 What kind of tasks that could be done by readers or casual contributors that would support editors at the same time?

After the work on community of readers and the user interaction consultation, the Reading team would like to move forward with a conversation that helps the team frame a theme of products, where readers are actively contributing with some of the required editing tasks. Below, we have created wireframes to demonstrate a subset of the ideas mentioned in our first consultation on this subject.

You can check the Reading team thoughts on the topic here, while below is a list of suggested tasks, and their evaluation based on some criteria. Please feel free to add more ideas, and add to the criteria or comment in the talk page.

Kindly note that all ideas need organization and planning around content moderation, a thing which we would need to discuss seriously if any of those idea are going to move towards implementation.

Recording audio titles:
Adding an appropriate audio micro-contribution capability to mobile apps, where there is an increasing readership, as a way to foster more user usage and contributions on mobile for audio content. New activities might inspire existing readers community with new experiences to convert more mobile readers to editors.
 * Goal

When I come across an article that has a hard to pronounce title that I am familiar with, I want to contribute an audio sample of my pronunciation so that other readers unfamiliar with IPA are able to hear the correct pronunciation.
 * User Story

User contributes a recorded pronunciation of the article's title using their mobile device's microphone. (e.g. en:Eyjafjallajokull which can be linked in Wikidata e.g. D:Q39651#P443)

View the PDFs showing the upload and moderation workflow of this idea, or alternatively view an Interactive mockup (on Invision).
 * How do we know we’ve been successful?
 * Increase in addition of quality (ie., non-vandalism type) audio contributions
 * Increase usage of listen feature

Image contributions:
Adding an appropriate image via direct upload through the Android app. The entry point of the image could have one of the below two elaborated scenarios below:
 * Goal

Add image via Nearby


If one allowed our app to access their geo-location, the app could send them a notification that there was a wikipedia entry near them that needs an image. A handy feature that you can opt-in for if you are sightseeing and have time. Could also work well for an organized photo walk, or hackathon kind of activity.

You can check an Interactive mockup of the idea in the link.

Add/edit lead image to the article:
The idea is if someone is reading an article which is missing a lead image and there is a more suitable image on their device, they can add it to the article.

This was requested in the 2016 community wishlist: https://meta.wikimedia.org/wiki/2016_Community_Wishlist_Survey/Categories/Mobile_and_apps#Uploading_from_a_mobile_phone

You can check an Interactive mockup of the idea in the link.

Moderation queue for image submissions
As part of supporting the ability to allow relatively high-visibility contribution of images via mobile devices is an ability to also moderate image contributions in the app through the introduction of a dedicated moderation queue/dashboard.

Mockup of a proposed Moderation queue

You can check an Interactive mockup of the idea in the link.

Lead image editing:
Goal: Improve the quality of the lead image feature, where images are sometimes poorly cropped.

This is a lower impact image feature to allow improving the display of the lead image for an article by allowing users to update the crop and choice of already existing images within the article.

A. Update the crop of the lead image on an article

B. Change the lead image to another image within the article

You can check an Interactive mockup of the idea in the link.

Article Feedback:
The idea is to provide a voice to readers on a platform that is predominantly readers (mobile apps). Increase reader contributions from mobile app users via micro-contributions. But introducing a low effort feedback mechanism for readers that can give value back to Editors (providing recognition and impetus for improving quality of articles focused on the audience), readers then benefit from seeing others’ feedback that may help inform their reading choices, and secondarily they become more aware of the editorial aspect of Wikipedia.

A specific version of this was proposed and discussed in the 2016 community wishlist: https://meta.wikimedia.org/wiki/2016_Community_Wishlist_Survey/Categories/Reading#Readers_comments_and_vote

You can check an Interactive mockup of the idea in the link.

Variation: Readers thank Editors
A variation of this idea is allowing users to thank Editors at the end of an article. This essentially has the same benefits as Article feedback in providing appreciation to Editors and promoting the fact that Wikipedia is editable, but addresses some of the concerns raised in the past regarding users rating negatively based on topic rather than article content quality by removing the Good/Bad voting aspect.

You can check an Interactive mockup of the idea in the link.

Questions about articles
Make use of the quick contribution format by asking quick questions suitable for mobile readers to help improve Wikipedia content. We would do this by engaging the existing reader community with new experiences that allow them to contribute to improving Wikipedia.

This project overall asks humans to review and categorize article content. The resulting output could be used for two Wikimedia initiatives:

A. Wikilabels for mobile
Users ‘label’ or categorise articles and edit types made to articles to help ‘train’ Wikipedia’s machine learning system (ORES).

Currently, there are three ‘campaigns’ asking for labeling in Wikilabels: For the purpose of categorization on mobile devices for readers, it seems only apt to start with the first article topic campaign, since the majority of readers would be unfamiliar with edit diffs and it would be outside the normal workflow of a mobile reader.
 * 1) Article topic – categorize an article as Academic and/or Pop culture by reading the article.
 * 2) Edit type – categorize the likely intention(s) of an edit, and whether it was an addition, modification and/or removal by looking at the diff of an article edit.
 * 3) Edit quality – categorize whether an edit was damaging and whether it was made in good faith by looking at the diff of an article edit.

You can check an Interactive mockup of the idea in the link.

B. WikiGrok
User confirm metadata information about an article that will be eventually stored in Wikidata.

The variation in this mock is that readers are asked questions in a section at the end of the article that they can optionally answer, since the presumption is they would require more knowledge gleaned from reading the article to be both more engaged and able to answer metadata type questions.

You can check an Interactive mockup of the idea in the link.

How do we measure success of this idea?

Reduction in the Wikilabels queue for article categorizations or

Increase in quality and content of article metadata in Wikidata

Report an issue
Help empower mobile readers to make minor contributions to improve the quality of article content and display, which may ultimately engage some readers to convert to become more active editors.

You can check an Interactive mockup of the idea in the link.