Wikimedia Apps/Team/iOS/Breaking Down the Wall

From mediawiki.org
< Wikimedia Apps‎ | Team‎ | iOS
Jump to navigation Jump to search

Overview[edit]

Making Wikipedia Legible[edit]

This year the iOS team is working towards the goal of breaking down the wall between our communities and editing process and the average reader. Our focus is on three potential experimental user experiences for the article view in the app, which will help readers understand how Wikipedia works without losing trust in the encyclopedia.

The intent of this work is to help readers better understand key elements of the Wikimedia content process, empower readers to potentially participate, and to improve their ability discern between trustworthy and untrustworthy content on the platform without losing brand trust.

The Public Trust[edit]

Trust in our brand and trust in the quality of our content are vital to our mission and movement strategy. At the same time, we also want people to understand the basic editing model. Knowing the basics of how Wikipedia "works" is a key to both effective participation and consumption. Research shows that, even in places where Wikipedia is well known, the open and participatory nature of the system is not well known. But research also shows that, when first informed about the open nature of the editing systems, many users lose trust in the content and brand. This is particularly evidenced from research in places newer to the internet. This makes sense: remember hearing for the first time that anyone on the internet can edit Wikipedia. You would naturally be suspicious of how valid the content could be.

But we also know from our own experience that the system does mostly work, and that there are specific reasons: safeguards like references and reliable sources and transparent editing history. This also aligns with our Perspective on Trust, which explains the value of owning up to your faults and mistakes when building and re-building trust for any consumer brand. So we conclude from these two findings (that trust initial decreases, but increases with increased understanding and transparency) that there is a trough of disillusion that we need to support users through. But that, ultimately, given more legibility into the process, they will both understand and trust Wikipedia more.

Discernment in a time of Disinformation[edit]

So helping users understand how Wikipedia works, breaking down the wall between just reading content and knowing how its made is our aim. But we do this work in a larger media landscape. In that landscape, knowing what information to trust, and understanding the biases of your information sources is more critical, but difficult, than ever. So while nurturing trust in Wikipedia and savvy consumption of its content are our goal, we do not want to "push" users to trust all the content they read on Wikipedia. As Wikimedians our goal is to enable to users to better make the decision of what content is trustworthy on their own. So our goal, rather than increasing trust in all content, is to increase readers ability to discern what content is trustworthy and what is more questionable, by making the ways that content is generated more legible.

By focus on the concept of “discerning” which information is good or bad, as a skill to nudge and develop in users, rather than focusing on "raising trust", we believe we will actually increase trust in the brand. Again, not by hiding our weaknesses and deficiencies, but by placing them in context and showing you too can fix it. We also believe this could have impact beyond our projects, if we can find ways to help make users savvier consumers of information and more aware of concepts such as verifiability, that could help them better navigate the new landscape of disinformation.

Product Theory[edit]

Through clear design and enriched presentation of article meta-data we can:

  1. Increase readers understanding of Wikipedia as a participatory and active system
  2. Improve readers discernment about the trustworthiness of content they are reading
  3. Improve readers overall confidence in Wikipedia as a source

Program Goals[edit]

  • Run experiments on English Wikipedia to determine if our product theory has merit
  • Build prototype UIs and support APIs based on collaborative design and development processes
  • Use variant testing methods to expose these prototypes to iOS spp users
  • Measure and report findings in order to guide future investment in this strategic area

Methodology[edit]

For this program we'll be using a process of delivering limited experimental versions of potential new experiences, presenting user surveys before and after the user is exposed to the experimental feature. We'll use these survey results, as well as engagement metrics with those features, to determine if any of the product theories are supported.

Baseline Survey[edit]

Because of the interrelation between system understanding, quality discernment and trust we are asking about user attitudes in all three areas as part of the survey. More complete information about the content of the survey can be found here. After a short delay to allow them to do some basic reading/assessment of the article, the user will be presented with the option to take a short survey. The survey will presented as a full screen call to action using the existing app announcements system.

Targeting Articles[edit]

In order to reduce the number of variable and potential confounding factors, and also to limit the cost and scope of prototype development, we will only survey and variant test users in a limited set of high volume articles. This allows us to "fix" one variable (the content) while still randomly sampling users. Article selection will be based on two factors, both derived from pageview counts: "evergreen" articles which have high pageview ranks over a longer period (a year or more) and "highly topical" articles which have high pageview ranks in the recent past. This will give us two known, but different, context to compare user reactions and results and represents two of the most common consumption patterns on Wikipedia.

Exposure[edit]

In all cases the experimental UI will be exposed to users in the context of a targeted article. The experiments may vary slightly in how they are first shown to the users, but in order to, again, limit the number of variables, we won't be presenting calls to action outside the articles or using other means of raising awareness of these features. This may limit or slow the amount of data collection, but allows us to focus on the core product theory first.

Post Survey and Measurement[edit]

After a user has exited an experimental feature they will be again prompted to take our survey. In addition we'll be using engagement analytics (for opted in users) to track interactions. Using these two data sets of self-reported qualitative responses and quantified engagement metrics, will provide both a measurement of user interest and impact for each prototype. We'll use these measurements to analyze if, and how, our product theories are supported.

Planned Experiments[edit]

Based on an extensive review of existing research in this area, the team used a design sprint process to conceive three initial experimental experiences on English Wikipedia. Each user story is given below along with an initial design proposal. Much more detail can be found on the Phabricator tickets.

Source Navigator[edit]

Development task: https://phabricator.wikimedia.org/T241254

Story: As a reader of Wikipedia, I would like to know more about the citations used in the articles that I read so that I can learn more about how Wikipedia is built using reliable sources and explore similar articles or resources.

Concept: Provide richer and more consumable information about citations in an article in order to highlight and expose the importance of sourcing and citations for verifiabitliy and therefore trustworthiness of content.

Initial Mock:

Article Inspector[edit]

Development task: https://phabricator.wikimedia.org/T241255

Story: As a reader of Wikipedia, I would like to be able to learn more about a specific area of the article that I am reading, without having to stray too far from my current reading experience, so that I can better understand how articles are written.

Concept: Building on the "Who Wrote That" tool, developed in response to this Community Wishlist item, this experience allow the user to highlight specific text in an article to learn about its editing history, to make the evolution and origins of specific content legible.

Initial Mock:

135 Article inspection mode - highlight.png

Article as a Living Document[edit]

Development task: https://phabricator.wikimedia.org/T241253

Story: As a reader of Wikipedia, I would like to see how an article has changed over time so that I can better understand how Wikipedia articles evolve over time without being overwhelmed by too much granular information.

Concept: By showing a more visual presentation of key events in an articles editing history, users can better consume the changes being made to an article and have more insight into how an article has evolved and clearer understanding that articles are evolving collaborative efforts. We think this is particularly relevant for highly controversial or current events articles which are undergoing frequent change.

Initial Mock:

A - 102 Living document.png


Further Reading[edit]

  1. The Product Departments perspective on Trust, written to prepare for the Medium Term Plan, summarizes significant research and puts forth our initial theories on content and brand Trust: Wikimedia Product/Perspectives/Trust
  2. Most readers don’t know Wikipedia is editable https://meta.wikimedia.org/wiki/Strategy/Wikimedia_movement/2017/Sources/Brand_awareness,_attitudes,_and_usage_-_Executive_Summary
  3. Learning the system is open and participatory for the first time decreases trust in the content and brand: https://upload.wikimedia.org/wikipedia/commons/0/0c/New_Readers_research_findings_presentation%2C_September_2016.pdf slide 83
  4. Clark, Malcolm & Ian, Ruthven, & Holt, Patrik & Dawei, Song,. (2012). Looking for Genre: the use of Structural Features During Search Tasks with Wikipedia. IIiX 2012 - Proceedings 4th Information Interaction in Context Symposium: Behaviors, Interactions, Interfaces, Systems. 10.1145/2362724.2362751
  5. Cox, Joseph. “Why People Trust Wikipedia More Than the News.” Vice, 11 Aug. 2014, https://www.vice.com/en_us/article/ae37ee/in-defense-of-wikipedia.
  6. Kittur, Aniket et al. “Can you ever trust a wiki?: impacting perceived trustworthiness in wikipedia.” CSCW (2008).
  7. Lucassen, T. , Muilwijk, R. , Noordzij, M. L. and Schraagen, J. M. (2013), Topic familiarity and information skills inonline credibility evaluation. J Am Soc Inf Sci Tec, 64: 254-264. doi:10.1002/asi.22743
  8. Mothe, Josiane and Gilles Sahut. “How trust in Wikipedia evolves: a survey of students aged 11 to 25.” Inf. Res. 23 (2018): n. Pag.
  9. Okoli, C. , Mehdi, M. , Mesgari, M. , Nielsen, F. Å. and Lanamäki, A. (2014), Wikipedia in the Eyes of Its Beholders. J Assn Inf Sci Tec, 65: 2381-2403. doi:10.1002/asi.23162
  10. Teun Lucassen, Jan Maarten Schraagen,The influence of source cues and topic familiarity on credibility evaluation,Computers in Human Behavior, Volume 29, Issue 4, 2013, Pages 1387-1392, ISSN 0747-5632, https://doi.org/10.1016/j.chb.2013.01.036.
  11. "One easy way to make Wikipedia Better", The Atlantic: https://www.theatlantic.com/technology/archive/2016/04/wikipedia-open-access/479364/
  12. "We've witnessed a halving of journalists since 2008, while the number of corporate communications execs has tripled. In sum, the ratio of bullshit/spin to watchdogs has increased sixfold." - https://www.profgalloway.com/wewtf-part-deux

Give Feedback[edit]

Updates[edit]