Wikimedia Apps/Team/iOS/Breaking Down the Wall

NARRATIVE

Making Wikipedia Legible

helping people understand basic aspects of wikipedia’s content generation

Intro to Content Trust

summarize from Perspectives doc, design deck

The relationship between the two, and the valley of distrust

content trust and system understanding are related and not linearly

Discernment in a time of Disinformation

focus on the concept of “discerning” which information is good or bad as a skill to subtiley nudge and develop in users

Learning From Wikipedia, Learning about Wikipedia, Learning about Learning

its good for us to be understood, its good for the user to be more informed consumer

GOAL

run experiments, build prototype uis, measure, invest and bring to other platforms/langs.

analogy to optimizing for certain types of engagement and brand feeling on for-profit social networks, but in this case for something that helps us and the user and maybe even society.

THEORY

through clear design and enriched presentation we can


 * 1) Increase users understand of wikipedia as a participatory and active system
 * 2) Improve their discernment about how trustworthy (“verified and vetted NPOV reference information”) the content they are reading is
 * 3) Improve their overall confidence in Wikipedia as a source

Among users exposed to the experience.

METHOD

Measuring understanding, discernment and confidence

Baseline Survey

Targeting Users

Exposure

Planned Experiments


 * 1) Source Inspector
 * 2) Article Quality Report
 * 3) Living Article History

Post Exposure Survey and Measurement

BIBLIO/FURTHER READS

FEEDBACK

UPDATES

People don’t know its editable [pulling from https://meta.wikimedia.org/wiki/Strategy/Wikimedia_movement/2017/Sources/Brand_awareness,_attitudes,_and_usage_-_Executive_Summary]

When they do they don’t trust it https://upload.wikimedia.org/wikipedia/commons/0/0c/New_Readers_research_findings_presentation%2C_September_2016.pdf Research deck, slide 83

Broadened participation brings perceived risk to quality: not clear tihs is universally true: many eyes makes bugs shallow argues otherwise as do our values and esp the movement strategy; beleif not all are qualified:: we work under the theory that all have something to add, but need routing/support/policing to raise chances its appropriate; but we can test these, and ios is a nice controlled platform;

Don’t want to increase trust per se (New Readers cite that it has no impact on consumption), but understanding of the signals and systems wikis use to convey both changeability and reliability;

the precondition of participation (the opening of the funnel), KPIs focused on that "ripeness for editor conversion"

The base case is just telling people they can edit. we will do this to establish the baseline, but it has risks that will need management:


 * risk of loss of trust in content (resulting in loss of consumption, evangelism and/or donations)
 * risk of vandalism/low quality edits

"We've witnessed a halving of journalists since 2008, while the number of corporate communications execs has tripled. In sum, the ratio of bullshit/spin to watchdogs has increased sixfold." - https://www.profgalloway.com/wewtf-part-deux