User:Tgr (WMF)/claims

From mediawiki.org

One of Wikipedia's greatest social inventions is the neutral point of view, a policy which, together with verifiability, turns contested facts about reality into consensual facts about claims about reality (what a certain group thinks about some issue, how influential that view is, how large the group is etc), which made the process of coming up with claims fairly unbiased and controversy-free. This invention is underutilized because the smallest unit of information on Wikipedia are articles, not claims. Articles introduce more potential for bias (the process of deciding which claims are notable enough to fit into the article is much less objective and leaves a lot of room for applying soft power), potential for misinformation and disinformation (sourcing becomes much harder to police on the article level because which source belongs to which claim becomes unclear), accessibility barriers (a significant fraction of readers are looking for "quick facts", i.e. claims, and find a wall of text instead) and degrade discourse (the talk page of a large article tends to be an unbrowsable mess, anything that gets archived is lost forever).

Of course articles are what an encyclopedia is about, they contextualize claims and organize them into a comprehensible narrative, so the trade-offs above are absolutely necessary. Still, the article writing process involves the creation of claims; much like infobox data, it is wasteful to store them as unidentifiable substrings of the article text and discard them forever when that text gets redacted out of the article. Providing structured storage for claims would enable:

  • A searchable archive of past debates so we can avoid duplication of effort as the same arguments for the same claims get discussed over and over. Large wikis have some workarounds for this (invisible comments in the article text warning people not to add a certain claim without reading past discussions; FAQ pages summarizing past discussion on the most common claims) but they don't work great and are a lot of effort to maintain.
  • A centralized discussion place for each claim (which might be shared between many articles and many languages).
  • Verifiability controls which are much easier to police, both manually and automatically. (This can include many things, from just making sure the correspondance between a piece of article text and specific locations in specific sources is obvious, to patrol flags for trusted editors actually verifying the claim in the sources, to dealing with clarifications, corrections and retractions).
  • More space for giving a nuanced explanation of a claim, without the terseness constrains of an article.
  • More accessible talk pages where one can actually find past discussion on the specific claim they want to add/change/remove, see what claims were the most contentions / resulted in the most discussion, etc. Also the possibility to automatically build FAQs or indexes of false claims (e.g. by making it possible to add refutations alongside normal sources, and mark claims as refuted / unreliable).
  • Small content chunks which might be possible to expose to readers directly, in formats that are friendly to mobile and to fast consumption:
    • readers sharing facts on social media (this is already happening, but in ways that are hostile to verifiability, authorship tracking or inviting readers to take part in the discussion)
    • translation or voice/video for key facts is easier and more maintainable than for whole articles
    • if we can annotate claims with extra information we might be able to guess which claim a search query might be about
  • A contribution workflow that's more amenable to microcontributions (no need to load the whole article into a mobile editing window) and is less adversarial (claims could serve as a staging platform for articles where content can be discussed before it becomes visible).
  • Indexing claims with stable identifiers (much like how Wikidata indexes concepts and things with stable identifiers), which could expand to cover claims not directly relevant to Wikipedia articles, and could be used by external projects or potential sister projects for all kinds of things (fact-checking, combatting fake/biased news by annotating the claims news articles make, exposing the logical structure of how claims depend on other claims).

Overall, this could increase trust in Wikipedia by making the article process more transparent, better understood (by virtue of becoming less time-consuming to investigate), and maybe less biased and more accurate (via easier verification); improve community efficiency and health (by segregating the more and less contentious parts of contributing to articles on controversial topics and lowering barrier to entry); make our content format more friendly to mobile and social media; and maybe make Wikimedia the standard platform in a whole new knowledge space (claims/facts).

See also the slides from presenting this idea at WikiConference North America 2019.