Wikimedia Product/Needs from APIs

From mediawiki.org
Note: Some of this material was written in June 2017, and plans/thoughts/timelines may have shifted since the last major revision.

This document invites discussion on how to prioritise the implementation of consistent remote APIs in Wikimedia's infrastructure for content distribution and creation. We would like to standardise on Parsoid-backed remote APIs for the mobile web and both app experiences, and create a new JavaScript-based isomorphic frontend (e.g., with Preact) that is a client of these standardized APIs and other Wikimedia hosted remote APIs. Longer term, we are also interested in using this scheme on desktop.

To be specific, this does not take into account 3rd party MediaWiki installations — this isn’t to invalidate this use case, but it’s not the one we are focused on in this document.

When we use terms like “services” we are speaking generally. Audiences products depend on the RESTBase API which in turn is provided by specific services like Parsoid, Citoid, and others but in this document the term does not apply to any specific technology except when noted.

Introduction[edit]

With the proliferation of Internet connected mobile devices over the last 10 years, users have gradually changed their Internet access habits from a Web browser on a desktop computer to a wide variety of products and services on a variety of devices and contexts. As a result, Internet software architecture has evolved from a monolithic structure where a single executable was optimized around handling requests and returning uniform HTML pages to browsers to an architecture based on discrete remote APIs that deliver smaller pieces of content to be aggregated and transformed on the device or the server for different contexts.

Wikipedia and our platform MediaWiki are no different. We have seen shifts to mobile in our access patterns both for consumers and contributors. We need to provide Wikipedia consumption and contribution experiences on an increasing number of channels and platforms.

Also, as we look to extend our communities in locations where mobile Internet access is dominant, we will see similar requirements emerging for our platforms.

The evolution from a single desktop destination to the proliferation of Wikipedia content and actions across platforms, form factors, and tools changes the asks that client applications have of MediaWiki. Remote APIs allow us to address many of these new use cases:

  • Design, create, and maintain multiple user experience variations with minimal effort
    • Example: multiple ways to view, filter and react to the “recent changes” feed
  • Migrate to a new technology/platform
    • Example: the apps rely entirely on APIs. When the next big platform hits, these will likely continue to be essential
  • Develop and maintain features with greater speed and stability
    • Example: a core reason app development is much faster
  • Nurture an ecosystem of third party tools and applications
    • Example: everything on Wikimedia Cloud VPS
  • Facilitate the embedding of Wikipedia knowledge throughout the Internet
    • Example: Siri, Kindle, Google's Knowledge panel…
  • Pass information between wikis:
    • Example: Structured data search on Commons using Wikidata Query Service
  • Provide users with a faster experience
    • Example: progressive page loading
  • Scale services independently of each other

To the extent that components of MediaWiki are not yet extracted as a remote API, the above is not possible without a great deal of redundant effort. We’ve identified some high level issues faced by Audiences:

  • It has been difficult to use modern front end tooling for building Web user interfaces
  • Reading clients do not share a common intermediated content layer (and use HTML from different sources) which has required duplicated effort for each platform.
  • Editing and reading clients do not share a common intermediated content layer (and use HTML from different sources)
  • The current Action API infrastructure is largely designed and developed for content editing and curation … orthogonal to the needs of the presentation layer which makes it difficult to serve mobile use cases (bandwidth, processing and power concerns)

Based on these needs and our software development processes, the Audiences team would like to begin a discussion on how to prioritise the implementation of consistent remote APIs in Wikimedia's infrastructure for content distribution and creation. We’ll document current efforts to date and propose a path forward, highlighting open issues and tradeoffs as we go. We look forward to working with our CTO, the Technology team, and our communities on extending MediaWiki to support our current and future contributors and consumers.

Prior Art[edit]

It is acknowledged that a great deal of prior work exists around remote APIs and services -- of course we have a Services team in Technology, a services team (Reading Infrastructure) in Readers, a Parsing team in Contributors, and MediaWiki has had the Action API for many years. There’s certainly no interest in starting from scratch — we are looking to have a mutually agreed architecture plan that we can all work towards. This is as much about understanding and formalizing the various entry points into MediaWiki as about remote APIs themselves. Not everything needs to be a service, and prioritization of what functionality would benefit most from being broken out as a separable API is a core assumption in this proposal.

Proposed Direction[edit]

We would like to get to a standardized Page Content Service with Parsoid-backed markup for different mobile experiences (apps, web). For the Mobile web, our goal is an isomorphic front end (e.g., in the style of Preact or Vue.js), with a clear separation from the content APIs. This provides us with the following benefits mentioned in the introduction with a few more:

  • Developers can write JavaScript on server or client which is easier to maintain.
  • Allows sharing efforts by consuming the same services and having shared libraries with mobile apps.

The long term plan is that we’d like to disentangle mobile web from MediaWiki PHP (trimming down/moving away from the MobileFrontend “MFE” extension) and we are also interested in using this scheme on the desktop.

Current Art[edit]

There are many loosely connected initiatives across audiences towards this high level goal. While the current development is not based on a singular plan or document, this has not prevented cooperation and collaboration between multiple teams.

The Editing team is reducing the number of pieces of software editing wikitext (VE, Flow, CX) — technical solution is standalone HTML editing system with specific integration technologies. This can be easily integrated with Reader’s solutions for page layout. There is a limited amount of PHP in VisualEditor; it’s mostly JavaScript. All integrations of VE are API based with a few judicious hacks to have to patch in the extra data which we don't get from Parsoid.

The parsing team is working on minimising differences between the Parsoid and PHP parser output with lots of pixel-diff testing to spot differences, most of them are already fixed or are being fixed (the current effort to replace Tidy will ensure fixes to markup that improve Parsoid’s compatibility). One conflict that needs addressing: difference between Readers wanting least-possible HTML, Contributors wanting full-data HTML for editing. (Refer to phab:T164033 and phab:T78676; note large differences can exist in no small part due to the references section on some articles.)

In Readers, mobile apps already use Mobile Content Service; Readers has established a services team (Reading Infrastructure) to build more APIs; strong partnership with Tech Services team. There are several initiatives underway or being considered around Services here:

We believe that incremental solutions along a longer arc are not only desirable but the only way to go so we’d like to continue the current development model albeit with more consistent and mutually agreed upon goals across the org and community.

The Parser[edit]

There has been a lot of pressure over the years (from the beginning even) to have Parsoid be part of the monolithic core. When we started, this was not at all possible (for performance reasons + there was no HTML5 parser in PHP). But, 5 years down the line, with a changed context (HHVM + RemexHTML), a PHP version of Parsoid is not infeasible today, but it is not a trivial task. However, whether we ought to take it is a different question.

In any case, even if Parsoid becomes part of that core, the API that Parsoid provides will remain unchanged and wouldn’t affect the rest of the services that consume Parsoid output. Parsing#E:_Evaluate_feasibility_of_porting_Parsoid_to_PHP has some info about this.

With Tim and Kunal moved to Platform, the porting experiment / prototyping is now on hold for quite a while till we get Parsoid and PHP parser output and feature compatibility to a greater degree.

The document Parsing/Notes/Two Systems Problem details some of the thinking around parser replacement.

Challenges, trade-offs and open issues (incomplete)[edit]

  • Reading and editing static content pages isn't the only issue for multi-device development; special pages proliferate, and often need a lot of manual attention to coax onto other platforms
  • Community concerns (3rd party v. custom experiences v. WMF only)
  • What is the the future of web tech in apps?
    • Lots of smaller initiatives/most interest in instant apps -- they are moving slowly, no evidence that instant apps is taking off yet. Progressive Web Apps are gaining traction, though.
  • We should architecturally acknowledge the differences between reading and editing. It’s ok.
  • There is consensus that HTML is still appropriate as a wire format -- Trevor believes this is a good thing because *everybody* is working on the optimization. We need to confirm this.
  • Mobile Content Service is a compatibility service in some ways for clients; results in some architectural duplications
  • Any wikitext changes aren’t impactful to this discussion and out of scope -- long term arc for where Parsing team wants wikitext to go will impact this for the better (for example, by enabling micro-edits on much finer granularities than a section), but nothing in the immediate or short term.

See Also[edit]