Parsoid/API

Parsoid converts MediaWiki's Wikitext to XHTML5 + RDFa and back.

Common HTTP headers supported in all entry points

 * Accept-encoding : Please accept gzip.
 * Cookie : Cookie header that will be forwarded to the API. Makes it possible to use Parsoid with private wikis. Setting a cookie implicitly disables all caching for security reasons, so do not send a cookie for public wikis if you care about caching.

Entry points
The  in these examples refers to the configured wiki id as available in the siteinfo API request. Examples: 'enwiki', 'frwiki', 'dewiki' etc. refers to the canonical page name with spaces replaced with underscores. Examples: 'Main_Page', 'Barack_Obama' etc.


 * : Get HTML for a given page revision. Example: /enwiki/Main_Page?oldid=598252063.
 * Convert passed-in wikitext or html
 * The {page} path component should be provided if available. Both it and the oldid is needed for clean round-tripping of HTML retrieved earlier with.
 * HTML to Wikitext
 * oldid
 * the revision id this is based on (if any)
 * html
 * HTML to serialize to wikitext
 * Wikitext to HTML
 * wt
 * Wikitext to parse to HTML
 * body (optional) : boolean flag, only return the HTML body.innerHTML instead of a full document
 * For example, using the cURL command-line tool $ curl localhost:8000/localhost/Main_Page -d wt="Hello world" -d body=1
 * For example, using the cURL command-line tool $ curl localhost:8000/localhost/Main_Page -d wt="Hello world" -d body=1

Convenience method:
 * : Get HTML for the latest page revision, redirects to the full  form.

There are additional form-based debugging tools available. See  (e.g. http://parsoid-lb.eqiad.wikimedia.org/). Those are not part of the API, and can change or disappear at any time.