Continuous integration/Documentation generation

Jump to navigation Jump to search

Documentation is automatically generated and published to For increased flexibility and security, the static sites are generated inside labs instances, we had to set up a two-step process to be able to move the material from the labs instance that run the job to the production server that makes it publicly available. This page documents the workflow, part of the technical implementation, and how to define a new job.


Our Zuul layout reacts on two kinds of Gerrit events which are matched by two different pipelines:

Gerrit event Zuul pipeline Description
change-merged postmerge When a change is merged by Gerrit
ref-updated publish A reference is changed (tip of branch changes, tags modifications)

ref-updated could covers the changes being merged, but the event is not associated with a Gerrit change number which prevents us from reporting back to the Gerrit interface. We thus use postmerge to report back in Gerrit so the user knows it has been generated and the publish pipeline which handles references updates matching ^refs/tags/.

In both case (change-merged or ref-updated) we trigger the same job to generate the documentation for any branch or tag. We thus need to namespace the documentation under based on the branch name or the tag name. The information is carried differently between events and the reference is slightly different between branch updates and tags. The conversion logic is carried by a Zuul python function which is associated to all the publish jobs. It injects to the Gearman function (and thus the Jenkins job environment) a variable DOC_SUBPATH which represents the version. Example:

  • change merged on REL1_24 branch: DOC_SUBPATH = REL1_24
  • refs/tags/1.24.0 updated: DOC_SUBPATH = 1.24.0

Reference: Gerrit change 173049

We can thus reuse that parameter to easily namespace the jobs per version.

Jenkins job builder definitions[edit]

Most of the logic is defined straight in Jenkins Job Builder publish.yaml configuration file.

In a job definition, a builder defines what steps to execute. We provide a builder macro called doc-publish that takes care of transferring the generated files to the web server of It takes two parameters:

  1. docsrc Directory holding documentation files relative to workspace (without trailing slash)
  2. docdest Directory under

Example job definition:

# Typical PHP project
- job-template:
  name: myproject-publish
   - global-setup
   - doxygen
   - doc-publish:
      docsrc: 'doc/html'
      docdest: myproject
     - global-teardown

# Typical JS project
- job-template:
  name: myproject-publish
   - global-setup
   - jsduck
   - doc-publish:
      docsrc: 'docs'
      docdest: myproject
     - global-teardown

This will invoke the relevant build script and publish the specified directory at

To namespace the documentation based on project version, use the Zuul-generated DOC_SUBPATH (derived from branch or tag name). Simply insert it in the docdest parameter. You will need to invoke the builder assert-env-doc_subpath. Example for MediaWiki (mediawiki-core-doxygen-publish job):

- job:
    name: 'mediawiki-core-doxygen-publish'
     - assert-env-doc_subpath
     - zuul-cloner:
         projects: >
     # Build the documentation under /build/doc
     - shell: |
         rm -rf build/doc
         mkdir -p build/doc
         TARGET_BASEDIR="$WORKSPACE/build/doc" /srv/deployment/integration/slave-scripts/tools/
     # Publish from build/doc to mediawiki-core/$DOC_SUBPATH/php'
     - doc-publish:
         docsrc: 'build/doc'
         docdest: 'mediawiki-core/$DOC_SUBPATH/php'

This publishes the documentation at, and also publishes release branch documentation at, for example,


The documentations are ultimately published on which is a production machine (gallium As of February 2014). They are generated on labs instance part of the integration labs project which are not allowed to communicate with production machines.

To solve this, we created an intermediary instance integration-publishing.eqiad.wmflabs, the doc-publish macro running on the labs instance will rsync the generated content to the instance under the doc rsync container in a uniquely named sub directory (reusing ZUUL_UUID). The macro then triggers the publish-doc job with the unique identifier, it will rsync from the intermediary instance to the production machine, thus publishing the doc.

The integration-publishing.eqiad.wmflabs rsync daemon is reachable by other integration labs instances since they are in the same project. The production slave gallium is allowed connection since it has a public IP and can reach labs, the other production slave lanthanum has a private IP and thus can not reach labs per policy. Hence all jobs should be tied to the contintLabsSlave label.

Updating the site[edit]

The "static" content of the site is in the git repository integration/docroot. Most of the pages use light PHP to show a TOC, include a footer, etc.

See also[edit]

  • Continuous integration/Entry points describes the conventions for extension automation. If you follow these when developing your extension, then (with a lot of CI wizardry), tests run and documentation generates "automagically".