Continuous integration/Documentation generation
Documentation is automatically generated and published to doc.wikimedia.org. For increased flexibility and security, the static sites are generated inside labs instances, they are then fetched from the instance to the CI production host and then pushed with rsync to the documentation host (
doc1001.eqiad.wmnet since January 9th 2019). This page documents the workflow, part of the technical implementation, and how to define a new job.
|Gerrit event||Zuul pipeline||Description|
||When a change is merged by Gerrit|
||A reference is changed (tip of branch changes, tags modifications)|
ref-updated could cover the changes being merged, but the event is not associated with a Gerrit change number which prevents us from reporting back to the Gerrit interface. We thus use
postmerge to report back in Gerrit so the user knows the documentation has been generated and the
publish pipeline which handles references updates matching
In both case (change-merged or ref-updated) we trigger the same job to generate the documentation for any branch or tag. We thus need to namespace the documentation under doc.wikimedia.org based on the branch name or the tag name. The information is carried differently between events and the reference is slightly different between branch updates and tags. The conversion logic is carried by a Zuul python function which is associated to all the publish jobs. It injects to the Gearman function (and thus the Jenkins job environment) a variable DOC_SUBPATH which represents the version. Example:
- change merged on REL1_24 branch:
DOC_SUBPATH = REL1_24
- refs/tags/1.24.0 updated:
DOC_SUBPATH = 1.24.0
Reference: Gerrit change 173049
We can thus reuse that DOC_SUBPATH parameter to easily namespace the jobs per versions. As an example https://doc.wikimedia.org/mediawiki-core/ has documentation for both release branches and tags.
Jenkins job builder definitions
In a job definition, a
builder defines what steps to execute. We provide a builder macro called
doc-publish that takes care of transferring the generated files to the web server of https://doc.wikimedia.org/. It takes two parameters:
docsrcDirectory holding documentation files relative to workspace (without trailing slash)
docdestDirectory under https://doc.wikimedia.org/.
Example job definition:
# Typical PHP project - job-template: name: myproject-publish builders: - doxygen - doc-publish: docsrc: 'doc/html' docdest: myproject # Typical JS project - job-template: name: myproject-publish builders: - jsduck - doc-publish: docsrc: 'docs' docdest: myproject
This will invoke the build scripts (doxygen and jsduck) and publish their results (respectively in doc/html and docs) to https://doc.wikimedia.org/myproject/.
To namespace the documentation based on project version, use the Zuul-generated
DOC_SUBPATH (derived from branch or tag name). Simply insert it in the
docdest parameter. You will need to invoke the builder assert-env-doc_subpath. Example for MediaWiki (mediawiki-core-doxygen-publish job):
- job: name: 'mediawiki-core-doxygen-publish' builders: - assert-env-doc_subpath - zuul-cloner: projects: > mediawiki/core mediawiki/vendor # Build the documentation under /build/doc - shell: | rm -rf build/doc mkdir -p build/doc TARGET_BASEDIR="$WORKSPACE/build/doc" /srv/deployment/integration/slave-scripts/tools/mwcore-docgen.sh # Publish from build/doc to mediawiki-core/$DOC_SUBPATH/php' - doc-publish: docsrc: 'build/doc' docdest: 'mediawiki-core/$DOC_SUBPATH/php'
This publishes the documentation at https://doc.wikimedia.org/mediawiki-core/master/php/, and also publishes release branch documentation at, for example, https://doc.wikimedia.org/mediawiki-core/REL1_24/php/
The documentations are ultimately published on doc.wikimedia.org which is a production machine (doc1001.eqiad.wmnet since January 2019). They are generated on labs instance part of the
integration labs project which are not allowed to communicate with production machines.
A job publish-to-contint1001 executes on the CI Jenkins production node which fetch artifacts from the build workspace using rsync + ssh. The content is then rsynced to doc1001.eqiad.wmnet (rsync://doc1001.eqiad.wmnet/doc/).
Updating the doc.wikimedia.org site
The "static" content of the site is in the git repository integration/docroot. Most of the pages use light PHP to show a TOC, include a footer, etc. The git docroot repository is cloned on doc1001.eqiad.wmnet at /srv/docroot , that is done by puppet profile::doc.
After a change is merged, one has to update the docroot:
ssh doc1001.eqiad.wmnet sudo -u doc-uploader git -C /srv/docroot pull
- Continuous integration/Entry points describes the conventions for extension automation. If you follow these when developing your extension, then (with a lot of CI wizardry), tests run and documentation generates "automagically".