Continuous integration/Documentation generation
Documentation is automatically generated and published to doc.wikimedia.org.
For increased flexibility and security, the static sites are generated inside WMCS instances.
The docs are then fetched from the instance to the CI production host, and pushed with rsync to the documentation host (doc.discovery.wmnet
as of March 2022).
This page documents the workflow, part of the technical implementation, and how to define a new job.
Zuul
[edit]Our Zuul layout file (zuul/layout.yaml) reacts on two kinds of Gerrit events which are matched by two different pipelines:
Gerrit event | Zuul pipeline | Description |
---|---|---|
change-merged |
postmerge
|
When a change is merged by Gerrit |
ref-updated |
publish
|
A reference is changed (tip of branch changes, tags modifications) |
ref-updated
- could cover the changes being merged since the target branch is updated, but the event is not associated with a Gerrit change number which prevents us from reporting back to the Gerrit interface.
We thus use:
postmerge
- pipeline to report back in Gerrit so the user knows the documentation has been generatedpublish
- pipeline which solely handles references updates matching^refs/tags/
.
In both cases, (change-merged
or ref-updated
) we trigger the same job to generate the documentation for any branch or tag.
We thus need to namespace the documentation under doc.wikimedia.org based on the branch name or the tag name.
The information is carried differently between events and the reference is slightly different between branch updates and tags.
The conversion logic is carried by a Zuul python function which is applied to all the publish jobs.
It injects to the Gearman function (and thus the Jenkins job environment) a variable DOC_SUBPATH
which represents the version.
Example:
- A change merged on
REL1_35
branch:DOC_SUBPATH = REL1_35
refs/tags/1.35.0
updated:DOC_SUBPATH = 1.35.0
Reference: Gerrit change 173049
We can thus reuse that DOC_SUBPATH
parameter to easily namespace the jobs per version.
As an example, wmdoc:mediawiki-core/ has documentation for both release branches (wmdoc:mediawiki-core/REL1_37/) and tags (wmdoc:mediawiki-core/1.36.2).
Configuring for your MediaWiki extension or skin
[edit]If you are looking to get documentation published from a MediaWiki extension or skin, you should use one of the standard templates (which work for both extensions and skins, despite the name):
- PHP:
mwext-doxygen-publish
- JavaScript:
extension-javascript-documentation
Note: If you wish to publish both PHP and JavaScript code documentation, you will need to use a non-standard template for one, as they will both try to use the same target directory on doc.wikimedia.org; instead, use generic-node18-browser-coverage-publish
for the JavaScript, which will not conflict.
Jenkins job builder definitions
[edit]As said above, you should not need a bespoke job, but instead use the regular ones. If you find you need to do something unusual however, please raise a Phabricator ticket so we can help you.
Most of the logic is defined in Jenkins Job Builder jjb/job-templates.yaml
and other configuration files.
In a job definition, a builder
defines what steps to execute.
We provide a builder macro called doc-publish
that takes care of transferring the generated files to the web server of doc.wikimedia.org.
It takes two parameters:
docsrc
- Directory holding documentation files relative to workspace (without trailing slash)docdest
- Directory under doc.wikimedia.org.
Example job definition:
# Bespoke PHP project
- job:
name: myproject-php-publish
node: Docker
concurrent: false
triggers:
- zuul
builders:
- docker-log-dir
- docker-src-dir
- docker-run-with-log-cache-src:
image: docker-registry.wikimedia.org/releng/doxygen:1.9.8-s2
- doc-publish:
docsrc: 'doc/html'
docdest: myproject
publishers:
- teardown
# Bespoke JS project
- job:
name: myproject-js-publish
node: Docker
triggers:
- zuul
builders:
- setup
- docker-run-with-log-cache-src:
image: 'docker-registry.wikimedia.org/releng/node18-test:0.2.0-s2'
args: 'doc'
- doc-publish:
docsrc: 'docs'
docdest: myproject
publishers:
- teardown
This will invoke the build scripts (doxygen
and npm run doc
) and publish their results (respectively in doc/html and docs) to wmdoc:myproject/.
To namespace the documentation based on project version, use the Zuul-generated DOC_SUBPATH
(derived from branch or tag name).
Simply insert it in the docdest
parameter.
You will need to invoke the builder assert-env-doc_subpath.
Example for MediaWiki (mediawiki-core-doxygen-publish job):
- job:
name: 'mediawiki-core-doxygen-publish'
builders:
- assert-env-doc_subpath
- zuul-cloner:
projects: >
mediawiki/core
mediawiki/vendor
# Build the documentation under /build/doc
- shell: |
rm -rf build/doc
mkdir -p build/doc
TARGET_BASEDIR="$WORKSPACE/build/doc" /srv/deployment/integration/slave-scripts/tools/mwcore-docgen.sh
# Publish from build/doc to mediawiki-core/$DOC_SUBPATH/php'
- doc-publish:
docsrc: 'build/doc'
docdest: 'mediawiki-core/$DOC_SUBPATH/php'
This publishes the documentation at wmdoc:mediawiki-core/master/php/, and also publishes release branch documentation such as wmdoc:mediawiki-core/REL1_34/php/, and tagged releases such as wmdoc:mediawiki-core/1.34.0/php/
Architecture
[edit]For architecture overview and runbooks, see wikitech:Doc.wikimedia.org
See also
[edit]- Continuous integration/Entry points describes the conventions for extension automation. If you follow these when developing your extension, then (with a lot of CI wizardry), tests run and documentation generates "automagically".
- GitLab/Publishing docs