SQL/XML Dumps/A dump job using an existing MediaWiki script

Things to consider
Does your dump job run via a MediaWiki maintenance script (in core or an extension) or via some other command like mysqldump or a custom script? Does your dump job script write compressed output directly? Does your dump job script produce progress messages that can be used to judge the % of entries processed or to derive an ETA for when the job will complete?

Your job may integrate slightly differently than this example based on the answers to the above questions.

Code of the module
First, we need the code of the new dumps job python script; see below for the code to sample_job.py:

While we're here, we might as well see how it works. At this point, nothing here should be surprising.

Comments are inline so that anyone who checks out the repo can study this example and see how it works.

Wiring it in
Next we need to make the job known to the infrastructure. We do this by adding an entry for it in the dumpitemlist.py module :

Because this job is just for purposes of illustration and should not be run in a production environment, we also added a config switch that lets you disable jobs on all runs on all wikis; see the commit if you're interested in more details. Ordinarily you won't have to worry about that, since jobs you add will be jobs you want run :-)

Testing
Now we run it:

Output check
And finally we check the output:

Go forth and do likewise!
The end. Obligatory cute puppies link for reading this through to the end: