Wikimedia Release Engineering Team/Quarterly review, February 2014/Notes


Slide 3 [Last quarter - Status]

Monthly review of postmortems for actions.

  • expose fatal

Beta: Prod:

Process review done with photos online.

Slide 5 [Next quarter (diff)]

Having db slaves in beta match what is in production so that we catch problems like the recent problem Flow had moving to production. L10n team have been requested it for a few months as well.

Slide 7 [Next quarter - Deployment tooling]

Process through all (useful) pain points from the Dev/Deploy review session (Greg) explicitly: Begin discussions (on list and/or wiki), complete discussions, distill requirements and next steps, prioritize Scap incremental improvements step 1: Refactor existing scap scripts to enhance maintainability and reveal hidden complexity of current solution (Bryan) step 2: create matrix of tool requirements per software stack (MW, Parsoid, ElasticSearch) (Greg) Use above matrix to add/fix functionality in scap (or related) tooling for ONE software stack, prioritized by cross stack use (Bryan)

Looking at the red post-its on the flowcharts pictured above and starting to chip away at that list.

  • Question about Trebuchet.
  • Ryan and Andrew are working on Trebuchet for Analytics work. That is progressing concurrently with the work of reworking scap in python. Trebuchet is still a possibilty for MediaWiki deployment.

Trebuchet is bascially a wrapper around salt and would let us replace dsh

  • new scap is very modular
  • Python based \O/ - can be plugged into Trebuchet in the future.

operations and engineering mailing lists are jointly the official channel.

ACTION: Greg to send periodic updates about scap refactoring

Slide 8 [Next quarter - Hiring]

Complete hiring and train new Test Infrastructure Engineer Complete hiring and train new QA Automation Engineer


Slide 9 [Next quarter - Browser Testing]

Goal: use the API to create test data for given tests at run time. (Jeff, Chris, Željko) Create users and data at runtime so that we can have browser testing assume certain content exists.

Goal: create the ability to test headless (Željko, Jeff, Chris) We have been using PhantomJS, but it's been hard because VE. Now using Firefox in headless mode, which has been working well. This allows local Jenkins runs. Issues with PhantomJS are mostly filed upstream, and are months-old bugs. PhantomJS is Qt based, and the webkit version is a bit old.

Goal: run versions of tests compatible with target test environments (Chris, all)


  • Mobile starting to do more test driven development. Ie tests are written before the code, might not even be in production. Mobile also have a lot of different target environnement.

Chris: MobileFrontend and Flow are pushing more and more features

Chris: highest priority for the next quarter to come up with a solution to the versioning of test problem

Ongoing: Continue to move shared code to shared repo; e.g. Login Continue to maintain tests and keep them green, e.g. connection issues

Slide 10 [Beta Cluster]

Goal: Emulate production db more closely This could have demonstrated a Flow problem before it was deployed

Had to pick one, going to be DB overhaul + Master/Slaves setup.

Bryan: what about using real hardware / more robust setup. Adding more tests cluster for different purposes (ex: performances)

Greg-g: could use MediaWiki vagrant as well. Might want to use vagrant for staging/testing purposes.

For CI we would need isolated VM to run tests in. Devs need an easy way to test their code in an env close to production BEFORE proposing change (ie by using a Vagrant image that let them get their own clone of prod environement)

Yuvi created labs/vagrant. You can also teach Vagrant to interact with OpenStack

Goal: Use beta labs as a testing ground for the above deployment tooling work

ACTION: Greg convene conversation with labs folks post migration re labs-vagrant (including OpenStack API etc)

    • Greg: get hashar in loop as well please for CI purposes :-D

Slide 11 [Next quarter - Dependencies]

Ops: Deployment tooling

Greg: who is going to +2 things? Where to draw the lines between platform and ops. ACTION: Explore root for Bryan. Reach out to Mark B.

Slide 12 [Next quarter - Dependencies]

MW Core: Deployment tooling

All of engineering: Vagrant

ACTION: Have a plan for Vagrant

    • Mobile has a mobile specific roadmap

ACTION: determine fit within test infra explicitly

Two things:

   * make MW Vagrant performant and reliable and useful to developers, someone to lead saying "no" to some features
   * Factoring Vagrant into the test infrastructure so that it can be provisioned into Labs

Erik: I'm not convinced that we need to push forward on features

Ori: there's also a middle set of features that folks like Arthur have been driving that are more complicated than adding roles such as adding Varnish.

Chris Steipp: for release engineering side of things

  • Representing Mark & Markus (external contractors releaseing MW) in this meeting?
    • (Explicit choice made, but yes, there should be something similar to this with them included)
  • We need to figure out QA for our tarballs.

Chris McMahon: browser tests being enhanced to support targetting a fresh wiki install (instead of being smoke tests for WMF prod/beta) Rob: sec/general releases would be done automatically (ie via Jenkins)

ACTION: add MW release tarball as goal in next quarterly review

Slide 13 [Questions]

Any positions for 2014-15 we should open? Are we doing enough to promote NON-browser testing? (unit testing, integration testing)

  • See also: TitleValue RFC

Ori: we lack experience needed for Unit testing, ex: Mocking.

  • Structured exceptions in log
  • Try to identify the part of code which are biting us the most
    • via exposing the relationship between eg fatals and bugs and git blame

Deeper unit testing of frontend code

ACTION: figure out if a central developer to generate metrics on unit tests, maintaining the framework, etc

  • Convene a meeting of experienced (unit testing) devs [needs owner]
  • Maybe reach out with Wikidata team which ended up writing a lot of tests for their extensions and even mw/core.