Selenium/Ruby/Workshops/Search features

Automated Tests in Plain English

 * Q: Why automate tests at the browser level?
 * A: They find bugs!


 * Q: How do we know what to test?
 * A: Browser tests are most useful for:
 * Features under development and near release. We want them to work properly in all browsers when they are deployed.
 * Features heavy in javascript. Different browsers interpret javascript in different ways, which is a potential source of error.
 * Features that require navigation through the application. Unit tests don't navigate.
 * High risk features. If something breaks often, we should have a test for it.
 * UploadWizard is a good example of a feature that is fragile when it is being changed.
 * Features that need to work all the time. Wikipedia doesn't work if Edit or Search is broken.


 * Q: How can I help?
 * A: If you know how something should work, you can tell us how to test it. You don't have to be programmer, either.

Examples are Always Handy
Three different bugs, three different root causes
 * A global problem
 * A browser-specific problem
 * A configuration problem

A recent collaboration: https://github.com/wikimedia/qa-browsertests/blob/master/features/guided_tour.feature. This is a great test, it's javascript-heavy and involves navigating through the application.
 * Paired with Matt Flaschen of E3 on an acceptance test.
 * A little while later the test found a pretty important bug https://bugzilla.wikimedia.org/show_bug.cgi?id=45781
 * Note that Stephen Walling is actively testing this feature. Browser tests find bugs when humans aren't looking
 * We have video of all of our test runs. This one clearly shows the error: https://saucelabs.com/jobs/076ed6588f314cfea816f4d70f2fe826

An older test: https://github.com/wikimedia/qa-browsertests/blob/master/features/aftv5.feature
 * This particular feature worked correctly in every browser except for IE9: https://bugzilla.wikimedia.org/show_bug.cgi?id=42551

A simple test that found a configuration bug unrelated to software: https://github.com/wikimedia/qa-browsertests/blob/master/features/math.feature
 * After the EQIAD data center migration this test discovered that the "texvc" library had not been migrated to the new data center.

A recent collaboration with some challenges: https://github.com/wikimedia/qa-browsertests/blob/master/features/accept_language.feature
 * Paired with Siebrand Mazeland of the Language team. This test is in process, the way the browser determines whether the Serbian language is presented in Latin characters or Cyrillic characters is complicated.
 * We're telling the browser that we want Serbian in Latin characters, but the browser shows us pages in Serbian using Cyrillc characters

We have Mobile browser tests too: https://github.com/wikimedia/mediawiki-extensions-MobileFrontend/tree/master/tests/acceptance/features
 * They need some love

Demo time!
(screencast? hangout?)
 * WMF is preparing to make some changes to how Search works in Wikipedia
 * Search is pretty important! We need some good tests for Search.
 * Both desktop and mobile browsers need search tests.

Search demo: create Scenarios for Search tests

Work with us
Who should contribute tests to the backlog?


 * Developers who want their features under development tested in the browser (for example E3 and GuidedTour)
 * Product/Project managers responsible for particular features (for example Language and AFTv5)
 * Wikipedians who depend on some feature or features to continue to function correctly (for example Math extension, UploadWizard, and we can even support experimental projects like PDBHandler)
 * Software testers interested in world class test automation (we welcome community testers)

Željko and Chris pair program on tests every Friday (at least!). You can help us decide what to work on next, and you can join in if you want. Pairing is the best way to write tests.

We are a small team. Contributing to our test backlog not only helps insure that our software works correctly, it gives YOU skills you can take away.

Pair! Collaborate! Drop us tests on the backlog if you're too busy to talk in person. Given/When/Then mastery is not necessary, we can work with any plain language description of how a software feature should work.