Selenium/Ruby/Browser testing user satisfaction survey

From mediawiki.org

The goal of the survey was to get a baseline of user satisfaction in our browser-testing infrastructure/tooling. Target audience were members of a few mailing lists: wikitech-l, engineering and qa. The survey was created as a Google docs form. It had 5 sections, each section had a few questions. The last question in each section was a free form text field, so participants could leave any feedback. Free form answers are summarized, as the survey privacy statement required. Most of the questions had simple 5 level linear scale, from 1-5, or from :'( to :D.

:'( 1
:( 2
:| 3
:) 4
:D 5

18 people participated in the survey. Questions got from 8 to 16 answers. All details (including all questions) can be found in T131123 Phabricator task.

Jenkins[edit]

Take aways:

  • Participants of the survey are pretty happy about stability and speed of mwext-mw-selenium jobs.
    • Note that right now some mwext-mw-selenium jobs are failing with hard to reproduce failures, and some jobs take hours to run.
  • Most of the people also know how to use continuous integration entry points for Ruby (Rake) and JavaScript (Grunt).

Survey comments (paraphrased):

  • The majority of the failures were related to CI, not actual failures of tested code. They are having trouble debugging and do not understand how our continuous integration works.
Gerrit triggered Selenium+Jenkins jobs (mwext-mw-selenium, after patch set submission)

Jobs are stable enough (n=16)

:'( 1 6%
:( 2 13%
:| 2 13%
:) 9 56%
:D 2 13%

Browser testing user satisfaction survey 1

Jobs are fast enough (n=16)

:'( 1 6%
:( 4 25%
:| 4 25%
:) 6 38%
:D 1 6%

Browser testing user satisfaction survey 2 update

Time triggered Selenium+Jenkins jobs (selenium*, daily)

Jobs are stable enough (n=16)

:'( 2 13%
:( 3 19%
:| 5 31%
:) 5 31%
:D 1 6%

Browser testing user satisfaction survey 3 update

Jobs are fast enough (n=16)

:'( 1 6%
:( 2 13%
:| 4 25%
:) 8 50%
:D 1 6%

Browser testing user satisfaction survey 4 update

I know how to use continuous integration entry points

Rake (Ruby) (n=16)

:'( 3 19%
:( 0 0%
:| 7 44%
:) 3 19%
:D 3 19%

Browser testing user satisfaction survey 5

Grunt (JavaScript) (n=16)

:'( 1 6%
:( 2 13%
:| 4 25%
:) 3 19%
:D 6 38%

Browser testing user satisfaction survey 6

Ruby[edit]

Take aways:

  • It is interesting that (slightly) more people are happy with Ruby testing tools (mediawiki_selenium, Cucumber, RSpec) than those who are not.
  • The majority of the people did not have strong feelings. We expected more people being unhappy with the Ruby testing tools.

Survey comments (paraphrased):

  • Ruby is not the optimal choice for testing tools, another language that is more popular with MediaWiki developers would be a better choice. Some libraries we use are poorly documented.
  • Ruby is not the best language in our context, Node.js would be a better choice.
  • Having less languages on the stack would simplify things, but Ruby has good testing tools, so the decision is not easy.
How do you feel about

mediawiki_selenium (n=15)

:'( 1 7%
:( 3 20%
:| 6 40%
:) 4 27%
:D 1 7%

Browser testing user satisfaction survey 7

Cucumber (n=14)

:'( 1 7%
:( 2 14%
:| 5 36%
:) 4 29%
:D 2 14%

Browser testing user satisfaction survey 8

RSpec (n=15)

:'( 2 13%
:( 2 13%
:| 6 40%
:) 4 27%
:D 1 7%

Browser testing user satisfaction survey 9

JavaScript[edit]

Take aways:

  • JavaScript testing framework Malu is in it's early development and this survey was probably the first time some of the participants have heard about it. That might explain the highest :| (or 3) response about Malu. 11 participants (69%) gave that response, the highest number of votes that any other question got. Participants were in general positive towards Mocha and QUnit.

Survey comments (paraphrased):

  • Did not hear about Malu before and did not find much documentation when they went looking. No opinion on Mocha. QUnit does the job well, but has it's problems.
  • Not familiar with Malu but likes the idea. Prefers Mocha to QUnit but not sure if another testing framework would cause problems.
How do you feel about

Malu (n=16)

:'( 2 13%
:( 0 0%
:| 11 69%
:) 2 13%
:D 1 6%

Browser testing user satisfaction survey 10

Mocha (n=15)

:'( 1 7%
:( 0 0%
:| 5 33%
:) 6 40%
:D 3 20%

Browser testing user satisfaction survey 11

QUnit (n=15)

:'( 2 13%
:( 1 7%
:| 5 33%
:) 4 27%
:D 3 20%

Browser testing user satisfaction survey 12

Selenium[edit]

Take aways:

  • It is surprising that participants were pretty happy about writing tests.
  • It was not a surprise that the feelings were not the same when it came to fixing failed tests.

Survey comments (paraphrased):

  • Lot of time is spent fixing tests because the testing environment is not stable and tests are not written in a good way, the latter caused by lack of understanding of the framework.
How do you feel about

Writing tests (n=16)

:'( 3 19%
:( 1 6%
:| 3 19%
:) 7 44%
:D 2 13%

Browser testing user satisfaction survey 13

Fixing failed tests (n=15)

:'( 2 13%
:( 4 27%
:| 4 27%
:) 1 7%
:D 4 27%

Browser testing user satisfaction survey 14

Getting help[edit]

Take aways:

  • Our documentation (at mediawiki.org) is not good enough nor up to date (for example Browser testing), but participants were more happy with it than expected.
  • Looks like the most helpful tool was IRC (#wikimedia-releng), and to some degree Phabricator (#browser-tests, #browser-tests-infrastructure), and mailing lists (qa).

Survey comments (paraphrased):

  • Documentation not great. Getting started documentation targeting Mac. Getting started tutorial should be as simple as possible, the rest of the documentation should be moved to other pages.
  • Would like to know more about testing infrastructure, preferable as a session.
  • Hard to get help. Better process for getting help needed. Hard to schedule pair programming.
Rate your experience getting help via the below methods

mediawiki.org (n=16)

:'( 1 6%
:( 3 19%
:| 6 38%
:) 5 31%
:D 1 6%

Browser testing user satisfaction survey 15

IRC (n=16)

:'( 0 0%
:( 1 6%
:| 3 19%
:) 7 44%
:D 5 31%

Browser testing user satisfaction survey 16

Phabricator (n=16)

:'( 0 0%
:( 2 13%
:| 8 50%
:) 5 31%
:D 1 6%

Browser testing user satisfaction survey 17

Mailing list (n=16)

:'( 1 6%
:( 1 6%
:| 10 63%
:) 4 25%
:D 0 0%

Browser testing user satisfaction survey 18

Documentation[edit]

Take aways:

  • Documentation is important. There is never enough of it, and what is out there gets outdated quickly.
  • Both our getting started (example: Setup instructions) and general documentation (example: Browser testing) got more positive than negative answers.
How do you feel about

Getting started documentation (n=16)

:'( 0 0%
:( 4 25%
:| 6 38%
:) 4 25%
:D 2 13%

Browser testing user satisfaction survey 19

Documentation on how to do testing (n=16)

:'( 0 0%
:( 2 13%
:| 8 50%
:) 4 25%
:D 2 13%

Browser testing user satisfaction survey 20

I need help[edit]

Take aways:

  • We were offering training in the past, but we were not sure what type of training people prefer.
  • Based on the answers, participants equally like pairing, on-line and in-person workshops.
  • That is true for both getting started sessions and advanced sessions (like fixing failed tests).
I need help

I need help with getting started with testing (n=9)

I would like to pair with somebody. 5 56%
I would like to attend on-line workshop. 5 56%
I would like to attend in person workshop (hackathon, conference). 5 56%
Other 0 0%

Browser testing user satisfaction survey 21

I need help with fixing failed tests (n=8)

I would like to pair with somebody. 4 50%
I would like to attend on-line workshop. 4 50%
I would like to attend in person workshop (hackathon, conference). 4 50%
Other 0 0%

Browser testing user satisfaction survey 22