Code review management/July 2011 training

July 19, 2011 2:30pm-5:15pm PDT in the WMF San Francisco office.

Training and discussion of code review. Stepped through documents like the Security for developers document and other training material for reviewers. One goal: to spin off improvements in those documents, and create a training rubric that volunteers and staff can use to train at hackathons.


 * 2:30pm-3:15pm - 10 minute talks + a little slack
 * RobLa - 5 min intro
 * Roan - the basics
 * Brion - general philosophy - depth of review, etc.
 * Tim (on the phone) - stability and performance + general philosophy
 * 3:15pm-3:30pm - Discussion/Q&A
 * 3:30pm-4:15pm - code review mini-sprint.
 * 4:15pm-4:45pm
 * Trevor - unit testing
 * Brandon - UI considerations
 * Chad - security
 * 4:45pm-5:00pm Discussion/wrapup

Etherpad: http://etherpad.wikimedia.org/CodeReviewTraining2011

Notes below, copied from Etherpad.

RobLa: intro

 * Idea is to get everyone familiar with CR, even if you're not gonna be doing it yourself
 * We're going to be doing these regularly
 * Previously a bottleneck in our dev process: let the backlog grow, then sprinted it away
 * Current CR trunk stats: http://toolserver.org/~robla/crstats/crstats.trunkall.html
 * Monthly/weekly deployments means we need a way smaller backlog

Roan — basics of code review

 * Special:Code - our code review tool (Roan showing http://www.mediawiki.org/wiki/Special:Code/MediaWiki )
 * MW extension that integrates with svn — gives a list of revisions, along with their authors, status, comments, diffs, etc
 * Click on a revision ID to get the diff, modified paths, comments


 * Comments are used to point out errors, omissions, suggested improvements in the first instance — MediaWiki sends you a magic email


 * Each revision has a status, which can be modified by people in the appropriate group.
 * new = Default state after committing. Also used if a revision was fixme and committer thinks he/she resolved it (needs re-review)
 * fixme =Wrong, has to be fixed asap. Reviewer must leave a comment explaining the issue
 * reverted = It has been reverted in a followup revision
 * resolved = Issues are CONFIRMED fixed in another follow-up revision by the original reviewer (not just addressed) — you can't set your own code as resolved, usually.
 * ok =OK (don't do it to your own code! (CodeReview won't let you anyway))
 * deferred = We don't think this revision needs review (but a nicer way of saying it - what would be a reason we don't think this revision needs a review?)
 * old = Old code that has been superseded (e.g. Old REL1_18 branch merge revisions), or is really old and there was no benefit from actually reviewing the code


 * Each revision can also have tags.


 * Common tags: http://www.mediawiki.org/wiki/Code_review_tags
 * Full list of tags: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/tags
 * scaptrap — this might cause issues with updating on Wikimedia sites (configuration changes, schema changes, parser caches, dependencies). There will usually be a code comment that explains the problem. Before you deploy any revisions, check for scaptrap tags!
 * http://www.mediawiki.org/wiki/Special:Code/MediaWiki/tag/scaptrap
 * Review assignment: tag = committer name
 * Branch names (1.17, 1.17wmf1, 1.18, etc) — needs backporting to branch
 * schema — includes a schema change
 * Tags are free form and can be added as needed (also: if you remove the last usage of a tag, it disappears from use)


 * Follow-up revisions: association is automatically detected if you mention the revision with rXXXXX in the commit summary, if you use the same bug number (in the format bug X), or if you manually add an association
 * You get an email if anybody makes a follow-up revision, or if somebody comments on a follow up revision.

todo: add watch for CR https://bugzilla.wikimedia.org/show_bug.cgi?id=22094


 * Sign-offs: You can add that you've inspected or tested a revision. Contrary to the name, it is NOT a "sign-off" on the revision that marks the version as OK (taking responsibility for the revision and nobody else looks at it). It's designed to lower the bar to participation in code review.
 * You can search by path, author, status, tags with various interface elements.
 * Link your (or get a "Coder" to do it) account! See the new committer guide for how to do so.
 * You need "Coder" rights to be able to change revision status'

Deployment

 * We run a separate branch, currently 1.17wmf1. It's a cleaned up snapshot of trunk at a particular time with access protection. MFT - merge from trunk
 * The short version of how to deploy a particular code revision: do a subversion merge from trunk to /branches/wmf/BRANCH, do an svn up on fenari, then run the push script.

Brion and his TARDIS
Time And Relative Dimension In Space. Whoever this is needs educating ^^
 * Target, Attack, Revert, Defend, Insist, Stop?
 * Lots of current stuff is based on old habits.
 * We used to run "directly off trunk" — we just didn't run svn up.
 * Code review was done by replying to the CVS/SVN email notifications

How to review code:
 * See what the committer is trying to do
 * Verify that it does what it is claiming to do
 * Verify that we want to do what the committer wants to do
 * Check that the code will be maintainable in the future — tests if appropriate (parser)
 * Check for security, performance issues, and other consequences

Examples:
 * Coding conventions and maintainability ( e.g. opaque boolean parameters)
 * Make sure that we have tests for things
 * Consistent coding and comment style (with the surrounding file)
 * PHPDocumenter comments (web-based documentation automatically generated, IDEs also suck them in) — DO IT
 * Certain people will love you more
 * Make sure you have visibility modifiers
 * Don't go changing visibility for old code -- it can break stuff!
 * Refactors — check if you are breaking extensions
 * grep is your friend! (ack  is better) (Syntactical grep  is also useful)
 * Or use http://toolserver.org/~krinkle/wikimedia-svn-search/ if you dont want to svn-checkout GBs of code and wait 5min+

Tim's security and performance talk

 * MySQL query performance! People often write queries that are not indexed and/or scan a lot of rows. You need to understand what MySQL is doing when it executes my query
 * Example: namespace filter boxes all over the place. In a lot of cases the WHERE page_namespace=N is not indexed. MySQL might have to scan a lot of rows to satisfy the LIMIT, potentially scanning 100k rows to get 50
 * Adding unused indexes damages write performance, so just adding an index doesn't necessarily make things faster. Discuss DB-related things on IRC with e.g. Tim, Domas or Asher. People who aren't MySQL experts but can help with indexing questions include Chad (^demon), Sam (Reedy), Roan (RoanKattouw), Ian (raindirft)
 * Use EXPLAIN to find out information about your query -- tells you useful things like which indexes are being used, whether there's a filesort, etc.
 * Remember that MySQL will only use one index per table per query. If you're matching on multiple columns, you may need a concatenated index.
 * Also remember that ORDER BY also needs to be indexed, much more so than WHERE usually
 * Long timeouts can also take the site down, if the service you're hitting goes down and every request times out
 * Security
 * Read Security for developers. It's shorter than it was before, it's been split up into per-vulnerability subpages.
 * See Xkcd http://xkcd.com/327/ for SQL injection example
 * Tim explains SQL injection, and text protocol injection in general
 * Criteria for reverting:
 * broken interfaces (because it infects other code)
 * Breaking tests, unless you're around to fix it immediately
 * If you're committing, try to be on IRC so that you can be tracked down as appropriate :-)
 * Erik: tests reduce need for human involvement -- if code fails a test, that's enough to set it to fixme
 * We need a central place that lists *ALL* types of tests that we're expected to write, and how to write them.
 * Read mw:Manual:JavaScript unit testing and mw:Manual:PHP unit testing
 * For more of a narrative, see http://www.mediawiki.org/wiki/Requests_for_comment/Unit_testing

Break
Things that came up:
 * The "todo" tag: a milder form of "fixme"
 * Not marking fundraising code as "deferred"
 * Review process for non-MediaWiki code - example: Drupal/CiviCRM modules
 * Run "php -l" before you commit
 * (better yet, test thoroughly before you commit)
 * Sumana can hum the law and order theme

Testing with PHPUnit (Trevor)
Write tests! Having untested code is dangerous.
 * Unit tests are like penguins. Integration tests are like parties. They're both like mexican fighting fish. Einstein invented gravity.
 * Keep them separate, do not mix unit testing with integration testing.
 * Use the right and specific assertions (which are like cooks) for the right test. Don't use assertTrue( foo == 5 ) Otherwise the test report will give useless data like "test foo didn't return true but false", wheareas you want to have "test foo didn't return 5, but 12".
 * Use sentence case descriptions. There is no problem with having long descriptions.
 * Inventing:
 * Write tests:, write a test too which you will live up. It will fail but it gives you a nice view of what needs to be done and how you imagine it will work.
 * Write code.
 * Run tests.
 * Refactoring:
 * Before you refactor, change the tests ("break them"). Then you're going to perform the actual changes and when you're done run the tests.

Q: How and where to write and run tests for Extensions ? A: http://www.mediawiki.org/wiki/Manual:PHP_unit_testing#Writing_Unit_Test_for_Extensions

Q: Parser tests for code and extensions..? (not php unit) A: In the extensionfolder and add them to the registry through a wgHook (which one?)

read the dox! Read the manual we need to unify & org the dox more. If all of us spent 5 min on it, it would be way better.

Use PHPUnit in a standard way, not a creative way.

We've extended the base class, you your test classes will be extending MediaWikiTestCase

Brandon on UI review
Brandon should be reviewing things like "I wanna add 10 new checkboxes" or "make bkgrd of page black"

example: recently someone checked in a change to trunk that changed bhvr of searchbox to focus-follows-mouse got deployed to translatewiki and they hated it

we should avoid this sort of thing

such as "log me in globally" checkboxes

we should tag them with "design" or something like that

when you see a change that does something like that, we should talk about it

types of things we wanna focus on:


 * not so much SMW. We do not deploy it.

Most important:
 * ones readers face, go into an extension we have deployed in all wikis
 * readers of en & de
 * editors anywhere

the fewer users, the less it's on our radar. Pending Changes has < users than flaggedrevs.

Search box, on the other hand, top priority

[did not catch Rob's question] - what do you look for when you're reviewing UI changes?

often all Brandon does is look at the screenshot provided?

Rob asks other reviewers: have you noticed, turning the other way because you don't want to review UI things at all?

Brion says he will usually at least whine about UI issues Trevor says he will revert things

Erik asks: Brandon, you have started a style guide on MW, and an asset collection of buttons, etc. what's your plan?

Should we start a HIG? Human_interface_guidelines

There is a design doc/style guide out there, has been for about a month, generated conversation, we can make changes.

Way forward: Timo is writing an extension, where he is modifying the HTMLForm class to generate forms in style-guide format once it's done, we'll backport it into trunk, so now anything that uses HTML form will magically pick all this up.

Timo: a class in MW called HTMLForm makes it easy to standardizedly generate forms. Right now uses browser's native styling

this ext - lots of work - so many forms! basically creating a new class that will replace html class? replace output automatically?

Andrew Garrett: "I wrote HTML form and want to help"

jorm: going forward, how do we enforce? we have volunteers & they may not wanna do it this way. I'd feel hypocritical if we said from now on, do it using htmlform or we will revert you.

In future, take forms &start porting them into html form... login form as a good example, legacy things that would not be that big a deal

Aaron: hashar added a RecentChanges checkbox.... recommendations.... is that ? jorm: I'd wanna look at that. tag it as design.

Is there a tag already? No, but we should have one

We have to in the long run develop ... a practice? lots of people have to have this practice

Sumana provides this link on Launchpad's UI review process: https://dev.launchpad.net/UI/Reviews

a small review team will be a bottleneck. styleguide & HTMLform will help train lots of nerds in UI review, in the longrun

did not catch that?

Google - you get certified to write in certain langs before you can commit

Brandon likes the idea of having a measure of trust. It is a complicated thing, but we could try to make it easy as possible.

[did not catch that response]

We ought to start having design review BEFORE code commit. "we're gonna redo this, this is how it will behave," get design documents.

Aaron: community-driven stuff like PendingChanges... hard to resist what community wants!

PendingChanges is special. Design by committee leads to poor design. 2500 people voting on design: should not happen! 3-4 people in a room pondering data & making decision: that is what we want.

Erik: it's common that, if you open the possibility of having a feature/option, you'll get some people who say I'd like that! So an important piece: maybe not a cabal, but we gotta scrutinize -- "we have too many preferences!" in pendingchanges, the entire spec evolved by community. We'd like -- community suggests, via Bugzilla or something, "we'd like this feature," & then we talk, build an interactive prototype? we don't want mix of spec, rough UI, voting. Bad process. Have to push back. We do have 300 user testing -- usertesting.com accounts? tests? get a quick UX test by real human beings! you get a video back. Did this extensively with Upload Wizard, very helpful and it's something we can use to show community how real people interact with something they want. proof of problems surfacing.

Chad on security

 * Read http://www.mediawiki.org/wiki/Security_for_developers before you commit anything else

XSS: Cross-site scripting

 * Happens when you don't escape your HTML output — you need to distinguish between putting Wikitext, HTML and text into your page.
 * Wikitext needs to be parsed using OutputPage::parse
 * text needs to be escaped with htmlspecialchars
 * HTML needs to be sanitised if it's user input, otherwise can be output properly
 * In general: handle your escaping with the Html or Xml classes.

CSRF: Cross-site request forgery

 * Happens when you have a write action form that can be submitted off-site
 * Any write action that you don't want any website on the internet to be able to do with the session of any user who visits their site should have an "edit token"
 * See http://www.mediawiki.org/wiki/Cross-site_request_forgery#Defending_MediaWiki_against_CSRF_attacks
 * Use nice messages — you sometimes get token failures for legitimate users who have lost their cookies. Don't just exit or throw an exception.

Register Globals
An evil feature of PHP, deprecated for years. As of PHP 5.4 removed entirely. \o/

Anything you submit in query string would be set as global var within PHP script. Awful design decision. Basically: NEVER use an uninitialised global. ALWAYS explicitly set it at the beginning of your application to some default value. You can't just check if it's initialised or not, because a malicious user could set it themselves in a query string. Easy to protect against it, but dangerous if you don't. Many 3rd party users on shared hosts suffer from this being left on.

Easy to protect by always defining variables. Never just check for it to be set & set it conditionally if it is not set. Pass it through query parameter.... will end up use whatever is passed through query, skip the check

SQL Injections
Tim already talked about this. Mainly: best thing, always use our db wrappers we have methods for building queries, insert, delete, etc. We probably have or should have a wrapper for everything. Ensure input to db is always escaped as it should.

Sanitize data, input, output.

A last minor thing: dynamic code generation. Please, never use eval or create_function or the /e modifier to preg_replace. They execute whatever you give them as code, and you may not have sanitized as well as you thought you did

there's the same sorts of things in parameters for shelling out. Escape parameters for that. always do that if you need to shell out to an external binary

Questions?

RobLa: XSS: in particular, this is easy to underestimate the impact. If you consider: once someone can put arbitrary JS into a page, they can do things on behalf of that user, incl malicious things. XSS is not necessarily a prob on its own, depending on severity, but is an entry vector to other things. Ghost stories requested of Tim Tim: CentralNotice extension didn't have CSRF protection. Allowed arb HTML inserted onto every page on every wiki! Worst-case scenario: use this to insert JS into every page, & within 5 min, all stewards visiting would have viewed it, run injected script. Attacker could use it, running as privileged users, to de-sysop every user on every wiki, vandalize all websites. No ability to revert it.

RobLa: that was combo: potential XSS vuln, CSRF vuln we had. Sort of a tricky code review case. Miscommunication occurred. Some saw XSS, missed CSRF.

Tim: "Stored XSS" -- use vuln to insert malicious code into db, where it persists. "As nasty as you can imagine." Blocking all users... use everyone's bandwidth to attack site.

XSS allows account takeover more generally.

Tim: XSS vulns -> lets you get passwords. You can rewrite all page HTML, so you could put in a "you've been logged out" page, gather users, passwords, emails, etc via phishing.

RobLa: moral of the story: one can downplay XSS attacks. Let's not. Also: often ignored, worth repeating: validate your input & escape output. Maybe you can skip first part, never skip the second. Reason for this: from a code review perspective, sometimes hard to check 1st part, easy to check escaping. Not good enough for code to be secure; has to be obvious that it's secure. Otherwise reviewer might think "surely someone has considered this"

methods, classes, protection for developers. sanitizer class, html & xml classes, all db methods. They are there. Use them & do not try to do it yourself

Conclusion
Arthur: last year, during fundraiser, all fundraiser code coming through codereview marked deferred still doing that let's stop doing that, ensure fundraiser code gets reviewed we're trying to be more selfsufficient & build capacity for code review, but will need help

also: we have set up a separate code repo for non-MW code we request review help from people who know security for Drupal, Drupal modules, & CiviCRM

advice from someone: you should do cross-reviews as much as possible PayPal Pro Gateway, for example

Tim: I've reviewed payflow stuff not that hard to someone who is familiar with MW, Drupal is PHP, will feel mostly familiar

codereview extension: supports multiple repositories? how should we work this? many unreviewed revisions, but some are separate from MW...

Inside the Wikimedia repo, CiviCRM, other non-MW fundraising stuff. Also integrated with codeReview tool. on the other hand, CentralNotice & DonationInterface are MW things the fundraising team works on *in* MediaWiki

toolserver summary stuff refers to /mediawiki/trunk directory, not anything else — so the precious statistics aren't being "polluted" by the fundraising people: Short: Don't defer it!

T-H-A-N-K-S!