User talk:AKlapper (WMF)/Code Review

Re: Recognize active CR+1 users based on korma statistics
Where are CR+1/CR-1 statistics in that page? The second table is about +2/-2 (seemingly excluding self-merges), the first table seems to be about any "Actions" as the header states, hence presumably it counts the number of patchsets submitted too, as well as any comment without CR tag. Nemo 12:34, 3 January 2016 (UTC)
 * Thanks for catching this. I've clarified the section that theese statistics do not exist yet. --AKlapper (WMF) (talk) 18:53, 4 January 2016 (UTC)

Re: Developers/Maintainers
Why do you call the page outdated? Nemo 12:38, 3 January 2016 (UTC)
 * The main issue I see with that page is that links to teams are practically useless: you can't add a team to a gerrit patch, only individuals. When there is only a team in the maintainer column, it usually means the component is unmaintained (that is, getting stuff reviewed is hard). Nemo 12:39, 3 January 2016 (UTC)
 * I call the page outdated because it is. I don't consider it fixable as it's broken by design. It lists deprecated Analytics projects, it lists undeployed stuff (like Nagios or Nimsoft), it lists stuff long-finished without any specific 'project' (like VirtualRestService), it lists people as maintainers who aren't active anymore (like Chris, Ryan) or moved on (like Chad, Timo), it's unclear to me how "Key extensions" were defined, or where those random items under "MediaWiki core" come from (what is "Extension support"? "Security"? What is "General/Unknown"?). And "Anyone can list themselves as in training [...], signaling that they want [...] achieve maintainership" might create wrong expectations as noone will follow up on it. --AKlapper (WMF) (talk) 22:32, 4 January 2016 (UTC)
 * Other than the Analytics repositories, which nobody really understands and were originally not included in the page, the rest follows pretty clear criteria. All core components are included; "key extensions" are defined on a very objective metric. It's possible that we're missing some recently created component due to the messiness of Phabricator, that's T76942.
 * As for the names you mention, the mere fact they changed their main focus doesn't automatically mean there are currently better reviewers, although perhaps there are a couple rows to update. Names of WMF employees of course are those most likely to quickly become outdated.
 * The other point you make is correct, of course, but it has nothing to do with the list, which merely documents reality. Nobody has an obligation to review code, so there can't be a list of people one can "expect" to follow up. Nemo 10:20, 5 January 2016 (UTC)

Number of reviewers
Re "two reviewers find an optimal number of defects", we should encourage people to remove themselves from reviewers when they are certain they won't review the patch. A lot of noise and wasted time is created by the fact that people are unable to keep their dashboards clean... sadly the gerrit cleanup day made no change on the total number of open review requests (open changesets multiplied by number of reviewers on each). It would be nice to know how many people even look at their dashboards, maybe by checking gerrit webserver logs. Nemo 10:20, 5 January 2016 (UTC)
 * Thanks. Added. --AKlapper (WMF) (talk) 18:06, 12 January 2016 (UTC)

Clustering/sequencing
Do these groupings resonate with you? --KSmith (WMF) (talk) 00:00, 6 January 2017 (UTC)
 * Culture
 * 1.6 Weak review culture
 * 1.10 No culture to improve changesets by other contributors
 * Resourcing
 * 1.2 Lack of enough skillful, available, confident reviewers and mergers
 * 1.3 Under-resourced or unclear responsibilities
 * 1.7 Workload of existing reviewers
 * Skill/knowledge
 * 1.1 Unstructured review approach
 * 1.5 Unhelpful reviewer comments
 * 1.8 Poor quality of contributors' patches
 * Documentation/communication
 * 1.4 Hard to identify good reviewer candidates
 * 1.9 Hard to realize a repository is unmaintained
 * 1.11 Hard to find related patches
 * 1.12 Lack of synchronization between teams