Scrum of scrums/2016-07-20

= 2016-07-20 =

Reading Web

 * No update, working on language switcher on mobile web

iOS native app

 * 5.0.5 heading to regression today, expected release to Apple store later this week or early next week
 * Development of 5.1 is in progress
 * Planning of 5.2 is in progress

Android native app

 * Feed is released in beta! (Follow-up bugfix beta release is cooking as we speak.)
 * We are starting work on the navigation overhaul.
 * Heads-up to RelEng: we are going to talk this week about whether we have bandwidth this Q to transition to Differential code reviews.

Mobile Content Service

 * First public feed endpoints are deployed: aggregated + smart random

Community Tech

 * Patch for numeric sorting is ready for review (https://gerrit.wikimedia.org/r/#/c/299108/)
 * Will be rolling out on test wiki first. Need another test wiki before English Wikipedia (preferably already using UCA collation)
 * Fixed security bug in Pageviews Analysis
 * Architecture Committee RFC meeting about Cross-wiki watchlist back-end today 2pm

Collaboration

 * Blocked - None
 * Blocking - Krinkle would like us to stop using buildCssLinks to pave the way for a refactoring. Otherwise, no change.
 * Updates
 * Turned off Echo transition flags, now that the maintenance scripts are done. This should improve performance and avoid unexpected side effects.
 * Echo features (such as animation when notifications move in list) and bug fixes.
 * Flow security fixes merged to master; they were already on the cluster.

Parsing

 * In collaboration with Services (Marko) & Ops (Giuseppe), we transitioned Parsoid to be based on service-runner. Parsoid deploys will resume tomorrow / Monday.
 * Tim working on addressing HHVM segfault in preprocessor which was reported by Giuseppe in a security bug.
 * Scott & Tim working on a PHP only Tidy replacement which is close to being done.
 * OCG (Offline Content Generator) outage this week due to unrelated proxy misconfiguration: T140789

VisualEditor

 * Blocked: None.
 * Blocking: None known.
 * Update: Quiet week. Mostly working on bugs and the new wikitext editor. CustomData extension dependency removed from all three remaining Wikivoyage extensions that used it in master; will be able to de-deploy CustomData from production in the next few weeks once the dust settles.

Discovery

 * Blocking: none
 * Blocked: none
 * logstash.wikimedia.org upgraded to latest Kibana version
 * TextCat A/B test results are in: https://commons.wikimedia.org/wiki/File:Report_on_Cirrus_Search_TextCat_AB_Test_-_Language_Detection_on_English,_French,_Spanish,_Italian,_and_German_Wikipedias.pdf
 * TLDR: Success
 * TextCat demo has new design: https://tools.wmflabs.org/textcatdemo/
 * GeoSearch launched: https://www.mediawiki.org/wiki/Help:CirrusSearch#Geo_Search

Interactive

 * Launched maps on meta, cawiki, hewiki, mkwiki
 * This Friday (July 22) in Seattle - data visualization hackathon https://www.mediawiki.org/wiki/DataViz_Seattle_hackathon
 * This Weekend - the whole team is in Seattle for State of the Map US conference

Analytics

 * issues with eventbus deployment and new schemas, service was rejecting events, had to be restarted.
 * reconstructing edit history from mw database, pretty sure it is possible but will know better after thsi week
 * scaling pageview API, our new cluster has issues with being able to load data and compact (we needed to change compaction from old scheme)

Research

 * Memory issues on scb1001/1002 related to ORES have been partially addressed.
 * Lower number of uwsgi processes
 * Periodic restart of celery workers to address memory leak https://phabricator.wikimedia.org/T140020
 * Explore changing model from Random Forest to Gradient Boosting https://phabricator.wikimedia.org/T139963
 * We'll be seeking dedicated hardware. (Anyone in ops want to reach out to us to help with that process would be great)

Services

 * Feed endpoints deployed, but need to revisit using `/feed/featured` as it takes a looong time to MCS to compute it
 * https://en.wikipedia.org/api/rest_v1/?doc#!/Feed/get_feed_featured_yyyy_mm_dd
 * Parsoid move to service-runner and service::node completed
 * service-template-node v0.4.0 is out - please update soon.
 * security issue addressed
 * new feature - automatic metrics collection
 * Marko out next week

Security

 * Verifying T140366
 * Reviewing 296699
 * Drafting/editing security team job descriptions
 * Request security reviews: https://www.mediawiki.org/wiki/Wikimedia_Security_Team/Security_reviews
 * MediaWIki 1.27.1 security release planed for early August

RelEng

 * Blocking
 * Android to differential
 * Blocked
 * None
 * Updates
 * Zuul upgraded this week, should address a bunch of issues
 * New SWAT deploy process going ok, reminder to install https://wikitech.wikimedia.org/wiki/X-Wikimedia-Debug if you're putting things up for SWAT

Fundraising Tech

 * Civi post-upgrade bugfixes
 * more work on batch contact de-duplication
 * CentralNotice deployed (last week), watching closely for glitches
 * No longer serving CN modules on special pages and action=edit (https://phabricator.wikimedia.org/T139439)
 * Upgraded payments to MW 1.27 (LTS!)
 * Still hoping to get closer to master, but this buys us a lot of time
 * Killed ancient homegrown form template engine (-9,000 loc !)
 * Experimenting with scrutinizer-ci
 * Pivoting to ActiveMQ replacement work
 * Building out new servers
 * No blockers

TechOps

 * Blocking None


 * Blocked:
 * https://phabricator.wikimedia.org/T135483 - HHVM crashes - raised to UBN! after issue recurrence. Currently no one owns the ticket.
 * looks like there's already a patch at https://gerrit.wikimedia.org/r/299710 -- [cscott, for parsing (and tim starling, who wrote the patch)]


 * Updates:
 * Insecure (non-HTTPS) POST traffic blocked completely as of yesterday, may see reports of broken bots/tools - https://phabricator.wikimedia.org/T105794

Wikidata

 * No blockers.
 * Back into regular 2 weeks Scrum sprint. Connecting loose ends to get stuff done.
 * Reworking jQuery based UI code (minimizing the code base).
 * Still working on structured data for Commons.