Topic on Talk:Reading/Web/Projects/Barack Obama in under 15 seconds on 2G/Quarter 3

Some observations from a controlled Chrome desktop UA

1
ABaso (WMF) (talkcontribs)

Using a Chrome desktop UA with fresh incognito instances, here were some observations.

Production edge cached page

"Regular 2g" connection (250kbps) for a *cached* page

854kb transferred, domcontentloaded 26.73s, load 28.79s

https://en.m.wikipedia.org/wiki/Barack_Obama

Subsequent requests load similarly.


Production edge cachebusted page

"Regular 2g" connection (250kbps) for a cachebusted page

854KB transferred, domcontentloaded 27.62s, load 29.36s

https://en.m.wikipedia.org/wiki/Barack_Obama?cb=20160413f

Subsequent requests with unique URLs for cachebusting perform similarly.


Beta (uncached edge)

"Regular 2g" connection (250kbps) for a cachebusted page (because it's beta, plus URL was unique)

455KB transferred, domcontentloaded 14.04s, load 15.96s

https://en.m.wikipedia.org/wiki/Barack_Obama?mobileaction=beta&cb=20160413d

Subsequent requests with unique URLs for cachebusting perform similarly.

Note: beta contains a srcset lead image. Without a srcset lead image it would be somewhat faster and would use less data.

In other words, the beta page load time, where the enhancements are in place, is dramatically lower on a controlled network connection.

Is it possible to use a sample of articles to conduct similar analyses on controlled network conditions?

Regarding the data captured from the wild, it's conceivable that a reduction in bytes transferred - via srcset and navbox payload reduction - may result in previously lesser served network users now being able to retrieve pages better and bringing them into the sampling. This is in alignment with some hypotheses listed at https://www.mediawiki.org/wiki/Reading/Web/Projects/Performance/Removal_of_secondary_content_in_production#Analysis.

Reply to "Some observations from a controlled Chrome desktop UA"