Talk:Contributors/Projects/Editing performance

Jump to navigation Jump to search

About this board

2.207.24.243 (talkcontribs)

@ESanders (WMF): I think you flipped two things in the legend of the fourth graph.

ESanders (WMF) (talkcontribs)

I think it's correct, the y-axis has just been inverted ( )

Reply to "Legend flippend?"

Serious data error, the report needs to be fixed or withdrawn

1
Alsee (talkcontribs)

@Deskana (WMF), @Matma Rex, there is a severe data error in the report. Specifically, the graph Time_to_interactive does not accurately reflect time-to-interactive for the 2017Editor.

Using the 2017Editor in Chrome on Win7, I captured the beacon reports (action.init.timing, action.ready.timing, action.loaded.timing) while testing the page en:List_of_members_of_the_Lok_Sabha_(1952–present). Reported init.timing was about a half second, reported ready.timing was under 4 seconds, reported loaded.timing was over 100 seconds, actual user-experienced time to an interactive state was over 100 seconds. For the 2017Editor, the ready.timing value does not remotely reflect Time_to_interactive. 2017Editor time_to_interactive time appears to equal loaded.timing, or maybe slightly longer.

This may help explain the disconnect between the WMF's perception of 2017Editor performance and user reported performance. Also do not overlook the #Caveats sample bias. Anyone who edits larger pages, or has otherwise experienced poor performance, has probably shut off the Beta-option. (Performance appears to be significantly worse in Firefox.) So the values collected on the 2017Editor is going to skew low, independent of the error in measuring time_to_interactive.

Reply to "Serious data error, the report needs to be fixed or withdrawn"

Which start/end timestamps are being used?

9
Alsee (talkcontribs)

I'be been checking the new instrumentation. I assume the values being used come from the t= value in backend-timing:"D=###### t=################".

At the moment I'm assuming that the start time is being taken from the first request: https://en.wikipedia.org/wiki/PAGENAME?action=edit. The part I'm particularly unsure about is which request is being used for the final timestamp. The most obvious candidate is the https://en.wikipedia.org/wiki/PAGENAME?action=edit#null request. That is usually the final request. However I find that I can start working in the wikitext editor a lot faster than that. It seems very possible you're taking the final timestamp from a different request. If so, which one should I be using?

Whatamidoing (WMF) (talkcontribs)

backend-timing is an unrelated measurement.  It shows how long it took the servers to produce the page.

The measurement used for this report is browser time, not server time.  

You can see a similar, but not identical, measurement automatically reported at the dashboard for VisualEditor total load time (VisualEditor only).  Over the course of thousands of edits, it looks like the median is running around two and a half seconds.

Alsee (talkcontribs)

Unless the Grafana dashboard you linked has been changed to a completely different system, the data there is garbage. Around a yearish ago I did extensive testing, both local testing and testing with dedicated online website test systems. Actual VE load times were almost double what is being reported by the software.

If backend-timing is unrelated, could you please identify the name or location of the value that is being used? I should be able to find them in the browser network logs, with a reasonable pointer of where or what to look for.

Whatamidoing (WMF) (talkcontribs)

As I said, the data for this report:

  1. is not what is in that Grafana dashboard, and
  2. is what the user's browser self-reports.

I believe that it uses "performance.now", but I don't know any of the details.

Matma Rex (talkcontribs)

I'm afraid you won't find these values in the network logs. They are not directly related to when we load things.

If you really want to watch network logs, look for URLs starting with "https://en.wikipedia.org/beacon/event" (adjust the domain if you're testing on a different wiki), as that is how this data is logged. The data is in the query parameters, as JSON; https://meta.wikimedia.org/wiki/Schema:Edit has a brief description of the format. The actions "init", "ready" and "loaded" correspond to start time, time to interactive and time to fully loaded in this document.

Note that these are only recorded for 1 out of every 16 edit sessions (on average, randomly selected).

Alsee (talkcontribs)

Thanks @Matma Rex. Unfortunately I can't seem to find anything going to like that URL, or anything similar. I tried many dozens of editor-loads, so even if it's only sent 1/16 I should have had a statistical certainty to catch it. I also installed an extra extension to ensure I was catching and searching all browser requests. Can you give me any additional help in capturing these values?

I'm experiencing a 70+ second load time before the 2017Editor becomes responsive, for the article en:List_of_members_of_the_Lok_Sabha_(1952–present). It's not a network speed issue, it's CPU bound. I would like to verify that the timing is actually reporting 70+ seconds. The timing would be grossly wrong if it's measured before the overloaded CPU frees up.

Matma Rex (talkcontribs)

It works for me. The thing about randomness is that it may never happen even if you try it infinitely many times… Or maybe you have them blocked with AdBlock or something? I recommend trying with a more convenient, shorter page, until you know for sure how to capture them.

I was just able to see the following requests being made after refreshing the page https://en.wikipedia.org/wiki/Nico_Verlaan?veaction=editsource a few times:

https://en.wikipedia.org/beacon/event?%7B%22event%22%3A%7B%22version%22%3A1%2C%22action%22%3A%22init%22%2C%22editor%22%3A%22wikitext-2017%22%2C%22platform%22%3A%22other%22%2C%22integration%22%3A%22page%22%2C%22page.id%22%3A23690117%2C%22page.title%22%3A%22Nico_Verlaan%22%2C%22page.ns%22%3A0%2C%22page.revid%22%3A716846292%2C%22editingSessionId%22%3A%220f853d16be975303%22%2C%22user.id%22%3A6423215%2C%22user.editCount%22%3A1269%2C%22mediawiki.version%22%3A%221.31.0-wmf.24%22%2C%22action.init.type%22%3A%22page%22%2C%22action.init.mechanism%22%3A%22url%22%2C%22action.init.timing%22%3A422%7D%2C%22revision%22%3A17541122%2C%22schema%22%3A%22Edit%22%2C%22webHost%22%3A%22en.wikipedia.org%22%2C%22wiki%22%3A%22enwiki%22%7D;
https://en.wikipedia.org/beacon/event?%7B%22event%22%3A%7B%22version%22%3A1%2C%22action%22%3A%22ready%22%2C%22editor%22%3A%22wikitext-2017%22%2C%22platform%22%3A%22other%22%2C%22integration%22%3A%22page%22%2C%22page.id%22%3A23690117%2C%22page.title%22%3A%22Nico_Verlaan%22%2C%22page.ns%22%3A0%2C%22page.revid%22%3A716846292%2C%22editingSessionId%22%3A%220f853d16be975303%22%2C%22user.id%22%3A6423215%2C%22user.editCount%22%3A1269%2C%22mediawiki.version%22%3A%221.31.0-wmf.24%22%2C%22action.ready.timing%22%3A460%7D%2C%22revision%22%3A17541122%2C%22schema%22%3A%22Edit%22%2C%22webHost%22%3A%22en.wikipedia.org%22%2C%22wiki%22%3A%22enwiki%22%7D;
https://en.wikipedia.org/beacon/event?%7B%22event%22%3A%7B%22version%22%3A1%2C%22action%22%3A%22loaded%22%2C%22editor%22%3A%22wikitext-2017%22%2C%22platform%22%3A%22desktop%22%2C%22integration%22%3A%22page%22%2C%22page.id%22%3A23690117%2C%22page.title%22%3A%22Nico_Verlaan%22%2C%22page.ns%22%3A0%2C%22page.revid%22%3A716846292%2C%22editingSessionId%22%3A%220f853d16be975303%22%2C%22user.id%22%3A6423215%2C%22user.editCount%22%3A1269%2C%22mediawiki.version%22%3A%221.31.0-wmf.24%22%2C%22action.loaded.timing%22%3A943%7D%2C%22revision%22%3A17541122%2C%22schema%22%3A%22Edit%22%2C%22webHost%22%3A%22en.wikipedia.org%22%2C%22wiki%22%3A%22enwiki%22%7D;

After URI-decoding you can find the following information in JSON data:

  • "action.init.timing": 422
  • "action.ready.timing": 460
  • "action.loaded.timing": 943

I will not be spending my time finding out exactly how slow "List of members of the Lok Sabha (1952–present)" is, I can see it's definitely slow.

Matma Rex (talkcontribs)

I've been working on something entirely unrelated today and discovered that these requests are not sent if you have Do Not Track turned on in your browser – that might be the reason why you're not seeing them.

Alsee (talkcontribs)

Ouch. After your previous comment I opened over four hundred article-edit tabs trying to get it to show up - bringing the odds of statistical failure to somewhere around one in a trillion. That's not counting my earlier efforts.

When I turned off DoNotTrack the values appeared almost immediately. Thanks.

Reply to "Which start/end timestamps are being used?"
Alsee (talkcontribs)

There an extremely relevant example of performance measurement from a Youtube engineer that illustrates how the statistics being collected are garbage.

It's worth reading the link, but in short: Youtube pages were bloated and inefficient. A base page was 1.2 meg and many dozens of requests before you could even begin viewing a video. The engineer optimized the base to under 0.1 meg and just 14 requests. When he tested it, it was much much faster. So he sent it live. A week later he checked the performance stats. He was shocked to discover that average load times had gotten WORSE! Nothing made sense. He was going crazy trying to figure out why. Then a colleague came up with the answer:

For much of the world, the original (slow) Youtube page was absolutely unusable. The original metrics collected only included people who were already getting acceptable performance. When the lightweight version was released, news spread like wildfire across Southeast Asia, South America, Africa, Siberia, and elsewhere, that Youtube WORKED now! The metrics got worse because they included vast numbers of people who couldn't use the slow version at all.

In this case the 2017Editor is the slow version, and the normal wikitext editor is the fast version. Collecting this kind of bulk data and comparing the results is based on an assumption that sample populations are actually comparable. However as illustrated above, that assumption isn't true. Anyone who found the 2017Editor unusable has stopped using it. The data for the 2017Editor is going to be skewed towards people who get better than average performance, it's going to be grossly skewed towards people who only edit small pages. Anyone who tries to edit a large page with the 2017Editor is going to turn the damn thing off.

Whatamidoing (WMF) (talkcontribs)
Alsee (talkcontribs)

I see that performance improvements have been a major initiative this quarter.

What efforts, if any, have been devoted to improving loading and other performance of our current primary editor? Or has the WMF neglected our current primary tools, and comparing an optimized 2017Editor to an unoptimized current wikitext editor?

Matma Rex (talkcontribs)

We made some improvements to the loading of WikiEditor, mostly removing old unused code – workarounds for browsers we no longer support, and remains of long-disabled editor experiments from 2010.

When it comes to loading performance, the slowest thing about the "old" wikitext editor is that you have to reload the whole page to open it. (This applies both to WikiEditor, and the super-old "core" editing toolbar, see Editor for reference.) Unfortunately, this is not really something we can change, as it would break countless gadgets and user scripts that run only once after the page loads and see if the editor is open, and make a lot of people very angry at us.

I don't think there's much we can do about loading speed of WikiEditor. Apart from this big problem, it is already fast.

Post-load performance: Unusable.

3
Alsee (talkcontribs)

When I try to load EN:Intercollegiate_sports_team_champions in the 2017Editor the timing stats between the first to last requests have been {30,33,33,35,40} seconds. The editor is theoretically accepting input, however the actual time elapsed is OVER ONE HUNDRED SECONDS before I can actually get one new character to show up. And when I do type one character, the CPU thread chokes at maximum for over a minute. Then I can type one more character, and the browser goes unresponsive for another minute. Every character locks up the browser for over a minute.

On a smaller article such as EN:New_York_(state) the editor theoretically works, however it's still incapable of realtime response. It can't keep up with even moderate typing speed. Nothing shows up as I type, until the screen updates for a half-dozen keypresses at a time. That is not usable.

Matma Rex (talkcontribs)

I filed a bug about this when I saw your comment (T188965) and investigated quickly. The problem was actually noticed by another engineer (Ed) a few days before and already fixed.

Some code verifying the correctness of operations on the document, that was supposed to run only inside the test suite, was running during normal editing, and that was the reason for the insane slowness.

The fix is live on the Wikimedia wikis already, so you can test it yourself. The typing performance is still poor for long articles, but no longer completely unusable.

Alsee (talkcontribs)

@Matma Rex sorta-thanks. The slow typing is fixed.

However, testing EN:Intercollegiate_sports_team_champions in two different browsers (Chrome and a Firefox variant):

  • Time to load to first editable state is poor compared to the normal wikitext editor (almost 14-20 seconds) to awful (over 40 seconds).
  • The sequence CTRL-A CTRL-X is unusable (time out error and over 40 seconds) to stupid-unusable (multiple time out errors and over 2 minutes).
  • CTRL-V (pasting the original content back into the empty editor) is completely broken. In both browsers I waited over 5 minutes and umpteen time-out errors, then I had to kill the tabs.

The real bug is whoever forced the 2017Editor to be a mode inside of Visual Editor.

Reply to "Post-load performance: Unusable."
Alsee (talkcontribs)
Reply to "Preview performance"
There are no older topics