Talk:Moderator Tools/Automoderator/Testing

What is your overall perception of Automoderator's accuracy?

 * I checked 5 batches (150 edits) for hrwiki. Can't complain about accuracy, but sensitivity is rather low: very (very!) few edits will ever be caught. Seems like a lot of unnecessary computation for a little gain (in terms of patroller time). Note that hrwiki is highly abuse-filter'ed nowadays, so many bad edits never reach the main namespace. p o nor     (talk) 13:20, 15 November 2023 (UTC)

Among the different caution levels, which did you feel was the best?
This will be a compromise between the number of edits Automoderator catches, and its accuracy rate.
 * Medium level (0.98) seemed safe, but to get any useful action I'd go with 0.975. And I still think the most meaningful action would be to slow down a vandal (captcha, obligatory edit summary, throttle), rather that reverting one or two edits a day. p o nor     (talk) 13:26, 15 November 2023 (UTC)
 * I have evaluated 7 sheets (210 edits). 0.975 seemed to be a good compromise, with ~96% accuracy and recall around 50%. (Having 0.99 at 0.5 would be really awesome.) --Matěj Suchánek (talk) 13:29, 16 November 2023 (UTC)

Did you notice any patterns in the edits Automoderator decided to revert?
We're particularly interested in false positives, so that we can improve Automoderator's accuracy.
 * Not really because there were so few. I would like to see it catching more (anonymous) editors' infobox removals, intro words (bolded) removal or alternation, random number changes etc. p o nor     (talk) 13:29, 15 November 2023 (UTC)
 * I haven't really noticed any patterns, just a bunch of false positives, where even reverting vandalism received a 0.98. --Vít Karásek (talk) 22:07, 15 November 2023 (UTC)
 * Most decisions to revert (at 0.975) were instances of really obvious vandalism. --Matěj Suchánek (talk) 13:29, 16 November 2023 (UTC)

Requests for data for other wikis

 * Because I'm most familiar with hrwiki recent changes and vandalism patterns and am willing to do some tests, can you please generate data for my wiki. Thanks, p o nor     (talk) 11:32, 24 October 2023 (UTC)
 * @Ponor Absolutely - I've filed T349606. Samwalton9 (WMF) (talk) 11:58, 24 October 2023 (UTC)
 * Thanks, Samwalton9. Given that helpfulness of Automoderator, if I'm reading it correctly, is below 1% (# Automoderator reverts / # daily reverts) for the four listed wikis, I'm thinking maybe you should focus more on wikis that do not have their edit stream already heavily filtered by Abuse Filter. I'd ask (at least) administrators on those wikis to help you with this. Reach out to shwiki, for example, I've heard they had a recent increase of vandalisms, and do not have many filters in use.
 * Also, I'd still like Automoderator to be able to slow down any suspect vandals (let them wait, let them fill edit summary in), rather that revert their edits. Immediate reverts usually mean rage and more vandalism. p o nor     (talk) 12:10, 24 October 2023 (UTC)
 * @Ponor Thanks for sharing your thoughts. One thing to note with this data is that 'reverts' captures many different kinds of edits which take a page back to an earlier state, so this number isn't exclusively anti-vandalism reverts. We did some analysis of existing anti-vandalism bots where we constrained the comparison to reverts which happen within 24 hours, as an estimate which might be closer to only being anti-vandalism reverts. Comparing my data sources, it looks like 'fast' reverts are approximately 50% of daily reverts. That increases the % a little, but I think you're right that tools like AbuseFilter will be impacting this - thanks for the suggestion to reach out to wikis like shwiki, I think that's a great idea. Samwalton9 (WMF) (talk) 12:51, 24 October 2023 (UTC)
 * I'm interested in data for cswiki (500,000+ articles, 150–250 vandal edits a day). And I'm happy to bring more people familiar with patrolling. --Matěj Suchánek (talk) 16:38, 24 October 2023 (UTC)
 * I've filed a Phabricator task for this at T349832. CLo (WMF) (talk) 15:33, 26 October 2023 (UTC)
 * I have generated datasets for hrwiki and cswiki. Thank you participating in the testing. --KCVelaga (WMF) (talk) 06:49, 14 November 2023 (UTC)

Batch testing impressions

 * I've noticed quite a number of bot edits in the data for hrwiki. Because there are so many of them in comparison with user (esp. anonymous user) edits, is it possible that the training set is somewhat skewed? A bot fixing a word/pattern in thousands of articles is not an uncommon situation; I'd exclude bot edits from any consideration. p o nor     (talk) 13:40, 15 November 2023 (UTC)
 * Just like a bot can fix a word/pattern, an anonymous user can (though in a few articles). Sure, Automod should ignore bots, but it's reasonable to include some bot edits in the training data as negative samples. --Matěj Suchánek (talk) 13:29, 16 November 2023 (UTC)