Topic on Talk:ORES

Question from a student doing independent research

1
Summary by FeralOink

I am marking this question as fully resolved. To summarize, a computer science student at an American university inquired of Wikimedia regarding detection of undisclosed paid edits and revisions. The student is using a particular framework for his own project of detecting paid Wikipedia edits, and wished to validate its accuracy versus the findings of Wikimedia's own methods of undisclosed paid edit detection.

An employee of Wikimedia responded to the inquiry. He provided links to recent public releases of data, as well as descriptions of ORES and a predictive model used for detection. He provided ORES parameter settings and information about API use. The student replied, acknowledging that the response he received was helpful and adequate. The student also gave permission for the conversation to be openly posted on mediawiki.

Halfak (WMF) (talkcontribs)

The following is an email conversation that I had with Mark Wang about ORES. I'm posting this so that it'll maybe gain some long-term usefulness. As you can see at the end of the thread, Mark agreed to me posting this publicly.

On Sat, Nov 17, 2018 at 8:50 PM Wang, Mark <> wrote:

Hi Scoring Platform Team!

I'm Mark, a CS student at Brown Univ. I'm working on experimenting with applying Snorkel (a framework for leveraging unlabeled data with noisy label proposals) to detect paid Wikipedia edits. I've got a few selfish requests / questions for you guys.

Snorkel code : https://github.com/HazyResearch/snorkel Snorkel paper : https://arxiv.org/pdf/1711.10160.pdf

Some selfish questions:

1) Is it possible for me to have access to edits and page-stats data that you work with? I can scrape them myself (with a reasonable crawl rate), but of course, it's less convenient and I'll end up working with less data.

2) How do you represent revisions? I'm thinking about using character embeddings here. What are some methods that worked well for you guys? And what should I probably not try?

3) What features seem to be strongly informative in your models for detecting low-quality edits?

4) Any additional recommendations /advice?

Thank you in advance for your time, Mark Wang


On Mon, Nov 19, 2018 at 4:49 PM Aaron Halfaker <ahalfaker@wikimedia.org> wrote: Hi Mark!

Thanks for reaching out! Have you seen our recent data release of known paid editors? https://figshare.com/articles/Known_Undisclosed_Paid_Editors_English_Wikipedia_/6176927

1) I'm not sure what page stats you are looking for, but you can see the features we use in making predictions by adding a "?feature" argument to an ORES query. For example, https://ores.wikimedia.org/v3/scores/enwiki/21312312/damaging?features shows the features extracted and a "is this edit damaging" prediction for https://en.wikipedia.org/wiki/Special:Diff/21312312

2) A revision is a vector that we feed into the prediction model. We do a lot of manual feature engineering, but we use vector embeddings for topic modeling. We're actually looking into just using our current word2vec strategies for implementing better damage detection too. See https://phabricator.wikimedia.org/T197007

3) Here's an output of our feature importance weights for the same model. This is estimated by sklearn's GradientBoosting model.

feature.log((temporal.revision.user.seconds_since_registration + 1)) 0.131
feature.revision.user.is_anon 0.036
feature.english.dictionary.revision.diff.dict_word_prop_delta_sum 0.033
feature.revision.parent.markups_per_token 0.029
feature.revision.parent.words_per_token 0.028
feature.revision.parent.chars_per_word 0.027
feature.log((wikitext.revision.parent.ref_tags + 1)) 0.026
feature.revision.diff.chars_change 0.026
feature.revision.user.is_patroller 0.026
feature.english.dictionary.revision.diff.dict_word_prop_delta_increase 0.025
feature.log((wikitext.revision.parent.chars + 1)) 0.023
feature.log((AggregatorsScalar(<datasource.tokenized(datasource.revision.parent.text)>) + 1)) 0.023
feature.log((AggregatorsScalar(<datasource.wikitext.revision.parent.words>) + 1)) 0.023
feature.revision.parent.uppercase_words_per_word 0.022
feature.log((wikitext.revision.parent.wikilinks + 1)) 0.021
feature.log((wikitext.revision.parent.external_links + 1)) 0.02
feature.log((wikitext.revision.parent.templates + 1)) 0.02
feature.wikitext.revision.diff.markup_prop_delta_sum 0.02
feature.english.dictionary.revision.diff.non_dict_word_prop_delta_sum 0.02
feature.log((AggregatorsScalar(<datasource.wikitext.revision.parent.uppercase_words>) + 1)) 0.018
feature.revision.diff.tokens_change 0.018
feature.log((wikitext.revision.parent.headings + 1)) 0.017
feature.wikitext.revision.diff.markup_delta_sum 0.015
feature.revision.diff.words_change 0.015
feature.english.dictionary.revision.diff.dict_word_delta_sum 0.015
feature.english.dictionary.revision.diff.dict_word_prop_delta_decrease 0.015
feature.english.dictionary.revision.diff.non_dict_word_prop_delta_increase 0.015
feature.revision.diff.markups_change 0.014
feature.english.dictionary.revision.diff.dict_word_delta_increase 0.014
feature.wikitext.revision.diff.markup_prop_delta_increase 0.013
feature.wikitext.revision.diff.markup_delta_increase 0.012
feature.wikitext.revision.diff.number_prop_delta_sum 0.011
feature.wikitext.revision.diff.number_prop_delta_increase 0.011
feature.english.dictionary.revision.diff.non_dict_word_delta_sum 0.011
feature.wikitext.revision.diff.number_delta_increase 0.01
feature.revision.diff.wikilinks_change 0.01
feature.revision.comment.has_link 0.01
feature.english.dictionary.revision.diff.dict_word_delta_decrease 0.01
feature.revision.page.is_mainspace 0.009
feature.wikitext.revision.diff.number_delta_sum 0.009
feature.wikitext.revision.diff.markup_prop_delta_decrease 0.008
feature.english.dictionary.revision.diff.non_dict_word_prop_delta_decrease 0.008
feature.revision.page.is_articleish 0.007
feature.revision.diff.external_links_change 0.007
feature.revision.diff.templates_change 0.007
feature.revision.diff.ref_tags_change 0.007
feature.english.informals.revision.diff.match_prop_delta_sum 0.007
feature.english.informals.revision.diff.match_prop_delta_increase 0.007
feature.wikitext.revision.diff.number_prop_delta_decrease 0.006
feature.revision.comment.suggests_section_edit 0.006
feature.english.dictionary.revision.diff.non_dict_word_delta_increase 0.006
feature.wikitext.revision.diff.markup_delta_decrease 0.005
feature.revision.user.is_bot 0.005
feature.revision.user.is_admin 0.005
feature.english.badwords.revision.diff.match_prop_delta_sum 0.005
feature.wikitext.revision.diff.number_delta_decrease 0.004
feature.wikitext.revision.diff.uppercase_word_prop_delta_sum 0.004
feature.revision.diff.headings_change 0.004
feature.revision.diff.longest_new_repeated_char 0.004
feature.english.badwords.revision.diff.match_prop_delta_increase 0.004
feature.english.informals.revision.diff.match_delta_increase 0.004
feature.english.dictionary.revision.diff.non_dict_word_delta_decrease 0.004
feature.wikitext.revision.diff.uppercase_word_delta_sum 0.003
feature.wikitext.revision.diff.uppercase_word_prop_delta_increase 0.003
feature.revision.diff.longest_new_token 0.003
feature.english.informals.revision.diff.match_delta_sum 0.003
feature.wikitext.revision.diff.uppercase_word_delta_increase 0.002
feature.wikitext.revision.diff.uppercase_word_prop_delta_decrease 0.002
feature.english.badwords.revision.diff.match_delta_sum 0.002
feature.english.badwords.revision.diff.match_delta_increase 0.002
feature.wikitext.revision.diff.uppercase_word_delta_decrease 0.001
feature.english.informals.revision.diff.match_prop_delta_decrease 0.001
feature.revision.page.is_draftspace 0.0
feature.revision.user.has_advanced_rights 0.0
feature.revision.user.is_trusted 0.0
feature.revision.user.is_curator 0.0
feature.english.badwords.revision.diff.match_delta_decrease 0.0
feature.english.badwords.revision.diff.match_prop_delta_decrease 0.0
feature.english.informals.revision.diff.match_delta_decrease 0.0

4) You'll note that time since registration and is_anon are strongly predictive. They don't overwhelm the predictions -- we can still differentiate good from bad among newcomers and anonymous editors. But the model generally doesn't predict that an edit by a very experienced editors is bad regardless of what's actually in the edit. The more we can move away from relying is_anon and seconds_since_registration, the more we'll be targeting the things that people do -- rather than targeting them for their status. See section 7.4 our systems paper for a more substantial discussion of this problem.

-Aaron


On Mon, Nov 19, 2018 at 6:47 PM Wang, Mark <> wrote:

Thanks a bunch for your help Aaron! This is all very informative.

One more question from me: May I borrow your features? And if so, is accessing them through the API the preferred method of access for an outsider?

Thanks again, Mark


On Tue, Nov 20, 2018 at 11:07 AM Aaron Halfaker <ahalfaker@wikimedia.org> wrote:

Say, I'd like to save this conversation publicly so that others might benefit from it. Would you be OK with me posting our discussion publicly on a wiki?

On Tue, Nov 20, 2018 at 10:06 AM Aaron Halfaker <ahalfaker@wikimedia.org> wrote: Yes. That is a good method for accessing the features. You'll notice that the features that the API reports are actually just the basic reagents for the features the model uses.

For example, we have features like this:

  • words added
  • words removed
  • words add / words removed
  • log(words added)
  • log(words removed)
  • etc.

In all of these features, the basic foundation is "words added" and "words removed" with some mathematical operators on top. So we only report those two via the API. To see the full set of features for our damage detection model, see https://github.com/wikimedia/editquality/blob/master/editquality/feature_lists/enwiki.py See also a quick overview I put together for feature engineering here: https://github.com/wikimedia/revscoring/blob/master/ipython/feature_engineering.ipynb

If I wanted to extract the raw feature values for the English Wikipedia "damaging" model, I'd install the "revscoring" library (pip install revscoring) and then run the following code from the base of the editquality repo:

$ python
Python 3.5.1+ (default, Mar 30 2016, 22:46:26) 
[GCC 5.3.1 20160330] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> from editquality.feature_lists.enwiki import damaging
/home/halfak/venv/3.5/lib/python3.5/site-packages/sklearn/cross_validation.py:41: DeprecationWarning: This module was deprecated in version 0.18 in favor of the model_selection module into which all the refactored classes and functions are moved. Also note that the interface of the new CV iterators are different from that of this module. This module will be removed in 0.20.
  "This module will be removed in 0.20.", DeprecationWarning)
>>> from revscoring.extractors import api
>>> import mwapi
>>> extractor = api.Extractor(mwapi.Session("https://en.wikipedia.org"))
Sending requests with default User-Agent.  Set 'user_agent' on mwapi.Session to quiet this message.
>>> list(extractor.extract(123456789, damaging))
[True, True, False, 10.06581896445358, 9.010913347279288, 8.079927770758275, 3.4965075614664802, 2.772588722239781, 5.402677381872279, 2.70805020110221, 1.791759469228055, 2.1972245773362196, 7.287484510532837, 0.3940910755707484, 0.009913258983890954, 0.06543767549749725, 0.0, 2.0, -2.0, 0.04273504273504275, 0.15384615384615385, -0.1111111111111111, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 2.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1, 1, False, False, False, False, True, False, False, 11.305126087390619, False, False, 0, 0, 0, 0.0, 0.0, 0.0, 0, 0, 0, 0.0, 0.0, 0.0, 0, 0, 0, 0.0, 0.0, 0.0, 0, 0, 0, 0.0, 0.0, 0.0]

This extracts the features for this edit: https://en.wikipedia.org/w/index.php?diff=123456789

-Aaron


Hi Aaron:

Thank you so much! This is all so helpful. And of course, feel free to publicize any of our conversations.


Mark