API talk:Usercontribs

Just new pages
How to create list of pages which was created by user (just contributions with flag new="")? --BaseBot (talk) 20:46, 1 June 2012 (UTC)
 * I don't think you can. You can get the flags by using "ucprop=flags", but then you'll have to filter your result to just select those with flag "new".  --R&#39;n&#39;B (talk) 16:51, 2 June 2012 (UTC)

uccontinue?
I'm a little baffled about the usage of uccontinue. In what way does this allow me to continue to grab more data when it's available? For example, I was trying to retrieve all of a user's past revision ids (more than 500). Other than copying and pasting the ucstart given at the bottom of the result into the url and going to that page as well, I can't figure out how uccontinue works. Could there be a little more detail on uccontinue usage somewhere? Might be a silly question, but I'd appreciate the help! RachulAdmas (talk) 19:45, 25 July 2012 (UTC)
 * Hi. There is more info on this at API:Query. --R&#39;n&#39;B (talk) 21:16, 25 July 2012 (UTC)
 * Thanks for the response! It's just that it looks like uccontribs can't be used as a generator, so I'm unsure of how to keep grabbing data when there's more (like when there's more than 500 revisions). How should I go about continuing the query when I can't use uccontribs as a generator? Thanks again for the help, and if there's a more appropriate place for me to be posting questions, just let me know! RachulAdmas (talk) 14:50, 26 July 2012 (UTC)
 * Well, you can continue a query even without using a generator. Look at the example on API:Usercontribs -- [//en.wikipedia.org/w/api.php?action=query&list=usercontribs&ucuser=Catrope&uclimit=3&ucdir=newer api.php?action=query&list=usercontribs&ucuser=Catrope&uclimit=3&ucdir=newer] gives you a "query" element that contains a "usercontribs" element that contains three revision items, and also a "query-continue" element that contains a "usercontribs" element with a "ucstart" value.  You need to add this "ucstart" key/value pair to your query string and submit it again to get the next 3 revisions (of course, in a real application, you probably would use a higher limit than 3). --R&#39;n&#39;B (talk) 15:08, 1 August 2012 (UTC)
 * P.S. I don't know why the API is still using "ucstart" instead of "uccontinue". A best practice for programming this would be to take whatever key is contained in the " " element and add it to your query string; that way, if the API is changed to use a different key, your program won't break.  --R&#39;n&#39;B (talk) 15:14, 1 August 2012 (UTC)

Wikiget

 * Wikiget, a unix command-line tool to retrieve user contributions with options. -- Green Cardamom (talk) 01:49, 23 November 2016 (UTC)

No duplicates
Hi, is there a parameter where you can specify that all duplicates should be stripped out from result set so you have n (as specified in ) different records? --Minilexikon (talk) 09:17, 8 June 2017 (UTC)


 * I think  would have that effect. – Robin Hood   (talk)  09:27, 8 June 2017 (UTC)
 * Doesn't that mean if someone else changed the page after me, it wouldn't occur in the result set. -- Minilexikon (talk) 09:49, 8 June 2017 (UTC)
 * I think it would, yeah. I honestly haven't worked with this module much. You may just have to filter the results client-side. – Robin Hood  (talk)  20:29, 8 June 2017 (UTC)
 * I guess, it isn't meant to work this way, is it? --Minilexikon (talk) 20:33, 8 June 2017 (UTC)
 * In my opinion it's de rigueur to be handled on the server side for a good, qualitative and handsome API. Minilexikon (talk) 09:43, 30 June 2017 (UTC)

Did multiple ucuser just break?
I'm sure this worked a little while back, but I can no longer specify ucuser=User1|User2 using either a literal | or %7C. The query just times out. It works fine with one user, so I can merge client-side if necessary. Did that change? DavidBrooks (talk) 02:25, 6 October 2018 (UTC)

I'd really like to know where this fad about naming things that work differently from what someone envisions as "broken" comes from. A timeout doesn't mean something is broken. API:Usercontribs just like special:contributions is simply incredibly inefficient due to its underlying design. So if special:contributions can timeout with just one user, one can imagine what will happen if several users or if too many revisions are queried. 15:31, 6 October 2018 (UTC)


 * (replying to comment from User:197.235.78.130) My apologies for the implication. I would just instead observe that a specific two-user query has worked unfailingly over many months in the recent past. It now times out every time I try it (about 20 attempts over several days) but a single-user query always returns. I wonder if there has been a change that anyone has noticed to make performance significantly (one might say reliably) worse, and whether it would be a good plan to revert to a client-side merge from now on. DavidBrooks (talk) 17:37, 8 October 2018 (UTC)
 * Well, there isn't even close to enough information to answer your question. The wiki in question might be a third party wiki, hosted in a laptop or a supercomputer, or have changed its specifications and have lowered its timeouts or not. Just because the query might be the similar (or the same?) it doesn't mean the underlying data being retrieved in the same way. In any case, one possible answer is here: https://phabricator.wikimedia.org/T33197#356905 . 09:51, 9 October 2018 (UTC)
 * Again, apologies. I came here from English Wikipedia. Here is the hanging URL (apologies for any line break)

GET /w/api.php?action=query&format=xml&list=usercontribs&ucuser=DavidBrooks%7CDavidBrooks-AWB&uclimit=20&ucnamespace=0&ucprop=title%7Ctimestamp%7Ccomment%7Cflags&continue= HTTP/1.1
 * Removing the "%7CDavidBrooks-AWB" results in an immediate reply, and all the continuations run to completion. I'll try to report this at the thread you cited. Thanks. DavidBrooks (talk) 14:05, 9 October 2018 (UTC)
 * For wikimedia wikis a new timeout was added for long running queries, https://phabricator.wikimedia.org/T97192. As noted previously that special page and its associated api are inefficient, so more api calls will timeout properly. However ucusers still works, https://en.wikipedia.org/w/api.php?action=query&format=json&list=usercontribs&continue=-%7C%7C&ucuser=%E2%80%8ERishin%20Chatterjee%7C%20%E2%80%8EDinesh%20sulaniya&ucnamespace=2currently works perfectly. You could file a bug report if you believe it should work, and maybe they'll try to optimize it but that seems unlikely. The alternative is indeed joining the results client-side. 197.235.208.193 14:46, 9 October 2018 (UTC)
 * As of a few hours ago, my test case has started returning data as before. Either it's a good morning, or an update has been pushed. Thanks anyway for responding to my (somewhat ill-formed) question. DavidBrooks (talk) 16:20, 11 October 2018 (UTC)
 * It is probably due to the switchover (https://meta.wikimedia.org/wiki/Tech/Server_switch_2018). But the performance issue hasn't really been fixed, see https://commons.wikimedia.org/w/api.php?action=query&format=json&list=usercontribs&ucuser=F%C3%A6%7CFaebot, for an example.  12:46, 12 October 2018 (UTC)