Manual talk:Pywikibot/

Jump to navigation Jump to search
The following discussion has been transferred from Meta-Wiki.
Any user names refer to users of that site, who are not necessarily users of (even if they share the same username).


Is autonomous_problems.dat intended to be usable as an argument for the -file: option? My version of the bot (04-22-2005 CVS) seems not to like it. It'd be of great help if it could be used; I wrote a simple script to perform the task, but at the cost of losing quite a lot of stored information. Taragui 14:52, 6 May 2005 (UTC)

probably bug[edit]

I run this bot within th (Thai) and en (English) Wikipedia(s). In English Wikipedia, instead of editing articles as my bot username, my IP-Address is shown in edit history (similar to un-login user). In Thai Wikipedia it worked well to me.

crashing bug[edit]

Python 2.5, Windows XP MCE, all SPs, just updated, get this error:

NOTE: [[es:Of Vengeance and Violence]] does not exist
Getting 60 pages from wikipedia:sv...
Sleeping for 5.7 seconds, 10 Dec 2006 15:59:47 (UTC)
Dump en (wikipedia) saved
Traceback (most recent call last):
  File "C:\pyWikipedia\pywikipedia\", line 1552, in <module>
  File "C:\pyWikipedia\pywikipedia\", line 1264, in run
  File "C:\pyWikipedia\pywikipedia\", line 1238, in queryStep
  File "C:\pyWikipedia\pywikipedia\", line 1234, in oneQuery
  File "C:\pyWikipedia\pywikipedia\", line 513, in workDone
    if self.hasnondisambig(
  File "C:\pyWikipedia\pywikipedia\", line 363, in hasnondisambig
    if not pl.isDisambig():
  File "C:\pyWikipedia\pywikipedia\", line 699, in isDisambig
    for tn in self.templates():
  File "C:\pyWikipedia\pywikipedia\", line 1219, in templates
    thistxt = self.get()
  File "C:\pyWikipedia\pywikipedia\", line 487, in get
    raise self._getexception

If I start from the same place right before a crash, it dies again. Could it be failing to get sv.wikipedia? ST47 (en:User:ST47) 16:03, 10 December 2006 (UTC)

Incorrect (?) inclusion of subheadings[edit]

I had a short exchange to a bot operator. Wikipedia bot user account w:User:SieBot undid some changes I made to the w:Albert article: [1]. The Mandarin update is fine, but the German and Swedish changes are incorrect. I don't think the subheadings are appropriate for those linked articles -- the articles are each about the name; using the subheadings would be appropriate only if the English article were simply a list of the people with the name, I think. -- JHunterJ 20:27, 7 April 2007 (UTC)

Working with the logfile[edit]

Could someone explain how to use this code for other languages than English: "python -warnfile:english_treelang.log"? and notably, for example, how to use a file as "warning-wikipedia-kw"? Thank you very much!Benoni 10:20, 1 June 2007 (UTC)

Well, the answer seems to be to use the command: "python -lang:xx"Benoni 11:17, 1 June 2007 (UTC)

Blocking unwanted links[edit]

We need a per-page way for local editors to specify that a given interwiki link on a given page is unwanted, and a default configuration of the bots that will understand and honor such a request. The bots should never be allowed to make edits against local consensus of human editors. So we need a way for human editors to tell the bot what not to do.

The case I have in mind is w:en:Ingria, which is visited almost daily by a dozen different interwiki bot accounts and gets a link added to the highly controversial ru-sib wiki, which always get reverted. There is a local consensus among the editors of the article that this link is not wanted. I'm not going to delve into the policies and politics of why one should or shouldn't link to ru-sib, or whether or not it is okay for editors of one language wiki to boycott another. The point is simply that it shouldn't be up to the bot to decide.

In this particular case the issue could be solved by having all bot operators set their bots to "-neverlink ru-sib", but we can't really expect all bot owners to know about this case in advance. Also, there might be cases where the rejection is not of a whole language but specific to individual pages. So, we really need something on the page to stop the bot, and we need the bot to understand that in its default configuration.

As this constant revert war against a dozen different bots is getting disruptive, I very strongly urge developers to come up with a technical solution. Otherwise, I might take the step of blocking every interwiki bot on enwiki that runs without "-neverlink ru-sib". Fut.Perf. 09:25, 19 June 2007 (UTC)

Maybe we should share our skipfiles, or have a wiki-page that lists pages that bots should skiped? TaBaZzz 16:07, 24 August 2008 (UTC)

Wybot mishandles iw links on templates[edit]

Apparently this is where I'm supposed to report that en:User:Wybot is misbehaving. It's incorrectly inserting/removing interwiki links on templates. See [2] and [3]. The first change causes the interwiki links to be included on all transcluded pages, while the second change removed valid interwiki links intended for the main template and inserted other interwiki links that should have been wrapped in a noinclude section to avoid being included on the main template. --PEJL 08:06, 16 September 2007 (UTC)

Actually, this is something better posted at en:User talk:WonYong. EVula // talk // // 08:08, 16 September 2007 (UTC)
I've left a note there referring to this discussion. I was merely following the instruction at User:Wybot that said: "Non-administrators can report misbehaving bots to Wikipedia:Administrators' noticeboard/Incidents.", generated by {{Emergency-bot-shutoff}}. If that instruction is inappropriate, perhaps it should be changed. --PEJL 08:10, 16 September 2007 (UTC)
Oh~ I am sorry. I run " -continue -autonomous". what happen? I stopped my interwiki bot. It is a bot program's error? I use pywikipedia bot SVN. version is new. what happen?? I am not programmer. I don't know why, how, etc. :( -- WonYong (talk contribs count logs email) 08:23, 16 September 2007 (UTC)
(I'm not sure if this is the right place to post; if not, feel free to move the discussion to someplace more appropriate, but please notify me.) One of the issues seems to be (at least in the case of the second link) the fact that the IW links are kept on a transcluded subpage (along with other documentation). Apparently, this confuses the bot. This may be bad practice, I don't know, but it certainly simplifies life when dealing with a protected template. The other issue, where it moves the IW links outside of the noinclude tags appears to simply be a bug in the bot. Xtifr tälk 08:42, 16 September 2007 (UTC)

working with categories[edit]

How can I run the bot for interwikiing the categories, p.e. new created categories, or subcategories wich arise from a given category? --A1 21:50, 11 December 2007 (UTC)

You can use -cat:"Stubs" for interwikiing the Category:Stubs. --Plasmarelais 17:21, 29 September 2009 (UTC)

Running on template space on EN[edit]

Is there a way that we could get an interwiki bot running on EN that will add links to the /doc sub page and recognize other iw links from templates using this pattern too? II know that there have been operators who have attempted this but because of the special nature of the templates most of the edits needed to be revert. —Dispenser 03:47, 3 February 2008 (UTC)

Removal of links[edit]

Can someone please explain on what basis the bot decides that links are to be removed? How can this ever be done safely without human confirmation? There is no one-one correspondence between articles in different language wikipedias, so linking can't be expected to be one-one either. (User:Kotniski from en:WP)

It seems to remove some based on whether both articles are categorised/tagged as disambiguation pages. This sometimes has unhelpful results.
I would say that the pywiki bot SoxBot III removed these interwiki links incorrectly on 1 March 2008. Having noticed the first one, I checked a dozen or so others from that date and found the others.
[4] Removed 2 valid interwikis, perhaps because the EN is tagged as disam but the others are not.
[5] Removed valid disambiguation to disambiguation link, perhaps because the EN has disambiguation subcategories/specific templates rather than the general disambiguation category/template.
[6] Removed redirects rather than replacing them.
I raised these with the editor who ran the bot and he referred me here. User:Fayenatic_london on en:WP

The above is not meant to be an answer to the question raised by user:Kotniski, but additions to it with specific examples of problems. Please would someone who is able to improve the program look into these faults and answer us? - User:Fayenatic_london on en:WP, 23 July 2008

A problem like the one above was reported today on enwiki at en:Wikipedia:Administrators' noticeboard/Incidents#SassoBot. While looking at the code of, I discovered this change that was put in by a_engels in November, 2008. It allows the hard cases (DAB in one language and non-DAB in the other) to be skipped. So the bot will either do nothing or allow the operator to fix the pages up manually. EdJohnston 04:16, 27 February 2009 (UTC)

Is there any hope of correcting this bug, because it's still removing good iw? Regards, --Klemen Kocjančič (Talk - Fast reply) 22:12, 9 February 2011 (UTC)

Algorithm used[edit]

Can someone explain how bot works in some cases? For example,"finds that an interwiki-link has to be changed" - how bot recognizes this? Another case - "finds that an interwiki link is to be removed" - when it happens? Thanks DixonD 11:20, 14 August 2009 (UTC)

I guess the bot finds iwls needing a change either by following redirects or in the following situation. If en:A and de:A were connected right, then de:A moved to de:B and kept its iwl. Later the bot finds de:B pointing to en:A but en:A still linking on de:A which is a broken link. With one direction working it can repair the broken one.
The bot finds links to be deleted, if none of the cases above worked or a page has a link to a moved or deleted page without redirect. Another case is, that the iwl is incorrect or pointing on a page that never existed. --Plasmarelais 17:27, 29 September 2009 (UTC)

where do we put the option?[edit]

  • If you add the option -start, the bot will go through the pages alphabetically, starting at the word specified. If you want to start at the letter B, for example, you can use " -start:B". In particular, if you want to do the whole Wiki, you can use " -start:!". I don't know where we put the option (which scripts, or
  • When I run this script on "vi", I receive this report:

Which page to check: Mù tạc Traceback (most recent call last):

 File "C:\Python25\pywikipedia\", line 2389, in <module>
 File "C:\Python25\pywikipedia\", line 2356, in main
   singlePageTitle = pywikibot.input(u'Which page to check:')
 File "C:\Python25\pywikipedia\", line 8494, in input
   data = ui.input(question, password)
 File "C:\Python25\pywikipedia\userinterfaces\", line 241, in input
   text = unicode(text, config.console_encoding)

TypeError: decoding Unicode is not supported

How can I fix this?--Tranletuhan 05:58, 10 March 2010 (UTC)

The option -start:B is not to be put into any script. You have to add it after the name of the script when you type in the command line. Or did I get you wrong? --Plasmarelais 17:08, 2 April 2010 (UTC)


I suggest that bots do not add content to pages which do not interwiki-link to other wikipedias. I had the tedious task of cleaning up after a vandal some weeks ago, but the job was complicated by a flood of bots that came and added wikilinks. Thus i could not use the normal rollback mechanism, instead i had to go through 6-7 steps of undoing for each page. The problem is possible bigger in small wikipedias (like the faroese) as we do not have the same manpower as big-pedias. Quackor 06:25, 31 May 2010 (UTC)

Possible problem[edit]

When running on hebrew-wiki, sometimes I got the following message and boy is stopping. What is the problem and how it can be fixed

[[he:חגורת האסטרואידים]] gives new interwiki [[zh-min-nan
Si¢-he?k-chheDump he (wikipedia) appended.
raceback (most recent call last):

File "C:\pywikipedia\", line 2514, in <module>

File "C:\pywikipedia\", line 2488, in main

File "C:\pywikipedia\", line 2248, in run

File "C:\pywikipedia\", line 2221, in queryStep

File "C:\pywikipedia\", line 2217, in oneQuery

File "C:\pywikipedia\", line 1453, in batchLoaded
   pywikibot.output(u"%s: %s gives new interwiki %s"% (self.originPage.title(as
ink=True), page.aslink(True), linkedPage.aslink(True)))

File "C:\pywikipedia\", line 7554, in output
   ui.output(text, toStdout = toStdout)

File "C:\pywikipedia\userinterfaces\", line 221, in outpu

self.printColorized(text, targetStream)

File "C:\pywikipedia\userinterfaces\", line 174, in print
   self.printColorizedInWindows(text, targetStream)

File "C:\pywikipedia\userinterfaces\", line 140, in print
   targetStream.write(text[:tagM.start()].encode(config.console_encoding, 'repl

OError: [Errno 42] Illegal byte sequence 10:25, 6 November 2010 (UTC)

Interwiki on documentation subpage[edit]

I was warned that my bot has duplicated some interwiki links. There were links on the subpage and the bot added them again on the main template page. Is there any way to fix this? Firilăcroco discuție / talk 16:04, 10 January 2011 (UTC)

Could you please explain that a little more detailed? Maybe with a link on the pages, then I'd understand better, I think. Thx, --Plasmarelais 17:49, 17 January 2011 (UTC)
I believe that Firilăcroco is talking about moved links to documentation subpages for templates. Actually I posted solution to tracker about a month ago but it is still not checked and could be used only with manual reviewing of every edit made by bot. --DixonD 13:49, 21 January 2011 (UTC)

Feature required on Commons[edit]

What about the Sister templates? JackPotte 19:53, 25 January 2011 (UTC)

Accessing MediaWiki:Mainpage[edit]

I was trying to run today and encountered the following error. It occured right after engaging the script:

Traceback (most recent call last):
  File "c:\Python26\pywikipedia\", line 2081, in <module>
    mainpagename = site.mediawiki_message('mainpage')
  File "c:\Python26\pywikipedia\", line 4991, in mediawiki_message
    tree = XML(decode)
  File "<string>", line 85, in XML
SyntaxError: not well-formed (invalid token): line 142, column 251

I had it run in the Memory Alpha (de). To me it seems, that there is a problem with MediaWiki:Mainpage. The scripts are running fine as soon as I comment out the following part of (line 2081):

        # ensure that we don't try to change main page
            site = wikipedia.getSite()
            mainpagename = site.mediawiki_message('mainpage')
            globalvar.skip.add(wikipedia.Page(site, mainpagename))
        except wikipedia.Error:
            wikipedia.output(u'Missing main page name')

So which way am I to modify the script to make it run like it always used to? I guess there could be a connection to recents Updates of MediaWiki software and/or the new skin used. Thanks for any help, --Plasmarelais 22:52, 1 February 2011 (UTC)

Solution: update to Python27 and newest framework. --Plasmarelais 17:00, 12 February 2011 (UTC)

removal of multiple interwikies in Wikisource[edit]

Several Wikisource subdomains allows for multiple interwikies pointing to the same language, because a text can have several editions and/or several translations in other languages. But the default behaviour of is to remove these "surplus" interwikies without even asking for a confirmation, see here for example (I was running it in manual mode). Is it possible to have some workaround for this? Something like "are you sure you want to remove this link? y/n". Thanks in advance. Candalua 11:38, 26 February 2011 (UTC)

I confirm that it's blocking for some developers. JackPotte 19:42, 28 February 2011 (UTC)


Look like hints aren't working, because when I use -hint:all it still asks for at least one interwiki link. Any way to fix this? Thanks, 20:27, 30 May 2011 (UTC)

name space[edit]

In force option when bot came across with diffrent name space it will delete that interwiki. in my opinion it can check other interwiki's and instead of deleting it can fix that interwiki to correct name space.Reza1615 05:08, 9 August 2011 (UTC)

Need Assistance[edit]

I am not sure what to do to fix robots adding interwiki links that clearly should not be added. For example, SZS on shwp clearly is not talking about Sayonara, Zetsubou-Sensei. When I removed the interwiki link manually from jawp, the robot re-added the link. Do I have to delink from both sides? Penwhale (talk) 09:09, 16 July 2012 (UTC)

extracting interwikilinks[edit]

Is there a possibility to extract the interwiklinks? I mean in stead of changing the page just return: this are all found interwikilinks... and they are not controversial. --Sk!d (talk) 22:01, 30 September 2012 (UTC)

Bug with i18n translation[edit]

Hello, since few days, my bot always stoppe with an error message : Traceback (most recent call last):

 File "", line 2584, in <module>
 File "", line 2558, in main
 File "", line 2291, in run
 File "", line 2269, in queryStep
 File "", line 1718, in finish
   if self.replaceLinks(page, new, bot):
 File "", line 1833, in replaceLinks
 File "", line 2342, in compareLanguages
   mcomment += i18n.twtranslate(insite.lang, commentname) % changes
 File "C:\Python27\pywikipedia\pywikibot\", line 304, in twtranslate
   raise TranslationError("No English translation has been defined for TranslateWiki key %r" % twtitle)

pywikibot.i18n.TranslationError: No English translation has been defined for TranslateWiki key 'interwiki-modifying-from' I update my bot recently (revision 10749) and I don't know exastly why it crashed ? Could someone check ? --Gdgourou (talk) 07:08, 26 November 2012 (UTC)

How to link 2 French speaking articles to one English speaking article?[edit]

Bonjour. Interlanguage link bots seem not to allow duplication of interlanguage links.

For example: Mango is a single English language article whereas French speaking WP as a page for the fruit and the tree.

Bots will just keep one interlanguage link, so one of the two French speaking articles will not be interlanguage linked.

Plenty pages don't carry interlanguage links that would be so useful for people who translate or want to compare various articles in various languages. Can this be worked around?

I'm new and don't know where to look. This topic must have been discussed before. I apologise if this is not the place or if it is easy to find a solution. If you don't know how this can be worked around, may you please forward me to users that may know? Very many thanks in advance for your help. Best Edouard Albert (talk) 12:08, 13 December 2012 (UTC)

Hi again, anyone know where I should look for this information please? Edouard Albert (talk) 20:03, 16 February 2013 (UTC)

zh-min-nan -> nan[edit]

I'm not sure where to report this problem. But it's seems that pywikipediabot still using "zh-min-nan" as the interwiki code for "". Maybe it shoud be changed into "nan" as the mediawiki had released a related update many years ago (though the url has not be fixed yet...). Luuva (talk) 16:23, 17 January 2013 (UTC)

Running on Memory Alpha[edit]

Hi @all! I keep trying to run on, but the script always runs into following error:

======Post-processing [[de:Aamin Marritza]]======
Updating links on page [[nl:Aamin Marritza]].
Changes to be made: Robot: Adding [[de:Aamin Marritza]]

NOTE: Updating live wiki...
Page [[nl:Aamin Marritza]] saved
Dump de (memory-alpha) appended.
Traceback (most recent call last):
  File "ma\", line 143, in <module>
    run_python_file(fn, argv, argvu)
  File "ma\", line 67, in run_python_file
    exec(compile(source, filename, "exec"), main_mod.__dict__)
  File "ma\scripts\", line 2610, in <module>
  File "ma\scripts\", line 2584, in main
  File "ma\scripts\", line 2334, in run
  File "ma\scripts\", line 2312, in queryStep
  File "ma\scripts\", line 1767, in finish
    if self.replaceLinks(page, new):
  File "ma\scripts\", line 1990, in replaceLinks
    status, reason, data = page.put(newtext, comment=mcomment)
TypeError: 'NoneType' object is not iterable

Anybody has some idea what to do? Updating Python 2.7.5 and the pywikibot framework didn't help anyhting. I'm getting desperate about this... --Plasmarelais (talk) 19:45, 27 May 2014 (UTC)

In the community of wikia they gave me the hint to use compat version 8888. I did so and now the script runs fine. --Plasmarelais (talk) 20:42, 5 June 2014 (UTC)
@Plasmarelais: Could you please try the latest version of the script in core? I fixed what I believe was the same issue in gerrit:132711. whym (talk) 12:25, 6 June 2014 (UTC)
I'll check that asap. --Plasmarelais (talk) 19:45, 25 June 2014 (UTC)

Trouble with (or at least being exposed by) -start[edit]

Using Python 3.7.6. The wikis themselves are at Fandom. Got my user and family files set up, successfully used interwiki with a few specific pages to see that it was making the right edits in the right places. However, now trying to make it take care of many pages at once, it throws up on me.

I:\pywikibot>python interwiki -start:!
NOTE: Number of pages queued is 0, trying to add 50 more.
Retrieving 50 pages from hpwiki:ja.
Dump ja (hpwiki) appended.
Traceback (most recent call last):
  File "", line 297, in <module>
    if not main():
  File "", line 292, in main
    run_python_file(filename, [filename] + args, argvu, file_package)
  File "", line 96, in run_python_file
  File ".\scripts\", line 2577, in <module>
  File ".\scripts\", line 2553, in main
  File ".\scripts\", line 2265, in run
  File ".\scripts\", line 2239, in queryStep
  File ".\scripts\", line 2228, in oneQuery
    for page in gen:
  File "I:\pywikibot\pywikibot\", line 3374, in preloadpages
    for pagedata in rvgen:
  File "I:\pywikibot\pywikibot\data\", line 2983, in __iter__
    for result in super(PropertyGenerator, self).__iter__():
  File "I:\pywikibot\pywikibot\data\", line 2823, in __iter__
    for result in self._extract_results(resultdata):
  File "I:\pywikibot\pywikibot\data\", line 2997, in _extract_results
    self._update_old_result_dict(d, data_dict)
  File "I:\pywikibot\pywikibot\data\", line 3018, in _update_old_result_dict
    'continued API result had an unexpected type: %s' % type(v))
AssertionError: continued API result had an unexpected type: <class 'dict'>
CRITICAL: Exiting due to uncaught exception <class 'AssertionError'>

It seems to have some connection with the actual number of pages. I tried changing it from starting at ! to starting at a place late in the alphabet, and it successfully found and dealt with a couple dozen pages.JoshuaJSlone (talk) 17:47, 28 July 2019 (UTC)

I tried searching for help a few other places. Came to a partial solution that's workable enough for me, so I'll leave the info here. The problem seems related to the "query continue" part of the API which has changed between MediaWiki versions. I couldn't figure out how to force Pywikibot to act appropriately for an older version, so decided to try to find an old version of Pywikibot altogether. Had trouble finding old versions with downloads of every file the standard modern download has, but found enough old versions of files that I decided to just try other versions of the file. The one from 2018-05-05 is mostly successful. Doesn't give the error above, anyway. I do occasionally get a couple other types of error, but it gets enough done between times of that happening that I can get my interwiki links taken care of without too much extra hassle on my part.JoshuaJSlone (talk) 19:21, 30 July 2019 (UTC)