API talk:Query

first pages of WP have a swear word
I visited http://en.wikipedia.org/w/api.php?action=query&generator=allpages&gaplimit=4 and was greeted with a swear word. Jidanni 20:26, 19 October 2007 (UTC)
 * !!!Fuck You!!! is about a music album. This is just another Wikipedia article, nothing the API can do about that. --Catrope 20:39, 19 October 2007 (UTC)

prop=info
I think prop=info should be documentated here. Anyone can fix the problem? --87.6.112.238 16:39, 1 December 2007 (UTC)
 * It's documented here, and also linked from this page ("Page information"). --Catrope 14:43, 3 December 2007 (UTC)

Suggestion: start and limit versus start and until
Having to specify a starting name or title plus a limit makes it necessary to start yet another query after the last name has been determined from the previous one in such cases, when the number of items returned does not suffice the callers needs.

In order to make programming easier, we could as well allow to tell the name or title of the last item wanted. This can sometimes, but not always, reached using "prefixindex". --Purodha Blissenbach 14:14, 11 March 2008 (UTC)

There are a non-obsolete alternative?
Suggestion: to comment at the article that "api.php is a non-obsolete alternative to query.php".
 * I don't think that's necessary. You're supposed to get here through API, where that's said already. --Catrope 20:01, 28 March 2008 (UTC)

limit = max ?
Hi, would it be possible to have a special "max" value for limit parameters ? That would just tell the API to use the maximum authorized value for the user. This would be useful for example for tools that can be run both by regular users, bots or sysops. --NicoV 18:26, 7 July 2008 (UTC)
 * That feature has existed for ages. It could be that it's not documented here, I'll check. --Catrope 20:37, 7 July 2008 (UTC)
 * Oups, thanks for pointing that out and putting a link to the other documentation :) --NicoV 18:56, 8 July 2008 (UTC)

Limits confusion
There seem to be at least two kinds of limits which this documentation is not clear about.


 * The limit on the number of titles
 * API:Query
 * Specifying titles through the query string is limited to 51 titles per query (or 501 for those with the apihighlimits right, usually bots and sysops).
 * The limits for just about everything else
 * API:Query
 * API:Query - Lists

It seems the former does not have the features such as "max", the "limits" element, "query-contine".

Can this be clarified by somebody who knows? In particular I'd like my code to deterministically know how many titles it can query for each site no matter whether it is run as bot or normal etc. &mdash; Hippietrail 16:35, 2 August 2008 (UTC)
 * You're right that there's no query-continue for the limit on  (a limit that applies to all multivalue parameters, i.e. parameters that allow multiple values separated by the | character), excess titles will just be ignored. To determine which limit applies to you, run a userinfo query to see whether you've got the   right. --Catrope 22:10, 3 August 2008 (UTC)

Tokens cannot be obtained through JSON Callback mode
I understand why this limitation exists, but, please, it would be nice if it was written in the documentation! I lost 1 day of experimentation form my degree thesis with various communication methods, only to find out that tokens cannot be obtained when there is a JSON callback (from examining the source). MediaWiki always returned me the  error. I hope that this discussion will help others googling this to find info on the issue. --Bobo italy 11:04, 24 February 2009 (UTC)
 * Added here. --Catrope 19:43, 24 February 2009 (UTC)

continue and Generators
I am missing information, about how I can continue through query when I use a generator? It is not right to add both continue params to the queryurl. I must first add the continue param from the non-generator modul and when I get one continue, I can use that. It that right? Why a continue can stand at the top, or at the end of the query? Thanks for information 80.143.85.173 14:00, 23 May 2009 (UTC)
 * You're right, you should first continue the 'regular' module, then the generator; I'll add this information. The query-continue element sometimes being on top and sometimes on the bottom may be weird, but not very relevant. --Catrope 21:54, 23 May 2009 (UTC)

Can a generator generate titles for a list query?
I want to identify articles in a category that aren't in a list. This would be trivial, except that the list might link to an article in the category via a redirect. What it boils down to is this: I need a list of redirects to articles in a given category.

The obvious solution is to use generator=categorymembers with list=backlinks. But this doesn't work for me. The documentation doesn't say this can't be done, but I note that there is not a single example in the documentation, of a generator that passes generated titles to a list query: every single generator example passes the generated titles to a prop query.

Can this be done? If so, please add an example to the documentation; if not, please update the documentation to explicitly say this is not possible.

Hesperian 02:13, 1 July 2009 (UTC)
 * To me this is pretty obvious from the docs on generators, which say that the generated pages are substituted for the titles parameter, whereas list=backlinks uses the bltitle parameter. Also, the latter accepts only one title while a generator may generate more than one title. In short: no, this cannot be done in a single request using generators, you'd have to run a separate list=backlinks query for each title. --Catrope 11:31, 1 July 2009 (UTC)

Chacter Issues with meta=siteinfo
After receiving errors in the Collection Extension indicating the SiteInfo could not be retrieved, I narrowed the problem down to the Mediawiki API calls from the collections extension. The errors were created during the use of renderer mwserve which creates a PDF of a collection of pages.

The Api call that caused the problem is in the file, ApiQuerySiteinfo.php under the function appendGeneralInfoOriginal. The offending code is below:

$mainPage = Title :: newFromText(wfMsgForContent('mainpage')); $data['mainpage'] = $mainPage->getPrefixedText; $data['base'] = $mainPage->getFullUrl;

The line that causes problems is the access to the $mainpage object to get the full url. Commenting only that line out of the file will cause the Api command to run sucessfully and provide xml output to the screen.

The result of running the following Api query: api.php?action=query&meta=siteinfo&format=xmlfm presents the following output: ï»¿

According to the Joomla Forum, this is the result of a Byte Order Mark that is not interpreted correctly. Yes, "ï»¿" is the Byte Order Mark (BOM) of the Unicode Standard. Specifically it is the hex bytes EF BB BF, which form the UTF-8 representation of the BOM, misinterpreted as ISO 8859/1 text instead of UTF-8.

Probably what it means is that you are using a text editor that is saving files in UTF-8 with the BOM, when it should be saving without the BOM. It could be PHP files that have the BOM, in which case they'd appear as literal text on your page. Or it could be translated text you pasted into Joomla! edit windows.

I do not know why others are not having the same problem. I have verified that MySQL is using UTF-8. I experimented with the $wgShowHostnames, having no effect. I set $wgDBmysql5 to 'false' from 'true' to test with no effect (NOTE that it is not suggested to change this setting and actually the setting is supposed to force names to utf8). I also verified no PHP settings for apache were changed in mbstrings or other areas.

I have spent considerable time researching various threads on the web and none could provide help or a solution. My (pathetic) resolution was to finally hardcode the url into the Api file. Then the calls to the collection extension work fine and PDF output is correctly produced.

--David, 9/30/2009 3:06pm (-5)
 * Apparently, you've edited some file (LocalSettings.php?) with Windows Notepad or some other screwy text editor that added a BOM to it. You'll have to find and remove it. I recommend using something like Notepad++ in the future. 62.140.253.6 19:17, 30 September 2009 (UTC)