API talk:Main page

From MediaWiki.org
Jump to: navigation, search

Renaming this page, front doors to the APIs[edit]

This page is no longer a Main_page for APIs (plural), it's an introduction to the MediaWiki action API, our biggest and oldest API. The next step is to rename the page to more accurately reflect its role. Already the {{API}} navigation template links to it as the "MediaWiki APIs" heading and as "Introduction and quick start" (I just now renamed the latter from "Quick start guide"). My choice is:

API:Introduction to the action API

which accurately if incompletely describes its content, so unless someone has a better suggestion I'll rename the page to that. API:Main_page will of course redirect to the new name.

Alternatives:

  • trimming to "API:Action API" makes the page mysterious
  • expanding "API:Introduction to the MediaWiki action API" feels unnecessarily long
  • "API:Action API introduction and quick start" is even more informative and would match its item in the {{API}} navigation template, but is too wordy. :-)

Note that for certain developers, API:Web APIs hub is a better introduction to "the APIs that let you access free open knowledge on Wikimedia wikis." Eventually I think "Web APIs hub" should be the destination of the heading MediaWiki APIs in the {{API}} navigation template.

This is part of phab:T105133, "Organize current and new content in the API: namespace at mediawiki.org".

-- SPage (WMF) (talk) 20:50, 31 August 2015 (UTC)

The current aliases, as seen in the redirects. Cpiral (talk) 18:55, 19 January 2016 (UTC)

Server returned HTTP response code: 500 for URL: https://www.wikidata.org/w/api.php[edit]

When I try to login to Wikidata I keep getting a 500 error. I don't have any problems with logging in to Wikipedias. --jobu0101 (talk) 20:43, 20 January 2016 (UTC)

wgWikibaseItemId from history page[edit]

Hi. I would like to retrieve the Wikidata ID of an wikipedia article from its revision history page. Command mw.config.get( 'wgWikibaseItemId' ) only works from main page (it returns null from history page). Any advice or solution ? Thanks in advance. --H4stings (talk) 09:00, 4 June 2016 (UTC)

This is not an API question. You might be able to reach the wikidata people by filing a task or posting to their mailing list. --Krenair (talkcontribs) 18:08, 4 June 2016 (UTC)

List of all pages edited by a user[edit]

Hi, any suggestions for the following request? - How to get:
- all page titles or page IDs (including or excluding older page versions)
- a user X has edited yet
- optionally limiting to discussion pages?

I look and tryed the API sandbox, but the query-list-allpages does not have relevant attributs.

Thank you for your help and all the best, --Liuniao (talk) 07:08, 6 July 2016 (UTC)

You'll probably want to look at API:Usercontribs. To limit to only discussion pages, you would need to first know the namespace IDs for all the talk namespaces, which you can get from API:Siteinfo (e.g., [1]). For argument's sake, let's say you have no custom namespaces on your wiki and you want to see my talk page contributions. Your ultimate query would look like this:
<url to api.php on your wiki>?action=query&list=usercontribs&ucuser=RobinHood70&ucnamespace=1|3|5|7|9|11|13|15
If you wanted only the most recent edit on each page, then you'd add &ucshow=top to that (or use &uctoponly= if your wiki is 1.22 or older). Robin Hood  (talk) 18:52, 6 July 2016 (UTC)

What is maximum URL length I can use with Wikipedia API ?[edit]

Sometimes I get HTTP 414 error (Request-URL Too Long) when I pass very long requests (with more than 50 page titles). I limit it roughly but I'l like to know exact limit. Anyone knows that? Нирваньчик (talk) 00:20, 17 July 2016 (UTC)

There's no byte limit that I'm aware of in the MediaWiki software, the limit usually comes from the servers themselves. It's often set to 8k (8192 bytes), but different servers could be using different sizes. I would hope that all the MW servers are the same, whatever that might be, but there are no guarantees. The only way to be sure would be to track sizes that succeed or fail on whichever server(s) you're using. Unless, of course, you can get in touch with a server admin and ask them to find out directly.
You should also be aware that there is a limit built into MediaWiki for the number of titles you can put in a single query, and that limit is typically 50 titles unless you have higher permissions (typically bot or admin permissions), in which case, it rises to 500. There are also a couple of special cases where the number is lower, but those are noted on the individual API pages. Whether it's the normal limit or a smaller one, though, you shouldn't get a 414 error from MediaWiki, it should just spit out a warning in the output. Robin Hood  (talk) 01:36, 17 July 2016 (UTC)

Question[edit]

What does API stand for?--Mr. Guye (talk) 23:27, 6 April 2017 (UTC)

Application Programming Interface. It's a generic computer term for how other programs can interact with yours. In this case, the API is how you can control a wiki to do things like editing pages, retrieving information, etc. Robin Hood  (talk) 04:27, 7 April 2017 (UTC)
See w:API. --Tacsipacsi (talk) 18:40, 7 April 2017 (UTC)

all pages that have new content or are deleted[edit]

I already asked it (but not very clearly). In order to create a fresh "pages-meta-current" dump to work with one should try many API requests just to get a list of recently "changed pages". Is it possible to get all titles that have recent changes (after TS XXXX-XX-XXTXX:XX:XXZ), aka: list of all pages that for any reason, either edited or created or moved from another title or uploaded or imported from another project or merged with another title or even deleted, in one pass? The idea is to get all titles for which we will ask the newer revision as well as all titles to delete from current dump. The returned title containing xml should include at least:

  • type="newedit" or "deleted"
    • "deleted" will include:
      • pages deleted using the delete command
      • pages deleted because where moved and user choose not to leave a redirection
    • "newedit" will include:
      • new pages (which will include the page an uploaded file creates)
      • edited pages
      • imported pages
      • pages that moved and user choose to leave a redirection (the original title, not the new one)
      • pages that merged and now have a new content
  • ns=
  • title=
  • timestamp=

This will be a good API for getting a fresh "pages-meta-current" dump to work with. This will be very useful to most of the bots as they can have a more recent dump, in fewer steps, to work with. --Xoristzatziki (talk) 10:23, 16 June 2017 (UTC)

Have you tried with Grabbers? If you have a working MediaWiki database with an imported dump, grabNewText.php will update it to the current state. It uses API:RecentChanges and API:Logevents and handles moves, deletions, restores and uploads (updating page and revision, it doesn't actually upload files). Imports may not be handled, though, but I can add it for you. --Ciencia Al Poder (talk) 08:58, 17 June 2017 (UTC)
Beyond the fact I do not use an exact db, and even more a MediaWiki db, the point is (if, of course, this is easily done) that everyone be able to get the titles changed (which means edited in any way, new by any means and deleted for any reason) in one single API and not to have plus one script (plus another bunch of parameters, plus RDBMS maintenance). Apart from this, thanks for the information about the existence of Grabbers. I must take a look at it in time. --Xoristzatziki (talk) 15:50, 18 June 2017 (UTC)

Getting history of titles for a page[edit]

I have old links (for example [[Car]] from 2014-03-03) and want to get the corresponding wikitext (as of 2014-03-03). Its simple to look up the revision history for the page currently titled [[Car]] - but that is not what I am after, I want the revision history for the page which was titled Car on 2014-03-03 (The content of [[Car]] could have been moved to [[Vehicle]] on 2015-03-10, and a new [[Car]] could have been created, so the current history is useless).

As far as I can tell there is no way to do this short of "replaying" all log-actions involving page names, hoping the replay doesn't diverge from the reality? The database does not seem to retain the old page titles.

Is there any (other) way? Loralepr (talk) 19:55, 25 August 2017 (UTC)

You can read logs to see renamings. --wargo (talk) 09:41, 26 August 2017 (UTC)
Yes, but its not done with renamings, it would require to reapply deletions, restores, page history merges and possibly a few more things as well. On top of that the log table only goes back to 2004, and there is no dump from that point in time to apply these log events to. So I would have to apply the log events backwards from the current state. To add to the fun these old log events are missing the page id they applied to, that was only added later. Loralepr (talk) 10:07, 26 August 2017 (UTC)

It seems you've already answered your question. Mediawiki is a very complex tool, and there certainly is no way to account for all its edge cases, such as database corruption or someone fiddling directly with the database thereby completely breaking the history of pages. There are even more problems (https://phabricator.wikimedia.org/T39591 , https://phabricator.wikimedia.org/T41007). For a naive scenario looping through logevents will get the data in most cases.

There is certainly no API that can account for all such cases and actions related to the titles and moves of a page. That requires a separate analysis tool to go through the database or dumps to recreate a new database which can then be queried to get that data. There would also be a need to be aware of how mediawiki database schema changed.

Even then there will still be edge cases when the tool will not show the correct data. There are also hidden revisions that won't show up in the database, so it is truly impossible to get some revisions at a specific date. 10:57, 26 August 2017 (UTC)

Thanks, that is basically what I figured out - my main hope was that I overlooked some API or some dump containing this data - no luck here. I also had the idea to take all old article dumps as (consistent?) snapshots - but they get very sparse very soon (Archive.org: 2015:9 Dumps 2014:2 Dumps 2013:0 2012:0 2011:1 Dump) Loralepr (talk) 11:38, 26 August 2017 (UTC)