Also read the quick start guide. It answers some questions not answered here and points to other useful pages.
- Read this FAQ.
- Try to find the answer to your question in the API documentation here, in the API sandbox or on the self-documenting API home page.
- If you can't find the answer to your question on the web, you can ask your question on the mediawiki-api mailing list.
There is no hard and fast limit on read requests, but we ask that you be considerate and try not to take a site down. Most system administrators reserve the right to unceremoniously block you if you do endanger the stability of their site.
If you make your requests in series rather than in parallel (i.e. wait for the one request to finish before sending a new request, such that you're never making more than one request at the same time), then you should definitely be fine. Also try to combine things into one request. For example: specify multiple '|'-separated titles in a
titles parameter instead of making a new request for each title; use a "generator" instead of making a request for each result from another request.
Parsing of revisions
While it is possible to query for results from a specific revision number this is an expensive operation for the servers. To retrieve a specific revision use the 'oldid' parameter, example:
Use maxlag parameter
If your task is not interactive (i.e. A user is not waiting for the result) you should use the maxlag parameter. This will prevent your task from running when the load on the servers is high. Higher values mean more aggressive behavior, lower values are nicer.
Use a descriptive
Thinking about performance generally
If you are trying to figure out why you are getting results more slowly than you would like, try Performance guidelines to help you think about performance generally. If you are finding that reading via the API rather than directly reading from databases is impeding your client's performance, consider whether to put it into Wikimedia's Toolforge.