API talk:Client code

Is there an objective-C library? 74.202.39.3 19:05, 14 January 2013 (UTC)

= Client library evaluation = As a part of my OPW internship project, I'll be developing standards for MediaWiki-Recommended-Client Libraries. As a part of this process I'm evaluating existing libraries in Python, Perl, Ruby, JavaScript, and Java. While this is in progress I'll be posting my initial evaluations on this page. Fhocutt (talk) 20:22, 20 May 2014 (UTC)

Initial screening criteria

 * Mandatory
 * Has it been updated in the last 12 mo (i.e. since May 2013)?
 * Listed these separately on API:Client code. Only further evaluating currently-supported ones. Fhocutt (talk) 20:22, 20 May 2014 (UTC)
 * Does it, at the minimum, handle logins/cookies/continuations? (even "syntactic sugar" libraries should do these things)
 * Warning signs
 * Does it have a lot of open bugs/pull requests, especially compared to the number closed?
 * Does it provide inadequate documentation, code samples, or tests?

Ruby

 * mediawiki/ruby/api, https://gerrit.wikimedia.org/r/#/admin/projects/mediawiki/ruby/api - Ruby API client library in active development as of April 2014
 * wikipedia-client - Ruby framework using the API.
 * MediaWiki::Gateway - Ruby framework for the API. Maintained, tested up to MediaWiki 1.22, compatible with Wikimedia wikis. Has not been in active development since 2012 but docs say patches are welcome.


 * mediawiki/ruby/api comments:
 * Only one open issue that I could find (Bugzilla/Gerrit)
 * Does not cover most common API functions; does appear to handle logins, tokens, cookies
 * Has tests, next to no documentation, one code sample


 * wikipedia-client comments:
 * Does not appear to handle continuations
 * 5 open/16 closed issues (3 issues are pull requests < 3 weeks old)
 * Has tests.
 * Has minimal documentation
 * Does not handle cookies, login, but offers abstraction and ease of use for common GET API calls

--Fhocutt (talk) 22:39, 20 May 2014 (UTC)
 * MediaWiki/Gateway comments:
 * Handles login, cookies, continuations, queries, editing/moving/protecting
 * Provides a selection of example scripts: https://github.com/jpatokal/mediawiki-gateway/tree/master/samples
 * 10 open/43 closed issues (open ones are mostly from 2011-2012)
 * Docs exist. No tests.

Python

 * Pywikibot - A collection of python scripts. Seems up to date (Nov 2013) (IRC)
 * [//github.com/mwclient/mwclient mwclient] - A Python library that makes most of the API functions accessible. ([//pypi.python.org/pypi/mwclient/ PyPI])
 * wikitools - Provides several layers of abstraction around the API. Should be up to date ([//pypi.python.org/pypi/wikitools PyPI])
 * Wikipedia - A Python library that makes it easy to access and parse data from Wikipedia. (PyPI)
 * [//github.com/ianweller/python-simplemediawiki simplemediawiki] - A simple, no-abstraction interface to the API. Handles cookies and other extremely basic things. Python 2.6+ and 3.3+ compatible. ([//pypi.python.org/pypi/simplemediawiki PyPI])
 * [//github.com/legoktm/supersimplemediawiki supersimplemediawiki] - Similar to simplemediawiki, but does not handle tokens or compression.
 * Pattern, - web mining module, has classes for handling MediaWiki API requests, handles continuations

Perl

 * MediaWiki::Bot - A higher-level Perl module with read and write functions. Easily extensible with plugins, for example to provide administrator functions. Updated Jan 2014.
 * Documentation Wikibook (very out of date, bug filed: https://github.com/MediaWiki-Bot/MediaWiki-Bot/issues/56)
 * source code on Github
 * Google Code project (updated 2011)
 * Client scripts (updated 2010)

(Fhocutt (talk) 21:31, 20 May 2014 (UTC))
 * MediaWiki/Bot comments:
 * The most recent release is on CPAN, the most recent development version is on github.
 * Handles logins and cookies; supports continuations for image search but only searches up to max limits for the other methods.
 * Has tests, has documentation, has code samples, has 13 open/43 closed issues. Most of the open issues are c. 2011.

JavaScript

 * https://github.com/macbre/nodemw - Node.js client, actively maintained as of May 2014.
 * mediawiki.api.js - A module that ships with MediaWiki core, abstracts many API calls into simple one liners (uses  internally).
 * jQuery.ajax - Not specifically made for the MediaWiki API, but most queries are very simple with one or two lines of using  or.
 * mediawiki-js (npm) Ultra-light, vanilla JavaScript wrapper of Mediawiki API for use in the browser
 * Node.js MediaWiki module - A JavaScript framework of standard requests (e.g. log in, log out, read, edit, etc.) as well as a general wrapper method. Includes some helpful stuff like throttling.
 * Reactive Extensions for JavaScript - also not specifically made for the MediaWiki API but supports throttling and MediaWiki API calls
 * WikiJS - a simple node.js library that serves as an interface to MediaWiki

Java

 * Bliki Engine - Java Wikipedia API - very complete. Can convert wikicode to HTML, DocBook or PDF. Has a helper library for API calls.
 * JavaWikiBotFramework - a Java library that makes almost all API functions accessible. On github: https://github.com/eldur/jwbf.
 * Wiki.java — a simple one-class API implementation
 * WPCleaner — a Java editing tool that includes a package for MediaWiki API.


 * Blicki/MediaWikiAPISupport comments:


 * JavaWikiBotFramework comments:
 * Appears to be fairly full-featured; definitely handles login, cookies, continuations
 * 2 open issues, 11 closed, no pending pull requests
 * Has documentation, some code samples, tests


 * Wiki.java comments:
 * Handles login, cookies, continuations, variety of GET/POST requests
 * Has documentation (JavaDoc and extended documentation), code sample, and tests
 * In active development as of May 2014; 6 open issues, 47 closed, most from 2010.


 * WPCleaner comments:
 * WPCleaner is a stand-alone tool with a package that handles the MediaWiki API. It has tests and a JavaDoc but it's not clear to me what that package specifically does. -Fhocutt (talk) 00:10, 21 May 2014 (UTC)
 * Issue reporting is on the talk pages, which get responses