API talk:Login

Login / lg
How is the password verified by the server? How is the password sent? Plaintext / encrypted? What encryption is used?

Perhaps someone more knowledgable can fill in this table: Thanks a lot in advance, Shinobu 04:49, 20 December 2006 (UTC)


 * From what I can tell, HTTPS is not supported. Also, since the authentication is form-based, the only option for sending the password is plain text. Using status code 401, HTTP itself supports Basic and Digest authentication as standard schemes, although Digest is not used all that often. In the case of Basic, it is effectively plain text; for Digest, it is an MD5 digest of the password, so it is effectively encrypted. All of this is moot because MediaWiki doesn't do this.


 * The problem is that if anything but plaintext is received by the server, you are limited in how the passwords can be stored in the database. For instance, if you use digest authentication based on MD5, the version of the password has to be based on the MD5 hash of the password, since it can't check against another hashing/crypting scheme without reprocessing the original password. In the case of MediaWiki, this would actually work, since it's database value is based on an MD5 hash of the plain text, but I'm not sure that there is consistent browser support for Digest authentication. The other issue is that using form-based authentication allows you to have a pretty HTML form for posting and let's you implement stuff like "Remember me", since HTTP authentication doesn't support this (regardless of whether you use Basic or Digest). Hope that helps. Mike Dillon 16:38, 22 December 2006 (UTC)


 * What I said about Digest authentication is a little inaccurate; since it isn't relevant to what MediaWiki does, you can just ignore my correction if you aren't interested.


 * The real problem is that without access to a plaintext versions of the password on the server, the server can't reproduce the expected response in the challenge/response protocol. This is because the response is based on the MD5 hash of not just the password, but other information as well. Since MediaWiki only stores the MD5 hash of the password, it can't generate the expected hash when the other components of the HA1 hash are factored in (see Digest authentication for more details).


 * There are further complications if MediaWiki has been configured with the  setting, since it actually double-hashes to get the value that's stored in the database in that case. Since MD5 is a one-way hash, there is no way for the server to get back to the original password. All it can do is verify that what the user sent as the password can be re-hashed to the same value. Mike Dillon 04:42, 24 December 2006 (UTC)

So in summary, the password is always sent in plaintext?

@From what I can tell, HTTPS is not supported: I think https is supported: Main Page (https), so I should think that only URLs are passed unencrypted, right? POST data and responses should be encrypted. Unfortunately, it doesn't seem to know that I already signed in on http, so apparently when you use the https login you have to do everything in https. Question is, is API supported using https?

@form-based, the only option for sending the password is plain text: One could calculate the MD5 hash using JavaScript, or one could use https. Considering I've already sent my password in plaintext, it should at least be possible to verify and/or change it without doing so again. Sending the password in plaintext again and again seems even less wise than doing it once.

I'm pretty new at this, but sending passwords in plaintext seems pretty dumb to a layman. Not warning about it even more so. Shinobu 08:03, 24 December 2006 (UTC)


 * I didn't know about secure.wikimedia.org. I'm not sure that you can log in to that server and get a cookie that will work on a non HTTP version, though. I just tried to use: https://secure.wikimedia.org/wikipedia/en/w/api.php?action=login&format=xml&lgname=Username&lgpassword=Password&lgdomain=en.wikipedia.org and I got a cookie in "secure.wikimedia.org", which means that all traffic must go through that domain instead of en.wikipedia.org. Also, since the server sets the "secure" flag on the cookie, normal HTTP libraries won't send it to the plain HTTP URLs on secure.wikimedia.org, so you have to take the SSL overhead on every request to stay logged in. If you aren't trying to use the cookies returned, you should be able to use the lgtoken value returned in the API response against en.wikipedia.org, but I haven't tested that. I didn't look into what is done on the server side with the login token.


 * As for what is encrypted when using HTTPS, everything is encrypted, including the URL. All that can be seen by sniffing is that you're sending data to the server's IP address using SSL. You can't see any headers or the HTTP request body or URL, since they're part of the SSL payload.


 * Regarding the use of Javascript to send an MD5 hash, it could be done, but the MediaWiki software would have to support it and you'd have to find Javascript code to do the hashing.


 * I'm not sure what you mean about sending the password again and again. It should only be sent to the login action. After that, you either use the cookies returned or the login token in the API response to send what is effectively a session id, not your password. Mike Dillon 20:31, 24 December 2006 (UTC)

Regarding encryption on secure.wikimedia.org, I just saw this post on English Wikipedia's technical village pump. The poster claims that the "null" cipher is possibly being used for the HTTPS server and if so even that is not encrypted... I just checked it and saw "AES 256", so I'm not sure whether it is encrypted or not. The browser said it was AES 256 and the stream looked encrypted in a packet capture, so this is mainly just an FYI. I'm not sure if there is any documentation on the cipher used by the HTTPS server. Mike Dillon 00:23, 27 December 2006 (UTC)


 * There have been some additions to the aforementioned discussion from Brion VIBBER. The null encryption thing is not true, but he mentions some other quirks of the secure.wikimedia.org server. See en:Wikipedia:Village pump (technical). Mike Dillon 15:33, 27 December 2006 (UTC)

@you'd have to find Javascript code to do the hashing: There is a module to do this. w:User:Lupin/md5-2.2alpha.js Although one should be careful about what to hash, because otherwise the hash could be sniffed and used instead, defeating the purpose of sending a hash.

As much as I understand this to be a complicated issue, I still think it needs to be fixed. Shinobu 19:06, 28 December 2006 (UTC)


 * I just tried the following:
 * Log in using API to secure.wikimedia.org
 * Note the lgusername, lgtoken, and lguserid fields in the response
 * Request a watchlist against en.wikipedia.org using those values
 * Unfortunately, I received . As I said before, I haven't looked into how the tokens work to see what the problem might be, but I would think this should work. Of course, at that point your session can be hijacked since you're sending the token over the wire unencrypted, but there's really no way to avoid this except for using SSL for every request. At least nobody can steal your password...


 * I also noticed that I get the same token back for subsequent login requests. Mike Dillon 22:13, 30 December 2006 (UTC)


 * I am also trying to the exact same thing as Mike Dillon said above. I've gone through the same steps with the same lack of result. I'm passing lgusername, lgtoken and lguserid in POST, and verifying this in my app before it goes out. I'm extremely new the API, but trying getting it rapidly except for this hiccup. Any progress here? Eddieroger 05:58, 13 February 2007 (UTC)


 * Hope this helps : Example of the Atom authentification (the only remaining problem is that passwords are kept in clear by the server, but the transaction is secure). I don't know if there is an updated version. w:fr:Leafcat

I hate reopening old issues, but I still can't get login working without using cookies. I am able to login with result code of "success," but when I pass those parameters back in to the API along with my request, I still get wlnotloggedin errors. I'm testing this against en.wikipedia.org as a normal non-bot user. Can anyone confirm that it works at all? Eddieroger 23:16, 4 August 2007 (UTC)


 * Same problem, (In PHP):
 * $wiki = 'http://en.wikipedia.org/';
 * $url = $wiki . 'w/api.php?action=login&lgname='.$username.'&lgpassword='.$password.'&format=xml';
 * $http = http_parse_message(http_get($url))->body; /* Works */
 * $xml = simplexml_load_string($http);
 * $login = "&lgtoken={$xml->login['lgtoken']}&
 * lgusername={$xml->login['lgusername']}&lguserid={$xml->login['lguserid']}";
 * $url = $wiki . 'w/api.php?action=query&list=watchlist' . $login;
 * $watchlist = http_parse_message(http_get($url))->body; /* */
 * Carpetsmoker 06:02, 27 August 2007 (UTC)

It seems I am having the same problem. When trying to issue a recentChanges query using a PHP program, I get very different results than when I log in to a browser and issue the same query from the browser window. It seems there are problems with the login and token passing in the URL query string, but seems to work fine if I use cookies. Formatting cookies in PHP is no where near as easy as formatting a query string :-( 128.221.197.20 14:30, 11 September 2007 (UTC)
 * Try Snoopy, a PHP class that makes this stuff easier. --Catrope 15:05, 11 September 2007 (UTC)
 * I believe the secure and unsecure servers require separate logins.

POST?
Am I getting it right that currently login works only via GET and you have to disclose your login in URL? MaxSem 10:47, 21 November 2007 (UTC)
 * No. Both GET and POST work just fine for all API modules. --Catrope 13:43, 21 November 2007 (UTC)
 * No, certain api modules such as login require that you do a post.

User-Agent
Hello. Since few days iv got problem to connect my robot with the API. Checking the code, i see i receive "Scripts should use an informative User-Agent string with contact information, or they may be IP-blocked without notice.". But the error is not embbeded into XML, then everything crash. Then, is there a way to embbed this message into XML?

My second problem is i dont know what "informative User-Agent string with contact information" mean. I really dont know where i can ask help. Maybe it is outside the topic of the page. I dont know. 82.246.189.168 20:56, 18 February 2010 (UTC)
 * See User-Agent_policy. Tisane 12:04, 6 May 2010 (UTC)

New problem with API:login
Hello, I'm using api.php for a while now for a set of tools around my bots (on fr.wikipedia.org). Today I get a problem to log in (but maybe it exists before, but my token was expired, so I really my program really attempt a login). When I try to log, I get a "NeedToken" reply, with a associated token: I don't understand why, because this part of the code works for mounthes without any problem. I also tried to add lgtoken=faab82e4158d864b9f06304c9cb870ec (in this example) to the POST data (with the lgname and lgpassword) with exactly the same effect: it still replies a "NeedToken" and gives a new "token" value.

Any clues to help me? If it can help this part of the code is an independant script in shell (Linux) using Curl, code that is used for at least 3 mounthes by now.

Regards, Hexasoft 09:57, 9 April 2010 (UTC)
 * Details: login action is performed this way:
 * (-d option is the POSTed data)
 * I also tried passing "lgname=Hexabot&lgpassword=mypassword&lgtoken=the_token_in_previous_reply" without success.
 * Hexasoft 10:02, 9 April 2010 (UTC)


 * Hi, the login procedure has changed a few days ago, and every tool need to be modified to work with it. The new procedure is described here. I did it for my own tool without any problem. --NicoV 11:21, 10 April 2010 (UTC)
 * Ok! Thanks. I applied this method and my login now works. Regards, Hexasoft 11:49, 10 April 2010 (UTC)

I experienced the same problem with my bot on Wikia after they updated the software. Unfortunately, when I try to get the token in the first request I only receive the token itself but not the cookieprefix or sessionid for the second request (which also returns a NeedToken error). I checked the HTTP headers and they don't send a cookie, either. It would be great if someone who fixed the problem for his bot could tell me how he got the sessionid from the response. Have a nice weekend, 24.182.167.38 20:43, 17 April 2010 (UTC)
 * Send a first login request (with lgpassword and lgname) to receive the token, then send a second login request (with lgpassword, lgname and lgtoken).
 * http://fr.wikipedia.org/w/api.php?lgpassword=XXXXX&action=login&lgname=ZZZZ&format=xml
 * http://fr.wikipedia.org/w/api.php?lgtoken=YYYYY&lgpassword=XXXXX&action=login&lgname=ZZZZ&format=xml
 * --NicoV 15:49, 18 April 2010 (UTC)
 * Thank you for your fast response; I fixed my mistake. In case anyone else has the same problem, my script is written in JavaScript and AJAX and I was using a PHP proxy to relay the request to the wiki server. The wiki server returned the correct response including the HTTP cookies back to the proxy but my proxy only returned the XML part of the response, not the session cookies. Therefore, on my computer I did not see any cookies when I was looking at the HTTP headers. I had to change the proxy script so that the cookie would be returned as well. Thank you again for your quick answer. --24.182.167.38 19:17, 18 April 2010 (UTC)

I am doing the same thing. I make a second call with the lgtoken value set. How do you pass the cookie in PHP? What I did was have the CURLOPT_HEADER be TRUE, so that I would get the header back. Then I parsed the header for the session id, then did this in the second API call:

curl_setopt ($curl_obj2, CURLOPT_COOKIE, 'wiki_session='.$sessionId);

But it's very unpredictable. It works sometimes, doesn't at others. Is there some way to do this without using Snoopy. I'm working in a corporate environment and use of open source isn't encouraged, unless it's really well established. It's a big thing that I'm using MediaWiki at all. --Jogjayr 23:13, 8 June 2010 (UTC)

Throttling setting
How does one change the throttling setting to something other than 5 logins per 300 seconds? Thanks, Tisane 13:20, 6 May 2010 (UTC)

Take a look here: Manual:$wgPasswordAttemptThrottle JRWR / wiki@jrwr.co.cc

is there a possibility to...
...check if someone is online (login) or not? And is it possible to add this in a wiki with mediawiki 1.15.1? --80.144.42.121 12:18, 11 August 2010 (UTC)
 * Extension:WhosOnline. Max Semenik 12:51, 11 August 2010 (UTC)
 * Yeah... I know this function, but WhosOnline works with edits. I think of something that works with the button Log in/out... --80.144.42.121 13:13, 11 August 2010 (UTC)

Automate Browser Login
Is it possible to use this API to login on behalf of the browser?

I have successfully performed a login and login confirmation, but that just allows my server to communicate with the Wiki via the API calls.

What I really wanted to do was to log in on behalf of the user (my web site was going to store their mediawiki username and password, with their permission), then allow him or her to continue browsing. However, as my script can't set the username, userid, session and token cookies for a different domain, I'm beginning to think this can't be done.

Any suggestions?

Thanks
 * I don't think you can do that very easily. (You could probably do something with forcing your users at site A to load an image from your mediawiki site that sets the cookie based on information received from the server of site A, like how central auth does multi-wiki logins, but that'd probably require you to write an auth extension). Bawolff 22:12, 19 October 2010 (UTC)

logging in with FauxRequest
Hello, here is a newbie question: I'm trying to upload an image to my own Mediawiki installation from an extension, and I need to log in to do that via the API. However: $request = new FauxRequest( array( 'action' => 'login', 'lgname=' => 'mylogin', 'lgpassword' => 'mypassword', ));	$api = new ApiMain( $request); $api->execute; ...returns as result only 'NoName', supposedly to mean that lgname is not defined. What am I missing?

rotsee 13:11, 17 April 2011 (UTC)