Release status: beta
|Implementation||Parser function, User activity|
|Description||Allows a list of currently active users to be embedded into a page using a template|
|Author(s)||Aran Dunkley (Nadtalk)|
|Latest version||1.0.10 (2010-12-14)|
|MediaWiki||broken in 1.20.2|
|License||GNU Lesser General Public License|
Translate the CurrentUsers extension if it is available at translatewiki.net
|Check usage and version matrix; code metrics|
The CurrentUsers allows a list of currently active users to be embedded into a page using a template. The list contains an entry for each of the logged in users who have accessed the wiki within an expiry period. The last two list items show the number of anonymous users and bots who have accessed the wiki within the expiry period.
The last access time of the user and their name or IP address is recorded in a file called CurrentUsers.txt which resides in the same directory as the main CurrentUsers.php script.
Note that currently the CurrentUsers.txt file must exist for the extension to work.
Create a directory called CurrentUsers in your wiki's extensions directory. Then, copy the source code from OrganicDesign:Extension:CurrentUsers.php and save it as a file called CurrentUsers.php into the newly created directory. Create an empty file in there too called CurrentUsers.txt.
Note: The permissions of this file needs to be set so that it is writable by the web server.
Add an include statement into your LocalSettings.php file to include the downloaded CurrentUsers.php file as usual, for example:
Create an template article called Template:CurrentUsers which will define the layout of the items. The first parameter is the time, the second is the username and the third is the number of guests (which is only used on the last line). Here's the one we use on our site which lists the items as a bullet list with the names linking to the associated user pages.
To add the list to an article, use the following parser function syntax:
Currently the bot distinction works based on which users have requested the robots.txt file, and it only works if you have friendly URL's on such that the request gets treated as an article title request.
Most of the robots running around out there do not obey the robots exclusion standard and so do not bother to read the file to see what its rules say, or only read the rules for any particular site rarely. There are many databases of user-agent strings out there which could be downloaded periodically and some new web services cropping up which allow dynamic querying of user-agent strings. Incorporating one of these would allow the distinction of guests and robots to be more objective - this problem can never really be solved though because many bots are purposely trying to look identical to a normal browser.
Here are some global variables which affect the operation of the extension. These should be set in your LocalSettings file after the include of the script.
|$egCurrentUsersMagic||'currentusers'||The default parser-function name|
|$egCurrentUsersTemplate||'CurrentUsers'||The template article to use for formatting each entry in the list|
|$egCurrentUsersTimeout||60||The number of minutes of inactivity after which a user or guest is removed from the internal cache file|
- Extension:WhosOnline - similar extension which uses updates the database instead of a file