Talk:How does MediaWiki work?

To learn how MediaWiki works, try out your skills in the Sandbox.

edits
Since editing is off. Please link "farm" to wiki farm. I'd like to define this term somehow, and I'm beginning the research. Anyone is free to remove this note once the link's in. -- Sysy / (talk) 21:57, 9 May 2006 (UTC)


 * A server farm is different from a wiki farm. A server farm is a group of servers distributing the same content (thus reducing load and increasing speed) whilst a wiki farm is a set of wiki installations on the same server, which are in some way linked, either by using a linked database, sharing uploaded files, sharing a single login or some other feature along those lines.  --HappyDog 22:22, 9 May 2006 (UTC)


 * So what do Wikimedia run? A wiki farm and a server farm? A wiki farm is one or more wikis linked in some manner, sharing configuration, common code, or at the other end, users and media (and even content). robchurch | talk 00:18, 31 May 2006 (UTC)


 * Exactly - they run both: 'More than one' wikis that share common data, whose load is then distributed over 'more than one' servers that are connected to the internet.  A wiki farm can exist on a single server, and a server farm can host a single wiki.  In WM's case, both are true. --HappyDog 00:59, 31 May 2006 (UTC)

Offline browser?
Is there an ofline browser I can run on my laptop while traveling such that I can then submit the edits I have made while in transit to the server after I get to my destination and back online ? Pce3@ij.net 15:46, 31 May 2006 (UTC)
 * Install a server (apache or whatever you like) onto your offline laptop, as well as PHP and MySQL if you want to do debugging. Use your normal browser, but point to "localhost" rather than a domain (eg, http://localhost/test_wiki/Main_Page ) --63.249.44.12 04:50, 13 June 2006 (UTC)


 * None that I'm aware of. A standard browser certainly doesn't perform the necessary functionality that would need to be incorporated: each of your edits will be based on potentially out-dated pages & the software would need to be written to flag such differences prior to completing the submission (otherwise other people's work would be overwritten unintentionally).  Best thing I can describe currently would be to take a copy of the raw pages on your laptop & edit directly.  Once you're back online, check that no one else has editted the page since you took your copy & incorporate any differences into your local copy.  Once you're sure a particular page is up to date with everyone's edits (including your own), edit the page & paste it back. Not ideal under any circumstances. --Barthax 09:57, 13 June 2006 (UTC)

Help!
I do not know how to operate the tar.gz file I just downloaded. What other files do I need? ---Joseph Staleknight
 * You must unzip the file with for exemple en:7-Zip (the tar.gz file is a package which contains all files), put it in a directory in your server, run the file index.php and follow the instructions. See also meta:Help:Installation. ~ Seb35 16:45, 1 July 2006 (UTC)

SVG
Please replace the bitmap picture of servers with Image:Wikimedia Servers.svg. Dake 15:24, 22 February 2007 (UTC)


 * Done --HappyDog 15:37, 19 April 2007 (UTC)

unnecessary comma
"MediaWiki is free server-based software, that is licensed under the GNU General Public License (GPL)" please remove the comma 66.195.208.29 03:34, 28 March 2007 (UTC)


 * Done --HappyDog 15:36, 19 April 2007 (UTC)

Which version of GPL
Shouldn't this page list the GPL version (v1, v2 or v3)?
 * It's GPLv2, per latest SVN. Superm401 00:54, 28 August 2008 (UTC)

Which database engine?
The INSTALL file of the current tarball specifies that either MySQL or PostgreSQL may be used. Many people such as myself avoid MySQL like the plague, and if the About page mentioned PostgreSQL as well as MySQL, it would no doubt attract more users. Not to mention that it simply would be more accurate.

Wikipedia Article Stats
Hello,

When I read a MediaWiki/Wikipedia article, I would like to see information along the lines of:


 * How many have read this article (all time-total)
 * How many have read this article today / last week / last month / last year
 * What are the top 100 (xxx) articles that have been read today
 * Which are the top 100 article based on a per-country IP analysis (e.g. top 100 articles read from US IP addresses etc.)
 * Most clicked links (and click numbers) in a given article today/this week/this month
 * etc.

Ideally this would be accessed via a new "article stats" tab at the top of the page for per-page stats, and links to aggregated stats page for the big picture.

Of course, there are limitations regarding how much information could be provided, but one could also provide sufficient granular data that could be exported into a spreadsheet for the real stats addicts, and only provide a high level data summary of the most interesting data on the web stats page. If performance considerations are a hurdle, then perhaps version 1 of this stats page would just provide basic info and the performance impact monitored, and future updates could incrementally improve the level of information available until performance became a problem.

Providing this level of metadata about what was being viewed, how often, etc. would add another dimension to the amount of value/information that MediaWiki/Wikipedia provides.

Besides individual curiousity, I believe that stats would also further improve the visibility of Wikipedia in the media since the stats always provide another dimension to identifying the current "zeitgeist". For example, it would be useful to be able to report that "there were 100,000 views of the Eliot Spitzer article on Wikipedia today".

Any thoughts, ideas, comments about this?

Nish --165.228.153.24 00:15, 19 March 2008 (UTC)

Greg -- 5.21.2008

Misspelling
editprotected scaleable -> scalable Av16ar 11:01, 24 April 2008 (UTC)
 * Done. i Alex  20:27, 11 May 2008 (UTC)