Talk:How to become a MediaWiki hacker

The Top Stuff
I think some detailed software component lists and version numbers would be useful for the target operational configuration and the "standard" development environment deemed best for neophytes.

More later after I read carefully and attempt some download and installation.

w:user:mirwin


 * version numbers have been addressed in the readme file at sourceforge w:user:mirwin

"You'll find cryptic instructions in the INSTALL file in the source. Try to follow them.

If you want to set up a local copy of the existing database to play with, first create an empty database with MYSQL and run the 'createdb.php' script in the 'maintenance' subdirectory (make sure it's configured appropriately!). Note that the maintenance scripts include files from the main source directory; either set up an include directory for PHP or just copy the files in."

Could someone please elaborate a little bit on this for someone who has never used PHP and MySQL before (and Apache only a few times)? More specific: mysqld and apache (with php4-module) are running (on MacOS 10.1.5) but I have no idea how to proceed from there. the wikipedia-software is in an subdirectory of my publicfolder (my test.php there is executed). mysqld is run as safe_mysqld by user "mysql". At the moment I get the following error:

"Warning: Failed opening 'Setup.php' for inclusion (include_path='.:/usr/lib/php') in /Users/elian/Sites/wikipedia/maintenance/createdb.php on line 6

Fatal error: Undefined class name 'title' in /Users/elian/Sites/wikipedia/maintenance/createdb.php on line 7"

What are the next steps to be done? What has do be configured now? --Elian


 * 1) Note that the maintenance scripts include files from the main source directory; either set up an include directory for PHP or just copy the files in. Simplest: copy everything in the 'maintenance' directory to the main source directory. Or, play with your php.ini and set up an include path.


 * I have no php.ini (this is macos x :-. The other solution worked.


 * 1) Create the database:
 * mysqladmin create wikidb #(You may need to use -u and -p options on mysql calls for root user)
 * (edit buildusers.sql to be have the desired database name, passwords)
 * mysql wikidb < buildusers.sql
 * 1) Set up the tables:
 * php createdb.php

Thanks, Brion. one step closer to the goal :-) wiki.phtml gets displayed now! There are some strange problems left (editing is possible in iCab but not in Mozilla and IE), but I try to solve them first alone ;-) --Elian

Everything works now except the upload of files. I suppose I have to tweak permissions a little bit. --Elian

Next question (sorry, just learning php, I have still some difficulties to understand the source code): update user set user_rights ='is_sysop' where user_name ='Test'; commit; but this change does not appear in the browser. What else do I have to add or to do? --Elian
 * How do I make a sysop? I tried:
 * How to switch everything to another language?

Sysops: it used to be "is_sysop" in phase II days, but now it's just "sysop". For developer access too (site locking, non-SELECT queries in the Asksql page), make it "sysop,developer".

Language: set the language code in LocalSettings.php, for instance for Esperanto:
 * $wgLanguageCode = "eo";

You'll probably also want to set the interwiki prefix if you're emulating or duplicating one of the per-language wikis:
 * $wgLocalInterwiki = "eo";

For Esperanto, Polish, Czech, Japanese, Korean, Chinese, Russian, or just for kicks, you'll want to configure it for utf-8 encoding also:
 * $wgInputEncoding       = "UTF-8";
 * $wgOutputEncoding      = "UTF-8";

To keep string functions like ucfirst from mis-munging utf-8 strings, I also set the locale:
 * setlocale(LC_ALL, "en_US.UTF-8");

I don't know if that's appropriate for non-Linux systems. Note that there should be some tweaks to the database configuration too, but we don't do them yet, so just ignore that for now. ;)

If you want to set up a language that doesn't have a LanguageXx.php file yet, make a stub one first! --Brion VIBBER 02:08 Nov 15, 2002 (UTC)

I am not {yet} a wikipedia hacker, but I have experience with Apache, PHP, Zope and ZWiki. I run a few ZWikis, frontended by Apache with mod_gzip enabled, which improves performance greatly, especially over high-latency or packet-lossy connections. Because most wiki pages are short plaintext, gzipped wiki pages can often be transmitted in 1 or 2 TCP/IP packets instead of many more for non-gzipped pages. A thought: if the wikipedia server has some CPU overhead, has anyone considered trying mod_gzip? -- Sydhart


 * Could be done... we could do this either with mod_gzip or PHP's gzip output filter (requires recompiling php with zlib support). About how much would this tend to affect CPU usage? --Brion VIBBER 04:40 Dec 10, 2002 (UTC)

Could someone explain how the edit conflict system works, ie what the wpEdittime variable is all about? I ask as I'm playing with a script I'm writing, which conceivably could create a nice gui interface for wikipedia editing (but at the moment I'm just playing). Thanks Smelialichu 18:41 Jan 9, 2003 (UTC)


 * When the edit form is generated, the wpEdittime field contains the timestamp of the last edit of the article (if it exists). When saving, it's compared with the current timestamp of the page, and if it's different, the edit conflict screen is given. --Brion 19:01 Jan 9, 2003 (UTC)


 * Thanks, so to stop coming up with the edit conflict I just need to submit the current value of wpEdittime with the rest? Smelialichu 19:09 Jan 9, 2003 (UTC)


 * Yup; you can grab it out of the edit form. Also, have you taken a look at Wikipedia Client & related pages? A more machine-friendly wiki interface would help things like this. --Brion 19:14 Jan 9, 2003 (UTC)


 * I have had a look through there. The machine friendly interface seems like a good idea (it isn't too hard to grab the needed info from the edit pages, but search, edit histories etc seems a bit more complex, and susceptible to breakage in the case of users using different themes, or a minor html change), but I'm certainly not up to writing anything to answer all the needs listedat Wikipedia Client. I'm just mucking about really, trying to hone my Python skills. But I guess due to python's reusable, and object oriented nature anything I did write could be useful, for the basic login, logout, post article functions anyway. Thankyou for your quick and helpful replies! Smelialichu 19:29 Jan 9, 2003 (UTC)

Has anyone managed to work with the codebase under windows? Advice would be much appreciated -- having problens with logging in with SSH -- Tarquin 18:50 Jan 27, 2003 (UTC)


 * For months I did all my work on it on a Windows 2000 box. I used the cvs and ssh that come with Cygwin. --Brion VIBBER 19:13 Jan 27, 2003 (UTC)

Help!

I have absolutely no knowledge of the software required to set up a wiki using Wikipedia soft whatsoever, neither the skills to learn it. But I do have a great desire to write an open project management standard, using this wiki (future project site: http://openPM.net). Volunteers are already lined up. I also have a commercial account with PHP and mySQL.

Could someone help me set up a wiki for that project?

Thanks, Mkoval


 * The Wikipedia software is under active development. It changes constantly, it's full of bugs, it has a lot of Wikipedia-specific things in the code that have to be changed for third-party use. I strongly discourage people from using it at the moment who aren't programmers and aren't interested in helping out with development.


 * That's not to say you shouldn't use it if you do have someone willing and able to babysit it and keep up with fixes and help with generalizing things. Just understand what you're getting into: this isn't a turnkey product, it's an experimental prototype that's being tweaked, improved, and occasionally scrapped and rebuilt. --Brion VIBBER 19:58 21 Apr 2003 (UTC)

becoming a hacker....
hi folks,

I've tried 'mysql -u wikiuser -p wikidb <table.sql' but mysql said: 'ERROR 2006 at line 45: MySQL server has gone away'.

Before I set up a user wikiuser with password and granted all permissions on wikidb.

What's wrong?

Thanks mark


 * This means that the connection to the server was broken, but doesn't tell you why. The server could have crashed (sometimes it will restart itself, which will be mentioned in the server's error log), or perhaps the connection timed out. Or, maybe it's just buggy.


 * If you're running an old (3.x) MySQL, consider upgrading to the current version. Try tweaking the configuration to increase the timeout value and the maximum packet size. --Brion VIBBER 03:08, 30 Nov 2003 (UTC)

problems importing mediawiki
Hi Folks again,

Thank you, Brion, for your hints. Here is my state:

I'm running mysql 4.0.16-nt (up to date recommended production release) on win xp.

Now I copied the contents of my-huge.cnf to WinMySQLadmin's "my.ini setup"-field, assured that no information is double and saved the modifications. Then I restarted the mysql daemon.

The values from my-huge.cnf must be large enough, but on 'mysql -u wikiuser -p wikidb <table.sql' mysql said again: 'ERROR 2006 at line 45: MySQL server has gone away'.

Hmm. Any suggestions? Thank you all. Mark


 * The server writes out to an error log which might have more useful information that isn't being reported to the client. In a Unix installation this would be in /usr/local/mysql/var or similar... Can you check this? --Brion VIBBER 22:17, 1 Dec 2003 (UTC)

tar and unzip
Somewhere here the unzip and tar commands used to unpack the .tar.gz file needs to be explained... it's hard to look at the INSTALL with it's cryptic installation guide without getting past first base... would someone please add the appropriate commands for doing that to the main article.

I figured out that you could use Winzip to unzip the .gz part, but I don't really feel like reading the man pages on tar tonight.

Sometimes it seems that open source people forget that there are many talented programmers out there that don't know much about CVS, tar, and gz files... let's lower the bar for entry.

Ok, feeling a teensy bit sheepish, I figured out something useful... if you take the tar[1] extension and make it just .tar, then Winzip can figure it out and deal with extracting tar.

Still, it would be nice to have the command line for unix spelled out.


 * The last time I opened a .tar.gz in WinZip it helpfully offered to open the .tar archive after decompressing it. However I haven't used WinZip much in a long long time, it may have changed behavior. Could you explain what "the tar[1] extension" means? It occurs to me that your browser may be screwing you over by renaming the file in strange ways as it downloads. What browser are you using?


 * Anyway, typical tar command line:
 * tar zxvf mediawiki-1.1.0.tar.gz
 * Hooray for cryptic 1970s-era command lines! --Brion VIBBER 11:34, 4 Jan 2004 (UTC)

SQL dumps
Then, once you've got the SQL dumps for the language you want, import them like so:

That's only if you're trying to create a clone of wikipedia, right? If you're just wanting to set up a blank encyclopedia project you'd skip this step, right? 170.35.224.64 20:09, 10 Mar 2004 (UTC)

Command line user creation
I'm trying to set up a shell script that will create Wiki users, and it seems it should just be a simple matter of MySQL insertions. However, I can't get the passwords to hash correctly. I've looked at the functions in User.php but I'm afraid I don't completely understand them.

What would be the best way to accomplish this? 67.171.79.204 13:12, 24 Aug 2004 (UTC)

First Hack (+ Problems)
I did my first very basic own hack today. Before to release it on the main entry i want to present it here hopefully some questions got anserwed and some hints added.

Add the Special Page "FirstTry"

 * 1. Open the file SpecialPage.php in the include dir and insert a new line into the $wgSpecialPages Array with the value "FirstTry" in both fields.
 * 2. Create a new file called SpecialFirstTry.php in the include dir.
 * 3. Insert into this File the function wfSpecialFirstTry and make $wgOut global to use the wiki-output.
 * 4. Now use the function addHTML of the $wgOut class to do a basic output. $wgOut -&gt; addHTML("Hello World");
 * 5. Go to Our Special Pages an Enjoy :)

Failures and next Steps
Pfew. Im not sure that there is not allready a Hack Guide into the wide wiki area. In this Case please tell me where!

In case that not the guide need some class infomations next so that a user can use this global classes and do not do all this sql and output stuff by own code.

Another problem i could not fixed was this < > problem on the displayed title. I realy dont found it !!

tharo@bahamut.de

--> moved to hacking for your first time, with attributions. -Hillgentleman 08:54, 12 December 2007 (UTC)

Lost connection --- suggestion
I often encounter this:

A database error has occurred Query: SELECT cur_id FROM cur WHERE cur_namespace=0 AND cur_title='King_of_England' Function: LinkCache::addLink Error: 2013 Lost connection to MySQL server during query

Backtrace: Database.php line 196, in wfdebugdiebacktrace DatabaseFunctions.php line 36, in database::query LinkCache.php line 136, in wfquery Title.php line 595, in linkcache::addlinkobj Skin.php line 1480, in title::getarticleid Parser.php line 1280, in skincologneblue::makelinkobj Parser.php line 810, in parser::replaceinternallinks Parser.php line 98, in parser::internalparse OutputPage.php line 226, in parser::parse refreshLinks.inc line 35, in outputpage::addwikitext rebuildlinks.php line 29, in refreshlinks

It would be pretty easy to change the code to reconnect to the database if the connection is lost and resume rebuilding the links, since it's a long process of updating the links, restarting is usually a pain.

How to add keywords for meta tags in HTML Header
How can I add keywords for meta tags in HTML Header? I have found no way to do so and there seems to be a mechnaism automatically picking some words from the page, which results in confusing keywords.

thx, marc

Answer: editing meta data, adding keywords
Look for the file OutputPage.php in the directory  /includes. Search for 'KEYWORDS'. The best way to add keywords is to append them to the line $wgOut=, like this:

$wgOut->addMeta('KEYWORDS', $a.', ,...');

Good luck Edward

THE ERROR
Parse error: syntax error, unexpected T_VARIABLE, expecting T_FUNCTION in OutputPage.php on line 946

What I did
(Cut out of OutputPage.php, somewhere close to the end of the file)

$wgOut->addHTML( "\n $r \n" ); } $wgOut->addMeta('KEYWORDS', $a.'Testword1, Testword2'); /** * This function takes the title (first item of mGoodLinks), categories, existing and broken links for the page * and uses the first 10 of them for META keywords */

External uses for Parser.php
I am hoping someone familiar with the Mediawiki system might bear with me here if it is at all possible to address the following, as it could, I think, be a potentially powerful use of wiki technology in a way that it has not yet, to my knowledge, been used.

The file Parser.php is mentioned on the article page (along with Skin.php) as being the page that processes the WikiText. I was wondering whether anyone here who knew may be so kind as to offer any tips to someone who is not that familiar with the MediaWiki system about how this file might be accessed by a PHP script to convert wiki text (as from one's own MediaWiki database setup) into HTML? I am very interested to design such a script so as to be able to combine the content of various wiki pages into a tabular format without having all the toolbar content and the like appearing in each cell.

Beyond this, I would like to specifically transclude all these processed wikitext pages of HTML into iFrames (I know that frames have to be reenabled within one's copy of the Mediawiki software first). Within these iframes, one could, by use Javascript and the DOM, add individual backward and forward links, as well as add some bare bones PHP links for wiki features such as "edit page". Of course, for this to be effective, one would also need to figure out how to have the script adapt the results of the parser (or modify the parser itself) in order to have the "edit page" or other links lead to views which also did not include the toolbars, headers, etc.

This would, in essence, offer the possibility for Mediawiki software to become a virtual (tabular) collaborative database (without needing to alter the code too much, I would venture). One could enhance this functionality with features such as "show recent changes for the page in this row/column/table", if one could add the feature to Mediawiki (if it does not exist yet) of showing recent changes for a set of specified pages.

Another (simpler) question for another use is what code within the Parser.php file (or elsewhere) could be included in a non-Mediawiki PHP page to create a link which detected whether a page at the wiki site had content or not. Of course one would need to call (and have access to) the wiki's database to do this (could a lower-level public database access be put into Mediawiki for those who wished to simply query this data of whether a page had been created or not?), but I could see many uses for adding such smart-links to other pages. One could automatically create these smart links within one's own PHP pages which linked to the relevant Wikipedia, Wiktionary, etc. pages, based on the title of one's own page (or wiki page). For example, one could design a script which, whenever calling a particular wiki page of one's own (or other type of page), would also automatically add links to the relevant Wikipedia, Wiktionary (not to mention Google, etc.) pages, if they existed (and creating the orange links to allow new pages to be created if they had not).


 * I've since hacked out some of my own code to do the latter. See here. But I think the earlier problem would be much more difficult. And it'd be still be nice to find some way to get a lower-level public access to the Wikimedia databases in order for this smart-link script to work for linking to Wikipedia, etc. Brettz9 09:42, 12 May 2005 (UTC)


 * Not to mention this would be useful as an interwiki option (e.g., when linking from Wikipedia to other sister projects or other Mediawiki pages and one wanted a smart-link to save the hassle of having to visit an uncreated page (or go through an extra step to edit a page))

Thanks...

Brettz9 08:39, 12 May 2005 (UTC)

Support for MS-SQL Server
I work in an MS dominated corporate environment, but have downloaded the XAMPP package and MediaWiki-1.4.4 to give it a whirl. Is there any work being done to support MS-SQL as the SQL backend to MediaWiki ?


 * No, there is no such work being done. --brion 06:04, 31 May 2005 (UTC)

I guess a slightly different way of wording this question would be - is there any SQL syntax particular to MySQL that would stop it working out of the box on MS-SQL?


 * MySQL has a much larger feature set in PHP than MSSQL. Many simple queries are just a matter of replacing "MY" with "MS" in the query whereas some of the others which MediaWiki undoubtedly use will not work in that manner.

Hi there, I'm developing a MS-SQL Plugin for my Company (Siemens). It's nearly done. If I'm ready with my work, further informations will be posted! Sven (sven.schuberth.ext(at)siemens.com)

updating...

has anyone managed to have MsSQL as their data backend already? is there a support now? mr. sven, any news from you? i, here, also work in a pro MS company... so i am trying to dig up if mw can now support MsSQL... hope to hear responses...

which php-file is the right one
i have installed my own wiki.


 * 1) on the left side, in the navi-bar are some links. how can i change this links, ad some more and other pages in and out of wiki.

answer post here or wirte to kaikotzian@gmail.com

A: Open the page name "MediaWiki:Sidebar" in your wiki.


 * 1) i also want to know how can i change the page Donations. i dont want to change the page in the free-edit part of the side. i will change it in the php-source code. because i want add a fix not changeabale text and a formula for donations with paypal. but i cant find a file such donation.php or fundraising.php in my ftp-directory

answer post here or wirte to kaikotzian@gmail.com

A: I did it like this: Create a "DonateExtension.php" or something in extensions dir that looks similar to this:

$wgExtensionFunctions[] = "wfDonateExtension"; function wfDonateExtension { global $wgParser; $wgParser->setHook( "donate", "renderDonate" ); } function renderDonate( $input ) { $output="Your actual HTML code ,, bla ..<form action=\"https://www.paypal.com/cgi-bin/webscr\" ... etc"; return $output; }

and then insert " " in any page (which you can link to from the Sidebar f.e.). Mutante 15:40, 25 May 2007 (UTC)

Adding a Header/Footer
Most of the edits and configurations I understand well enough. I have been searching just to find out how to add a header and footer to the wiki. It would be great if I could just add a header using a php include or even just the raw code. I have tried to edit the default MonoBook.php but not matter what I put, it doesn't seem to parse at all. You would think it would be easier to just add a header/banner to the very top of the page and a footer at the bottom.

Can anyone help me with this? Thank you.

I'm also looking for something like this... I found this link: http://mail.wikipedia.org/pipermail/mediawiki-l/2005-September/006670.html

cheers www.nathanwaters.com

How do I edit "[edit]" tags in Wiki page?
I am trying to change few things in Wiki and I don't really know where to start. First I wanna change "[edit]" tags in each sections of Wiki and besides the tag I want to add few more link. Which file contains these codes?

Thanks!

Emacs settings
The mediawiki code uses lots of tabs. I've found the following useful in ~/.emacs to accommodate this (using php-mode.el from sourceforge).

(require 'php-mode) (setq-default tab-width 4) (add-hook 'php-mode-user-hook (lambda (setq c-basic-offset tab-width) (setq-default indent-tabs-mode t)		(setq backward-delete-char-untabify-method nil) ))

I don't know a good place to record this for future generations (and me when I kill my .emacs), so I've left it here. Lupin 20:13, 17 December 2005 (UTC)

Database access
I have installed my own wiki. How do i use functions in databasefunctions.php for creating a special page.How do i get a query executed on database and result displayed on special page.


 * Check out: Writing a new special page, but I would suggest browsing Database.php and figuring out functions you need. Ambush Commander 21:40, 5 May 2006 (UTC)

MIME Types upload at Windows
I have installed MediaWiki in a Windows XP Server here at my job. Are there some program or Add-on that can be able to upload any type of file like excell files and pdf files?

Cubajoe 10:52, 2 March 2006 (UTC)


 * If you using it on an intranet, you can configure the installation to accept 'xsl' and 'pdf' files in LocalSettings.php. Ambush Commander 21:39, 5 May 2006 (UTC)

Matrix at MediaWiki
I have installed MediaWiki in a Windows XP Server here at my job. Are there some program or Add-on that can be able to create a matrix with columns and lines (like excell) inside an article?

Cubajoe 10:52, 3 March 2006 (UTC)


 * These are called tables, you can learn more about them here: Help:Table. Spreadsheet style functionality like summing rows is experimentally provided in some extensions. Ambush Commander 21:37, 5 May 2006 (UTC)

SVN or CVS?
In the patch section, SVN is mentioned in the example, and it made me happy to see it was used for MediaWiki. However, it seems that this is the only place it is mentioned, everywhere else there is CVS. :-( So, CVS is what is used now, is that correct? Will it perhaps change to SVN at some point? 163.9.100.136 07:34, 5 May 2006 (UTC)


 * They're probably old links. MediaWiki uses SVN, you can browse it here: http://svn.wikimedia.org/viewvc/mediawiki/trunk/ . If you get a chance, you should probably update all those links. Ambush Commander 21:34, 5 May 2006 (UTC)

Light version for mediawiki ?
Hi, I've installed MediaWiki on my webpage (http://wikipoll.free.fr) but there is a PHP memory_limit of 8M. It seems that it's not enough to edit pages for now. The line "ini_set( 'memory_limit', '20M');" doesn't seem to be efficient (I suppose the php.ini specifies 8M and I don't have any access to it). Is there a way to disable some functions of MediaWiki so that it could require less memory ? Is there a way to hack the 8M memory limit ? Has anyone an idea why pages with accents (éàê...) in their title only display "Internal server error" ? Is it linked ?

My webpage worked perfectly yesterday, and I didn't change anything... So weird -.- --81.57.3.212 01:19, 26 May 2006 (UTC)


 * You probably shouldn't try to use MediaWiki on free web-hosting, it isn't well-adapted to extremely restrictive environments (it's quite robust, but there's only so much it can do). Furthermore, the internal server misconfiguration is a webserver problem... and out of your control. Bug your host, but since it's free, don't expect too much. &mdash; Ambush Commander (Talk) 20:16, 26 May 2006 (UTC)

Revamping this page
I'm going to start cutting out a lot of the info on how to learn PHP and MySQL. If you're not well versed in that, it's going to be difficult to do much of anything. Maybe a few links, but that's it. I'm also going to put info on configuring MediaWiki for Windows. &mdash; Ambush Commander (Talk) 16:10, 18 June 2006 (UTC)

Host multiple wikis
I have a wiki site and am planing on expanding a bit. I want users to be able to create there own wikis. I can think of 3 ways to do this


 * 1) Create a separate set of tables for each wiki (with a prefix)
 * 2) Have each new wiki be part of a main wiki but require it to be in a particuar category (this would mean I would have to write some code to enable searching within a category)
 * 3) Have each new wiki be part of a main wiki but create a new name space for each wiki.

I don't like option 1 because I want users to be able to search through all wikis at once (i would have to have a separate search for each wiki with option 1). Plus, seems like a lot of overhead. Option 2 seems like it might not be that efficient as I don't think categories were designed for this purpose. So, I'm leaning on option 3. However, I guess I could end up with 1000 namespaces. So, maybe there is some way I could define them in a table rather then a php file.

So, what do you guys think. Is option 3 the proper way to do this? Or is there some other way that I haven't thought of?

Complete Tutorial
I know somewhat about php but when you're talking about command line and /usr/bin/php I'm confused so I need help. Plus what program is used to view phps properly and make them. I had downloaded it on my old computer but it was just txt's for the binary download I think and for the installer the folder it made had a couple of exes but they just looked like command prompt.--Melab 01:43, 25 April 2008 (UTC)

need help for idea
http://en.wikipedia.org/wiki/User_talk:92.143.4.107#another_brilliant_idea

the value of automated tests
Since I am not actually writing code for mediawiki, you may want to ignore me, but I believe, based on my experience, it is naive to consider writing automated tests a mere nuisance. For any program with a long lifetime, it is simply more efficient to have automated tests. It has been known for a long time, and it is certainly true of mediawiki, that most of the work writing code for the program occurs in the "maintenance" phase, after the program is put on line. When a piece of code is added, or a piece of code is changed, it is common to perform some manual tests, as well as performing a code review, before placing it in production. The great contribution of junit and its testing framework relatives is that they make it no harder to write tests that can be kept than it is to perform throwaway, manual tests. Having a body of automated tests greatly reduces the costs and risks of making any changes to the code later, that is where these tests pay off, over and over, and, with a good testing framework, for a minimal investment. A suite of automated tests also serves as a set of working examples of how to use the existing code, and can often make it much easier for a programmer new to the program to become productive.

Forgive me, you have probably heard all this before. I was saddened to see the recent update on the status of automated testing for mediawiki (which said, in essence, that there are not any functional tests except some for the parser, and that a lot of prior effort has apparently been abandoned). As a long term admirer and user of mediawiki, I want to put in my two cents worth of encouragement, and say that I believe, on the basis of my experience, that you will gain improved productivity for each automated test that you add. --AJim 00:31, 17 October 2009 (UTC)