Project:WikiProject Extensions/MediaWiki Workshops/MediaWiki 1.19/Transcript

 [19:04:35] 	okay - so welcome to the MediaWiki Workshop: Preparing extensions for MediaWiki 1.19 happening in this IRC room - Project:WikiProject Extensions/MediaWiki Workshops [19:04:55] 	this is our first go at one of these - and towards the end we'll have a chance to talk more about future ones, etc. [19:05:08] 	basically we're going to start by mentioning some changes coming in MW1.19 [19:05:11] 	where to find that information [19:05:26] 	and then we'll move into questions and answers both about MW1.19 and other things that may be lingering on your mind :) [19:05:49] 	be warned that any long topics may be asked to be tabled either for offline or a future workshop - such as ResourceLoader which will require more in-depth explanations [19:06:20] 	finally we're wrap up with some discussion about these session in general - and a general open thoughts on how to help support the extension developer community [19:06:53] 	thanks for creating & publicizing this, varnent [19:07:01] 	we're not usually a strictly formal process or rules - so feel free to chime in with question as we go along - if it's off topic - I may chime in that I'll re-raise it at a more appropriate time later to help things flowing [19:07:05] 	Yes, many many thanks [19:07:09] 	sumanah: thanks for the help along the way  :) [19:07:32] 	you can also PM me directly with anything you may want to ask but aren't sure about timing, etc. [19:07:41] 	for reference: Extensions is the homepage for MediaWiki extensions [19:08:09] 	more reference links: [19:08:20] 	and if you're looking for ideas for extensions to implement, Extension requests and the enhancement request category in Bugzilla are useful [19:08:30] *olenz_ 	opens the beer bottle. [19:08:32] 	this whole effort is a part of a larger project on helping extensions and extension developers - you're all now invited to chime in and participate: Project:WikiProject Extensions [19:09:03] 	there's also a companion project for SysAdmins if that's more your flavor: Project:WikiProject SysAdmins [19:09:10] 	so, 1.19 1.19 [19:09:28] 	The release notes: https://svn.wikimedia.org/viewvc/mediawiki/trunk/phase3/RELEASE-NOTES-1.19?view=markup but the summary is at 1.19 [19:09:29] 	so here's some of the upcoming changes you'll want to know about [19:09:48] 	we'll open it to Q&A - and core developers are invited to chime in as items are posted [19:09:53] 	RL2 isn't being released with 1.19 fyi, hasn't been merged yet. [19:10:11] 	that's good to know actually - lol - was on my list [19:10:18] 	so in MW1.19: [19:10:28] 	Bumped MySQL version requirement to 5.0.2 [19:10:38] 	Gadgets are always loaded through ResourceLoader, which means each module executes in a local (new) scope by default. [19:10:42] 	can anyone speak briefly to that? [19:11:07] 	I'm pretty sure that's a RL2 thing, which isn't part of this release [19:11:10] 	I've removed it from the page [19:11:18] 	okay - lol - good to know :) [19:11:27] 	next item - [19:11:30] 	New common*.css files usable by skins instead of having to copy piles of generic styles from MonoBook or Vector's css. [19:11:59] 	are there are immediate questions or additional info on that item in regards to skin extensions or skin developers [19:12:03] 	That should theoretically make it easier for folks to create new skins without massively re-inventing wheels [19:12:12] 	(mglaser, JeroenDeDauw, Yaron, just letting you know this extensions meeting is happening right now here) [19:12:39] 	i'm following :) [19:12:52] 	Hello, I'm here. [19:12:54] 	Dantman: ^^^ skin questions may show up about now :) [19:12:59] 	bawolff: is use of it pretty straightforward - or anything folks should know about using them? [19:13:24] *bawolff 	hasn't actually looked at said commit [19:14:04] 	but i believe it allows people to use some of the css that's unlikely to change between skins, so they only have to worry about the stuff that's likely to change [19:14:19] 	it appears pretty straightforward - so if folks have questions - we can address those later - ty bawolff :) [19:14:21] 	Yay DRY code [19:14:23] 	Is this documented anywhere? [19:14:58] *varnent 	starting a post-workshop to-do list for the wikiproject - adding documentation for this topic [19:15:34] 	The default user signature now contains a talk link in addition to the user link. - neat but not sure anything folks need to integrate or conform to :) [19:16:01] 	Better timezone recognition in user preferences. - helpful to know for extensions using time zones or elaborate time functions [19:16:31] 	The interwiki links table can now be accessed also when the interwiki cache is used (used in the API and the Interwiki extension). [19:16:51] 	anything on that which folks may need to know about? [19:16:56] 	The timezone thing is just detecting daylight savings properly when filling in from browser. Unlikely to affect extension devs [19:18:01] 	you can visit API:Meta and Extension:Interwiki for more info on that if you'd like [19:18:03] 	interwiki cache thing will make life easier for certain people developing gadgets aimed at wmf wikis, but unlikely to affect extension developers (unless they need to get a list of all interwikis, which seems unlikely) [19:18:25] 	The command-line installer supports various RDBMSes better. [19:18:30] 	Extensions can now participate in the extraction of titles from URL paths. [19:18:49] 	those are the listed items - ended on that one to allow folks to speak to it - we'll shortly move to Q&A [19:19:06] 	Is there a dosumentation of the CLI Installer and how to extend it / hookt into it? [19:19:07] 	oops - sorry - I lied - I do have more items - but I'll let folks speak to that first [19:19:33] 	sorry for the typos :) [19:19:41] 	Osnard: What would be looking to do with the CLI installer? [19:19:58] 	Use it for automated wiki setup in a wikifarm. [19:20:19] 	I am just curious. [19:20:52] 	Amgine: You'd probably be better off creating your own maintence script for that. Take a look at addWiki.php in WikimediaMaintanence [19:21:22] 	It may be interesting if one would try to create some kind of distribution [19:21:31] 	I know that gicode is interested in working on a wiki family related extension San Francisco Hackathon January 2012/Teams [19:21:52] 	a wiki family management extension so they can store and load configuration sets in a database [19:22:28] 	varnent: what 'participate in the extraction of titles from URL paths' means? Is there any specific function? [19:22:34] 	Let's say I want to bundle some extensions together wir MW and have some extra stept in the installer to set them up. [19:22:38] 	^demon was working on a new configuration system, he should be getting back to the post-1.19 release and post-git migration [19:22:53] 	Osnard: so, that kind of extension management is something that would be very very welcome as a big project [19:22:54] 	is it about the ContextRequest? [19:22:57] 	toniher_casa: excellent question - we'll be getting to that next  :) [19:23:07] 	varnent: :D [19:23:19] *bawolff 	thought the installer could install extensions already? [19:23:41] 	It can, if they are present in extensions/ [19:23:58] 	bawolff: yes but can it configure them? [19:24:07] 	while it would be excellent if Chad had time to do everything on his plate, he doesn't, so if an experienced MediaWiki developer has time to take it on, that would be welcome.  But it's a very big and complicated project to get MediaWiki to the state where installing and managing extensions in an existing installation works as well as we want [19:24:48] 	sumanah: Agreed. It's something that should be in core an done by an experienced developer. [19:25:14] 	and Chad's been meaning to get to config management for as long as I have been at WMF and the 1.19 deployment + release + the git migration won't all be over for several months. [19:25:32] *johnduhart 	nods [19:25:44] 	so it sounds like the conclusion is that documentation in this context does not exist - the overall applications are being explored more at Hackathon - and long-term dream implementation of this overall topic is in the roadmap - but not a high-level priority given the scope and skillsets needed [19:26:06] 	:) [19:26:20] <tr|nn|>	will there be videos of the Hackathon for those who cannot be there? [19:26:20] <Amgine>	 yet [19:26:29] 	Osnard: but to get back to your original question: the documentation of the CLI is, IIRC, fairly poor. It's very much a second-class citizen.  I would welcome someone taking on the task of enhancing its documentation, and it could probably use an owner/champion for its robustness, given how often people end up using it while troubleshooting it [19:26:50] 	For the configure stuff, is there at least a page explaining what goals are, what to do differently from extension:Configure? [19:26:54] 	tr|nn|: there are many hackathons.  varnent, what hackathon are you talking about? [19:27:11] 	sumanah: I know that gicode is interested in working on a wiki family related extension San Francisco Hackathon January 2012/Teams [19:27:18] 	was including that  :) [19:27:32] *varnent 	adding CLI documentation to post-workshop to-do list for wikiproject [19:27:36] 	In regards to CLI. Much of it is self documenting - run the script with --help will give you basic info on what it does and what the options are [19:27:39] 	tr|nn|: the San Francisco hackathon will have about three tutorials and we're aiming on videotaping all of them.  But there are going to be dozens of individual teams collaborating on various projects and we're not taping or streaming that. [19:27:53] 	bawolff: Have it suck less. Right now it generates the PHP config for MW. That needs to be reworked to allow completely configuration loading from db. [19:28:14] 	family management sounds interesting. I'll be there in SF [19:28:17] <tr|nn|>	sumanah: it would be a great resource to have those tutorials available. thanks [19:28:38] 	mglaser: you can sign up on San Francisco Hackathon January 2012/Teams to say you want to work on that team with gicode [19:28:45] 	any other CLI comments? [19:28:58] 	if not, can anyone speak to "Extensions can now participate in the extraction of titles from URL paths." [19:29:38] 	tr|nn|: Guillaume Paumier has also done some very good work which he is continuing on How to become a MediaWiki hacker, and I recently got a volunteer to redo API:Main page -- so there is improvement happening in mediawiki.org documentation, and we welcome your aid [19:29:50] 	(sorry for derailing) [19:29:56] 	no worries :) [19:30:48] 	too bad dantman isn't here, i believe that's also his code [19:31:10] <Amgine>	Hexmode. [19:31:23] 	Amgine: [19:31:32] <Amgine>	cli installer [19:31:47] 	? [19:31:51] 	Hexmode: can you speak a little to "Extensions can now participate in the extraction of titles from URL paths." [19:32:23] *varnent 	thinking our expert on that may not have been able to attend [19:32:45] 	varnent: I know you asked me about that before, but I don't really know what that means [19:32:49] 	:( [19:32:54] 	okay - no problem :) [19:33:10] 	I'll commit to tracking down Dantman and getting some documentation about it online by the end of this weekend [19:33:11] 	Basically there's a new hook WebRequestPathInfoRouter [19:33:29] 	That would be this https://svn.wikimedia.org/viewvc/mediawiki?view=revision&revision=104274 [19:33:33] 	Which can extend how mediawiki processes urls [19:33:41] 	varnent: seee bawolff & johnduhart ^ [19:33:51] 	there we go  :) [19:34:13] 	Basically before we had an action paths feature which allowed people to things like mywiki.com/edit/foo instead of mywiki.com/index.php?title=foo&action=edit [19:34:34] 	From what i understand the path router stuff is all that stuff taken one step further [19:35:01] 	Where one could specify more complicated patterns in urls to be turned into query parameters that mediawiki understands [19:35:30] 	that makes sense - I'll work with Dantman on some documentation so the scope of its abilities are available to folks [19:35:53] *bawolff 	hopes that was accurate, i'm not super familar with that area of the code [19:36:14] 	would this relate to POST data as well? [19:36:38] <Osnard>	This is very interesting. I have to admit, that I haven't heard about this feature in MW before... [19:37:00] 	I imagine such paths would still accept POST requests [19:37:17] 	helloo [19:37:28] 	Osnard: as an example of action paths, see the url structure of Server admin log vs http://wikitech.wikimedia.org/edit/Server_admin_log [19:38:10] <Osnard>	bawolff: cool :) [19:38:12] <olenz_>	that does sound inteersting [19:38:40] <olenz_>	i do not exactly see the advantage, besides probably nicer URLs? [19:38:55] *bawolff 	to be honest doesn't either... [19:39:11] <olenz_>	Although nicer URLs are a good reason [19:39:21] 	I think there might be some stuff there related to alternate entry points [19:39:26] <Osnard>	I like nicer URLs [19:39:39] 	I think much of the code in img_auth.php was simplified as a result of that [19:39:44] 	I believe it relates to this (could be wrong): Requests for comment/Entrypoint Routing and 404 handling [19:40:14] 	So theoretically an extension could introduce a new entry point, and re-use much of the path info code (why an extension would want to do that i don't know) [19:40:45] <olenz_>	But isn't this something that has to be handled mostly on the level of the web server? [19:40:56] <olenz_>	Apache, not MW? [19:41:04] 	it might also help with specialpages where you have several GET parameters that you want to mask as a path [19:41:07] 	varnent: I haven't seen that page before, but it does sound like it describes this feature [19:41:15] 	(by the way, olenz_, what are extensions that you work on?) [19:41:17] 	sumanah: Do you know where the beautiful soup people hang out on irc? [19:41:26] 	e.g. SpecialSomething/user/mglaser/manespace/Help [19:41:54] <tr|nn|>	mglaser: I guess a similar approach would be to use $par passed to the execute function [19:41:56] 	multichill: I don't know whether there are enough to make an IRC community. There's a newsgroup/Google Group where you get a fairly fast response, espec if you tell me when you have posted something & I can shout to the next room to alert Leonard. :) [19:42:08] 	it looks like there were some specific extensions in mind for this - forum and 404 page related [19:42:13] 	olenz_: well on the apache level, it's concerned with making sure urls go to correct script, and then the php script (mediawiki) takes the url, and does some stuff based on that [19:42:16] <olenz_>	sumanah: some fixes on some extensions, and a so-far private extension on BibTeX [19:42:51] *bawolff 	really hopes he didn't say anything inacurrate in regards to that feature. I'm really not very familar with it [19:43:16] 	bawolff: thank you for speaking to it - and I'll accept any hassle for any inaccuracies :) [19:43:18] 	olenz_: my group is interested in bibtex, I'd be interested in learning what you're doing sometime [19:43:36] 	any follow-up questions? again, apologies for lack of documentation - will hopefully have that linked from MediaWiki 1.19 within the week [19:43:53] <olenz_>	in general, the path is a hierarchy, while the query params are each a flat name [19:43:58] 	tr|nn| right, I was just wondering if the new feature would make things easier. [19:44:02] <olenz_>	so that might make soe differences [19:44:31] <Amgine>	<prefers class WebRequest::getQuery, getValues> [19:44:34] <Amgine>	etc [19:44:40] <Nikerabbit>	you've been mostly talking about the path system? [19:44:57] <Nikerabbit>	I haven't seen anything using it yet [19:44:58] 	Nikerabbit: not necessarily mostly - but most recently [19:45:01] 	Does anyone have questions about context and how to use it? [19:45:33] 	great, if not, the final items are Internationalization related - more info at MediaWiki 1.19 [19:45:35] 	olenz_: Actually, afaik mediawiki sees the part after index.php (aka /foo/bar in index.php/foo/bar as a simple flat parameter, and not as a hieracial strucute) [19:45:50] *varnent 	we're now officially moving to the general Q&A part btw [19:45:59] <olenz_>	oops, sorry [19:46:02] 	Because if I see anyone writing new SpecialPAges or API modules and using globals I'll be mad ;) [19:46:04] <olenz_>	Go on, varnent [19:46:05] 	johnduhart: I've been meaning to add examples to manual:$wgTitle and similar pages gives examples of what to use instead [19:46:38] 	bawolff: Haven't we all been meaning to write documentation? :p [19:46:43] 	:P [19:46:49] 	folks are already doing a great job of asking questions - and I'm thrilled about that - so at this point, you may chime in with questions - please be mindful of not mixing more than one subject - and I may ask to sideline an item to keep flow easier :) [19:47:09] 	nekohayo: ping :-) re your extensions question [19:47:18] 	oh hai [19:47:42] 	So, I maintain wiki.pitivi.org, and over the last 2 years we have been heavily spammed. The situation is now "under control" with a strict account policy, and I managed to recently purge the 1000+ pages and 500+ spam images (including all history). [19:47:50] 	However, 800+ fake user accounts remain. I'd like some advice on how to purge them (including history). There is an extension called UserAdmin, but it seems to work *only* with mediawiki 1.16 and to be unmaintained. [19:48:14] 	We don't recommend deleteing user accounts, ever. [19:48:27] 	spam user accounts I couldn't care less :) [19:48:36] 	Still applies. [19:48:51] <Amgine>	rename to spamaccount0000001, etc. [19:49:08] 	I have heard from several MediaWiki administrators that they tried AbuseFilter & other extensions and still get too much spam to deal with, and need help [19:49:39] 	doing that for 800 accounts is just as much work as deleting them, with no gained clarity in https://wiki.pitivi.org/index.php?title=Special:ListUsers&limit=1500 [19:49:51] 	nekohayo: If you don't clean everything up when you delete users, you're end up with inconsistancies in your database which could lead to more problems down the road. [19:49:53] 	note that pitivi wiki has less than ten actual users [19:50:11] 	nekohayo: icky solution, but you could always rev delete them [19:50:29] 	nekohayo: You could use Extension:User_Merge_and_Delete to merge them all into one account [19:50:51] 	nekohayo: https://wiki.pitivi.org/index.php?title=Special:ActiveUsers [19:51:16] *bawolff 	has to go, happy extensioning everyone [19:51:18] 	One admin writes: "After opening up my wiki for free registration, I immediately got three spam accounts. One turned into real spam being injected into and then cleaned from the wiki. None of the methods aside from preventing self-registration seem to work against spam." [19:51:18] 	Isn't this really an administration question? [19:51:33] 	johnduhart: I suggested nekohayo come in and ask for help during this meeting [19:51:33] 	well, [19:51:45] 	johnduhart: it has extensions applications I'd say [19:51:54] 	my question is an administration question at its core, [19:51:59] 	but yeah - hopefully we'll mimic a session like this for sysadmins as well :) [19:52:03] 	experienced MW extensions developers are usually also admins [19:52:08] 	Did we already do a session like this for upgrading to 1.18? [19:52:28] 	but it also is a question of "what is your policy regarding admin extensions, and extensions that seem to be mostly unmaintained" [19:52:34] 	cneubauer: no - so feel free to ask beyond just MW1.19 - you may ask any questions in general related to developing extensions [19:52:44] 	cneubauer: no we did not, this is the first ever session like this [19:53:01] 	nekohayo: that's actually a really good question - and one I think we're just starting to explore [19:53:05] 	for the "DeletePagePermanently" extension for example I had to patch it with some random suggestions in the talk page [19:53:14] 	and I found that to be a little bit funny in terms of user experience [19:53:19] 	:) [19:53:36] 	nekohayo: this is an effort to try and find all those extensions and utilize new templates to improve things like that - Project:WikiProject Extensions/Projects/Page_Drive [19:53:49] 	okay, excellent, how do I add messages to the message cache dynamically now that addMessage is gone? [19:54:01] 	cneubauer: You don't. [19:54:03] 	;) [19:54:06] 	hmm [19:54:11] 	for example - Template:Incompatible  and   Template:Archived extension    have just recently been created to help with archiving unmaintained extensions, etc. [19:55:02] 	there is also more clarity on  Manual:Developing extensions   about what is expected of developers hosting extensions on MW.org [19:55:09] 	cneubauer: want to go into what it is you're looking to do? [19:55:16] 	then maybe johnduhart will be less dismissive ;-) [19:55:18] 	johnduhart: really? you're not joking? [19:55:25] 	cneubauer: Really. [19:55:36] 	cneubauer: talk about the extension you work on [19:55:41] 	ouch [19:56:24] 	well I have a number of legacy extensions that used addMessage. I'm upgrading them to use an i18n file where possible but some of them add dynamic messages... [19:56:29] 	nekohayo: so I think in general the answer is - many others agree there are a lot of extensions in need of archiving and many in need of updating (which should be noted to other developers and sysadmins on their respective pages) - I think until now there's been a lot of concern about stepping on the toes of those respective extension's developers - the modifications I mentioned and templates are an effort to work through th [19:56:29] 	concerns and come to some solutions [19:56:36] 	cneubauer: Dynamic how? [19:57:12] 	varnent, I have to say I've been impressed by wordpress' extension management system lately [19:57:25] 	but I guess you might say that's a completely different target audience :) [19:57:31] 	nekohayo: I agree - I also like how Drupal handles things [19:57:33] 	like the message changes based on the input to a callback [19:57:53] 	I am hereby jealous of how WP started off as something that ordinary admins were supposed to install & maintain; MediaWiki's legacy is a tough one in some ways [19:58:00] 	(WP here meaning WordPress) [19:58:08] 	but that's offtopic [19:58:12] 	nekohayo: I think we're trying to get things moved in those directions - and in a way that works for MediaWiki (direct clones of those efforts would be unlikely to work within the Wikimedia communities) [19:58:26] 	cneubauer: You'd want to use message parameters for that. [19:58:39] 	WP has a nifty community-based compatibility report for each extension for each release, and their system can actually search and install extensions directly from the admin interface (but I understand this is not exactly trivial to do) [19:58:57] 	johnduhart: hmm, gotcha. so $1 or something [19:59:02] 	cneubauer: Yup [19:59:02] 	sounds reasonable [19:59:14] 	nekohayo: you saw me talking earlier about the task that would take an experienced developer a lot of time? that sort of dashboard. [19:59:40] 	nekohayo: but like sumanah said - the original intents of the two projects provides for varying base needs and foundations to work from as the projects are more widely used - I'm not sure the original developers of MediaWiki envisioned a community of literally thousands of sites outside the Wikimedia umbrella running, and in part depending on, the software - although I could be wrong :) [19:59:46] 	it's now been about an hour since we started, so I need to direct my attention elsewhere for a little bit... [20:00:00] 	varnent: you are not wrong.   MediaWiki architecture document/text gives the history. [20:00:01] 	cneubauer: Then when you use the message you'd populate the params with wfMessage('msg', 'paramone', ....) or whatever flavor of message generation you prefer :) [20:00:07] 	are there any other immediate questions? [20:00:16] 	otherwise I'd also like to open for feedback on these sessions [20:00:21] 	I have some questions for extension developers [20:00:34] 	johnduhart: those would be great as well [20:00:37] 	Next: it looks like in 1.18, QuickTemplate was replaced by BaseTemplate and you now need to call printTrail explicitly to get the resource loader stuff to load at the bottom of the page. Is that documented somewhere for skin devlepers? [20:00:42] 	folks, you can subscribe to @MediaWikiMeet on Twitter or Identi.ca to be told of upcoming meetings like this (plus bug triages and hackathons and trainings) [20:00:44] <Forestft>	What recommendations are there for a tool that can be used to inspect variables and function calls within MediaWiki without disrupting the wiki itself. I ask this because installing a debugger sounds like a hassle, so I'm writing an incredibly basic special page that I can edit to investigate the contents of various variables, etc. [20:00:56] *johnduhart 	will hold his question for now ;) [20:01:06] 	Forestft: Good question! [20:01:12] <Forestft>	(it's a little Off topic, but it's been bugging me) [20:01:28] 	would you be willing to announce future ones on wikitech-l as well? [20:01:40] 	Forestft: I just added a debugging toolbar to MediaWiki in 1.19, you can enable it using $wgDebugToolbar [20:01:48] <Forestft>	Johnduhart: thanks. I was worried it was too noobish. [20:02:10] 	cneubauer: I've got your question on my list - will make sure we get to it [20:02:14] <tr|nn|>	lleworden: +1 [20:02:18] 	johnduhart: great to hear! [20:02:20] <olenz_>	johnduhart: Whoa! [20:02:25] 	varnent: Thanks [20:02:33] 	Forestft: It's still a work in progress but you can call it using MWDebug::log('My message') [20:02:37] 	leeworden: yes - if we didn't this time - that was an accidental oversight on my part [20:02:45] 	I'm planning on adding variable dumping real soon [20:02:50] <Forestft>	That sounds terrific. Will MWDebug work in 1.18? [20:02:50] 	leeworden, tr|nn|, I'm fairly sure these were mentioned on mediawiki-l and wikitech-l [20:03:00] 	Forestft: No, it's a 1.19 thing :( [20:03:05] <tr|nn|>	johnduhart: that sounds great! [20:03:28] 	leeworden: tr|nn|: and varnent is enough of a publicity hound that he will do so again, I am sure :-) also mediawiki-enterprise [20:03:35] 	Forestft: You could create a wrapper in your extension to see if MediaWiki is 1.19, and then call MWDebug if it is [20:03:42] 	sumanah: for sure, that's how I heard about it - it'd be helpful to me to announce future ones there too, not only on twitter and identi.ca. [20:03:50] 	yes - I can promise you'll hear about the future ones :) [20:03:51] 	thanks! [20:03:52] 	and we wll. [20:03:53] 	will* [20:04:04] <Forestft>	JohnDuhart: that sounds good. In the meantime, I'll just use my little special page.  Thanks. [20:04:19] 	okay - cneubauer: Next: it looks like in 1.18, QuickTemplate was replaced by BaseTemplate and you now need to call printTrail explicitly to get the resource loader stuff to load at the bottom of the page.  Is that documented somewhere for skin devlepers? [20:04:29] 	Dantman: ^^ in case you came in :) [20:05:17] 	cneubauer: I think the simple answer is that there is not yet any documentation for it [20:05:38] 	ok [20:05:38] 	cneubauer: as unsatisfying as that is... [20:05:41] 	thanks [20:05:56] 	If you're interested you can see the dbug toolbar in use here: http://test.wikimedia.deployment.wmflabs.org/wiki/Wikipedia:Current_events [20:05:58] 	that said - I'll also commit to nudging Dantman about one for that [20:06:07] 	(the /topic used to mention a meeting that is now past) [20:07:06] 	Now, my questions to extension developers: [20:07:32] 	Are you aware of ContextSource and what it does? And are you using it? [20:07:47] <Nikerabbit>	yes, no [20:07:53] <Amgine>	no, no [20:07:56] <tr|nn|>	no, no [20:07:56] 	No, no [20:08:01] 	No, no [20:08:13] 	no, no [20:08:19] <Osnard>	no, no [20:08:32] <Forestft>	#debug toolbar: sweet! [20:08:33] 	johnduhart: good question! [20:08:38] 	hehehe [20:08:51] <Forestft>	no, no [20:09:00] 	and I'll just leave this here as I go /away for a bit: mediawiki-l [20:09:23] 	So, in a nutshell: ContextSource is a replacement for our global variables: like $wgRequest and $wgUser [20:09:34] 	Yea! [20:09:35] 	ialex we're talking about Context [20:09:57] 	Let me pull up an example [20:10:08] *varnent 	adding ContextSource as a future workshop topic [20:10:15] 	johnduhart: this is helpful - please continue :) [20:10:35] 	http://svn.wikimedia.org/svnroot/mediawiki/trunk/extensions/ServerAdminLog/specials/SpecialAdminLog.php [20:10:48] 	$out = $this->getOutput; [20:11:16] 	So, instead of getting the OutputPage with $wgOut, I'm calling the getOut function [20:11:16] 	johnduhart: yes? [20:11:16] <toniher_casa>	varnent, thanks, I have a tricky (I think) case here about how to use it: Special:Code/MediaWiki/107877 [20:11:29] 	ialex: Just in case you want to chime in here [20:11:31] 	johnduhart: getOutput, not getOut [20:11:36] 	ialex: heh [20:12:01] 	https://svn.wikimedia.org/svnroot/mediawiki/trunk/phase3/includes/context/IContextSource.php [20:12:12] 	That's the interface for all context sources [20:12:19] 	All of them implement those methods [20:12:37] 	toniher_casa: excellent - ty - noted  :) [20:13:18] 	The reason we want to use Contexts is because, A. globals are evil and B. It allows for much easier testing, since we don't have to fiddle with globals to insert test data [20:13:25] 	ialex: right? [20:13:55] <Osnard>	So SpecialPage implements IContextSource? [20:14:09] 	Osnard: Yup [20:14:12] 	johnduhart: much more A that B for me [20:14:15] 	Question: in what version of MediaWiki was ContextSource introduced? [20:14:21] 	1.18 [20:14:23] 	ialex: same :p [20:14:32] <tr|nn|>	nice! :) [20:14:34] <Osnard>	What else does implement this interface? [20:14:50] 	Osnard: As of 1.19 API modules have context [20:15:08] 	Osnard: I don't think SpecialPage implements the interface directly, but it has the same methods (except the new getWikiPage) [20:15:27] 	right [20:15:53] <Osnard>	Is the implementation in SpecialPage different from the ones in the API modules? [20:16:02] 	I don't have a listing of what and where it's available right now, you'll have to poke around. [20:16:08] 	Osnard: Nope, same thing [20:16:27] <Osnard>	Sounds like duplicate code [20:16:39] 	Action doesn't implement it either, but it could easily by adding getWikiPage [20:16:57] 	Osnard: How so?> [20:17:13] 	Osnard: SpecialPage::getTitle is not compatible with ContextSource's one [20:17:58] <Osnard>	johnduhart: We seem to have same implementation of in SpecialPage and API modules [20:18:26] <Osnard>	johnduhart: Maybe I'm getting this wrong [20:18:30] 	Osnard: How is that a bad thing, consistency should be embraced [20:18:45] 	Osnard: Are you thinking we just copypasted the methods in? [20:19:15] <Osnard>	johnguhart: It sounded like this, yes. [20:19:22] 	Ah [20:19:23] 	Osnard: https://svn.wikimedia.org/svnroot/mediawiki/trunk/phase3/includes/context/ContextSource.php [20:19:31] 	ApiBase extends ContextSource [20:19:38] 	So really you can add a one second [20:20:05] <Osnard>	Okay :) [20:20:10] 	johnduhart: did you have other questions? that was helpful  :) [20:20:30] 	So really you can add context to anything by extending ContextSource [20:20:33] <Osnard>	No thanks. Didn't want to interrupt [20:20:35] *johnduhart 	had to open the door for someone [20:21:31] 	So does anyone see themselves using this? Or have questions about when and how to use it? [20:21:44] *varnent 	sees self using this [20:21:46] 	thanks for the insight folks, gotta run for now [20:21:51] 	Nikerabbit: You said you knew about it but didn't use it, why's that? [20:22:05] 	nekohayo: thank you for stopping by! [20:22:10] <olenz_>	Well, how about a RL primer? [20:22:22] 	johnduhart: someone changed SpecialPage to use ContextSource but it was reverted [20:22:23] 	nekohayo: a wikipage will also be posted on Project:WikiProject Extensions/MediaWiki Workshops with follow-up info on this session [20:22:26] 	What about parser functions and such? Do they have a way of accessing ContextSource? [20:22:50] 	for my own archives I posted a summarized log of my discussion on my particular issue in the comments of http://jeff.ecchi.ca/blog/2012/01/12/spring-clean-up-in-january/ [20:23:08] 	excellent [20:23:10] 	greenmanrb: no, Parser is different [20:23:13] <Nikerabbit>	johnduhart: mostly because of old habit and BC issues [20:23:25] <tr|nn|>	compatibility to older mw releases could be a reason to not use ContextSource -- is there a statistic out there about usage of different mw-versions? [20:23:39] <Nikerabbit>	I'm still targetting 1.17 and newer (which is lot less than most extension devs) [20:23:51] 	Nikerabbit: heh, and yeah that's a problem with CS. Hopefully i a year that'll be a diffferent story [20:23:52] 	Does that mean that for now, parser functions and similar extensions will still have to use the old globals? [20:23:54] 	Can I call ContextSource static? [20:24:01] 	mglaser: no [20:24:02] 	mglaser: No [20:24:21] 	Ok. [20:24:32] 	mglaser: the point of ContextSource is to get it from somewhere, otherwise it's useless [20:24:33] 	mglaser: You can get a Context with RequestContect::getMain if you cannot get a context. [20:24:56] 	RequestContect::getMain should be avoided as much as possible [20:24:58] 	johnduhart that's what I was aiming at. thanks [20:24:59] 	RequestContext* [20:25:09] 	And I agree with ialex [20:25:41] 	so it's not a way to get rid of all globals. Esp. the configuration variables remain. right? [20:25:42] 	But it's still possible to get an object if you can't inherit for some reason. OK; thanks. [20:25:50] 	mglaser: correct [20:26:13] 	tr|nn|: http://s23.org/wikistats/ theoretically has some stats - but I can't speak to its stability / accuracy / etc [20:26:16] 	gotit [20:26:20] <Forestft>	Thanks for the help, all. Got to go. [20:26:30] 	mglaser: Maybe one day ;) [20:26:38] 	;) [20:26:52] 	johnduhart: that should be another object, like say SiteContext :) [20:26:59] 	I want to be mindful of people's time - we're absolutely welcome to continue this - and I encourage folks to do so - but I also want to thank everyone for showing up - we'll be hosting more of these in the future [20:27:17] 	RequestContext should extends SiteContext in that case, imho [20:27:40] 	ialex: Well ^demon was working on configuration [20:27:41] 	There will be more info on future sessions, and recap info on this one, available at: Project:WikiProject Extensions/MediaWiki Workshops   - you're also encouraged to edit that wikipage with your ideas, etc.  Everyone is also welcome to participate in Project:WikiProject Extensions [20:27:48] 	johnduhart: I know [20:27:51] 	I understand that RequestContext::getMain should be avoided when possible, but is using it still preferable to using the old globals? [20:28:00] *varnent 	continue about your regular conversing now :) [20:28:05] 	greenmanrb: not really [20:28:27] *ialex 	even prefers old globals than RequestContext::getMain [20:28:48] 	ialex: It's really a fallback for BC, right? [20:28:54] 	johnduhart: yes [20:29:03] 	right [20:29:18] 	until we can pass RequestContext everywhere we need [20:29:27] <tr|nn|>	thx varnent for the stats-page -- it is still loading ;) [20:29:37] 	but this will take a long, long time [20:29:50] 	OK, understood. [20:30:21] 	I'm popping back in a moment to say: those of you who develop extensions, you are welcome to put them in the main Wikimedia SVN repository [20:30:25] 	Commit access requests [20:30:31] *sumanah 	goes back out to a spreadsheet [20:32:35] 	Well, anything else? [20:33:14] 	varnent: Think it's time to wrap up [20:34:12] <Amgine> [20:34:13] 	agreed [20:34:20] 	Thanks again for holding this. Great info [20:34:31] 	so with that we're officially wrapped on this session - thank you all for attending!!! :) [20:34:34] *johnduhart 	has to catch a bus [20:34:38] *johnduhart 	floats away [20:34:48] 	thanks _varnent [20:34:49] <tr|nn|>	thx to everyone for this workshop :) [20:35:06] <olenz_>	bye, and thanks guys! [20:35:08] <tr|nn|>	please keep the workshops coming regularly! [20:35:08] <Osnard>	yes, thanks everybody [20:35:24] 	thank you - we will - this was a great first trial run :) [20:37:59] 	bye for now [20:39:32] <Osnard>	bye!