The Wikimedia annual development community meet-up — the Wikimedia Hackathon — was held in Zürich, Switzerland in 2014 from May 9-11.
It was a long weekend filled with hacking anything related to MediaWiki or one of the Wikimedia projects (and sometimes other things, too). The Hackathon events are completely open; we welcome both seasoned and new developers, as well as people working on MediaWiki, tools, pywikibot, gadgets, extensions, templates, etc.
The venue and accommodation were both at the same site: Youth Hostel Zürich, Switzerland. All conference rooms were available to us including a nice yard. We have a big main conference hall and three workshop rooms. There is a beer bar in the Foyer of the Youth Hostel.
Regular contributors plus some newcomers hack together with a shared goal.
A lot of regular users end up reading one article on Wikipedia (through Google Search) and other articles which they need as pre-requisites.
It would be interesting to have a recommendation engine for Wikipedia, Wikiquotes and Wikisource.
Our goal is to make a handful of high-priority puppet roles available to MediaWiki-Vagrant to make developer instances of Vagrant closer to a production-like environment. The ultimate idea is to provide a uniform developer-focussed instance to Mediawiki engineers to facilitate better testing of new and existing code, while also making it easy for new engineers to get started hacking on Mediawiki and Mediawiki with minimal friction for bringing new features to production. The first hurdle will be to provide some crucial production services in MediaWiki-Vagrant, which will be our focus for the Hackathon. Possible services to focus on include
The idea is to bring pin map templates, geohack, wikiminiatlas, openstreetmap, Extension:Maps, Special:NearBy, mobile, coordinates extension etc etc etc, more together in a way that works for mobile, desktop, micro and macro contributions. It would be handy if before hand we can do some exploratory work to create a more focused effort during the hackathon: Zürich Hackathon 2014/Geo Namespace.
There are topics for meetings/sprints with a wider developer/designer/GLAM advocate community with the Wikimaps project. We are hoping to set up a corner for continuous workshopping and an event for kick-off discussions in quite the beginning of the event.
Template:Map for archival metadata, geolocation and temporal properties of maps. For inclusion in the GWToolset.
Help us make the documentation of our APIs better! We're in the process of building a Data & Developer Hub that can help inspire developers everywhere to build new things using data available from Wikimedia projects and be a central location with documentation that is clear and helpful.
Bug triaging refers to improving the quality of bug reports and enhancement requests in bugzilla.wikimedia.org. The exact triage topic of this sprint remains to be defined (no coding is planned but nobody would stop you either). Which bug reports in which area would you like to take a look at together? Or bring your existing bug reports - we can give them a shot or discuss best practices how to triage them and keep them in good shape!
The GLAMwiki toolset makes it easier for libraries, archives and museums in partnership with Wikimedia Commons to make large contributions of content to Wikimedia Commons. Get to know the tool and help us create a prioritized development roadmap for it.
We also need additional developers to help us continue to improve the tool and handle incoming bugs. If you’re interested please come by and ask us how you can get involved.
We plan to publish another release candidate of MediaWiki 1.23 during the Hackathon. Yet, there are still a few known bugs with this: see bugzilla. The hackaton will be a great chance of bringing a few people together to tackle these issues.
Snuggle (en:WP:Snuggle) is a late (at least not the earliest) adopter of OAuth and that's a shame. There's also requests to extend support to Portuguese and Italian Wikipedias. In this sprint we'll discuss & design UI changes to support an OAuth handshake with MediaWiki. In parallel, we'll also be extending Snuggle's configuration to support IT and PT wikis. Italian and Portuguese Wikipedians needed to help with localization.
There are a lot of open government data evolving with interesting data which can be used for wikimedia projects. For example: The data of polls and referendums, the population of cities and municipalities, the public art in cities (interesting for Wiki Loves Public Art). How can the data of that portals be transfered automatically to the wikimedia projects. For example generating a script on the tool server which checks daily if some numbers have changed. If yes, it transfers the data to wikidata and from there the data is included in the infoboxes of the Wikipedia articles. Or how can the list of public arts or of other objects (ex. fountains) in wikipedia be updated automatically if there is a new object or an object has been removed.
As a concrete task we will promote the integration of demographic data from the open data platform () to wikidata for integrating the data in the specific articles of the districts. Ex. in the german article of the Rathaus District.
RACHEL (www.rachel.worldpossible.org) is now reaching over one million users worldwide who lack access to reliable internet. The project was a weekend prototype by just a few Cisco employees years ago. Please consider lending your web development, perl skills (dev.worldposible.org), or linux ideas to help us enhance this fantastic product (as seen on BBC, CNN, RaspberryPi.org and much more). Currently, foreign language wikipedia inclusion requires running a second webserver dedicated to wikipedia, an inefficient use of resources in places where these are scarce. Help us integrate foreign language wikipedia without running a second server (biblioteca.worldpossible.org).
Jeremy Schwartz <firstname.lastname@example.org>
Wikibase architecture overview
An introduction to the Wikibase architecture and components aimed at potential new contributors. Wikibase is the software behind the Wikidata project. In this session you will get a high level overview of the wider wikibase codebase, which functionality can be found where, and how the different parts interact. Novice developers can attend, no special knowledge is required.
Full title: Clean Code, and other requirements for contributing to Wikibase
This is an introduction to the topic of clean code. It covers basic design principles, effective use of tests, and many general best practices. Want to know how to write code that is easier to maintain? Want to avoid spending so much time in the debugger? Want to write code that reads like well written prose? Want to become more effective at the craft of software development? Then this session is definitely for you.
This introduction is broad and covers a lot of ground. While many topics will not be covered in the depth they deserve, references to other material will be provided. The focus on the most common problem points and practical solutions.
A lot of the topics that will be covered are part of the contribution guidelines for the Wikibase software. Examples from Wikibase will also be used in places, though this session is by no means Wikibase specific.
Is mobile a mystery to you? Can't get MobileFrontend setup? Don't understand why your favourite desktop features don't work on mobile? Want to know how to work on the latest Wikipedia apps? Do you want to make mobile things and don't know how? Does one of your projects look great on a desktop and terrible on a mobile device?
This workshop will be an unstructured session where you can ask all these questions and get some answers. Members of the mobile team will be available to answer your questions and give you hands on help on anything you need.
mediawiki.ui implements the evolving "Agora" visual style for buttons and forms in MediaWiki software. Learn how to apply it in your extensions and gadgets to deliver attractive consistent appearance. Actual real-life visual designers will be on-hand to give you advice and get feedback about what controls would be useful.
Flow is a modern discussion and collaboration system for WMF wikis. However, Flow pages aren't talk pages, so bots and semi-automated editing tools (e.g., Huggle, Twinkle) will have to adapt to handle them. Come explore the new Flow API and help us make improvements that will support some of the most important processes on our projects!
Introduction to using MediaWiki-Vagrant to manage a development environment for hacking on MediaWiki. We'll do a really quick high level look at what Vagrant is and how MediaWiki-Vagrant uses and extends it. Then we'll learn just enough Puppet to understand what "roles" are and how to use them to configure your MediaWiki-Vagrant instance. Participants will then get hands on by creating roles to install and configure extensions used on Commons that are not yet available in MediaWiki-Vagrant.
An event-long workshop with Coren; hands-on migration, debugging and creation of tools on the Tool Labs. Join in and leave at any time; there will be impromptu and planned breakaway tutorials on various related topics as interest demands (planned: using gridengine tutorial, database query optimization (including federated tables), how to deploy web services).
Pywikibot is the most popular framework for running bots but how we can help improving pywikibot? There are several ways to help including solving and reporting bugs, wrangling bugs, bug triage and areas of developing including support of Wikibase and Wikidata, porting functionality from compat to core, and or network optimization (in order to reduce pressure on WMF servers)
Multichill I would like to see a session on the future of Pywikibot (like this) but I'm not sure if we have the right people at this hackathon. Maybe we should move that part to the Wikimania hackathon.
Antoine "hashar" Musso would like to setup an integration test that runs pywikibot against MediaWiki core master and wmf branches.
whym - interested in adding more tests for scripts I use.
Nikerabbit is there to explain how the Translate extension works. Let's fix your most wanted (or hated) Translate bugs together or work on new features.
Possible bigger topics to work on, depending on the interest:
Repository management. Translatewiki.net is using a collection of shell scripts to manage all repositories where translations are exported to and imported from. Lack of better repository management is blocking further progress on areas like automation of imports and exports.
How will page translation work with Parsoid and VisualEditor.
Let's provide more useful statistics about translation activity in nicer format.
Translation memory improvements.
Translate extension provides an translation editor, support for many different types of content, translation memory, statistics ad lots more. It is also very old and big extension, while also adopting new technologies like composer and CSS grid based interface.
This session will show how to refactor parts of MediaWiki core in order to improve modularity. The focus will be on backwards compatibility and testability. The TitleValue RFC will serve as an example.
It's been 2 years since we setup the current infrastructure around developer tools with Gerrit and Jenkins. Not everyone's happy and things break more often than we like. Additionally there's been rumblings for years about getting rid of Bugzilla. I think it's high time we have a discussion about what we envision our ideal development environment to be and figure out what it would take to get us there. Are there any tools (hint hint: Phabricator) that can help get us most of the way there?
A continuation of our past architecture meetings where we discuss the future of MediaWiki's internal design. Specifically, in Sunday's discussion, we will discuss Performance guidelines, answer last questions and polish last rough edges, and hopefully remove the draft tag.
A workshop (preferably in the early evening) to invite local users, beginners but also power users, who want to learn more about MediaWiki, our infrastructure and more. Kind of a "Ask the Developers" session but more informal.
Proposals that seem to need clarification. Feel free moving them in any of the categories above.
Local Wikipedia forks
"Semi-autonomous instances of localized Wikipedia". Expand on the work of SOS Children, the Rachel Project to create packaged snapshots of Wikipedia contents that can be deployed on local storage e.g. SD cards in a tablet computer, 3G WiFi router, etc. and on a Raspberry Pi, etc. Ideally there'll be bi-directional updates from the local instance to Wikipedia and from Wikipedia to the local instance - asynchronous synchronization. People using one of these snapshot instances would be able to create new content and revise existing contents. One of the main goals is to help extend the reach locality and relevance of Wikipedia for people throughout the world.
I've been doing some work in this area already and will bring hardware, etc. to demo and test our work.
There's lots of known-unknowns for this project. With your help I hope we'll be able to
There's an initiative to rework MediaWiki.org. We should discuss the plans and outcomes in order to make MediaWiki.org a site that is easy to use and valuable to all of us, be it users, site maintainers, core and extension developers and any other interested parties. Let's go over the new design and make this a state-of-the-art site a so wide spread software like MediaWiki deserves!