You can now edit the structure of a table, adding or deleting rows and columns and various other common tasks like merging cells and using captions. VisualEditor now support keyboard shortcuts like entering "* " at the start of a line to make a bullet list; if you didn't mean to use the "smart" sequence, pressing undo will get back to what you typed. Most wikis now have VisualEditor available as an opt-in tool, whereas previously communities had to ask for it to be switched on.
The toolbar's menus in VisualEditor now open in a collapsed, short form with uncommon tools only shown when requested. You can now create and edit simple "blockquoted" paragraphs for indenting. You can now use a basic editor for gallery and hieroglyphic blocks on the page. Category editing was enhanced in a few ways, including adding a redirect category now adds its target, and making categories without a description page show as red. We improved compatibility with some variations of how wikis use the Flagged Revisions system. Armenian language users now get customised bold and italic icons in the toolbar; if your language would benefit from customised icons, please contact us.
In November, the Editing team continued their work on the front-end standardisation project and VisualEditor, both of which are reported separately. The team made some improvements to the ResourceLoader library used inside MediaWiki core, as part of their wider work to bring in the OOjs UI library to MediaWiki. A volunteer attempt to add high quality (SVG) versions of the toolbar icons in WikiEditor was introduced, but later removed because of some quality issues; we will be re-doing this soon. The team led continuous integration work to move the existing unit testing system for MediaWiki from production slaves to virtual boxes in Wikimedia Labs, and CI improvements for the citoid and MobileFrontend projects. The team made a number of improvements to the Vector, Monobook and Apex skins.
In November, the Parsoid team continued to work through the big blockers to using Parsoid HTML for read views. We made further progress in customizing the Cite extension via CSS, and started work on supporting templates that Parsoid does not handle properly yet. These templates used on a subset of pages on Wikipedia generate attributes of a table as well as content of the table and do not fit well within the DOM-based model that Parsoid works with. We expect both these blockers to be lifted by early January which significantly furthers our goal of serving read views via Parsoid's HTML. Besides this, we continued to ongoing code cleanup, maintenance, bug fixes, and regular deployments.
In November, the Flow team completed our first conversion of LiquidThreads (LQT) pages into Flow pages, on the private Wikimedia Office wiki. The team now has the ability to turn LQT pages into Flow boards, keeping the items in history and user contributions intact, including edits made to LQT posts. OfficeWiki also has existing wiki talk pages, and the team prepared a conversion script to archive the existing pages, with a prominent link to the archives in the new Flow board headers. The conversion of all talk pages on OfficeWiki will happen in early December.
The team created a new Flow dashboard in Limn to chart usage across all projects. This is a helpful baseline for comparison when we make changes and build new features. We're also working on implementing EventLogging in Flow for the first time. Feature work included front-end work on a Table of Contents feature, and back-end work supporting an upcoming Search feature.
This month the team ran tests of the WikiGrok interface (version a and b) for readers in beta, in preparation for launching a logged in test on the stable mobile site this quarter. In order to test in production, we also expanded our question set to include simple claims that are easier to generate. To help support our continued testing and data-driven decision making, the team also overhauled many of the mobile dashboards, adding more features and functionality to the set that we monitor for improvement.
The team migrated the translation memory service of the Translate extension to ElasticSearch. Thanks to WMF's ElasticSearch cluster, this migration increases the speed and reliability of the service. We have identified one issue with the suggestions, which is being fixed during December. Thanks to Chad and Nik for helping Niklas.
Last, the team also made RTL fixes in MobileFrontend and VisualEditor.
The third version was released; it includes several enhancements and fixes. New features include a simple first version translation dashboard for viewing, loading and saving own translation, and the ability to save ongoing translations. The deployment of the Content Translation Database is currently in progress. On completion, users will be able to use the newly added dashboard and save and resume translations for unfinished articles. Collaboration continues with Tech Ops team for preparing the tool for deployment in a production Wikipedia as a beta-feature
The Machine Translation service code was refactored to make it more extensible for other languages and translation services. As an experiment, the Yandex machine translation service was tested. Several fixes related to template adaptation were done. The language selector and the top-bar in the editing interface have been redesigned.
The fourth release is currently underway with a specific goal to prepare the tool for deployment as a beta feature in January.
Rollout is ongoing. The plan is to deploy HHVM on 100% of app servers by Christmas, barring unforeseen issues. Tim Starling and Giuseppe Lavagetto isolated and resolved an issue with OAuth: Authorization header was not available to HHVM via apache_request_headers() due to a mod_proxy_fcgi bug. It was resolved by reinjecting the header via a SetEnvIf directive in the Apache config.
Tim Starling isolated and fixed an API throughput issue (phab:T758). Shelling out to tidy didn’t perform well on HHVM. It was solved by making the tidy extension for PHP compatible with HHVM (at least for essential functionality) and using that instead. This needs to be tested and fully deployed.
Special:GlobalRenameRequest and Special:GlobalRenameQueue, special pages intended to help users and Steward and global renamers process SUL finalization name changes, are just about ready to move into production on Meta. Special:GlobalUserMerge required some more work this month due to database conflicts with beta labs. Once properly tested out, GlobalUserMerge will join the other two tools on Meta to complete the kit needed to process the anticipated large number of rename requests.
Aaron Schulz and Chad Horohoe joined the team. Both are helping make updates to the Profiler classes used to measure the performance of MediaWiki related to the Better PHP profiling RFC. Profiling was identified early on as a common entanglement for many generally useful utility libraries found in the MediaWiki codebase. The work that is progressing here should enable us to remove many explicit wfProfileIn() and wfProfileOut() calls in the MediaWiki PHP code while still getting the benefit of performance measurements via the XHProf profiling library. Profiling via the XHProf functionality built into HHVM is currently in use in both the beta and production clusters and helping drive some low hanging fruit code improvements.
Bryan is continuing to work on structured logging changes and is testing a Monolog based logging pipeline in Beta to replace the current system. MWFunction::newObj has been deprecated and all usage in the core or MediaWiki replaced with the new ObjectFactory class which was introduced by the PSR-3 logging changes.
The cssjanus library has been removed from MediaWiki's core repository and replaced with a Composer managed import from the official upstream. The lessphp CSS pre-processor which was historically manually copied into MediaWiki's git repository is now imported via Composer.
The CDB library originally written by Tim Starling has been extracted to its own git repository and published on Packagist. Both MediaWiki itself and the "multiversion" scripts that are used to manage the WMF wiki family are now importing CDB via Composer instead of the old practice of keeping two copies of the code updated manually in the respective repositories.
The simplei18n PHP library that was developed for the IEG's Grant review application based on code from the Wikimania Scholarships application was transferred from Bryan's personal github account to the official Wikimedia account.
External dependencies for the BounceHandler and Elastica extensions have been removed from the extension git repositories and replaced with Composer managed imports. For the WMF cluster, these dependent libraries have been added to the mediawiki/vendor.git repository. ExtensionDistributor has been updated to package composer managed dependencies in the tarballs it generates for installing extensions. The php-composer-validate test is now applied to all extensions and skins to validate the syntax of composer.json when changes are uploaded to gerrit.
Several classes have been moved from includes/utils to includes/libs (ArrayUtils, MapCacheLRU, Cookie/CookieJar) which makes them easy candidates for publication in stand alone libraries in the future. Aaron is working on a list of possible libraries to create from the MediaWiki codebase that would group several useful classes together. This should produce more sustainable projects than having literally dozens of libraries made up of only a single class.
In November we made significant improvements to the Vagrant development environments, and also conducted a poll of Vagrant users to guide future development. We sorted out issues with HHVM on beta labs, and we made improvements to Jenkins and Zuul for speed and efficency. We introduced the Ruby linter "rubocop" to all of the Jenkins builds, one more step in managing technical debt and improving the quality of our browser test code.
In November the CentralNotice repo acquired its first browser tests. In addition to adding new test coverage, we continue to refactor existing tests across all of our repositories. While the primary purpose of this refactoring is to update all of the tests' assertions to RSpec3 syntax from RSpec2, we are also taking the time to address technical debt and sort other issues in the tests such as removing inefficient code and replacing explicit wait statements with dynamic wait-for statements. This not only improves the speed at which the tests run, but also helps immensely with maintenance and usability into the future.
In November 2014, after releasing requested improvements, the multimedia team started shifting its focus away from Media Viewer and onto bugs in the upload pipeline and UploadWizard.
The team attended the Amsterdam Hackathon, with a focus on GLAMs and structured data. There work was done on a working prototype for what entering structured data on Commons might be like, on research and groundwork for means to track per-file views (a long-standing request from GLAMs) in preparation for Erik Zachte and Christian Aistleitner's RfC as well as parsing image annotations so that they may be displayed in Media Viewer in the future.
The team's focus on Media Viewer has significantly reduced after releasing the last round of improvements that came out of the community consultation. The project has moved to maintenance mode, taking care of major bugs that need immediate attention. The team has provided support for the file metadata cleanup drive and will continue to do so, in order to improve the accuracy of the metadata displayed in Media Viewer.
UploadWizard has seen numerous code cleanup improvements, as well as the trimming down of a few obscure legacy features. This refactoring effort is already making UploadWizard easier to maintain, which supports the team's goal of fixing bugs and improving the efficiency of the upload pipeline.
In November, we supported the Fundraising team with the preparation and kickoff of the English fundraising campaign.
We started applying our new page view definition towards a number of reports and presentations, including an update for the Wikimedia Foundation board on readership trends.
We continued supporting the Mobile team with data on mobile traffic and prepared the launch of two controlled tests of microcontributions on the beta version of the mobile site. We performed preliminary analysis and QA of the data in preparation of a larger test to be launched on the stable site in January.
We concluded data analysis for the test of HHVM and found no conclusive evidence that HHVM substantially affects newcomer engagement in Wikipedia, but hypothesized that HHVM would have effects elsewhere.
We hosted a research showcase with Yan Chen (University of Michigan) as a guest speaker. We finalized a formal collaboration with a team of researchers at Stanford University, to be launched in December. A workshop we submitted to CSCW '15 on creating industry/academic partnerships for open collaboration research was accepted and will be held at the conference in Vancouver on March 14, 2015.
We have released, for the first time, a complete offline version of the Gutenberg project, a 50,000-book public domain online library. This new software solution is able to easily create a complete offline snapshot proposing all the books in HTML and EPUB format. We make the books accessible via a custom and really easy-to-use interface. It's consequently trivial to have this big library available everywhere on your PC, local network or even smartphone. This was the first step of a broader effort to increase outreach of public domain literature; further development will take place in 2015. If you want to know more, read the release announcement.
The engineering management team continues to update the Deployments page weekly, providing up-to-date information on the upcoming deployments to Wikimedia sites, as well as the annual goals, listing ongoing and future Wikimedia engineering efforts.