Content translation/Development Plan/Roadmap

User Experience Research

 * Translation view
 * Translation Entry points
 * Prototype
 * UX Feedback
 * UX Design

State of the server backend

 * Research on technology choice, proof of concept was done
 * Nodejs server prepared with socket.io bidirectional communication
 * Server architecture documented
 * Parsoid/Mediawiki interface to fetch articles
 * ROT13 Dummy translation tool for development and debugging purpose
 * Functional data model manager developed
 * Redis based data store and pub/sub mechanism developed
 * Research on segmentation of the article done
 * Segmentation implementation based on Sax
 * Publish article in User:name space-verbatim copy

State of the frontend design and development

 * Basic UI developed - 3 column layout
 * Allows loading page by giving title and language

Analytics

 * Initial plan is prepared: Analytics
 * Work is being done on preparing EventLogging schemas (draft in Google docs)

Development plan for FY2014 Q2

 * 1) Improve the server as per the POC to handle requests using socket.io, have all the boiler plate and architecture code(to be evolved in future) https://wikimedia.mingle.thoughtworks.com/projects/language_engineering/cards/4023
 * 2) "Load the given article to the editor" should be refactord to a server task https://wikimedia.mingle.thoughtworks.com/projects/language_engineering/cards/4024
 * 3) Sent the article title, language pair to server
 * 4) with the response from point 5 Show the article. No UI change. Currently it's loaded by frontend API, directly from MW. Change it to load from CX server after processing.
 * 5) Perform initial processing of the article content at server side
 * 6) Server side responds to UI request by creating an empty "project" in the data store
 * 7) Sentence segmentation (minimal, matching [a-z]\.\s+[A-Z])
 * 8) Save processed article into server-side data store-in the form of data model
 * 9) Serve the data model from the store as version 0.
 * 10) Research on sentence segmentation
 * 11) Segmentation may be with TR29 and other pluggable/expandable methods
 * 12) May also be language-specific
 * 13) Basic segmentation Implementation
 * 14) Block alignment - paragraph, table, image, div, ul, headers alignment in Special:CX
 * 15) ROT13 integration-for debugging and development - In progress, planning to complete with pairing with Kartik
 * 16) CX Server tests
 * 17) CI integration
 * 18) Write segmentation tests. Goto step 7 and loop:)
 * 19) Redis data store modeling
 * 20) UX adaptation for design update-eg: Progress bar stick to tools pane as per latest design from Pau
 * 21) Design Service provider interface
 * 22) MT Interface
 * 23) Dictionary Interface
 * 24) TM Interface
 * 25) Wikidata interface
 * 26) Wikidata interface implementation
 * 27) MT Interface implementation
 * 28) TM Interface implementation
 * 29) Dictionary interface implementation
 * 30) Start with some existing dictionaries
 * 31) Data model diff design and implementation - Do not send the whole data model every time
 * 32) Preprocessing - Clean up
 * 33) Remove Table of contents, preprocess templates
 * 34) Think about "transcoding templates" in an analogous way to link handling
 * 35) Entry points
 * 36) Capturing edits for Machine learning

Special:ContentTranslation

 * Loading source article
 * Segmentation
 * Editing target article section-by-section
 * Publishing the translated article to user space in target language

Translation tools

 * Translation memory
 * Automatic links insertion

Backend

 * Loading source article
 * Cleanup of the source article from unneeded markup
 * Segmentation
 * Initial storage of the source article segments in the backend (Redis)
 * Saving the segment translations
 * Publishing
 * Generic interface to language services (TM, MT, dictionaries, Wikidata, etc.)
 * Interface to translation memory
 * Interface to Wikidata

QA plan

 * Manual Testing
 * Browser tests for Special:ContentTranslation
 * Unit tests for the backend

Analytics

 * Prepare UI event logging, translation statistics and translation tools usage according to the analytics plan