MinT

MinT (Machine in Translation) is a machine translation service based on open source neural machine translation models. The service is hosted in Wikimedia Foundation infrastructure, and it runs translation models that have been released by other organizations with an open source license. An open machine translation service can be a key piece of the essential infrastructure of the ecosystem of free knowledge. This page captures the initiatives to scale the service and make this infrastructure more widely available.

You can try MinT as part of projects such as Content Translation and Translatewiki.net, or directly in a test instance.

About MinT
MinT is designed to provide translations from multiple machine translation models. Initially it uses the following models:


 * NLLB-200. The latest model from the [ https://ai.facebook.com/research/no-language-left-behind/ No Language Left Behind project ] by a research team at Meta. This model supports translation across [ https://github.com/facebookresearch/flores/blob/main/flores200/README.md#languages-in-flores-200 200 languages], including many that are not supported by other vendors.
 * OpusMT. The [ https://opus.nlpl.eu/ OPUS (Open Parallel Corpus) project ] from the University of Helsinki compiles multilingual content with a free license to train [ https://github.com/Helsinki-NLP/Opus-MT the OpusMT translation models]. Anyone can easily help improve the translation quality by participating on the different projects that contribute data to OPUS. For example, when using Content Translation to create translations of Wikipedia articles, the data on published translations will be incorporated as new resource to improve the translation quality for the next version of the model. Another quick way to contribute is to provide sentence translations with [ https://tatoeba.org/ Tatoeba ].
 * IndicTrans2. [ https://ai4bharat.iitm.ac.in/indic-trans2 The IndicTrans2 project] provides translation models to support [ https://models.ai4bharat.org/#/nmt/v2 over 20 Indic languages]. These models were developed by AI4Bharat@IIT Madras, a research group at the Indian Institute of Technology Madras.
 * Softcatalà. Softcatalà is a non-profit organization with the goal to improve the use of Catalan in digital products. As part of the [ https://github.com/Softcatala/nmt-softcatala Softcatalà Translation project], translation models used in [ https://www.softcatala.org/traductor/ their translator service] to translate 10 languages to and from Catalan have been released.

MinT supports over 200 languages, with more than 50 languages not supported by other services (including 27 languages for which there is no Wikipedia yet). You can read more about the initial release of MinT and check some frequently asked questions in the summary page for the service.

Technical details
The translation models have been optimized for performance using [ https://github.com/OpenNMT/CTranslate2 OpenNMT Ctranslate2 library] in order to [ https://techblog.wikimedia.org/2020/04/06/saying-no-to-proprietary-code-in-production-is-hard-work-the-gpu-chapter/ avoid the need for GPU acceleration]. This makes it easier for organizations and individuals to build and run their own instances. For more details you can check the source code, the [ https://translate.wmcloud.org/api/spec API spec], and [ https://translate.wmcloud.org/ a test instance].

MinT provides a platform to run multiple translation models. In order to support different initiatives, aspects such as [ https://github.com/santhoshtr/sentencex sentence segmentation], language detection, pre/post-processing of contents, and rich format support has been developed on top of the plain-text based models.

Get involved
Feel free to share any feedback in the discussion page. Planned improvements are captured in Phabricator, you can [ https://phabricator.wikimedia.org/maniphest/task/edit/form/1/?projects=MinT propose improvements or report any issue], track the progress of any task, and share your perspective on it. For completed work you can also check the status updates below.

MinT for translators
Translation is a common way to contribute in the Wikimedia ecosystem for multilingual users. Machine translation can provide a useful initial translation for users to review and improve. The Language team has developed tools to support translations in their workflows that can integrate different machine translation services to speed-up their processes. Once MinT was available, integrating it with these tools was a logical next step to amplify their impact. MinT is available in the following projects:

* Content Translation. Content Translation provides guidance to create a translation of a Wikipedia article into another language. Content Translation integrates several translation services to provide an initial translation. * Localization infrastructure. The Translate extension provides the infrastructure used to translate our software and multilingual pages. Communities of translators use it on Translatewiki.net, Wikimedia Meta-wiki, Mediawiki.org and more.

MinT for Wikipedia readers
The number of topics and the amount of information a reader can learn about from Wikipedia depends on the languages they speak. Machine translation can help people to learn more about their topics of interest when the content is not available in their language.

This initiative explores how to surface the machine translation support from MinT in Wikipedia articles in a way that:


 * Allows readers to learn more about the topics of interest from other languages
 * Clearly differentiates automatically generated content from community-created one.
 * Encourages to contribute to the community created contents when possible.

At the moment the Language team is working on the design and research aspects of the project to identify the bet ways to surface MinT on wikipedia and the technical explorations for the service to work in this context.

MinT more widely available
Working on the previous initiatives will help to polish and solidify the system. For now, the MinT API is only available for Wikimedia products. As the system gets ready, we'll consider a wider exposure. Providing a service that can be used by communities in innovative ways can be a very powerful tool. New initiatives to make MinT more widely avalable will be captured here in the future. Meanwhile, feel free to configure your own MinT instance to experiment with it.


 * Completed initial design exploration to illustrate 5 concepts on how to surface machine translated contents from other languages for Wikipedia articles
 * Completed enablements of MinT in Content Translation for Lingurian, where the community requested further clarifications about MinT, and the last set of 14 languages that could be supported with the NLLB-200 model.
 * Enabled Mint for translatable pages on [ https://test.wikipedia.org/ test wiki]


 * Expanded exposure of MinT with the enablement of Content Translation mobile and desktop experiences as default in 7 Wikipedias supported by MinT (Cherokee, Tongan, Hungarian, Kazakh, Kyrgyz, Minangkabau, and Sardinian).
 * Completed the validation for all languages supported by the translation models used by MinT as part of the final QA for enabling the new translation service.
 * Santhosh presented at [ https://lotus.kuee.kyoto-u.ac.jp/WAT/WAT2023/ the 10th Workshop on Asian Translation] emphasizing the need for machine translation to be universal, free, and available in more languages. A message [ https://twitter.com/prajdabre1/status/1698595253417095514 well receive by the attendees].


 * Research planning started with initial draft of research brief for MinT on Wikipedia
 * Continuing technical explorations for applying machine translation beyond plain text (what underlying models provide) to support the Wikipedia context: A [ https://github.com/santhoshtr/sentencex new improved approach] for sentence segmentation (with [ https://santhoshtr.github.io/sentencex/ a demo page to try]) that provides a more accurate way to identify when a sentence ends in different languages, and with a preference to avoid splitting in case of doubt (preferred in the context of machine translation to avoid fragmenting the context of a translation, for example, misinterpreting the dot of an abbreviation as a fullstop).


 * Successful exploration for the use of MinT to translate structured formats such as html, svg and markdown.
 * Completed the deprecation of Youdao, an external translation service that was failing for a long time.
 * Continued design exploration for MinT on Wikipedia with new and updated workflows based feedback.
 * Identified languages which can benefit the most from new OpusMT models


 * Made MinT the default translation service for Zulu in Content Translation


 * Enabled machine translation with MinT (and communicating with communities) for 75 new languages: 62 languages where the mobile translation experience is available, and 13 languages where translation quality from other services may not be ideal based on [ https://nbviewer.org/github/wikimedia-research/machine-translation-service-analysis-2022/blob/main/mt_service_comparison_Sept2022_update.ipynb the MT usage report] data and/or community feedback.
 * Validation of previous enablements: identified issues with Bhojpuri and with Latvian where MinT was not available due to mismatches with the language codes used by Wikipedias, MinT and the underlying translation models.


 * Initial design explorations and prototypes on ways we could integrate MinT in Wikipedia
 * Improved Mint translation post-processing to better support languages using the Arabic script by avoiding extra paces after fullstops.
 * Completed the integration of the IndicTrans2 model by verifying the enablement of all their 23 supported languages.


 * Initial analysis of activity for Wikipedia communities that are supported with MinT for the first time to identify potential pilot wikis for future research and as early adopters.
 * Enablement of MinT on translatewiki.net for the use in localization of Wikimedia and other open projects.