Help:Extension:Translate/Translation memories

From MediaWiki.org
Jump to: navigation, search

Other languages:
العربية • ‎български • ‎dansk • ‎Deutsch • ‎Zazaki • ‎English • ‎British English • ‎español • ‎فارسی • ‎日本語 • ‎Lëtzebuergesch • ‎lietuvių • ‎Nederlands • ‎polski • ‎português do Brasil • ‎русский • ‎தமிழ் • ‎українська • ‎中文

TTMServer is a translation memory server that comes with the Translate extension. It needs no external dependencies. It is enabled by default and it replaces the support for tmserver from translatetoolkit, which was hard to set up. TTMServer is a simple translation memory and it doesn't use any advanced algorithms. It does, however, take advantage of MediaWiki's excellent language support and database abstraction features.

There are three different ways to use TTMServer:

Local database Remote API Solr backend
Enabled by default Yes No No
Can have multiple sources No Yes Yes
Updated with local translations Yes No Yes
Accesses database directly Yes No No
Access to source Editor Link Editor if local or link
Can be shared as an API service Yes Yes Yes

Configuration

All translation aids including translation memories are configured with the $wgTranslateTranslationServices configuration setting. Example configuration of TTMServers:

Default configuration
$wgTranslateTranslationServices['TTMServer'] = array(
	'database' => false, // Passed to wfGetDB
	'cutoff' => 0.75,
	'type' => 'ttmserver',
	'public' => false,
);
Remote API configuration
$wgTranslateTranslationServices['example'] = array(
        'url' => 'http://example.com/w/api.php',
        'displayname' => 'example.com',
        'cutoff' => 0.75,
        'timeout' => 3,
        'type' => 'ttmserver',
        'class' => 'RemoteTTMServer',
);
Solr backend configration
$wgTranslateTranslationServices['TTMServer'] = array(
	'type' => 'ttmserver',
	'class' => 'SolrTTMServer',
	'cutoff' => 0.75,
	/* See http://wiki.solarium-project.org/index.php/V2:Basic_usage
	'config' => This will be passed to Solarium_Client
	 */
);
See installation notes at the bottom of this page.

Possible keys and values are:

Key Applies to Description
config Solr Solr instance config for Solarium, see below.
cutoff All Minimum threshold for matching suggestion. Only a few best suggestions are shown even if there would be more above the threshold.
database Local If you want to store the translation memory in a different location, you can specify the database name here. You also have to configure MediaWiki's load balancer to know how to connect to that database.
displayname Remote The text shown in the tooltip when hovering the suggestion source link (the bullets).
public All Whether this TTMServer can be queried through the api.php of this wiki.
symbol All The suggestion source link text. Defaults to ‣ for remote and to • otherwise.
timeout Remote How long to wait for an answer from remote service.
type All Type of the TTMServer in terms of results format.
url Remote URL to api.php of the remote TTMServer.
You must use the key TTMServer as the array index to $wgTranslateTranslationServices if you want the translation memory to be updated with new translations. Remote TTMServers cannot be used for that, because they cannot be updated.

Currently only MySQL is supported for the databases.

TTMServer API

If you would you like to implement your own TTMServer service, here are the specifications.

Query parameters:

Your service must accept the following parameters:

Key Value
format json
action ttmserver
service Optional service identifier if there are multiple shared translation memories. If not provided, the default service is assumed.
sourcelanguage Language code as used in MediaWiki, see IETF language tags and ISO693?
targetlanguage Language code as used in MediaWiki, see IETF language tags and ISO693?
test Source text in source language

Your service must provide a JSON object that must have the key ttmserver with an array of objects. Those objects must contain the following data:

Key Value
source Original source text.
target Translation suggestion.
context Local identifier for the source, optional.
location URL to the page where the suggestion can be seen in use.
quality Decimal number in range [0..1] describing the suggestion quality. 1 means perfect match.

Example:

{
	"ttmserver": [
		{
			"source": "January",
			"target": "tammikuu",
			"context": "Wikimedia:Messages\\x5b'January'\\x5d\/en",
			"location": "https:\/\/translatewiki.net\/wiki\/Wikimedia:Messages%5Cx5b%27January%27%5Cx5d\/fi",
			"quality": 0.85714285714286
		},
		{
			"source": "January",
			"target": "tammikuu",
			"context": "Mantis:S month january\/en",
			"location": "https:\/\/translatewiki.net\/wiki\/Mantis:S_month_january\/fi",
			"quality": 0.85714285714286
		},
		{
			"source": "January",
			"target": "Tammikuu",
			"context": "FUDforum:Month 1\/en",
			"location": "https:\/\/translatewiki.net\/wiki\/FUDforum:Month_1\/fi",
			"quality": 0.85714285714286
		},
		{
			"source": "January",
			"target": "tammikuun",
			"context": "MediaWiki:January-gen\/en",
			"location": "https:\/\/translatewiki.net\/wiki\/MediaWiki:January-gen\/fi",
			"quality": 0.85714285714286
		},
		{
			"source": "January",
			"target": "tammikuu",
			"context": "MediaWiki:January\/en",
			"location": "https:\/\/translatewiki.net\/wiki\/MediaWiki:January\/fi",
			"quality": 0.85714285714286
		}
	]
}

TTMServer architecture

The backend contains three tables: translate_tms, translate_tmt and translate_tmf. Those correspond to sources, targets and fulltext. You can find the table definitions in sql/translate_tm.sql. The sources contain all the message definitions. Even though usually they are always in the same language, say, English, the language of the text is also stored for the rare cases this is not true.

Each entry has a unique id and two extra fields, length and context. Length is used as the first pass filter, so that when querying we don't need to compare the text we're searching with every entry in the database. The context stores the title of the page where the text comes from, for example "MediaWiki:Jan/en". From this information we can link the suggestions back to "MediaWiki:Jan/de", which makes it possible for translators to quickly fix things, or just to determine where that kind of translation was used.

The second pass of filtering comes from the fulltext search. The definitions are mingled with an ad hoc algorithm. First the text is segmented into segments (words) with MediaWiki's Language::segmentByWord. If there are enough segments, we strip basically everything that is not word letters and normalize the case. Then we take the first ten unique words, which are at least 5 bytes long (5 letters in English, but even shorter words for languages with multibyte code points). Those words are then stored in the fulltext index for further filtering for longer strings.

When we have filtered the list of candidates, we fetch the matching targets from the targets table. Then we apply the levenshtein edit distance algorithm to do the final filtering and ranking. Let's define:

edit distance
the text we are searching suggestions for
Tc 
the suggestion text
To 
the original text which the Tc is translation of

The quality of suggestion Tc is calculated as E/min(length(Tc),length(To)). Depending on the length of the strings, we use: either PHP's native levenshtein function; or, if either of the strings is longer than 255 bytes, the PHP implementation of levenshtein algorithm.[1] It has not been tested whether the native implementation of levenshtein handles multibyte characters correctly. This might be another weak point when source language is not English (the others being the fulltext search and segmentation).

There is a script which fills the translation memory with translations from the active message groups. Even big sites should be able to bootstrap the memory in half an hour when using multiple threads with the --thread parameter. The time depends heavily on how complete the message group completion stats are (incomplete ones will be calculated during the bootstrap). New translations are automatically added by a hook. New sources (definitions) are added when first translation is added.

Old translations which are no longer used and do not belong to any message groups are not purged automatically, unless you rerun the bootstrap script. When the translation of a message is updated, the previous translation is removed from the memory. When the definition is updated nothing happens immediately. When translations are updated against the new definition, a new entry will be added. The old definition and its old translations remain in the database until purged by rerunning the bootstrap script. Also fuzzy translations will not be added to the translation memory, but neither are the translations removed from the memory when they are fuzzied.

Solr backend

Much of the above also applies to the TTMServer using the Solr search platform as backend, except the details on database layout and queries. The results are by default ranked with the levenshtein algorithm on the Solr side, but other available string matching algorithms can also be used, like ngram matching for example.

In Solr there are no tables. Instead we have documents with fields. Here is an example document:

  <doc>
    <str name="wiki">sandwiki-bw_</str>
    <str name="uri">http://localhost/wiki/MediaWiki:Action-read/bn</str>
    <str name="messageid">MediaWiki:Action-read</str>
    <str name="globalid">sandwiki-bw_-MediaWiki:Action-read-813862/bn</str>
    <str name="language">bn</str>
    <str name="content">এই পাতাটি পড়ুন</str>
    <arr name="group">
      <str>core</str>
      <str>core-1.20</str>
      <str>core-1.19</str>
      <str>mediawiki</str>
    </arr>
    <long name="_version_">1421795636117766144</long>
  </doc>

Each translation has its own document and message documentation has one too. To actually get suggestions we first perform the search sorted by string similarity algorithm for all documents in the source language. Then we do another query to fetch translations if any for those messages.

We are using lots of hooks to keep the translation memory database updated in almost real time. If user translates similar messages one after another, the previous translation can (in the best case) be displayed as suggestion for the next message.

Initial import

  1. Execute ttmserver-export.php command line script for each wiki using the shared translation memory.

New translation (if not fuzzy)

  1. Create document

Updated translation (if not fuzzy)

  1. Delete wiki:X language:Y message:Z
  2. Create document

Updated message definition

  1. Create new document

All existing documents for the message stay around because globalid is different.

Translation is fuzzied

  1. Delete wiki:X language:Y message:Z

Messages changes group membership

  1. Delete wiki:Z message:Z
  2. Create document (for all languages)

Message goes out of use

  1. Delete wiki:Z message:Z
  2. Create document (for all languages)

Any further changes to definitions or translations are not updated to TM.

Translation memory query

  1. Collect similar messages with strdist("message definition",content)
  2. Collect translation with globalid:[A,B,C]

Search query

  1. Find all matches with text:"search query"

Can be narrowed further by facets on language or group field.

Identifier fields Field globalid uniquely identifies the translation or message definition by combining the following fields:

  • wiki identifier (MediaWiki database id)
  • message identifier (Title of the base page)
  • message version identifier (Revision id of the message definition page)
  • message language

The used format is wiki-message-version/language.

In addition we have separate fields for wiki id, message id and language to make the delete queries listed above possible.

Installation

Here are the general quick steps for installing and configuring Solr for TTMServer. You should adapt them to your situation.

# Solr needs java
sudo apt-get install openjdk-6-jre-headless
# Download and extract solr from  http://lucene.apache.org/solr/downloads.html
wget http://www.nic.funet.fi/pub/mirrors/apache.org/lucene/solr/3.6.0/apache-solr-3.6.0.tgz
tar xzf apache-solr-*.tgz
cd apache-solr-*/example
# Copy the config from the extension directory
cp .../Translate/ttmserver/schema.xml solr/conf
# Start the server
java -jar start.jar

To use Solrbackend you also need Solarium library. Easiest way is to install the Solarium MediaWiki extension. See the example configuration for Solr backend at the configuration section of this page. You can pass extra configuration to Solarium via the config key as done for example in the Wikimedia configuration.

And finally we can populate the translation memory with content.

php Translate/scripts/ttmserver-export.php --threads 2