Review of Translation Quality Assurance Software
Translation Quality Assurance software (hereinafter referred to as TQA tools) compares source and target segments of bilingual texts (saved in .doc, .rtf, and .ttx files) in order to detect translation errors. Such errors include: inconsistencies; terms, which have not been translated in accordance with a project glossary; omissions; target segments, which are identical to source segments; punctuation, capitalization and number value/formatting errors; and incorrect untranslatables and tags.
The aim of this study is to compare three of the most popular TQA tools in order to find out their strengths and weaknesses, and therefore help translators, project managers and proofreaders to select the optimal TQA tool for any particular job.
Intrinsic Limitations of TQA Tools
There are a number of intrinsic limitations with TQA tools, some of which are listed below.
General Description and Features of Three TQA Tools
The three TQA tools tested in this study were: SDL Trados Terminology Verifier and QA Checker; Wordfast Quality Check feature; and QA Distiller (hereinafter referred to as Trados, WF and QAD, respectively). General information about these three tools is contained in Table 1 below and a comparative list of their main features is given in Table 2.
X means that a feature is provided.
0 means that a feature is not provided.
Detection of formal errors
In order to test these TQA tools, I created a test .doc file (1,373 words) containing a sample source text from a real client (Volvo Cars), and translated it with Trados in both MS Word and TagEditor. As a result, I had two identical bilingual target files (1,071 words) saved in .rtf and .ttx formats.
At the first stage (check of formal errors only) I added seven typical formal errors to both files:
1. One sentence was kept in English (identical source and target segments).
2. A double space.
3. One end of sentence punctuation different from that in the source sentence.
4. Repeated phrases translated inconsistently.
5. Incorrect untranslatable (Volvo S60 in the source segment changed to Volvo S60R in the target segment).
6. Incorrect number (350 in the source segment changed to 360 in the target segment).
7. One closing round bracket ")" missing in the target segment.
All special terminology in the target file was translated in accordance with my Volvo glossary (although I did not perform a terminology check at this stage of the study).
The settings in the three TQA tools were optimized experimentally to ensure detection of the maximum number of real errors and the minimum number of false errors (maximum ‘signal to noise’ ratio).
The results of the TQA formal error check are given in Table 3 below.
As a result of carrying out this formal error check, the conclusions listed below in Table 4 can be drawn.
Detection of Terminology Errors
In order to test the terminology check features, I added four terminology errors to the test translation. First, I translated ‘simulator’ as ‘ШЬШвРвЮа’, rather than ‘бШЬгЫпвЮа’, then I created glossaries containing one record only (simulator > бШЬгЫпвЮа) in the formats required by each TQA tool.
Note: Russian is an inflected language and my test translation contained various forms of the word ‘ШЬШвРвЮа’.
The results of terminology check were as follows:
Comments on the data received:
Trados - The false errors detected by Trados were caused by fuzzy matches. On both occasions, Trados suggested the use of the glossary term ‘simulator/бШЬгЫпвЮа’ for the verb ‘simulate’. The user has no control over such situations. The only option is to ignore such false errors.
WF - This proved to be the most simple, accurate, user-friendly and controllable terminology checker. The user can set the level of fuzziness by using wildcards.
QAD - The copy of QAD installed on my notebook failed to perform the terminology check. During the Analyze step, the application returned the following error message: “A program exception occurred”.
Are TQA Tools Necessary for an Experienced and Diligent Translator?
As a freelance English-Russian translator with 27 years of experience, I always take pride in my human quality assurance methods. I proofread all my translations at least twice before delivery and frequently hire a proofreader or a technical expert to check my translations. Further information about my human quality assurance methods can be found on my website at www.erussiantranslations.com/Article9.htm.
Since 2000, I have translated about 700,000 words per year, and in the ten years before that I translated 56 novels. My sample translations were checked and approved by ATA, ITI and UTR. My clients are always happy with the quality of my translations.
However, are experience and human quality assurance methods enough to avoid formal and terminology mistakes? To find the answer I checked a 10,000-word translation I did in 2005, before I started to use TQA tools. I found two terminology and eight formal errors, which is enough to suggest that TQA tools may be as useful for experienced translators as they are for beginners.
1. TQA tools do not replace human editors/proofreaders, but only help them. First and foremost they help translators.
2. Each of the three TQA tools has its own strengths and weaknesses, as well as its preferable area of use.
3. No matter how experienced the translator is and what human quality assurance methods s/he uses, TQA tools are able to decrease the number of mistakes and improve the overall quality of translation.
The results given above were achieved on my two PCs, a desktop and a notebook, both running the Russian version of Windows XP with SP and updates. Were the tests to be run on computers using a different operating system, there might be a slight variance in the results.
I would like to record my special thanks to Nathalie De Sutter for her invaluable contribution to this study.
* Original publication: Multilingual Magazine, January-February 2007, p. 22.
Please see some ads as well as other content from TranslationDirectory.com: