Minority Report on Localization 2003
It is always a risky business to try to predict the future, particularly in view of erratic human behavior and rapid technological changes. Then again, we are not totally clueless; the future is built on the present and affected by the actions we take today. Looking back at the language industry in 2002, one may conclude that there were no dramatic developments. And yet, though perhaps not recognized as such, there are usually signs that appear in the present that point to the future. In this article, I focus on a number of elements that I personally believe to be significant pointers that will send the language industry in certain directions in the future.Micro view: Translation Memory - A distant memory?
Throughout 2002, Translation Memory more or less dominated the priority list for an increasing number of translators, less perhaps because of its usefulness as a regular translation tool, than as a means to keep their options open when bidding for certain jobs. Clients and agencies now often insist on the use of TM (whether or not the technology is actually economical for the translation process).
To put it bluntly, the increasing interest in current TM technology by a wider translation community than those engaged in updating repetitive technical documentation is surprising, given its clunkiness. The basic principle of TM is to enable recycling of prior translations by a search and replace function between two sets of source and target texts. This principle works well with certain types of jobs, such as the updating of texts that involve a relatively small amount of change, and where the previous version has been captured in the system's memory. This is what was originally intended with TM systems.
However, when one attempts to fit this technology into day-to-day, general translation work, the benefit of TM becomes questionable. This is chiefly because the majority of translation jobs undertaken are not minor updates of previous translations of the same text. Some translation suppliers are trying to apply TM systems to large-volume technical documentation where certain terminology recurs. But one wonders if TM provides the economy the client is seeking in these cases - wouldn't a simple setup of macro keys do the job just as well?
While the concept of recycling prior translations may be immediately appealing, today's TM systems are, in my opinion, not designed to be a generically useful tool for translators at large. The technology won't win the same ubiquitous status as word processing unless it is re-designed.
In order to reassess the benefits of TM technology, it may be useful to compare it with example-based Machine Translation (EBMT) systems. EBMT produces translations on the basis of bilingual corpus rather than pure linguistic analysis. So, an EBMT system can produce a perfect translation of a sentence if a 100% match is found, as is the case with TM. The difference between EBMT and TM is that the former is primarily an automatic translation system, and the internal corpus in use is not prepared specific to the input text. By comparison, TM is not an automatic translation system and assumes the input text to be related to the contents of the memory (e.g., an earlier version of a technical manual). However, the problem with TM lies in that it can only draw on what's in its internal memory, rendering it unable to access a wider pool of bilingual texts that may be available elsewhere.
What is needed by a typical translator, who takes on a wide range of jobs, is a much more open system with a search and replace functionality capable of identifying similar translations from a much wider source of texts, well beyond a personal or enterprise-wide TM system. Even if a TM's internal memory drawer is empty or does not contain anything relevant, the system should still work by drawing on external memory sources according to the source text. For example, indexed and searchable "questions and answers" from translators' forums are invaluable resources for translators - often much more so than the contents of individual TMs. So, why can these not also be linked to TM? This type of search and replace functionality, not limited by an organization's boundaries, can also help alleviate the common problem of losing the expert knowledge-base when an experienced translator is no longer available.
The whole rationale of TM is to avoid re-inventing the wheel; if a brilliant rendition of a word, phrase or expression is available, it may as well be reused by another translator to avoid repeating the same effort. To my mind, however, the current TM technology is not taking full advantage of the true strength of powerful computers and their networking capabilities to augment the human translation process by allowing access to existing translation solutions.
Whether or not 2003 will witness the start of a new approach to TM is not clear, but one hopes that the developers will respond to the considerable dissatisfaction among translators with today's TM and finally deliver what the wider translator community really needs.Macro view: foreign language policy
Turning now to a more macro picture of the factors likely to affect the language industry, my recent participation in the AILA 2002 (International Association for Applied Linguistics) Conference in Singapore provides a couple of pointers. Firstly, the large number of papers devoted to foreign language education (including Computer-Assisted Language Learning) can be taken as a sign of continuing difficulties faced in language teaching (and learning) and of the ongoing efforts to find the best way to teach languages. Foreign language teaching/learning concerns the language industry in a most fundamental way as our sector relies on human resources for its source and target language processing capabilities. So, for example, if a breakthrough methodology or technological tool is developed to make the human language learning process more effective and rapid, the industry will benefit in a most obvious way. Having been involved in the teaching of both translation technology and language, as well as working as a translation practitioner, I firmly believe that skills in the use of technology alone do not make a great translator, although such skills are becoming increasingly essential.
Contrary to AILA's emphasis on language education, however, it is rather disconcerting to see that the British government has decided to abolish foreign language as a compulsory subject in secondary schools, similar to the U.S. This policy will certainly have a follow-on impact on our industry. While it in part reflects the clear dominance of English as an international communication tool, at the same time, it ignores the increasing demand for finely-tuned communication in the local context, as the localization industry epitomizes.
The very success of the Internet shows the two sides of the coin - in one sense, the Internet may be seen as cementing the perception of the English language as the center of the linguistic universe, further progressing globalization based on English. However, on the flip side, the Internet has facilitated many different language communities to create a global connectivity on the basis of cultural contexts and local languages other than English. The declining number of students learning foreign languages (excluding English) will inevitably pressurize the human resource pool available to our industry.Culture as key
Another pointer I sensed from the above conference was the revival of culture in linguistics. The Sapir-Whorf hypothesis, otherwise known as linguistic relativity, has never won a central place in linguistics because it cannot be scientifically proven. However, with the current advances in cognitive science, there is increasing interest in such concepts. Simply put, linguistic relativity advocates that one's world view is colored by the language one speaks.
Extending this hypothesis, one may say that language in turn is situated in culture and therefore, contexualization of communication relies on the use of the local language and its cultural conventions. Relating this to our industry, the fact that culture matters is increasingly demonstrated by such tasks as Web localization. The latter is often not a matter of simply rendering a word in one language into another, but almost always involves a wider scope of adjustments, including the use of appropriate icons, images, page design, etc., according to the cultural conventions of the target audience.
The role of cultural knowledge in translation has tended to be mainly implicit in conventional, offline text-based environments. However, within the new online multimedia environments, it seems to be assuming a much more explicit role. While translators have always known that culture matters and is something that cannot be separated from the process of translation or localization, cultural knowledge seems to be increasing in importance with the advent of Internet-based, inter-lingual interactions.
In fact, the more that people whose first language is not English start using English as a tool for global communication, the more there will be a distinction between the language they use in a global context to navigate in cyberspace, and the language they use in local interactions in online and offline environments. This is what the late Professor Michael Dertouzos, former Director of the MIT Laboratory for Computer Science, described in terms of a "veneer" on which people carry out practical tasks in the virtual world. This veneer covers another layer underneath that is based on the speaker's mother tongue and his/her culture. This second layer is much more deeply seated for the speaker than the "veneer" on top. This justifies why e-commerce sites need to be localized into local language(s) according to the cultural conventions that appeal on an emotional level to the targeted consumers.Conclusions
To summarize my view on the future course of the language industry: in the short-to-medium term, we may see the development of more widely applicable generic TM tools. They will be based on flexible but systematic access to external relevant resources, rather than limited to specific internal memory contents. Such developments may also include incorporating the approach used in EBMT.
Secondly, the changes in foreign language policies, teaching methodology and tools may impact the industry in the medium- to longer-term.
Thirdly, the importance of the role of culture will increasingly be recognized in global communication. Explicit culturalization processes may become integral parts of localization, particularly in online environments. This will mean that translators and localizers need to be thoroughly familiar with the cultural dimensions of the target language. Furthermore, more appropriate cultural models may be sought to facilitate globalization and localization processes.
Reprinted by permission from the Globalization Insider,
15 January 2003, Volume XII, Issue 1.1.
Copyright the Localization Industry Standards Association
(Globalization Insider: www.localization.org, LISA: www.lisa.org)
and S.M.P. Marketing Sarl (SMP) 2004
Please see some ads as well as other content from TranslationDirectory.com: