Language Quality-Assurance (QA) Software: Optimizing Your Documentation for a Global Audience
Language quality-assurance (QA) software is a technology that helps technical communicators ensure the quality and consistency of communications. In recent years, this software has seen a surge in popularity and usage, and for good reason. The cost has come down, making return on investment fairly easy to quantify and achieve within a short time. The implementation process, while not trivial, is becoming easier. And because the software can be used to help authors communicate more effectively with a global audience, more and more companies view it as a business imperative.
Ironically, language QA software, which is intended partly to help companies standardize their terminology, is known by many names, including controlled-authoring software, controlled-language checker, and automated editing software. The term language QA software better reflects the way companies use this technology today and helps potential users avoid the common misperception that these tools are being used only to impose severe restrictions on language.
In the 1970s, companies such as Caterpillar, Kodak, and Xerox invested substantial resources in the development of their own proprietary controlled-language checkers. As the term controlled-language checker implies, the goal was to help authors conform to a controlled language—a subset of the English language.
In many early versions of controlled language, terminology and sentence structures were severely restricted. For example, Kodak International Service Language was limited to only the most basic sentence structures, tenses, and terminology. By simplifying the language in their service manuals and then giving their service technicians around the world a little bit of training in English, the company eliminated the need to translate those manuals into 140 different languages.
Other companies developed controlled languages and controlled-language checkers in order to make machine translation (the use of translation software) feasible. Even in the 1970s and 1980s, machine-translation software could translate simple texts from one language to another without requiring a lot of post-editing (error corrections and stylistic improvements). As long as technical terms and other noun phrases are pretranslated and added to the software’s dictionaries, machine translation was and is cost-effective.
Controlled-language checkers such as Boeing’s Simplified English Checker have long been used to ensure compliance with ASD Simplified Technical English (ASDSTE100). The ASD-STE100 standard was developed for the aerospace industry and is now being used in other industries as well.
By the late 1990s, many individuals realized that the English language did not need to be so tightly controlled in order for the language-checking software to produce benefits. For a manufacturing company, even the need to ensure consistency in the names of parts and components might be enough of a business justification for implementing the software. The emphasis began to shift from controlling language to ensuring the quality and consistency of language.
Additional software vendors such as acrolinx, Smart Communications, and Tedopres have emerged, eliminating the need for companies to develop their own language QA software and thereby greatly reducing the cost and time required for implementation.
Nowadays, companies use langu age QA software for the following reasons:
Companies that use this type of software include Bosch, Drager Medical, Motorola, Philips, SAS Institute, Schneider Electric, Siemens, Symantec, and many others. The combined client lists of the three vendors mentioned above easily exceed 100 customers, and that number is growing rapidly.
Better than Off-the-Shelf
On the one hand, language QA software is little more than a combination of a grammar checker, style checker, and spelling checker. However, there are some important differences between language QA software and off-theshelf products:
Flexibility. All of the language QA software products that I am familiar with can be used with multiple authoring tools.
Customization. Language QA products are highly customizable. Users choose which style issues and grammar rules they want the software to check for, and they typically work with the vendor to eliminate as many “false alarms” (often caused by idiosyncrasies in technical documentation) as possible. Vendors can also make the software flag certain grammatical language patterns as errors—patterns that are inherently ambiguous, or those that are not likely to be understood by non-native speakers, for example.
Term harvesting. Language QA products typically include a term-harvesting component, or else the vendor offers term harvesting as an additional service. Term harvesting (identifying all the unique terms in a document) helps identify variant spellings and even variant phrases. The software then flags the deprecated variants as errors and suggests the approved forms, which users can substitute with a mouse-click. Term harvesting also makes it possible to identify approved technical terms that would otherwise be flagged as spelling errors. Adding approved terms to the software’s dictionary also makes the software’s style-checking and grammar-checking components more accurate.
Groundwork for Success
If you believe that language QA software is right for your company or organization, start by doing a few things that will help ensure your long-term success.
First, make sure that you and your management have a clear idea of the benefits that you hope to achieve. Also make sure your management is committed to requiring all authors and editors in your division or company to use the software. Invite your prospective vendor to educate your staff about how important the software is to attaining your objectives.
The experiences of a few companies whose implementations of language QA software failed over the long term show that use of the software must not be allowed to become optional under any circumstances. Quality assurance is a misnomer when only a subset of your workforce is using QA tools or following QA processes. That statement is as true of language QA as of any other type of QA program.
Imposing too many stylistic or terminology restrictions on users at once is another possible reason for failure. However, unless you try to implement a highly restrictive controlled language, the too-much-too-fast syndrome is unlikely to occur. You will almost certainly choose to get the software into production with a starter set of rules and terminology restrictions. That approach enables you to get some immediate value from the software. There will be plenty of time for gradual enhancements later.
The Implementation Process
The following tasks are typical of any implementation of language QA software, regardless of which vendor you choose. Many of the tasks overlap, but this list will give you a general idea of what is involved.
Test the software. Assemble two collections of documents that are representative of what your company produces. You and the vendor need these in order to test the software’s style rules, grammar rules, and spell checking against your documents. One collection should be quite large, for comprehensive testing. (Ask the vendor for more specifics.) The other collection should be smaller—perhaps 1,000,000 to 2,000,000 words—for preliminary testing, so that you won’t have as much output to evaluate during your initial customization process.
Choose rules. Decide which of the vendor’s standard set of style rules and grammar rules you want to implement. If business requirements mandate that you follow an already defined standard such as ASD-STE100, then of course you will be using the rules that support that standard. The vendor typically runs the selected rules in batch mode against your collection of documents. Then you review the output in order to identify the “false alarms” that the software reports. The vendor customizes the rules in order to eliminate as many of the false alarms as possible or practical.
Consider customization. Determine whether there are other style issues that you want the software to detect. Ask the vendor whether they can customize the software to detect those issues, and discuss the time frame and cost of doing so.
Plan pilot projects. Line up participants for some pilot projects.
Consolidate lists of terms. If you already have lists of terms that you want the software to flag as errors, assemble those terms into a single file. A spreadsheet application such as Microsoft Excel is suitable until you can get the terms into your language QA software’s repository.
Consider term harvesting. Ask the vendor what their policy and process are for term harvesting. Even if the vendor does the term harvesting for you, you will have to review the output and determine which terms and phrases to approve and which to deprecate.
Conduct pilot projects. Collect feedback after the pilot projects are over.
Finalize details with the vendor. Give the vendor a list of any requirements or further customizations that you want, and agree on an approximate timetable and cost.
Roll out the product. Plan and conduct your rollout, with initial training provided either by the vendor or by someone on your staff.
Three to six months is a typical amount of time for getting an initial set of rules and terminology restrictions in place and for putting the software into production. The process of refining existing rules and adding new rules and terminology restrictions can continue indefinitely. Vendors will certainly have some parameters for how much customization is included in your initial implementation and for how much further customizations or refinements will cost. Minor changes might be covered in a maintenance agreement.
During the initial implementation, you will need at least one person to devote half time to project management. Of course, that person will require collaboration and input from many colleagues in order to progress through the implementation steps.
After the rollout, support requirements depend on how much you want to extend the functionality of the software and on how rapidly you want to make those extensions. During the initial implementation, you were probably anxious to get the software into production quickly so that you could begin to get some value from it. But in order to get the most value from the software, you will need to devote much more time and attention to term harvesting and perhaps to implementing additional style rules as well.
A few companies have full-time linguistic engineers and terminologists working to identify and implement new style rules and to classify terms that are collected during term harvesting. However, most companies have one or two people with aptitude and interest in those areas who support their users and who work to extend the capabilities of the software as time permits.
The non-linguistic tasks that a support person typically provides include but are not limited to the following:
After the Rollout
Most writers recognize that language QA software helps them improve their writing skills, which makes them more competitive in the job market and less vulnerable to having their jobs outsourced. In the August 2007 issue of ClientSideNews Magazine, I explained how writers benefit from language QA software:
In any implementation, there are usually a few employees who must be convinced that attending to the details of language quality and consistency is part of their job. However, as one executive at SAS Institute pointed out, “Most people want to do a good job. If you make it easy for them to do a good job, they are usually happy to oblige.” The employees who resist using language QA software are usually the ones who have not tried it yet. They need to be given a gentle nudge so that they can discover how easy it is and how much they learn by using it.The SAS implementation is aimed in part at helping writers and editors conform to a detailed set of Global English guidelines (Kohl 2008). Because some SAS documentation is not translated, SAS has long been interested in making that documentation easier for non-native speakers of English to read and comprehend. Language QA software puts that goal within reach.
In the January 2008 Intercom, JoAnn Hackos writes, “[i] n a highly competitive global environment, we must look for ways to reduce costs, gain efficiencies, and prove that the work we do adds value for our employers.” Language QA software is one tool that helps many companies meet those objectives. It can be adapted to help companies meet different criteria for language quality and to support different business objectives, including communicating more effectively with a global audience.
More and more companies are recognizing that, in contrast to their preconceived notions, language QA software ultimately makes their writers and editors more productive. Even non-professional communicators can use the software to improve their writing skills and to communicate more effectively with colleagues and customers around the world.
As vendors of language QA software and their customers collaborate to exploit the software’s potential more fully, it seems likely that the software will become an important part of many technical communicators’ tool sets.
Akis, Jennifer Wells, and William R. Sisson. 2002. “Improving Translatability: A Case Study at Sun Microsystems, Inc.” The LISA Newsletter: Globalization Insider 4.5. Available at www.lisa.org/globalizationinsider/2002/12/ improving_trans.html.
Dillinger, Mike, and Arle Lommel. 2004. LISA Best Practices Guide: Implementing Machine Translation. Geneva: Localization Industry Standards Association. Available at www.lisa.org/Best-Practice-Guides.467.0.html.
Hackos, JoAnn T. 2008. “Information Development in a Flat World.” Intercom 55.1 (January): 22-25.
Kohl, John R. 1999. “Improving Translatability and Readability with Syntactic Cues.” Technical Communication 46.2 (May): 149-166.
Kohl, John R. 2007. “Assisted Writing and Editing at SAS.” ClientSideNews Magazine (May). Available at www. clientsidenews.com/downloads/CSNV7I8.pdf.
Kohl, John R. 2008. The Global English Style Guide: Writing Clear, Translatable Documentation for a Global Market. Cary, NC: SAS Press.
O’Brien, Sharon, and Johann Roturier. 2007. “How Portable Are Controlled Language Rules? A Comparison of Two Empirical MT Studies.” In Machine Translation Summit XI, ed. Bente Maegaard, September 10-14, 2007, Copenhagen. Available at www.mt-archive.info/MTS-2007-OBrien. pdf.
Strong, Kathy L. 1983. “Kodak International Service Language.” Technical Communication 30.2 (May): 20-22.
Wojcik, Richard H., and Heather Holmback. 1996. “Getting a Controlled Language Off the Ground at Boeing.” Proceedings of the First International Workshop on Controlled Language Applications (CLAW96). Leuven, Belgium: Katholieke Universiteit Leuven Centre for Computational Linguistics, March 26-27, 1996.
John R. Kohl (John.Kohl@sas.com) is a linguistic engineer at SAS Institute in Cary, North Carolina. He is the author of The Global English Style Guide: Writing Clear, Translatable Documentation for a Global Market.
Published - May 2008
ClientSide News Magazine - www.clientsidenews.com
Please see some ads as well as other content from TranslationDirectory.com: