Presented by Kent Taylor at Documentation and Training West, May 6-9, 2008 Vancouver, BC
Maintaining a reasonable level of quality and consistency across all of the content that gets into your customers’ hands has always been difficult to manage. It used to be possible when the majority of the content was written by groups of professional writers, and edited by professional editors. And generally distributed in only one language - English.
Today, however, your customers get content from all manner of sources that used to be only for internal consumption, where quality and consistency was less important. And in today’s Global Economy, chances are that much of the customer facing content is translated, and distributed in more than one language, or at least to a large population of non-native speakers. This is where quality and consistency really pay off.
Using meaning-based natural language processing software, we’ve analyzed Translation Memories, Software UI Strings from very large systems, and large corpora of assorted customer facing content. And, we’ve found that nearly every set of content that we look at contains 15% to 25% redundancy, or more.
A minor irritant to a native speaker, a bigger irritant for a non-native speaker or poor reader, and a major irritant for your CFO. Every one of those variants was translated in some cases to 30+ target languages. On the average, this kind of linguistic redundancy adds 20% to the cost of translation. Put another way, if you currently translate to five languages, and could eliminate this redundancy in your source, your savings would be great enough to translate to an additional language, and open up a new market.
This presentation will discuss these and other relevant content quality issues in depth, suggest ways to deal effectively with them, and present real-world examples of companies that have ‘been there, done that.