Good Applications of Bad Machine Translation

  • 986 views
Uploaded on

Presented at the 6th Language Technology Conference, Cordoba, Argentina, April 2009

Presented at the 6th Language Technology Conference, Cordoba, Argentina, April 2009

More in: Technology , Business
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
986
On Slideshare
0
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
0
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Bob Donaldson, VP Strategy Good Applications of Bad Machine Translation
  • 2. Promise of MT or eMpTy Promises? Anything worth doing is worth doing poorly.
  • 3. A Few Well-known Facts
    • Web content is growing 70% or more year over year
    • Almost 65% of all Internet users (more than 800 million) do not speak English!
    • Today, English is less than 36% of internet users and will possibly decline to 15% by 2010
    • More than 70% of digital messages and documents are “born” in languages other than English
    • Non-English speaking Internet users are increasing by over 140 million per year
    • Emerging shortage of human translators for the vast and growing demand of digital content
  • 4. Tipping Point for Machine Translation?
    • Before:
    • Machine Translation success is always “ a few years away”
    • Actual MT output requires substantial rework with very little overall cost/time savings
    • Incremental improvements through technology
      • Translation Memory
      • Translation Workflow Management
    • After (Are we there yet?):
    • Statistical MT is showing a 4:1 productivity improvement in some narrow domains of application
    • Integration of MT into high-volume workflows
    • Expansion of addressable market for translation services
  • 5. The Famous Triangle MT + Post-Edit Improvements Currently translated by humans } MT Extensions – Reduced Quality
  • 6. The Famous Triangle Eventual MT success in narrow domain MT + Post-Edit Improvements Currently translated by humans
  • 7. Intel Experience LAR Spanish MT visits growth since January ‘08
    • Using SMT for “raw” translation of customer support site
    • MT increased Spanish content from 12% to 100%.
    • Stopped 95% human translation
    • Project cycle went from 2-3 weeks to every 24 hours
    • Survey shows MT solves 40% customers’ questions/problems vs. 41% human translation
    • Developing more language MT systems
    * Source: Will Burgett, Global Support Summit, 2008
  • 8. Microsoft Experience * * Source: Rich Kaplan, Localization World, Seattle, 2007
    • Users see human-edited MT translations
    • Increased efficiency (and cost savings)
    • Used when accuracy is critical
      • E.g., MT output is post-edited by localizers during translation of documentation and software strings
    • Users see unedited MT translations
    • Enables translation of material that otherwise goes un-translated
    • Used when some errors can be tolerated
      • E.g., when the alternative is nothing at all: The CSS knowledge base
      • Users are motivated!
  • 9. Microsoft Experience *
    • Ranges from 5% to 25% savings in translation time/cost
    • Depends on division of labor, post-editing quality guidelines, translator training, and vendor
    * Source: Rich Kaplan, Localization World, Seattle, 2007
  • 10. What about small volumes?
    • Google?
    • MT Vendors?
    • SaaS?
    • But what about quality?
    • Cost/Quality Correlation
    • Assumes Fully Trained SMT
    • Quality of “Raw” MT is Suspect (at best)
    • * Source: Kirti Vashee,
      • TAUS Workshop,
      • Beijing, 2007
    SMT TM SMT HT Review TM SMT Post-Edit TM SMT Quality of translation * SMT
  • 11. So … What good is Bad Machine Translation?
    • Part of an integrated use-case in research and discovery applications …
    • Searching is as much art as science
    • Primary goal is to establish relevance
    • May also include identifying absence of a particular topic or term
    • Examples:
      • Patent Search
      • Litigation Support
  • 12. Translation Services Marketplace Taxonomy
    • Customer as content consumer (e.g. individual researchers)
      • Traditional “Translation Agency” target
      • Multiple sources translated individually
      • Little motivation for adopting TM or MT
    • Customer as content creator (e.g. Ford, Oracle, etc.)
      • “ Localization Company” target
      • Large body of source material, often under version control
      • Opportunities for controlled authorship, terminology management, TM, etc. to control cost
      • Emerging opportunity for MT
    • Customer as content aggregator (e.g. Google, LexisNexis)
      • Value is in centralized search/selection support (information retrieval)
      • Economies of scale to meet needs of content consumer
      • Requires MT at some level to be viable
  • 13. MT + Search + HT = Cost-Effective Solution SMT TM SMT HT Review TM SMT Post-Edit TM SMT Quality of translation SMT Entire Corpus Translated for Human Reader Total Cost Proportionate to Quality & Volume Assumes Fully Trained SMT HT* * Minimal Training SMT HT* Rough MT of Entire Corpus Translated for Index/Search Lower Overall Cost plus Highest Quality * On Demand
  • 14. Data Aggregation Perspective Rough MT … may not be “human ready” “ On-demand” human translation Analytics to support ‘triage'
  • 15. Just in Time Translation
    • Manufacturing Model
    • Product Design & Prototyping
    • Publish Product Catalog
    • Build to Order
    • No inventory
    • Translation Analog
    • Train/Configure MT System
    • Integrate with Retrieval System
    • Translate to Order
    • Free inventory!
    • Iterative Improvement Loop
  • 16. Eg: Unified Legal Analysis Environment English Document Set Translated Document Set IPX Document Profiling Process Document-Specific Concept Profiles Translation Process: ~1% HT ~99% MT IPX NLP Process Keyword Translation Basic Priority Scoring Process IPX NLP Process Unified Document Set Unified Correlation Matrices French Document Set Paralegal Analysis
  • 17. Eg.: Chinese Patent Search Pilot
    • Matrixware Information Services
      • Information Retrieval Specialists
      • Targeting Individual Knowledge Workers
    • Asia Online
      • Statistical MT Experts
      • Custom Domain Development
      • Real-time Feedback for MT Improvement
    • Mc Elroy Translation
      • Training & Tuning Set Development
      • MT Quality Assessment Services
      • Ongoing MT Quality Improvement Services
      • Quick-turn Human Translation “On Demand”
  • 18. Project Goals & Status
    • Goal: Proof of Concept
      • Validate “searchability” of patent database
        • High recall (finding what is there)
        • Acceptable precision (eliminating “noise”)
      • Validate rate of quality improvement
        • Utilizing Asia Online interface
        • Filling technical vocabulary gaps
      • Validate customer acceptance
    • Status
      • SMT training to be complete this month
      • 90 day quality improvement cycle to follow
  • 19. Contact Details
    • Bob Donaldson
    • VP Strategy
    • McElroy Translation Company
    • 910 West Avenue
    • Austin, TX 78701
    • +1 (512) 472-6753
    • [email_address]
    • www.mcelroytranslation.com