Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

"Human Capacity Cognitive Computing" by Patrick Ehlen, PhD and Chief Scientist at Loop AI Labs (loop.ai)

253 views

Published on

Many organizations today have a supermassive black hole at the center of their Digital Transformation strategy: Around 90% of the data they hold about their customers, employees, and business processes persists as "dark data" -- data that is locked away as ordinary human language written in reports, e-mails, internal documents, and customer service tickets, none of which can be easily understood by a machine. While people can read, and make sense of small batches of such data, the amount of data most organizations now hold has become too massive, and in-depth analysis of them goes beyond their resources. Fortunately, the dawn of a new era is upon us: A branch of artificial intelligence known as "Human Capacity Cognitive Computing" is able to provide human-level understanding of this dark data, giving organizations valuable insights into the contents of their dark data. Moreover, it can connect disparate pieces of the big data puzzle, providing a unified "big data big picture" of an organization's past, present, and future role in the marketplace.

About Patrick

Patrick Ehlen is Chief Scientist at Loop AI Labs. He has pursued artificial intelligence since 3rd grade, after reading Arthur C. Clarke's novel 2001: A Space Odyssey. In 1977, before most people had computers, his econometrician father bought a DEC PDP-11 to use in their home, and Patrick became obsessed with figuring out how to make the machine talk.

Fifteen years later, in 1992, inspired by the Parallel Distributed Processing volumes, he trained his first neural network to learn distributed representations of concepts.

Patrick earned a PhD in Cognitive Psychology from the New School for Social Research in 2005. While completing his PhD, he interned at speech recognition pioneer Dragon Systems (acquired by Nuance) and then at AT&T Labs-Research, trailblazing innovations in speech and multimodal interface technology.

He then went to the Center for the Study of Language and Information (CSLI), an AI mecca at Stanford University. As a research scientist in the Stanford's Computational Semantics Lab, he worked on the DARPA CALO AI project and devised machine learning methods to extract concepts and topics from ordinary, spontaneous conversations.

When approached by his old CALO colleague Bart Peintner about the work they were doing at Loop AI Labs, Patrick saw an opportunity to advance the state-of-the-art using techniques from Deep Learning that were just beginning to emerge.

Patrick has been awarded four U.S. patents and produced 45 research publications in the areas of computational semantics, cognitive linguistics, psycholinguistics, word sense disambiguation, human concept learning, and artificial intelligence. His work is cited in over 430 scientific papers.

LinkedIn: https://www.linkedin.com/in/patrickehlen
Twitter: @patrick_ehlen

Published in: Technology
  • DOWNLOAD THAT BOOKS INTO AVAILABLE FORMAT (2019 Update) ......................................................................................................................... ......................................................................................................................... Download Full PDF EBOOK here { http://bit.ly/2m6jJ5M } ......................................................................................................................... Download Full EPUB Ebook here { http://bit.ly/2m6jJ5M } ......................................................................................................................... Download Full doc Ebook here { http://bit.ly/2m6jJ5M } ......................................................................................................................... Download PDF EBOOK here { http://bit.ly/2m6jJ5M } ......................................................................................................................... Download EPUB Ebook here { http://bit.ly/2m6jJ5M } ......................................................................................................................... Download doc Ebook here { http://bit.ly/2m6jJ5M } ......................................................................................................................... ......................................................................................................................... ................................................................................................................................... eBook is an electronic version of a traditional print book that can be read by using a personal computer or by using an eBook reader. (An eBook reader can be a software application for use on a computer such as Microsoft's free Reader application, or a book-sized computer that is used solely as a reading device such as Nuvomedia's Rocket eBook.) Users can purchase an eBook on diskette or CD, but the most popular method of getting an eBook is to purchase a downloadable file of the eBook (or other reading material) from a Web site (such as Barnes and Noble) to be read from the user's computer or reading device. Generally, an eBook can be downloaded in five minutes or less ......................................................................................................................... .............. Browse by Genre Available eBooks .............................................................................................................................. Art, Biography, Business, Chick Lit, Children's, Christian, Classics, Comics, Contemporary, Cookbooks, Manga, Memoir, Music, Mystery, Non Fiction, Paranormal, Philosophy, Poetry, Psychology, Religion, Romance, Science, Science Fiction, Self Help, Suspense, Spirituality, Sports, Thriller, Travel, Young Adult, Crime, Ebooks, Fantasy, Fiction, Graphic Novels, Historical Fiction, History, Horror, Humor And Comedy, ......................................................................................................................... ......................................................................................................................... .....BEST SELLER FOR EBOOK RECOMMEND............................................................. ......................................................................................................................... Blowout: Corrupted Democracy, Rogue State Russia, and the Richest, Most Destructive Industry on Earth,-- The Ride of a Lifetime: Lessons Learned from 15 Years as CEO of the Walt Disney Company,-- Call Sign Chaos: Learning to Lead,-- StrengthsFinder 2.0,-- Stillness Is the Key,-- She Said: Breaking the Sexual Harassment Story That Helped Ignite a Movement,-- Atomic Habits: An Easy & Proven Way to Build Good Habits & Break Bad Ones,-- Everything Is Figureoutable,-- What It Takes: Lessons in the Pursuit of Excellence,-- Rich Dad Poor Dad: What the Rich Teach Their Kids About Money That the Poor and Middle Class Do Not!,-- The Total Money Makeover: Classic Edition: A Proven Plan for Financial Fitness,-- Shut Up and Listen!: Hard Business Truths that Will Help You Succeed, ......................................................................................................................... .........................................................................................................................
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here

"Human Capacity Cognitive Computing" by Patrick Ehlen, PhD and Chief Scientist at Loop AI Labs (loop.ai)

  1. 1. Human Capacity Cognitive Computing Patrick Ehlen Chief Scientist, Loop AI Labs March 16, 2017
  2. 2. Cognitive Computing Platforms • IDC forecast: $12.5B market in 2019 (CAGR of 35%) • By 2018, ½ of consumers will regularly interact with cognitive computing services • How will it work? • What is “cognitive computing,” anyhow?
  3. 3. Which is more difficult?
  4. 4. Which is more difficult?
  5. 5. Communicate in multiple modalities
  6. 6. Embed and recurse over long sequences This is the rat that ate the malt that lay in the house that Jack built.
  7. 7. “Discrete Infinity”: infinite use of finite means This is the farmer sowing the corn, that kept the cock that crowed in the morn, that waked the priest all shaven and shorn, that married the man all tattered and torn, that kissed the maiden all forlorn, that milked the cow with the crumpled horn, that tossed the dog, that worried the cat, that killed the rat, that ate the malt that lay in the house that Jack built.
  8. 8. • Multimodal Communication • Recursion • Discrete Infinity
  9. 9. • Multimodal Communication • Recursion • Discrete Infinity “The Human Capacity”
  10. 10. • Multimodal Communication • Recursion • Discrete Infinity “The Human Capacity”
  11. 11. 3 Possible Sources: • Interface • Learning Algorithm • Architecture “The Human Capacity”
  12. 12. 3 Possible Sources: • Interface • Learning Algorithm • Architecture “The Human Capacity” Deep Learning
  13. 13. Hinton (’81, ’86) • Assemblies for different semantic roles (hack) Hinton, G.E. (1981) Implementing semantic networks in parallel hardware. Hinton, G.E. (1986) Learning distributed representations of concepts.
  14. 14. Frankland & Greene (2015) • Assemblies for different semantic roles (brain) Frankland, S.M. & Greene, J.D. (2015) An architecture for encoding sentence meanings in left mid-superior temporal cortex. PNAS 112:37
  15. 15. Recap • (Even though we can all do it…) Language is Hard • “Human Capacity” probably arises from special architecture
  16. 16. Semantics • What does anything mean?
  17. 17. Dictionary approach • Define a thing by its necessary & sufficient features Bachelor #1 Bachelor #2
  18. 18. Bachelor #1 Bachelor #2
  19. 19. Bachelor #1 Bachelor #2 Katz, J.J. & Fodor, J.A. (1963) The structure of a semantic theory. Language 39:2
  20. 20. Bachelor #1 Bachelor #2 noun human animal male nevermarried young unmatedseal academicdegree “Units Representation”
  21. 21. “Matrix Representation”
  22. 22. Bachelor #1 Bachelor #2 human “Vector Space Representation”
  23. 23. Problems with Dictionary Approach: • “necessary and sufficient” features: neither necessary nor sufficient • Prototypical examples (E. Rosch): • “Robin” -> more representative of bird than “finch” or “penguin” • Things don’t categorize so easily • Metaphorical & analogic nature of language (G. Lakoff & M. Johnson, D. Hofstadter)
  24. 24. Problems with Dictionary Approach: • “Edge cases” (C. Fillmore): • “Widow”: woman who murdered husband? “Max went too far today and teapotted a policeman” (H. Clark)
  25. 25. Distributional approach: • Determine what words mean solely by their lexical context (surrounding words) the quick brown fox jumped over the lazy dog
  26. 26. Distributional approach: • Determine what words mean solely by their lexical context (surrounding words) the quick brown fox jumped over the lazy dog
  27. 27. Distributional approach: • Determine what words mean solely by their lexical context (surrounding words) the quick brown fox jumped over the lazy dog
  28. 28. Distributional approach: • Determine what words mean solely by their lexical context (surrounding words) ContextFeatures
  29. 29. Distributional approach: • Determine what words mean solely by their lexical context (surrounding words) • Use dimensionality reduction to collapse into latent factors (or “microfeatures”)
  30. 30. Distributional approach: LatentContextFeatures
  31. 31. Local Representation: noun human animal male nevermarried young unmatedseal academicdegree
  32. 32. Local Distributed Representation: x0 x1 x2 x3 x4 x5 x6 x7
  33. 33. Local Distributed Representation: x0 x1 x2 x3 x4 x5 x6 x7 young
  34. 34. bachelor X1
  35. 35. Deep Learning • Learn distributed representations using neural networks • Learn from data as it comes in • Learn from sequences (e.g., sentences)
  36. 36. Deep Learning • Learn from lots of additional context features (not just other words) • Visual features (CNNs) • Parse structure (Recursive NNs) • Higher-level abstractions from earlier sequences (RNNs)
  37. 37. Deep Learning • Learn from lots of additional context features (not just other words) “Max went too far today and teapotted a policeman” (H. Clark)
  38. 38. Deep Learning • Learn from lots of additional context features (not just other words) • For Human Capacity Cognitive Computing: HUGE potential “context feature” input space • very sparse
  39. 39. Deep Learning • Large, sparse input fully-connected to many layers • Complex memory assemblies • RNN • LSTM or GRU to retain relevant context features from further upstream
  40. 40. Human Capacity Cognitive Computing Platform • Handle context feature input from multiple modalities and project into single representation space • Support architectures with specialized assemblies permitting recursion / embedding • “Discrete Infinity” • “Fluid” interpretation and understanding
  41. 41. Loop Cognitive Computing Platform • GPU-based appliance • Human Capacity understanding • Learns from unstructured and structured data • Produces a structured representation • Understands concepts in the context of their domain
  42. 42. Your use of the word “teapot” does not match any of my dictionary entries.

×