David Israel (SRI Intl) on "Natural Language Processing"

758 views

Published on

David Israel (SRI Intl) on "Natural Language Processing" at a LASER http://www.scaruffi.com/leonardo/aug2013.html

Published in: Technology, Spiritual
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
758
On SlideShare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
3
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

David Israel (SRI Intl) on "Natural Language Processing"

  1. 1. David Israel Artificial Intelligence Center SRI International August 8, 2013
  2. 2.  AI as Cognitive Science by Other Means  How do people do what they do  AI as focused on emulating intelligence artificially… by whatever means necessary  AI as a design and engineering discipline, not an empirical science  AI as Applied Logic (in weird disguise?)  Central focus: Representation & Reasoning  AI as Applied Probability Theory/Statistics  Central focus: (Machine) Learning from data
  3. 3.  Distinctively Human (?) Cognitive Achievement  In any case, a central human cognitive achievement  We (not I!) know something about how we do it – about underlying processes and mechanisms of language use  And we know something about how we (our infant selves) come to be able to do it – how we learn our first language(s) BUT I DON’T CARE !!* And more important, neither does DARPA * Beyond finding “inspiration” in theories of actual cognitive mechanisms/processes
  4. 4.  Goal: To make the knowledge expressed in (English) texts accessible by formal (artificial) reasoning systems  Translation(?): To make the (information) content expressed, e.g., in news stories available as input to “downstream” AI-systems  For, e.g., Intelligence Analysts, trying to put together an analytic picture of what was going on in some region during some time period.
  5. 5.  Applied Linguistics & Logic vs. (versus???)  Machine Learning: Applied Probability Theory and Statistics  What does this really come to, in our case (Machine Reading)?
  6. 6.  Hand-built grammars: sets of rules governing the ways in which sentences could be constructed out of sub-sentential elements (ultimately, of words/morphemes)  Often quite directly inspired by work in linguistics  Rules linking syntactic elements and structures with structures of symbols from formal languages  Often directly inspired by the languages developed and studied by logicians, typically for representing mathematical structures
  7. 7.  Availability of large annotated data-sets and of huge quantities of “raw” (unlabeled) text data  Growth of the practice of community-wide open evaluations and of  A metrics-focused research community  Moore’s Law; huge advances in processing speeds, memory capacity, etc., etc.  Resulted in moving toward a-theoretical, statistically- trained, ML-induced NLP modules (e.g., POS-taggers, NamedEntityExtractors, SemanticRoleLabelers, Parsers)  Tilrecently: sentence-/clause-level semantics was ignored
  8. 8. A New Synthesis:  Probabilistic Representation of Non-linguistic information + state-of-the-art Statistically-based ML-induced NLProcessing Modules  Analogous developments in Computer Vision  How to operationalize that `+’ ??  Many different possibilities to be explored  So little time … and nowhere near enough $$

×