Tetherless World Constellation

Why Watson Won: A
cognitive perspective
Jim Hendler
and Simon Ellis
Tetherless World Profe...
IBM Watson
Tetherless World Constellation
How’d I get into it? Watson and Semantic Web
Tetherless World Constellation

IBM
Watson and Semantic Web
Tetherless World Constellation

IBM
Is Watson cognitive?

???

“The computer‟s techniques for unraveling Jeopardy! clues sounded
just like mine. That machine ...
Outline
Tetherless World Constellation

• Is Ken right?
– How Watson Works
– Watson as a cognitive architecture??
– Beyond...
Inside Watson

???

Watson pipeline as published by IBM; see IBM J Res & Dev 56 (3/4), May/July 2012, p.
Question Analysis

???
???

Question analysis


What is the question asking for?



Which terms in the question refer to the answer?



Given ...
Parsing and semantic analysis


What information about a previously unseen piece of
English text can Watson determine?

...
Question analysis pipeline

Unstructured
Question Text

Parsing
&
Semantic
Analysis

Structured Annotations
of Question:
F...
Search Result Processing and Candidate
???
Generation
Primary Search

???



Primary Search is used to generate the corpus of
information from which to take candidate answers,...
Candidate Generation

???



Candidate Generation generates a wide net of possible
answers for the question from each doc...
Search Result Processing and
Candidate Generation

???
Scoring & Ranking

???
Scoring


Analyzes how well a candidate answer relates to the
question



Two basic types of scoring algorithm


Contex...
Types of scorers


Context-independent


Question Analysis



Ontologies (DBpedia, YAGO, etc)





???

Type hierarch...
Scorers


Passage Term Match



Textual Alignment



???

Skip-Bigram


Each of these scores supportive evidence



T...
Example:Textual Alignment

???



Finds an optimal alignment of a question and a passage



Assigns “partial credit” for...
Skip-Bigram


Constructs a graph


Nodes represent terms (syntactic objects)





???

Edges represent relations

Extr...
Example


Who authored
“The Good Earth”?



“Pearl Buck, author of
the good earth…”

???
Watson Summary
Tetherless World Constellation

• Watson works by
– Analyzing the question
• natural language parsing
• tex...
MiniDeepQA (Not Watson!)


???

RPI students implementing a DeepQA pipeline to explore
the principles underlying this kin...
???

Examples

Right answer
???

Examples

Right answer?
???

Examples

Right answer
???

Examples

Right
answer??
???

Examples

had to get this
one right!
Scoring

???



One of DeepQA‟s main strengths is aggregating a
number of different scoring algorithms capable of
running...
Scoring Principles: combine
evidence


He was the Prime Minister of Canada in 1993.




candidates could include Trudea...
New Scoring types


???

We can explore how new kinds of information can be
added to the Watson scoring pipeline


Examp...
Special purpose reasoning

???

• Can we match simulation (or steer) large scale
simulations to help answer NL questions?
...
Alternate Universe Reasoning
(Contexts)


How can a Watson reasoner appropriately use Q/A
contexts?


Where was Yoda bor...
But back to the original question
Tetherless World Constellation

• Q: How does Watson fare as a
cognitive model?
• A: Poo...
Watson and Q/A
Tetherless World Constellation

• Watson’s feed-forward pipeline has
the following properties
– lots of can...
Production rules, modules, etc
Tetherless World Constellation

Production Rule style Architectures cf ACT-R (Anderson 1974...
Network based
Tetherless World Constellation

Network based architectures (cf. spreading activation (Collins 75),
marker-p...
MAC/FAC
Tetherless World Constellation

MAC/FAC (Gentner & Forbus, 1991)
Many are chosen, few are called model of analogic...
Cognitive Architecture? Watson as “component”

Decision Making

Memory
Reasoning

Watson, Cogito, and Clarion

Office of R...
Summary
Tetherless World Constellation

• Watson won by a combination of
–
–
–
–
–

natural language processing
search tec...
Questions?
Tetherless World Constellation
Upcoming SlideShare
Loading in...5
×

Why Watson Won: A cognitive perspective

2,146

Published on

In this talk, we present how the Watson program, IBM's famous Jeopardy playing computer, works (based on papers published by IBM), we look at some aspects of potential scoring approaches, and we examine how Watson compares to several well known systems and some preliminary thoughts on using it in future artificial intelligence and cognitive science approaches.

Published in: Technology, Education
0 Comments
12 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
2,146
On Slideshare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
92
Comments
0
Likes
12
Embeds 0
No embeds

No notes for slide

Why Watson Won: A cognitive perspective

  1. 1. Tetherless World Constellation Why Watson Won: A cognitive perspective Jim Hendler and Simon Ellis Tetherless World Professor of Computer,Web and Cognitive Sciences Director, Rensselaer Institute for Data Exploration and Applications Rensselaer Polytechnic Institute (RPI) http://www.cs.rpi.edu/~hendler @jahendler (twitter)
  2. 2. IBM Watson Tetherless World Constellation
  3. 3. How’d I get into it? Watson and Semantic Web Tetherless World Constellation IBM
  4. 4. Watson and Semantic Web Tetherless World Constellation IBM
  5. 5. Is Watson cognitive? ??? “The computer‟s techniques for unraveling Jeopardy! clues sounded just like mine. That machine zeroes in on key words in a clue, then combs its memory (in Watson‟s case, a 15-terabyte data bank of human knowledge) for clusters of associations with those words. It rigorously checks the top hits against all the contextual information it can muster: the category name; the kind of answer being sought; the time, place, and gender hinted at in the clue; and so on. And when it feels „sure‟ enough, it decides to buzz. This is all an instant, intuitive process for a human Jeopardy! player, but I felt convinced that under the hood my brain was doing more or less the same thing.” — Ken Jennings
  6. 6. Outline Tetherless World Constellation • Is Ken right? – How Watson Works – Watson as a cognitive architecture?? – Beyond Watson
  7. 7. Inside Watson ??? Watson pipeline as published by IBM; see IBM J Res & Dev 56 (3/4), May/July 2012, p.
  8. 8. Question Analysis ???
  9. 9. ??? Question analysis  What is the question asking for?  Which terms in the question refer to the answer?  Given any natural language question, how can Watson accurately discover this information? Who is the president of Rensselaer Polytechnic Institute? Question Analysis Focus Terms: “Who”, “president of Rensselaer Polytechnic Institute” Answer Types: Person, President
  10. 10. Parsing and semantic analysis  What information about a previously unseen piece of English text can Watson determine?  ??? How is this information useful? Natural Language Parsing Semantic Analysis - grammatical structure - meanings of words, phrases, etc. - parts of speech - synonyms, entailment - relationships between words - hypernyms, hyponyms - ...etc. - ...etc.
  11. 11. Question analysis pipeline Unstructured Question Text Parsing & Semantic Analysis Structured Annotations of Question: Focus, answer types, Useful search queries Machine Learning Classifiers ???
  12. 12. Search Result Processing and Candidate ??? Generation
  13. 13. Primary Search ???  Primary Search is used to generate the corpus of information from which to take candidate answers, passages, supporting evidence, and essentially all textual input to the system  It formulates queries based on the results of Question Analysis  These queries are passed into a (cached) search engine which returns a set number of highly relevant documents and their ranks.  on the open Web this could be a regular search engine (our
  14. 14. Candidate Generation ???  Candidate Generation generates a wide net of possible answers for the question from each document.  Using each document, and the passages created by Search Result Processing, we generate candidates using three techniques:  Title of Document (T.O.D.): Adds the title of the document as a candidate.  Wikipedia Title Candidate Generation: Adds any noun phrases within the document‟s passage texts that are also the titles of Wikipedia articles.  Anchor Text Candidate Generation: Adds candidates based on the hyperlinks and metadata within the document.
  15. 15. Search Result Processing and Candidate Generation ???
  16. 16. Scoring & Ranking ???
  17. 17. Scoring  Analyzes how well a candidate answer relates to the question  Two basic types of scoring algorithm  Context-independent scoring  Context-dependent scoring ???
  18. 18. Types of scorers  Context-independent  Question Analysis  Ontologies (DBpedia, YAGO, etc)   ??? Type hierarchy reasoning Context-dependent  Analyzes feature of the natural language environment where candidates were found   Relies on “passages” found during search Many special purpose ones used in Jeopardy
  19. 19. Scorers  Passage Term Match  Textual Alignment  ??? Skip-Bigram  Each of these scores supportive evidence  These scores are then merged to produce a single candidate score
  20. 20. Example:Textual Alignment ???  Finds an optimal alignment of a question and a passage  Assigns “partial credit” for close matches    “Who is the President of RPI?” Who President of RPI. Shirley Ann Jackson is the President of RPI.
  21. 21. Skip-Bigram  Constructs a graph  Nodes represent terms (syntactic objects)   ??? Edges represent relations Extracts skip-bigrams    A skip-bigram is a pair of nodes either directly connected or which have only one intermediate node Skip-bigrams represent close relationships between terms Scores based on number of common skip-bigrams
  22. 22. Example  Who authored “The Good Earth”?  “Pearl Buck, author of the good earth…” ???
  23. 23. Watson Summary Tetherless World Constellation • Watson works by – Analyzing the question • natural language parsing • text extraction – Generating a large number of candidates • mostly search heuristics – Scoring each • through multiple scorers • with weights adjusted by learning algorithm – Returning top candidate
  24. 24. MiniDeepQA (Not Watson!)  ??? RPI students implementing a DeepQA pipeline to explore the principles underlying this kind of Q/A system (THIS IS NOT WATSON!)  Pipeline development  Data caching  Graphical and command line interfaces  Parsing  Scoring
  25. 25. ??? Examples Right answer
  26. 26. ??? Examples Right answer?
  27. 27. ??? Examples Right answer
  28. 28. ??? Examples Right answer??
  29. 29. ??? Examples had to get this one right!
  30. 30. Scoring ???  One of DeepQA‟s main strengths is aggregating a number of different scoring algorithms capable of running in parallel.  RPI scorers are primitive compared to IBM‟s, but  allow us to explore the principles  allow us to explore different algorithms for computing scores  allow us to create new ones not tried by IBM
  31. 31. Scoring Principles: combine evidence  He was the Prime Minister of Canada in 1993.   candidates could include Trudeau, Harper, Campbell, Chretien, Mulroney… Try (Research):  Trudeau was Prime Minister of Canada in 1993 (doesn‟t match)  Campbell was Prime Minister of Canada in 1993 (MATCH)  Chretien was Prime Minister of Canada in 1993 (MATCH)  Scoring Research & type match  Trudeau: Re-search NO; Type: Yes  Campbell: Re-search YES; Type: No  Chretien: Re-search YES; Type: Yes WHO WAS CHRETIEN? ???
  32. 32. New Scoring types  ??? We can explore how new kinds of information can be added to the Watson scoring pipeline  Example: new NLP extraction techniques   Example: Specialized Web Sources   Adding a ML-based extractor built by Heng Ji Database advisor project Example: More complex inferencing  Jeopardy questions are unambiguous, real world questions aren‟t • •  Where is Montreal? Who is Jim Hendler? Example: Special purpose reasoning…
  33. 33. Special purpose reasoning ??? • Can we match simulation (or steer) large scale simulations to help answer NL questions? - eg. Answer questions such as “Why” and “How” integrated with large scale simulations
  34. 34. Alternate Universe Reasoning (Contexts)  How can a Watson reasoner appropriately use Q/A contexts?  Where was Yoda born?   Very little is known about Yoda's early life. He was from a remote planet, but which one remains a mystery. Where was Yoda made?   designed and built by Stuart Freeborn   The Yoda puppet was originally for LucasFilm and Industrial Light & Magic. Where did Yoda live?   Jedi Master Yoda went into voluntary exile on Dagobah Where did Yoda live in the Phantom Menace? ???
  35. 35. But back to the original question Tetherless World Constellation • Q: How does Watson fare as a cognitive model? • A: Poorly – no conversational ability – no concept of self – no deeper reasoning … • Q: How does Watson fare as a model of question answering?
  36. 36. Watson and Q/A Tetherless World Constellation • Watson’s feed-forward pipeline has the following properties – lots of candidates generated • the more the better – “ad hoc” filtering pipelines • domain independent usually score lower than domain dependent – no “counter-reasoning” between answers • separately scored, only comparison is numbers
  37. 37. Production rules, modules, etc Tetherless World Constellation Production Rule style Architectures cf ACT-R (Anderson 1974; …2012) - modularization, but not Watson style - parallelization, but in rule productions (procedural memory) - declarative memory is fact based Watson is not well correlated, except for using search for declarative memory
  38. 38. Network based Tetherless World Constellation Network based architectures (cf. spreading activation (Collins 75), marker-passing (Hendler 86) … Microsaint 2006) - positive activations - inhibitory nodes (or other negative enforcers) Watson has no negative inhibition, does use network-based scorers
  39. 39. MAC/FAC Tetherless World Constellation MAC/FAC (Gentner & Forbus, 1991) Many are chosen, few are called model of analogic reasoning Strong correspondence in performance, not in mechanism New work by Forbus (SME) uses a more feed-forward mechanism (Discussions in progress)
  40. 40. Cognitive Architecture? Watson as “component” Decision Making Memory Reasoning Watson, Cogito, and Clarion Office of Research
  41. 41. Summary Tetherless World Constellation • Watson won by a combination of – – – – – natural language processing search technologies semantic typing (minimal reasoning) scoring heuristics machine learning (scorer tuning) • Watson Q/A has some interesting analogies to cognitive architectures of the past – but mainly at a “level of abstraction” • Watson as a memory component in a more complex cognitive system is a very intriguing possibility
  42. 42. Questions? Tetherless World Constellation
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.

×