Dependency Parsing-based QA System      using RDF and SPARQL             Fariz Darari         fadirra@gmail.com
Motivation             2
Motivation             3
Motivation             4
BACKGROUND             5
QA in General• Finding an answer to natural language  questions based on documents/facts• Instead of documents, give answe...
Dependency Parsing                     7
RDF & SPARQL• RDF Data:  :book1 :title "SPARQL Tutorial" .• SPARQL Query:  SELECT ?title  WHERE { :book1 :title ?title . }...
WordNet• Large lexical database of English words• Words are grouped into synsets• Relations among synsets: Synonymy,  anto...
DBpedia• Knowledge base of Wikipedia in RDF• Data at dbpedia.org/resource/Italy:                                        10
DEVELOPMENT              11
NL Text   Dependency(Facts)     Parser           RDFizer             RDF        SPARQL            OWL        SPARQLizer   ...
Facts Population1. We parse the natural language facts using the   Stanford dependency parser. The result will be   typed ...
Query Execution1. We parse the natural language questions   using the Stanford dependency parser. The   result will be typ...
Background Knowledge               WordNetSynonymy:<http://example.org/sentence/buy>  owl:sameAs  <http://example.org/sent...
Background Knowledge              WordNetHyponymy:• PREFIX : <http://example.org/sentence/>  CONSTRUCT {:vehicle ?y ?z} WH...
Background Knowledge              WordNetTroponymy:• PREFIX : <http://example.org/sentence/>  CONSTRUCT {:move ?y ?z} WHER...
Background Knowledge                DBpedia<http://example.org/sentence/italy>  owl:sameAs  <http://dbpedia.org/resource/I...
IMPLEMENTATION AND RESULT                            19
Program•   Java-based•   Reuse Apache Jena: SW library•   Reuse Stanford Parser: Typed dependencies•   MorphAdorner: Lemma...
A Detailed Example (1)• Fact: Aliana bought a car• Question: Who purchased a vehicle?                                     ...
A Detailed Example (2)• Typed Dependencies[nsubj(bought-2, Aliana-1), root(ROOT-0,  bought-2), det(car-4, a-3), dobj(bough...
A Detailed Example (3)RDF:<http://example.org/sentence/car>   <http://example.org/sentence/det>        <http://example.org...
A Detailed Example (4)KB has:We have the following triple in the knowledge base  already:         :bought owl:sameAs :purc...
A Detailed Example (5)Inferred facts:<http://example.org/sentence/purchased>    <http://example.org/sentence/dobj>        ...
A Detailed Example (6)Typed dependencies of question:[nsubj(purchased-2, Who-1), root(ROOT-0,  purchased-2), det(vehicle-4...
A Detailed Example (7)SPARQL form of question:SELECT ?x WHERE {  :vehicle :det :a .  :purchased :nsubj ?x .  :purchased :d...
DBpedia Integration• By adding some background knowledge from  DBpedia, one can ask more questions.• Example of Italy data...
Example Case• Fact = “Fariz loves Italy.”• Question = “Does Fariz love a country, whose  capital is Rome, which was the na...
Example Case (cont.)• Note that, the previous example, the fact is  translated automatically by the system but the  questi...
How to handle negation? (1)• Fact: I did not buy it.• RDF:<http://example.org/sentence/root>   <http://example.org/sentenc...
How to handle negation? (2)• Question: Who bought it?• SPARQL:SELECT ?x WHERE {:bought :nsubj ?x . :bought  :dobj :it . :r...
How to handle negation? (3)• Who did not buy it? I.QUERY: SELECT ?x WHERE {:bought :dobj :it .  :bought :neg :not . :bough...
How to handle tenses? (1)• Fact (I will buy it):<http://example.org/sentence/buy>   <http://example.org/sentence/aux>     ...
How to handle tenses? (2)• Who buys it?• SELECT ?x WHERE {:root :root :buys . :buys  :nsubj ?x . :buys :dobj :it . FILTER ...
How to handle passive sentences?Fact: Juliet was killed by Romeo.<http://example.org/sentence/root>   <http://example.org/...
How to handle passive sentences?• Ontology::nsubjpass owl:equivalentProperty :dobj .  :agent owl:equivalentProperty :nsubj...
How to handle passive sentences?• Who killed Juliet?SELECT ?x WHERE {:killed :nsubj ?x . :killed :dobj  :juliet . :root :r...
DEMO - A Story about AntonioAntonio is a famous and cool doctor. Antonio  has been working for 10 years. Antonio is in  It...
Conclusions• Dependency parsing-based QA system with  RDF and SPARQL• The system is also aware of negations, tenses  and p...
APPENDIX: EXAMPLES                     41
Working Examples• "The Japanese girl sang the song beautifully."• "Who sang the song beautifully?"• "Who sang the song?"• ...
Working Examples• "The Beatles do sing the song perfectly.“• "How do The Beatles sing the song?"                          ...
Working Examples• "They should sing the song well.“• "How should they sing the song?"                                     ...
Working Examples• "I did buy the book, the pencil and the ruler  yesterday.“• "What did I buy?"                           ...
Working Examples• "Microsoft is located in Redmond.“• "What is located in Redmond?"                                       46
Working Examples• "The man was killed by the police."• "Who killed the man?"                                        47
Working Examples• "Sam is genius, honest and big."• "Who is big?"                                     48
Working Examples• "John is rude.";• "Is John good?"• "Is John rude?"                               49
Working Examples• "John, that is the founder of Microsoft and the  initiator of Greenpeace movement, is genius,  honest an...
Working Examples• "Farid wants to go to Rome.“• "Who wants to go to Rome?"• "Who wants to go?"• "Who wants?"              ...
Working Examples• "Jorge ate 10 delicious apples."• "Who ate 10 delicious apples?"• "Who ate 10 apples?"• "Who ate apples?...
Working Examples• "John is a doctor.“• "Is John a doctor?"• "Is John a teacher?"                               53
Working Examples• "John is a good doctor."• "Who is John?"• "What is John?"                              54
Working Examples• "John is in Alaska."• "John is at home."• "John is on the street."• "Where is John?"                    ...
Working Examples• "Apples are good for health.“• "What are good for health?"                                  56
Upcoming SlideShare
Loading in …5
×

Dependency Parsing-based QA System for RDF and SPARQL

1,692 views
1,508 views

Published on

Published in: Technology
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
1,692
On SlideShare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
23
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide
  • Siri image: apple.comWatson: gizmodo.comExample questions (Querying Contacts):What&apos;s Michael&apos;s address?What is Susan Park&apos;s phone number?When is my wife&apos;s birthday?Show Jennifer&apos;s home email address
  • Siri image: apple.comWatson: gizmodo.comAnswer - Question systemfor example: 4 JulyWhen is the celebration of the independence day of the USA?
  • The Stanford dependencies provide a representation of grammatical relations between words in a sentence. They have been designed to be easily understood and effectively used by people who want to extract textual relations. Stanford dependencies (SD) are triplets: name of the relation, governor and dependent.The dependencies are produced using hand-written tregex patterns over phrase-structure trees as described in:Marie-Catherine de Marneffe, Bill MacCartney and Christopher D. Manning. 2006. Generating Typed Dependency Parses from Phrase Structure Parses. In LREC 2006.
  • SW: an extension of the Web with machine-interpretable content
  • Synsets: sets of cognitive synonyms, each expressing a distinct conceptAn example of synonymy: buy = purchaseantonymy: bad = goodhyponymy: hyponymy = relationmeronymy: tires = cartroponymy: run = move
  • Dependency Parsing-based QA System for RDF and SPARQL

    1. 1. Dependency Parsing-based QA System using RDF and SPARQL Fariz Darari fadirra@gmail.com
    2. 2. Motivation 2
    3. 3. Motivation 3
    4. 4. Motivation 4
    5. 5. BACKGROUND 5
    6. 6. QA in General• Finding an answer to natural language questions based on documents/facts• Instead of documents, give answers• Factoid questions: – Who can dance Tango? – What did I eat this morning? – When Mahatma Gandhi was born? 6
    7. 7. Dependency Parsing 7
    8. 8. RDF & SPARQL• RDF Data: :book1 :title "SPARQL Tutorial" .• SPARQL Query: SELECT ?title WHERE { :book1 :title ?title . } 8
    9. 9. WordNet• Large lexical database of English words• Words are grouped into synsets• Relations among synsets: Synonymy, antonymy, hyponymy, meronymy, troponymy 9
    10. 10. DBpedia• Knowledge base of Wikipedia in RDF• Data at dbpedia.org/resource/Italy: 10
    11. 11. DEVELOPMENT 11
    12. 12. NL Text Dependency(Facts) Parser RDFizer RDF SPARQL OWL SPARQLizer Ontology Dependency NL Text Parser (Questions)
    13. 13. Facts Population1. We parse the natural language facts using the Stanford dependency parser. The result will be typed dependencies.2. The typed dependencies are then translated into RDF format using RDFizer. The RDFizer is built using Java with Apache Jena as the Semantic Web library.3. The resulting RDF will be consulted with an OWL ontology that contains some WordNet and DBpedia axioms to infer some new facts. 13
    14. 14. Query Execution1. We parse the natural language questions using the Stanford dependency parser. The result will be typed dependencies.2. We then translate the typed dependencies into SPARQL query format.3. The SPARQL query is then executed over populated RDF data from the result of Facts Population. 14
    15. 15. Background Knowledge WordNetSynonymy:<http://example.org/sentence/buy> owl:sameAs <http://example.org/sentence/purchase> . 15
    16. 16. Background Knowledge WordNetHyponymy:• PREFIX : <http://example.org/sentence/> CONSTRUCT {:vehicle ?y ?z} WHERE {:car ?y ?z}• PREFIX : <http://example.org/sentence/> CONSTRUCT {?x ?y :vehicle} WHERE {?x ?y :car} 16
    17. 17. Background Knowledge WordNetTroponymy:• PREFIX : <http://example.org/sentence/> CONSTRUCT {:move ?y ?z} WHERE {:run ?y ?z}• PREFIX : <http://example.org/sentence/> CONSTRUCT {?x ?y :move} WHERE {?x ?y :run} 17
    18. 18. Background Knowledge DBpedia<http://example.org/sentence/italy> owl:sameAs <http://dbpedia.org/resource/Italy> 18
    19. 19. IMPLEMENTATION AND RESULT 19
    20. 20. Program• Java-based• Reuse Apache Jena: SW library• Reuse Stanford Parser: Typed dependencies• MorphAdorner: Lemmatization and verb conjugation 20
    21. 21. A Detailed Example (1)• Fact: Aliana bought a car• Question: Who purchased a vehicle? 21
    22. 22. A Detailed Example (2)• Typed Dependencies[nsubj(bought-2, Aliana-1), root(ROOT-0, bought-2), det(car-4, a-3), dobj(bought-2, car- 4)] 22
    23. 23. A Detailed Example (3)RDF:<http://example.org/sentence/car> <http://example.org/sentence/det> <http://example.org/sentence/a> .<http://example.org/sentence/root> <http://example.org/sentence/root> <http://example.org/sentence/bought> .<http://example.org/sentence/bought> <http://example.org/sentence/dobj> <http://example.org/sentence/car> ; <http://example.org/sentence/nsubj> <http://example.org/sentence/aliana> . 23
    24. 24. A Detailed Example (4)KB has:We have the following triple in the knowledge base already: :bought owl:sameAs :purchased .and also the following rules:PREFIX : <http://example.org/sentence/>CONSTRUCT {:vehicle ?y ?z} WHERE {:car ?y ?z}PREFIX : <http://example.org/sentence/>CONSTRUCT {?x ?y :vehicle} WHERE {?x ?y :car} 24
    25. 25. A Detailed Example (5)Inferred facts:<http://example.org/sentence/purchased> <http://example.org/sentence/dobj> <http://example.org/sentence/vehicle> ;<http://example.org/sentence/nsubj> <http://example.org/sentence/aliana> . 25
    26. 26. A Detailed Example (6)Typed dependencies of question:[nsubj(purchased-2, Who-1), root(ROOT-0, purchased-2), det(vehicle-4, a-3), dobj(purchased- 2, vehicle-4)] 26
    27. 27. A Detailed Example (7)SPARQL form of question:SELECT ?x WHERE { :vehicle :det :a . :purchased :nsubj ?x . :purchased :dobj :vehicle . :root :root :purchased }Answer: “aliana” 27
    28. 28. DBpedia Integration• By adding some background knowledge from DBpedia, one can ask more questions.• Example of Italy data::italy owl:sameAs dbpedia:Italy . dbpedia:Italy dbpprop:capital "Rome" . dbpedia:Enzo_Ferrari dbpedia- owl:nationality dbpedia:Italy ; dbpprop:deathPlace dbpedia:Maranello . dbpedia:Enzo_Ferrari dbpedia-owl:child dbpedia:Piero_Ferrari , dbpedia:Alfredo_Ferrari . 28
    29. 29. Example Case• Fact = “Fariz loves Italy.”• Question = “Does Fariz love a country, whose capital is Rome, which was the nationality of a person who passed away in Maranello and whose sons are Piero Ferrari and Alfredo Ferrari?”• Thus the answer will be: YES, eventhough we only have a fact, Fariz loves Italy. 29
    30. 30. Example Case (cont.)• Note that, the previous example, the fact is translated automatically by the system but the question is translated manually to be the following SPARQL query: ASK WHERE { :love :nsubj :fariz . :root :root :love . :love :dobj ?x . ?x dbpprop:capital "Rome" . ?y dbpedia-owl:nationality ?x ; dbpprop:deathPlacedbpedia:Maranello . ?y dbpedia-owl:childdbpedia:Piero_Ferrari , dbpedia:Alfredo_Ferrari } 30
    31. 31. How to handle negation? (1)• Fact: I did not buy it.• RDF:<http://example.org/sentence/root> <http://example.org/sentence/root> <http://example.org/sentence/bought> .<http://example.org/sentence/bought> <http://example.org/sentence/dobj> <http://example.org/sentence/it> ; <http://example.org/sentence/neg> <http://example.org/sentence/not> ; <http://example.org/sentence/nsubj> <http://example.org/sentence/i> . 31
    32. 32. How to handle negation? (2)• Question: Who bought it?• SPARQL:SELECT ?x WHERE {:bought :nsubj ?x . :bought :dobj :it . :root :root :bought . FILTER NOT EXISTS { [] :neg ?z . } } 32
    33. 33. How to handle negation? (3)• Who did not buy it? I.QUERY: SELECT ?x WHERE {:bought :dobj :it . :bought :neg :not . :bought :nsubj ?x . :root :root :bought } 33
    34. 34. How to handle tenses? (1)• Fact (I will buy it):<http://example.org/sentence/buy> <http://example.org/sentence/aux> <http://example.org/sentence/will> ; <http://example.org/sentence/dobj> <http://example.org/sentence/it> ; <http://example.org/sentence/nsubj> <http://example.org/sentence/i> . 34
    35. 35. How to handle tenses? (2)• Who buys it?• SELECT ?x WHERE {:root :root :buys . :buys :nsubj ?x . :buys :dobj :it . FILTER NOT EXISTS { [] :aux :will . } } 35
    36. 36. How to handle passive sentences?Fact: Juliet was killed by Romeo.<http://example.org/sentence/root> <http://example.org/sentence/root> <http://example.org/sentence/killed> .<http://example.org/sentence/killed> <http://example.org/sentence/agent> <http://example.org/sentence/romeo> ; <http://example.org/sentence/nsubjpass> <http://example.org/sentence/juliet> . 36
    37. 37. How to handle passive sentences?• Ontology::nsubjpass owl:equivalentProperty :dobj . :agent owl:equivalentProperty :nsubj . 37
    38. 38. How to handle passive sentences?• Who killed Juliet?SELECT ?x WHERE {:killed :nsubj ?x . :killed :dobj :juliet . :root :root :killed . FILTER NOT EXISTS { [] :neg ?z . }} 38
    39. 39. DEMO - A Story about AntonioAntonio is a famous and cool doctor. Antonio has been working for 10 years. Antonio is in Italy. Antonio can dance Salsa well. Antonio loves Maria and Karina. Antonio is also loved by Karina. Antonio never cooks. But Maria always cooks. Antonio just bought a car. Antonio must fly to Indonesia tomorrow. 39
    40. 40. Conclusions• Dependency parsing-based QA system with RDF and SPARQL• The system is also aware of negations, tenses and passive sentences• Improvements: More advanced parsing method, more efficient inference system, richer background knowledge 40
    41. 41. APPENDIX: EXAMPLES 41
    42. 42. Working Examples• "The Japanese girl sang the song beautifully."• "Who sang the song beautifully?"• "Who sang the song?"• "Who sang?" 42
    43. 43. Working Examples• "The Beatles do sing the song perfectly.“• "How do The Beatles sing the song?" 43
    44. 44. Working Examples• "They should sing the song well.“• "How should they sing the song?" 44
    45. 45. Working Examples• "I did buy the book, the pencil and the ruler yesterday.“• "What did I buy?" 45
    46. 46. Working Examples• "Microsoft is located in Redmond.“• "What is located in Redmond?" 46
    47. 47. Working Examples• "The man was killed by the police."• "Who killed the man?" 47
    48. 48. Working Examples• "Sam is genius, honest and big."• "Who is big?" 48
    49. 49. Working Examples• "John is rude.";• "Is John good?"• "Is John rude?" 49
    50. 50. Working Examples• "John, that is the founder of Microsoft and the initiator of Greenpeace movement, is genius, honest and cool."• "Who is honest?" 50
    51. 51. Working Examples• "Farid wants to go to Rome.“• "Who wants to go to Rome?"• "Who wants to go?"• "Who wants?" 51
    52. 52. Working Examples• "Jorge ate 10 delicious apples."• "Who ate 10 delicious apples?"• "Who ate 10 apples?"• "Who ate apples?" 52
    53. 53. Working Examples• "John is a doctor.“• "Is John a doctor?"• "Is John a teacher?" 53
    54. 54. Working Examples• "John is a good doctor."• "Who is John?"• "What is John?" 54
    55. 55. Working Examples• "John is in Alaska."• "John is at home."• "John is on the street."• "Where is John?" 55
    56. 56. Working Examples• "Apples are good for health.“• "What are good for health?" 56

    ×