Dependency Parsing-based QA System
      using RDF and SPARQL

             Fariz Darari
         fadirra@gmail.com
Motivation




             2
Motivation




             3
Motivation




             4
BACKGROUND


             5
QA in General
• Finding an answer to natural language
  questions based on documents/facts
• Instead of documents, give answers
• Factoid questions:
  – Who can dance Tango?
  – What did I eat this morning?
  – When Mahatma Gandhi was born?


                                          6
Dependency Parsing




                     7
RDF & SPARQL
• RDF Data:
  :book1 :title "SPARQL Tutorial" .

• SPARQL Query:
  SELECT ?title
  WHERE { :book1 :title ?title . }



                                      8
WordNet
• Large lexical database of English words
• Words are grouped into synsets
• Relations among synsets: Synonymy,
  antonymy, hyponymy, meronymy, troponymy




                                            9
DBpedia
• Knowledge base of Wikipedia in RDF
• Data at dbpedia.org/resource/Italy:




                                        10
DEVELOPMENT


              11
NL Text   Dependency
(Facts)     Parser



           RDFizer



             RDF        SPARQL



            OWL        SPARQLizer
           Ontology


                       Dependency     NL Text
                         Parser     (Questions)
Facts Population
1. We parse the natural language facts using the
   Stanford dependency parser. The result will be
   typed dependencies.
2. The typed dependencies are then translated into
   RDF format using RDFizer. The RDFizer is built
   using Java with Apache Jena as the Semantic
   Web library.
3. The resulting RDF will be consulted with an OWL
   ontology that contains some WordNet and
   DBpedia axioms to infer some new facts.
                                                 13
Query Execution
1. We parse the natural language questions
   using the Stanford dependency parser. The
   result will be typed dependencies.
2. We then translate the typed dependencies
   into SPARQL query format.
3. The SPARQL query is then executed over
   populated RDF data from the result of Facts
   Population.

                                                 14
Background Knowledge
               WordNet
Synonymy:
<http://example.org/sentence/buy>
  owl:sameAs
  <http://example.org/sentence/purchase> .




                                             15
Background Knowledge
              WordNet
Hyponymy:
• PREFIX : <http://example.org/sentence/>
  CONSTRUCT {:vehicle ?y ?z} WHERE {:car ?y
  ?z}
• PREFIX : <http://example.org/sentence/>
  CONSTRUCT {?x ?y :vehicle} WHERE {?x ?y
  :car}


                                              16
Background Knowledge
              WordNet
Troponymy:
• PREFIX : <http://example.org/sentence/>
  CONSTRUCT {:move ?y ?z} WHERE {:run ?y
  ?z}
• PREFIX : <http://example.org/sentence/>
  CONSTRUCT {?x ?y :move} WHERE {?x ?y
  :run}


                                            17
Background Knowledge
                DBpedia


<http://example.org/sentence/italy>
  owl:sameAs
  <http://dbpedia.org/resource/Italy>




                                        18
IMPLEMENTATION AND RESULT


                            19
Program
•   Java-based
•   Reuse Apache Jena: SW library
•   Reuse Stanford Parser: Typed dependencies
•   MorphAdorner: Lemmatization and verb
    conjugation




                                                20
A Detailed Example (1)
• Fact: Aliana bought a car
• Question: Who purchased a vehicle?




                                       21
A Detailed Example (2)
• Typed Dependencies
[nsubj(bought-2, Aliana-1), root(ROOT-0,
  bought-2), det(car-4, a-3), dobj(bought-2, car-
  4)]




                                                22
A Detailed Example (3)
RDF:
<http://example.org/sentence/car>
   <http://example.org/sentence/det>
        <http://example.org/sentence/a> .

<http://example.org/sentence/root>
   <http://example.org/sentence/root>
        <http://example.org/sentence/bought> .

<http://example.org/sentence/bought>
   <http://example.org/sentence/dobj>
        <http://example.org/sentence/car> ;
   <http://example.org/sentence/nsubj>
        <http://example.org/sentence/aliana> .
                                                 23
A Detailed Example (4)
KB has:
We have the following triple in the knowledge base
  already:
         :bought owl:sameAs :purchased .
and also the following rules:
PREFIX : <http://example.org/sentence/>
CONSTRUCT         {:vehicle ?y ?z} WHERE {:car ?y ?z}
PREFIX : <http://example.org/sentence/>
CONSTRUCT         {?x ?y :vehicle} WHERE {?x ?y :car}
                                                 24
A Detailed Example (5)
Inferred facts:
<http://example.org/sentence/purchased>
    <http://example.org/sentence/dobj>
        <http://example.org/sentence/vehicle> ;
<http://example.org/sentence/nsubj>
        <http://example.org/sentence/aliana> .



                                                  25
A Detailed Example (6)
Typed dependencies of question:
[nsubj(purchased-2, Who-1), root(ROOT-0,
  purchased-2), det(vehicle-4, a-3), dobj(purchased-
  2, vehicle-4)]




                                                26
A Detailed Example (7)
SPARQL form of question:
SELECT ?x WHERE {
  :vehicle :det :a .
  :purchased :nsubj ?x .
  :purchased :dobj :vehicle .
  :root :root :purchased }

Answer: “aliana”

                                  27
DBpedia Integration
• By adding some background knowledge from
  DBpedia, one can ask more questions.
• Example of Italy data:
:italy owl:sameAs dbpedia:Italy .
  dbpedia:Italy dbpprop:capital "Rome" .
  dbpedia:Enzo_Ferrari     dbpedia-
  owl:nationality     dbpedia:Italy ;
  dbpprop:deathPlace dbpedia:Maranello .
  dbpedia:Enzo_Ferrari dbpedia-owl:child
     dbpedia:Piero_Ferrari ,
     dbpedia:Alfredo_Ferrari .

                                             28
Example Case
• Fact = “Fariz loves Italy.”
• Question = “Does Fariz love a country, whose
  capital is Rome, which was the nationality of a
  person who passed away in Maranello and
  whose sons are Piero Ferrari and Alfredo
  Ferrari?”

• Thus the answer will be: YES, eventhough we
  only have a fact, Fariz loves Italy.
                                                29
Example Case (cont.)
• Note that, the previous example, the fact is
  translated automatically by the system but the
  question is translated manually to be the
  following SPARQL query:
  ASK WHERE { :love :nsubj :fariz . :root
  :root :love .
  :love :dobj ?x .
  ?x dbpprop:capital "Rome" .
  ?y dbpedia-owl:nationality ?x ;
  dbpprop:deathPlacedbpedia:Maranello .
  ?y dbpedia-owl:childdbpedia:Piero_Ferrari
  , dbpedia:Alfredo_Ferrari }

                                                   30
How to handle negation? (1)
• Fact: I did not buy it.
• RDF:
<http://example.org/sentence/root>
   <http://example.org/sentence/root>
        <http://example.org/sentence/bought> .

<http://example.org/sentence/bought>
   <http://example.org/sentence/dobj>
        <http://example.org/sentence/it> ;
   <http://example.org/sentence/neg>
        <http://example.org/sentence/not> ;
   <http://example.org/sentence/nsubj>
        <http://example.org/sentence/i> .

                                                 31
How to handle negation? (2)
• Question: Who bought it?
• SPARQL:
SELECT ?x WHERE {:bought :nsubj ?x . :bought
  :dobj :it . :root :root :bought . FILTER NOT
  EXISTS { [] :neg ?z . } }




                                                 32
How to handle negation? (3)
• Who did not buy it? I.
QUERY: SELECT ?x WHERE {:bought :dobj :it .
  :bought :neg :not . :bought :nsubj ?x . :root
  :root :bought }




                                                  33
How to handle tenses? (1)
• Fact (I will buy it):
<http://example.org/sentence/buy>
   <http://example.org/sentence/aux>
         <http://example.org/sentence/will> ;
   <http://example.org/sentence/dobj>
         <http://example.org/sentence/it> ;
   <http://example.org/sentence/nsubj>
         <http://example.org/sentence/i> .

                                                34
How to handle tenses? (2)
• Who buys it?
• SELECT ?x WHERE {:root :root :buys . :buys
  :nsubj ?x . :buys :dobj :it . FILTER NOT EXISTS {
  [] :aux :will . } }




                                                  35
How to handle passive sentences?
Fact: Juliet was killed by Romeo.
<http://example.org/sentence/root>
   <http://example.org/sentence/root>
         <http://example.org/sentence/killed> .

<http://example.org/sentence/killed>
   <http://example.org/sentence/agent>
        <http://example.org/sentence/romeo> ;
   <http://example.org/sentence/nsubjpass>
        <http://example.org/sentence/juliet> .

                                                  36
How to handle passive sentences?
• Ontology:
:nsubjpass owl:equivalentProperty :dobj .
  :agent owl:equivalentProperty :nsubj .




                                            37
How to handle passive sentences?
• Who killed Juliet?
SELECT ?x WHERE {:killed :nsubj ?x . :killed :dobj
  :juliet . :root :root :killed . FILTER NOT EXISTS {
  [] :neg ?z . }}




                                                    38
DEMO - A Story about Antonio
Antonio is a famous and cool doctor. Antonio
  has been working for 10 years. Antonio is in
  Italy. Antonio can dance Salsa well. Antonio
  loves Maria and Karina. Antonio is also loved
  by Karina. Antonio never cooks. But Maria
  always cooks. Antonio just bought a car.
  Antonio must fly to Indonesia tomorrow.



                                                  39
Conclusions
• Dependency parsing-based QA system with
  RDF and SPARQL
• The system is also aware of negations, tenses
  and passive sentences
• Improvements: More advanced parsing
  method, more efficient inference system,
  richer background knowledge


                                                  40
APPENDIX: EXAMPLES


                     41
Working Examples
• "The Japanese girl sang the song beautifully."

• "Who sang the song beautifully?"
• "Who sang the song?"
• "Who sang?"




                                                   42
Working Examples
• "The Beatles do sing the song perfectly.“

• "How do The Beatles sing the song?"




                                              43
Working Examples
• "They should sing the song well.“

• "How should they sing the song?"




                                      44
Working Examples
• "I did buy the book, the pencil and the ruler
  yesterday.“

• "What did I buy?"




                                                  45
Working Examples
• "Microsoft is located in Redmond.“

• "What is located in Redmond?"




                                       46
Working Examples
• "The man was killed by the police."

• "Who killed the man?"




                                        47
Working Examples
• "Sam is genius, honest and big."

• "Who is big?"




                                     48
Working Examples
• "John is rude.";

• "Is John good?"
• "Is John rude?"




                               49
Working Examples
• "John, that is the founder of Microsoft and the
  initiator of Greenpeace movement, is genius,
  honest and cool."

• "Who is honest?"




                                                50
Working Examples
• "Farid wants to go to Rome.“

• "Who wants to go to Rome?"
• "Who wants to go?"
• "Who wants?"




                                 51
Working Examples
• "Jorge ate 10 delicious apples."

• "Who ate 10 delicious apples?"
• "Who ate 10 apples?"
• "Who ate apples?"




                                     52
Working Examples
• "John is a doctor.“

• "Is John a doctor?"
• "Is John a teacher?"




                               53
Working Examples
• "John is a good doctor."

• "Who is John?"
• "What is John?"




                              54
Working Examples
• "John is in Alaska."
• "John is at home."
• "John is on the street."

• "Where is John?"




                               55
Working Examples
• "Apples are good for health.“

• "What are good for health?"




                                  56

Dependency Parsing-based QA System for RDF and SPARQL

  • 1.
    Dependency Parsing-based QASystem using RDF and SPARQL Fariz Darari fadirra@gmail.com
  • 2.
  • 3.
  • 4.
  • 5.
  • 6.
    QA in General •Finding an answer to natural language questions based on documents/facts • Instead of documents, give answers • Factoid questions: – Who can dance Tango? – What did I eat this morning? – When Mahatma Gandhi was born? 6
  • 7.
  • 8.
    RDF & SPARQL •RDF Data: :book1 :title "SPARQL Tutorial" . • SPARQL Query: SELECT ?title WHERE { :book1 :title ?title . } 8
  • 9.
    WordNet • Large lexicaldatabase of English words • Words are grouped into synsets • Relations among synsets: Synonymy, antonymy, hyponymy, meronymy, troponymy 9
  • 10.
    DBpedia • Knowledge baseof Wikipedia in RDF • Data at dbpedia.org/resource/Italy: 10
  • 11.
  • 12.
    NL Text Dependency (Facts) Parser RDFizer RDF SPARQL OWL SPARQLizer Ontology Dependency NL Text Parser (Questions)
  • 13.
    Facts Population 1. Weparse the natural language facts using the Stanford dependency parser. The result will be typed dependencies. 2. The typed dependencies are then translated into RDF format using RDFizer. The RDFizer is built using Java with Apache Jena as the Semantic Web library. 3. The resulting RDF will be consulted with an OWL ontology that contains some WordNet and DBpedia axioms to infer some new facts. 13
  • 14.
    Query Execution 1. Weparse the natural language questions using the Stanford dependency parser. The result will be typed dependencies. 2. We then translate the typed dependencies into SPARQL query format. 3. The SPARQL query is then executed over populated RDF data from the result of Facts Population. 14
  • 15.
    Background Knowledge WordNet Synonymy: <http://example.org/sentence/buy> owl:sameAs <http://example.org/sentence/purchase> . 15
  • 16.
    Background Knowledge WordNet Hyponymy: • PREFIX : <http://example.org/sentence/> CONSTRUCT {:vehicle ?y ?z} WHERE {:car ?y ?z} • PREFIX : <http://example.org/sentence/> CONSTRUCT {?x ?y :vehicle} WHERE {?x ?y :car} 16
  • 17.
    Background Knowledge WordNet Troponymy: • PREFIX : <http://example.org/sentence/> CONSTRUCT {:move ?y ?z} WHERE {:run ?y ?z} • PREFIX : <http://example.org/sentence/> CONSTRUCT {?x ?y :move} WHERE {?x ?y :run} 17
  • 18.
    Background Knowledge DBpedia <http://example.org/sentence/italy> owl:sameAs <http://dbpedia.org/resource/Italy> 18
  • 19.
  • 20.
    Program • Java-based • Reuse Apache Jena: SW library • Reuse Stanford Parser: Typed dependencies • MorphAdorner: Lemmatization and verb conjugation 20
  • 21.
    A Detailed Example(1) • Fact: Aliana bought a car • Question: Who purchased a vehicle? 21
  • 22.
    A Detailed Example(2) • Typed Dependencies [nsubj(bought-2, Aliana-1), root(ROOT-0, bought-2), det(car-4, a-3), dobj(bought-2, car- 4)] 22
  • 23.
    A Detailed Example(3) RDF: <http://example.org/sentence/car> <http://example.org/sentence/det> <http://example.org/sentence/a> . <http://example.org/sentence/root> <http://example.org/sentence/root> <http://example.org/sentence/bought> . <http://example.org/sentence/bought> <http://example.org/sentence/dobj> <http://example.org/sentence/car> ; <http://example.org/sentence/nsubj> <http://example.org/sentence/aliana> . 23
  • 24.
    A Detailed Example(4) KB has: We have the following triple in the knowledge base already: :bought owl:sameAs :purchased . and also the following rules: PREFIX : <http://example.org/sentence/> CONSTRUCT {:vehicle ?y ?z} WHERE {:car ?y ?z} PREFIX : <http://example.org/sentence/> CONSTRUCT {?x ?y :vehicle} WHERE {?x ?y :car} 24
  • 25.
    A Detailed Example(5) Inferred facts: <http://example.org/sentence/purchased> <http://example.org/sentence/dobj> <http://example.org/sentence/vehicle> ; <http://example.org/sentence/nsubj> <http://example.org/sentence/aliana> . 25
  • 26.
    A Detailed Example(6) Typed dependencies of question: [nsubj(purchased-2, Who-1), root(ROOT-0, purchased-2), det(vehicle-4, a-3), dobj(purchased- 2, vehicle-4)] 26
  • 27.
    A Detailed Example(7) SPARQL form of question: SELECT ?x WHERE { :vehicle :det :a . :purchased :nsubj ?x . :purchased :dobj :vehicle . :root :root :purchased } Answer: “aliana” 27
  • 28.
    DBpedia Integration • Byadding some background knowledge from DBpedia, one can ask more questions. • Example of Italy data: :italy owl:sameAs dbpedia:Italy . dbpedia:Italy dbpprop:capital "Rome" . dbpedia:Enzo_Ferrari dbpedia- owl:nationality dbpedia:Italy ; dbpprop:deathPlace dbpedia:Maranello . dbpedia:Enzo_Ferrari dbpedia-owl:child dbpedia:Piero_Ferrari , dbpedia:Alfredo_Ferrari . 28
  • 29.
    Example Case • Fact= “Fariz loves Italy.” • Question = “Does Fariz love a country, whose capital is Rome, which was the nationality of a person who passed away in Maranello and whose sons are Piero Ferrari and Alfredo Ferrari?” • Thus the answer will be: YES, eventhough we only have a fact, Fariz loves Italy. 29
  • 30.
    Example Case (cont.) •Note that, the previous example, the fact is translated automatically by the system but the question is translated manually to be the following SPARQL query: ASK WHERE { :love :nsubj :fariz . :root :root :love . :love :dobj ?x . ?x dbpprop:capital "Rome" . ?y dbpedia-owl:nationality ?x ; dbpprop:deathPlacedbpedia:Maranello . ?y dbpedia-owl:childdbpedia:Piero_Ferrari , dbpedia:Alfredo_Ferrari } 30
  • 31.
    How to handlenegation? (1) • Fact: I did not buy it. • RDF: <http://example.org/sentence/root> <http://example.org/sentence/root> <http://example.org/sentence/bought> . <http://example.org/sentence/bought> <http://example.org/sentence/dobj> <http://example.org/sentence/it> ; <http://example.org/sentence/neg> <http://example.org/sentence/not> ; <http://example.org/sentence/nsubj> <http://example.org/sentence/i> . 31
  • 32.
    How to handlenegation? (2) • Question: Who bought it? • SPARQL: SELECT ?x WHERE {:bought :nsubj ?x . :bought :dobj :it . :root :root :bought . FILTER NOT EXISTS { [] :neg ?z . } } 32
  • 33.
    How to handlenegation? (3) • Who did not buy it? I. QUERY: SELECT ?x WHERE {:bought :dobj :it . :bought :neg :not . :bought :nsubj ?x . :root :root :bought } 33
  • 34.
    How to handletenses? (1) • Fact (I will buy it): <http://example.org/sentence/buy> <http://example.org/sentence/aux> <http://example.org/sentence/will> ; <http://example.org/sentence/dobj> <http://example.org/sentence/it> ; <http://example.org/sentence/nsubj> <http://example.org/sentence/i> . 34
  • 35.
    How to handletenses? (2) • Who buys it? • SELECT ?x WHERE {:root :root :buys . :buys :nsubj ?x . :buys :dobj :it . FILTER NOT EXISTS { [] :aux :will . } } 35
  • 36.
    How to handlepassive sentences? Fact: Juliet was killed by Romeo. <http://example.org/sentence/root> <http://example.org/sentence/root> <http://example.org/sentence/killed> . <http://example.org/sentence/killed> <http://example.org/sentence/agent> <http://example.org/sentence/romeo> ; <http://example.org/sentence/nsubjpass> <http://example.org/sentence/juliet> . 36
  • 37.
    How to handlepassive sentences? • Ontology: :nsubjpass owl:equivalentProperty :dobj . :agent owl:equivalentProperty :nsubj . 37
  • 38.
    How to handlepassive sentences? • Who killed Juliet? SELECT ?x WHERE {:killed :nsubj ?x . :killed :dobj :juliet . :root :root :killed . FILTER NOT EXISTS { [] :neg ?z . }} 38
  • 39.
    DEMO - AStory about Antonio Antonio is a famous and cool doctor. Antonio has been working for 10 years. Antonio is in Italy. Antonio can dance Salsa well. Antonio loves Maria and Karina. Antonio is also loved by Karina. Antonio never cooks. But Maria always cooks. Antonio just bought a car. Antonio must fly to Indonesia tomorrow. 39
  • 40.
    Conclusions • Dependency parsing-basedQA system with RDF and SPARQL • The system is also aware of negations, tenses and passive sentences • Improvements: More advanced parsing method, more efficient inference system, richer background knowledge 40
  • 41.
  • 42.
    Working Examples • "TheJapanese girl sang the song beautifully." • "Who sang the song beautifully?" • "Who sang the song?" • "Who sang?" 42
  • 43.
    Working Examples • "TheBeatles do sing the song perfectly.“ • "How do The Beatles sing the song?" 43
  • 44.
    Working Examples • "Theyshould sing the song well.“ • "How should they sing the song?" 44
  • 45.
    Working Examples • "Idid buy the book, the pencil and the ruler yesterday.“ • "What did I buy?" 45
  • 46.
    Working Examples • "Microsoftis located in Redmond.“ • "What is located in Redmond?" 46
  • 47.
    Working Examples • "Theman was killed by the police." • "Who killed the man?" 47
  • 48.
    Working Examples • "Samis genius, honest and big." • "Who is big?" 48
  • 49.
    Working Examples • "Johnis rude."; • "Is John good?" • "Is John rude?" 49
  • 50.
    Working Examples • "John,that is the founder of Microsoft and the initiator of Greenpeace movement, is genius, honest and cool." • "Who is honest?" 50
  • 51.
    Working Examples • "Faridwants to go to Rome.“ • "Who wants to go to Rome?" • "Who wants to go?" • "Who wants?" 51
  • 52.
    Working Examples • "Jorgeate 10 delicious apples." • "Who ate 10 delicious apples?" • "Who ate 10 apples?" • "Who ate apples?" 52
  • 53.
    Working Examples • "Johnis a doctor.“ • "Is John a doctor?" • "Is John a teacher?" 53
  • 54.
    Working Examples • "Johnis a good doctor." • "Who is John?" • "What is John?" 54
  • 55.
    Working Examples • "Johnis in Alaska." • "John is at home." • "John is on the street." • "Where is John?" 55
  • 56.
    Working Examples • "Applesare good for health.“ • "What are good for health?" 56

Editor's Notes

  • #3 Siri image: apple.comWatson: gizmodo.comExample questions (Querying Contacts):What&apos;s Michael&apos;s address?What is Susan Park&apos;s phone number?When is my wife&apos;s birthday?Show Jennifer&apos;s home email address
  • #4 Siri image: apple.comWatson: gizmodo.comAnswer - Question systemfor example: 4 JulyWhen is the celebration of the independence day of the USA?
  • #8 The Stanford dependencies provide a representation of grammatical relations between words in a sentence. They have been designed to be easily understood and effectively used by people who want to extract textual relations. Stanford dependencies (SD) are triplets: name of the relation, governor and dependent.The dependencies are produced using hand-written tregex patterns over phrase-structure trees as described in:Marie-Catherine de Marneffe, Bill MacCartney and Christopher D. Manning. 2006. Generating Typed Dependency Parses from Phrase Structure Parses. In LREC 2006.
  • #9 SW: an extension of the Web with machine-interpretable content
  • #10 Synsets: sets of cognitive synonyms, each expressing a distinct conceptAn example of synonymy: buy = purchaseantonymy: bad = goodhyponymy: hyponymy = relationmeronymy: tires = cartroponymy: run = move