SlideShare a Scribd company logo
L-R Feature
Structure Unification
Syntactic Parser
Richard Caneba
RPI Cognitive Science Department
Human-Level Intelligence Laboratory
Intuitions
• An interpretive grammar views syntax as finding the most
  appropriate sequence of head and dependency relationship
  between phrases and words.
• Language understanding occurs (roughly) left to right
• Syntactic trees have a flat structure, that gives no syntactic
  preferences to sequences of adjunctive modifiers of the same
  category (adjectives, adverbs, modifying prepositional
  phrases)
• We can infer a number of things immediately from the
  perception of a weird, although by no means all things
Intuitions cont’d
• There are many patterns that exist in natural language, that
  can be deterministic in some cases, and must be
  defeasible/probabilistic in others.
• Reliably deterministic:
  • [Det N] => NP[Det N]
  • [Adj N] => NP[Adj N]
• Defeasible:
  • *V NP NP…+ (<1.0)> VP*V NP NP…+
  • *V NP NP…+ (<1.0)> VP*V NP*NP…+…+
• Make an attempt to do search ONLY if there is a genuine
  ambiguity as to what the next step in a L-R parse should be
  • Second object/Relative clause modifier in ditransitive context
  • Prepositional phrase attachment
Feature Structure Unification
• A traditional challenge with the HPSG theory of grammar is
  that, in order to preserve the recursiveness of their grammar
  rules, they were required to have a “right-branching”
  structure that posited additional feature structure nodes for
  each dependency-head relationship the theory posits
• This is to some extent slightly cognitively unrealistic:
  • Posits an unecessary amount of structure for a syntactic parse
  • Intuitively there is no syntactic distinction that should be made
    between sequences of adjuncts (it’s hard to tell the difference
    between “the angry green dog” and the “green angry dog.”
Lexical Representation of
Syntax
• Each word posits a sequence of head-dependency
  relationships that form a “phrasal chain.”
• These chains are based on the notion that we can infer
  immediately some head-dependency relationships based on
  the syntactic category of the word.
• Roughly, each node in a chain is of three types (not explicitly
  defined in the lexicon, but nonetheless present):
  • Word Level (WordUtteranceEvent)
  • Dependency Level (PhraseUtteranceEvent)
  • HeadLevel (PhraseUtteranceEvent)
Lexical Representation of
Syntax
• Let’s do a quick example to show the lexical syntactic
  representation:
• “the angry dog”
• With part-of-speech tags, that is:
• [Det the][Adj angry][N dog].
• The representation in di-graph form:
Lexical Representation of Syntax
              PhraseUtteranceEvents

              WordUtteranceEvents
                             Syntactic Entry for a Common Noun

                                CandType               CandType      Verb
               Preposition



                                             PartOf    IsA
                  Noun
                                CandType      PartOf
                                                                     Noun




                     IsA         Specifier             IsA
 Determiner                                                       CommonNoun
                                              Phon




                                                dog
Lexical Representation of Syntax
      PhraseUtteranceEvents

      WordUtteranceEvents
                  Syntactic Entry for an Adjective


                      CandType            IsA         Noun
          Noun



                                 PartOf
                                          IsA
                                                     Adjective
                                 Phon




                                 angry




   NOTE: will need to posit a dependency layer, to account for adverbs that
   modify the adjective i.e. “really big”.
Lexical Representation of Syntax
    PhraseUtteranceEvents

    WordUtteranceEvents
                   Syntactic Entry for a Determiner

                      CandType             CandType     Verb
     Preposition



                                 PartOf
                      CandType             IsA
        Noun                                            Noun
                                  PartOf




                                           IsA
                                                      Determiner
                                 Phon




                                     the
Grammar Rules
• In our example, we will need to have at least two rules:
  • One that unifies the structures posited by the determiner to the
    structures posited by the common noun
  • One that unifies the structures posited by adjective, either to the
    determiner or the noun
  • Let’s consider this from L-R:
     • First, unify the Det-NP-XP structure chain to the Adj-NP structure
       chain
     • Next, unify that resulting structure chain to the N-NP-XP structure
       chain
Grammar Rules
• Determiner-Adjective Rule


                 CandType
   Preposition
      Verb

                            PartOf                CandType
      Noun
                 CandType              Noun

                                                      IsA




                                                             PartOf
      Noun          IsA                Noun
                             PartOf




                    IsA               Adjective
                                                     IsA
   Determiner




                                                             Phon
                            Phon




                                the                          angry
Grammar Rules
• Determiner-Adjective Rule

                 CandType                     Same
   Preposition
      Verb

                            PartOf                   CandType
      Noun
                 CandType              Noun

                                                         IsA




                                                                PartOf
      Noun          IsA                Noun
                             PartOf




                    IsA               Adjective
                                                        IsA
   Determiner




                                                                Phon
                            Phon




                                the                             angry
Grammar Rules
• Determiner-Adjective Rule


                                  CandType            CandType
                  Verb                                                     Preposition




                                             PartOf
                  Noun             IsA                CandType               Noun




                     IsA                                             IsA            Adjective
     Determiner




                                                             Phon
                           Phon




                             the                             angry
Grammar Rules
• We would like to allow for anywhere from 0-infinite number
  of adjectives to stand between the determiner and the noun
  that selects the determiner as its specifier.
• We can achieve this by explicitly stating that whenever a Det
  chain and an Adj chain are unified, it’s exposed as a
  determiner on the right wall of the growing parse, as opposed
  to an adjective.
Grammar Rules
• Determiner-Adjective Resulting Structure


                                  CandType            CandType
                  Verb                                                     Preposition




                                             PartOf
                  Noun             IsA                CandType               Noun




                     IsA                                             IsA            Adjective
     Determiner




                                                             Phon
                           Phon




                             the                             angry
Grammar Rules
• Determiner-Adjective Resulting Structure + NP


                                  CandType                                                CandType
             Verb                                                    CandType
                                                       Preposition                                     Verb




                                             PartOf
         Preposition




                                                                                PartOf
                                                                                                       Noun

                                   IsA                               CandType
             Noun                                           Noun

             Noun




                                                                                 PartOf
                                                      IsA                                   IsA
                    IsA                  Adjective                                                   CommonNoun
Determiner




                                                                                  Phon
                                                            Phon
                          Phon




                            the                             angry                   dog
Grammar Rules
• Expose the resulting structure from the Det-Adj unification as
  just the Det structure:




                 XP                            XP



                NP                             NP




          Det                Adj        Spr         N
Grammar Rules
• Expose the resulting structure from the Det-Adj unification as
  just the Det structure:



                  Border


                 XP                            XP
                                                          Frontier

                NP                             NP




          Det                Adj        Spr         N
Grammar Rules
• Expose the resulting structure from the Det-Adj unification as
  just the Det structure:



                  Border


                 XP                             XP
                                   Same                   Frontier

                NP                              NP
                            Same




          Det                Adj          Spr        N
Grammar Rules
<!--Pre-head Adjective Modifier w/ Det: Shift Border-->   <!--Subcategorization Rules: NP Specifier-->
 <constraint shouldFalsify="false">                        <constraint shouldFalsify="false">
             Border(?ba, ?t0, ?w)^                                     Border(?ba, ?t0, ?w)^
             Border(?bb, ?t0, ?w)^                                     Border(?bb, ?t0, ?w)^
             Frontier(?fa, ?t1, ?w)^                                   Frontier(?fa, ?t1, ?w)^
             Frontier(?fb, ?t1, ?w)^                                   Frontier(?fb, ?t1, ?w)^
             Meets(?t0, ?t1, E, ?w)^                                   Meets(?t0, ?t1, E, ?w)^
             PartOf(?ba, ?bb, E, ?w)^                                  PartOf(?ba, ?bb, E, ?w)^
             PartOf(?fa, ?fb, E, ?w)^                                  PartOf(?fa, ?fb, E, ?w)^
             IsA(?ba, Determiner, E, ?w)^                              IsA(?ba, Determiner, E, ?w)^
             IsA(?bb, Noun, E, ?w)^                                    IsA(?bb, Noun, E, ?w)^
             IsA(?fa, Adjective, E, ?w)^                               Specifier(?fa, ?spr, E, ?w)^
             IsA(?fb, Noun, E, ?w)                                     IsA(?spr, Determiner, E, ?w)^
             ==>                                                       IsA(?fb, Noun, E, ?w)^
             Same(?bb, ?fb, E, ?w)^                                    Heard(?wue, E, ?w)^
             Border(?ba, ?t1, ?w)^                                     IsA(?wue, WordUtteranceEvent, ?t1, ?w)
</constraint>                                                          ==>
                                                                       Same(?ba, ?spr, E, ?w)^
                                                                       Same(?bb, ?fb, E, ?w)^
                                                                       Border(?wue, ?t1, ?w)^
                                                                       _NPSPR(?ba, ?bb, ?fa, ?fb, E, ?w)
                                                           </constraint>
Grammar Rules




   *send+ *john+ *a+ *message+ *that+ *says+ *“hi”+.
Grammar Rules




[V send] [N john] [Det a] [N message] [RelP that] [V says] [Q “hi”+.
Grammar Rules


          XP      XP        XP           NP                 XP




 VP       NP      NP        NP           VP        VP       NP




[V send] [N john] [Det a] [N message] [RelP that] [V says] [Q “hi”+.
Grammar Rules


          XP      XP        XP           NP                 XP




 VP       NP      NP        NP           VP        VP       NP




[V send] [N john] [Det a] [N message] [RelP that] [V says] [Q “hi”+.
Grammar Rules


          XP      XP        XP           NP                 XP




 VP       NP      NP        NP           VP        VP       NP




[V send] [N john] [Det a] [N message] [RelP that] [V says] [Q “hi”+.
Grammar Rules


          XP      XP        XP           NP                 XP




 VP       NP      NP        NP           VP        VP       NP




[V send] [N john] [Det a] [N message] [RelP that] [V says] [Q “hi”+.
Grammar Rules


          XP      XP        XP           NP                 XP




 VP       NP      NP        NP           VP        VP       NP




[V send] [N john] [Det a] [N message] [RelP that] [V says] [Q “hi”+.
Grammar Rules


          XP      XP        XP           NP                 XP




 VP       NP      NP        NP           VP        VP       NP




[V send] [N john] [Det a] [N message] [RelP that] [V says] [Q “hi”+.
Grammar Rules


          XP      XP        XP           NP                 XP




 VP       NP      NP        NP           VP        VP       NP




[V send] [N john] [Det a] [N message] [RelP that] [V says] [Q “hi”+.
Grammar Rules

     VP




                           NP




                                           VP

          NP

                                                       NP




[V send] [N john] [Det a] [N message] [RelP that] [V says] [Q “hi”+.
Grammar Rules
• Benefits of this feature-structure unification parse:
    •   Captures the intuition that when we hear a word, and posit its feature structure, we can infer the
        existence of not only the word’s direct feature structure (usually generated by lexical rules) but also
        the existence of additional structures and their head/dependency relationships, and some definition
        of the values in the structure.
    •   Ambiguities (i.e. the head of an NP) are resolved from L-R through lazy definitions and unificiation of
        under-defined structures to well-defined structures in terms of particular features.
    •   Posits no more additional structures in the parse tree than is necessary in order to reflect a parse,
        whereas theories like HPSG posited by a large number of structures in a branching tree in order to
        preserve the recursivity of its grammar rules.
    •   However, we have shown that with feature structure unification, at least in theory, we can preserve
        recursivity of many of the rules without requiring a left or right branching structure.
    •   All of the necessary structure to build a parse are known from the beginning.
Grammar Rules!
• The future:
   •   Ungrammaticality: when objects aren’t where they are supposed to be, search for a likely head-
       dependency relationship
         •   Missing arguments: “Car is big.”
         •   Extra words (rare to have full content words be considered extra, but occurs in natural language: “I saw the, um,
             car.”)
         •   Dependents out of order: “Give the car me.”
         •   Dangling dependent: “
         •   Will require a good branch and bound system, that only performs search when what is expected/predicted
             reasonably is violated.
   •   Give a feature-structure unification account of garden path sentence
         •   Should be fairly natural given the L-R predictive nature of the parser
   •   Attach a semantic representation that generates word-sense based on head-dependency
       relationships.
         •   Syntax should be closely tied to semantics, in that both serve to help compute each other to varying degrees.
   •   Examine discourse from a syntactic perspective, and syntax from a discourse perspective, and use to
       disambiguate simultaneously:
Notes on Theory (boring)
• By having a lexical representation that is closely tied to the syntax, a number of advantages
  fall out:
   • Parsimony: by allowing a lot of information to be loosely defined/undefined at the lexical
     level, we do not need to posit additional lexical entries to cover all possible configurations of
     a phrases arguments in the entry, nor do we need an excessive number of lexical rules to
     generate these representations.
   • Generativity: a word’s sense is at least in part generated by its relationship to its dependents
     and head, and the semantic/syntactic type that these dependents/heads have in theory can
     compute a words sense on the fly (inspired by GL theory from Pustejovsky).
   • Context embedding: by tying your theory of the lexicon closely to syntactic theory, you move
     towards embedding your lexical representation in a cognitive system that is closely tied to the
     way words are ACTUALLY used.
Lexical Mosaics
• Thus, we can see that the sense of words comes from a number of
  different locations:
  • Memory
  • Syntactic context
  • Pragmatic/Discourse factors
• It is the hope for future research to tie these together in an
  organized way to give a theory on lexical representation that is tied
  closely to these factors, in a computable and tractable manner.
• Early goals:
  • Compute word senses from syntactic context + memory (very
    difficult)
  • Use syntactic context to disambiguate lexical ambiguity
  • Use generative word sense to disambiguate syntactic ambiguity
  • Simultaneously attempt to give a computational account of lexical
    memory, syntactic parsing, and pragmatic/discourse.

More Related Content

What's hot

Venn Diagram Green by Slideshop
Venn Diagram Green by Slideshop Venn Diagram Green by Slideshop
Venn Diagram Green by Slideshop SlideShop.com
 
Syntax III
Syntax IIISyntax III
Syntax III
kuozhengfeng
 
Formal languages
Formal languagesFormal languages
Formal languages
Vestforsk.no
 
On Japanese Resultatives: Some Cross-linguistic Implications
On Japanese Resultatives: Some Cross-linguistic ImplicationsOn Japanese Resultatives: Some Cross-linguistic Implications
On Japanese Resultatives: Some Cross-linguistic Implications
Fukushima University
 
April 20 24
April 20 24April 20 24
April 20 24
dstrohma
 
Modeling Improved Syllabification Algorithm for Amharic
Modeling Improved Syllabification Algorithm for AmharicModeling Improved Syllabification Algorithm for Amharic
Modeling Improved Syllabification Algorithm for Amharic
Guy De Pauw
 
IMPORTANCE OF VERB SUFFIX MAPPING IN DISCOURSE TRANSLATION SYSTEM
IMPORTANCE OF VERB SUFFIX MAPPING IN DISCOURSE TRANSLATION SYSTEMIMPORTANCE OF VERB SUFFIX MAPPING IN DISCOURSE TRANSLATION SYSTEM
IMPORTANCE OF VERB SUFFIX MAPPING IN DISCOURSE TRANSLATION SYSTEM
cscpconf
 
Basic arabic grammar
Basic arabic grammarBasic arabic grammar
Basic arabic grammar
AdeelMushtaq
 
Paren free
Paren freeParen free
Paren free
Brendan Eich
 
Statistical Dependency Parsing in Korean: From Corpus Generation To Automatic...
Statistical Dependency Parsing in Korean: From Corpus Generation To Automatic...Statistical Dependency Parsing in Korean: From Corpus Generation To Automatic...
Statistical Dependency Parsing in Korean: From Corpus Generation To Automatic...
Jinho Choi
 
Reading comprehension 1
Reading comprehension 1Reading comprehension 1
Reading comprehension 1
Sherliaty Saad
 
Active and passive voice
Active and passive voiceActive and passive voice
Active and passive voiceRACHELMONTEJOA
 
Headnoun vgp do&io
Headnoun vgp do&ioHeadnoun vgp do&io
Headnoun vgp do&io
Ivette Fernández
 
Sadia Qamar (Assignment)
Sadia Qamar (Assignment)Sadia Qamar (Assignment)
Sadia Qamar (Assignment)Dr. Cupid Lucid
 

What's hot (20)

ACTIVIDAD 3
ACTIVIDAD 3ACTIVIDAD 3
ACTIVIDAD 3
 
Venn Diagram Green by Slideshop
Venn Diagram Green by Slideshop Venn Diagram Green by Slideshop
Venn Diagram Green by Slideshop
 
Syntax III
Syntax IIISyntax III
Syntax III
 
Formal languages
Formal languagesFormal languages
Formal languages
 
85906 633561392036875000
85906 63356139203687500085906 633561392036875000
85906 633561392036875000
 
On Japanese Resultatives: Some Cross-linguistic Implications
On Japanese Resultatives: Some Cross-linguistic ImplicationsOn Japanese Resultatives: Some Cross-linguistic Implications
On Japanese Resultatives: Some Cross-linguistic Implications
 
Phonetics activities
Phonetics activitiesPhonetics activities
Phonetics activities
 
April 20 24
April 20 24April 20 24
April 20 24
 
Modeling Improved Syllabification Algorithm for Amharic
Modeling Improved Syllabification Algorithm for AmharicModeling Improved Syllabification Algorithm for Amharic
Modeling Improved Syllabification Algorithm for Amharic
 
ACTIVIDAD 7
ACTIVIDAD 7ACTIVIDAD 7
ACTIVIDAD 7
 
Irregular verbs
Irregular verbsIrregular verbs
Irregular verbs
 
IMPORTANCE OF VERB SUFFIX MAPPING IN DISCOURSE TRANSLATION SYSTEM
IMPORTANCE OF VERB SUFFIX MAPPING IN DISCOURSE TRANSLATION SYSTEMIMPORTANCE OF VERB SUFFIX MAPPING IN DISCOURSE TRANSLATION SYSTEM
IMPORTANCE OF VERB SUFFIX MAPPING IN DISCOURSE TRANSLATION SYSTEM
 
Basic arabic grammar
Basic arabic grammarBasic arabic grammar
Basic arabic grammar
 
Paren free
Paren freeParen free
Paren free
 
Statistical Dependency Parsing in Korean: From Corpus Generation To Automatic...
Statistical Dependency Parsing in Korean: From Corpus Generation To Automatic...Statistical Dependency Parsing in Korean: From Corpus Generation To Automatic...
Statistical Dependency Parsing in Korean: From Corpus Generation To Automatic...
 
Reading comprehension 1
Reading comprehension 1Reading comprehension 1
Reading comprehension 1
 
Case theory
Case theoryCase theory
Case theory
 
Active and passive voice
Active and passive voiceActive and passive voice
Active and passive voice
 
Headnoun vgp do&io
Headnoun vgp do&ioHeadnoun vgp do&io
Headnoun vgp do&io
 
Sadia Qamar (Assignment)
Sadia Qamar (Assignment)Sadia Qamar (Assignment)
Sadia Qamar (Assignment)
 

Viewers also liked

Syntactic Structure of Predication: An Instroduction
Syntactic Structure of Predication: An InstroductionSyntactic Structure of Predication: An Instroduction
Syntactic Structure of Predication: An Instroduction
SherlynDeLosSantos
 
The Semantic Processing of Syntactic Structure in Sentence Comprehension
The Semantic Processing of Syntactic Structure in Sentence ComprehensionThe Semantic Processing of Syntactic Structure in Sentence Comprehension
The Semantic Processing of Syntactic Structure in Sentence Comprehension
Zheng Ye
 
Structures of Predication Introduction
Structures of Predication IntroductionStructures of Predication Introduction
Structures of Predication Introduction
SherlynDeLosSantos
 
Syntactic structures
Syntactic structuresSyntactic structures
Syntactic structures
Gil Cabaltican
 
Syntax
SyntaxSyntax
Syntax
SyntaxSyntax
Syntax
amna-shahid
 
Structures of Modification!
Structures of Modification!Structures of Modification!
Structures of Modification!
2charles2
 

Viewers also liked (7)

Syntactic Structure of Predication: An Instroduction
Syntactic Structure of Predication: An InstroductionSyntactic Structure of Predication: An Instroduction
Syntactic Structure of Predication: An Instroduction
 
The Semantic Processing of Syntactic Structure in Sentence Comprehension
The Semantic Processing of Syntactic Structure in Sentence ComprehensionThe Semantic Processing of Syntactic Structure in Sentence Comprehension
The Semantic Processing of Syntactic Structure in Sentence Comprehension
 
Structures of Predication Introduction
Structures of Predication IntroductionStructures of Predication Introduction
Structures of Predication Introduction
 
Syntactic structures
Syntactic structuresSyntactic structures
Syntactic structures
 
Syntax
SyntaxSyntax
Syntax
 
Syntax
SyntaxSyntax
Syntax
 
Structures of Modification!
Structures of Modification!Structures of Modification!
Structures of Modification!
 

Similar to Feature Structure Unification Syntactic Parser 2.0

Constituency Tests
Constituency TestsConstituency Tests
Constituency Tests
Letra Essencia
 
PARTS OF SPEECH
PARTS OF SPEECHPARTS OF SPEECH
PARTS OF SPEECHdolaswati
 
Parts of speech
Parts of speechParts of speech
Parts of speech
Yanuar Hadi Saputro
 
Speech 100517215702-phpapp01 (1)
Speech 100517215702-phpapp01 (1)Speech 100517215702-phpapp01 (1)
Speech 100517215702-phpapp01 (1)Judie Ann Ricaplaza
 
unit -3 part 1.ppt
unit -3 part 1.pptunit -3 part 1.ppt
unit -3 part 1.ppt
LSURYAPRAKASHREDDY
 

Similar to Feature Structure Unification Syntactic Parser 2.0 (7)

Level 1 Analysis
Level 1 AnalysisLevel 1 Analysis
Level 1 Analysis
 
Constituency Tests
Constituency TestsConstituency Tests
Constituency Tests
 
PARTS OF SPEECH
PARTS OF SPEECHPARTS OF SPEECH
PARTS OF SPEECH
 
Parts of speech
Parts of speechParts of speech
Parts of speech
 
Speech 100517215702-phpapp01 (1)
Speech 100517215702-phpapp01 (1)Speech 100517215702-phpapp01 (1)
Speech 100517215702-phpapp01 (1)
 
Parts of speech
Parts of speechParts of speech
Parts of speech
 
unit -3 part 1.ppt
unit -3 part 1.pptunit -3 part 1.ppt
unit -3 part 1.ppt
 

Recently uploaded

Leading Change strategies and insights for effective change management pdf 1.pdf
Leading Change strategies and insights for effective change management pdf 1.pdfLeading Change strategies and insights for effective change management pdf 1.pdf
Leading Change strategies and insights for effective change management pdf 1.pdf
OnBoard
 
Monitoring Java Application Security with JDK Tools and JFR Events
Monitoring Java Application Security with JDK Tools and JFR EventsMonitoring Java Application Security with JDK Tools and JFR Events
Monitoring Java Application Security with JDK Tools and JFR Events
Ana-Maria Mihalceanu
 
Neuro-symbolic is not enough, we need neuro-*semantic*
Neuro-symbolic is not enough, we need neuro-*semantic*Neuro-symbolic is not enough, we need neuro-*semantic*
Neuro-symbolic is not enough, we need neuro-*semantic*
Frank van Harmelen
 
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...
Ramesh Iyer
 
FIDO Alliance Osaka Seminar: FIDO Security Aspects.pdf
FIDO Alliance Osaka Seminar: FIDO Security Aspects.pdfFIDO Alliance Osaka Seminar: FIDO Security Aspects.pdf
FIDO Alliance Osaka Seminar: FIDO Security Aspects.pdf
FIDO Alliance
 
Connector Corner: Automate dynamic content and events by pushing a button
Connector Corner: Automate dynamic content and events by pushing a buttonConnector Corner: Automate dynamic content and events by pushing a button
Connector Corner: Automate dynamic content and events by pushing a button
DianaGray10
 
Bits & Pixels using AI for Good.........
Bits & Pixels using AI for Good.........Bits & Pixels using AI for Good.........
Bits & Pixels using AI for Good.........
Alison B. Lowndes
 
From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...
From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...
From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...
Product School
 
GraphRAG is All You need? LLM & Knowledge Graph
GraphRAG is All You need? LLM & Knowledge GraphGraphRAG is All You need? LLM & Knowledge Graph
GraphRAG is All You need? LLM & Knowledge Graph
Guy Korland
 
When stars align: studies in data quality, knowledge graphs, and machine lear...
When stars align: studies in data quality, knowledge graphs, and machine lear...When stars align: studies in data quality, knowledge graphs, and machine lear...
When stars align: studies in data quality, knowledge graphs, and machine lear...
Elena Simperl
 
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...
Jeffrey Haguewood
 
Transcript: Selling digital books in 2024: Insights from industry leaders - T...
Transcript: Selling digital books in 2024: Insights from industry leaders - T...Transcript: Selling digital books in 2024: Insights from industry leaders - T...
Transcript: Selling digital books in 2024: Insights from industry leaders - T...
BookNet Canada
 
How world-class product teams are winning in the AI era by CEO and Founder, P...
How world-class product teams are winning in the AI era by CEO and Founder, P...How world-class product teams are winning in the AI era by CEO and Founder, P...
How world-class product teams are winning in the AI era by CEO and Founder, P...
Product School
 
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Encryption in Microsoft 365 - ExpertsLive Netherlands 2024
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024
Albert Hoitingh
 
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...
DanBrown980551
 
Assuring Contact Center Experiences for Your Customers With ThousandEyes
Assuring Contact Center Experiences for Your Customers With ThousandEyesAssuring Contact Center Experiences for Your Customers With ThousandEyes
Assuring Contact Center Experiences for Your Customers With ThousandEyes
ThousandEyes
 
The Art of the Pitch: WordPress Relationships and Sales
The Art of the Pitch: WordPress Relationships and SalesThe Art of the Pitch: WordPress Relationships and Sales
The Art of the Pitch: WordPress Relationships and Sales
Laura Byrne
 
Mission to Decommission: Importance of Decommissioning Products to Increase E...
Mission to Decommission: Importance of Decommissioning Products to Increase E...Mission to Decommission: Importance of Decommissioning Products to Increase E...
Mission to Decommission: Importance of Decommissioning Products to Increase E...
Product School
 
To Graph or Not to Graph Knowledge Graph Architectures and LLMs
To Graph or Not to Graph Knowledge Graph Architectures and LLMsTo Graph or Not to Graph Knowledge Graph Architectures and LLMs
To Graph or Not to Graph Knowledge Graph Architectures and LLMs
Paul Groth
 
FIDO Alliance Osaka Seminar: Passkeys at Amazon.pdf
FIDO Alliance Osaka Seminar: Passkeys at Amazon.pdfFIDO Alliance Osaka Seminar: Passkeys at Amazon.pdf
FIDO Alliance Osaka Seminar: Passkeys at Amazon.pdf
FIDO Alliance
 

Recently uploaded (20)

Leading Change strategies and insights for effective change management pdf 1.pdf
Leading Change strategies and insights for effective change management pdf 1.pdfLeading Change strategies and insights for effective change management pdf 1.pdf
Leading Change strategies and insights for effective change management pdf 1.pdf
 
Monitoring Java Application Security with JDK Tools and JFR Events
Monitoring Java Application Security with JDK Tools and JFR EventsMonitoring Java Application Security with JDK Tools and JFR Events
Monitoring Java Application Security with JDK Tools and JFR Events
 
Neuro-symbolic is not enough, we need neuro-*semantic*
Neuro-symbolic is not enough, we need neuro-*semantic*Neuro-symbolic is not enough, we need neuro-*semantic*
Neuro-symbolic is not enough, we need neuro-*semantic*
 
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...
 
FIDO Alliance Osaka Seminar: FIDO Security Aspects.pdf
FIDO Alliance Osaka Seminar: FIDO Security Aspects.pdfFIDO Alliance Osaka Seminar: FIDO Security Aspects.pdf
FIDO Alliance Osaka Seminar: FIDO Security Aspects.pdf
 
Connector Corner: Automate dynamic content and events by pushing a button
Connector Corner: Automate dynamic content and events by pushing a buttonConnector Corner: Automate dynamic content and events by pushing a button
Connector Corner: Automate dynamic content and events by pushing a button
 
Bits & Pixels using AI for Good.........
Bits & Pixels using AI for Good.........Bits & Pixels using AI for Good.........
Bits & Pixels using AI for Good.........
 
From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...
From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...
From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...
 
GraphRAG is All You need? LLM & Knowledge Graph
GraphRAG is All You need? LLM & Knowledge GraphGraphRAG is All You need? LLM & Knowledge Graph
GraphRAG is All You need? LLM & Knowledge Graph
 
When stars align: studies in data quality, knowledge graphs, and machine lear...
When stars align: studies in data quality, knowledge graphs, and machine lear...When stars align: studies in data quality, knowledge graphs, and machine lear...
When stars align: studies in data quality, knowledge graphs, and machine lear...
 
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...
 
Transcript: Selling digital books in 2024: Insights from industry leaders - T...
Transcript: Selling digital books in 2024: Insights from industry leaders - T...Transcript: Selling digital books in 2024: Insights from industry leaders - T...
Transcript: Selling digital books in 2024: Insights from industry leaders - T...
 
How world-class product teams are winning in the AI era by CEO and Founder, P...
How world-class product teams are winning in the AI era by CEO and Founder, P...How world-class product teams are winning in the AI era by CEO and Founder, P...
How world-class product teams are winning in the AI era by CEO and Founder, P...
 
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Encryption in Microsoft 365 - ExpertsLive Netherlands 2024
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024
 
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...
 
Assuring Contact Center Experiences for Your Customers With ThousandEyes
Assuring Contact Center Experiences for Your Customers With ThousandEyesAssuring Contact Center Experiences for Your Customers With ThousandEyes
Assuring Contact Center Experiences for Your Customers With ThousandEyes
 
The Art of the Pitch: WordPress Relationships and Sales
The Art of the Pitch: WordPress Relationships and SalesThe Art of the Pitch: WordPress Relationships and Sales
The Art of the Pitch: WordPress Relationships and Sales
 
Mission to Decommission: Importance of Decommissioning Products to Increase E...
Mission to Decommission: Importance of Decommissioning Products to Increase E...Mission to Decommission: Importance of Decommissioning Products to Increase E...
Mission to Decommission: Importance of Decommissioning Products to Increase E...
 
To Graph or Not to Graph Knowledge Graph Architectures and LLMs
To Graph or Not to Graph Knowledge Graph Architectures and LLMsTo Graph or Not to Graph Knowledge Graph Architectures and LLMs
To Graph or Not to Graph Knowledge Graph Architectures and LLMs
 
FIDO Alliance Osaka Seminar: Passkeys at Amazon.pdf
FIDO Alliance Osaka Seminar: Passkeys at Amazon.pdfFIDO Alliance Osaka Seminar: Passkeys at Amazon.pdf
FIDO Alliance Osaka Seminar: Passkeys at Amazon.pdf
 

Feature Structure Unification Syntactic Parser 2.0

  • 1. L-R Feature Structure Unification Syntactic Parser Richard Caneba RPI Cognitive Science Department Human-Level Intelligence Laboratory
  • 2. Intuitions • An interpretive grammar views syntax as finding the most appropriate sequence of head and dependency relationship between phrases and words. • Language understanding occurs (roughly) left to right • Syntactic trees have a flat structure, that gives no syntactic preferences to sequences of adjunctive modifiers of the same category (adjectives, adverbs, modifying prepositional phrases) • We can infer a number of things immediately from the perception of a weird, although by no means all things
  • 3. Intuitions cont’d • There are many patterns that exist in natural language, that can be deterministic in some cases, and must be defeasible/probabilistic in others. • Reliably deterministic: • [Det N] => NP[Det N] • [Adj N] => NP[Adj N] • Defeasible: • *V NP NP…+ (<1.0)> VP*V NP NP…+ • *V NP NP…+ (<1.0)> VP*V NP*NP…+…+ • Make an attempt to do search ONLY if there is a genuine ambiguity as to what the next step in a L-R parse should be • Second object/Relative clause modifier in ditransitive context • Prepositional phrase attachment
  • 4. Feature Structure Unification • A traditional challenge with the HPSG theory of grammar is that, in order to preserve the recursiveness of their grammar rules, they were required to have a “right-branching” structure that posited additional feature structure nodes for each dependency-head relationship the theory posits • This is to some extent slightly cognitively unrealistic: • Posits an unecessary amount of structure for a syntactic parse • Intuitively there is no syntactic distinction that should be made between sequences of adjuncts (it’s hard to tell the difference between “the angry green dog” and the “green angry dog.”
  • 5. Lexical Representation of Syntax • Each word posits a sequence of head-dependency relationships that form a “phrasal chain.” • These chains are based on the notion that we can infer immediately some head-dependency relationships based on the syntactic category of the word. • Roughly, each node in a chain is of three types (not explicitly defined in the lexicon, but nonetheless present): • Word Level (WordUtteranceEvent) • Dependency Level (PhraseUtteranceEvent) • HeadLevel (PhraseUtteranceEvent)
  • 6. Lexical Representation of Syntax • Let’s do a quick example to show the lexical syntactic representation: • “the angry dog” • With part-of-speech tags, that is: • [Det the][Adj angry][N dog]. • The representation in di-graph form:
  • 7. Lexical Representation of Syntax PhraseUtteranceEvents WordUtteranceEvents Syntactic Entry for a Common Noun CandType CandType Verb Preposition PartOf IsA Noun CandType PartOf Noun IsA Specifier IsA Determiner CommonNoun Phon dog
  • 8. Lexical Representation of Syntax PhraseUtteranceEvents WordUtteranceEvents Syntactic Entry for an Adjective CandType IsA Noun Noun PartOf IsA Adjective Phon angry NOTE: will need to posit a dependency layer, to account for adverbs that modify the adjective i.e. “really big”.
  • 9. Lexical Representation of Syntax PhraseUtteranceEvents WordUtteranceEvents Syntactic Entry for a Determiner CandType CandType Verb Preposition PartOf CandType IsA Noun Noun PartOf IsA Determiner Phon the
  • 10. Grammar Rules • In our example, we will need to have at least two rules: • One that unifies the structures posited by the determiner to the structures posited by the common noun • One that unifies the structures posited by adjective, either to the determiner or the noun • Let’s consider this from L-R: • First, unify the Det-NP-XP structure chain to the Adj-NP structure chain • Next, unify that resulting structure chain to the N-NP-XP structure chain
  • 11. Grammar Rules • Determiner-Adjective Rule CandType Preposition Verb PartOf CandType Noun CandType Noun IsA PartOf Noun IsA Noun PartOf IsA Adjective IsA Determiner Phon Phon the angry
  • 12. Grammar Rules • Determiner-Adjective Rule CandType Same Preposition Verb PartOf CandType Noun CandType Noun IsA PartOf Noun IsA Noun PartOf IsA Adjective IsA Determiner Phon Phon the angry
  • 13. Grammar Rules • Determiner-Adjective Rule CandType CandType Verb Preposition PartOf Noun IsA CandType Noun IsA IsA Adjective Determiner Phon Phon the angry
  • 14. Grammar Rules • We would like to allow for anywhere from 0-infinite number of adjectives to stand between the determiner and the noun that selects the determiner as its specifier. • We can achieve this by explicitly stating that whenever a Det chain and an Adj chain are unified, it’s exposed as a determiner on the right wall of the growing parse, as opposed to an adjective.
  • 15. Grammar Rules • Determiner-Adjective Resulting Structure CandType CandType Verb Preposition PartOf Noun IsA CandType Noun IsA IsA Adjective Determiner Phon Phon the angry
  • 16. Grammar Rules • Determiner-Adjective Resulting Structure + NP CandType CandType Verb CandType Preposition Verb PartOf Preposition PartOf Noun IsA CandType Noun Noun Noun PartOf IsA IsA IsA Adjective CommonNoun Determiner Phon Phon Phon the angry dog
  • 17. Grammar Rules • Expose the resulting structure from the Det-Adj unification as just the Det structure: XP XP NP NP Det Adj Spr N
  • 18. Grammar Rules • Expose the resulting structure from the Det-Adj unification as just the Det structure: Border XP XP Frontier NP NP Det Adj Spr N
  • 19. Grammar Rules • Expose the resulting structure from the Det-Adj unification as just the Det structure: Border XP XP Same Frontier NP NP Same Det Adj Spr N
  • 20. Grammar Rules <!--Pre-head Adjective Modifier w/ Det: Shift Border--> <!--Subcategorization Rules: NP Specifier--> <constraint shouldFalsify="false"> <constraint shouldFalsify="false"> Border(?ba, ?t0, ?w)^ Border(?ba, ?t0, ?w)^ Border(?bb, ?t0, ?w)^ Border(?bb, ?t0, ?w)^ Frontier(?fa, ?t1, ?w)^ Frontier(?fa, ?t1, ?w)^ Frontier(?fb, ?t1, ?w)^ Frontier(?fb, ?t1, ?w)^ Meets(?t0, ?t1, E, ?w)^ Meets(?t0, ?t1, E, ?w)^ PartOf(?ba, ?bb, E, ?w)^ PartOf(?ba, ?bb, E, ?w)^ PartOf(?fa, ?fb, E, ?w)^ PartOf(?fa, ?fb, E, ?w)^ IsA(?ba, Determiner, E, ?w)^ IsA(?ba, Determiner, E, ?w)^ IsA(?bb, Noun, E, ?w)^ IsA(?bb, Noun, E, ?w)^ IsA(?fa, Adjective, E, ?w)^ Specifier(?fa, ?spr, E, ?w)^ IsA(?fb, Noun, E, ?w) IsA(?spr, Determiner, E, ?w)^ ==> IsA(?fb, Noun, E, ?w)^ Same(?bb, ?fb, E, ?w)^ Heard(?wue, E, ?w)^ Border(?ba, ?t1, ?w)^ IsA(?wue, WordUtteranceEvent, ?t1, ?w) </constraint> ==> Same(?ba, ?spr, E, ?w)^ Same(?bb, ?fb, E, ?w)^ Border(?wue, ?t1, ?w)^ _NPSPR(?ba, ?bb, ?fa, ?fb, E, ?w) </constraint>
  • 21. Grammar Rules *send+ *john+ *a+ *message+ *that+ *says+ *“hi”+.
  • 22. Grammar Rules [V send] [N john] [Det a] [N message] [RelP that] [V says] [Q “hi”+.
  • 23. Grammar Rules XP XP XP NP XP VP NP NP NP VP VP NP [V send] [N john] [Det a] [N message] [RelP that] [V says] [Q “hi”+.
  • 24. Grammar Rules XP XP XP NP XP VP NP NP NP VP VP NP [V send] [N john] [Det a] [N message] [RelP that] [V says] [Q “hi”+.
  • 25. Grammar Rules XP XP XP NP XP VP NP NP NP VP VP NP [V send] [N john] [Det a] [N message] [RelP that] [V says] [Q “hi”+.
  • 26. Grammar Rules XP XP XP NP XP VP NP NP NP VP VP NP [V send] [N john] [Det a] [N message] [RelP that] [V says] [Q “hi”+.
  • 27. Grammar Rules XP XP XP NP XP VP NP NP NP VP VP NP [V send] [N john] [Det a] [N message] [RelP that] [V says] [Q “hi”+.
  • 28. Grammar Rules XP XP XP NP XP VP NP NP NP VP VP NP [V send] [N john] [Det a] [N message] [RelP that] [V says] [Q “hi”+.
  • 29. Grammar Rules XP XP XP NP XP VP NP NP NP VP VP NP [V send] [N john] [Det a] [N message] [RelP that] [V says] [Q “hi”+.
  • 30. Grammar Rules VP NP VP NP NP [V send] [N john] [Det a] [N message] [RelP that] [V says] [Q “hi”+.
  • 31. Grammar Rules • Benefits of this feature-structure unification parse: • Captures the intuition that when we hear a word, and posit its feature structure, we can infer the existence of not only the word’s direct feature structure (usually generated by lexical rules) but also the existence of additional structures and their head/dependency relationships, and some definition of the values in the structure. • Ambiguities (i.e. the head of an NP) are resolved from L-R through lazy definitions and unificiation of under-defined structures to well-defined structures in terms of particular features. • Posits no more additional structures in the parse tree than is necessary in order to reflect a parse, whereas theories like HPSG posited by a large number of structures in a branching tree in order to preserve the recursivity of its grammar rules. • However, we have shown that with feature structure unification, at least in theory, we can preserve recursivity of many of the rules without requiring a left or right branching structure. • All of the necessary structure to build a parse are known from the beginning.
  • 32. Grammar Rules! • The future: • Ungrammaticality: when objects aren’t where they are supposed to be, search for a likely head- dependency relationship • Missing arguments: “Car is big.” • Extra words (rare to have full content words be considered extra, but occurs in natural language: “I saw the, um, car.”) • Dependents out of order: “Give the car me.” • Dangling dependent: “ • Will require a good branch and bound system, that only performs search when what is expected/predicted reasonably is violated. • Give a feature-structure unification account of garden path sentence • Should be fairly natural given the L-R predictive nature of the parser • Attach a semantic representation that generates word-sense based on head-dependency relationships. • Syntax should be closely tied to semantics, in that both serve to help compute each other to varying degrees. • Examine discourse from a syntactic perspective, and syntax from a discourse perspective, and use to disambiguate simultaneously:
  • 33. Notes on Theory (boring) • By having a lexical representation that is closely tied to the syntax, a number of advantages fall out: • Parsimony: by allowing a lot of information to be loosely defined/undefined at the lexical level, we do not need to posit additional lexical entries to cover all possible configurations of a phrases arguments in the entry, nor do we need an excessive number of lexical rules to generate these representations. • Generativity: a word’s sense is at least in part generated by its relationship to its dependents and head, and the semantic/syntactic type that these dependents/heads have in theory can compute a words sense on the fly (inspired by GL theory from Pustejovsky). • Context embedding: by tying your theory of the lexicon closely to syntactic theory, you move towards embedding your lexical representation in a cognitive system that is closely tied to the way words are ACTUALLY used.
  • 34. Lexical Mosaics • Thus, we can see that the sense of words comes from a number of different locations: • Memory • Syntactic context • Pragmatic/Discourse factors • It is the hope for future research to tie these together in an organized way to give a theory on lexical representation that is tied closely to these factors, in a computable and tractable manner. • Early goals: • Compute word senses from syntactic context + memory (very difficult) • Use syntactic context to disambiguate lexical ambiguity • Use generative word sense to disambiguate syntactic ambiguity • Simultaneously attempt to give a computational account of lexical memory, syntactic parsing, and pragmatic/discourse.