Slideshare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our User Agreement and Privacy Policy.

Slideshare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our Privacy Policy and User Agreement for details.

Like this presentation? Why not share!

- The Chomskyan Revolution - Prof. Fr... by Phoenix Tree Publ... 792 views
- Chapter 7 - The Nervous System by mpattani 1656 views
- Chomskyan linguistics lec 3 by Hina Honey 5006 views
- Chomskyan linguistics 2 by Hina Honey 3882 views
- Resumen Unit 5 by mjgvalcarce 1584 views
- Ug & sla by Mounir Elharrak 1023 views

248 views

Published on

RPI Issues in Cognitive Science Speakers Series

Spring 2011

No Downloads

Total views

248

On SlideShare

0

From Embeds

0

Number of Embeds

3

Shares

0

Downloads

0

Comments

0

Likes

1

No embeds

No notes for slide

- 1. An Interpretation-Driven Model of Syntax<br />Richard Caneba <br />canebr@rpi.edu<br />RPI Cognitive Science Department<br />Human-Level Intelligence Laboratory<br />5/2/2011<br />
- 2. Introduction<br />We start with a goal: Develop a system that can understand natural language.<br />5/2/2011<br />
- 3. Introduction<br />We start with a goal: Develop a system that can understand natural language.<br />(Roughly) three sub-goals:<br />Syntactic Parsing<br />Semantic Representation<br />Pragmatics/Discourse<br />5/2/2011<br />
- 4. Introduction<br />We start with a goal: Develop a system that can understand natural language.<br />(Roughly) three sub-goals:<br />Syntactic Parsing<br />Semantic Representation<br />Pragmatics/Discourse<br />Stage 1: Syntax Why is syntax important for natural language understanding?<br />5/2/2011<br />
- 5. Introduction: Syntax<br />“The dog bit the man.”<br />5/2/2011<br />
- 6. Introduction: Syntax<br />“The dog bit the man.”<br />5/2/2011<br />
- 7. Introduction: Syntax<br />“The man bit the dog.”<br />5/2/2011<br />
- 8. Introduction: Syntax<br />“The man bit the dog.”<br />5/2/2011<br />
- 9. Introduction: Syntax<br />“I hit the man with my car.”<br />5/2/2011<br />
- 10. Introduction: Syntax<br />5/2/2011<br />“[I] hit [the man] [with my car].”<br />
- 11. Introduction: Syntax<br />“[I] hit [the man] [with my car].”<br />5/2/2011<br />Option 1: “with my car” modifies “hit””I hit the man while driving my car.”<br />
- 12. Introduction: Syntax<br />“[I] hit [the man] [with my car].”<br />5/2/2011<br />Option 2: “with my car” modifies “the man””I hit the man who had my car.”<br />
- 13. Introduction: Syntax<br />Syntactic interpretation yields very distinct semantic interpretations.<br />5/2/2011<br />
- 14. Introduction: Syntax<br />Syntactic interpretation yields very distinct semantic interpretations.<br />Helps identify the role words play in an utterance.<br />5/2/2011<br />
- 15. Introduction: Syntax<br />Thus, syntax plays a fundamental role in natural language understanding.<br />5/2/2011<br />
- 16. Syntax<br />Current grammar formalisms have number of shortcomings<br />5/2/2011<br />
- 17. Syntax<br />Current grammar formalisms have number of shortcomings<br />From perspective of trying to develop a system to understand natural language<br />5/2/2011<br />
- 18. Syntax<br />Current grammar formalisms have number of shortcomings<br />From perspective of trying to develop a system to understand natural language<br />Focus on generative grammar (i.e. Chomskyan, HPSG [11], [13])<br />5/2/2011<br />
- 19. Syntax: Theory & Implementation<br />Two classes of shortcomings:<br />Theoretical: shortcoming in the way a theory of language represents the users mental knowledge of that language.<br />Implementation: shortcoming in the way a theory implies or represents language processing in terms of computability and/or cognitive realism.<br />5/2/2011<br />
- 20. Syntax: Theory & Implementation<br />We will show that these two classes are so closely related, does not make sense to make a strong distinction<br />5/2/2011<br />
- 21. Theoretical Shortcomings<br />Shortcoming in the way a theory of language represents the users mental knowledge of that language<br />5/2/2011<br />
- 22. Theoretical Shortcomings<br />Interpretability<br />5/2/2011<br />
- 23. Theoretical Shortcomings<br />i.e. “Fido bit dog.”<br />5/2/2011<br />
- 24. Theoretical Shortcomings<br />i.e. “Fido bit dog.”<br />cf. “Fido bit the dog.” <br />5/2/2011<br />
- 25. Why Syntax?<br />Why syntax at all, if we can still interpret ungrammaticality?<br />5/2/2011<br />
- 26. Why Syntax?<br />“Fido bit dog.”<br />cf. “Dog bit Fido.” <br />5/2/2011<br />Fido<br />Fido<br />
- 27. Theoretical Shortcomings<br />Phrasal Nesting<br />5/2/2011<br />
- 28. Theoretical Shortcomings<br />Phrasal Nesting<br />example: “the man in boston with the hat is here.”<br />5/2/2011<br />
- 29. 5/2/2011<br />[S]<br />[NP]<br />[VP]<br />[NP]<br />is<br />[PP]<br />[PP]<br />here<br />man<br />[NP]<br />[DP]<br />in<br />the<br />[PP]<br />boston<br />with<br />[NP]<br />hat<br />[DP]<br />“the man in boston with the hat is here.”<br />the<br />
- 30. 5/2/2011<br />[S]<br />[NP]<br />[VP]<br />[NP]<br />is<br />[PP]<br />[PP]<br />here<br />man<br />[NP]<br />[DP]<br />in<br />the<br />[PP]<br />boston<br />with<br />[NP]<br />hat<br />[DP]<br />“the man in boston with the hat is here.”<br />the<br />
- 31. 5/2/2011<br />[S]<br />[VP]<br />[NP]<br />is<br />[PP]<br />man<br />[DP]<br />[PP]<br />[PP]<br />here<br />with<br />[NP]<br />[NP]<br />in<br />the<br />hat<br />[DP]<br />boston<br />the<br />“the man in boston with the hat is here.”<br />
- 32. 5/2/2011<br />[S]<br />[VP]<br />[NP]<br />is<br />[PP]<br />man<br />[DP]<br />[PP]<br />[PP]<br />here<br />with<br />[NP]<br />[NP]<br />in<br />the<br />hat<br />[DP]<br />boston<br />the<br />“the man in boston with the hat is here.”<br />
- 33. Implementation Shortcomings<br />Shortcoming in the way a theory implies or represents language processing in terms of computability and/or cognitive realism.<br />5/2/2011<br />
- 34. Implementation Shortcomings<br />Perceptual Ordering<br />5/2/2011<br />
- 35. Implementation Shortcomings<br />john<br />5/2/2011<br />
- 36. Implementation Shortcomings<br />john<br />saw<br />5/2/2011<br />
- 37. Implementation Shortcomings<br />john<br />saw<br />mary<br />5/2/2011<br />
- 38. Implementation Shortcomings<br />[VP]<br />[NP]<br />5/2/2011<br />john<br />saw<br />mary<br />
- 39. Implementation Shortcomings<br />[S]<br />[VP]<br />[NP]<br />[NP]<br />5/2/2011<br />john<br />saw<br />mary<br />
- 40. Implementation Shortcomings<br />Excessive Structure<br />5/2/2011<br />
- 41. Implementation Shortcomings<br />5/2/2011<br />the<br />tall<br />strong<br />angry<br />man<br />
- 42. Implementation Shortcomings<br />5/2/2011<br />[NP]<br />the<br />tall<br />strong<br />angry<br />man<br />
- 43. Implementation Shortcomings<br />5/2/2011<br />[NP]<br />[NP]<br />the<br />tall<br />strong<br />angry<br />man<br />
- 44. Implementation Shortcomings<br />5/2/2011<br />[NP]<br />[NP]<br />[NP]<br />the<br />tall<br />strong<br />angry<br />man<br />
- 45. Implementation Shortcomings<br />5/2/2011<br />[NP]<br />[NP]<br />[NP]<br />[NP]<br />the<br />tall<br />strong<br />angry<br />man<br />
- 46. Implementation Shortcomings<br />5/2/2011<br />[NP]<br />[NP]<br />[NP]<br />[NP]<br />the<br />tall<br />strong<br />angry<br />man<br />
- 47. Implementation Shortcomings<br />5/2/2011<br />[NP]<br />[NP]<br />[NP]<br />[NP]<br />[PP]<br />the<br />tall<br />strong<br />angry<br />man<br />in<br />boston<br />
- 48. Implementation Shortcomings<br />5/2/2011<br />[NP]<br />[NP]<br />[NP]<br />[NP]<br />[PP]<br />the<br />tall<br />strong<br />angry<br />man<br />in<br />boston<br />
- 49. Implementation Shortcomings<br />5/2/2011<br />[NP]<br />[NP]<br />[NP]<br />[NP]<br />[PP]<br />the<br />tall<br />strong<br />angry<br />man<br />in<br />boston<br />
- 50. Implementation Shortcomings<br />5/2/2011<br />[NP]<br />[NP]<br />[NP]<br />[NP]<br />[PP]<br />the<br />tall<br />strong<br />angry<br />man<br />in<br />boston<br />
- 51. Implementation Shortcomings<br />5/2/2011<br />[NP]<br />[NP]<br />[NP]<br />[NP]<br />[PP]<br />the<br />tall<br />strong<br />angry<br />man<br />in<br />boston<br />
- 52. Implementation Shortcomings<br />5/2/2011<br />[NP]<br />[NP]<br />[NP]<br />[NP]<br />[PP]<br />the<br />tall<br />strong<br />angry<br />man<br />in<br />boston<br />
- 53. Syntax: Statistical Approach<br />Most currently successful parsing algorithms rely heavily on statistics.<br />However, inferences that require notion of semantics difficult<br />5/2/2011<br />
- 54. Syntax: Statistical Approach<br />i.e. “The couple walked in the park. He held her hand.”<br />5/2/2011<br />
- 55. Syntax: Statistical Approach<br />i.e. “The couple walked in the park. He held her hand.”<br />Statistics does not give anaphoric bind.<br />5/2/2011<br />
- 56. Chomskyan Approach<br />From [6], consider “Joe has put those raw potatoes in the pot.”<br />5/2/2011<br />
- 57. Chomskyan Approach<br />5/2/2011<br />
- 58. Chomskyan Approach<br />A simpler analysis is possible:<br />5/2/2011<br />
- 59. Chomskyan Approach<br />5/2/2011<br />
- 60. Chomskyan Approach<br />Illustrates: way a theory is represented has very real computational concerns.<br />5/2/2011<br />
- 61. Other Notable Work<br />Other Cognitive Architecture based Parsers<br />[8] R. Lewis et.al. developed a parser based on "immediate reasoning" in Soar<br />[9] R. Lewis et.al. developed an activation based parser model in ACT-R<br />[1][2][3] J.T. Ball et.al. developed a parser based on Double R Grammar Model, for "synthetic teammate" development in ACT-R<br />5/2/2011<br />
- 62. Other Notable Work<br />However, each of these theories suffers from the shortcomings we’ve already seen.<br />5/2/2011<br />
- 63. Other Notable Work<br />However, each of these theories suffers from the shortcomings we’ve already seen.<br />Both parsers designed by Lewis rely on a CFG formalism<br />Bell’s ACT-R parser does not deeply integrated with reasoning<br />5/2/2011<br />
- 64. Other Notable Work<br />However, each of these theories suffers from the shortcomings we’ve already seen.<br />Both parsers designed by Lewis rely on a CFG formalism<br />Bell’s ACT-R parser does not deeply integrated with reasoning<br />These approaches are not well integrated with reasoning overall<br />5/2/2011<br />
- 65. Motivating Principles<br />To address these shortcomings from an interpretative perspective, four principles are motivated:<br />5/2/2011<br />
- 66. Motivating Principles<br />To address these shortcomings from an interpretative perspective, four principles are motivated:a) The existence of satellite structuresb) Feature structure unificationc) Feature structure aggregationd) Incrementality<br />5/2/2011<br />
- 67. Satellite Structures<br />When we hear words, infer existence of other structures related to that word<br />5/2/2011<br />
- 68. Satellite Structures<br />When we hear words, infer existence of other structures related to that word<br />i.e. “bit”<br />5/2/2011<br />bit<br />bit<br />
- 69. Satellite Structures<br />When we hear words, infer existence of other structures related to that word<br />i.e. “bit”<br />5/2/2011<br />[NP]<br />[NP]<br />Obj<br />Subj<br />bit<br />bit<br />
- 70. Feature Structure Unification<br />With the existence of satellite structures, we can unify observed structures together<br />5/2/2011<br />
- 71. Feature Structure Unification<br />With the existence of satellite structures, we can unify observed structures together<br />5/2/2011<br />[NP]<br />[NP]<br />Obj<br />Subj<br />john<br />bit<br />bit<br />fido<br />
- 72. Feature Structure Unification<br />With the existence of satellite structures, we can unify observed structures together<br />5/2/2011<br />[NP]<br />[NP]<br />Obj<br />Subj<br />john<br />bit<br />bit<br />fido<br />
- 73. Feature Structure Aggregation<br />Sequential unification multiple feature structures yields “aggregating” feature structure<br />5/2/2011<br />
- 74. Feature Structure Aggregation<br />i.e. “the big dog”<br />5/2/2011<br />
- 75. Feature Structure Aggregation<br />i.e. “the big dog”<br />5/2/2011<br />big<br />the<br />dog<br />
- 76. Feature Structure Aggregation<br />i.e. “the big dog”<br />5/2/2011<br />[NP]<br />[NP]<br />[NP]<br />big<br />the<br />dog<br />
- 77. Feature Structure Aggregation<br />i.e. “the big dog”<br />5/2/2011<br />[NP]<br />[NP]<br />[NP]<br />Unify<br />big<br />the<br />dog<br />
- 78. Feature Structure Aggregation<br />i.e. “the big dog”<br />5/2/2011<br />[NP]<br />[NP]<br />[NP]<br />Unify<br />Unify<br />big<br />the<br />dog<br />
- 79. Feature Structure Aggregation<br />i.e. “the big dog”<br />5/2/2011<br />Unify<br />[NP]<br />[NP]<br />[NP]<br />big<br />the<br />dog<br />
- 80. Incrementality<br />Human sentence comprehension is left-to-right, and incremental<br />Incrementally generate an interpretation<br />5/2/2011<br />
- 81. Using these principles, attempt to address the aforementioned shortcomingsin a model<br />5/2/2011<br />Architectural Implementation<br />
- 82. Architectural Implementation<br />Parser prototype implemented in the Polyscheme Cognitive Architecture.<br />5/2/2011<br />
- 83. Architectural Implementation<br />Parser prototype implemented in the Polyscheme Cognitive Architecture.<br />Use the model to process and interpret natural language input.<br />5/2/2011<br />
- 84. Architectural Implementation<br />Parsing driven by pair-wise pattern matching<br />5/2/2011<br />
- 85. Architectural Implementation<br />5/2/2011<br />Parsing driven by pair-wise pattern matching<br />[XP]<br />*<br />[NP]<br />[XP]<br />*<br />[NP]<br />[DetP]<br />[NP]<br />[DetP]<br />[D]<br />[N]<br />[D]<br />[N]<br />*Underspecified Feature Structure<br />
- 86. Architectural Implementation<br />5/2/2011<br />Parsing driven by pair-wise pattern matching<br />[XP]<br />*<br />[NP]<br />[XP]<br />*<br />[NP]<br />[DetP]<br />[NP]<br />[DetP]<br />[D]<br />[N]<br />[D]<br />[N]<br />*Underspecified Feature Structure<br />
- 87. Architectural Implementation<br />5/2/2011<br />[XP]<br />[XP]<br />[NP]<br />[NP]<br />[DetP]<br />[NP]<br />[DetP]<br />[1]<br />man<br />the<br />man<br />the<br />DetP<br />[1]<br />
- 88. Architectural Implementation<br />5/2/2011<br />IsA(?the, WordUtteranceEvent)<br />IsA(?the, Determiner)<br />PartOf(?d, ?dp)<br />IsA(?dp, PhraseUtteranceEvent)<br />IsA(?dp, Determiner)<br />PartOf(?dp, ?np1)<br />IsA(?np1, PhraseUtteranceEvent)<br />IsA(?np1, Noun)<br />Meets(?the, ?man)<br />IsA(?man, WordUtteranceEvent)<br />IsA(?man, CommonNoun)<br />Specifier(?man, ?detSpr)<br />IsA(?detSpr, Determiner)<br />PartOf(?man, ?np2)<br />IsA(?np2, PhraseUtteranceEvent)<br />IsA(?np2, Noun)<br /><br />Same(?dp, ?detSpr)<br />Same(?np1, ?np2)<br />[XP]<br />[XP]<br />[NP]<br />[NP]<br />[DetP]<br />[NP]<br />[DetP]<br />[1]<br />man<br />the<br />man<br />the<br />DetP<br />[1]<br />
- 89. 5/2/2011<br />[DetP][N] NP[DetP, N]<br />[AdjP][N] NP[AdjP, N]<br />[XP]<br />[XP]<br />[XP]<br />[XP]<br />[XP]<br />[NP]<br />[NP]<br />[NP]<br />[DetP]<br />[NP]<br />[AdjP]<br />[NP]<br />[DetP]<br />[AdjP]<br />[D]<br />[N]<br />[D]<br />[N]<br />[Adj]<br />[N]<br />[Adj]<br />[N]<br />[DetP][AdjP] NP[DetP, AdjP]<br />[XP]<br />[XP]<br />[NP]<br />[NP]<br />[DetP]<br />[AdjP]<br />[AdjP]<br />[DetP]<br />[D]<br />[Adj]<br />[D]<br />[Adj]<br />
- 90. 5/2/2011<br />[P][NP] PP[P, NP]<br />[V][NP] VP[V, NP]<br />[XP]<br />[XP]<br />[XP]<br />[XP]<br />[XP]<br />[VP]<br />[XP]<br />[PP]<br />[PP]<br />[NP]<br />[NP]<br />[NP]<br />[VP]<br />[NP]<br />[P]<br />[X]<br />[P]<br />[X]<br />[X]<br />[X]<br />[V]<br />[V]<br />[NP][V] VP[NP, V]<br />[XP]<br />[XP]<br />[XP]<br />[VP]<br />[NP]<br />[VP]<br />[NP]<br />[N]<br />[V]<br />[N]<br />[V]<br />
- 91. "the tall man"<br />5/2/2011<br />Architectural Implementation<br />
- 92. Architectural Implementation<br />"the tall man"<br />5/2/2011<br />[NP]<br />[DetP]<br />the<br />
- 93. Architectural Implementation<br />"the tall man"<br />5/2/2011<br />[NP]<br />[NP]<br />[AdjP]<br />[DetP]<br />tall<br />the<br />
- 94. Architectural Implementation<br />"the tall man"<br />5/2/2011<br />[NP]<br />[NP]<br />[AdjP]<br />[DetP]<br />tall<br />the<br />
- 95. Architectural Implementation<br />"the tall man"<br />5/2/2011<br />[NP]<br />[AdjP]<br />[DetP]<br />tall<br />the<br />
- 96. Architectural Implementation<br />5/2/2011<br />"the tall man"<br />[XP]<br />[NP]<br />[AdjP]<br />[DetP]<br />[NP]<br />tall<br />the<br />man<br />DetP<br />
- 97. Architectural Implementation<br />5/2/2011<br />"the tall man"<br />[XP]<br />[NP]<br />[AdjP]<br />[DetP]<br />[NP]<br />tall<br />the<br />man<br />DetP<br />
- 98. Architectural Implementation<br />5/2/2011<br />[XP]<br />"the tall man"<br />[NP]<br />[AdjP]<br />[DetP]<br />[1]<br />tall<br />the<br />man<br />DetP<br />[1]<br />
- 99. Comparative Results<br />Compare to previous incarnation of syntactic parser in Polyscheme:<br />Loyal to CFG formalism<br />Relies very heavily on search<br />5/2/2011<br />
- 100. Comparative Results<br />New model has significant benefits:<br />Orders of magnitude faster (tens of minutes vs. seconds)<br />Wider coverage of sentences<br />5/2/2011<br />
- 101. Conclusions<br />Introduced a syntactic theory based on the principles of<br />Satellite Structure Positing<br />Feature Structure Unification<br />Feature Structure Aggregation<br />Incrementality<br />5/2/2011<br />
- 102. Conclusions<br />Working implementation in Cognitive Architecture that is:<br />Structurally Efficient<br />Computationally fast<br />Cognitively plausible<br />5/2/2011<br />
- 103. Contribution<br />We have presented a new grammatical formalism<br />Implemented in a cognitive architecture<br />Integrated with reasoning capabilities<br />Computationally efficient, cognitively plausible<br />Will lead towards a system that can understand natural language.<br />5/2/2011<br />
- 104. Future Directions<br />Many linguistic phenomena not mentioned here (not necessarily syntactic)<br />Developing a new lexical representation theory to support interpretive grammar<br />Integration with notions of pragmatics/discourse<br />Integrate theory into working applications<br />5/2/2011<br />
- 105. Questions?<br />5/2/2011<br />
- 106. Special thanks to (in no particular order):<br />Dr. Nick Cassimatis<br />Perrin Bignoli<br />JR Scally<br />Soledad Vedovato<br />John Borland<br />Hiroyuki Uchida<br />5/2/2011<br />
- 107. References<br />5/2/2011<br />Ball, J. T. (2004). A Cognitively Plausible Model of Language Comprehension. Proceedings of the 13th Conference on Behavior Representation in Modeling and Simulation.<br />Ball, J., Rodgers, S., & Gluck, K. (2001). Integrating ACT-R and Cyc in a large-scale model of language comprehension for use in intelligent agents. Artificial Intelligence.<br />Ball, J., Heiberg, A., & Silber, R. (2007). Toward a large-scale model of language comprehension in ACT-R 6. In R. L. Lewis, T. A. Polk, & J. E. Laird (Eds.), Proceedings of the 8th International Conference on Cognitive Modeling (pp. 163-168).<br />Ball, Jerry T; Heiberg, Andrea; Silber, R. (2005). Toward a Large-Scale Model of Language Comprehension in ACT-R 6 Construction-Driven Language Processing. Language, 1.<br />Ball, J. T. (2004). A Cognitively Plausible Model of Language Comprehension. Proceedings of the 13th Conference on Behavior Representation in Modeling and Simulation.<br />Culicover, P. W., & Jackendoff, R. (2006). The simpler syntax hypothesis. Trends in cognitive sciences, 10(9), 413-8. doi: 10.1016/j.tics.2006.07.007.<br />Lewis, R. L. (1993). An architecturally-based theory of human sentence comprehension. Proceedings of the fifteenth annual conference of the Cognitive Science Society June 18 to 21 1993 Institute of Cognitive Science University of Colorado Boulder (p. 108). Lawrence Erlbaum. <br />Lewis, R. L., Newell, A., & Polk, T. A. (1989). Toward a Soar theory of taking instructions for immediate reasoning. Proceedings of the Eleventh Annual Conference of the Cognitive Science Society (pp. 514-521). Erlbaum.<br />Lewis, R. L., & Vasishth, S. (2005). An Activation-Based Model of Sentence Processing as Skilled Memory Retrieval. Cognitive Science, 29(3), 375-419. Psychology Press. doi: 10.1207/s15516709cog0000_25.<br />Nivre, J. (2005). Dependency grammar and dependency parsing. MSI report, 5133(1959), 1-32. Citeseer. <br />Pollard, C., & Sag, I. (1994). Head Driven Phrase Structure Grammar. Studies in Contemporary Linguistics. University of Chicago Press.<br />Pullman, Stepen G. (1991). Basic Parsing Techniques: an introductory survey.<br />Sag, I. A., Wasow, T., & Bender, E. (2003). Syntactic Theory: A Formal Introduction (second edition). (I. A. Sag, Thomas Wasow, & Emily Bender, Eds.). CSLI Publications.<br />

No public clipboards found for this slide

×
### Save the most important slides with Clipping

Clipping is a handy way to collect and organize the most important slides from a presentation. You can keep your great finds in clipboards organized around topics.

Be the first to comment