Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Issues W2011 Final

280 views

Published on

RPI Issues in Cognitive Science Speakers Series
Spring 2011

  • Be the first to comment

Issues W2011 Final

  1. 1. An Interpretation-Driven Model of Syntax<br />Richard Caneba <br />canebr@rpi.edu<br />RPI Cognitive Science Department<br />Human-Level Intelligence Laboratory<br />5/2/2011<br />
  2. 2. Introduction<br />We start with a goal: Develop a system that can understand natural language.<br />5/2/2011<br />
  3. 3. Introduction<br />We start with a goal: Develop a system that can understand natural language.<br />(Roughly) three sub-goals:<br />Syntactic Parsing<br />Semantic Representation<br />Pragmatics/Discourse<br />5/2/2011<br />
  4. 4. Introduction<br />We start with a goal: Develop a system that can understand natural language.<br />(Roughly) three sub-goals:<br />Syntactic Parsing<br />Semantic Representation<br />Pragmatics/Discourse<br />Stage 1: Syntax Why is syntax important for natural language understanding?<br />5/2/2011<br />
  5. 5. Introduction: Syntax<br />“The dog bit the man.”<br />5/2/2011<br />
  6. 6. Introduction: Syntax<br />“The dog bit the man.”<br />5/2/2011<br />
  7. 7. Introduction: Syntax<br />“The man bit the dog.”<br />5/2/2011<br />
  8. 8. Introduction: Syntax<br />“The man bit the dog.”<br />5/2/2011<br />
  9. 9. Introduction: Syntax<br />“I hit the man with my car.”<br />5/2/2011<br />
  10. 10. Introduction: Syntax<br />5/2/2011<br />“[I] hit [the man] [with my car].”<br />
  11. 11. Introduction: Syntax<br />“[I] hit [the man] [with my car].”<br />5/2/2011<br />Option 1: “with my car” modifies “hit””I hit the man while driving my car.”<br />
  12. 12. Introduction: Syntax<br />“[I] hit [the man] [with my car].”<br />5/2/2011<br />Option 2: “with my car” modifies “the man””I hit the man who had my car.”<br />
  13. 13. Introduction: Syntax<br />Syntactic interpretation yields very distinct semantic interpretations.<br />5/2/2011<br />
  14. 14. Introduction: Syntax<br />Syntactic interpretation yields very distinct semantic interpretations.<br />Helps identify the role words play in an utterance.<br />5/2/2011<br />
  15. 15. Introduction: Syntax<br />Thus, syntax plays a fundamental role in natural language understanding.<br />5/2/2011<br />
  16. 16. Syntax<br />Current grammar formalisms have number of shortcomings<br />5/2/2011<br />
  17. 17. Syntax<br />Current grammar formalisms have number of shortcomings<br />From perspective of trying to develop a system to understand natural language<br />5/2/2011<br />
  18. 18. Syntax<br />Current grammar formalisms have number of shortcomings<br />From perspective of trying to develop a system to understand natural language<br />Focus on generative grammar (i.e. Chomskyan, HPSG [11], [13])<br />5/2/2011<br />
  19. 19. Syntax: Theory & Implementation<br />Two classes of shortcomings:<br />Theoretical: shortcoming in the way a theory of language represents the users mental knowledge of that language.<br />Implementation: shortcoming in the way a theory implies or represents language processing in terms of computability and/or cognitive realism.<br />5/2/2011<br />
  20. 20. Syntax: Theory & Implementation<br />We will show that these two classes are so closely related, does not make sense to make a strong distinction<br />5/2/2011<br />
  21. 21. Theoretical Shortcomings<br />Shortcoming in the way a theory of language represents the users mental knowledge of that language<br />5/2/2011<br />
  22. 22. Theoretical Shortcomings<br />Interpretability<br />5/2/2011<br />
  23. 23. Theoretical Shortcomings<br />i.e. “Fido bit dog.”<br />5/2/2011<br />
  24. 24. Theoretical Shortcomings<br />i.e. “Fido bit dog.”<br />cf. “Fido bit the dog.” <br />5/2/2011<br />
  25. 25. Why Syntax?<br />Why syntax at all, if we can still interpret ungrammaticality?<br />5/2/2011<br />
  26. 26. Why Syntax?<br />“Fido bit dog.”<br />cf. “Dog bit Fido.” <br />5/2/2011<br />Fido<br />Fido<br />
  27. 27. Theoretical Shortcomings<br />Phrasal Nesting<br />5/2/2011<br />
  28. 28. Theoretical Shortcomings<br />Phrasal Nesting<br />example: “the man in boston with the hat is here.”<br />5/2/2011<br />
  29. 29. 5/2/2011<br />[S]<br />[NP]<br />[VP]<br />[NP]<br />is<br />[PP]<br />[PP]<br />here<br />man<br />[NP]<br />[DP]<br />in<br />the<br />[PP]<br />boston<br />with<br />[NP]<br />hat<br />[DP]<br />“the man in boston with the hat is here.”<br />the<br />
  30. 30. 5/2/2011<br />[S]<br />[NP]<br />[VP]<br />[NP]<br />is<br />[PP]<br />[PP]<br />here<br />man<br />[NP]<br />[DP]<br />in<br />the<br />[PP]<br />boston<br />with<br />[NP]<br />hat<br />[DP]<br />“the man in boston with the hat is here.”<br />the<br />
  31. 31. 5/2/2011<br />[S]<br />[VP]<br />[NP]<br />is<br />[PP]<br />man<br />[DP]<br />[PP]<br />[PP]<br />here<br />with<br />[NP]<br />[NP]<br />in<br />the<br />hat<br />[DP]<br />boston<br />the<br />“the man in boston with the hat is here.”<br />
  32. 32. 5/2/2011<br />[S]<br />[VP]<br />[NP]<br />is<br />[PP]<br />man<br />[DP]<br />[PP]<br />[PP]<br />here<br />with<br />[NP]<br />[NP]<br />in<br />the<br />hat<br />[DP]<br />boston<br />the<br />“the man in boston with the hat is here.”<br />
  33. 33. Implementation Shortcomings<br />Shortcoming in the way a theory implies or represents language processing in terms of computability and/or cognitive realism.<br />5/2/2011<br />
  34. 34. Implementation Shortcomings<br />Perceptual Ordering<br />5/2/2011<br />
  35. 35. Implementation Shortcomings<br />john<br />5/2/2011<br />
  36. 36. Implementation Shortcomings<br />john<br />saw<br />5/2/2011<br />
  37. 37. Implementation Shortcomings<br />john<br />saw<br />mary<br />5/2/2011<br />
  38. 38. Implementation Shortcomings<br />[VP]<br />[NP]<br />5/2/2011<br />john<br />saw<br />mary<br />
  39. 39. Implementation Shortcomings<br />[S]<br />[VP]<br />[NP]<br />[NP]<br />5/2/2011<br />john<br />saw<br />mary<br />
  40. 40. Implementation Shortcomings<br />Excessive Structure<br />5/2/2011<br />
  41. 41. Implementation Shortcomings<br />5/2/2011<br />the<br />tall<br />strong<br />angry<br />man<br />
  42. 42. Implementation Shortcomings<br />5/2/2011<br />[NP]<br />the<br />tall<br />strong<br />angry<br />man<br />
  43. 43. Implementation Shortcomings<br />5/2/2011<br />[NP]<br />[NP]<br />the<br />tall<br />strong<br />angry<br />man<br />
  44. 44. Implementation Shortcomings<br />5/2/2011<br />[NP]<br />[NP]<br />[NP]<br />the<br />tall<br />strong<br />angry<br />man<br />
  45. 45. Implementation Shortcomings<br />5/2/2011<br />[NP]<br />[NP]<br />[NP]<br />[NP]<br />the<br />tall<br />strong<br />angry<br />man<br />
  46. 46. Implementation Shortcomings<br />5/2/2011<br />[NP]<br />[NP]<br />[NP]<br />[NP]<br />the<br />tall<br />strong<br />angry<br />man<br />
  47. 47. Implementation Shortcomings<br />5/2/2011<br />[NP]<br />[NP]<br />[NP]<br />[NP]<br />[PP]<br />the<br />tall<br />strong<br />angry<br />man<br />in<br />boston<br />
  48. 48. Implementation Shortcomings<br />5/2/2011<br />[NP]<br />[NP]<br />[NP]<br />[NP]<br />[PP]<br />the<br />tall<br />strong<br />angry<br />man<br />in<br />boston<br />
  49. 49. Implementation Shortcomings<br />5/2/2011<br />[NP]<br />[NP]<br />[NP]<br />[NP]<br />[PP]<br />the<br />tall<br />strong<br />angry<br />man<br />in<br />boston<br />
  50. 50. Implementation Shortcomings<br />5/2/2011<br />[NP]<br />[NP]<br />[NP]<br />[NP]<br />[PP]<br />the<br />tall<br />strong<br />angry<br />man<br />in<br />boston<br />
  51. 51. Implementation Shortcomings<br />5/2/2011<br />[NP]<br />[NP]<br />[NP]<br />[NP]<br />[PP]<br />the<br />tall<br />strong<br />angry<br />man<br />in<br />boston<br />
  52. 52. Implementation Shortcomings<br />5/2/2011<br />[NP]<br />[NP]<br />[NP]<br />[NP]<br />[PP]<br />the<br />tall<br />strong<br />angry<br />man<br />in<br />boston<br />
  53. 53. Syntax: Statistical Approach<br />Most currently successful parsing algorithms rely heavily on statistics.<br />However, inferences that require notion of semantics difficult<br />5/2/2011<br />
  54. 54. Syntax: Statistical Approach<br />i.e. “The couple walked in the park. He held her hand.”<br />5/2/2011<br />
  55. 55. Syntax: Statistical Approach<br />i.e. “The couple walked in the park. He held her hand.”<br />Statistics does not give anaphoric bind.<br />5/2/2011<br />
  56. 56. Chomskyan Approach<br />From [6], consider “Joe has put those raw potatoes in the pot.”<br />5/2/2011<br />
  57. 57. Chomskyan Approach<br />5/2/2011<br />
  58. 58. Chomskyan Approach<br />A simpler analysis is possible:<br />5/2/2011<br />
  59. 59. Chomskyan Approach<br />5/2/2011<br />
  60. 60. Chomskyan Approach<br />Illustrates: way a theory is represented has very real computational concerns.<br />5/2/2011<br />
  61. 61. Other Notable Work<br />Other Cognitive Architecture based Parsers<br />[8] R. Lewis et.al. developed a parser based on "immediate reasoning" in Soar<br />[9] R. Lewis et.al. developed an activation based parser model in ACT-R<br />[1][2][3] J.T. Ball et.al. developed a parser based on Double R Grammar Model, for "synthetic teammate" development in ACT-R<br />5/2/2011<br />
  62. 62. Other Notable Work<br />However, each of these theories suffers from the shortcomings we’ve already seen.<br />5/2/2011<br />
  63. 63. Other Notable Work<br />However, each of these theories suffers from the shortcomings we’ve already seen.<br />Both parsers designed by Lewis rely on a CFG formalism<br />Bell’s ACT-R parser does not deeply integrated with reasoning<br />5/2/2011<br />
  64. 64. Other Notable Work<br />However, each of these theories suffers from the shortcomings we’ve already seen.<br />Both parsers designed by Lewis rely on a CFG formalism<br />Bell’s ACT-R parser does not deeply integrated with reasoning<br />These approaches are not well integrated with reasoning overall<br />5/2/2011<br />
  65. 65. Motivating Principles<br />To address these shortcomings from an interpretative perspective, four principles are motivated:<br />5/2/2011<br />
  66. 66. Motivating Principles<br />To address these shortcomings from an interpretative perspective, four principles are motivated:a) The existence of satellite structuresb) Feature structure unificationc) Feature structure aggregationd) Incrementality<br />5/2/2011<br />
  67. 67. Satellite Structures<br />When we hear words, infer existence of other structures related to that word<br />5/2/2011<br />
  68. 68. Satellite Structures<br />When we hear words, infer existence of other structures related to that word<br />i.e. “bit”<br />5/2/2011<br />bit<br />bit<br />
  69. 69. Satellite Structures<br />When we hear words, infer existence of other structures related to that word<br />i.e. “bit”<br />5/2/2011<br />[NP]<br />[NP]<br />Obj<br />Subj<br />bit<br />bit<br />
  70. 70. Feature Structure Unification<br />With the existence of satellite structures, we can unify observed structures together<br />5/2/2011<br />
  71. 71. Feature Structure Unification<br />With the existence of satellite structures, we can unify observed structures together<br />5/2/2011<br />[NP]<br />[NP]<br />Obj<br />Subj<br />john<br />bit<br />bit<br />fido<br />
  72. 72. Feature Structure Unification<br />With the existence of satellite structures, we can unify observed structures together<br />5/2/2011<br />[NP]<br />[NP]<br />Obj<br />Subj<br />john<br />bit<br />bit<br />fido<br />
  73. 73. Feature Structure Aggregation<br />Sequential unification multiple feature structures yields “aggregating” feature structure<br />5/2/2011<br />
  74. 74. Feature Structure Aggregation<br />i.e. “the big dog”<br />5/2/2011<br />
  75. 75. Feature Structure Aggregation<br />i.e. “the big dog”<br />5/2/2011<br />big<br />the<br />dog<br />
  76. 76. Feature Structure Aggregation<br />i.e. “the big dog”<br />5/2/2011<br />[NP]<br />[NP]<br />[NP]<br />big<br />the<br />dog<br />
  77. 77. Feature Structure Aggregation<br />i.e. “the big dog”<br />5/2/2011<br />[NP]<br />[NP]<br />[NP]<br />Unify<br />big<br />the<br />dog<br />
  78. 78. Feature Structure Aggregation<br />i.e. “the big dog”<br />5/2/2011<br />[NP]<br />[NP]<br />[NP]<br />Unify<br />Unify<br />big<br />the<br />dog<br />
  79. 79. Feature Structure Aggregation<br />i.e. “the big dog”<br />5/2/2011<br />Unify<br />[NP]<br />[NP]<br />[NP]<br />big<br />the<br />dog<br />
  80. 80. Incrementality<br />Human sentence comprehension is left-to-right, and incremental<br />Incrementally generate an interpretation<br />5/2/2011<br />
  81. 81. Using these principles, attempt to address the aforementioned shortcomingsin a model<br />5/2/2011<br />Architectural Implementation<br />
  82. 82. Architectural Implementation<br />Parser prototype implemented in the Polyscheme Cognitive Architecture.<br />5/2/2011<br />
  83. 83. Architectural Implementation<br />Parser prototype implemented in the Polyscheme Cognitive Architecture.<br />Use the model to process and interpret natural language input.<br />5/2/2011<br />
  84. 84. Architectural Implementation<br />Parsing driven by pair-wise pattern matching<br />5/2/2011<br />
  85. 85. Architectural Implementation<br />5/2/2011<br />Parsing driven by pair-wise pattern matching<br />[XP]<br />*<br />[NP]<br />[XP]<br />*<br />[NP]<br />[DetP]<br />[NP]<br />[DetP]<br />[D]<br />[N]<br />[D]<br />[N]<br />*Underspecified Feature Structure<br />
  86. 86. Architectural Implementation<br />5/2/2011<br />Parsing driven by pair-wise pattern matching<br />[XP]<br />*<br />[NP]<br />[XP]<br />*<br />[NP]<br />[DetP]<br />[NP]<br />[DetP]<br />[D]<br />[N]<br />[D]<br />[N]<br />*Underspecified Feature Structure<br />
  87. 87. Architectural Implementation<br />5/2/2011<br />[XP]<br />[XP]<br />[NP]<br />[NP]<br />[DetP]<br />[NP]<br />[DetP]<br />[1]<br />man<br />the<br />man<br />the<br />DetP<br />[1]<br />
  88. 88. Architectural Implementation<br />5/2/2011<br />IsA(?the, WordUtteranceEvent)<br />IsA(?the, Determiner)<br />PartOf(?d, ?dp)<br />IsA(?dp, PhraseUtteranceEvent)<br />IsA(?dp, Determiner)<br />PartOf(?dp, ?np1)<br />IsA(?np1, PhraseUtteranceEvent)<br />IsA(?np1, Noun)<br />Meets(?the, ?man)<br />IsA(?man, WordUtteranceEvent)<br />IsA(?man, CommonNoun)<br />Specifier(?man, ?detSpr)<br />IsA(?detSpr, Determiner)<br />PartOf(?man, ?np2)<br />IsA(?np2, PhraseUtteranceEvent)<br />IsA(?np2, Noun)<br /><br />Same(?dp, ?detSpr)<br />Same(?np1, ?np2)<br />[XP]<br />[XP]<br />[NP]<br />[NP]<br />[DetP]<br />[NP]<br />[DetP]<br />[1]<br />man<br />the<br />man<br />the<br />DetP<br />[1]<br />
  89. 89. 5/2/2011<br />[DetP][N]  NP[DetP, N]<br />[AdjP][N]  NP[AdjP, N]<br />[XP]<br />[XP]<br />[XP]<br />[XP]<br />[XP]<br />[NP]<br />[NP]<br />[NP]<br />[DetP]<br />[NP]<br />[AdjP]<br />[NP]<br />[DetP]<br />[AdjP]<br />[D]<br />[N]<br />[D]<br />[N]<br />[Adj]<br />[N]<br />[Adj]<br />[N]<br />[DetP][AdjP]  NP[DetP, AdjP]<br />[XP]<br />[XP]<br />[NP]<br />[NP]<br />[DetP]<br />[AdjP]<br />[AdjP]<br />[DetP]<br />[D]<br />[Adj]<br />[D]<br />[Adj]<br />
  90. 90. 5/2/2011<br />[P][NP]  PP[P, NP]<br />[V][NP]  VP[V, NP]<br />[XP]<br />[XP]<br />[XP]<br />[XP]<br />[XP]<br />[VP]<br />[XP]<br />[PP]<br />[PP]<br />[NP]<br />[NP]<br />[NP]<br />[VP]<br />[NP]<br />[P]<br />[X]<br />[P]<br />[X]<br />[X]<br />[X]<br />[V]<br />[V]<br />[NP][V]  VP[NP, V]<br />[XP]<br />[XP]<br />[XP]<br />[VP]<br />[NP]<br />[VP]<br />[NP]<br />[N]<br />[V]<br />[N]<br />[V]<br />
  91. 91. "the tall man"<br />5/2/2011<br />Architectural Implementation<br />
  92. 92. Architectural Implementation<br />"the tall man"<br />5/2/2011<br />[NP]<br />[DetP]<br />the<br />
  93. 93. Architectural Implementation<br />"the tall man"<br />5/2/2011<br />[NP]<br />[NP]<br />[AdjP]<br />[DetP]<br />tall<br />the<br />
  94. 94. Architectural Implementation<br />"the tall man"<br />5/2/2011<br />[NP]<br />[NP]<br />[AdjP]<br />[DetP]<br />tall<br />the<br />
  95. 95. Architectural Implementation<br />"the tall man"<br />5/2/2011<br />[NP]<br />[AdjP]<br />[DetP]<br />tall<br />the<br />
  96. 96. Architectural Implementation<br />5/2/2011<br />"the tall man"<br />[XP]<br />[NP]<br />[AdjP]<br />[DetP]<br />[NP]<br />tall<br />the<br />man<br />DetP<br />
  97. 97. Architectural Implementation<br />5/2/2011<br />"the tall man"<br />[XP]<br />[NP]<br />[AdjP]<br />[DetP]<br />[NP]<br />tall<br />the<br />man<br />DetP<br />
  98. 98. Architectural Implementation<br />5/2/2011<br />[XP]<br />"the tall man"<br />[NP]<br />[AdjP]<br />[DetP]<br />[1]<br />tall<br />the<br />man<br />DetP<br />[1]<br />
  99. 99. Comparative Results<br />Compare to previous incarnation of syntactic parser in Polyscheme:<br />Loyal to CFG formalism<br />Relies very heavily on search<br />5/2/2011<br />
  100. 100. Comparative Results<br />New model has significant benefits:<br />Orders of magnitude faster (tens of minutes vs. seconds)<br />Wider coverage of sentences<br />5/2/2011<br />
  101. 101. Conclusions<br />Introduced a syntactic theory based on the principles of<br />Satellite Structure Positing<br />Feature Structure Unification<br />Feature Structure Aggregation<br />Incrementality<br />5/2/2011<br />
  102. 102. Conclusions<br />Working implementation in Cognitive Architecture that is:<br />Structurally Efficient<br />Computationally fast<br />Cognitively plausible<br />5/2/2011<br />
  103. 103. Contribution<br />We have presented a new grammatical formalism<br />Implemented in a cognitive architecture<br />Integrated with reasoning capabilities<br />Computationally efficient, cognitively plausible<br />Will lead towards a system that can understand natural language.<br />5/2/2011<br />
  104. 104. Future Directions<br />Many linguistic phenomena not mentioned here (not necessarily syntactic)<br />Developing a new lexical representation theory to support interpretive grammar<br />Integration with notions of pragmatics/discourse<br />Integrate theory into working applications<br />5/2/2011<br />
  105. 105. Questions?<br />5/2/2011<br />
  106. 106. Special thanks to (in no particular order):<br />Dr. Nick Cassimatis<br />Perrin Bignoli<br />JR Scally<br />Soledad Vedovato<br />John Borland<br />Hiroyuki Uchida<br />5/2/2011<br />
  107. 107. References<br />5/2/2011<br />Ball, J. T. (2004). A Cognitively Plausible Model of Language Comprehension. Proceedings of the 13th Conference on Behavior Representation in Modeling and Simulation.<br />Ball, J., Rodgers, S., & Gluck, K. (2001). Integrating ACT-R and Cyc in a large-scale model of language comprehension for use in intelligent agents. Artificial Intelligence.<br />Ball, J., Heiberg, A., & Silber, R. (2007). Toward a large-scale model of language comprehension in ACT-R 6. In R. L. Lewis, T. A. Polk, & J. E. Laird (Eds.), Proceedings of the 8th International Conference on Cognitive Modeling (pp. 163-168).<br />Ball, Jerry T; Heiberg, Andrea; Silber, R. (2005). Toward a Large-Scale Model of Language Comprehension in ACT-R 6 Construction-Driven Language Processing. Language, 1.<br />Ball, J. T. (2004). A Cognitively Plausible Model of Language Comprehension. Proceedings of the 13th Conference on Behavior Representation in Modeling and Simulation.<br />Culicover, P. W., & Jackendoff, R. (2006). The simpler syntax hypothesis. Trends in cognitive sciences, 10(9), 413-8. doi: 10.1016/j.tics.2006.07.007.<br />Lewis, R. L. (1993). An architecturally-based theory of human sentence comprehension. Proceedings of the fifteenth annual conference of the Cognitive Science Society June 18 to 21 1993 Institute of Cognitive Science University of Colorado Boulder (p. 108). Lawrence Erlbaum. <br />Lewis, R. L., Newell, A., & Polk, T. A. (1989). Toward a Soar theory of taking instructions for immediate reasoning. Proceedings of the Eleventh Annual Conference of the Cognitive Science Society (pp. 514-521). Erlbaum.<br />Lewis, R. L., & Vasishth, S. (2005). An Activation-Based Model of Sentence Processing as Skilled Memory Retrieval. Cognitive Science, 29(3), 375-419. Psychology Press. doi: 10.1207/s15516709cog0000_25.<br />Nivre, J. (2005). Dependency grammar and dependency parsing. MSI report, 5133(1959), 1-32. Citeseer. <br />Pollard, C., & Sag, I. (1994). Head Driven Phrase Structure Grammar. Studies in Contemporary Linguistics. University of Chicago Press.<br />Pullman, Stepen G. (1991). Basic Parsing Techniques: an introductory survey.<br />Sag, I. A., Wasow, T., & Bender, E. (2003). Syntactic Theory: A Formal Introduction (second edition). (I. A. Sag, Thomas Wasow, & Emily Bender, Eds.). CSLI Publications.<br />

×