Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Debugging Chomsky's Hierarchy


Published on

Presentation at CIRCL - Connecting Innovative Research in CUNY Linguistics - an attempt to examine initial assumptions by Chomsky dating back to the 1950s about the adequacy of Context Free Grammar (CFG) to represent natural language. This talk presents preliminary evidence that CFG can be appropriately used in different language structures, even those claimed to be incompatible with it.

Published in: Education
  • Be the first to comment

Debugging Chomsky's Hierarchy

  1. 1. Debugging Chomsky's hierarchy - Adequacy of Context Free Grammar to represent language By: Hussein Ghaly
  2. 2. Chomsky’s Hierarchy A containment hierarchy (strictly nested sets) of classes of formal grammars. Chomsky’s early work indicated that regular grammar and context free grammar are inadequate to represent natural language. Image: Table and definition:
  3. 3. Motivation - For both Natural Language Processing and Language Acquisition, it is important to have an adequate formal grammar that: - Represents accurately the structure of the language, following Chomsky’s criteria of generating only grammatical sentences and not generating ungrammatical ones - Can be represented computationally - Is learnable with finite sentence input (poverty of stimulus argument) - Goes with the spirit of minimalism, to reduce the number of entities needed to represent natural languages - Therefore, thorough investigation is needed for Chomsky’s hierarchy, to identify such adequate formal grammar - As a demonstration, it is needed to develop a simple computational model based on the proposed grammar
  4. 4. Motivation - NLP Applications - Deciding on grammaticality (syntactic completion) of a sentence is an important cue for turn taking in human-robot discussions (Skantze, 2017) I would like to ask about ... Silence Sorry I didn’t get that. … the show times. waiting
  5. 5. Motivation - Computational Challenges - Modern parsers are not generally built with rules to determine grammaticality of sentences, hence can accept and parse ungrammatical input (Output shown from Stanford Online Parser)
  6. 6. Motivation - Learnability Challenges Context sensitivity of the language is also associated with the language being unlearnable from input sentences alone without an informant, according to Gold’s definition of learnability (Gold, 1967)
  7. 7. Regular languages - Definition - Can be represented by Finite State Automata - A finite state language is a finite or infinite set of strings (sentences) of symbols (words) generated by a finite set of rules (the grammar), where each rule specifies the state of the system in which it can be applied, the symbol which is generated, and the state of the system after the rule is applied. (Chomsky, 1958)
  8. 8. Regular languages - Examples - For example, following string can be generated by Finite State: baaaa baaaaa baa - It can also generate grammatical sequences, for example, a simple sentence John saw the big red happy dog. ab DETVN ADJ N
  9. 9. Regular languages - Inadequacies - Regular/Finite State Grammar cannot represent embedding, mirror- image and other phenomena in natural language (Chomsky, 1957)
  10. 10. Context-Free Grammar (CFG) Also referred to as “Phrase Structure Grammar (PSG)”. A grammar made of rewrite rules (production rules): X → Y Where X and Y are symbols. Symbols are terminal and nonterminal. For example: Non terminal: S → NP VP VP → V NP Terminal: V → ate NP → John NP → Chocolate This can generate the sentence: John ate chocolate
  11. 11. Inadequacy of CFG - Constructional Homonymity
  12. 12. Inadequacy of CFG - Auxiliaries In order to account for the possible ways of formulating verbs with auxiliaries (is taking - has been taking, will be taken), the following rules were introduced:
  13. 13. Inadequacy of CFG - Verb Type and Transitivity CFG is too limited to give a true picture of linguistic structure.
  14. 14. Context-sensitivity - Necessity of transformation - In both Syntactic structures and three grammars, the discussion of inadequacy of CFG/Phrase structure grammar is accompanied with the presentation of Transformation (context-sensitive) rules as the solution to this inadequacy. - For example, the auxiliary formulation problem:
  15. 15. Context-sensitivity - Transformational Grammar Therefore, Chomsky proceeded with Context-Sensitive grammar, where all sentences are simple declarative sentences, but are transformed to their surface forms. For example: the sentence: “the food was eaten by the man” is “the man ate the food”, and some transformations are applied.
  16. 16. Context-sensitivity - Structural Ambiguity Examples such as: - The shooting of the hunters (meaning: hunters shoot or are being shot) - The growling of lions (meaning: lions growl) - The raising of flowers (meaning: flowers are being raised) Show that the transformation gives more explanatory power for the meaning
  17. 17. CFG Adequacy Claims - Gazdar (1982) - “Phrase structure grammar analyses can be at least as elegant and general and no more prone to counterexamples than the alternative transformational accounts of the same phenomena - Gazdar’s approach uses complex symbols, consisting of a component that distinguishes between the X bar and X lexical category, and a component which is a feature bundle
  18. 18. CFG Adequacy Claims - Pullum and Gazdar - “Compiler design for Context Free languages is fairly well explored problem, but designing compilers for non-CFLs can be grossly more difficult” - in line with the thesis of Fodor (1975) “language acquisition for a human learner is nothing more or less than the construction of a program to compile the natural language into the human machine code”
  19. 19. CFG Adequacy claims: Verb Frames In Chomsky’s example, the verb “fly” has actually multiple frames depending on its sense, showing that each structure is associated with a different sense of the verb. This goes also for other structural ambiguities (shoot vs. raise & growl)
  20. 20. New Model - CFG with finite flexible rules and labels Instead of the set of rules and labels introduced by Chomsky for the auxiliary problem, let’s use these ones, which are also CFG rules without transformation, and without any AUX label
  21. 21. New Model → New labels Labels NP → the man NP → the book V0 → take V → took V → takes VG → taking VN → taken BE → is BEEN → been BEING → being BE0 → be HAVE → has MOD → will Rules S → NP VP VP → V NP V → BE VG BE → MOD BE0 V → HAVE VN V → MOD V0
  22. 22. New Model - Generating Sentences Given the rules: S → NP VP and VP → V NP we have the following formula S → NP V NP - Since the label V applies directly to terminal labels (takes, took), we can formulate: The man takes the book - The man took the book - The label V also applies to other non-terminal labels: HAVE VN, BE VG, MOD V0 The man has taken the book - The man is taking the book - The man will take the book
  23. 23. New Model - Generating Sentences Given the rules: S → NP VP and VP → V NP we have the following formula S → NP V NP Given the rules: V → BE VG and BE → MOD BE0 we can formulate: The man will be taking the book
  24. 24. New Model - Passive Voice An additional concern is how to generate sentences in passive voice using CFG without generating ungrammatical sentences, so one way of doing this: - Introduce additional sentence level rule: S → NP VP_passive - Introduce verb phrase level passive rules: VP_passive → V_passive, VP_passive → V_passive BY NP - Introduce verb level passive rules: V_passive → BE VN - All other rules remain intact
  25. 25. New Model - Generating Sentences with Passive We can generate passive sentences using the rules: S → NP VP_passive , VP_passive → V_passive, V_passive → BE VN the book is taken (BE → is) the book will be taken (BE → MOD BE) The book has been taken (BE → HAVE BEEN)
  26. 26. New Model - problem with “is being” For the sentence: “The book is being taken”, it might be tempting to develop a rule: BE → BE BEING However, this would lead to generating ungrammatical sentences: * The man is being taking the book Therefore, different rules should apply
  27. 27. New Model - problem with “is being” We will simply introduce the rule: BEING0 → BE BEING And make sure this rule applies only in passive: V_passive → BEING0 VN (in addition to V_passive → BE VN) So the possible passive constructions generated will include: “The book is being read” in addition to: the book is read - the book has been read - the book will be read
  28. 28. New Model - outcome - From the demonstration above, using a finite set of context free rules can represent natural language structures previously thought to be context sensitive and require transformation rules - These rules will generate only grammatical sentences and will not generate uny ungrammatical ones
  29. 29. BONUS: Inadequacy of CFG - Swiss German (Schieber, 1985) - Cross-serial semantic dependencies to prove non-context freeness of natural language
  30. 30. New Model - Swiss German - From the data provided, it appears that the critical factor for grammaticality is the case of the noun (dative or accusative) depending on the verb used
  31. 31. New Model - Swiss German - The word order seems to be quite free
  32. 32. New Model - Swiss German We can propose a finite set of rules for this construction (Relative Clause- RC): NPa: accusative NP, NPd: dative NP Vd: Verb with dative object, Va: Verb with accusative object RC → NP NPd NPa Vd Va RC → NP NPa NPd Vd Va RC → NP NPd Vd NPd Va
  33. 33. New Model Alternatively, we can propose certain merge rules of the two verbs so that the composite structure would be equivalent to a ditransitive verb with both a dative and an accusative object, which can occur in any order. An indicative example in high German, also with some freedom in word order: Ich habe ihm das Buch gegeben Ich habe das Buch ihm gegeben I have given him-dative the book-accusative Vcomposite → Vd Va, RC → NP APa NPd Vcomposite, RC → NP APa NPd Vcomposite
  34. 34. Implementation - Hatshepsut Parser Building on this set of rules and labels, the following parser implementation was developed, as a shift-reduce parser to parse sentences according to a finite set of labels and recursive rules:
  35. 35. Parser Testing
  36. 36. Conclusion - Context-Free Grammar can be an appropriate formalism for representing natural language, having addressed some of the concerns regarding its adequacy - This may have implications on the learnability, given the fact that we are dealing with a finite number of labels and rules, which can be examined as patterns by the learner - CFG can be embedded in automatic parsing applications
  37. 37. Thank You! Questions? Hatshepsut Parser at: Email me:
  38. 38. Inadequacy of CFG - Undecidability
  39. 39. Inadequacy of CFG - Ambiguity