Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Elsj sf slides2


Published on

Published in: Technology, Business
  • Be the first to comment

  • Be the first to like this

Elsj sf slides2

  1. 1. Presenting a new type of usage-based approach to grammatical constructions<br />Toward a pattern-based analysis of English resultatives:<br />Keio University<br />Masato YOSHIKAWA<br />April 24th, 2010<br />ELSJ International Spring Forum 2010<br />
  2. 2. 1. Introduction<br />
  3. 3. 1.1. Outline<br />ELSJ International Spring Forum 2010<br />Theme<br />The Resultative Construction(RC, henceforth; e.g., (1))<br />(1) John hammered the metal flat.<br />Position<br />Usage-based view (e.g., Kemmer & Barlow 2000; Langacker 1987)<br />Based on Pattern Lattice Model(Kuroda & Hasebe 2009; Kuroda 2009), a radically memory-based/exemplar-based model of language<br />Methodology<br />a quantitative research<br />using the RC database collected by Boas (2003).<br />Conclusion<br />RC is a “mosaic” of partially similar conventional phrases<br />3<br />
  4. 4. 1.2. The aim of this talk<br />The aims of this talk<br />To show the possibility of a new approach to grammatical constructions which is based on the Usage-based view;<br />Suggestion: “reductionist” approaches should not work<br />To contribute to a “memory-based”or “exemplar-based”theory of human linguistic knowledge (e.g., Bod 2006; Pierrehumbert 2001; Port 2007)<br />What is implied<br />Constructions of abstract kind =psychologically unreal!?<br />Grammar = an epiphenomenon derived from analogical applicationsof conventionalized expressions!?<br />ELSJ International Spring Forum 2010<br />4<br />
  5. 5. 1.3. The organization of this talk<br />Section 2<br />Provides a brief sketch of Pattern Lattice Model (PLM)<br />Section3<br />Reports the detail of the quantitative research <br />Section 4<br />Discusses the results of the research<br />Section 5<br />Summarizes the whole discussion;<br />Remarks on the remaining problems<br />Section 6<br />Acknowledgements and additional references<br />ELSJ International Spring Forum 2010<br />5<br />
  6. 6. 2. Background<br />Presenting the Pattern Lattice Model (PLM)<br />
  7. 7. 2.1. Pattern Lattice Model (PLM)<br />Pattern Lattice Model (PLM, Kuroda & Hasebe 2009; Kuroda 2009)<br />Assumption 1: <br />the linguistic knowledge we have in mind = a collection of concrete exemplars of linguistic experiences<br />Exemplars are considered almost equivalent to what we call “episodes” (e.g., Tulving 2002)<br />The underlying idea: the hypothesis of “full memory”<br />Assumption 2:<br />Those exemplars are connected to vast number of “indices”<br />Indices = any kinds of abstract units (e.g., phonemes, morphemes, lexemes, etc.)<br />As for syntax: the relevant indices = “patterns”<br />whose definition is given below<br />ELSJ International Spring Forum 2010<br />7<br />
  8. 8. 2.2. Patterns [1/3]<br />Where do patterns come from?<br />Segment an exemplar e (e.g., (1a)) into arbitrary size of units and make T(e) (e.g., (1b))<br />(1) a. John hammered the metal flat.<br /> b. [John, hammered, the metal, flat]<br />ELSJ International Spring Forum 2010<br />8<br />John hammered the metal flat<br />hammered<br />the metal<br />flat<br />= e<br />John<br />segmentation<br />= T(e)<br />hammered<br />the metal<br />flat<br />John<br />
  9. 9. 2.2. Patterns [2/3]<br />Where do patterns come from?<br />Replace each segment with a variable X (shown here as “_”)<br />The products of this procedure = patterns<br />{[ _, hammered, the metal, flat], [ John, _, the metal, flat], [ John, hammered, _, flat], [ John, hammered, the metal, _ ]}<br />ELSJ International Spring Forum 2010<br />9<br />hammered<br />the metal<br />flat<br />John<br />hammered<br />the metal<br />flat<br />__<br />Patterns<br />__<br />the metal<br />flat<br />John<br />hammered<br />__<br />flat<br />John<br />hammered<br />the metal<br />__<br />John<br />
  10. 10. 2.2. Patterns [3/3]<br />Where do patterns come from?<br />Perform the replacement recursively until all the segments are replaced with variables<br />The result = the pattern set P for e =P (e)<br />ELSJ International Spring Forum 2010<br />10<br />
  11. 11. 2.3. Pattern Lattice<br />What is Pattern Lattice (PL)?<br />A hierarchical network of patterns<br />The partially-ordered set where “≤” = “is-a” relation<br />Is-a relation here:<br />For pi , pj∈ P, pi is-a pj when pj matches pi<br />x = [a, b, _, d], y = [ a, _, _, d] <br />y matches x ⇒ x is-a y <br />The TOP of PL = a pattern composed only of variable(s)<br />The BOTTOMof PL = a set of exemplar(s)<br />Shown diagrammatically in the next slide<br />ELSJ International Spring Forum 2010<br />11<br />
  12. 12. ELSJ International Spring Forum 2010<br />The Hasse diagram of PL<br />12<br />Created by using Pattern Lattice Builder (<br />RANK<br />
  13. 13. 2.4. Why PLM?<br />PLM gives us<br />A solid foundation for the usage-based view of language;<br />A simple but powerful algorithm of pattern generation;<br />This means: the current Usage-based Model (e.g., Langacker 2000) = insufficient<br />A pattern-based analysis = an approach based on PLM<br />Note<br />PLM = only the beginning!<br />We need:<br />Additional procedure which tells us which patterns are useful<br />ELSJ International Spring Forum 2010<br />13<br />
  14. 14. 3. Research<br />
  15. 15. 3.1. Data<br />RC database collected by Boas (2003)<br />Containing about 6000 examples of RCs obtained from British National Corpus (BNC)<br />Downloadable at<br />Manual coding<br />Each sentence annotate with<br />1) the head noun of Argument 1<br />= “Object” if transitive/“Subject” if intransitive<br />2) the head noun of Argument 2<br />= “Subject” if transitive/NONE if intransitive<br />3) the verb<br />4) the resultative predicate<br />ELSJ International Spring Forum 2010<br />15<br />
  16. 16. 3.1. Data in detail [1/4]<br />ELSJ International Spring Forum 2010<br />16<br />
  17. 17. 3.1. Data in detail [2/4]<br />ELSJ International Spring Forum 2010<br />17<br />
  18. 18. 3.1. Data in detail [3/4]<br />ELSJ International Spring Forum 2010<br />18<br />
  19. 19. 3.1. Data in detail [4/4]<br />ELSJ International Spring Forum 2010<br />19<br />
  20. 20. 3.2. Method<br />VP Extraction<br />Extract VP from manually-coded data<br />Tally the number of different VPs<br />Patterngeneration<br />Input the VPs into self-made Python script to get patterns<br />The tool employed ≠what is shown in ABSTRACT<br />Python’s version: 2.6.5; Windows ver.<br />Calculate z-score of each pattern pi.e., z(p)<br />f(p) = the frequency of p; f(k) = the average frequency of the rank k<br />s(k) = the standard deviation of the frequency of the rank k<br />z-score tells us how productive and conventional a pattern is<br />ELSJ International Spring Forum 2010<br />20<br />
  21. 21. 3.3. Results [1/2]<br />Overview<br />3,376 different VPs<br />11,392 patterns*<br />Notice!<br />Different from the number shown in ABSTRACT<br />The “top” pattern: <br />“shoot __ dead” (z = 43.6)<br />“Superior” patterns<br />Shown in the right table<br />Notice!<br />Different from the table show in ABSTRACT<br />ELSJ International Spring Forum 2010<br />21<br />
  22. 22. 3.3. Results [2/2]<br />ELSJ International Spring Forum 2010<br />22<br />
  23. 23. 4. Discussion<br />
  24. 24. 4.1. Variety of slot positions<br />Inconsistency of slot positions<br />As for the top 100 patterns:<br />V = “X _ _”: 5 pattern types<br />O = “_ Y _”: 6 pattern types<br />R = “_ _ Z”:7 pattern types<br />VO = “X Y _”: 8 pattern types<br />OR = “_ Y Z”:13 pattern types<br />VR = “X _ Z”: 29 pattern types<br />VOR = “X Y Z”:32 pattern types<br />Overall (for the patterns whose z ≥ 1)<br />V= 20; O = 10; R = 16; VO = 38; OR = 51; VR = 93; VOR = 106<br />This may mean:<br />The resultative construction = inconsistent set??<br />ELSJ International Spring Forum 2010<br />24<br />
  25. 25. 4.2. Remarks<br />Ubiquitous Super-Lexical patterns<br />VO, OR, VR, and VOR are ubiquitous<br />Suggestion: RC = irreducible to lexical factors!?<br />One possibility: RC = a mosaic of conventional patterns<br />Bonus<br />Additional examples (found in Corpus of Contemporary American English, COCA: Davies 2008-)<br />“_ door open”  creak door open, buzz door open, etc.<br />RCs with additional verbs<br />“beat _ _”  beat ~ senseless<br />New RP<br />Note:<br />Examples with the verb make ≠ RC!?<br />ELSJ International Spring Forum 2010<br />25<br />
  26. 26. 5. Concluding Remarks<br />
  27. 27. 5.1. Summary of this research<br />This talk presents<br />A quantitative research of the Resultative Construction (RC)<br />Under the radically usage-based model called Pattern Lattice Model (PLM)<br />Findings<br />Slot position of the patterns = highly inconsistent<br />Productive patterns of RC = highly lexically-specific = concrete<br />Conclusion<br />RC = a mosaic of conventional patterns (e.g., shoot _ dead, _ door open, drive me mad, etc)<br />But unfortunately this is only a suggestion…<br />ELSJ International Spring Forum 2010<br />27<br />
  28. 28. 5.2. Remaining problems<br />“Semi-”concreteness<br />The inputs employed to generate patterns = abstract arrays (= VOR) ≠ concrete item sequences (e.g., raw sentences)<br />This means: this research = NOT entirely usage-based<br />No direct references to psychological reality<br />Only the result of corpus research was provided Psychological experiment (or the like) will be needed<br />ELSJ International Spring Forum 2010<br />28<br />
  29. 29. 6. Acknowledgements and references<br />
  30. 30. 6.1. Acknowledgements<br />Prof. Ippei INOUE (Keio University)<br />Mr. Fuminori NAKAMURA (Keio Univeristy)<br />ELSJ International Spring Forum 2010<br />30<br />
  31. 31. 6.2. References<br />Boas, H. 2003. A constructional approach to resultatives. Stanford: CSLI publications.<br />Bod, R. 2006. Exemplar-based syntax: How to get productivity from examples. The linguistic review, 23, 291-320.<br />Davies, M. 2008-. The Corpus of Contemporary American English (COCA): 400+ million words,1990-present. Available online at<br />Kemmer, S., & Barlow, M. 2000. Introduction: A usage-based conception of language. In Barlow, M., &. Kemmer, S. (eds.) Usage-based models of language (pp. vii-xxii). Stanford: CSLI Publications.<br />Kuroda, K. 2009. Pattern lattice as a model of linguistic knowledge and performance. Proceedings of The 23rd Pacific Asia Conference on Language, Information and Computation.<br />Kuroda, K. and Hasebe, Y. 2009. Modeling (Human) Knowledge and Processing of Natural Language Using Pattern Lattice. 15th Annual Meeting of Japanese Society of Natural Language Processing, 670‒673.<br />Langacker, R. 1987. Foundations of cognitive grammar Vol. 1: Theoretical prerequisites. Stanford: Stanford University Press.<br />— — . 2000. A dynamic usage-based model. In Barlow, M., &. Kemmer, S. (eds.) (pp. 1- 63).<br />Pierrehumbert, J. 2001. Exemplar dynamics: Word frequency, lenition and contrast. In Bybee, J., & Hopper, P. (eds.) Frequency and the emergence of linguistic structure (pp. 137-157). Amsterdam: John Benjamins.<br />Port, R. 2007. How words are stored in memory: Beyond phones and phonemes. New Ideas in Psychology, 25, 143-170.<br />Tulving, E. 2002. Episodic memory: From mind to brain. Annual Review of Psychology, 53, 1–25. <br />ELSJ International Spring Forum 2010<br />31<br />
  32. 32. Thank you for your attention<br />