Your SlideShare is downloading. ×
Combining Approaches for Identifying Metonymy Classes of Named Locations
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Combining Approaches for Identifying Metonymy Classes of Named Locations

2,025
views

Published on

Talk given at EPIA 2007, …

Talk given at EPIA 2007,
December 4 2007, Guimaraes, Portugal

Published in: Technology

0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
2,025
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
19
Comments
0
Likes
1
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Combining Approaches for Identifying Metonymy Classes of Named Locations Sven Hartrumpf and Johannes Leveling Intelligent Information and Communication Systems (IICS) University of Hagen (FernUniversität in Hagen) 58084 Hagen, Germany firstname.lastname@fernuni-hagen.de EPIA 2007, Dec. 4, Guimarães, Portugal
  • 2. Identifying Metonymy Classes of Named Locations Outline S. Hartrumpf and J. Leveling 1 Introduction Introduction Metonymy Classes for 2 Metonymy Classes for Location Names Location Names Corpus 3 Corpus Annotation with Metonymy Information Annotation with Metonymy Information 4 Metonymy Classifiers Metonymy Classifiers 5 Classifier Combination Classifier Combination Evaluation 6 Evaluation Results Results Conclusion and Outlook 7 Conclusion and Outlook References
  • 3. Identifying Metonymy Classes of Named Locations Figurative Speech S. Hartrumpf and J. Leveling Definition Introduction Metonymy is a figure of speech in which a speaker uses Metonymy Classes for one entity to refer to another that is related to it Location (Lakoff and Johnson, 1980) Names Corpus Annotation with → senses different from normal reading Metonymy Information → identifying metonymy can be seen as word sense Metonymy disambiguation Classifiers Classifier → classification task Combination • levels of classification: Evaluation Results coarse (LITERAL/NON-LITERAL) Conclusion and Outlook medium (LIT /MET /MIX ) References fine
  • 4. Identifying Metonymy Classes of Named Locations Metonymy S. Hartrumpf and J. Leveling Introduction Metonymy • Typically, metonymy recognition experiments on Classes for Location English texts Names • Growing importance in research and applications: Corpus Annotation • SemEval I task at ACL 2007 (Markert and Nissim, with Metonymy 2007): recognition of metonymic location and Information organization names Metonymy • Question Answering (Stallard, 1993), Classifiers • Machine Translation (Kamei and Wakao, 1992), Classifier Combination • Geographic Information Retrieval (Leveling and Evaluation Hartrumpf, 2006) Results Conclusion and Outlook References
  • 5. Identifying Metonymy Classes of Named Locations Metonymy Classes S. Hartrumpf and (Markert and Nissim, 2002) J. Leveling Class Description Introduction Medium Fine Metonymy LIT literal literal, geographic sense Classes for MET place-for-event →event Location place-for-people: Names place-for-gov(ernment) →people in government Corpus place-for-off(icials) →people in official administration Annotation with place-for-org(anization) →organization at location Metonymy place-for-pop(ulation) →population Information place-for-product →product from place Metonymy othermet metonymy not covered by regular Classifiers pattern MIX mixed literal and metonymic sense Classifier Combination Evaluation Results Conclusion and Outlook References
  • 6. Identifying Metonymy Classes of Named Locations Metonymy Classes S. Hartrumpf and (Markert and Nissim, 2002) J. Leveling Class Description Introduction Medium Fine Metonymy LIT literal literal, geographic sense Classes for MET place-for-event →event Location place-for-people: Names place-for-gov(ernment) →people in government Corpus place-for-off(icials) →people in official administration Annotation with place-for-org(anization) →organization at location Metonymy place-for-pop(ulation) →population Information place-for-product →product from place Metonymy othermet metonymy not covered by regular Classifiers pattern MIX mixed literal and metonymic sense Classifier Combination Evaluation Example for literal: Results Seit Beginn des Kosovo-Krieges rekrutiert die UCK in D EUTSCHLAND Conclusion Kämpfer. – 9951 and Outlook (Since the beginning of the Kosovo war, the UCK recruits fighters in References G ERMANY.)
  • 7. Identifying Metonymy Classes of Named Locations Metonymy Classes S. Hartrumpf and (Markert and Nissim, 2002) J. Leveling Class Description Introduction Medium Fine Metonymy LIT literal literal, geographic sense Classes for MET place-for-event →event Location place-for-people: Names place-for-gov(ernment) →people in government Corpus place-for-off(icials) →people in official administration Annotation with place-for-org(anization) →organization at location Metonymy place-for-pop(ulation) →population Information place-for-product →product from place Metonymy othermet metonymy not covered by regular Classifiers pattern MIX mixed literal and metonymic sense Classifier Combination Evaluation Example for place-for-event: Results Nach dem KOSOVO geht es in Makedonien und Montenegro weiter. – 6336 Conclusion (After KOSOVO, it will continue in Macedonia and Montenegro.) and Outlook References
  • 8. Identifying Metonymy Classes of Named Locations Metonymy Classes S. Hartrumpf and (Markert and Nissim, 2002) J. Leveling Class Description Introduction Medium Fine Metonymy LIT literal literal, geographic sense Classes for MET place-for-event →event Location place-for-people: Names place-for-gov(ernment) →people in government Corpus place-for-off(icials) →people in official administration Annotation with place-for-org(anization) →organization at location Metonymy place-for-pop(ulation) →population Information place-for-product →product from place Metonymy othermet metonymy not covered by regular Classifiers pattern MIX mixed literal and metonymic sense Classifier Combination Evaluation Example for place-for-off : Results . . . D EUTSCHLAND (wird) mehr Geschick haben als Clinton. – 2435 Conclusion (. . . G ERMANY will be more successful than Clinton.) and Outlook References
  • 9. Identifying Metonymy Classes of Named Locations Metonymy Classes S. Hartrumpf and (Markert and Nissim, 2002) J. Leveling Class Description Introduction Medium Fine Metonymy LIT literal literal, geographic sense Classes for MET place-for-event →event Location place-for-people: Names place-for-gov(ernment) →people in government Corpus place-for-off(icials) →people in official administration Annotation with place-for-org(anization) →organization at location Metonymy place-for-pop(ulation) →population Information place-for-product →product from place Metonymy othermet metonymy not covered by regular Classifiers pattern MIX mixed literal and metonymic sense Classifier Combination Evaluation Example for place-for-product: Results Politisch sollte die Unterschrift Belgrads unter R AMBOUILLET erzwungen Conclusion werden. – 12087 and Outlook (The signature of Belgrade under R AMBOUILLET should be forced politically.) References
  • 10. Identifying Metonymy Classes of Named Locations Metonymy Classes S. Hartrumpf and (Markert and Nissim, 2002) J. Leveling Class Description Introduction Medium Fine Metonymy LIT literal literal, geographic sense Classes for MET place-for-event →event Location place-for-people: Names place-for-gov(ernment) →people in government Corpus place-for-off(icials) →people in official administration Annotation with place-for-org(anization) →organization at location Metonymy place-for-pop(ulation) →population Information place-for-product →product from place Metonymy othermet metonymy not covered by regular Classifiers pattern MIX mixed literal and metonymic sense Classifier Combination Evaluation Example for othermet: Results Dabei ist A FRIKA auch bei dieser Zusammenstellung von Musik eher eine Conclusion ideelle Klammer. – 8415 and Outlook (But A FRICA is an ideational cramp for this composition of music, too.) References
  • 11. Identifying Metonymy Classes of Named Locations Metonymy Classes S. Hartrumpf and (Markert and Nissim, 2002) J. Leveling Class Description Introduction Medium Fine Metonymy LIT literal literal, geographic sense Classes for MET place-for-event →event Location place-for-people: Names place-for-gov(ernment) →people in government Corpus place-for-off(icials) →people in official administration Annotation with place-for-org(anization) →organization at location Metonymy place-for-pop(ulation) →population Information place-for-product →product from place Metonymy othermet metonymy not covered by regular Classifiers pattern MIX mixed literal and metonymic sense Classifier Combination Evaluation Example for mixed: Results Die Friedensfahrt gewinnt im Osten D EUTSCHLANDS wieder stark an Conclusion Renommee. – 1498 and Outlook (The peace tour makes a reputation in the eastern part of G ERMANY again.) References
  • 12. Identifying Metonymy Classes of Named Locations Data and Annotation (1/2) S. Hartrumpf and J. Leveling • TüBa-D/Z corpus containing articles from the German Introduction Metonymy newspaper taz (27,067 sentences with 500,628 tokens) Classes for Location • Annotation levels: Names • (PoS tags) Corpus • NE tags (LOC, PER, ORG, and MISC) Annotation with • NE subclasses (e.g. first names, last names, and other Metonymy Information parts of a name) Metonymy • Label corresponding to medium and fine metonymy Classifiers classification Classifier • Example: token Africa →(NE, LOC, region, MET, Combination Evaluation othermet) Results → 1,515 (18.5%) of all toponyms are used in a nonliteral Conclusion and Outlook sense References
  • 13. Identifying Metonymy Classes of Named Locations Data and Annotation (2/2) S. Hartrumpf and J. Leveling Introduction Metonymy Annotation checking: Classes for Location • Applied the variation (or inconsistency) detection tool Names Corpus DECCA (http://decca.osu.edu/) Annotation with • Used corrections supplied by the TüBa-D/Z corpus Metonymy Information publishers Metonymy • Identify additional spelling errors by frequency analysis Classifiers Classifier → Errors in text and on levels of PoS tags, NE tags, NE Combination subclasses, medium and fine metonymy classes Evaluation Results Conclusion and Outlook References
  • 14. Identifying Metonymy Classes of Named Locations Frequency of Metonymy S. Hartrumpf and Classes J. Leveling Class Frequency Introduction Coarse Medium Fine Metonymy Classes for Location LITERAL LIT literal 6672 Names NON-LITERAL MET (1433) place-for-event 55 Corpus Annotation place-for-gov 51 with Metonymy place-for-off 512 Information place-for-org 148 Metonymy Classifiers place-for-pop 340 Classifier place-for-product 10 Combination othermet 317 Evaluation Results MIX mixed 82 Conclusion and Outlook References
  • 15. Identifying Metonymy Classes of Named Locations Metonymy Classifiers S. Hartrumpf and J. Leveling Introduction • All classifiers are based on a memory-based learner, Metonymy Classes for TiMBL (supervised learning) Location Names • All classifiers implemented by different people Corpus Annotation • Shallow classifier 1 (SC1): relies largely on features with Metonymy obtained from gazetteer lookup Information • Shallow classifier 2 (SC2): includes features encoding Metonymy Classifiers ontological sorts from the context Classifier Combination • Deep classifier (DC): employs features from parse Evaluation results (syntactico-semantic parsing with a semantically Results oriented computer lexicon) Conclusion and Outlook References
  • 16. Identifying Metonymy Classes of Named Locations Metonymy Classifier SC1 S. Hartrumpf and Main features for training instances: J. Leveling • 109 features Introduction • Character features (e.g. token starts with capital letter?) Metonymy • Semantic entities (entity classes for the token obtained Classes for Location from morpholexical analysis) Names • PoS tags Corpus Annotation • Gazetteer lookups (for cities, countries, etc.) with Metonymy • Metonymy context (metonymy class of the token to the left) Information Metonymy Classifiers Classifier Combination Evaluation Results Conclusion and Outlook References
  • 17. Identifying Metonymy Classes of Named Locations Metonymy Classifier SC2 S. Hartrumpf and Main features for training instances: J. Leveling • 269 features Introduction • Sentence context (lemma and distance to the location Metonymy token) Classes for Location • Word context (the first three and the last three characters Names of the token, PoS tag, position in the sentence, Corpus Annotation upper/lower case information, and word length) with Metonymy • Metonymy context (metonymy class of two preceding Information tokens) Metonymy • Ontological sorts (for words in the context, using a bit Classifiers Classifier vector representation of a sort hierachy) Combination • Sentence length (number of tokens) Evaluation Results Conclusion and Outlook References
  • 18. Identifying Metonymy Classes of Named Locations Metonymy Classifier DC (1/2) S. Hartrumpf and J. Leveling Introduction Background: Metonymy Classes for • Syntactico-semantic parser (WOCADI) delivers Location Names features for the deep classifier Corpus Annotation • Semantic result: MultiNet (multilayered extended with Metonymy semantic networks, Helbig (2006)); MultiNet nodes: Information disambiguated word readings (concepts) Metonymy Classifiers • Syntactic result: dependency graph Classifier Combination • Important resource for the parser: Evaluation semantically oriented lexicon (HaGenLex) Results Conclusion and Outlook References
  • 19. Identifying Metonymy Classes of Named Locations Metonymy Classifier DC (1/2) S. Hartrumpf and • 13 features J. Leveling • p-quality: quality of the parser result as a numerical value between 500 and 1000 Introduction • token: name token; type: name type (i.e. lemma) Metonymy Classes for • dep-rel: dependency relation leading to the governor (mother Location constituent) Names • role: semantic role filled by the name Corpus • appos-molec: name accompanied by a molecular apposition? Annotation with • adjective: lemma of modifying adjective Metonymy Information • csister-ctype: lemma of coordinated sister node with compound reduction Metonymy Classifiers • csister-entity: semantic entity value of coordinated sister node Classifier • mother-entity: semantic entity value of mother constituent Combination • mother-sort: ontological sort of mother constituent Evaluation • mother-type: type (i.e. lemma) of mother Results • mother-ctype: type (i.e. lemma) of mother with compound reduction Conclusion and Outlook References
  • 20. Identifying Metonymy Classes of Named Locations Classifier Combination S. Hartrumpf and J. Leveling Introduction Metonymy Classes for Location Names Features for training instances: Corpus • 15 features Annotation with Metonymy • results for the location token (from SC1, SC2, DC) Information • results for tokens in the context (from SC1, SC2, DC) Metonymy Classifiers Classifier Combination Evaluation Results Conclusion and Outlook References
  • 21. Identifying Metonymy Classes of Named Locations Results on Coarse Level S. Hartrumpf Class SC1 SC2 DC and J. Leveling P R P R P R LITERAL 89.16 93.74 93.36 93.71 94.01 36.71 Introduction NON-LITERAL 64.36 49.83 71.81 70.63 82.31 32.87 Metonymy Classes for Location Names Corpus Annotation with Metonymy Information Metonymy Classifiers Classifier Combination Evaluation Results Conclusion and Outlook References
  • 22. Identifying Metonymy Classes of Named Locations Results on Coarse Level S. Hartrumpf Class SC1 SC2 DC Combined and J. Leveling P R P R P R P R LITERAL 89.16 93.74 93.36 93.71 94.01 36.71 95.13 94.83 Introduction NON-LITERAL 64.36 49.83 71.81 70.63 82.31 32.87 77.54 78.61 Metonymy Classes for Location Names Corpus Annotation with Metonymy Information Metonymy Classifiers Classifier Combination Evaluation Results Conclusion and Outlook References
  • 23. Identifying Metonymy Classes of Named Locations Results on Medium Level S. Hartrumpf Class SC1 SC2 DC and J. Leveling P R P R P R LIT 88.97 94.18 93.35 93.68 93.80 36.75 Introduction MET 63.27 48.08 70.08 68.81 81.76 33.15 Metonymy MIX 54.29 23.17 22.35 23.17 26.67 4.88 Classes for Location Names Corpus Annotation with Metonymy Information Metonymy Classifiers Classifier Combination Evaluation Results Conclusion and Outlook References
  • 24. Identifying Metonymy Classes of Named Locations Results on Medium Level S. Hartrumpf Class SC1 SC2 DC Combined and J. Leveling P R P R P R P R LIT 88.97 94.18 93.35 93.68 93.80 36.75 94.75 95.23 Introduction MET 63.27 48.08 70.08 68.81 81.76 33.15 76.11 77.60 Metonymy MIX 54.29 23.17 22.35 23.17 26.67 4.88 75.00 18.29 Classes for Location Names Corpus Annotation with Metonymy Information Metonymy Classifiers Classifier Combination Evaluation Results Conclusion and Outlook References
  • 25. Identifying Metonymy Classes of Named Locations Results on Fine Level S. Hartrumpf Class SC1 SC2 DC and J. Leveling P R P R P R literal 87.71 96.36 92.55 94.08 93.35 36.84 Introduction mixed 55.88 23.17 17.72 17.07 22.22 4.88 Metonymy othermet 41.03 20.19 34.75 30.91 34.62 8.52 Classes for place-for-event 37.50 10.91 12.50 12.73 46.67 12.73 Location Names place-for-gov 42.11 15.69 20.00 13.73 63.64 13.73 place-for-off 50.58 42.77 55.35 52.54 67.90 35.94 Corpus Annotation place-for-org 42.86 14.19 42.31 37.16 51.52 11.49 with place-for-pop 30.87 13.53 45.29 43.82 52.35 22.94 Metonymy place-for-product 0.00 0.00 0.00 0.00 0.00 0.00 Information Metonymy Classifiers Classifier Combination Evaluation Results Conclusion and Outlook References
  • 26. Identifying Metonymy Classes of Named Locations Results on Fine Level S. Hartrumpf Class SC1 SC2 DC Combined and J. Leveling P R P R P R P R literal 87.71 96.36 92.55 94.08 93.35 36.84 89.78 97.80 Introduction mixed 55.88 23.17 17.72 17.07 22.22 4.88 85.71 14.63 Metonymy othermet 41.03 20.19 34.75 30.91 34.62 8.52 52.55 22.71 Classes for place-for-event 37.50 10.91 12.50 12.73 46.67 12.73 30.00 5.45 Location Names place-for-gov 42.11 15.69 20.00 13.73 63.64 13.73 87.50 13.73 place-for-off 50.58 42.77 55.35 52.54 67.90 35.94 62.25 60.55 Corpus Annotation place-for-org 42.86 14.19 42.31 37.16 51.52 11.49 55.79 35.81 with place-for-pop 30.87 13.53 45.29 43.82 52.35 22.94 61.15 28.24 Metonymy place-for-product 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 Information Metonymy Classifiers Classifier Combination Evaluation Results Conclusion and Outlook References
  • 27. Identifying Metonymy Classes of Named Locations Effect of Metonymy Support in S. Hartrumpf and the Lexicon J. Leveling Metonymy Sentence #Sentences Parse results (%) support constraint Introduction Full Chunks Failed Metonymy Classes for Location no NON- 1,124 47.15 37.46 15.39 Names LITERAL Corpus no constraint 27,067 54.08 31.09 14.83 Annotation with yes NON- 1,124 52.40 32.21 15.39 Metonymy Information LITERAL no constraint 27,067 53.60 31.19 15.21 Metonymy Classifiers Classifier Combination Evaluation Results Conclusion and Outlook References
  • 28. Identifying Metonymy Classes of Named Locations Conclusion and Outlook S. Hartrumpf and J. Leveling • Classifiers differ in their strengths and weaknesses Introduction (for example, the deep method shows the highest Metonymy Classes for precision values, but recall values are low because they Location Names are limited by the parser coverage) Corpus → Combined classifier outperforms each single classifier Annotation with significantly Metonymy Information • Created a new resource about metonymy in German Metonymy Classifiers • Metonymy support in the lexicon improves results of Classifier syntactico-semantic parser Combination Evaluation • Future work: investigate semantic representation of Results metonymic names; Conclusion and Outlook application to QA and GIR References
  • 29. Identifying Metonymy Classes of Named Locations Selected References S. Hartrumpf Helbig, Hermann (2006). Knowledge Representation and the Semantics of Natural and Language. Berlin: Springer. URL http://www.springer.com/sgw/cda/ J. Leveling frontpage/0,11855,1-40109-22-72041224-0,00.html. Introduction Kamei, Shin-ichiro and Takahiro Wakao (1992). Metonymy: Reassessment, survey of acceptability, and its treatment in machine translation systems. In Proceedings of the Metonymy 30th Annual Meeting of the Association for Computational Linguistics (ACL’92), pp. Classes for Location 309–311. Newark, Delaware. Names Lakoff, George and Mark Johnson (1980). Metaphors We Live By. Chicago University Corpus Press. Annotation Leveling, Johannes and Sven Hartrumpf (2006). On metonymy recognition for GIR. In with Metonymy Proceedings of GIR-2006, the 3rd Workshop on Geographical Information Retrieval Information (hosted by SIGIR 2006). Seattle, Washington. URL http://www.geo.unizh.ch/~rsp/gir06/papers/individual/leveling.pdf. Metonymy Classifiers Markert, Katja and Malvina Nissim (2002). Towards a corpus annotated for metonymies: The case of location names. In Proceedings of the 3rd International Conference on Classifier Combination Language Resources and Evaluation (LREC 2002). Las Palmas, Spain. Markert, Katja and Malvina Nissim (2007). Task 08: Metonymy resolution at SemEval-07. In Evaluation Results Proceedings of SemEval 2007. Conclusion Stallard, David (1993). Two kinds of metonymy. In Proceedings of the 31st Annual Meeting and Outlook of the Association for Computational Linguistics (ACL’93), pp. 87–94. Columbus, Ohio. URL http://www.aclweb.org/anthology/P93-1012. References

×