Advertisement

NS-CUK Seminar: J.H.Lee, Review on "Abstract Meaning Representation for Sembanking", ACL 2013

Mar. 31, 2023
Advertisement

More Related Content

More from ssuser4b1f48(20)

Advertisement

NS-CUK Seminar: J.H.Lee, Review on "Abstract Meaning Representation for Sembanking", ACL 2013

  1. Jooho-Lee School of Computer Science and Information Engineering, The Catholic University of Korea E-mail: jooho414@gmail.com 2023-03-31
  2. 1  Introduction • Problem statement • Contributions  Related Works  Methodology
  3. 2 Introduction Problem Statement • Lack of context in natural language processing: It is sometimes difficult to effectively grasp the sentence structure and context of the text. • Data sparsity problem in natural language processing: Text data has the problem of sparsity as the length increases. • Polynominal problems in natural language processing: Sometimes the same words or phrases have different meanings, making it difficult for natural language processing models to accurately grasp the context. • Unstructured data problems in natural language processing: Text data is unstructured, so it is difficult to process.
  4. 3 Introduction Contribution • Gain Language Integration: AMR is not dependent on specific languages, and provides a foundation for integrating and processing different languages. • Conservation of Semantic Connectivity: AMR converts sentences into semantic structures while preserving semantic connectivity between words. This allows you to understand the meaning of a sentence more accurately.
  5. 4 Related Works Public Wisdom Matters! Discourse-Aware Hyperbolic Fourier Co-Attention for Social-Text Classification Conference on Neural Information Processing Systems 2021
  6. 5 Related Works
  7. 6 Methodology How to construct AMR graph? • Nodes: AMR graphs consist of a set of nodes. Each node represents a concept or entity, and has a word or phrase describing that concept or object. • Edge: Relationships between nodes are represented by edges. An edge connects two nodes and represents a semantic relationship between them. For example, :arg0 edge represents the relationship between the node's subject and verb (verb). • Root: The AMR graph has one root node, all of which are associated with the root node. • Value: A node can have a value. Values represent information such as string, number, or time. • Repeated Concepts: Repeated concepts have the same meaning and are used when repeated. In this case, add :same-of-edge between repeated concepts to indicate that it is the same concept. • Negative Concepts: Negative concepts are displayed using the :polarity edge. • Date and Time: The date and time are expressed in ISO 8601 format.
  8. 7 Methodology How to construct AMR graph? 1. Tokenizing 2. Allocate concept node to each word token 3. Add entity node 4. Make the connectivity between nodes
  9. 8 Methodology How to construct AMR graph? “John has a dog.” 1. ['John', 'has', 'a', 'dog', '.’] 2. John -> [n1] has -> [v1] a -> [d1] dog -> [n2] . -> [p1] 3. [n1 : person, name "John"] [v1 : have-rel] [d1 : op] [n2 : animal, name "dog"] [p1 : op]

Editor's Notes

  1. 이미 이전에 propagation을 활용하여 rumor detection 하는 논문들을 모두 봤었다.
  2. 이미 이전에 propagation을 활용하여 rumor detection 하는 논문들을 모두 봤었다.
  3. 이미 이전에 propagation을 활용하여 rumor detection 하는 논문들을 모두 봤었다.
  4. 이미 이전에 propagation을 활용하여 rumor detection 하는 논문들을 모두 봤었다.
  5. 이미 이전에 propagation을 활용하여 rumor detection 하는 논문들을 모두 봤었다.
  6. 이미 이전에 propagation을 활용하여 rumor detection 하는 논문들을 모두 봤었다.
  7. 이미 이전에 propagation을 활용하여 rumor detection 하는 논문들을 모두 봤었다.
Advertisement