This document contains definitions of terminology-related terms from various sources:
- It defines terms such as term, preferred term, synonym, deprecated term, and nested term.
- It describes terminology work practices such as term extraction, terminology planning, and standardization.
- It explains terminology resources such as terminology databases and terminological entries that contain terms and their attributes for a given subject field.
Modular Ontologies: Package-based Description Logics ApproachJie Bao
This document provides an outline for a Ph.D. preliminary dissertation proposal on modular ontologies using package-based description logics. The outline discusses motivations for modular ontologies such as collaborative ontology building and selective reuse. It then covers key aspects of the proposed package-based description logics language including packages, package hierarchies, scope limitations, and semantics of importing between packages. The outline concludes by discussing reasoning approaches such as a federated tableau algorithm that allows distributed reasoning across ontology modules without requiring integration.
Introduction to Ontology Concepts and TerminologySteven Miller
The document introduces an ontology tutorial that will cover basic concepts of the Semantic Web, Linked Data, and the Resource Description Framework data model as well as the ontology languages RDFS and OWL. The tutorial is intended for information professionals who want to gain an introductory understanding of ontologies, ontology concepts, and terminology. The tutorial will explain how to model and structure data as RDF triples and create basic RDFS ontologies.
Ontology and Ontology Libraries: a Critical StudyDebashisnaskar
The concept of digital library revolutionized its popularity with the development of networking technology. Digital library stores various kind of documents in digitized format that enables user smooth access to these documents at subsidized costs. In the recent past, a similar concept i.e., ontology library has gained popularity among the communities like semantic web, artificial intelligence, information science, philosophy, linguistics, and so forth.
study or concern about what kinds of things exist
what entities there are in the universe.
the ontology derives from the Greek onto (being) and logia (written or spoken). It is a branch of metaphysics , the study of first principles or the root of things.
The document discusses different information retrieval models including boolean, vector, and probabilistic models. The boolean model uses set theory and boolean algebra to represent documents and queries. The vector model assigns weights to terms and ranks documents based on similarity to the query. The probabilistic model calculates the probability of a document being relevant given a query. It also covers structured models for different tasks like filtering and browsing.
Automatically finding domain specific key terms from a given set of research paper is a challenging task and research papers to a particular area of research is a concern for many people including students, professors and researchers. A domain classification of papers facilitates that search process. That is, having a list of domains in a research field, we try to find out to which domain(s) a given paper is more related. Besides, processing the whole paper to read take a long time. In this paper, using domain knowledge requires much human effort, e.g., manually composing a set of labeling a large corpus. In particular, we use the abstract and keyword in research paper as the seeing terms to identify similar terms from a domain corpus which are then filtered by checking their appearance in the research papers. Experiments show the TF –IDF measure and the classification step make this method more precisely to domains. The results show that our approach can extract the terms effectively, while being domain independent.
Lect6-An introduction to ontologies and ontology developmentAntonio Moreno
The document provides an overview of ontologies and ontology development:
1. It defines ontologies as explicit specifications of conceptualizations in a domain that define concepts, properties, attributes, and relationships to enable knowledge sharing.
2. Ontology components include concepts, properties, restrictions, and individuals. Ontologies can range from single large ontologies to several specialized smaller ones.
3. OWL is introduced as the standard language for representing ontologies, with features like classes, properties, restrictions, and logical operators.
4. A general methodology for ontology development is outlined, including determining scope, reusing existing ontologies, enumerating terms, and defining classes, properties, and other components in an iterative
For efficient and innovative use of big data, it is important to integrate multiple data bases across domains. For example, various public data bases are developed in life science, and how to find a novel scientific result using them is an essential technique. In social and business areas, open data strategies in many countries promote diversity of public data, how to combine big data and open data is a big challenge. That is, diversity of dataset is a problem to be solved for big data.
Ontology gives a systematized knowledge to integrate multiple datasets across domains with semantics of them. Linked Data also provides techniques to interlink datasets based on semantic web technologies. We consider that combinations of ontology and Linked Data based on ontological engineering can contribute to solution of diversity problem in big data.
In this talk, I discuss how ontological engineering could be applied to big data with some trial examples.
Modular Ontologies: Package-based Description Logics ApproachJie Bao
This document provides an outline for a Ph.D. preliminary dissertation proposal on modular ontologies using package-based description logics. The outline discusses motivations for modular ontologies such as collaborative ontology building and selective reuse. It then covers key aspects of the proposed package-based description logics language including packages, package hierarchies, scope limitations, and semantics of importing between packages. The outline concludes by discussing reasoning approaches such as a federated tableau algorithm that allows distributed reasoning across ontology modules without requiring integration.
Introduction to Ontology Concepts and TerminologySteven Miller
The document introduces an ontology tutorial that will cover basic concepts of the Semantic Web, Linked Data, and the Resource Description Framework data model as well as the ontology languages RDFS and OWL. The tutorial is intended for information professionals who want to gain an introductory understanding of ontologies, ontology concepts, and terminology. The tutorial will explain how to model and structure data as RDF triples and create basic RDFS ontologies.
Ontology and Ontology Libraries: a Critical StudyDebashisnaskar
The concept of digital library revolutionized its popularity with the development of networking technology. Digital library stores various kind of documents in digitized format that enables user smooth access to these documents at subsidized costs. In the recent past, a similar concept i.e., ontology library has gained popularity among the communities like semantic web, artificial intelligence, information science, philosophy, linguistics, and so forth.
study or concern about what kinds of things exist
what entities there are in the universe.
the ontology derives from the Greek onto (being) and logia (written or spoken). It is a branch of metaphysics , the study of first principles or the root of things.
The document discusses different information retrieval models including boolean, vector, and probabilistic models. The boolean model uses set theory and boolean algebra to represent documents and queries. The vector model assigns weights to terms and ranks documents based on similarity to the query. The probabilistic model calculates the probability of a document being relevant given a query. It also covers structured models for different tasks like filtering and browsing.
Automatically finding domain specific key terms from a given set of research paper is a challenging task and research papers to a particular area of research is a concern for many people including students, professors and researchers. A domain classification of papers facilitates that search process. That is, having a list of domains in a research field, we try to find out to which domain(s) a given paper is more related. Besides, processing the whole paper to read take a long time. In this paper, using domain knowledge requires much human effort, e.g., manually composing a set of labeling a large corpus. In particular, we use the abstract and keyword in research paper as the seeing terms to identify similar terms from a domain corpus which are then filtered by checking their appearance in the research papers. Experiments show the TF –IDF measure and the classification step make this method more precisely to domains. The results show that our approach can extract the terms effectively, while being domain independent.
Lect6-An introduction to ontologies and ontology developmentAntonio Moreno
The document provides an overview of ontologies and ontology development:
1. It defines ontologies as explicit specifications of conceptualizations in a domain that define concepts, properties, attributes, and relationships to enable knowledge sharing.
2. Ontology components include concepts, properties, restrictions, and individuals. Ontologies can range from single large ontologies to several specialized smaller ones.
3. OWL is introduced as the standard language for representing ontologies, with features like classes, properties, restrictions, and logical operators.
4. A general methodology for ontology development is outlined, including determining scope, reusing existing ontologies, enumerating terms, and defining classes, properties, and other components in an iterative
For efficient and innovative use of big data, it is important to integrate multiple data bases across domains. For example, various public data bases are developed in life science, and how to find a novel scientific result using them is an essential technique. In social and business areas, open data strategies in many countries promote diversity of public data, how to combine big data and open data is a big challenge. That is, diversity of dataset is a problem to be solved for big data.
Ontology gives a systematized knowledge to integrate multiple datasets across domains with semantics of them. Linked Data also provides techniques to interlink datasets based on semantic web technologies. We consider that combinations of ontology and Linked Data based on ontological engineering can contribute to solution of diversity problem in big data.
In this talk, I discuss how ontological engineering could be applied to big data with some trial examples.
16. Anne Schumann (USAAR) Terminology and Ontologies 1RIILP
This document provides an overview of terminology and ontologies. It discusses why terminology is important, including for expert communication, knowledge transfer, and management. Terms are defined as linguistic symbols that represent concepts, with the relationship between terms and concepts being one-to-one in terminology. Conceptual relations between concepts are also discussed, including hierarchical relations like "is-a" that define a concept's location within a concept system. The document emphasizes that terminology work should be concept-oriented, structuring concepts into organized concept systems.
17. Anne Schuman (USAAR) Terminology and Ontologies 2RIILP
This document discusses current research topics in terminology and ontologies. It covers trends like term variation, culture-specific semantic differences, definitions, contexts, and knowledge-rich contexts. It also discusses term extraction and mapping. Key areas of research include improving techniques for specialised domains, identifying term variants, providing richer semantic descriptions, and supporting terminological workflows and users.
Application of Ontology in Semantic Information Retrieval by Prof Shahrul Azm...Khirulnizam Abd Rahman
Application of Ontology in Semantic Information Retrieval
by Prof Shahrul Azman from FSTM, UKM
Presentation for MyREN Seminar 2014
Berjaya Hotel, Kuala Lumpur
27 November 2014
Ontology and Ontology Libraries: a critical studyDebashisnaskar
This document provides an overview of ontology and ontology libraries. It discusses what ontologies are, languages for expressing ontologies like OWL, and tools for building ontologies such as Protégé. It also examines several ontology libraries including BioPortal for biomedical ontologies, OBO Foundry, oeGov for e-government, and TONES for general ontologies. Evaluation criteria for comparing ontology libraries and future challenges and opportunities are also reviewed.
Translating Ontologies in Real-World SettingsMauro Dragoni
To enable knowledge access across languages, ontologies that are often represented only in English, need to be translated into different languages. The main challenge in translating ontologies is to find the right term with respect to the domain modeled by ontology itself. Machine translation services may help in this task; however, a crucial requirement is to have translations validated by experts before the ontologies are deployed. Real-world applications must implement a support system addressing this task for relieve experts work in validating all translations. In this paper, we present ESSOT, an Expert Supporting System for Ontology Translation. The peculiarity of this system is to exploit semantic information of the concept's context for improving the quality of label translations. The system has been tested both within the Organic.Lingua project by translating the modeled ontology in three languages and on other multilingual ontologies in order to evaluate the effectiveness of the system in other contexts. The results have been compared with the translations provided by the Microsoft Translator API and the improvements demonstrated the viability of the proposed approach.
The document describes Lydia, a system for named entity recognition and text analysis that was adapted for question answering at TREC 2005. It summarizes Lydia's pipeline for entity recognition and relationship analysis. It then describes the question answering system, which takes questions as input, extracts targets, collects candidate answers from Lydia's database, scores and ranks candidates, and produces a single answer or list of answers. The system handles factoid, list, and other questions by analyzing the question type and scoring candidates based on features like target juxtaposition and question term matching.
The MSR-NLP Chinese word segmentation system is part of a full sentence analyzer. It uses a dictionary and rules for basic segmentation, morphology, and named entity recognition to build a word lattice. The system proposes new words, prunes the lattice, and uses a parser to produce the final segmentation. It participated in four segmentation bakeoff tracks, ranking highly in each. An analysis found that parameter tuning, morphology/NER, and lattice pruning contributed most to performance, while the parser helped less. Problems included inconsistent annotations and differences in defining new words.
Ontology development in protégé-آنتولوژی در پروتوغهsadegh salehi
This document describes an agenda for an ontology development presentation in Protégé. It discusses the syntactic web and its limitations, as well as the promise of the semantic web to address these issues by adding meaning to web content that is understandable to machines. It outlines two sessions on ontology and OWL basics, Protégé, and developing a pizza ontology in Protégé.
The document discusses different approaches to defining translation and translating metaphors. It describes linguistics-based approaches that view translation as substituting equivalent signs between languages. Textlinguistic approaches see translation as producing a target text based on the source text. Functional approaches define translation as a purposeful, transcultural activity. The document also discusses challenges in translating metaphors, such as retaining imagery across cultures, and proposes procedures like translating metaphor to simile or sense.
This document provides an overview of relevance theory as it relates to communication and translation. It discusses key concepts in relevance theory including the inferential nature of communication, semantic representation, context and the principle of relevance. It also provides examples of how discourse markers are translated from English to Persian to convey contextual implications and assumptions. The examples show how relevance theory treats translation as a form of communication and emphasizes deriving implicatures based on contextual factors.
1. Translation studies developed as an academic discipline since the 1970s, drawing from fields like linguistics, comparative literature, and cultural studies.
2. Early work focused on contrastive analysis and equivalence, while recent approaches examine translation as a communicative act within sociocultural contexts.
3. Current theories address issues like text types, skopos, descriptive approaches, the literary polysystem, cultural studies perspectives, and the relationship between theory and practice in the field.
This document summarizes the key concepts in Western translation theory, focusing on the central concept of equivalence. It discusses how equivalence has been conceptualized at different linguistic levels and in different theoretical frameworks, from formal correspondence to functional and Skopos-based approaches. Descriptive translation studies are also introduced as moving away from prescriptive notions of equivalence towards describing translation based on norms, conventions, and the polysystem of socio-cultural factors that influence translation.
This document discusses translation as both a process and a product. It defines translation as the act of transferring meaning from one language to another. As a process, translation refers to the role of the translator in taking a source text and producing a target text in another language. As a product, translation refers to the concrete translated text produced. The document emphasizes that translation encompasses both the process conducted by translators as well as the written work that results from that process.
This document discusses various philosophical approaches to translation that emerged in the late 20th century. It outlines George Steiner's hermeneutic approach to translation as consisting of four "movements": initiative trust, aggression, incorporation, and compensation. It also discusses Ezra Pound's view of translation as a tool for cultural struggle and Walter Benjamin's concept of a "pure language" released through the vital link between an original text and its translation. Finally, it examines the implications of deconstruction theory for understanding language and meaning in translation.
Translation studies has gone through four phases of development: from Cicero and Horace translating Greek works into Latin between 46 BC and 1792; the rise of translation as a profession between 1769 and 1946; the development of machine translation and hermeneutic approaches between 1940 and 1960; and a metaphysical approach to translation theory from 1960 onwards. The Romans were early translators who helped spread Greek literature, but focused more on practical translation than creative works, lacking the imagination of later eras. Key Roman translators included Cicero, Horace, Longinus, and Quintilian.
Lexicography involves processes for determining word meanings and constructing dictionaries. It includes designing guidelines, researching words and definitions, and formatting entries for publication. There are two related disciplines: practical lexicography focuses on compiling dictionaries, while theoretical lexicography analyzes lexicons and develops theories to improve dictionaries. Corpora are important resources used in both disciplines to study word usage and inform dictionary content. Word sketches were developed to efficiently analyze large corpora and streamline the process of identifying relevant collocates for dictionary entries.
This document discusses the history and theories of translation. It summarizes several key theorists and models of translation. Jakobson categorized translation into three types: intralingual translation (within a language), interlingual translation (between languages), and intersemiotic translation (across sign systems). The document also outlines the stages in the development of translation theory from the linguistic stage to the current ethical/aesthetic stage. Finally, it discusses various approaches to translation based on prioritizing the source language or target language, such as word-for-word translation or communicative translation.
Extending models for controlled vocabularies to classification systems: model...Marcia Zeng
Mitchell, Joan S., Marcia Lei Zeng, and Maja Zumer. Presented at the International UDC Seminar 2011, Classification & Ontology, The Hague, The Netherlands, Sept. 19-20, 2011.
This document provides a summary of a framework for managing vocabularies as presented at the TDWG Vocabulary Management Task Group meeting. It discusses the status of TDWG ontologies, requirements for a vocabulary management framework, and Semantic MediaWiki as a potential platform for collaborative vocabulary development. Key points include:
- Vocabularies are a core component of the TDWG technical architecture and provide shared understanding of terms, but development and governance has been challenging.
- A framework is needed to standardize the process for minting terms, releasing finalized concept vocabularies, and reusing terms in other schemas and ontologies to promote interoperability.
- Semantic MediaWiki is proposed as a platform for collaborative
The document discusses the key aspects of thesauri including their purpose, structure, types of relationships displayed, and evaluation criteria. Specifically, it notes that a thesaurus provides a standardized vocabulary for information retrieval by displaying hierarchical (e.g. broader and narrower terms) and equivalence (e.g. synonyms) relationships between terms. It also discusses how terms are organized in a thesaurus and criteria for evaluating the effectiveness of a thesaurus.
The document discusses different types of taxonomies and controlled vocabularies including their definitions, purposes, and benefits. It describes term lists, synonym rings, authority files, hierarchical taxonomies, thesauri, and ontologies. The key purposes are to help people find information using different terms, retrieve relevant concepts rather than just words, and organize information into logical hierarchies or relationships to aid searching and browsing. Solo information professionals are common accidental taxonomists.
This document summarizes a transcript from the PEMT '06 conference discussing challenges with terminology across disciplines and proposes approaches to address ambiguities. It notes how knowledge evolution has led to specialized terminology that may only be understood by experts, hindering cross-disciplinary communication. Defining terms unambiguously is important for knowledge management. The document provides examples of ambiguous terms like homonyms and synonyms and proposes establishing a transparent, inter-disciplinary lexicon using fundamental disciplines like physics and mathematics to prioritize terms. It emphasizes the need to review scientific terminology to remove ambiguity and proposes criteria to clearly define terms.
16. Anne Schumann (USAAR) Terminology and Ontologies 1RIILP
This document provides an overview of terminology and ontologies. It discusses why terminology is important, including for expert communication, knowledge transfer, and management. Terms are defined as linguistic symbols that represent concepts, with the relationship between terms and concepts being one-to-one in terminology. Conceptual relations between concepts are also discussed, including hierarchical relations like "is-a" that define a concept's location within a concept system. The document emphasizes that terminology work should be concept-oriented, structuring concepts into organized concept systems.
17. Anne Schuman (USAAR) Terminology and Ontologies 2RIILP
This document discusses current research topics in terminology and ontologies. It covers trends like term variation, culture-specific semantic differences, definitions, contexts, and knowledge-rich contexts. It also discusses term extraction and mapping. Key areas of research include improving techniques for specialised domains, identifying term variants, providing richer semantic descriptions, and supporting terminological workflows and users.
Application of Ontology in Semantic Information Retrieval by Prof Shahrul Azm...Khirulnizam Abd Rahman
Application of Ontology in Semantic Information Retrieval
by Prof Shahrul Azman from FSTM, UKM
Presentation for MyREN Seminar 2014
Berjaya Hotel, Kuala Lumpur
27 November 2014
Ontology and Ontology Libraries: a critical studyDebashisnaskar
This document provides an overview of ontology and ontology libraries. It discusses what ontologies are, languages for expressing ontologies like OWL, and tools for building ontologies such as Protégé. It also examines several ontology libraries including BioPortal for biomedical ontologies, OBO Foundry, oeGov for e-government, and TONES for general ontologies. Evaluation criteria for comparing ontology libraries and future challenges and opportunities are also reviewed.
Translating Ontologies in Real-World SettingsMauro Dragoni
To enable knowledge access across languages, ontologies that are often represented only in English, need to be translated into different languages. The main challenge in translating ontologies is to find the right term with respect to the domain modeled by ontology itself. Machine translation services may help in this task; however, a crucial requirement is to have translations validated by experts before the ontologies are deployed. Real-world applications must implement a support system addressing this task for relieve experts work in validating all translations. In this paper, we present ESSOT, an Expert Supporting System for Ontology Translation. The peculiarity of this system is to exploit semantic information of the concept's context for improving the quality of label translations. The system has been tested both within the Organic.Lingua project by translating the modeled ontology in three languages and on other multilingual ontologies in order to evaluate the effectiveness of the system in other contexts. The results have been compared with the translations provided by the Microsoft Translator API and the improvements demonstrated the viability of the proposed approach.
The document describes Lydia, a system for named entity recognition and text analysis that was adapted for question answering at TREC 2005. It summarizes Lydia's pipeline for entity recognition and relationship analysis. It then describes the question answering system, which takes questions as input, extracts targets, collects candidate answers from Lydia's database, scores and ranks candidates, and produces a single answer or list of answers. The system handles factoid, list, and other questions by analyzing the question type and scoring candidates based on features like target juxtaposition and question term matching.
The MSR-NLP Chinese word segmentation system is part of a full sentence analyzer. It uses a dictionary and rules for basic segmentation, morphology, and named entity recognition to build a word lattice. The system proposes new words, prunes the lattice, and uses a parser to produce the final segmentation. It participated in four segmentation bakeoff tracks, ranking highly in each. An analysis found that parameter tuning, morphology/NER, and lattice pruning contributed most to performance, while the parser helped less. Problems included inconsistent annotations and differences in defining new words.
Ontology development in protégé-آنتولوژی در پروتوغهsadegh salehi
This document describes an agenda for an ontology development presentation in Protégé. It discusses the syntactic web and its limitations, as well as the promise of the semantic web to address these issues by adding meaning to web content that is understandable to machines. It outlines two sessions on ontology and OWL basics, Protégé, and developing a pizza ontology in Protégé.
The document discusses different approaches to defining translation and translating metaphors. It describes linguistics-based approaches that view translation as substituting equivalent signs between languages. Textlinguistic approaches see translation as producing a target text based on the source text. Functional approaches define translation as a purposeful, transcultural activity. The document also discusses challenges in translating metaphors, such as retaining imagery across cultures, and proposes procedures like translating metaphor to simile or sense.
This document provides an overview of relevance theory as it relates to communication and translation. It discusses key concepts in relevance theory including the inferential nature of communication, semantic representation, context and the principle of relevance. It also provides examples of how discourse markers are translated from English to Persian to convey contextual implications and assumptions. The examples show how relevance theory treats translation as a form of communication and emphasizes deriving implicatures based on contextual factors.
1. Translation studies developed as an academic discipline since the 1970s, drawing from fields like linguistics, comparative literature, and cultural studies.
2. Early work focused on contrastive analysis and equivalence, while recent approaches examine translation as a communicative act within sociocultural contexts.
3. Current theories address issues like text types, skopos, descriptive approaches, the literary polysystem, cultural studies perspectives, and the relationship between theory and practice in the field.
This document summarizes the key concepts in Western translation theory, focusing on the central concept of equivalence. It discusses how equivalence has been conceptualized at different linguistic levels and in different theoretical frameworks, from formal correspondence to functional and Skopos-based approaches. Descriptive translation studies are also introduced as moving away from prescriptive notions of equivalence towards describing translation based on norms, conventions, and the polysystem of socio-cultural factors that influence translation.
This document discusses translation as both a process and a product. It defines translation as the act of transferring meaning from one language to another. As a process, translation refers to the role of the translator in taking a source text and producing a target text in another language. As a product, translation refers to the concrete translated text produced. The document emphasizes that translation encompasses both the process conducted by translators as well as the written work that results from that process.
This document discusses various philosophical approaches to translation that emerged in the late 20th century. It outlines George Steiner's hermeneutic approach to translation as consisting of four "movements": initiative trust, aggression, incorporation, and compensation. It also discusses Ezra Pound's view of translation as a tool for cultural struggle and Walter Benjamin's concept of a "pure language" released through the vital link between an original text and its translation. Finally, it examines the implications of deconstruction theory for understanding language and meaning in translation.
Translation studies has gone through four phases of development: from Cicero and Horace translating Greek works into Latin between 46 BC and 1792; the rise of translation as a profession between 1769 and 1946; the development of machine translation and hermeneutic approaches between 1940 and 1960; and a metaphysical approach to translation theory from 1960 onwards. The Romans were early translators who helped spread Greek literature, but focused more on practical translation than creative works, lacking the imagination of later eras. Key Roman translators included Cicero, Horace, Longinus, and Quintilian.
Lexicography involves processes for determining word meanings and constructing dictionaries. It includes designing guidelines, researching words and definitions, and formatting entries for publication. There are two related disciplines: practical lexicography focuses on compiling dictionaries, while theoretical lexicography analyzes lexicons and develops theories to improve dictionaries. Corpora are important resources used in both disciplines to study word usage and inform dictionary content. Word sketches were developed to efficiently analyze large corpora and streamline the process of identifying relevant collocates for dictionary entries.
This document discusses the history and theories of translation. It summarizes several key theorists and models of translation. Jakobson categorized translation into three types: intralingual translation (within a language), interlingual translation (between languages), and intersemiotic translation (across sign systems). The document also outlines the stages in the development of translation theory from the linguistic stage to the current ethical/aesthetic stage. Finally, it discusses various approaches to translation based on prioritizing the source language or target language, such as word-for-word translation or communicative translation.
Extending models for controlled vocabularies to classification systems: model...Marcia Zeng
Mitchell, Joan S., Marcia Lei Zeng, and Maja Zumer. Presented at the International UDC Seminar 2011, Classification & Ontology, The Hague, The Netherlands, Sept. 19-20, 2011.
This document provides a summary of a framework for managing vocabularies as presented at the TDWG Vocabulary Management Task Group meeting. It discusses the status of TDWG ontologies, requirements for a vocabulary management framework, and Semantic MediaWiki as a potential platform for collaborative vocabulary development. Key points include:
- Vocabularies are a core component of the TDWG technical architecture and provide shared understanding of terms, but development and governance has been challenging.
- A framework is needed to standardize the process for minting terms, releasing finalized concept vocabularies, and reusing terms in other schemas and ontologies to promote interoperability.
- Semantic MediaWiki is proposed as a platform for collaborative
The document discusses the key aspects of thesauri including their purpose, structure, types of relationships displayed, and evaluation criteria. Specifically, it notes that a thesaurus provides a standardized vocabulary for information retrieval by displaying hierarchical (e.g. broader and narrower terms) and equivalence (e.g. synonyms) relationships between terms. It also discusses how terms are organized in a thesaurus and criteria for evaluating the effectiveness of a thesaurus.
The document discusses different types of taxonomies and controlled vocabularies including their definitions, purposes, and benefits. It describes term lists, synonym rings, authority files, hierarchical taxonomies, thesauri, and ontologies. The key purposes are to help people find information using different terms, retrieve relevant concepts rather than just words, and organize information into logical hierarchies or relationships to aid searching and browsing. Solo information professionals are common accidental taxonomists.
This document summarizes a transcript from the PEMT '06 conference discussing challenges with terminology across disciplines and proposes approaches to address ambiguities. It notes how knowledge evolution has led to specialized terminology that may only be understood by experts, hindering cross-disciplinary communication. Defining terms unambiguously is important for knowledge management. The document provides examples of ambiguous terms like homonyms and synonyms and proposes establishing a transparent, inter-disciplinary lexicon using fundamental disciplines like physics and mathematics to prioritize terms. It emphasizes the need to review scientific terminology to remove ambiguity and proposes criteria to clearly define terms.
A Corpus-based Analysis of the Terminology of the Social Sciences and Humanit...Sarah Morrow
This document presents the results of a corpus-based analysis of terminology in the social sciences and humanities fields of legal science and administrative science in Indonesia. Keywords and word clusters were identified from specialized corpora in these fields to extract term candidates based on their linguistic components. Collocation analysis was then used to further examine two term candidates, "LINGKUNGAN HIDUP" and "BUDAYA PERUSAHAAN", to recognize their specific meanings based on habitually co-occurring words, fulfilling the cognitive component for terminology. This analysis demonstrated how corpus linguistics can be integrated with the communicative theory of terminology to study terminology from its linguistic and cognitive perspectives.
The document discusses context based indexing in search engines using ontology. It describes how current search engines use term based indexing which has problems with polysemy and synonymy. It proposes using an ontology to determine the context of documents in order to build a context based index. This involves extracting concepts and relationships from documents, a thesaurus, and ontology repository to determine the context. The context based index would improve search relevance by allowing queries based on context rather than just keywords.
The document discusses stepwise methodologies for building ontologies. It outlines common steps such as identifying the purpose and scope, capturing concepts and relationships, coding the ontology formally, integrating existing ontologies, evaluation, and documentation. It emphasizes starting with a middle-out approach to capture definitions and discusses reaching consensus among those involved in building the ontology. Modularization of ontologies into reusable components is also presented as an important aspect of the methodology.
ISO 25964: Thesauri and Interoperability with Other VocabulariesMarcia Zeng
ISO 25964: Thesauri and Interoperability with Other Vocabularies + IFLA Guidelines for Multilingual Thesauri + A KOS Resource Application Profile. Presented at 2009 NKOS Workshop.
The JTHES as Part of the Intelligence Layer for the Sustainability Collection...Access Innovations, Inc.
Presented at the 2015 Data Harmony User Group Meeting in Albuquerque, New Mexico on February 17, 2015 by Ron Snyder and Sharon Garewal of ITHAKA Labs.
The JSTOR Sustainability Collection, which will launch in 2015, is composed of journals, reports, and working papers selected in consultation with scholars, policy researchers, and subject librarians. The collection features journal titles from academic publishers, scholarly societies, and industry groups, as well as a substantial library of indexed reports and working papers from leading research institutes and university centers. It addresses the emerging interdisciplinary discussion about how the environment and human activities and economic gains can be made durable over the long term. Along with this broad set of content, the collection will feature specialized functionality to support research in this emerging field, including a semantic indexing feature that helps researchers locate related terms and concepts that may have varying names across disciplines.
Ron Snyder, ITHAKA Labs Director of Research and Development, and Sharon Garewal, Senior Metadata Librarian, will discuss how the JSTOR Thesaurus (JTHES) was applied as part of the intelligence layer for the Sustainability collection prototype. This includes adding a facet for sustainability within the JTHES to tag terms as part of the collection, working with SME's across disciplines, and applying the curated terms into a live data portal.
This document summarizes JSTOR's efforts to develop a sustainability collection and utilize their controlled vocabulary thesaurus (JTHES) to semantically index and organize content in the collection. Subject matter experts were enlisted to review and provide feedback on key sustainability terms. A prototype sustainability portal was created that uses the thesaurus and semantic indexing to power auto-suggest, discovery, and refinement of results. Refinements are ongoing, including automating the calculation of documents' sustainability scores and topic labeling.
The document discusses ontologies, including:
1) It defines ontologies as formal specifications of concepts and relationships that can exist for an agent or community. Ontologies allow knowledge to be shared and reused.
2) Ontologies can be used to facilitate knowledge management, enable learning about a domain, and enable intelligent search and query expansion.
3) The document provides guidance on developing ontologies, including researching the domain, using existing resources, defining classes and properties, and choosing an ontology language.
Metadata for Terminology / KOS ResourcesMarcia Zeng
1. Why do we need metadata for terminology resources? 2. What do we need to know about a terminology resource? 3. Is there a standardized set of metadata elements for terminology resources?-- a presentation at the "New Dimensions in Knowledge Organization Systems", a Joint NKOS/ CENDI Workshop, World Bank, Washington, DC. September 11, 2008 http://nkos.slis.kent.edu/2008workshop/NKOS-CENDI2008.htm
Presentation made in the context of the FAO AIMS Webinar titled “Knowledge Organization Systems (KOS): Management of Classification Systems in the case of Organic.Edunet” (http://aims.fao.org/community/blogs/new-webinaraims-knowledge-organization-systems-kos-management-classification-systems)
21/2/2014
The objective of this webinar is to provide a brief overview of the Knowledge Organization Systems (KOS) and the tools used for managing them. The presentation will focus on the management of the multilingual Organic.Edunet ontology as a case study. In this context it will present aspects such as the collaborative work, multilinguality needs and update of the concepts using an online KOS management tool (MoKi).
The document discusses semantic interoperability within a company. It describes several tools that can be used to describe and structure semantics, including ontologies, tagging, classifications, and taxonomies. It provides examples of how these tools can be applied at an enterprise level, including enterprise ontologies, tag clouds, the Zachman framework, and IBM's Information Framework.
This document discusses controlled vocabularies and thesaurus construction. It defines a controlled vocabulary as a standardized set of terms used within a specific domain. Thesaurus construction involves multiple steps, including planning scope, conceptualizing hierarchies, collecting words, identifying relationships, validation, and ongoing maintenance. Different types of thesauri serve different purposes, such as general language thesauri, specialized thesauri for domains or languages, and digital thesauri with advanced features.
Developing an architecture for translation engine using ontologyAlexander Decker
This document proposes an architecture for machine translation that utilizes WordNet ontology and transition network grammars. It aims to improve translation accuracy by using WordNet as a syntactic guide during parsing and determining the grammatical structure of text. It then maps between the source and target languages. The architecture is described at a high level to provide a framework for future integration with other techniques to enhance machine translation.
The document discusses taxonomy development and digital projects. It provides definitions of key terms like controlled vocabularies, taxonomies, thesauri and ontologies. It explains purposes like translation, consistency, navigation, search and retrieval. Challenges like ambiguity, synonymy and polyhierarchies are covered. Guidelines and standards for building taxonomies are also summarized. The value proposition of taxonomies for improving search, productivity and information sharing is outlined.
SKOS - 2007 Open Forum on Metadata Registries - NYCjonphipps
An brief introduction to SKOS (Simple Knowledge Organization Systems) and its usage in the NSDL Metadata Registry, with some discussion of current challenges.
Similar to Term and terminology interactive fun (20)
বাংলাদেশের অর্থনৈতিক সমীক্ষা ২০২৪ [Bangladesh Economic Review 2024 Bangla.pdf] কম্পিউটার , ট্যাব ও স্মার্ট ফোন ভার্সন সহ সম্পূর্ণ বাংলা ই-বুক বা pdf বই " সুচিপত্র ...বুকমার্ক মেনু 🔖 ও হাইপার লিংক মেনু 📝👆 যুক্ত ..
আমাদের সবার জন্য খুব খুব গুরুত্বপূর্ণ একটি বই ..বিসিএস, ব্যাংক, ইউনিভার্সিটি ভর্তি ও যে কোন প্রতিযোগিতা মূলক পরীক্ষার জন্য এর খুব ইম্পরট্যান্ট একটি বিষয় ...তাছাড়া বাংলাদেশের সাম্প্রতিক যে কোন ডাটা বা তথ্য এই বইতে পাবেন ...
তাই একজন নাগরিক হিসাবে এই তথ্য গুলো আপনার জানা প্রয়োজন ...।
বিসিএস ও ব্যাংক এর লিখিত পরীক্ষা ...+এছাড়া মাধ্যমিক ও উচ্চমাধ্যমিকের স্টুডেন্টদের জন্য অনেক কাজে আসবে ...
Executive Directors Chat Leveraging AI for Diversity, Equity, and InclusionTechSoup
Let’s explore the intersection of technology and equity in the final session of our DEI series. Discover how AI tools, like ChatGPT, can be used to support and enhance your nonprofit's DEI initiatives. Participants will gain insights into practical AI applications and get tips for leveraging technology to advance their DEI goals.
Introduction to AI for Nonprofits with Tapp NetworkTechSoup
Dive into the world of AI! Experts Jon Hill and Tareq Monaur will guide you through AI's role in enhancing nonprofit websites and basic marketing strategies, making it easy to understand and apply.
How to Add Chatter in the odoo 17 ERP ModuleCeline George
In Odoo, the chatter is like a chat tool that helps you work together on records. You can leave notes and track things, making it easier to talk with your team and partners. Inside chatter, all communication history, activity, and changes will be displayed.
This slide is special for master students (MIBS & MIFB) in UUM. Also useful for readers who are interested in the topic of contemporary Islamic banking.
हिंदी वर्णमाला पीपीटी, hindi alphabet PPT presentation, hindi varnamala PPT, Hindi Varnamala pdf, हिंदी स्वर, हिंदी व्यंजन, sikhiye hindi varnmala, dr. mulla adam ali, hindi language and literature, hindi alphabet with drawing, hindi alphabet pdf, hindi varnamala for childrens, hindi language, hindi varnamala practice for kids, https://www.drmullaadamali.com
How to Fix the Import Error in the Odoo 17Celine George
An import error occurs when a program fails to import a module or library, disrupting its execution. In languages like Python, this issue arises when the specified module cannot be found or accessed, hindering the program's functionality. Resolving import errors is crucial for maintaining smooth software operation and uninterrupted development processes.
ISO/IEC 27001, ISO/IEC 42001, and GDPR: Best Practices for Implementation and...PECB
Denis is a dynamic and results-driven Chief Information Officer (CIO) with a distinguished career spanning information systems analysis and technical project management. With a proven track record of spearheading the design and delivery of cutting-edge Information Management solutions, he has consistently elevated business operations, streamlined reporting functions, and maximized process efficiency.
Certified as an ISO/IEC 27001: Information Security Management Systems (ISMS) Lead Implementer, Data Protection Officer, and Cyber Risks Analyst, Denis brings a heightened focus on data security, privacy, and cyber resilience to every endeavor.
His expertise extends across a diverse spectrum of reporting, database, and web development applications, underpinned by an exceptional grasp of data storage and virtualization technologies. His proficiency in application testing, database administration, and data cleansing ensures seamless execution of complex projects.
What sets Denis apart is his comprehensive understanding of Business and Systems Analysis technologies, honed through involvement in all phases of the Software Development Lifecycle (SDLC). From meticulous requirements gathering to precise analysis, innovative design, rigorous development, thorough testing, and successful implementation, he has consistently delivered exceptional results.
Throughout his career, he has taken on multifaceted roles, from leading technical project management teams to owning solutions that drive operational excellence. His conscientious and proactive approach is unwavering, whether he is working independently or collaboratively within a team. His ability to connect with colleagues on a personal level underscores his commitment to fostering a harmonious and productive workplace environment.
Date: May 29, 2024
Tags: Information Security, ISO/IEC 27001, ISO/IEC 42001, Artificial Intelligence, GDPR
-------------------------------------------------------------------------------
Find out more about ISO training and certification services
Training: ISO/IEC 27001 Information Security Management System - EN | PECB
ISO/IEC 42001 Artificial Intelligence Management System - EN | PECB
General Data Protection Regulation (GDPR) - Training Courses - EN | PECB
Webinars: https://pecb.com/webinars
Article: https://pecb.com/article
-------------------------------------------------------------------------------
For more information about PECB:
Website: https://pecb.com/
LinkedIn: https://www.linkedin.com/company/pecb/
Facebook: https://www.facebook.com/PECBInternational/
Slideshare: http://www.slideshare.net/PECBCERTIFICATION
Main Java[All of the Base Concepts}.docxadhitya5119
This is part 1 of my Java Learning Journey. This Contains Custom methods, classes, constructors, packages, multithreading , try- catch block, finally block and more.
it describes the bony anatomy including the femoral head , acetabulum, labrum . also discusses the capsule , ligaments . muscle that act on the hip joint and the range of motion are outlined. factors affecting hip joint stability and weight transmission through the joint are summarized.
Assessment and Planning in Educational technology.pptxKavitha Krishnan
In an education system, it is understood that assessment is only for the students, but on the other hand, the Assessment of teachers is also an important aspect of the education system that ensures teachers are providing high-quality instruction to students. The assessment process can be used to provide feedback and support for professional development, to inform decisions about teacher retention or promotion, or to evaluate teacher effectiveness for accountability purposes.
Macroeconomics- Movie Location
This will be used as part of your Personal Professional Portfolio once graded.
Objective:
Prepare a presentation or a paper using research, basic comparative analysis, data organization and application of economic information. You will make an informed assessment of an economic climate outside of the United States to accomplish an entertainment industry objective.
1. Most terms were extracted from TermTerm.org, the freely accessible multilingual terminology
database containing the terms of terminology and from ISO’s Online Browsing Platform. For links
to the other sources visit my blog and use the search tool.
2. A word (simple term), multiword expression
(complex term), symbol or formula that
designates a particular concept within a given
subject field. Also Terminology unit.
(Pavel tutorial)
A designation consisting of one or more words
representing a general concept in a special
language in a specific subject field
(ISO 704:2009: 34)
3. Term rated according to the scale of a term
acceptability rating as a synonym for a preferred
term (ISOcat.org)
Admitted term(s) (set in normal type in the
printed publication) or symbol(s) shall each be
placed on a new line, after the preferred term.
(ISO 0241-1 “Layout criteria”)
4. Designation which could be used as a terminological
entry (also used “term candidate”
(Termterm.org)
5. A complex term can be either a one-word term
or a multi-word term. (ISOcat.org)
Examples of complex terms are: book-maker,
know-how, fault recognition circuit.
(ISOcat.org)
6. Term which is no longer in common use.
(ISOcat.org). Deprecated terms include obsolete,
superseded, and archaic terms.
Deprecated term(s) (set in normal type) or symbol(s)
shall each be placed on a new line and shall be
identified by an appropriate text, e.g. “DEPRECATED:”
The definition shall be placed on a new line, starting
with a lower case letter, except for any capital letters
required by the normal written form in running text,
and shall not be followed by a full-stop.
(ISO 0241-1 “Layout criteria”)
7. Term used in a text field such as the /definition/
or /context/ that designates a concept that is
defined in another terminological entry.
(ISO 26162:2012)
9. A nested term is a valid term on its own, but also
forms a part of other longer term. For example,
"floating point" is a nested term of "floating point
arithmetic". (Used in term extraction lingo)
(The National Center for Text Mining:
http://www.nactem.ac.uk/faq_termine.php?faq=5)
Nested terms appear as substrings of longer terms
(whether or not they appear as a standalone term
as well).
(“Automatic Term Extraction” by K. Heylen and D. De Hertog in
Handbook of Terminology, Vol. 1, edited by H. Kockaert and F.
Steurs, p. 212)
10. Term rated according to the scale of the term
acceptability rating as the primary term for a
given concept.
(ISOcat.org).
The preferred term(s) (set in bold type in the printed
publication) or symbol(s) shall be placed on a new line,
after the entry number, starting with a lower case
letter except for any capital letters required by the
normal written form in running text. For complex terms
(e.g. compounds and multiword terms), the natural
word order shall be retained.
(ISO 0241-1 “Layout criteria”)
12. Principle of documenting of every synonymous
term with entire correspondent data categories.
(Wright, S. E., Budin, G.: Handbook of terminology
management: Application-oriented terminology management.
John Benjamins Publishing Company, Berlin, 2001.)
All terms are created equal and can be described
with the same degree of detail (that is, using all
the same fields in the system). All terms that
denote a concept are managed as autonomous
and repeatable blocks of data categories within a
terminological entry. (TerminOrgs Starter Guide.)
Principle whereby all terms in a terminological
entry can be described by using the same set of
data categories (ISO 26162:2012)
13. Collection of terminological databases including
the organizational framework for recording,
processing and disseminating data. (ISOcat.org)
Data bank containing terminological data.
(ISO 1087-1:2000)
14. Database comprising a terminological resource
(terminological data collection) (ISO 26162:2012)
Also Terminological database or Term base.
(TerminOrgs Starter Guide.)
16. Part of term excerption (extraction) involving
recognition and selection of designations.
(ISOcat.org)
17. Any logically significant portion of a larger term.
(ISO 12620)
Used when breaking down a compound term into
components or when documenting morphemes from
an etymological viewpoint. (ISOcat.org)
18. Part of a terminological resource (terminological
data collection) that contains the terminological
data related to one concept. (ISO 26162:2012)
Also Terminological entry (ISOcat.org)
19. The careful reading of a corpus and selection of
terms, normally with contexts, for recording on
terminology records. Also scanning for terms and
term excerption.
(Pavel tutorial).
20. The careful reading of a corpus and selection of
terms, normally with contexts, for recording on
terminology records. Also scanning for terms and
term excerption.
(Pavel tutorial).
21. Attribute assigned to a term.
(ISOcat.org).
Term types can include:
main entry term; synonym, quasi-synonym, international
scientific term, common name, internationalism, full form,
abbreviated form of term, abbreviation, short form of term,
initialism, acronym, clipped term, variant, transliterated form,
transcribed form, romanized form, symbol, formula, equation,
logical expression, materials management categories (like
stockkeeping unit, part number), phraseological unit,
collocation, set phrase, synonymous phrase, standard text
(ISO 12620:1999)
22. Part of terminology work concerned with the
recording and presentation of terminological data.
(ISOcat.org).
Practices, activities, methods and know-how
related to collecting and describing terms,
compiling terminological lexicons, establishing
concept systems or ontologies, making thesauri,
etc. which constitute an important aspect of
terminology.
(“Terminology and lexicography”, K. Kageura, in Handbook of
Terminology, Vol. 1, edited by H. Kockaert and F. Steurs, p. 56).
23. Process by which a general-language word or
expression is transformed into a term designating
a concept in a language for special purposes.
(ISO 704:2000).
Also related: De-terminologization (introduced by Ingrid Meyer and Kristen
Macintosh): A technical term is incorporated into general language as a
widely known word, that is, the “technicality sense” is drained out of the
term and transformed into a regular word was. Meyer and Macintosh gave
the example of the word “virtual”: Today its “virtual reality” meaning has
little to do with virtual reality per se: virtual sex, virtual office, virtual
money, even virtual corpus and virtual dictionary.
Re-terminologization: Is the transition of the term from one terminological
system into another, preserving or changing its meaning. For example, the
term “introspection” that is used in Physics and Psychology, in the didactic
context receives the meaning of self-analysis and self-knowledge. Another
easy example is the word “virus”.
Source: Read My blog post
24. Terminologists carry out the research required to index terms
specific to a certain area of activity or organization. They analyze
concepts, define terms, find their equivalents in another language
and select the most appropriate equivalents. The results of the
research are used to compile glossaries, feed terminology
databases and standardize the terminology used in a certain field
or organization. Public service administrators and communicators
(including translators, interpreters and writers) use terminologists
when they require specialized terminology. (Pavel tutorial).
Terminologists are experts in formulating, describing, managing
and distributing mono- and multi-lingual terminologies. They
work in all areas that are concerned with data, information,
knowledge and communication. Terminology work is an
interdisciplinary activity; therefore terminologists very often work
with professionals from different subject fields.
Dr. Klaus-Dirk Schmitz “The terminologist”
25. 1. The set of practices and methods used for the
collection, description and presentation of terms
(Sager 1990,3)
2. A theory, i.e. the set of premises, arguments and
conclusions required for explaining the
relationships between concepts and terms which
are fundamental for coherent activity under 1.
(Sager 1990,3)
Definitions taken from “Terminology and
lexicography”, K. Kageura, in Handbook of
Terminology, Vol. 1, edited by H. Kockaert and F.
Steurs, p. 45).
3. A vocabulary of a special subject field.
(Sager 1990,3)
26. Approach for managing terminology that
documents the way that terms are used in
contexts without indicating preferred usage
(ISO 26162:2012)
Descriptive terminology work reflects the actual state of the
terminology in a special field without assessing or confining it.
Every terminological work first starts with a descriptive analysis
by which the existing terminology of a subject field is identified
and recorded.
(“Corporate Terminology Management: An approach in theory
and practive, A. GroBjean, p. 44)
27. An agreement by users to adopt a term for
common and repeated use in given circumstances.
(Termterm.org)
Approach for managing terminology that indicates
preferred usage.
(ISO 26162:2012)
28. The process by which an official-approval
committee in a company, department or other
administrative unit approves a set of terms (and, in
some cases, their definitions) for the purpose of
establishing preferred usage for a particular user
community. Also validation.
(Pavel tutorial)
29. Collection of terms, concepts and phrases
(terminology) of a particular subject field or topic
in one (or several) language(s).
Also terminology stock.
(Termterm.org)
30. Collection of terminological databases including
the organizational framework for recording,
processing and disseminating data
(ISOcat.org)
31. Collection of terms, concepts and phrases
(terminology) of a particular subject field or topic
in one (or several) language(s)
(Termterm.org)
32. Any deliberate manipulation of terminological
information.
(ISOcat.org)
It is primarily concerned with manipulating
terminological resources for specific purposes, e.g.
establishing repertories of terminological
resources for publishing dictionaries, maintaining
terminology databases, or ad hoc problem solving
in finding multilingual equivalences in translation
work or creating new terms in technical writing.
(C. Galinski and G. Budin, 1996)
33. Part of terminology work which involves extracting
terminological data by searching through a text or
a corpus.
(ISOcat.org)
34. Activities aimed at developing, improving,
implementing and disseminating the terminology
of a subject field
(ISOcat.org)
Terminology planning is most prevalent in language
communities where there is a need to develop specialized
terms in languages that may have fallen behind in one way or
the other in keeping up with the evolution of scientific and
technical terminology or where there are socio-political
situations where there is a need to generate a range of
terminology for the political arena.
(ISOcat.org)
35. Policy formulated at the level of decision-making in
a language, domain or professional community,
with the aim of developing or regulating emerging
or existing terminologies for various purposes.
(ISO 29383:2010)
36. Part of terminography concerned with computer
aspects of database creation, maintenance and
extraction of terminology from texts.
(ISOcat.org)
37. Science studying the structure, formation,
development, usage and management of
terminologies in various subject fields.
(ISO 1087-1:2000)
38. Standard that is concerned with terms
accompanied by their definitions, and sometimes
by explanatory notes, illustrations, examples, etc.
(ISO 10241:2011)
39. Establishment of terminology standards or of
terminology sections in technical standards, and
their approval by an authoritative body.
(ISO TR 22134:2007)
40. Work concerned with the systematic collection,
description, processing, and presentation of
concepts and their designations, for the purpose of
documenting and promoting correct usage.
(Pavel tutorial)
(ISO 1087-1:2000)