Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Vizedom amanda ontology evaluation lightning talk 5322


Published on

Currently, there is no agreed on methodology for development of ontologies, and there is no consensus on how ontologies should be evaluated. Consequently, evaluation techniques and tools are not widely utilized in the development or use of ontologies. This can lead to development or selection of ontologies that are a poor fit for their intended use and is an obstacle to the successful deployment of ontologies and technologies on which they depend.
This talk will provide some key pointers to ontology evaluation activities that can give a ontology development or use project the direction and structure it needs to succeed. In particular, these pointers involve the identification of ontology and semantic technology requirements, and the integration of appropriate evaluation against those requirements throughout the life of the project.

For more reliable success in ontology development and use, ontology evaluation should be incorporated across all phases of the ontology life cycle. Evaluation should be conducted against carefully identified requirements; these requirements depend on the intended use of the ontology and its operational environment.

Key points of this presentation will be drawn from the Ontology Summit 2013 Communiqué: "Towards Ontology Evaluation across the Life Cycle." That document focuses on the evaluation of five aspects of the quality of ontologies: intelligibility, fidelity, craftsmanship, fitness, and deployability. A model for the ontology life cycle is presented, and evaluation criteria are presented in the context of the phases of the life cycle. The full Communiqué also discusses availability of tools, observations and recommendations. Given the current level of maturity of ontology as an engineering discipline, any results on how to best build and evaluate ontologies have to be considered as preliminary. However, the results achieved a broad consensus across the range of backgrounds, application foci, specialties and experience found in the Ontology Summit community.

Published in: Technology, Education, Spiritual
  • Be the first to comment

  • Be the first to like this

Vizedom amanda ontology evaluation lightning talk 5322

  1. 1. Ontology Evaluationacross the Life CycleKey findings from a four-monthcollaborative explorationAmanda Vizedom, Ph.D.June 3, 1
  2. 2. Problem & Approach• General state of the field:– No consensus on how to evaluate ontologies, and– not enough knowledge to ground such consensus– Pockets of knowledge & experience; not enough sharing & not enough understanding ofbounds of applicability• Practical upshot:– very little ontology evaluation– even less effective ontology evaluation• Serious consequences:– Poor quality ontologies– Poor fit between ontologies used and intended usages– No effective test of semantic solutions & the ontologies they incorporate until much too farinto effort– No way to accurately trace successes and failures to ontologies vs other system components– Additional consequences for distinguishing effective and ineffective ontology techniques,personnel, management, etc.• Approach taken over Ontology Summit 2013*– Some people report good results from applying Software Engineering and SystemsEngineering approaches to Ontologies. How do they do so? How well does it work? What canwe learn by looking at ontology evaluation through these lenses?2*Co-organized by Ontolog, NIST, NCOR, NCBO, IAOA & NCO_NITRD
  3. 3. Progress and Recommendations• Ontology Life Cycle Model– An ontology lifecycle model thatidentifies recognizable, recurringphases in the lives of ontologies– Phases = clusters of activities aroundrecognizable inputs, outputs, andgoals– No single sequence, timing, etc., thatfits or should fit all ontologies,– BUT patterns of dependenciesbetween phases that enabletransformation of generalrequirements to phase-appropriaterequirements!– Evaluation throughout the life cycleand against those requirements canand should be practiced.3
  4. 4. Progress and Recommendations (2)• Requirements Identification is key– Ontologies are not one size, or one set of capabilities, fits all– Ontology requirements depend on range of usage factors– To skip careful & informed requirements development is to dramatically reduce odds ofsuccessful ontology development and/or deployment• High-Level Characteristics– Huge number of potential requirements can be approached by considering smaller number ofhigh-level characteristics that almost always matter, seeing how they apply to specific case and/or phase– We identified & focused on these: intelligibility, fidelity, craftsmanship, fitness, anddeployability• Integration into Systems Quality Management discipline– Too often, whole semantic systems will be managed with disciplined requirements specification,test development, frequent and phase-appropriate testing and evaluation, issue tracking andunit testing, covering every aspect of semantic system *except the ontology*• Tools– Recent rapid acceleration in tools & techniques for ontology evaluation.– Need more, and need whole-lifecycle tools that incorporate evaluation, requirements, testdevelopment, etc., into native ontology management environments (IDEs, repositories)4
  5. 5. Action Item and More Details• Much more detail, and references to supporting materials, are in theOntology Summit 2013 Communiqué: Towards Ontology Evaluationacross the Life Cycle, available at:• Call for endorsements is open until 15 June, 2013• Number of endorsements has already surpassed those of past years’Communiqués, but don’t stop! Numbers, names, and varied spheresof influence strengthen message and impact.• If you want to see more and better evaluation, results-focus, andengineering discipline in ontologies and semantic technologies,please read the full Communiqué and consider adding your voice byendorsing.5