DiLEOa domain ontology for the evaluation of digital librariesdbis/dlib/ionio
structure- Introduction- On the evaluation of digital libraries  - Modeling the evaluation of digital libraries- An ontolo...
in short- an ontology  - for comparing instances  - for the support of digital library evaluation planning                ...
motive- e development of a schema for the description and the  comparison of evaluation instances.- Outmost aim is to cov...
part aon the evaluation of digital libraries
modeling evaluation- We do not refer to digital library evaluation models, but to the    modeling of the process itself.- ...
Saracevic’s Classification- A classification of evaluation studies according to:  - what elements have been evaluated (Con...
Saracevic’s Classification- Divided the evaluation studies to these proposing evaluation  models and to these reporting re...
the evaluation computer- A faceted classification of different  views, which synthesize an  instance of an evaluation or an...
the evaluation computer- Five facets:  - System  - Content  - Organization  - User  - Evaluation                          10
PRET A Rapporter- An evaluation framework that emphasized on the context of work.- According to the authors the framework ...
PRET A Rapporter- PRET A Rapporter is moving case study-wise. In practice this  means it focuses on particular dimensions ...
5SQual- e model is based on the well-known framework for the  description of the digital libraries 5S (Streams, Structure...
Digital library concept   Quality dimension   5S ConceptsDigital object            Accessibility       Societies (actor), ...
the Zachman framework- Zachman Framework is a framework for enterprise architecture,  developed by John Zachman, IBM, earl...
the Zachman framework                     What             How              Where             Who             When        ...
part ban ontological representation of the digitallibrary evaluation domain
why an ontology?- Formal models that help us:  - understand a domain of knowledge; in this case the domain of     digital ...
why an ontology?- e previous schemas are located vertically in specific research  areas. For example the PRET A Rapporter...
ontologies- We use elements such as:  –   classes (representing concepts, entities, etc.)  –   relationships (linking the ...
engineering process- DiLEO is the result of some process:  - Literature review and study    - selecting the proper concept...
a typical presentation of an evaluation- Development in OWL with Protégé Ontology Editor  - http://protege.stanford.edu/  ...
the higher levelsLevelscontent level, processing        isAffecting / isAffectedBy                                        Re...
the lower levels                                  Instruments                                  devices, scales, software, ...
connection of the levels Levels                        Dimensions                             Subjects content level, proc...
relationships- Some of the forty (40) relationships             Relations                              Domain             ...
use of ontology- We use threads of the ontology — paths — to express explicitly a    process or a requirement. For example...
use of ontology-     Level/individual level - isAffectedBy - Dimensions/performance      measurement - isFocusingOn - Objec...
instances- Entry of instances in Protégé.                                   29
query examples- We ask the knowledge base by issuing SPARQL queries  - Assuming that we want to plan an evaluation with lo...
query examples- the query and the answers will have this form:                                                            ...
sources- more on DiLEO:   -   G. Tsakonas & C. Papatheodorou (2011). “An ontological representation of       the digital l...
Upcoming SlideShare
Loading in …5
×

Dileo Presentation (in English)

1,054 views
994 views

Published on

Evaluation is a very vital research interest in the digital library domain. This has been exhibited by the growth of the literature in the main conferences and journal papers. However it is very difficult for one to navigate in this extended corpus. For these reasons the DiLEO ontology has been developed in order to assist the exploration of important concepts and the discovery of trends in the evaluation of digital libraries. DiLEO is a domain ontology, which aims to conceptualize the DL evaluation domain by correlating its key entities and provide reasoning paths that support the design of evaluation experiments.

Published in: Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
1,054
On SlideShare
0
From Embeds
0
Number of Embeds
393
Actions
Shares
0
Downloads
4
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Dileo Presentation (in English)

  1. 1. DiLEOa domain ontology for the evaluation of digital librariesdbis/dlib/ionio
  2. 2. structure- Introduction- On the evaluation of digital libraries - Modeling the evaluation of digital libraries- An ontological representation of the digital library evaluation domain - Ontologies - DiLEO presentation 2
  3. 3. in short- an ontology - for comparing instances - for the support of digital library evaluation planning 3
  4. 4. motive- e development of a schema for the description and the comparison of evaluation instances.- Outmost aim is to cover the disagreement among evaluation models through a structured and formal meta-model. “the lack of globally accepted abstract evaluation models and methodologies can be counterbalanced by collecting, publishing and analyzing current research activities” [Führ et al., 2007]- At the same time to develop a digital library evaluation planning tool. “Every evaluation should begin with a well-crafted plan. Writing an evaluation plan is not just common sense. It is an essential roadmap to successful evaluation!” [Reeves et a., 2003]. 4
  5. 5. part aon the evaluation of digital libraries
  6. 6. modeling evaluation- We do not refer to digital library evaluation models, but to the modeling of the process itself.- Five main works: - In Tefko Saracevic’s classification [2004] - In the Evaluation Computer [Kovacs & Micsik, 2004] - In the PRET A Rapporter framework [Blandford et al., 2007] - In the 5SQual model [Gonçalves et al., 2007] - In the Zachman Framework 6
  7. 7. Saracevic’s Classification- A classification of evaluation studies according to: - what elements have been evaluated (Constructs) - which were the goals, the perspectives and so on (Context) - which were the perspectives that interested us (Criteria) - how the evaluation was conducted (Methodology) 7
  8. 8. Saracevic’s Classification- Divided the evaluation studies to these proposing evaluation models and to these reporting results of evaluation initiatives.- Saracevic formed the concept of Context to encapsulate all high- level questions, such as why one evaluates, what is his/her target, etc.- At the same time he developed a category to classify studies according to what was evaluated (Constructs) and two categories (Criteria, Methodology) to classify the studies according to how these are conducted. 8
  9. 9. the evaluation computer- A faceted classification of different views, which synthesize an instance of an evaluation or an ‘evaluation atom’.- A calculation of the distance between two ‘atoms’ in a space. 9
  10. 10. the evaluation computer- Five facets: - System - Content - Organization - User - Evaluation 10
  11. 11. PRET A Rapporter- An evaluation framework that emphasized on the context of work.- According to the authors the framework holds features that assist the planning of an evaluation.- e framework structures the evaluations according to: - the purpose of evaluation - the resources and the constrains - the ethical considerations - the data gathering - the analysis of data - the reporting of findings 11
  12. 12. PRET A Rapporter- PRET A Rapporter is moving case study-wise. In practice this means it focuses on particular dimensions in each of the three indicative studies that presents. - a formative evaluation of a system - a comparative evaluation of two interfaces of the same database - a qualitative study of a system in actual use. 12
  13. 13. 5SQual- e model is based on the well-known framework for the description of the digital libraries 5S (Streams, Structures, Spaces, Scenarios, & Societies).- e model defines some dimensions (criteria) that correspond to constituting elements of the digital libraries.- e authors refer to a series of studies, where these criteria are applied on digital libraries, such as ACM DL, CITIDEL and NDLTD. 13
  14. 14. Digital library concept Quality dimension 5S ConceptsDigital object Accessibility Societies (actor), Structures (metadata specification), Streams + Structures (structured streams) Pertinence Societies (actor), Scenarios (task) Preservability Streams, Structures (structural metadata), Scenarios (process (e.g., migration) Relevance Streams + Structures (structured streams), Structures (query), Spaces (Metric, Probabilistic, Vector) Similarity Same as in relevance, Structures (citation/link patterns) Significance Structures (citation/link patterns) Timeliness Streams (time), Structures (citation/link patterns)Metadata specification Accuracy Structure (properties, values) Completeness Structure (properties, schema) Conformance Structure (properties, schema)Collection Completeness Structure (collection)Catalog Completeness Structure (collection) Consistency Structure (collection)Repository Completeness Structure (collection) Consistency Structure (catalog, collection)Services Composability See Extensibility, reusability Efficiency Streams (time), Spaces (operations, contraints) Effectiveness See Pertinence, Relevance Extensibility Societies + Scenarios (extends, inherits_from, redefines) Reusability Societies + Scenarios (includes, reuses) Reliability Societies + Scenarios (uses, executes, invokes) 14
  15. 15. the Zachman framework- Zachman Framework is a framework for enterprise architecture, developed by John Zachman, IBM, early 1980.- e framework reflects a formal and high-level structured view of an organization. A taxonomy for the organization of structural elements of the organization under the lens of specific perspectives.- It classifies and organizes in a two-dimensional space all the concepts needed to be homogeneous and to express different planning perspectives. - According to the participants (alternative perspectives). - According to processes (questions). 15
  16. 16. the Zachman framework What How Where Who When Why Data Process Location Worker Timing Motivation Core Major Scope Business Principal Business Mission Business Business [Planner] Locations Actors Events & Goals Concepts Transformations BusinessBusiness Model Workflow Business Policy Fact Model Tasks Connectivity [Owner] Models Milestones Charter Map Platform & State System Model Data Behavior Communication Rule BRScripts Transition [Evaluator] Model Allocation s Book Diagrams Map Technology Relational Technical Plat- Procedure & Work Queue Program form & Rule Model Database Interface & Scheduling Specifications Commu- Specifications [Evaluator] Design Specifications Designs nications Design Detail Database Source Procedures & Work Queues Rule representation Network Schema Code Interfaces & Schedules Base [Evaluator] Operational OperationalFunctioning Bus Operational Operational Operational Operational Procedures & Work Queues [Evaluator] Database Object Cod Network Rules Interfaces & Schedules 16
  17. 17. part ban ontological representation of the digitallibrary evaluation domain
  18. 18. why an ontology?- Formal models that help us: - understand a domain of knowledge; in this case the domain of digital library evaluation. - to structure a knowledge base to collate different instances; in this case instances portraying evaluations of digital libraries. - to infer a logical development; in this case to assist digital library evaluation planning. 18
  19. 19. why an ontology?- e previous schemas are located vertically in specific research areas. For example the PRET A Rapporter framework has a HCI view of things or the 5SQual examines the dimension of quality.- ey define concepts (constituents), either of the digital libraries, or of the evaluation, but not their in-between relationships. - e purpose is to use the ontology relationships and to highlight the links between the concepts and to semantically strengthen them. - It has the potential to express paths, which will reveal alternative or complementary concepts and threads. 19
  20. 20. ontologies- We use elements such as: – classes (representing concepts, entities, etc.) – relationships (linking the concepts together) – functions (constraining the relationships in particular ways) – axioms (stating true facts) – instances (reflecting examples of reality) 20
  21. 21. engineering process- DiLEO is the result of some process: - Literature review and study - selecting the proper concepts - continuously exploring the proper relationships - Expressed in OWL - Validation - through discussion and practice in the “Exploring perspectives on the evaluation of digital libraries” tutorial in ECDL 2010. - through a focus group with field researchers. 21
  22. 22. a typical presentation of an evaluation- Development in OWL with Protégé Ontology Editor - http://protege.stanford.edu/ 22
  23. 23. the higher levelsLevelscontent level, processing isAffecting / isAffectedBy Research Questions isDecomposedTolevel, engineering level,interface level, individuallevel, institutional level,social level Dimensions effectiveness, performance measurement, service quality, Subjects isCharacterizing/ technical excellence, outcomes isCharacterizedBy assessment isOperatedBy isOperating isFocusingOn Characteristics isAimingAt isCharacterizing/ Objects isCharacterizedBy Goals describe, document, design hasDimensionsType Dimensions Type formative, summative, iterative 23
  24. 24. the lower levels Instruments devices, scales, software, statistics, narrative items, research artifacts isUsedIn/ isSupporting/isSupportedBy isUsing isReportedIn/isReporting hasPerformed/isPerformedIn hasMeansTypeFindings Activity Means Means Types record, measure, analyze, compare, Comparison studies, qualitative, interpret, report, recommend expert studies, quantitative laboratory studies, field studies, logging hasSelected/isSelectedIn studies, surveys CriteriaCriteria Categories isDependingOn specific aims, standards, toolkits isGrouped/isGrouping isMeasuredBy/isMeasuring isSubjectTo Metrics Factors content initiated, system initiated, cost, infrastructure, user initiated personnel, time 24
  25. 25. connection of the levels Levels Dimensions Subjects content level, processing effectiveness, performance level, engineering level, measurement, service quality, interface level, individual technical excellence, outcomes level, institutional level, assessment Objects social level hasConstituent isAppliedTo /isConstituting Research Questions Activity Means record, measure, analyze, compare, Comparison studies,isAddressing interpret, report, recommend expert studies, laboratory studies, Findings field studies, logging studies, surveys hasInitiatedFrom Metrics content initiated, system initiated, user initiated 25
  26. 26. relationships- Some of the forty (40) relationships Relations Domain RangeisCitedIn / inverse: isCiting Appellations/study identifier (AP/ Appellations/study reference (AP/ stid) strf )Constraints: max cardinality=1hasDimensionsType Dimensions (D) Dimensions Type (DT)Constraints: min cardinality=1, ∃ (formative ∪ summative ∪ iterative)isAffecting inverse: isAffectedBy Dimensions (D) Level (L)min cardinality =1, ∃ (content level ∪ engineering level ∪ processing level ∪ interface level ∪ individual level ∪institutional level ∪ social level)hasConstituent / inverse: Dimensions (D) Activities (A)isConstitutingConstraints: min cardinality =1, ∃ (record ∪ measure ∪ analyze ∪ compare ∪ interpret ∪ report ∪recommend)isSupporting / inverse: Instruments (I) Activities (A)isSupportedByConstraints: min cardinality =1, ∃ (record ∪ measure ∪ analyze ∪ compare ∪ interpret ∪ report ∪ 26recommend)
  27. 27. use of ontology- We use threads of the ontology — paths — to express explicitly a process or a requirement. For example:- Activities/analyze - isPerformedIn - Means/logging studies- hasMeansType - Means Type/quantitative isPerformedIn hasMeansTypeActivity Means Means Typesrecord, measure, analyze, compare, Comparison studies, qualitative,interpret, report, recommend expert studies, quantitative laboratory studies, field studies, logging studies, surveys 27
  28. 28. use of ontology- Level/individual level - isAffectedBy - Dimensions/performance measurement - isFocusingOn - Objects/usage of content/usage of data - isOperatedBy - Subjects/human agents - isCharacterizedby - Characteristics/experience- ... isCharacterizedby - Characteristics/discipline- ... isCharacterizedby - Characteristics/age isAffectedByLevels Dimensions Subjectscontent level, processing effectiveness, performance system agents, isCharacterizedbylevel, engineering level, measurement, service quality, human agentsinterface level, individual technical excellence, outcomes Characteristics age, count, discipline,level, institutional level, assessment experience, profession,social level isFocusingOn isOperatedBy Objects usage of content: usage of data, usage of metadata 28
  29. 29. instances- Entry of instances in Protégé. 29
  30. 30. query examples- We ask the knowledge base by issuing SPARQL queries - Assuming that we want to plan an evaluation with log files. - During the evaluation planning we are interested in knowing which were the research questions of relevant studies. - To mine this information from the knowledge base we need to submit a SPARQL query. 30
  31. 31. query examples- the query and the answers will have this form: answers the research questions (in the first column) from two studies (wm2008c and nzdl2000) that used log files (in second column). SPARQL query SELECT DISTINCT ?Research_QuestionsInst ?Means WHERE { ?Research_QuestionsInst a<Research_Questions>. ?Dimensions a<Technical_Excellence>. ?Activity a <Record>. ?Means a <Logs>. ?Research_QuestionsInst<isBelongingTo> ?Dimensions. ?Dimensions<hasConstituent> ?Activity. ?Activity<isPerformedIn> ?Means } 31
  32. 32. sources- more on DiLEO: - G. Tsakonas & C. Papatheodorou (2011). “An ontological representation of the digital library evaluation domain”. Journal of the American Society of Information Science and Technology 62(8), 1577–1593.- related readings are located in: - http://www.mendeley.com/groups/731821/dileo/ 32

×