DiLEO
a domain ontology for the evaluation of digital libraries

dbis/dlib/ionio
structure
- Introduction
- On the evaluation of digital libraries
  - Modeling the evaluation of digital libraries
- An ontological representation of the digital library evaluation
  domain
  - Ontologies
  - DiLEO presentation




                                                                    2
in short

- an ontology
  - for comparing instances
  - for the support of digital library evaluation planning




                                                             3
motive
- e development of a schema for the description and the
  comparison of evaluation instances.
- Outmost aim is to cover the disagreement among evaluation
  models through a structured and formal meta-model.
    “the lack of globally accepted abstract evaluation models and
    methodologies can be counterbalanced by collecting, publishing and
    analyzing current research activities” [Führ et al., 2007]
- At the same time to develop a digital library evaluation planning
  tool.
    “Every evaluation should begin with a well-crafted plan. Writing an
    evaluation plan is not just common sense. It is an essential roadmap
    to successful evaluation!” [Reeves et a., 2003].
                                                                      4
part a
on the evaluation of digital libraries
modeling evaluation

- We do not refer to digital library evaluation models, but to the
    modeling of the process itself.
-   Five main works:
    - In Tefko Saracevic’s classification [2004]
    - In the Evaluation Computer [Kovacs & Micsik, 2004]
    - In the PRET A Rapporter framework [Blandford et al., 2007]
    - In the 5SQual model [Gonçalves et al., 2007]
    - In the Zachman Framework



                                                                     6
Saracevic’s Classification
- A classification of evaluation studies according to:
  - what elements have been evaluated (Constructs)
  - which were the goals, the perspectives and so on (Context)
  - which were the perspectives that interested us (Criteria)
  - how the evaluation was conducted (Methodology)




                                                                 7
Saracevic’s Classification
- Divided the evaluation studies to these proposing evaluation
  models and to these reporting results of evaluation initiatives.
- Saracevic formed the concept of Context to encapsulate all high-
  level questions, such as why one evaluates, what is his/her target,
  etc.
- At the same time he developed a category to classify studies
  according to what was evaluated (Constructs) and two categories
  (Criteria, Methodology) to classify the studies according to how
  these are conducted.




                                                                        8
the evaluation computer
- A faceted classification of different
  views, which synthesize an
  instance of an evaluation or an
  ‘evaluation atom’.
- A calculation of the distance
  between two ‘atoms’ in a space.




                                         9
the evaluation computer
- Five facets:
  - System
  - Content
  - Organization
  - User
  - Evaluation




                          10
PRET A Rapporter
- An evaluation framework that emphasized on the context of work.
- According to the authors the framework holds features that assist
  the planning of an evaluation.
- e framework structures the evaluations according to:
  - the purpose of evaluation
  - the resources and the constrains
  - the ethical considerations
  - the data gathering
  - the analysis of data
  - the reporting of findings


                                                                 11
PRET A Rapporter
- PRET A Rapporter is moving case study-wise. In practice this
  means it focuses on particular dimensions in each of the three
  indicative studies that presents.
  - a formative evaluation of a system
  - a comparative evaluation of two interfaces of the same database
  - a qualitative study of a system in actual use.




                                                                 12
5SQual
- e model is based on the well-known framework for the
  description of the digital libraries 5S (Streams, Structures, Spaces,
  Scenarios, & Societies).
- e model defines some dimensions (criteria) that correspond to
  constituting elements of the digital libraries.
- e authors refer to a series of studies, where these criteria are
  applied on digital libraries, such as ACM DL, CITIDEL and
  NDLTD.




                                                                      13
Digital library concept   Quality dimension   5S Concepts
Digital object            Accessibility       Societies (actor), Structures (metadata specification), Streams + Structures (structured streams)
                          Pertinence          Societies (actor), Scenarios (task)
                          Preservability      Streams, Structures (structural metadata), Scenarios (process (e.g., migration)
                          Relevance           Streams + Structures (structured streams), Structures (query), Spaces (Metric, Probabilistic,
                                              Vector)
                          Similarity          Same as in relevance, Structures (citation/link patterns)
                          Significance        Structures (citation/link patterns)
                          Timeliness          Streams (time), Structures (citation/link patterns)
Metadata specification    Accuracy            Structure (properties, values)
                          Completeness        Structure (properties, schema)
                          Conformance         Structure (properties, schema)
Collection                Completeness        Structure (collection)
Catalog                   Completeness        Structure (collection)
                          Consistency         Structure (collection)
Repository                Completeness        Structure (collection)
                          Consistency         Structure (catalog, collection)
Services                  Composability       See Extensibility, reusability
                          Efficiency            Streams (time), Spaces (operations, contraints)
                          Effectiveness        See Pertinence, Relevance
                          Extensibility       Societies + Scenarios (extends, inherits_from, redefines)
                          Reusability         Societies + Scenarios (includes, reuses)
                          Reliability         Societies + Scenarios (uses, executes, invokes)                                            14
the Zachman framework

- Zachman Framework is a framework for enterprise architecture,
  developed by John Zachman, IBM, early 1980.
- e framework reflects a formal and high-level structured view of
  an organization. A taxonomy for the organization of structural
  elements of the organization under the lens of specific perspectives.
- It classifies and organizes in a two-dimensional space all the
  concepts needed to be homogeneous and to express different
  planning perspectives.
  - According to the participants (alternative perspectives).
  - According to processes (questions).



                                                                     15
the Zachman framework
                     What             How              Where             Who             When           Why
                     Data            Process          Location          Worker           Timing       Motivation

                     Core             Major
          Scope                                       Business          Principal        Business       Mission
                    Business         Business
      [Planner]                                       Locations          Actors           Events        & Goals
                    Concepts     Transformations

                                                     Business
Business Model                                                         Workflow          Business         Policy
                   Fact Model         Tasks         Connectivity
      [Owner]                                                           Models          Milestones       Charter
                                                       Map
                                                      Platform &                          State
  System Model       Data          Behavior        Communication                                          Rule
                                                                       BRScripts        Transition
    [Evaluator]      Model         Allocation               s                                             Book
                                                                                        Diagrams
                                                          Map
    Technology     Relational                       Technical Plat-   Procedure &      Work Queue
                                    Program             form &                                            Rule
         Model     Database                                             Interface      & Scheduling
                                  Specifications       Commu-                                         Specifications
    [Evaluator]     Design                                            Specifications     Designs
                                                   nications Design
          Detail
                    Database         Source                           Procedures &     Work Queues        Rule
  representation                                      Network
                    Schema            Code                              Interfaces     & Schedules        Base
     [Evaluator]

                                                                       Operational     Operational
Functioning Bus    Operational    Operational        Operational                                      Operational
                                                                      Procedures &     Work Queues
    [Evaluator]     Database      Object Cod          Network                                           Rules
                                                                        Interfaces     & Schedules
                                                                                                              16
part b
an ontological representation of the digital
library evaluation domain
why an ontology?
- Formal models that help us:
  - understand a domain of knowledge; in this case the domain of
     digital library evaluation.
  - to structure a knowledge base to collate different instances; in
     this case instances portraying evaluations of digital libraries.
  - to infer a logical development; in this case to assist digital
     library evaluation planning.




                                                                    18
why an ontology?
- e previous schemas are located vertically in specific research
  areas. For example the PRET A Rapporter framework has a HCI
  view of things or the 5SQual examines the dimension of quality.
- ey define concepts (constituents), either of the digital libraries,
  or of the evaluation, but not their in-between relationships.
  - e purpose is to use the ontology relationships and to
      highlight the links between the concepts and to semantically
      strengthen them.
  - It has the potential to express paths, which will reveal
      alternative or complementary concepts and threads.



                                                                         19
ontologies

- We use elements such as:
  –   classes (representing concepts, entities, etc.)
  –   relationships (linking the concepts together)
  –   functions (constraining the relationships in particular ways)
  –   axioms (stating true facts)
  –   instances (reflecting examples of reality)




                                                                      20
engineering process
- DiLEO is the result of some process:
  - Literature review and study
    - selecting the proper concepts
    - continuously exploring the proper relationships
  - Expressed in OWL
  - Validation
    - through discussion and practice in the “Exploring
       perspectives on the evaluation of digital libraries” tutorial in
       ECDL 2010.
    - through a focus group with field researchers.


                                                                     21
a typical presentation of an evaluation
- Development in OWL with Protégé Ontology Editor
  - http://protege.stanford.edu/




                                                    22
the higher levels

Levels
content level, processing        isAffecting / isAffectedBy                                        Research Questions
                                                                      isDecomposedTo
level, engineering level,
interface level, individual
level, institutional level,
social level                       Dimensions
                                   effectiveness, performance
                                   measurement, service quality,              Subjects                isCharacterizing/
                                   technical excellence, outcomes                                     isCharacterizedBy
                                   assessment                                            isOperatedBy
                                                                                         isOperating
                                                               isFocusingOn                          Characteristics
                    isAimingAt


                                                                                                    isCharacterizing/
                                                                              Objects
                                                                                                    isCharacterizedBy
    Goals
    describe, document, design
                                                      hasDimensionsType

                                   Dimensions Type
                                   formative, summative, iterative
                                                                                                                23
the lower levels
                                  Instruments
                                  devices, scales, software, statistics,
                                  narrative items, research artifacts
                                                                                      isUsedIn/
                          isSupporting/isSupportedBy                                  isUsing
           isReportedIn/isReporting                       hasPerformed/isPerformedIn
                                                                                                         hasMeansType
Findings                          Activity                                      Means                        Means Types
                                  record, measure, analyze, compare,            Comparison studies,          qualitative,
                                  interpret, report, recommend                  expert studies,              quantitative
                                                                                laboratory studies,
                                                                                field studies, logging
                          hasSelected/isSelectedIn
                                                                                studies, surveys

                                  Criteria
Criteria Categories                                                                          isDependingOn
                                  specific aims, standards, toolkits

  isGrouped/isGrouping

                      isMeasuredBy/isMeasuring                             isSubjectTo
                                  Metrics                                       Factors
                                  content initiated, system initiated,          cost, infrastructure,
                                  user initiated                                personnel, time                             24
connection of the levels

 Levels                        Dimensions                             Subjects
 content level, processing     effectiveness, performance
 level, engineering level,     measurement, service quality,
 interface level, individual   technical excellence, outcomes
 level, institutional level,   assessment                             Objects
 social level
                                                    hasConstituent                isAppliedTo
                                                    /isConstituting
 Research Questions            Activity                               Means
                               record, measure, analyze, compare,     Comparison studies,
isAddressing                   interpret, report, recommend           expert studies,
                                                                      laboratory studies,
 Findings
                                                                      field studies, logging
                                                                      studies, surveys



           hasInitiatedFrom

                               Metrics
                               content initiated, system initiated,
                               user initiated                                                   25
relationships

- Some of the forty (40) relationships
             Relations                              Domain                                  Range
isCitedIn / inverse: isCiting         Appellations/study identifier (AP/ Appellations/study reference (AP/
                                      stid)                              strf )
Constraints: max cardinality=1

hasDimensionsType                     Dimensions (D)                         Dimensions Type (DT)

Constraints: min cardinality=1, ∃ (formative ∪ summative ∪ iterative)

isAffecting inverse: isAffectedBy       Dimensions (D)                         Level (L)

min cardinality =1, ∃ (content level ∪ engineering level ∪ processing level ∪ interface level ∪ individual level ∪
institutional level ∪ social level)
hasConstituent / inverse:            Dimensions (D)                          Activities (A)
isConstituting
Constraints: min cardinality =1, ∃ (record ∪ measure ∪ analyze ∪ compare ∪ interpret ∪ report ∪
recommend)
isSupporting / inverse:              Instruments (I)                         Activities (A)
isSupportedBy
Constraints: min cardinality =1, ∃ (record ∪ measure ∪ analyze ∪ compare ∪ interpret ∪ report ∪
                                                                                                                26
recommend)
use of ontology

- We use threads of the ontology — paths — to express explicitly a
    process or a requirement. For example:
-   Activities/analyze - isPerformedIn - Means/logging studies- hasMeansType
    - Means Type/quantitative




                        isPerformedIn                            hasMeansType

Activity                                Means                                   Means Types
record, measure, analyze, compare,      Comparison studies,                     qualitative,
interpret, report, recommend            expert studies,                         quantitative
                                        laboratory studies,
                                        field studies, logging
                                        studies, surveys

                                                                                               27
use of ontology
-     Level/individual level - isAffectedBy - Dimensions/performance
      measurement - isFocusingOn - Objects/usage of content/usage of data -
      isOperatedBy - Subjects/human agents - isCharacterizedby -
      Characteristics/experience
-     ... isCharacterizedby - Characteristics/discipline
-     ... isCharacterizedby - Characteristics/age
               isAffectedBy
Levels                        Dimensions                        Subjects
content level, processing     effectiveness, performance         system agents,
                                                                                       isCharacterizedby
level, engineering level,     measurement, service quality,     human agents
interface level, individual   technical excellence, outcomes
                                                                                       Characteristics
                                                                                       age, count, discipline,
level, institutional level,   assessment
                                                                                       experience, profession,
social level
                                                 isFocusingOn
                                                                           isOperatedBy
                                                                Objects
                                                                usage of content:
                                                                usage of data, usage
                                                                of metadata                           28
instances

- Entry of instances in Protégé.




                                   29
query examples
- We ask the knowledge base by issuing SPARQL queries
  - Assuming that we want to plan an evaluation with log files.
  - During the evaluation planning we are interested in knowing
    which were the research questions of relevant studies.
  - To mine this information from the knowledge base we need to
    submit a SPARQL query.




                                                              30
query examples

- the query and the answers will have this form:



                                                                                       answers
                    the research questions (in the first column) from two studies (wm2008c and
                                                nzdl2000) that used log files (in second column).

                 SPARQL query
                 SELECT DISTINCT
                 ?Research_QuestionsInst
                 ?Means
                 WHERE
                  {
                  ?Research_QuestionsInst a<Research_Questions>.
                  ?Dimensions a<Technical_Excellence>.
                  ?Activity a <Record>.
                  ?Means a <Logs>.
                  ?Research_QuestionsInst<isBelongingTo> ?Dimensions.
                  ?Dimensions<hasConstituent> ?Activity.
                  ?Activity<isPerformedIn> ?Means
                  }                                                                         31
sources
- more on DiLEO:
   -   G. Tsakonas & C. Papatheodorou (2011). “An ontological representation of
       the digital library evaluation domain”. Journal of the American Society of
       Information Science and Technology 62(8), 1577–1593.
- related readings are located in:
   -   http://www.mendeley.com/groups/731821/dileo/




                                                                               32

Dileo Presentation (in English)

  • 1.
    DiLEO a domain ontologyfor the evaluation of digital libraries dbis/dlib/ionio
  • 2.
    structure - Introduction - Onthe evaluation of digital libraries - Modeling the evaluation of digital libraries - An ontological representation of the digital library evaluation domain - Ontologies - DiLEO presentation 2
  • 3.
    in short - anontology - for comparing instances - for the support of digital library evaluation planning 3
  • 4.
    motive - e developmentof a schema for the description and the comparison of evaluation instances. - Outmost aim is to cover the disagreement among evaluation models through a structured and formal meta-model. “the lack of globally accepted abstract evaluation models and methodologies can be counterbalanced by collecting, publishing and analyzing current research activities” [Führ et al., 2007] - At the same time to develop a digital library evaluation planning tool. “Every evaluation should begin with a well-crafted plan. Writing an evaluation plan is not just common sense. It is an essential roadmap to successful evaluation!” [Reeves et a., 2003]. 4
  • 5.
    part a on theevaluation of digital libraries
  • 6.
    modeling evaluation - Wedo not refer to digital library evaluation models, but to the modeling of the process itself. - Five main works: - In Tefko Saracevic’s classification [2004] - In the Evaluation Computer [Kovacs & Micsik, 2004] - In the PRET A Rapporter framework [Blandford et al., 2007] - In the 5SQual model [Gonçalves et al., 2007] - In the Zachman Framework 6
  • 7.
    Saracevic’s Classification - Aclassification of evaluation studies according to: - what elements have been evaluated (Constructs) - which were the goals, the perspectives and so on (Context) - which were the perspectives that interested us (Criteria) - how the evaluation was conducted (Methodology) 7
  • 8.
    Saracevic’s Classification - Dividedthe evaluation studies to these proposing evaluation models and to these reporting results of evaluation initiatives. - Saracevic formed the concept of Context to encapsulate all high- level questions, such as why one evaluates, what is his/her target, etc. - At the same time he developed a category to classify studies according to what was evaluated (Constructs) and two categories (Criteria, Methodology) to classify the studies according to how these are conducted. 8
  • 9.
    the evaluation computer -A faceted classification of different views, which synthesize an instance of an evaluation or an ‘evaluation atom’. - A calculation of the distance between two ‘atoms’ in a space. 9
  • 10.
    the evaluation computer -Five facets: - System - Content - Organization - User - Evaluation 10
  • 11.
    PRET A Rapporter -An evaluation framework that emphasized on the context of work. - According to the authors the framework holds features that assist the planning of an evaluation. - e framework structures the evaluations according to: - the purpose of evaluation - the resources and the constrains - the ethical considerations - the data gathering - the analysis of data - the reporting of findings 11
  • 12.
    PRET A Rapporter -PRET A Rapporter is moving case study-wise. In practice this means it focuses on particular dimensions in each of the three indicative studies that presents. - a formative evaluation of a system - a comparative evaluation of two interfaces of the same database - a qualitative study of a system in actual use. 12
  • 13.
    5SQual - e modelis based on the well-known framework for the description of the digital libraries 5S (Streams, Structures, Spaces, Scenarios, & Societies). - e model defines some dimensions (criteria) that correspond to constituting elements of the digital libraries. - e authors refer to a series of studies, where these criteria are applied on digital libraries, such as ACM DL, CITIDEL and NDLTD. 13
  • 14.
    Digital library concept Quality dimension 5S Concepts Digital object Accessibility Societies (actor), Structures (metadata specification), Streams + Structures (structured streams) Pertinence Societies (actor), Scenarios (task) Preservability Streams, Structures (structural metadata), Scenarios (process (e.g., migration) Relevance Streams + Structures (structured streams), Structures (query), Spaces (Metric, Probabilistic, Vector) Similarity Same as in relevance, Structures (citation/link patterns) Significance Structures (citation/link patterns) Timeliness Streams (time), Structures (citation/link patterns) Metadata specification Accuracy Structure (properties, values) Completeness Structure (properties, schema) Conformance Structure (properties, schema) Collection Completeness Structure (collection) Catalog Completeness Structure (collection) Consistency Structure (collection) Repository Completeness Structure (collection) Consistency Structure (catalog, collection) Services Composability See Extensibility, reusability Efficiency Streams (time), Spaces (operations, contraints) Effectiveness See Pertinence, Relevance Extensibility Societies + Scenarios (extends, inherits_from, redefines) Reusability Societies + Scenarios (includes, reuses) Reliability Societies + Scenarios (uses, executes, invokes) 14
  • 15.
    the Zachman framework -Zachman Framework is a framework for enterprise architecture, developed by John Zachman, IBM, early 1980. - e framework reflects a formal and high-level structured view of an organization. A taxonomy for the organization of structural elements of the organization under the lens of specific perspectives. - It classifies and organizes in a two-dimensional space all the concepts needed to be homogeneous and to express different planning perspectives. - According to the participants (alternative perspectives). - According to processes (questions). 15
  • 16.
    the Zachman framework What How Where Who When Why Data Process Location Worker Timing Motivation Core Major Scope Business Principal Business Mission Business Business [Planner] Locations Actors Events & Goals Concepts Transformations Business Business Model Workflow Business Policy Fact Model Tasks Connectivity [Owner] Models Milestones Charter Map Platform & State System Model Data Behavior Communication Rule BRScripts Transition [Evaluator] Model Allocation s Book Diagrams Map Technology Relational Technical Plat- Procedure & Work Queue Program form & Rule Model Database Interface & Scheduling Specifications Commu- Specifications [Evaluator] Design Specifications Designs nications Design Detail Database Source Procedures & Work Queues Rule representation Network Schema Code Interfaces & Schedules Base [Evaluator] Operational Operational Functioning Bus Operational Operational Operational Operational Procedures & Work Queues [Evaluator] Database Object Cod Network Rules Interfaces & Schedules 16
  • 17.
    part b an ontologicalrepresentation of the digital library evaluation domain
  • 18.
    why an ontology? -Formal models that help us: - understand a domain of knowledge; in this case the domain of digital library evaluation. - to structure a knowledge base to collate different instances; in this case instances portraying evaluations of digital libraries. - to infer a logical development; in this case to assist digital library evaluation planning. 18
  • 19.
    why an ontology? -e previous schemas are located vertically in specific research areas. For example the PRET A Rapporter framework has a HCI view of things or the 5SQual examines the dimension of quality. - ey define concepts (constituents), either of the digital libraries, or of the evaluation, but not their in-between relationships. - e purpose is to use the ontology relationships and to highlight the links between the concepts and to semantically strengthen them. - It has the potential to express paths, which will reveal alternative or complementary concepts and threads. 19
  • 20.
    ontologies - We useelements such as: – classes (representing concepts, entities, etc.) – relationships (linking the concepts together) – functions (constraining the relationships in particular ways) – axioms (stating true facts) – instances (reflecting examples of reality) 20
  • 21.
    engineering process - DiLEOis the result of some process: - Literature review and study - selecting the proper concepts - continuously exploring the proper relationships - Expressed in OWL - Validation - through discussion and practice in the “Exploring perspectives on the evaluation of digital libraries” tutorial in ECDL 2010. - through a focus group with field researchers. 21
  • 22.
    a typical presentationof an evaluation - Development in OWL with Protégé Ontology Editor - http://protege.stanford.edu/ 22
  • 23.
    the higher levels Levels contentlevel, processing isAffecting / isAffectedBy Research Questions isDecomposedTo level, engineering level, interface level, individual level, institutional level, social level Dimensions effectiveness, performance measurement, service quality, Subjects isCharacterizing/ technical excellence, outcomes isCharacterizedBy assessment isOperatedBy isOperating isFocusingOn Characteristics isAimingAt isCharacterizing/ Objects isCharacterizedBy Goals describe, document, design hasDimensionsType Dimensions Type formative, summative, iterative 23
  • 24.
    the lower levels Instruments devices, scales, software, statistics, narrative items, research artifacts isUsedIn/ isSupporting/isSupportedBy isUsing isReportedIn/isReporting hasPerformed/isPerformedIn hasMeansType Findings Activity Means Means Types record, measure, analyze, compare, Comparison studies, qualitative, interpret, report, recommend expert studies, quantitative laboratory studies, field studies, logging hasSelected/isSelectedIn studies, surveys Criteria Criteria Categories isDependingOn specific aims, standards, toolkits isGrouped/isGrouping isMeasuredBy/isMeasuring isSubjectTo Metrics Factors content initiated, system initiated, cost, infrastructure, user initiated personnel, time 24
  • 25.
    connection of thelevels Levels Dimensions Subjects content level, processing effectiveness, performance level, engineering level, measurement, service quality, interface level, individual technical excellence, outcomes level, institutional level, assessment Objects social level hasConstituent isAppliedTo /isConstituting Research Questions Activity Means record, measure, analyze, compare, Comparison studies, isAddressing interpret, report, recommend expert studies, laboratory studies, Findings field studies, logging studies, surveys hasInitiatedFrom Metrics content initiated, system initiated, user initiated 25
  • 26.
    relationships - Some ofthe forty (40) relationships Relations Domain Range isCitedIn / inverse: isCiting Appellations/study identifier (AP/ Appellations/study reference (AP/ stid) strf ) Constraints: max cardinality=1 hasDimensionsType Dimensions (D) Dimensions Type (DT) Constraints: min cardinality=1, ∃ (formative ∪ summative ∪ iterative) isAffecting inverse: isAffectedBy Dimensions (D) Level (L) min cardinality =1, ∃ (content level ∪ engineering level ∪ processing level ∪ interface level ∪ individual level ∪ institutional level ∪ social level) hasConstituent / inverse: Dimensions (D) Activities (A) isConstituting Constraints: min cardinality =1, ∃ (record ∪ measure ∪ analyze ∪ compare ∪ interpret ∪ report ∪ recommend) isSupporting / inverse: Instruments (I) Activities (A) isSupportedBy Constraints: min cardinality =1, ∃ (record ∪ measure ∪ analyze ∪ compare ∪ interpret ∪ report ∪ 26 recommend)
  • 27.
    use of ontology -We use threads of the ontology — paths — to express explicitly a process or a requirement. For example: - Activities/analyze - isPerformedIn - Means/logging studies- hasMeansType - Means Type/quantitative isPerformedIn hasMeansType Activity Means Means Types record, measure, analyze, compare, Comparison studies, qualitative, interpret, report, recommend expert studies, quantitative laboratory studies, field studies, logging studies, surveys 27
  • 28.
    use of ontology - Level/individual level - isAffectedBy - Dimensions/performance measurement - isFocusingOn - Objects/usage of content/usage of data - isOperatedBy - Subjects/human agents - isCharacterizedby - Characteristics/experience - ... isCharacterizedby - Characteristics/discipline - ... isCharacterizedby - Characteristics/age isAffectedBy Levels Dimensions Subjects content level, processing effectiveness, performance system agents, isCharacterizedby level, engineering level, measurement, service quality, human agents interface level, individual technical excellence, outcomes Characteristics age, count, discipline, level, institutional level, assessment experience, profession, social level isFocusingOn isOperatedBy Objects usage of content: usage of data, usage of metadata 28
  • 29.
    instances - Entry ofinstances in Protégé. 29
  • 30.
    query examples - Weask the knowledge base by issuing SPARQL queries - Assuming that we want to plan an evaluation with log files. - During the evaluation planning we are interested in knowing which were the research questions of relevant studies. - To mine this information from the knowledge base we need to submit a SPARQL query. 30
  • 31.
    query examples - thequery and the answers will have this form: answers the research questions (in the first column) from two studies (wm2008c and nzdl2000) that used log files (in second column). SPARQL query SELECT DISTINCT ?Research_QuestionsInst ?Means WHERE { ?Research_QuestionsInst a<Research_Questions>. ?Dimensions a<Technical_Excellence>. ?Activity a <Record>. ?Means a <Logs>. ?Research_QuestionsInst<isBelongingTo> ?Dimensions. ?Dimensions<hasConstituent> ?Activity. ?Activity<isPerformedIn> ?Means } 31
  • 32.
    sources - more onDiLEO: - G. Tsakonas & C. Papatheodorou (2011). “An ontological representation of the digital library evaluation domain”. Journal of the American Society of Information Science and Technology 62(8), 1577–1593. - related readings are located in: - http://www.mendeley.com/groups/731821/dileo/ 32