0
Metadata Quality Issuesin Learning Object Repositories             PhD Candidate            Nikos Palavitsinis            ...
Structure•   Introduction•   Digital Repositories & Federations•   Metadata & Education•   Quality & Metadata• Metadata Qu...
Introduction               3/55
Problem    • Generic Problem: Low quality metadata in      digital repositories that affects resource      discovery    • ...
Background    • Relevant studies that look into quality issues:        – Study based on the Open Language Archives        ...
Aim of Digital Repositories     • Databases used for storing and/or enabling the       interoperability of Learning Object...
ARIADNE case      21 elements <50%      21 elements >50%                         7
ARIADNE case      14 elements <50%      12 elements >50%                         8
Metadata    • Metadata is structured information that describes,      explains, locates, or otherwise makes it easier to  ...
Metadata in Education    • In the field of Technology-Enhanced Learning, the      need for describing resources with infor...
Quality    • Level of excellence; A property or attribute that      differentiates a thing or person    • Quality is the s...
Quality in Metadata    • Poor quality metadata can mean that a resource is      essentially invisible within a repository ...
Metadata Creators    • In some cases, subject matter experts have been      proven to be better in metadata creation than ...
Metadata experts VS Domain experts                I have studied         I think I can use                  information   ...
Metadata Creation    • Metadata today is likely to be created by people      without metadata training, working largely in...
Metadata Quality Metrics (1/2)    • Completeness         – Number of element values provided by annotator,           compa...
Metadata Quality Metrics (2/2)    • Objectiveness         – Degree in which the metadata provided describe the           r...
Back to the problem    • How might we insert quality assurance      mechanisms in the digital repository lifecycle,      t...
Proposed Method                  19/55
Metadata Quality Assessment Certification Process                                                    20
Structure                          Different “periods” in the repository        Phases                          lifecycle ...
Metadata Design Phase    • Description         – Metadata specification / application profiling of an existing           m...
Testing Phase    • Description         – The envisaged system/tool is implemented & the users are           working with t...
Calibration Phase    • Description         – The envisaged system/tool is deployed in a controlled           environment a...
Building Critical Mass Phase    • Description         – Tools have reached a high-maturity phase and the           metadat...
Regular Operation Phase    • Description         – Metadata used in the tool(s) are finalized and content           provid...
Case Study             27/55
Case Study    • Metadata Quality Assessment Certification      Process applied in the Organic.Edunet      Federation of Le...
Metadata Design Phase     • Metadata Understanding Session          – Form that assesses elements easiness to            u...
Metadata Design Phase                                                                                     30Metadata Quali...
Metadata Design Phase     • Preliminary Hands-on Annotation          – Subject matter experts annotate a sample of their  ...
Metadata Design Phase                                                                                     32Metadata Quali...
Results                                           Results  Question                               Totally Disagree      Di...
Results                                                               Worst rated                              Rating  Is ...
Testing Phase     • Hands-on annotation experiment          – Core metadata quality criteria          – Related more with ...
Results                                                                                   36Metadata Quality Assessment Ce...
Results    Title         “Please use a more comprehensive title. For example the CRC acronym, can                  be refi...
Calibration Phase     • Metadata Quality Peer Review Exercise          – Peer reviewing metadata records using a pre-     ...
Calibration Phase                                                                                 39Metadata Quality Asses...
Results                                           3. Values            1. In which                                 4. Desc...
Building Critical Mass Phase     • Analysis of Usage Data coming from tool(s)          – Expecting to verify findings from...
Building Critical Mass Phase     • “1” shows that an element is completed whereas “0”       shows the opposite     • In th...
Records                                                                                 Results   No           ELEMENT NAM...
100.0%   80.0%              99.8%                                           94.8%                          93.9%          ...
Compare & Contrast                                                                          Best rated                  Ra...
Building Critical Mass Phase     • Metadata Quality Certification Mark          – Introduced the concept of a “Quality Sea...
Regular Operation Phase     • Regular Analysis of Usage Data coming from       the tool(s)          – Any improvement to t...
Results                                                  Critical Mass      Regular Operation           MANDATORY ELEMENTS...
Results                                                   Critical Mass       Regular Operation            RECOMMENDED ELE...
Results                                                  Critical Mass      Regular Operation               OPTIONAL ELEME...
Regular Operation Phase     • Online Peer Review Mechanism          – Deployed on the Organic.Edunet Federation Portal    ...
Overview                                                 No of participants                   Experiment                  ...
PhD Progress               53/55
Progress VS Publications (1/2)                   Experiment                         Phase              Date            Pub...
Progress VS Publications (2/2)                   Experiment                         Phase              Date            Pub...
Early Publications    • Knowledge Organization Systems           – Online study of Knowledge Organization Systems         ...
Real Users    • Organized a series of workshops involving      users annotating resources         – Organic.Edunet Summer ...
Stakeholder Consultation    • e-Conference: held during October 2010      (6/10-30/10)    • Experts on Quality for e-learn...
Topics    Phase         Topics    I             Learning resources creation: What constitutes a quality learning    (6-30/...
What’s next              60/55
Next Experiments    •           Pilot Experiment in Agricultural Learning                Resources’ Repository completed  ...
Timeline Introductory                   Adapted                      Validation Research                      MeQuACeP    ...
Next Steps     • 11/2011 – Journal paper on Metadata       Quality Assessment Certification Process       ready     • 4/20...
Metadata Quality Issuesin Learning Object Repositories      Thank you for your attention!                                 ...
Upcoming SlideShare
Loading in...5
×

Metadata quality in digital repositories

1,150

Published on

Presentation of a part of my PhD work so far, in Alcala de Henares

Published in: Technology, Education, Business
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
1,150
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
37
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide

Transcript of "Metadata quality in digital repositories"

  1. 1. Metadata Quality Issuesin Learning Object Repositories PhD Candidate Nikos Palavitsinis PhD Supervisors Ass. Prof. Salvador Sanchez-Alonso, Dr. Nikos Manouselis
  2. 2. Structure• Introduction• Digital Repositories & Federations• Metadata & Education• Quality & Metadata• Metadata Quality Assessment Certification Process• PhD Work/Research• Timetable• Next Steps 2
  3. 3. Introduction 3/55
  4. 4. Problem • Generic Problem: Low quality metadata in digital repositories that affects resource discovery • Specific Problem: How might we insert quality assurance mechanisms in the digital repository lifecycle, to enhance metadata quality 4Introduction/Problem
  5. 5. Background • Relevant studies that look into quality issues: – Study based on the Open Language Archives Community (Hughes, 2004) – Studies based on the National Science Digital Repository (Zeng et al., 2005; Bui & Ran Park, 2006) – Studies based on ARIADNE Federation repositories (Najjar et al., 2004; Ochoa et al., 2011) 5Introduction/Background
  6. 6. Aim of Digital Repositories • Databases used for storing and/or enabling the interoperability of Learning Objects (McGreal, 2007) • Enable the efficient search & discovery of objects (Richards et al., 2002) • How can the digital repositories fulfill their goals, if the quality of the metadata provided is poor? – Is it that poor? 6Digital Repositories & Federations/Aim of Digital Repositories
  7. 7. ARIADNE case 21 elements <50% 21 elements >50% 7
  8. 8. ARIADNE case 14 elements <50% 12 elements >50% 8
  9. 9. Metadata • Metadata is structured information that describes, explains, locates, or otherwise makes it easier to retrieve, use, or manage an information resource • …vital component of the learning object economy (Currier et al., 2004) 9Metadata & Education/Metadata
  10. 10. Metadata in Education • In the field of Technology-Enhanced Learning, the need for describing resources with information that extends the scope of regular metadata has been identified early (Recker & Wiley, 2001) • Most commonly used metadata schemas in education are IEEE LOM & Dublin Core • For users of Educational Repositories, problems in metadata result to poor recall of resources and inconsistent search results (Currier et al., 2004) 10Metadata & Education/Metadata in Education
  11. 11. Quality • Level of excellence; A property or attribute that differentiates a thing or person • Quality is the suitability of procedures, processes and systems in relation to the strategic objectives • Metadata are of high importance to the success of Learning Object Repositories (LORs) – Heery & Anderson, 2005; Guy et al., 2004; Robertson 2005 11Quality & Metadata/Quality
  12. 12. Quality in Metadata • Poor quality metadata can mean that a resource is essentially invisible within a repository of archive that remains unused (Barton et al., 2003) • Different settings and purposes require different approach to what represents quality in metadata (Robertson, 2005) – Quality cannot be discussed in a vacuum (Bruce & Hillman, 2004) 12Quality & Metadata/Quality in Metadata
  13. 13. Metadata Creators • In some cases, subject matter experts have been proven to be better in metadata creation than information specialists (Greenberg et al., 2001; Park, 2009) • Neither resource creators nor the information specialists handle pedagogic aspects of metadata well (Barton et al., 2003) • Importance of having only trained professionals providing metadata (Holden, 2003) 13Quality & Metadata/Metadata Creators
  14. 14. Metadata experts VS Domain experts I have studied I think I can use information the expertise of management both… I know how to create & manage data sources I have a PhD inI have been involved education in EU projects for digital libraries I know how to create educational resources I have worked with teachers for over 20 years 14
  15. 15. Metadata Creation • Metadata today is likely to be created by people without metadata training, working largely in isolation and without adequate documentation • Metadata records are also created automatically, often with poorly documented methodology and little or no indication of provenance • Unsurprisingly, the metadata resulting from these processes varies strikingly in quality and often does not play well together (Hillman et al., 2004) 15Quality & Metadata/Metadata Creation
  16. 16. Metadata Quality Metrics (1/2) • Completeness – Number of element values provided by annotator, compared to the total possible number of values • Accuracy – Metadata descriptions correspond to the actual resource they describe • Consistency – Degree of conformance of the metadata provided according to the rules metadata application profile used 16Quality & Metadata/Metadata Quality Metrics
  17. 17. Metadata Quality Metrics (2/2) • Objectiveness – Degree in which the metadata provided describe the resource in an unbiased way • Appropriateness – Fitness of use of the metadata provided when considered in terms of the envisaged services of the environment/tool deployed • Correctness – Usage of the language in the metadata, syntactically and/or grammatically 17Quality & Metadata/Metadata Quality Metrics
  18. 18. Back to the problem • How might we insert quality assurance mechanisms in the digital repository lifecycle, to enhance metadata quality? • Solution that capitalizes more on the human factor but also on automated methods of examining metadata quality 18Metadata Quality Assessment Certification Process/Introduction
  19. 19. Proposed Method 19/55
  20. 20. Metadata Quality Assessment Certification Process 20
  21. 21. Structure Different “periods” in the repository Phases lifecycle Specific metadata processes taking place in Steps each phase Quality “Control points” inserted in the repository Assurance Methods lifecycle, to enhance metadata quality Quality Tools Tools that are used to deploy the Quality /Instruments Assurance Methods People that are involved in the repository Actors lifecycle with various roles Results of each Quality Assurance Method Outcomes used in different Steps 21Metadata Quality Assessment Certification Process/Structure
  22. 22. Metadata Design Phase • Description – Metadata specification / application profiling of an existing metadata schema that will be used in a specific context • Quality Assurance Methods – Metadata Understanding Session – Preliminary Metadata Hands-on Annotation • Actors – Subject-matter experts & metadata experts • Outcomes – Initial input for metadata specification – Paper-based metadata records 22Metadata Quality Assessment Certification Process/Metadata Design Phase
  23. 23. Testing Phase • Description – The envisaged system/tool is implemented & the users are working with the first implementation of the metadata standard • Quality Assurance Methods – Test implementation of the tool – Hands-on annotation experiment – Metadata Quality Review of test sample of resources • Actors – Subject-matter experts & metadata experts • Outcomes – Good & Bad Metadata Practices Guide – Feedback for the development of the system/tool 23Metadata Quality Assessment Certification Process/Testing Phase
  24. 24. Calibration Phase • Description – The envisaged system/tool is deployed in a controlled environment and the subject matter experts continuously upload resources on it • Quality Assurance Methods – Metadata Quality Peer Review Exercise • Actors – Subject-matter experts & metadata experts • Outcomes – Good & Bad Metadata Practices Guide updated – Recommendations for metadata improvement – Peer Review results related to the quality of metadata for the resources examined 24Metadata Quality Assessment Certification Process/Calibration Phase
  25. 25. Building Critical Mass Phase • Description – Tools have reached a high-maturity phase and the metadata application profile has been finalized. Repository accepts a large number of resources • Quality Assurance Methods – Analysis of Usage Data coming from the tool(s) – Metadata Quality Certification Mark • Actors – Metadata experts • Outcomes – Minor changes to application profile – Recommendations for metadata improvement 25Metadata Quality Assessment Certification Process/Building Critical Mass Phase
  26. 26. Regular Operation Phase • Description – Metadata used in the tool(s) are finalized and content providers are uploading resources regularly. This period lasts for as long as the deployed services are online • Quality Assurance Methods – Regular Analysis of Usage Data coming from the tool(s) – Online Peer Review Mechanism – Quality Prizes/Awards for selected resources • Actors – Metadata experts & Content users/consumers • Outcomes – Recommendations for metadata improvement 26Metadata Quality Assessment Certification Process/Regular Operation Phase
  27. 27. Case Study 27/55
  28. 28. Case Study • Metadata Quality Assessment Certification Process applied in the Organic.Edunet Federation of Learning Repositories • Each respective Phase is presented focusing on its application in the Organic.Edunet case 28Metadata Quality Assessment Certification Process/Case Study
  29. 29. Metadata Design Phase • Metadata Understanding Session – Form that assesses elements easiness to understand, usefulness and appropriateness for the application domain – Also asking whether or not each element should be mandatory, recommended or optional Duration 2 hours Annotated Objects 0 Actors involved 20 metadata & subject-matter experts 29Metadata Quality Assessment Certification Process/Case Study/Metadata Design Phase
  30. 30. Metadata Design Phase 30Metadata Quality Assessment Certification Process/Case Study/Metadata Design Phase
  31. 31. Metadata Design Phase • Preliminary Hands-on Annotation – Subject matter experts annotate a sample of their resources using the suggested metadata application profile – Session organized with the participation of all content providers with supervised annotation of resources 31Metadata Quality Assessment Certification Process/Case Study/Metadata Design Phase
  32. 32. Metadata Design Phase 32Metadata Quality Assessment Certification Process/Case Study/Metadata Design Phase
  33. 33. Results Results Question Totally Disagree Disagree Neutral Agree Totally Agree Is the element easy for you to 0% 4% 21% 42% 33% understand? Is this element useful for describing 0% 12% 33% 41% 14% Organic.Edunet content resources? Is the selection of the element’s 0% 4% 37% 50% 9% possible values clear and appropriate? Best rated Rating General. Technical. Technical. Is the element easy for you to understand? 9.2 / 10 Keyword Format Size Is this element useful for describing General. General. Technical. 8.8 / 10 Organic.Edunet content resources? Identifier Description Format Is the selection of the element’s possible General. Rights. Format.Size 8.1 / 10 values clear and appropriate? Description Cost 33Metadata Quality Assessment Certification Process/Case Study/Metadata Design Phase
  34. 34. Results Worst rated Rating Is the element easy for you to Classification. Relation. Educational. 3.1 to 4.8 / 10 understand? Taxon Resource Semantic Density Is this element useful for describing Classification. Annotation. Annotation.Date 2.3 to 3.1 / 10 Organic.Edunet content resources? Taxon Entity Is the selection of the element’s Classification. Classificatio General. 2.9 to 4 / 10 possible values clear and appropriate? Taxon n.Purpose Identifier Mandatory Recommended Optional Question Before After Before After Before After Should this element be mandatory, recommended 19 25 26 21 12 11 or optional? Percentile change in overall number of mandatory +31% -19% -8,3% / recommended or optional elements 34Metadata Quality Assessment Certification Process/Case Study/Metadata Design Phase
  35. 35. Testing Phase • Hands-on annotation experiment – Core metadata quality criteria – Related more with information management practices and less with the content itself – Issues that are not connected to the domain of use for the resources Duration 1 week Annotated Objects 500 objects (5%) Actors involved 4 metadata experts Resources Reviewed 15 per metadata expert (60) 35Metadata Quality Assessment Certification Process/Case Study/Testing Phase
  36. 36. Results 36Metadata Quality Assessment Certification Process/Case Study/Testing Phase
  37. 37. Results Title “Please use a more comprehensive title. For example the CRC acronym, can be refined as Cooperative Research Centre just to provide the user with a way to understand what this learning resource is about.” Keyword “More keywords needed. Just one keyword is not enough, and even so, the keyword text here is misleading. These keywords should be provided separately as “turkey” and “poultry” along with some others, and not as one “turkey poultry”.” Typical Age “…why is it that simple pictures of pigs in the snow with no scientific details Range on them cannot be used for children that are less than 10 years old? Couldn’t these pictures be used in the context of a primary class?” Context “Since the age range is from 15 years old to undefined, it only makes sense that the Educational context cannot be limited to higher education but should also consider high school. Be very careful because in this sense, these two elements should not conflict.” 37Metadata Quality Assessment Certification Process/Case Study/Testing Phase
  38. 38. Calibration Phase • Metadata Quality Peer Review Exercise – Peer reviewing metadata records using a pre- defined quality grid assessing metadata quality metrics • Completeness, accuracy, correctness of language, etc based on Bruce & Hillman’s model Duration 3 weeks Annotated Objects 1.000 objects (10%) Actors involved 20 subject matter experts Resources Reviewed 105 resources (5 per expert) 38Metadata Quality Assessment Certification Process/Case Study/Calibration Phase
  39. 39. Calibration Phase 39Metadata Quality Assessment Certification Process/Case Study/Calibration Phase
  40. 40. Results 3. Values 1. In which 4. Describe 5. Values 6. Degree of 2. Overall provided 7. Overall degree is this the resource provided, correctness accuracy of consistent score for the Score metadata in an appropriate for of the the metadata to metadata of record objective the use in the language provided metadata this resource completed? way? Portal? used standard 5 42 54 53 72 43 72 42 4 47 34 29 22 35 22 39 3 5 10 16 6 19 9 20 2 9 3 1 2 5 0 0 1 1 1 0 0 2 1 1 no 1 3 6 3 1 1 3 40Metadata Quality Assessment Certification Process/Case Study/Calibration Phase
  41. 41. Building Critical Mass Phase • Analysis of Usage Data coming from tool(s) – Expecting to verify findings from the experiment in the “Metadata Design” Phase • Necessary elements, being used more, • Elements with values easy to understand being used correctly, etc. • Beginning of the intensive content population Duration 1 week Annotated Objects 6.600 objects (60%) Actors involved 2 metadata experts Resources Analyzed 6.600 41Metadata Quality Assessment Certification Process/Case Study/Building Critical Mass Phase
  42. 42. Building Critical Mass Phase • “1” shows that an element is completed whereas “0” shows the opposite • In the case of elements with multiplicity >1, values can be “2”, “3”, etc. – Interesting to look at the case of keywords, classification terms and/or educational elements 42Metadata Quality Assessment Certification Process/Case Study/Building Critical Mass Phase
  43. 43. Records Results No ELEMENT NAME % filled 1 General Title – 1.2 6639 99.8% 2 General Description – 1.4 6307 94.8% 3 General Language – 1.3 6248 93.9% Rights Cost Copyright & Other 4 1066 16.0% Restrictions – 6.2 5 Rights Cost – 6.1 1043 15.7% Educational Learning Resource 6 895 13.5% Type – 5.2 Educational Intended End User 7 853 12.8% Role – 5.5 8 General.Keyword – 1.5 850 12.8% Classification.Taxon Path.TaxonID 9 785 11.8% – 9.2.2.1 10 Lifecycle.Contribute.Role – 2.3.1 763 11.5% 43Metadata Quality Assessment Certification Process/Case Study/Building Critical Mass Phase
  44. 44. 100.0% 80.0% 99.8% 94.8% 93.9% Results 60.0% 40.0% 16.0% 16.0% 15.7% 20.0% 14.0% 13.5% 12.8% 12.8% 0.0% 11.8% 12.0% 12 General 14 General 13 General10.3% 10.2% 61 Rights Cost 62 Rights Cost 10.0% Title Description Language Copyright And 8.7% 8.7% 7.9% 7.7% 8.0% Other 6.0% Restrictions 3.8% 4.0% 2.0% 0.0% 9221 Classification 233 LifeCycle Contribute 232 LifeCycle Contribute 52 Educational Learning 57 Educational Typical Age 231 LifeCycle Contribute 56 Educational Context 63 Rights Description 15 General Keyword 17 General Structure 55 Educational Intended End TaxonPathTaxonId Resource Type Entity User Role Date Range Role 44Metadata Quality Assessment Certification Process/Case Study/Building Critical Mass Phase
  45. 45. Compare & Contrast Best rated Rating General. Technical. Technical. Is the element easy for you to understand? 9.2 / 10 Keyword Format Size Is the selection of the element’s possible General. Rights. Format.Size 8.1 / 10 values clear and appropriate? Description Cost Records ELEMENT NAME % filled Rights Cost – 6.1 1043 15.7% Educational Learning Resource 895 13.5% Type – 5.2 Educational Intended End User 853 12.8% Role – 5.5 General.Keyword – 1.5 850 12.8% Classification.Taxon Path.TaxonID 785 11.8% – 9.2.2.1 Lifecycle.Contribute.Role – 2.3.1 763 11.5% 45Metadata Quality Assessment Certification Process/Case Study/Building Critical Mass Phase
  46. 46. Building Critical Mass Phase • Metadata Quality Certification Mark – Introduced the concept of a “Quality Seal” for each metadata record that a content provider uploads to the Organic.Edunet Federation – In meta.metadata element 46Metadata Quality Assessment Certification Process/Case Study/Building Critical Mass Phase
  47. 47. Regular Operation Phase • Regular Analysis of Usage Data coming from the tool(s) – Any improvement to the quality of the metadata? – Measuring completeness only – Analysis conducted on October 2010 Duration 1 week Annotated Objects 11.000 objects (100%) Actors involved 2 metadata experts Resources Analyzed 11.000 47Metadata Quality Assessment Certification Process/Case Study/Regular Operation Phase
  48. 48. Results Critical Mass Regular Operation MANDATORY ELEMENTS Records % Records % Diff. 1.2 General Title 6639 99.8% 10.741 98.7% -1.1% 1.3 General Language 6248 93.9% 10.188 93.6% -0.3% 1.4 General Description 6307 94.8% 10.745 98.6% 3.8% 6.1 Rights Cost 1043 15.7% 8.681 79.7% 64.0% 6.2 Rights Cost Copyright & Other 1066 16.0% 10.720 98.4% 82.4% Restrictions 48Metadata Quality Assessment Certification Process/Case Study/Regular Operation Phase
  49. 49. Results Critical Mass Regular Operation RECOMMENDED ELEMENTS Diff. Records % Records % 1.5 General Keyword 850 12.8% 9.314 90.9% 78.1% 1.7 General Structure 523 7.9% 8.722 80.1% 72.2% 2.3.1 LifeCycle Contribute Role 763 11.5% 8.167 75% 63.5% 2.3.2 LifeCycle Contribute Entity 578 8.7% 8.244 75.8% 67.1% 2.3.3 LifeCycle Contribute Date 687 10.3% 6.842 62.8% 52.5% 5.5 Educational Intended End User Role 853 12.8% 8.589 78.9% 66.1% 5.6 Educational Context 678 10.2% 6.278 57.6% 47.4% 5.7 Educational Typical Age Range 252 3.8% 6.700 61.5% 57.7% 6.3 Rights Description 511 7.7% 9.865 90.6% 82.9% 49Metadata Quality Assessment Certification Process/Case Study/Regular Operation Phase
  50. 50. Results Critical Mass Regular Operation OPTIONAL ELEMENTS Diff. Records % Records % 1.6 General Coverage 10 0.2% 8730 80.1% 79.9% 2.2 LifeCycle Status 22 0.3% 4284 39.3% 39% 5.1 Educational Interactivity Type 22 0.3% 3907 35.9% 35.6% 5.3 Educational Interactivity Level 22 0.3% 3931 36.1% 35.8% 5.4 Educational Semantic Density 14 0.2% 3931 36.1% 35.9% 5.8 Educational Difficulty 9 0.1% 3947 36.2% 36.1% 5.10 Educational Description 102 1.5% 1603 14.7% 13.2% 5.11 Educational Language 22 0.3% 5577 51.2% 50.9% 50Metadata Quality Assessment Certification Process/Case Study/Regular Operation Phase
  51. 51. Regular Operation Phase • Online Peer Review Mechanism – Deployed on the Organic.Edunet Federation Portal – Collecting ratings on metadata quality for all resources available 51Metadata Quality Assessment Certification Process/Case Study/Regular Operation Phase
  52. 52. Overview No of participants Experiment Phase Date / records Application Profile Questionnaire & 20 Metadata Design 1/2009 Hands-on annotation Metadata Record review from metadata 4 / 60 (records) Testing 4/2009 experts Metadata Record review from subject 20 / 105 (records) Calibration 6/2009 matter experts Log files analysis from Annotation Tool 6.600 (records) Building Critical Mass 9/2009 Log files analysis from Annotation Tool 11.000 (records) Regular Operation 10/2010 52Metadata Quality Assessment Certification Process/Case Study/Overview
  53. 53. PhD Progress 53/55
  54. 54. Progress VS Publications (1/2) Experiment Phase Date Published Application Profile Questionnaire & Metadata Design 1/2009 JIAC 2009 Hands-on annotation Palavitsinis et al.: Interoperable metadata for a federation of learning repositories on organic agriculture and agroecology Metadata Record review from metadata Testing 4/2009 MTSR 2009 experts Palavitsinis et al.: Evaluation of a Metadata Application Profile for Learning Resources on Organic Agriculture Metadata Record review from subject ED-MEDIA Calibration 6/2009 matter experts 2011 Palavitsinis et al.: Metadata quality in learning repositories: Issues and considerations 54PhD Work
  55. 55. Progress VS Publications (2/2) Experiment Phase Date Published Log files analysis from Annotation Tool Metadata Design 9/2009 ICSD 2009 Palavitsinis et al.: Evaluating Metadata Application Profiles based on Usage Data ED-MEDIA Log files analysis from Annotation Tool Testing 10/2010 2011 Palavitsinis et al.: Metadata quality in learning repositories: Issues and considerations 55PhD Work
  56. 56. Early Publications • Knowledge Organization Systems – Online study of Knowledge Organization Systems on agricultural and environmental sciences • Palavitsinis & Manouselis, ITEE 2009 • Metadata Lifecycle – “Towards a Digital Curation Framework for Learning Repositories: Issues & Considerations” • Palavitsinis et al., SE@M 2010 56PhD Work
  57. 57. Real Users • Organized a series of workshops involving users annotating resources – Organic.Edunet Summer School 2009 – Joint Technology Enhanced Learning Summer School 2010 – American Farm School & Ellinogermaniki Agogi workshops – HSci Conference in Crete • Working with users (i.e. subject-matter experts, educators and metadata experts) 57PhD Work/User Events
  58. 58. Stakeholder Consultation • e-Conference: held during October 2010 (6/10-30/10) • Experts on Quality for e-learning • Two phases – four topics • Provided input for a separate PhD chapter 58PhD Work/e-Conference
  59. 59. Topics Phase Topics I Learning resources creation: What constitutes a quality learning (6-30/10) resource? Providing quality metadata: Is the gain worth the effort? II Populating a repository with resources and metadata: The quality versus (14-30/10) quantity dilemma Managing a portal with thousands of resources and users: Are communities “attracted” to quality, like bees to honey? • Each main topic, had 4 refining questions, • Each main topic, had 1 or 2 moderators • The e-Conference had 2 administrators • 1 keynote was recorded from Mrs. Amee Evans Godwin of the Institute for Knowledge Management in Education (IKSME) 59PhD Work/e-Conference/Topics
  60. 60. What’s next 60/55
  61. 61. Next Experiments • Pilot Experiment in Agricultural Learning Resources’ Repository completed – Organic.Edunet (Confolio) • Validation Experiment in Scientific/Scholarly Content Repository ongoing – VOA3R case (in Calibration Phase) • Validation Experiment in Cultural Content Repository ongoing – Natural Europe case (in Testing Phase) 61Timetable
  62. 62. Timeline Introductory Adapted Validation Research MeQuACeP Experiments5/09 5/10 10/10 2/11 12/11 6/12 9/12 Literature Literature Pilot Review (A) Review (B) Experiment WRITING 62Timetable
  63. 63. Next Steps • 11/2011 – Journal paper on Metadata Quality Assessment Certification Process ready • 4/2012 – Journal paper on MeQuACeP applied in other contexts pending • 6-9/2012 – Writing of thesis 63Next Steps
  64. 64. Metadata Quality Issuesin Learning Object Repositories Thank you for your attention! 64
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.

×