Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Quality Metrics for Learning Object Metadata


Published on

Presentation at EDMedia 2006 about how to measure quality of lerning object metadata

Published in: Education, Business
  • Be the first to comment

  • Be the first to like this

Quality Metrics for Learning Object Metadata

  1. 1. Quality Metrics for Learning Object Metadata Xavier Ochoa, ESPOL, Ecuador Erik Duval, KULeuven, Belgium
  2. 2. Standardization and Interoperability of Learning Objects Repositories has solved the “chicken and egg” problem… …but has created new ones.
  3. 3. The production, management and consumption of Learning Object Metadata is vastly surpassing the human capacity to review or process these metadata.
  4. 4. Why Measuring Quality? <ul><li>The quality of the metadata record that describes a learning object affects directly the chances of the object to be found, reviewed or reused. </li></ul><ul><li>An object with the title “Lesson 1 – Course 201” and no description, could not be found in a “Introduction to Java” query, even if it is about that subject. </li></ul>
  5. 5. How to Measure Metadata Quality? <ul><li>How it has been done (librarian community): </li></ul><ul><ul><li>Assign a group of Metadata experts to review a statistically significant sample of the records grading a common set of quality characteristics </li></ul></ul>
  6. 6. But we cannot measure the quality manually anymore…
  7. 7. … but is a good idea to follow the same quality characteristics.
  8. 8. Our Proposal: Use Metrics <ul><li>Small calculation performed over the values of the different fields of the metadata record in order to gain insight on a quality characteristics. </li></ul><ul><li>For example we can count the number of fields that have been filled with information (metric) to assess the completeness of the metadata record (quality characteristic). </li></ul>
  9. 9. Quality Characteristics <ul><li>We used the framework proposed by Bruce and Hillman: </li></ul><ul><ul><li>Completeness </li></ul></ul><ul><ul><li>Accuracy </li></ul></ul><ul><ul><li>Provenance </li></ul></ul><ul><ul><li>Conformance to expectations </li></ul></ul><ul><ul><li>Consistency & logical coherence </li></ul></ul><ul><ul><li>Timeliness </li></ul></ul><ul><ul><li>Accessability </li></ul></ul>
  10. 10. Quality Metrics <ul><li>We propose several quality metrics based on those quality characteristics </li></ul>
  11. 11. Quality Metric Readability Textual Information Content Nominal Information Content Weighted Completeness Simple Completeness Metric Formula Metric Name
  12. 12. Evaluation of the Metrics <ul><li>22 Human Reviewers </li></ul><ul><li>20 Learning objects </li></ul><ul><li>7 characteristics used for evaluation </li></ul><ul><li>5 quality Metrics </li></ul><ul><li>Result: Textual Information Content Metric highly correlated (0.842) with human review </li></ul>
  13. 13. Evaluation Results
  14. 14. Applications <ul><li>Repository Evaluation </li></ul><ul><li>Quality Visualization </li></ul><ul><li>Automatic correction of metadata </li></ul><ul><li>Automatic evaluation of metadata quality </li></ul>
  15. 15. Repository Evaluation
  16. 16. Quality Visualization
  17. 17. Automated Evaluation of Quality
  18. 18. Further Work <ul><li>Measure “real” quality. </li></ul><ul><li>Quality as Fitness to fulfill a given purpose </li></ul><ul><ul><li>Quality for Retrieval </li></ul></ul><ul><ul><li>Quality for Evaluation </li></ul></ul><ul><ul><li>Accessibility Quality </li></ul></ul><ul><ul><li>Re-use Quality </li></ul></ul>
  19. 19. Thank you, Gracias Comments, Suggestions, Critics… are Welcome! More Information: