Your SlideShare is downloading. ×
Quality Metrics for Learning Object Metadata
Quality Metrics for Learning Object Metadata
Quality Metrics for Learning Object Metadata
Quality Metrics for Learning Object Metadata
Quality Metrics for Learning Object Metadata
Quality Metrics for Learning Object Metadata
Quality Metrics for Learning Object Metadata
Quality Metrics for Learning Object Metadata
Quality Metrics for Learning Object Metadata
Quality Metrics for Learning Object Metadata
Quality Metrics for Learning Object Metadata
Quality Metrics for Learning Object Metadata
Quality Metrics for Learning Object Metadata
Quality Metrics for Learning Object Metadata
Quality Metrics for Learning Object Metadata
Quality Metrics for Learning Object Metadata
Quality Metrics for Learning Object Metadata
Quality Metrics for Learning Object Metadata
Quality Metrics for Learning Object Metadata
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Quality Metrics for Learning Object Metadata

1,363

Published on

Presentation at EDMedia 2006 about how to measure quality of lerning object metadata

Presentation at EDMedia 2006 about how to measure quality of lerning object metadata

Published in: Education, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
1,363
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
55
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Quality Metrics for Learning Object Metadata Xavier Ochoa, ESPOL, Ecuador Erik Duval, KULeuven, Belgium
  • 2. Standardization and Interoperability of Learning Objects Repositories has solved the “chicken and egg” problem… …but has created new ones.
  • 3. The production, management and consumption of Learning Object Metadata is vastly surpassing the human capacity to review or process these metadata.
  • 4. Why Measuring Quality?
    • The quality of the metadata record that describes a learning object affects directly the chances of the object to be found, reviewed or reused.
    • An object with the title “Lesson 1 – Course 201” and no description, could not be found in a “Introduction to Java” query, even if it is about that subject.
  • 5. How to Measure Metadata Quality?
    • How it has been done (librarian community):
      • Assign a group of Metadata experts to review a statistically significant sample of the records grading a common set of quality characteristics
  • 6. But we cannot measure the quality manually anymore…
  • 7. … but is a good idea to follow the same quality characteristics.
  • 8. Our Proposal: Use Metrics
    • Small calculation performed over the values of the different fields of the metadata record in order to gain insight on a quality characteristics.
    • For example we can count the number of fields that have been filled with information (metric) to assess the completeness of the metadata record (quality characteristic).
  • 9. Quality Characteristics
    • We used the framework proposed by Bruce and Hillman:
      • Completeness
      • Accuracy
      • Provenance
      • Conformance to expectations
      • Consistency & logical coherence
      • Timeliness
      • Accessability
  • 10. Quality Metrics
    • We propose several quality metrics based on those quality characteristics
  • 11. Quality Metric Readability Textual Information Content Nominal Information Content Weighted Completeness Simple Completeness Metric Formula Metric Name
  • 12. Evaluation of the Metrics
    • 22 Human Reviewers
    • 20 Learning objects
    • 7 characteristics used for evaluation
    • 5 quality Metrics
    • Result: Textual Information Content Metric highly correlated (0.842) with human review
  • 13. Evaluation Results
  • 14. Applications
    • Repository Evaluation
    • Quality Visualization
    • Automatic correction of metadata
    • Automatic evaluation of metadata quality
  • 15. Repository Evaluation
  • 16. Quality Visualization
  • 17. Automated Evaluation of Quality
  • 18. Further Work
    • Measure “real” quality.
    • Quality as Fitness to fulfill a given purpose
      • Quality for Retrieval
      • Quality for Evaluation
      • Accessibility Quality
      • Re-use Quality
  • 19. Thank you, Gracias Comments, Suggestions, Critics… are Welcome! More Information: http://ariadne.cti.espol.edu.ec/M4M

×