Your SlideShare is downloading. ×
ALMS Rubric Open Ed 2012 Presentation
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

ALMS Rubric Open Ed 2012 Presentation

231

Published on

Presentation at Open Education Conference 2012. The ALMS rubric measuring the technical difficulty of reusing OER.

Presentation at Open Education Conference 2012. The ALMS rubric measuring the technical difficulty of reusing OER.

Published in: Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
231
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
2
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide
  • Considered having a programming and miscellaneous, but both were dropped
  • Samples rated in order
  • Not an indication of quality.
  • Transcript

    • 1. Establishing inter-rater reliability withthe ALMS rubric
    • 2. Credit• David Wiley• Utah Valley University• Brigham Young University
    • 3. Purpose of studyThree barriers to reuse:• Content• Pedagogical• Technical
    • 4. Study Design• Delphi study (n=5)• Inter-rater reliability (n=4)
    • 5. Technical barriers ALMS rubric:• Access to editing tools• Level of expertise• Meaningfully Editable• Source Files
    • 6. Access to Editing Tools• Most difficult to measure• Context-dependent• Delphi study participants were least confident
    • 7. Solution: Access to Editing Tools MP3 WAV OGG AIFF WMA RMIs an appropriatesoftware applicationpre-installed with the 2 2 2 1 1 1operating system?Mean 3.00 2.88 2.88 2.00 1.61 1.39
    • 8. Findings• Operating system didn’t matter as much• Why? • Web services • Ubiquity of editing services • Popularity of file types
    • 9. Level of Expertise• Set to three levels of editing (like beginner, intermediate, advanced)• What degree of _____ editing expertise is required? • Text • Image • Audio • Video
    • 10. Meaningfully Editable• What portion of the ______ in the OER is editable? • Text • Image • Audio • Video
    • 11. Source Files• What portion of _____ are editable? • Text • Image • Audio • Video
    • 12. Establishing Reliability: Raters• 4 raters• Taken from grad. Program at BYU
    • 13. Inter-rater sample Type NROC MIT OCW WikiEducatorHumanities (e.g. Art,Literature) 3 samples 3 samples 3 samplesSocial Sciences (e.g.Education, Psychology) 3 samples 3 samples 3 samplesSciences (e.g. Biology, Math) 3 samples 3 samples 3 samples
    • 14. Results• ICC(2,1) =.655, df=376, 95% CI [.609, .699]• Could be better
    • 15. Results: Order Order Mean Std. Dev.1 2.91 .172 2.81 .253 2.80 .27Mean 2.84 .23
    • 16. Results: Disciplines Repository Mean Std. Dev.Humanities 3.02 .16Social Sciences 2.83 .19Sciences 2.67 .32Mean 2.84 .22
    • 17. Results: Repository Repository Mean Std. Dev.NROC 2.36 .33MIT OCW 2.99 .2WikiEducator 3.16 .15Mean 2.84 .23
    • 18. Conclusions• ALMS rubric has some agreement, but could be refined• There are differences in repositories with respect to reuse• There may be differences among disciplines

    ×