Your SlideShare is downloading. ×
Open Education 2011: Openness and Learning Analytics
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Introducing the official SlideShare app

Stunning, full-screen experience for iPhone and Android

Text the download link to your phone

Standard text messaging rates apply

Open Education 2011: Openness and Learning Analytics

2,617
views

Published on

Published in: Education, Technology

0 Comments
6 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
2,617
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
29
Comments
0
Likes
6
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide
  • Increase access, improve outcomes; educate more, better.
  • Disclaimer (we were ambitious): Out of curiosity, how many of you read our abstract? Did it strike you as awfully ambitious for a short presentation? Yeah, us too… Quick review of abstractOutcomesPoint to discussion
  • Making assumptions about data is difficultbecause of variety.
  • Each OER is collecting and capturing different things or not collecting data at all.
  • And many more on OER Commons. Whatdrives this…
  • Challenge: building alignment around common learning outcomes
  • Transcript

    • 1. Open Education 2011:Openness and Learning AnalyticsJohn Rinderle @johnrinderleNorman Bier @normanbier
    • 2. Open Learning InitiativeProduce and improve scientifically-based courses and course materials which enact instruction and support instructorsProvide open access to these courses and materialsDevelop communities of use, research and development that enable evaluation and continuous improvement
    • 3. Introduction: OutcomesShared understanding of challenges, tensions andpossibilities in learning analytics, around the dimensions of: • Potential of well-used OER in a use-driven design context • Adaptability (Variety)← → Analytics (Coherence) • Analytics Tools and Approach • Data—needs and challengesDescribe community-based analytics plans: • Flexible, long-range planning • Useful, short-term stepsCommit to action • Identify best existing efforts
    • 4. Driving Feedback Loops
    • 5. Infinite Points of Light
    • 6. Infinite Points of Light
    • 7. Infinite Points of Light
    • 8. Infinite Points of Light
    • 9. Infinite ProliferationThe 4 R’sReuseRedistributeReviseRemix
    • 10. Infinite ProliferationThe 4 R’s NOT:Reuse RecreateRedistributeReviseRemix Add: Evaluate
    • 11. Proliferation isn’t just OER…Intro to CS @ CMU Statistics @ everywhere Core Statistics Business Statistics Research Statistics Medical Statistics
    • 12. What drives change in these scenarios?• Data• Intuition• Market demand• Instructor preferences
    • 13. The problems of variety• Quality is highly variable• Much duplication of effort• Difficult to choose appropriately• Hard to evaluate• Impossible to improve• Hard to scale success up
    • 14. Effectiveness is hit or miss
    • 15. EffectivenessWhat is working in openeducation? Why? And howdo you know?
    • 16. Effectiveness Demonstrably support students in meeting articulated, measurable learning outcomes in a given set of contexts
    • 17. So why dont we do this now?• Its hard• Its expensive• Individual faculty cant do it alone• It can be threatening to educators• Disparate systems• How do we measure it?We need enabling processes and systems
    • 18. Driving Feedback Loops
    • 19. Great, but:What does it mean when we get out of the realm of discussion and into the realm of practice?Learning Analytics What are they? How do we create and use them?
    • 20. What do we mean by learning analytics?Proxies vs authentic assessment and evaluation
    • 21. Analytics Definition Data Collection  Reporting  Decision Making  Intervention  Action Collecting the data is not enough. We also need to make sense of if in ways that are actionable.
    • 22. Types of analytics• Educational/Academic Management analytics• Classroom Management analytics• Learning Outcomes analytics
    • 23. The problem of data collection1. Agreed upon standards2. Core collection3. Space for exploration
    • 24. The problem of data collection1. Agreed upon standards2. Core collection3. Space for exploration• Ownership• Privacy• Policy
    • 25. Ideal world•Common data standards•Analytics-enabled OER•Commonly accepted ownership and privacy approaches•Commitment to measuring effectiveness through assessment
    • 26. Bring Together What AlreadyWorks1) Data Collection Systems Data Schemas2) Communities of Evidence3) Analysis Tools
    • 27. Learning Dashboard
    • 28. DataShop
    • 29. Evidence Hub
    • 30. Learning Registry
    • 31. Communities of Evidence
    • 32. And build new things1) Data Collection Systems Data Schemas2) Communities of Evidence3) Analysis Tools
    • 33. Driven by different types of data Metadata Paradata Synthetic Data Contextual Behavioral Interaction Semantic Raw
    • 34. Share Alike and Share Data
    • 35. Community Based Approach
    • 36. A middle ground?Infinite The OneVariety True Course Communities Coalesce
    • 37. Can we put these together?"Full spectrum" analytics to drive different types of decisionmaking, address different feedback loops
    • 38. Learning Intelligence Systems
    • 39. What would we be giving up? This approach forces us to allow our minds to be changed by evidence.
    • 40. Conclusion: next steps• Innovate• Standardize• Scale
    • 41. Conclusion: next steps• Innovate • Commitment to Assessment• Standardize and Evaluation• Scale • Community Definition of Analytics-enabled OER • Common approach to data • Shared and private analytics platforms
    • 42. “Improvement in Post SecondaryEducation will require convertingteaching from a „solo sport‟ to acommunity based research activity.” —Herbert Simon
    • 43. Questions• Do you believe in this approach to analytics-enabled OER?• Can this better address the pedagogy vs. reuse value curve?
    • 44. A Virtuous Cycle Educational Technology Data & Practice Theory