Successfully reported this slideshow.

Open Education 2011: Openness and Learning Analytics

7

Share

Loading in …3
×
1 of 44
1 of 44

More Related Content

Related Books

Free with a 14 day trial from Scribd

See all

Open Education 2011: Openness and Learning Analytics

  1. Open Education 2011: Openness and Learning Analytics John Rinderle @johnrinderle Norman Bier @normanbier
  2. Open Learning Initiative Produce and improve scientifically-based courses and course materials which enact instruction and support instructors Provide open access to these courses and materials Develop communities of use, research and development that enable evaluation and continuous improvement
  3. Introduction: Outcomes Shared understanding of challenges, tensions and possibilities in learning analytics, around the dimensions of: • Potential of well-used OER in a use-driven design context • Adaptability (Variety)← → Analytics (Coherence) • Analytics Tools and Approach • Data—needs and challenges Describe community-based analytics plans: • Flexible, long-range planning • Useful, short-term steps Commit to action • Identify best existing efforts
  4. Driving Feedback Loops
  5. Infinite Points of Light
  6. Infinite Points of Light
  7. Infinite Points of Light
  8. Infinite Points of Light
  9. Infinite Proliferation The 4 R’s Reuse Redistribute Revise Remix
  10. Infinite Proliferation The 4 R’s NOT: Reuse Recreate Redistribute Revise Remix Add: Evaluate
  11. Proliferation isn’t just OER… Intro to CS @ CMU Statistics @ everywhere Core Statistics Business Statistics Research Statistics Medical Statistics
  12. What drives change in these scenarios? • Data • Intuition • Market demand • Instructor preferences
  13. The problems of variety • Quality is highly variable • Much duplication of effort • Difficult to choose appropriately • Hard to evaluate • Impossible to improve • Hard to scale success up
  14. Effectiveness is hit or miss
  15. Effectiveness What is working in open education? Why? And how do you know?
  16. Effectiveness Demonstrably support students in meeting articulated, measurable learning outcomes in a given set of contexts
  17. So why don't we do this now? • It's hard • It's expensive • Individual faculty can't do it alone • It can be threatening to educators • Disparate systems • How do we measure it? We need enabling processes and systems
  18. Driving Feedback Loops
  19. Great, but: What does it mean when we get out of the realm of discussion and into the realm of practice? Learning Analytics What are they? How do we create and use them?
  20. What do we mean by learning analytics? Proxies vs authentic assessment and evaluation
  21. Analytics Definition Data Collection  Reporting  Decision Making  Intervention  Action Collecting the data is not enough. We also need to make sense of if in ways that are actionable.
  22. Types of analytics • Educational/Academic Management analytics • Classroom Management analytics • Learning Outcomes analytics
  23. The problem of data collection 1. Agreed upon standards 2. Core collection 3. Space for exploration
  24. The problem of data collection 1. Agreed upon standards 2. Core collection 3. Space for exploration • Ownership • Privacy • Policy
  25. Ideal world •Common data standards •Analytics-enabled OER •Commonly accepted ownership and privacy approaches •Commitment to measuring effectiveness through assessment
  26. Bring Together What Already Works 1) Data Collection Systems Data Schemas 2) Communities of Evidence 3) Analysis Tools
  27. Learning Dashboard
  28. DataShop
  29. Evidence Hub
  30. Learning Registry
  31. Communities of Evidence
  32. And build new things 1) Data Collection Systems Data Schemas 2) Communities of Evidence 3) Analysis Tools
  33. Driven by different types of data Metadata Paradata Synthetic Data Contextual Behavioral Interaction Semantic Raw
  34. Share Alike and Share Data
  35. Community Based Approach
  36. A middle ground? Infinite The One Variety True Course Communities Coalesce
  37. Can we put these together? "Full spectrum" analytics to drive different types of decision making, address different feedback loops
  38. Learning Intelligence Systems
  39. What would we be giving up? This approach forces us to allow our minds to be changed by evidence.
  40. Conclusion: next steps • Innovate • Standardize • Scale
  41. Conclusion: next steps • Innovate • Commitment to Assessment • Standardize and Evaluation • Scale • Community Definition of Analytics-enabled OER • Common approach to data • Shared and private analytics platforms
  42. “Improvement in Post Secondary Education will require converting teaching from a „solo sport‟ to a community based research activity.” —Herbert Simon
  43. Questions • Do you believe in this approach to analytics-enabled OER? • Can this better address the pedagogy vs. reuse value curve?
  44. A Virtuous Cycle Educational Technology Data & Practice Theory

Editor's Notes

  • Increase access, improve outcomes; educate more, better.
  • Disclaimer (we were ambitious): Out of curiosity, how many of you read our abstract? Did it strike you as awfully ambitious for a short presentation? Yeah, us too… Quick review of abstractOutcomesPoint to discussion
  • Making assumptions about data is difficultbecause of variety.
  • Each OER is collecting and capturing different things or not collecting data at all.
  • And many more on OER Commons. Whatdrives this…
  • Challenge: building alignment around common learning outcomes
  • ×