Learning analytics overview: Building evidence based practice

3,439 views

Published on

Published in: Education, Technology
0 Comments
3 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
3,439
On SlideShare
0
From Embeds
0
Number of Embeds
873
Actions
Shares
0
Downloads
0
Comments
0
Likes
3
Embeds 0
No embeds

No notes for slide

Learning analytics overview: Building evidence based practice

  1. 1. Learning analytics: Building evidence based practice Assoc. Prof Shane Dawson Deputy Director LTU University of South Australia
  2. 2. What about today? • Current state of play • What analytics are in place? • What questions and data? • Patterns of data – importance of context • Analysis tools - SNA • Curriculum analytics • Privacy/ ethics • Questions, concerns or issues
  3. 3. Where are LA? Peak of inflated expectations Slope of enlightenment Trough of disillusionment Technology trigger Plateau of productivity
  4. 4. Learning Analytics What are learning analytics?
  5. 5. …is the collection, collation, analysis and reporting of data about learners and their contexts, for the purposes of understanding and optimizing learning Learning Analytics
  6. 6. Ed theory, Ed practice, SNA, Data mining, Machine learning, semantic, data visualisations, sense-making, psychology (social, cognitive, organisational), learning sciences Learning Analytics
  7. 7. Creatures of habit (Study, communication, search patterns, networks, credit card security, Movies) What do patterns indicate and what do changes in habit indicate? Learning Analytics
  8. 8. • 5 Billion mobile phones (2010) • 30 Billion content shared on facebook • $600 drive to store all of the worlds music • 60% increase in operating margin for retailers using big data Its accessible, cheap and critical Big data Manyika, J., et al. (2011). Big Data: The Next Frontier for Innovation, Competition, and Productivity: McKinsey Global Institute
  9. 9. Examples Develop a predictive algorithm to identify who will be admitted to a hospital using historical claims data Kaggle: connect with data scientists
  10. 10. Examples Loyalty programs Black Box trading
  11. 11. Target coupons inform father of daughter's pregnancy Examples
  12. 12. “Data is the new oil” Higher education: • Lots of isolated work targeting attrition. Few large enterprise egs. • Commercial – IBM, D2L S3, BB analytics Why? Where is LA?
  13. 13. • “High potential but low mindset” • Target rapid returns – students at risk. • Predictive Analytics Research Framework: • What data? Lets define terms Potential is there Manyika, J., et al. (2011). Big Data: The Next Frontier for Innovation, Competition, and Productivity: McKinsey Global Institute
  14. 14. Education Examples
  15. 15. Education Examples
  16. 16. Education - Purdue
  17. 17. Education - UMBC
  18. 18. Education – Fort Hays State Uni
  19. 19. What questions? What questions are learning analytics attempting to address? What analytics work is being undertaken at your institution? How far has this progressed?
  20. 20. Pass/Fail, Retention Concept understanding Learning motivation/ engagement Learning dispositions Graduate qualities Curriculum pathways Learning experience Satisfaction, community Questions explored
  21. 21. Rapidly moving beyond simple reports, attrition and student learning support measures TO - Predictive, Adaptive and Recommender states • Akin to – iTunes Genius, Amazon, Gmail Future
  22. 22. Future
  23. 23. Example Knewton: 2012 - 500,000 students 2013 - 5 million students 2014 - 15 million students 1 million points of data per student. Curriculum and activities modified based on the individual student. Future
  24. 24. Emotions/ face tracking Future Confusion Engaged Frustrated Activity modified Continuous state of challenge
  25. 25. In Australia: • Largely focused on retention and early intervention Examples: • UniSA – ESAP • QUT – student success and retention • CQU – indicators project • UTS – data intensive university Potential is there
  26. 26. What questions? What questions do you want LA to answer? What is the scope – e.g. institution / program or course?
  27. 27. What questions? Attrition Student failure Teaching quality Student learning Graduate qualities Quality assurance
  28. 28. What data? What data do you have access to?
  29. 29. What data? LMS activity Student grades and assessment activity Student demographics Prior learning experiences Other technologies – clickers, lecture recordings Course exp survey/ Student evaluations Course progression Institutional surveys Library Wireless, security camera, occupancy Social media Sentiment analysis
  30. 30. Who are the key stakeholders? Who has access to this data? Governance/ buy in? Additional data sets required?
  31. 31. Analysis How will you analyse this data? Who can analyse this data? What tools?
  32. 32. Its all about interpretation What is the interpretation of the data? Example patterns
  33. 33. 0 2 4 6 8 10 12 1 2 3 4 5 6 7 8 9 10 11 12 13 Its all about interpretation Student login frequency avg / week
  34. 34. 0 2 4 6 8 10 12 1 2 3 4 5 6 7 8 9 10 11 12 13 Class average
  35. 35. 0 2 4 6 8 10 12 1 2 3 4 5 6 7 8 9 10 11 12 13 High performing student
  36. 36. 0 2 4 6 8 10 12 1 2 3 4 5 6 7 8 9 10 11 12 13 Typical login engagement pattern
  37. 37. 0 10 20 30 40 50 60 70 80 90 100 -4 -2 0 2 4 6 8 10 12 Grades vs Time of first login (0 = course start)
  38. 38. 0 10 20 30 40 50 60 70 80 90 100 -4 -2 0 2 4 6 8 10 12
  39. 39. 0 10 20 30 40 50 60 70 80 90 100
  40. 40. 0 10 20 30 40 50 60 70 80 90 100
  41. 41. What are the predictors of failure and retention? • Prior grades • Low SE • First in family • Study load • Engagement online (time of login/ discussion activity) Risk assessment
  42. 42. What patterns do you expect? LMS activity Class interaction Assessment Qualitative Survey Networks What patterns?
  43. 43. • Context matters! Be critical
  44. 44. Analytic techniques SNA “single most potent source of influence” Astin, A. (1993). What matters in college: Four critical years revisited. San Francisco: Jossey-Bass. Student Networks
  45. 45. Analytic techniques SNA Social network building blocks: Actors (nodes) Relations (lines) Network (graph)
  46. 46. Analytic techniques SNA Interpreting visualisations
  47. 47. SNAPP • Social Networks Adapting Pedagogical Practice • Focus on student relationships (learning networks) • Simple visualizations to assist with interpretation and evaluate impact of activities • Lightweight analytics tool • Bookmarklet • Rapid and easy dissemination Bakharia, A., & Dawson, S. (2011). SNAPP: a bird's-eye view of temporal participant interaction. Paper presented at the 1st International Conference on Learning Analytics and Knowledge, Banff, Alberta, Canada
  48. 48. SNAPP
  49. 49. SNAPP • No need to access database • No need for Admin rights (installation of a bookmark) • Broad accessibility and compatibility • Fast delivery mechanisms – focus on simplicity
  50. 50. • Forum A • Forum B 14 messages posted by 4 participants Measuring Interaction
  51. 51. • Forum A • Forum B Measuring Interaction
  52. 52. Seeing networks in action Forum 1 Forum 2
  53. 53. Seeing networks in action Forum 1 Forum 2 Visualisations aid interpretation
  54. 54. SNAPP • http://www.snappvis.org/ • http://w3.unisa.edu.au/ltu/snapp.html • http://www.moodle.org/ <community tab < forums
  55. 55. Other tools Jigsaw: visual analytics for documents http://www.cc.gatech.edu/gvu/ii/jigsaw/ Netlytic: text & SNA for Twitter, Youtube, blogs, etc. Gephi.org
  56. 56. Other tools
  57. 57. Network examples Monitoring online networks - Informed decisions for improving learning design - Evaluate impact of implemented activities
  58. 58. Instructor Disconnected students Is this a learning community? Facilitator Centric
  59. 59. Instructor Facilitator Centric
  60. 60. Ego-Networks Ego-networks • Top 10% • Bottom 10%
  61. 61. Low Performer Top 10% student located in network Student with a passing grade
  62. 62. High Performer Low 10% student located in network Students with a grade >75% < 90%
  63. 63. Teaching Presence • Staff intervention • High – 70% of networks • Low – 10% of networks • Why? • The pursuit of community Dawson, S. (2006). Online forum discussion interactions as an indicator of student community. Australasian Journal of Educational Technology, 22(4), 495-510. Dawson, S. (2010). 'Seeing' the learning community: An exploration of the development of a resource for monitoring online student networking. British Journal of Educational Technology, 41(5), 736–752.
  64. 64. Curriculum analytics Need context in order to move from predictions to recommender systems Lecture, seminar, group work, community, online, hybrid, blended What teaching model? What outcomes?
  65. 65. Curriculum analytics Curriculum networks – dominant pathways
  66. 66. Curriculum analytics What outcomes, what experiences? A FED C B
  67. 67. Curriculum analytics What outcomes, what experiences? Assessment Learning outcomes Learning experiences Graduate attributes Automated portfolio/ Learning Relationship
  68. 68. Privacy and ethics • Who “owns” data? • Are analytics an intrusion of privacy? • If we can identify students at risk is there an obligation to intervene?
  69. 69. Thank you Questions? Shane.dawson@unisa.edu.au Events: A-LASI (http://www.solaresearch.org/events/a-lasi/) LAK14 (http://lak14indy.wordpress.com/)

×