Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

The power of learning analytics to unpack learning and teaching: a critical perspective

177 views

Published on

Across the globe many educational institutions are collecting vast amounts of small and big data about students and their learning behaviour, such as their class attendance, online activities, or assessment scores. As a result, the emerging field of Learning Analytics (LA) is exploring how data can be used to empower teachers and institutions to effectively support learners. In the recent Innovative Pedagogy Report Ferguson et al. (2017) encourage researchers and practitioners to move towards a new form of learning analytics called student-led learning analytics, which enable learners to specify their own goals and ambitions. They also support learners to reach these goals. This is particularly helpful for individuals who have little time to spare for study. In this ESRC session, based upon 6 years of experience with LA data and large-scale implementations amongst 450000+ students at a range of context, I will use an interactive format to discuss and debate three major questions: 1) To what extent is learning analytics the new holy grail of learning and teaching? 2) How can instructional design be optimised using the principles of learning analytics?; 3) With the introduction of student-led analytics, to what extent can learning analytics promote ‘personalisation’ or ‘generalisation’ for diverse populations of students?

Published in: Education
  • Be the first to comment

  • Be the first to like this

The power of learning analytics to unpack learning and teaching: a critical perspective

  1. 1. Workshop 1: The power of Learning Analytics (Dalhousie 1S05) @DrBartRienties Professor of Learning Analytics
  2. 2. A special thanks to Avinash Boroowa, Shi-Min Chua, Simon Cross, Doug Clow, Chris Edwards, Rebecca Ferguson, Mark Gaved, Christothea Herodotou, Martin Hlosta, Wayne Holmes, Garron Hillaire, Simon Knight, Nai Li, Vicky Marsh, Kevin Mayles, Jenna Mittelmeier, Vicky Murphy, Quan Nguygen, Tom Olney, Lynda Prescott, John Richardson, Jekaterina Rogaten, Matt Schencks, Mike Sharples, Dirk Tempelaar, Belinda Tynan, Lisette Toetenel, Thomas Ullmann, Denise Whitelock, Zdenek Zdrahal, and others…
  3. 3. Who are you? Why did you choose this workshop? In the mean time: go with your browser to https://pollev.com/bartrienties552
  4. 4. Objectives of today’s workshop 1. What is learning analytics? 2. To what extent is learning analytics the new holy grail of learning and teaching? 3. How can instructional design be optimised using the principles of learning analytics?; 4. With the introduction of student-led analytics, to what extent can learning analytics promote ‘personalisation’ or ‘generalisation’ for diverse populations of students? 5. Your preferred topic here 
  5. 5. Workshop activity • Discuss in pairs or groups • What is learning analytics? • What is the main benefit? • What is the main risk? • 10 minutes
  6. 6. What is learning analytics? http://bcomposes.wordpress.com/
  7. 7. (Social) Learning Analytics “LA is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs” (LAK 2011) Social LA “focuses on how learners build knowledge together in their cultural and social settings” (Ferguson & Buckingham Shum, 2012)
  8. 8. Dyckhoff, A. L., Zielke, D., Bültmann, M., Chatti, M. A., & Schroeder, U. (2012). Design and Implementation of a Learning Analytics Toolkit for Teachers. Journal of Educational Technology & Society, 15(3), 58-76.
  9. 9. Prof Paul Kirschner (OU NL) “Learning analytics: Utopia or dystopia”, LAK 2016 conference
  10. 10. SAFE FOR INDIRECT EDUCATIONAL VIEWING
  11. 11. 1. Increased availability of learning data 2. Increased availability of learner data 3. Increased ubiquitous presence of technology 4. Formal and informal learning increasingly blurred 5. Increased interest of non-educationalists to understand learning (Educational Data Mining, 4profit companies) 6. Personalisation and flexibility as standard
  12. 12. Big Data is messy!!!
  13. 13. Objectives of today’s workshop 1. What is learning analytics? 2. To what extent is learning analytics the new holy grail of learning and teaching? 3. How can instructional design be optimised using the principles of learning analytics?; 4. With the introduction of student-led analytics, to what extent can learning analytics promote ‘personalisation’ or ‘generalisation’ for diverse populations of students? 5. Your preferred topic here 
  14. 14. Workshop activity • Discuss in pairs or groups • To what extent is learning analytics the new holy grail of learning and teaching? • 10 minutes
  15. 15. Introduction math/stats • Business • 1st year students • Blended • 0-12 weeks after start studying • Adaptive learning/Problem-Based Learning • N=990
  16. 16. Diagnostic EntryTests Week 0 Week 1 Week 2 Week 3 Week 4 Week 6Week 5 Quiz 1 Quiz 2 Quiz 3 Final Exam • Math- Exam • Stats- Exam --------------------------------------------- BlackBoard LMS behaviour ----------------------------------------- Week 7 Mastery scores MyMathlab Mastery scores Practice time # Attempts Practice time # Attempts Mastery scores Practice time # Attempts Mastery scores Practice time # Attempts Mastery scores Practice time # Attempts Mastery scores Practice time # Attempts Mastery scores MyMathlab Practice time # Attempts Mastery scores MyStatlab Mastery scores Practice time # Attempts Practice time # Attempts Mastery scores Practice time # Attempts Mastery scores Practice time # Attempts Mastery scores Practice time # Attempts Mastery scores Practice time # Attempts Mastery scores MyStatlab Practice time # Attempts Demogra- phic data QMTotal Week 8 Learning Styles, Motivation, Engagement Learning Emotions -Learning dispositions ------------------ ------------------------------------------------------------------ Tempelaar, D. T., Rienties, B., & Giesbers, B. (2015). In search for the most informative data for feedback generation: Learning Analytics in a data-rich context. Computers in Human Behavior, 47, 157-167. Impact factor: 3.624.
  17. 17. LMS prediction Not great 
  18. 18. E-tutorials prediction Substantial improvement!
  19. 19. Entry test and quizes Even better!
  20. 20. All elements combined:
  21. 21. Using track data we can follow: -who is struggling? -where? -when? -why?
  22. 22. Who is struggling in week 3? What can be done about this? • (Personalised) feedback • (Personalised) examples • Peer support • Emotional/learning support
  23. 23. How to provide actionable feedback? Tempelaar, D. T., Rienties, B. Mittelmeier, J. Nguyen, Q. (2018) Student profiling in a dispositional learning analytics application using formative assessment. Computers in Human Behavior, 78, pp. 408-420. Impact factor: 3.624.
  24. 24. Tempelaar, D. T., Rienties, B., Nguyen, Q. (2017). Towards actionable learning analytics using dispositions. IEEE Transactions on Learning Technologies, 1 (Jan-March 2017), 6-16.
  25. 25. Tempelaar, D. T., Rienties, B., Nguyen, Q. (2017). Towards actionable learning analytics using dispositions. IEEE Transactions on Learning Technologies, 1 (Jan-March 2017), 6-16.
  26. 26. School of Business and Economics Next question: can we increase our knowledge on why, how and when students engage with e-tutorials? Given the crucial role of engagement with e-tutorials, this research tries to discover other, prior links in the chain: 1. The timing decisions of students: when do students practice? Three subsequent moments: – Tutorial session preparation – Quiz session preparation – Exam preparation 2. The learning mode decisions of students; preference for: – Worked examples – Tutored problem-solving – Untutored problem-solving 3. How do both decisions relate to learning dispositions?
  27. 27. School of Business and Economics Learning design & feedback in SOWISOSOWISO system provides different instructional designs (McLaren) with different types of learning feedback (Narciss) • Ask Hint learning design represents tutored problem-solving: students receive feedback in the form of hints and an evaluation of provided answers, both during and at the end of the problem-solving steps, helping to solve the problem step-by-step. This feedback is of Knowledge of Result/response, KR feedback type . • Ask Solution learning design represents the worked-out example approach. Many lab- based educational studies find it the most efficient design: ‘examples relieve learners of problem-solving that – in initial cognitive skill acquisition when learners still lack understanding – is typically slow, error-prone, and driven by superficial strategies’ (Renkl). It corresponds with Knowledge of the Correct Response, KCR type feedback. • Check Answer learning design represents untutored problem-solving: feedback is restricted to the evaluation of provided answers at the end of the problem-solving steps. In feedback terms: Multiple-Try type feedback.
  28. 28. School of Business and Economics Learning aids • Learning aids in SOWISO: – Check: untutored problem- solving – Theory: view a theory page explaining the mathematical principle – Solution: worked example – Hint: tutored problem-solving
  29. 29. School of Business and Economics Solution in SOWISO • The solution page provides a full worked example. After studying this, the next step is to ‘redo’: another version of the exercise loads (through parameterization), students can prove mastery in that new version.
  30. 30. School of Business and Economics RQ1: Descriptive statistics: timing decisions How do students plan their learning, given the three crucial moments: • Weekly tutorial sessions, Week1 … Week7 • Biweekly quiz sessions, Week3, Week5, Week7 • Final exam, end of course TutorSession QuizSession Exam 0 10 20 30 40 50 60 70 80 90 TutorSession QuizSession Exam Example: Week1 topics are discussed in tutorial session of Week1, quizzed in Week3, tested in Week8. Conclusion: Quiz is most important learning target. Tutorial mastery: 27% Quiz mastery: 63% Exam mastery: 71% on average (mastery% is cumulative) Mastery%
  31. 31. School of Business and Economics RQ2: Descriptive statistics: learning mode decisions How do students plan their learning in terms of learning mode decisions: • What is the total number of attempts, averaged per exercise • What share of these attempts are of worked example type: Solutions • What share of these attempts are of tutored problem-solving type: Hints Averages: • Attempts= 1.78, so on average, every exercise is loaded 1.78 times • Hints= 0.07, so on average, 7 hints are called for every 100 exercise attempts • Solutions= 0.76, so on average, 0.76 solutions are called for every exercise attempt Conclusion: students use of tutored problem-solving is quite low, but frequent use of worked examples (43% of attempts are solutions called for). Hints Atte…0.0 0.5 1.0 1.5 2.0 2.5 Hints Solutio ns
  32. 32. School of Business and Economics RQ2: Course performance, timing and learning mode To what extent do timing and learning mode decisions matter? Relationships with three course performance measures. Beyond the control variables: • Mastery positively predicts performance, but only timely mastery: Tutorial & Quiz • Attempts positively predicts performance, but only timely attempts: Tutorial & Quiz • Solutions called for negatively predicts performance, but only timely calls: Tutorial & Quiz • Hints called for unrelated to performance • Views called for negatively predicts performance
  33. 33. Objectives of today’s workshop 1. What is learning analytics? 2. To what extent is learning analytics the new holy grail of learning and teaching? 3. How can instructional design be optimised using the principles of learning analytics?; 4. With the introduction of student-led analytics, to what extent can learning analytics promote ‘personalisation’ or ‘generalisation’ for diverse populations of students? 5. Your preferred topic here 
  34. 34. Workshop activity • Discuss in pairs or groups • How can instructional design be optimised using the principles of learning analytics? • 10 minutes
  35. 35. Learning Design is described as “a methodology for enabling teachers/designers to make more informed decisions in how they go about designing learning activities and interventions, which is pedagogically informed and makes effective use of appropriate resources and technologies” (Conole, 2012).
  36. 36. Assimilative Finding and handling information Communication Productive Experiential Interactive/ Adaptive Assessment Type of activity Attending to information Searching for and processing information Discussing module related content with at least one other person (student or tutor) Actively constructing an artefact Applying learning in a real-world setting Applying learning in a simulated setting All forms of assessment, whether continuous, end of module, or formative (assessment for learning) Examples of activity Read, Watch, Listen, Think about, Access, Observe, Review, Study List, Analyse, Collate, Plot, Find, Discover, Access, Use, Gather, Order, Classify, Select, Assess, Manipulate Communicate, Debate, Discuss, Argue, Share, Report, Collaborate, Present, Describe, Question Create, Build, Make, Design, Construct, Contribute, Complete, Produce, Write, Draw, Refine, Compose, Synthesise, Remix Practice, Apply, Mimic, Experience, Explore, Investigate, Perform, Engage Explore, Experiment, Trial, Improve, Model, Simulate Write, Present, Report, Demonstrate, Critique Conole, G. (2012). Designing for Learning in an Open World. Dordrecht: Springer. Rienties, B., Toetenel, L., (2016). The impact of learning design on student behaviour, satisfaction and performance: a cross-institutional comparison across 151 modules. Computers in Human Behavior, 60 (2016), 333-341 Open University Learning Design Initiative (OULDI)
  37. 37. So what is the design of a “typical” Dundee course?
  38. 38. Toetenel, L., Rienties, B. (2016). Analysing 157 Learning Designs using Learning Analytic approaches as a means to evaluate the impact of pedagogical decision-making. British Journal of Educational Technology, 47(5), 981–992.
  39. 39. Merging big data sets • Learning design data (>300 modules mapped) • VLE data • >140 modules aggregated individual data weekly • >37 modules individual fine-grained data daily • Student feedback data (>140) • Academic Performance (>140) • Predictive analytics data (>40) • Data sets merged and cleaned • 111,256 students undertook these modules
  40. 40. Nguyen, Q., Rienties, B., Toetenel, L., Ferguson, R., Whitelock, D. (2017). Examining the designs of computer-based assessment and its impact on student engagement, satisfaction, and pass rates. Computers in Human Behavior. DOI: 10.1016/j.chb.2017.03.028.
  41. 41. Nguyen, Q., Rienties, B., Toetenel, L., Ferguson, R., Whitelock, D. (2017). Examining the designs of computer-based assessment and its impact on student engagement, satisfaction, and pass rates. Computers in Human Behavior. DOI: 10.1016/j.chb.2017.03.028.
  42. 42. So what is the design of a “typical” Dundee course?
  43. 43. Constructivist Learning Design Assessment Learning Design Productive Learning Design Socio-construct. Learning Design VLE Engagement Student Satisfaction Student retention 150+ modules Week 1 Week 2 Week30 + Rienties, B., Toetenel, L., (2016). The impact of learning design on student behaviour, satisfaction and performance: a cross-institutional comparison across 151 modules. Computers in Human Behavior, 60 (2016), 333-341 Nguyen, Q., Rienties, B., Toetenel, L., Ferguson, R., Whitelock, D. (2017). Examining the designs of computer-based assessment and its impact on student engagement, satisfaction, and pass rates. Computers in Human Behavior. DOI: 10.1016/j.chb.2017.03.028. Communication
  44. 44. So what happens when you give learning design visualisations to teachers? Toetenel, L., Rienties, B. (2016) Learning Design – creative design to visualise learning activities. Open Learning: The Journal of Open and Distance Learning, 31(3), 233-244.
  45. 45. Toetenel, L., Rienties, B. (2016) Learning Design – creative design to visualise learning activities. Open Learning: The Journal of Open and Distance Learning, 31(3), 233-244.
  46. 46. Objectives of today’s workshop 1. What is learning analytics? 2. To what extent is learning analytics the new holy grail of learning and teaching? 3. How can instructional design be optimised using the principles of learning analytics?; 4. With the introduction of student-led analytics, to what extent can learning analytics promote ‘personalisation’ or ‘generalisation’ for diverse populations of students? 5. Your preferred topic here 
  47. 47. Workshop activity • Discuss in pairs or groups • How can instructional design be optimised using the principles of learning analytics? • 10 minutes
  48. 48. “Excellent” students
  49. 49. “Failing” students
  50. 50. Hlosta, M., Herrmannova, D., Zdrahal, Z., & Wolff, A. (2015). OU Analyse: analysing at-risk students at The Open University. Learning Analytics Review, 1-16.
  51. 51. Hlosta, M., Herrmannova, D., Zdrahal, Z., & Wolff, A. (2015). OU Analyse: analysing at-risk students at The Open University. Learning Analytics Review, 1-16.
  52. 52. Hlosta, M., Herrmannova, D., Zdrahal, Z., & Wolff, A. (2015). OU Analyse: analysing at-risk students at The Open University. Learning Analytics Review, 1-16.
  53. 53. Hlosta, M., Herrmannova, D., Zdrahal, Z., & Wolff, A. (2015). OU Analyse: analysing at-risk students at The Open University. Learning Analytics Review, 1-16.
  54. 54. So what happens when you give learning analytics data about students to teachers? 1. How did 240 teachers within the 10 modules made use of PLA data (OUA predictions) and visualisations to help students at risk? 2. To what extent was there a positive impact on students' performance and retention when using OUA predictions? 3. Which factors explain teachers' uses of OUA?
  55. 55. Usage of OUA dashboard by participating teachers 6 Herodotou, C., Rienties, B., Boroowa, A., Zdrahal, Z., Hlosta, M., & Naydenova, G. (2017). Implementing predictive learning analytics on a large scale: the teacher's perspective. Paper presented at the Proceedings of the Seventh International Learning Analytics & Knowledge Conference, Vancouver, British
  56. 56. 65 Which factors better predict pass and completion rates? Regression analysis Student characteristics Age Gender New/c ontinu ous Disability Ethnicity Educat ion IMD band Best previous score Sum of previous credits Teacher characteristics Module presentations per teacher Students per module presentation OUA usage module design Herodotou, C., Rienties, B., Boroowa, A., Zdrahal, Z., Hlosta, M. (Submitted: 01-08-2017). Using Predictive Learning Analytics to Support Just-in-time Interventions: The Teachers' Perspective Across a Large-scale Implementation.
  57. 57. 66 Significant model (pass: χ2= 76.391, p < .001, df = 24). Logistic regression results (pass rates) ●Nagelkerke’s R2 = .185 (model explains 18% of the variance in passing rates) ● Correctly classified over 70% of the cases (prediction success overall was 70.2%: 33.5 % for not passing a module and 88.7% for passing a module). ●Significant predictors of both pass and completion rates: ●OUA usage (p=.006) ●Best previous module score achieved (p=.005) ● All other predictors were not significant. Best predictors of pass rates OUA usage Best previous score Herodotou, C., Rienties, B., Boroowa, A., Zdrahal, Z., Hlosta, M. (Submitted: 01-08-2017). Using Predictive Learning Analytics to Support Just-in-time Interventions: The Teachers' Perspective Across a Large-scale Implementation.
  58. 58. 67 ● Innovative Pedagogy 2017 Report ● Student-led analytics ● In the last decade, learning analytics have helped institutions, teachers, and policy makers to understand student learning and outcomes. These analytics make use of the data generated during study activity in order to enhance learning and teaching. They often focus on how teachers and institutions can help learners to pass a test, a module, or a degree. ● Student-led analytics, on the other hand, not only invite students to reflect on the feedback they receive but also start them on the path of setting their own learning goals. ● These analytics put learners in the driving seat. Learners can decide which goals and ambitions they want to achieve, and which types and forms of learning analytic they want to use to achieve those targets. ● The analytics then support learners to reach their goals.
  59. 59. Workshop 1: The power of Learning Analytics (Dalhousie 1S05) @DrBartRienties Professor of Learning Analytics

×