Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

The Moodle quiz at the Open University

623 views

Published on

My talk to the 5th MoodleMoot Croatia

Published in: Education
  • Be the first to comment

  • Be the first to like this

The Moodle quiz at the Open University

  1. 1. The Moodle Quiz at the Open University: how we use it & how that helps students Tim Hunt, Senior Developer 5th MoodleMoot Croatia
  2. 2. The Moodle Quiz at the Open University: how we use it & how that helps students Tim Hunt, Senior Developer 5th MoodleMoot Croatia many dedicated OU teaching staff
  3. 3. The Open University
  4. 4. About the Open University 45 years old All distance education ~200 000 students (part time) Used Moodle since 2005
  5. 5. Where we are Contains Ordnance Survey data © Crown copyright and database right. Licensed under CC BY-SA 3.0 via Wikimedia Commons - http://commons.wikimedia.org/wiki/File:Buckinghamshire_UK_location_map.svg
  6. 6. Where we are
  7. 7. Where we are
  8. 8. Moodle at the OU
  9. 9. A typical module website
  10. 10. Daily load
  11. 11. Lots of servers
  12. 12. The quiz at the OU
  13. 13. A typical quiz – Maths
  14. 14. A typical quiz – Languages
  15. 15. Overall use
  16. 16. It matters
  17. 17. End of module survey results 0% 20% 40% 60% 80% 100% KPI 01: Overall, I am satisfied with the quality of this module. Q33 Clear understanding of what was required to complete the assessed work Q34 the assessment activities supported my learning
  18. 18. It makes a difference
  19. 19. Level 3 physics (SM358)
  20. 20. SM358 assessment tasks 0 TMAs 1 TMA 2 TMAs 3 TMAs 4 TMAs 0 iCMAs 1 iCMA 2 iCMAs 3 iCMAs 4 iCMAs 5 iCMAs 6 iCMAs
  21. 21. Students completing tasks 0 TMAs 1 TMA 2 TMAs 3 TMAs 4 TMAs 0 iCMAs 11.6% 3.4% 1.5% 0.5% 0.5% 1 iCMA 1.5% 1.0% 2 iCMAs 1.5% 2.4% 1.5% 3 iCMAs 1.5% 4 iCMAs 5.3% 2.4% 5 iCMAs 0.5% 3.9% 5.8% 8.2% 6 iCMAs 0.5% 0.5% 5.8% 5.8% 34.3%
  22. 22. Average exam scores 0 TMAs 1 TMA 2 TMAs 3 TMAs 4 TMAs 0 iCMAs 6.0 1 iCMA 2 iCMAs 17.0 24.0 3 iCMAs 60.0 4 iCMAs 43.7 62.0 5 iCMAs 23.0 46.0 62.6 69.5 6 iCMAs 35.3 60.8 77.5
  23. 23. Exam scores vs prediction 0 TMAs 1 TMA 2 TMAs 3 TMAs 4 TMAs 0 iCMAs −20.8 1 iCMA 2 iCMAs −43.9 −27.5 3 iCMAs −9.0 4 iCMAs −15.6 +1.8 5 iCMAs −3.8 −11.1 +1.4 +2.4 6 iCMAs −17.1 +3.4 +4.6
  24. 24. Changing assessment type
  25. 25. T184 robotics & meaning of life Before 10% mid-course iCMA 90% final written EMA part 1 short-answer part 2 programming & essays After 10% mid-course iCMA 30% final iCMA 60% final written EMA
  26. 26. Module completion rates T184 completion rates 60% 65% 70% 75% 80% 85% 90% 95% 100% 2004 2005 2006 2007 2008 2009 2010 2011 May Oct Introduction of CME
  27. 27. T184 completion rates 60% 65% 70% 75% 80% 85% 90% 95% 100% 2004 2005 2006 2007 2008 2009 2010 2011 May Oct Introduction of CME Module completion rates
  28. 28. T184 completion rates 60% 65% 70% 75% 80% 85% 90% 95% 100% 2004 2005 2006 2007 2008 2009 2010 2011 May Oct Introduction of CME Module completion rates
  29. 29. Deadlines
  30. 30. SM358 iCMA51 submit date 2010 advisory deadline
  31. 31. SM358 iCMA51 submit date 2010 advisory deadline 2012 hard deadline
  32. 32. Grades
  33. 33. Optional quizzes
  34. 34. Compulsory quizzes
  35. 35. Can computers grade sentences?
  36. 36. Spoiler: Yes!
  37. 37. How good are humans? Question Number of responses in analysis Percentage of responses where the human markers were in agreement with question author Percentage of responses where computer marking was in agreement with question author Range for the 6 human markers Mean of the 6 human markers A 189 97.4 – 100. 98.9 99.5 B 248 83.9 – 97.2 91.9 97.6 C 150 80.7 – 94.0 86.9 94.7 D 129 91.5 – 98.4 96.7 97.6 E 92 92.4 – 97.8 95.1 98.9 F 129 86.0 – 97.7 90.8 97.7 G 132 66.7 – 90.2 83.2 89.4
  38. 38. Comparing three algorithms Question Number of responses in analysis Percentage of responses where computer marking was in agreement with question author Computational linguistics Algorithmic manipulation of keywords IAT Pattern-match Regular Expressions A 189 99.5 99.5 98.9 B 248 97.6 98.8 98.0 C 150 94.7 94.7 90.7 D 129 97.6 96.1 97.7 E 92 98.9 96.7 96.7 F 129 97.7 88.4 89.2 G 132 89.4 87.9 88.6
  39. 39. Summary
  40. 40. References • Overall iCMA usage numbers collated by Phil Butcher. • End of module server results from Student Analytics in the Institute of Educational Technology via Linda Price. • SM358 data from John Bolton • T184 data from Jon Rosewell • SDK125, S141, S151 & S240 data from Sally Jordan • Jordan, Sally (2014). Using e-assessment to learn about students and learning. International Journal of e-Assessment, 4(1) • Jordan, Sally (2014). Adult science learners’ mathematical mistakes: an analysis of responses to computer-marked questions. European Journal of Science and Mathematics Education, 2(2) pp. 63–86. • Jordan, Sally (2012). Short-answer e-assessment questions : five years on. In: 2012 International Computer Assisted Assessment Conference, 10-11 July 2012, Southampton. • Pattern-match question type https://moodle.org/plugins/view/qtype_pmatch • STACK question type (maths) https://moodle.org/plugins/view/qtype_stack • Chris Sangwin (2013) Computer Aided Assessment of Mathematics
  41. 41. Key points Getting the assessment right is important Online quizzes can be powerful learning tools Computers can grade much more than multi- choice but only on behalf of a teacher Analyse the data and you can learn • Is this quiz working? • What are my students’ misconceptions?

×