Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

James Bond, Monorail Cat and Partying penguins. What happens when you let student design their own assessment content

2,205 views

Published on

Published in: Education, Technology
  • Be the first to comment

James Bond, Monorail Cat and Partying penguins. What happens when you let student design their own assessment content

  1. 1. James Bond, internet memes and partying penguins (or, what happens when students write their own assessment content) Simon Bates Pearson Strategies for Success Workshop Toronto, May 2013
  2. 2. 2
  3. 3. 3
  4. 4. Overview I. Motivation II. Technology enabler: PeerWise III. Use cases IV. Engagement, learning, question quality?
  5. 5. Overview I. Motivation II. Technology enabler: PeerWise III. Use cases IV. Engagement, learning, question quality?
  6. 6. Overview I. Motivation II. Technology enabler: PeerWise III. Use cases IV. Engagement, learning, question quality?
  7. 7. The University of Edinburgh Edinburgh, Scotland 5th July, 2010 Paul Denny PeerWise bridging the gap between online learning and social media Department of Computer Science The University of Auckland New Zealand
  8. 8. • Web-based Multiple Choice Question repository built by students • Students: – develop new questions with associated explanations – answer existing questions and rate them for quality and difficulty – take part in discussions – can follow other authors peerwise.cs.auckland.ac.nz
  9. 9. >100,000 >100,000 student contributors >500,000>500,000unique questions >10,000,000>10,000,000 answers
  10. 10. 14 As a question author…..
  11. 11. 15
  12. 12. 16
  13. 13. 19 As a question answerer …..
  14. 14. 20
  15. 15. 21
  16. 16. 22
  17. 17. 23
  18. 18. Overview I. Motivation II. Technology enabler: PeerWise III. Use cases IV. Engagement, learning, question quality?
  19. 19. Timeline 2010-11: UoE pilot study 2011-12: Multi-institution, multi-course 2012-13: UBC PHYS 101 Coursera MOOC 25
  20. 20. Pilot year (2010-11) – replace single handin PeerWise was introduced in workshop sessions in Week 5 Students worked through structured example task and devised own Qs in groups. All these resources are available online (see final slide) 26
  21. 21. An assessment was set for the end of Week 6: Minimum requirements: • Write one question • Answer 5 • Comment on & rate 3 Contributed ~3% to course assessment 27
  22. 22. 28
  23. 23. We were deliberately hands off. • No moderation • No corrections • No interventions at all But we did observe….. 29
  24. 24. • JISC project – SGC4L
  25. 25. N (students) ~800 N (staff) ~10
  26. 26. Overview I. Motivation II. Technology enabler: PeerWise III. Use cases IV. Engagement, learning, question quality?
  27. 27. Generally, students did • Participate beyond minimum requirements • Engage in community learning, correcting errors • Create problems, not exercises • Provide positive feedback 34
  28. 28. 35
  29. 29. Generally, students did not • Contribute trivial or irrelevant questions • Obviously plagiarise • Participate much beyond assessment periods • Didn’t all leave it to the last minute 36
  30. 30. • Phys 101 uptake graph showing midterm
  31. 31. 39
  32. 32. 40
  33. 33. Overview I. Motivation II. Technology enabler: PeerWise III. Use cases IV. Engagement, learning, question quality?
  34. 34. Correlation with end of course outcomes
  35. 35. Quartiles Q4 – top 25% Q3 – upper middle Q2 – lower middle Q1 – bottom 25% 22 students did not take the FCI
  36. 36. 1st year Physics N=172 University of Edinburgh
  37. 37. 1st year Physics N=172 University of Edinburgh
  38. 38. Overview I. Motivation II. Technology enabler: PeerWise III. Use cases IV. Engagement, learning, question quality?
  39. 39. Comprehensive categorisation of >50% of repository for two successive academic years Principal measures to define a ‘high quality question’ - cognitive level of question - explanation quality - other binary criteria
  40. 40. Category Description 6 Create (synthesise ideas) 5 Assess 4 Analyse (multi-step) 3 Apply (1-step calcs.) 2 Understand 1 Remember Cognitive level of question
  41. 41. Explanation 0 – Missing 1 – Inadequate (e.g. wrong reasoning / answer, trivial, flippant, unhelpful) 2 – Minimal (e.g. correct answer, but with insufficient explanation or justification, aspects may be unclear) 3 – Good/Detailed (e.g. clear and sufficiently detailed exposition of correct method and answer) 4 – Excellent (e.g. Describes physics thoroughly, remarks on plausibility of answer, use of appropriate diagrams, perhaps explains reasoning for distractors)
  42. 42. 0% 5% 10% 15% 20% 25% 30% 35% 40% 45% 50% 1 2 3 4 5 6 Taxonomic Category PercentageofSubmittedQuestions First semesterFirst semester N = 350N = 350 Second semester N = 252Second semester N = 252 Results: Question level Physics 1A / 1B 2011
  43. 43. Results: Explanation Physics 1A 2010 and 2011
  44. 44. ‘High quality’ question 1. At least 2/6 on cognitive level (“understand” and above) 2. At least 2/4 on explanation (“minimal” and above) 3. Clearly worded question (binary) 4. Feasible distractors (binary) 5. ‘Most likely’ correct (binary) 6. ‘Not obviously’ plagiarised (binary)
  45. 45. Results: Physics 1A 2010 and 2011 2 successive years of the same course (N=150, 350) •‘High quality’ questions: 78%, 79% •Over 90% (most likely) correct, and 3/5 of those wrong were identified by students. •69% (2010) and 55% (2011) rated 3 or 4 for explanations •Only 2% (2010) and 4% (2011) rated 1/ 6 for taxonomic level.
  46. 46. Literature Bottomley & Denny Biochem and Mol Biol Educ. 39(5) 352-361 (2011) •107 Year 2 biochem students •56 / 35 / 9 % of questions in lowest 3 levels. Momsen et al CBE-Life Sci Educ 9, 436-440 (2010) “9,713 assessment items submitted by 50 instructors in the United States reported that 93% of the questions asked on examinations in introductory biology courses were at the lowest two levels of the revised Bloom’s taxonomy”
  47. 47. • High general standard of engagement and student- generated questions • Relatively few basic knowledge questions • Transferable across disciplines / institutions • We hypothesise scaffolding activities are critical for high level cognitive engagement Summary
  48. 48. Acknowledgements Physics 101 course team Georg Rieger Firas Moosvi Emily Altiere UBC CWSEI simon.bates@ubc.ca @simonpbates Ross Galloway Judy Hardy Karon McBride Alison Kay Keith Brunton Jonathan Riise Danny Homer Chemistry – Peter Kirsop Biology – Heather McQueen Physics – Morag Casey Comp Sci – Paul Denny
  49. 49. Community: http://www.PeerWise-Community.org JISC-funded multi institution study:https://www.wiki.ed.ac.uk/display/SGC4L/Home UoE Physics Pilot Study: AIP Conf. Proc. 1413, 359 http://dx.doi.org/10.1063/1.3680069 RSC overview article http://www.rsc.org/Education/EiC/issues/2013January/student-generated-assessment.asp UoE Physics scaffolding resources http://www2.ph.ed.ac.uk/elearning/projects/peerwise/ Resources
  50. 50. Question quality analysis (1st year Physics University of Edinburgh) Assessing the quality of a student-generated question repository, submitted to Phys Rev, ST Phys Educ Res. Multi-institution, multi-course study Student-generated content: Enhancing learning through sharing multiple-choice questions, submitted to International Journal of Science Education Scaffolding Student Learning via Online Peer Learning, submitted to International Journal of Science Education Publications in preparation / review / press
  51. 51. Copyright2013GrahamFowell/TheHitman,re-producedwithpermission,EducationInChemistry,Vol50No1(2013)
  52. 52. Photo credits Photo credits Community: http://www.flickr.com/photos/kubina/471164507/ Screen grab from Mwensch ‘A vision of students today’ http://www.youtube.com/watch?v=dGCJ46vyR9o 65

×