Achieving feedback at scale with Electronic Voting Systems


Published on

A lecture prepared for students on the MA Education, Technologies for Learning unit. Part of the University of Bath Department of Education summer school 2014.

Published in: Education
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Achieving feedback at scale with Electronic Voting Systems

  2. 2. Achieving feedback at scale with Electronic Voting Systems
  3. 3. Schedule • Determine motivations for the use of EVS to enhance the student learning experience. • Examine case studies of use and evaluate the effectiveness of the technology. • Explore briefly the different types of EVS, acknowledging their affordances and limitations.
  4. 4. Which team won the World Cup Final 2014? a. Argentina b. Germany c. England 0 7 0 A B C
  5. 5. Process Handset VOTE! Student Feedback
  6. 6. A different way • Go to • Login with the Session ID 720370
  7. 7. How did you get to campus this morning? a. Car – “I drove” b. Car – “I was a passenger” c. Walk d. Run e. Cycle f. Bus g. Living on campus h. None of the above A B C D E F G H
  8. 8. Process Handset VOTE! Student Data connected device Feedback
  9. 9. In summary • Users participate actively in a variety of contexts • Immediate submission of responses 1. With a range of devices 2. Anonymously or identified • Responses viewed as a range of graph types • Reporting on collected data is possible
  10. 10. Caveat Bell (1998) argues that, “The MCQ format holds world records in the categories of most popular, most unpopular, most used, most misused, most loved, and most hated”
  11. 11. Why use EVS?(Bruff, 2009) • Increases student attendance, participation and enjoyment • Provides both teachers and learners useful feedback on student learning “Teaching methods that use active learning, such as small-group and classwide discussion methods, typically result in improve learning over over methods in which students play more passive roles.” (p. 5)
  12. 12. Why use EVS? Gives prompt feedback (Chickering & Gamson, 1987) RATIONALE Diagnostic testing to support formative assessment
  13. 13. Why use EVS? Helps clarify what good performance is (goals, criteria, expected standards) (Nicol & Macfarlane-Dick, 2006) RATIONALE End of semester revision quiz with an included competitive element
  14. 14. Why use EVS? RATIONALE Provides information to teachers that can be used to help shape teaching. (Nicol & Macfarlane Dick, 2006)
  15. 15. Some considerations (Bruff, 2009) • What student learning goals do I have for the question? • What do I hope to learn about my students by asking this question? • What will my students learn about each other when they see the results of this question? • How might this question be used to engage students with course content in small-group or classwide discussions? • What distribution of responses do I expect to see from my students? • What might I do if the actual distribution turns out very differently?
  16. 16. Assertion reason questions • Assertion reason questions (ARQs) are a developed from of MCQs. • Aim to develop a question set which would test reasoning (procedural knowledge) rather than recall (declarative knowledge). • ARQs will test two per question (the assertion and reason statements) as well as the validity of the ‘because’ statement.
  17. 17. Is grass green? a. Yes, because it contains a pigment known as chlorophyll b. Yes, because it excretes a gas known as ammonia c. No, because it is red in colour and then dyed green d. No, because the effects of oxidation means that the colour is blue 7 0 0 0 A B C D
  18. 18. Mazur’s Peer Instruction Sequence 1. Concept question posed 2. Individual Thinking: students given time to think individually 3. Students provide individual responses 4. Students receive feedback (as a histogram) 5. Small group discussion 6. Retesting of same concept 7. Students provide individual responses 8. Students receive feedback – as a histogram 9. Lecturer summarises and explains "correct" response(s)
  19. 19. Enhancing feedback Davenport, Hayes & Parmar (2009)
  20. 20. • Ombea • TurningPoint • Poll Everywhere, (free for up 40 users) • Kahoot!, (game-based response system) Technologies
  21. 21. Questions?
  22. 22. Banff, AB, Canada Nitin Parmar SFHEA CMALT MBCS LEARNING TECHNOLOGIST @nrparmar |
  23. 23. [1] © #32210678 | © alengo | 25-12-2013 [5,8,21] © OMBEA AB [8] © XOO.ME [21] © TurningTechnologies LLC. [22] © Nitin Parmar Images
  24. 24. Bell, R. A. (1998) A humorous account of 10 multiple-choice test-item flaws that clue testwise students. Electronic Journal on Excellence in College Teaching, 9(2). Bruff D. (2009). Teaching with Classroom Response Systems. San Francisco, CA: Jossey Bass. Chickering A.W., Gamson Z.F. (1987). Seven Principles for Good Pracice in Undergraduate Education. AAHE Bulletin. Davenport J.H., Hayes A., Parmar N.R. (2009). The use of an Electronic Voting System to enhance student feedback. In the proceedings of: 4th Plymouth e-Learning Conference: Boundary Changes: Redefining Learning Spaces, 23-24 April 2009, University of Plymouth. Mazur E. (1997). Peer Instruction: A User’s Manual. New Jersey, NY: Prentice Hall. Narduzzo A., Parmar N.R. (2011). Chalk, Talk, Digital Pens and Audience Response Systems - Combining tradition and technology to improve maths learning. In: Innovations Day, 12 May 2011, University of Bath. Nicol D.J., Macfarlane-Dick D. (2006). Formative assessment and self-regulated learning: a model and seven principles of good feedback practice. Studies in Higher Education, Vol.31, No.2, April 2006, pp.199-218. Williams J.B. (2006). Assertion-reason multiple choice testing as a tool for deep learning: a qualitative analysis Assessment and Evaluation in Higher Education Vol 31 pp 287-301 References