Matthews personal response systems

689 views

Published on

Ian Matthews' presentation on teaching with technology at the Study Day on 5 March 11

Published in: Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
689
On SlideShare
0
From Embeds
0
Number of Embeds
49
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • Background – Teach mainly numeracy at a C class training prison. Lots of barriers to learning: Security limits IT access – no internet. At least they’re not texting throughout the lesson. Long lessons – 3 hours due to regime Constant churn and wide differentiation (E1 to OU) – so lots of individualized learning. Inevitably a big element of worksheets in this.
  • Lots of names generally called clickers. We also had another system called Quizdom. Used with promethean software
  • University of Massachusetts. The pedagogical approach I decided to use was based on a teaching model (Technology-Enhanced Formative Assessment - TEFA) developed at the University of Massachusetts (Beatty & Gerace 2009). It was based on pedagogical theory and also the empirical research (including some of the above) which indicated which approaches were successful in improving learning outcomes. It is designed to promote interactive learning in large science lectures in a Higher Education setting. 1.Pose a question or problem to the students. Do not teach first and then ask questions about what was taught; we ask questions first, and use them as a context for sense-making and direct instruction. 2. Have students wrestle with the question alone and decide upon a response. 3. Use Activote to collect responses (even from students who are uncertain) and display a chart of the aggregated responses.
  • Elicit from students as many different reasons and justifications for the chosen responses as possible, without revealing which is correct. In the process, draw out students’ reasoning and vocabulary, expose them to each others’ ideas, and make implicit assumptions explicit. Develop a student-dominated discussion of the assumptions, perceptions, ideas, and arguments involved. Help students formulate their ideas and practice talking mathematics, find out why they think what they do, and guide their understanding. Provide a summary, micro-lecture, meta-level comments, segue to another question, or whatever other closure seems warranted, informed by the detailed data just obtained on students’ thinking.
  • These questions are taken from a Standard Unit lesson on misconceptions in probability. Representativeness. This is caused by subjects estimating the likelihood of an event based on how well an outcome represents some aspect of the population(Shaughnessy, J.M. 2003)(Hirsch & O’Donnell 2001). A good example is when people believe that the selection 1,2,3,4,5 and 6 is less likely to come up in the lottery than a more random selection of numbers. Their logic being that the Lottery Officials draw the numbers at random; therefore your selected combination should look random(Khazanov 2006).
  • The equiprobability bias. This involves attributing the same probability to different events in a random experiment regardless of the chances in favor or against them. For example, when subjects are asked to compare the chances of different outcomes of three dice rolls, they tend to judge as equally likely the chance of rolling three fives and the chance of obtaining exactly one five (Lecoutre 1992).
  • Matthews personal response systems

    1. 1. Using Personal Response Systems In An Interactive Way Ian Matthews 5 th March 2011
    2. 2. This is a case-study of the use of this technology in teaching numeracy skills in a prison environment, where net-based resources are not available for security reasons
    3. 3. What is A Personal Response System? <ul><li>Called “Clickers” or “Voting Machines” </li></ul><ul><li>Our system is called “Activote” </li></ul>
    4. 4. Problems with original use of Activote in numeracy lessons <ul><li>• Tends to be used more in a summative rather than formative way. Does not generate learning. </li></ul><ul><li>• Activity generates a large element of competition. Motivating for majority but: </li></ul><ul><ul><li>Detracts from learning. </li></ul></ul><ul><ul><li>Can put off a minority of learners.. </li></ul></ul><ul><li>• Currently tend to use for just general knowledge quizzes. </li></ul><ul><li>Zero take up of technology in other subjects. </li></ul>
    5. 5. What did I think I can do about it? <ul><li>There is extensive research into using PRS systems as an active learning tool (assessment for learning) in Higher Education. </li></ul><ul><li>Research indicates that technology improves learner motivation and learning outcomes in HE environments. </li></ul><ul><li>To use it effectively the key is pedagogy not technology. </li></ul>
    6. 6. What did I do about it? <ul><li>Reintroduce Activote using Technology based Formative Assessment (TEFA) cycle (Beatty and Grace, 2009 ): </li></ul><ul><li>Pose a question or problem to the students. </li></ul><ul><li>Have students wrestle with the question alone and decide upon a response. </li></ul><ul><li>3. Use Activote to collect responses (even from students who are uncertain) and display a chart of the aggregated responses. </li></ul>
    7. 7. What will I do about it? <ul><li>Elicit from students as many different reasons and justifications for the chosen responses as possible, without revealing which is correct. </li></ul><ul><li>Develop a student-dominated discussion. </li></ul><ul><li>Provide a summary, micro-lecture, segue to another question, or whatever other closure seems warranted. </li></ul>
    8. 8. Question Design is crucial <ul><li>The questions should become progressively more challenging and build on each other. </li></ul><ul><li>Questions should produce a wide set of responses. </li></ul><ul><li>“ Identify student misconceptions and include them as answers, plausibly phrased” (Caldwell 2007:18). </li></ul>
    9. 9. The following examples are screen-shots of questions and response profiles
    10. 11. Response profile generated by package
    11. 14. Findings <ul><li>Observers noted that it encouraged peer discussion and teaching. </li></ul><ul><li>An effective diagnostic tool for formative assessment at the beginning of a topic. </li></ul><ul><li>It exposed and resolved misconceptions. </li></ul><ul><li>Learner feedback was positive relative to other learning activities. </li></ul>
    12. 15. Findings <ul><li>Some evidence of deeper conceptual learning being retained (no control group). </li></ul><ul><li>My teaching improved: </li></ul><ul><ul><li>Contingent teaching. </li></ul></ul><ul><ul><li>Facilitating discussions rather than teaching. </li></ul></ul><ul><li>Anonymity was not seen as a benefit. </li></ul><ul><li>Competition was not a negative factor.. </li></ul>
    13. 16. References <ul><li>BEATTY, I.D. & GERACE, W.J. (2009). Technology-Enhanced Formative Assessment: A Research-Based Pedagogy for Teaching Science with Classroom Response Technology. J Sci Educ Technol 18:146–162 </li></ul><ul><li>CALDWELL, J.E. (2007) “Clickers in the Large Classroom: Current Research and Best-Practice Tips”. CBE—Life Sciences Education Vol. 6, Spring 2007, pp 9-20 </li></ul>

    ×