REAP Assessment/Feedback Principles and Examples


Published on

Published in: Education, Technology
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

REAP Assessment/Feedback Principles and Examples

  1. 1. Martin Hawkseye-Learning Advisor (Higher Education)<br />Technology supported assessment<br />Increasing learner success with technology supported assessment<br />April 28, 2011| slide 1<br />
  2. 2. Plan<br />Defining assessment and feedback<br />Outline the REAP Project and its philosophy towards a ‘principled’ approach to course redesign <br />Some examples of REAP pilots<br />Questions and discussion<br />Technology supported assessment<br />April 28, 2011| slide 2<br />
  3. 3. Defining assessment and feedback<br />Assessment – the narrow meaning is exam for external accreditation<br />Feedback – the narrow sense is what the tutor writes about/on a finished piece of student work<br />REAP was also interested in:<br />Self-assessment and reflection - a form of internally generated feedback<br />Peer dialogue as feedback and peer assessment <br />Technology supported assessment<br />April 28, 2011| slide 3<br />
  4. 4. Definitions<br />Who is involved in formative assessment and feedback<br />Tutor<br />Peers<br />External (e.g. placement supervisor)<br />Computer generated<br />Self<br />Technology supported assessment<br />April 28, 2011| slide 4<br />
  5. 5. Why take formative assessment and feedback seriously?<br />Assessment is a key driver of student learning<br />Assessment is a major cost in HE (staff time)<br />Widely reported that students don’t read the feedback<br />Dropout/retention – linked to academic experience<br />First year experience – students need regular and structured feedback opportunities.<br />National student survey (NSS) – students are dissatisfied with feedback.<br />QAA reports – main area of criticism in England<br />Technology supported assessment<br />April 28, 2011| slide 5<br />
  6. 6. NSS 2007 - Assessment and Feedback Results<br />Technology supported assessment<br /><ul><li>Nationally only 55% of students think feedback is prompt and had helped to clarify things they did not understand [Scotland: 48%]
  7. 7. Nationally only 63% of students agree that have received detailed comments on their work [Scotland: 49%]</li></ul>April 28, 2011| slide 6<br />
  8. 8. Key messages<br />Formative assessment and feedback by others can only have an impact on learning when it influences a student’s own self-regulatory processes (adapted from Boud, 1995)<br />Students are already self-assessing and generating feedback so we should build on this capacity (Nicol and Macfarlane-Dick, 2004)<br />Technology supported assessment<br />April 28, 2011| slide 7<br />
  9. 9. REAP<br />3 HEIs (Strathclyde, Glasgow Caledonian Business School, Glasgow University)<br />Targeting large 1st year classes<br />Multi-disciplinary as well as faculty-wide (19 pilots, ~6000 students)<br />Range of technologies: online tests, simulations, discussion boards, e-voting, e-portfolios, peer/feedback software, admin systems, VLEs, offline-online<br />A ‘principled’ approach to designing and embedding assessment practices<br />Technology supported assessment<br />April 28, 2011| slide 8<br />
  10. 10. Scaffolding self regulation: 7 principles of good feedback (assessment design)<br />Clarify what good performance is (goals, criteria, standards).<br />Facilitate reflection and self-assessment in learning <br />Deliver high quality feedback to students: feedback that enables students to monitor and self-correct<br />Encourage peer and tutor dialogue around learning<br />Encourage positive motivational beliefs & self esteem through assessment<br />Provide opportunities to close the feedback loop<br />Use feedback information to shape teaching<br />Source: Nicol and Macfarlane-Dick (2006)<br />Technology supported assessment<br />April 28, 2011| slide 9<br />
  11. 11. Two super principles<br />Super-principle 1: developing learner self-regulation (empowerment) i.e steers to encourage ownership of learning – the seven principles discussed above.<br />Super-principle 2: time on task and effort (engagement) i.e. steers on how much work to do and when – Gibbs and Simpson 4 conditions<br />Case examples from REAP – applying these principles/conditions<br />Technology supported assessment<br />April 28, 2011| slide 10<br />
  12. 12. REAP Pilots (1)<br />Department of Mechanical Engineering<br />Students: 250<br />Technology: Commercial online homework packages, electronic voting system (EVS)<br />Assessment Activities: Weekly tests provide on-demand feedback on student performance for both students and tutors. Just-in-time teaching supported by interactive classes using EVS. <br />Efficiencies: 60% reduction in assessment workload saving 102 staff hours. License cost of commercial packages £12.95 and £7 per student per annum. <br />Learning Gains: Results from 2006/7 diet indicate strong class attainment maintained (90% pass rate, 65% average) "I think it’s managed to save a lot of time for ourselves and the tutors and given them more time to develop what they are going to talk about and give more time for them to speak to people individually if they need it." Student comment <br />Technology supported assessment<br />April 28, 2011| slide 11<br />
  13. 13. REAP Pilots (2)<br />Department of Psychology<br />Students: 560<br />Technology: Online collaborative group tasks supported by VLE message-board<br />Assessment Activities: Regular collaborative tasks support peer feedback processes and student engagement. <br />Efficiencies: 50% of lectures replaced with online tasks. Staff time re-directed to support online tasks. <br />Learning Gains: Significant overall improvement in average exam pass mark (51.1% in 2005/06 diet rising to 57.4% in 2006/07). Exam failure rate reduced from 13% to 5%. Course failure rate reduced from 12.1% to 2.8%. <br />Technology supported assessment<br />April 28, 2011| slide 12<br />
  14. 14. September 2007<br />JISC CETIS Assessment SIG<br />REAP Pilots (3)<br />Department of Hospitality & Tourism Management<br />Students: 200<br />Technology: Podcasts, electronic voting system <br />Assessment Activities: Regular podcast releases support changes to lectures to include interactive discussion using EVS.<br />Efficiencies: 50% reduction in lectures (however, lectures delivered twice to smaller groups).Licensing and development costs associated with podcasts (c. £1k per 30 minutes podcast)<br />Learning Gains: Significant gain in overall exam mark in 2006/07 diet (+12.2%) compared with 2005/06. Significant reduction (-25%) in students receiving non-qualification. <br />
  15. 15. September 2007<br />JISC CETIS Assessment SIG<br />REAP Pilots (4)<br />Department of Modern Languages<br />Students: 200<br />Technology: Online homework and tests supported by VLE, electronic voting system (EVS)<br />Assessment Activities: Diagnostic tests of student knowledge at start of year to inform teaching.Formative feedback from regular online testing and EVS classroom use. <br />Efficiencies: Effective delivery of course made possible despite significant cuts in funding and staffing. Tutorials reduced by 50% and replaced with online tasks. Listening classes reduced from 360 to 160 hours. Saving of 200 staff hours. <br />Learning Gains: Failure rate in final exam reduced from 24% to 4.6% compared with 2005/06 diet. "Having almost immediate feedback on marks was very useful as I was aware at every point as to how well I was coping" Student comment <br />
  16. 16. September 2007<br />JISC CETIS Assessment SIG<br />REAP Pilots (5)<br />School of Pharmacy - Pharmacy Practice 3<br />Students: 240<br />Technology: Online simulation tutorial <br />Assessment Activities: Regular online tasks to improve student engagement, multiple opportunities for self-testing and regular feedback. <br />Efficiencies: Savings in staff time due to reduced need for remedial revision work with individual students. <br />Learning Gains: Significant gain in overall exam mark in 2006/07 (+16%) compared to 2005/06 diet. “The tutorial was an excellent resource and learning tool to supplement our class” Simulated exactly what it would be like to carry out a check on a prescription, allowing us to experience the difficulties involved and discover where we needed improvement” Student comments <br />
  17. 17. REAP Pilot (6)<br />School of Pharmacy - Foundation Pharmacy<br />Students: 120<br />Technology: e-portfolio, electronic feedback form<br />Assessment Activities: Improved tutor feedback to students supported by feedback form, enhanced opportunities for reflection activities.<br />Efficiencies: Staff workload increased during test implementation but reductions anticipated in future years.<br />Learning Gains: No significant improvements in academic performance reported. 89% of students receiving feedback via the feedback form agreed the feedback was ‘useful’ or ‘very useful’. “The feedback form was especially useful, I found it easier to work from as it was segmented into the different aspects of the report I had written and had comments on both the strong and weak elements of my report” Student comment<br />Technology supported assessment<br />April 28, 2011| slide 16<br />
  18. 18. Slight detour <br />A more detailed exampleAudio/Video Feedback<br />Technology supported assessment<br />April 28, 2011| slide 17<br />
  19. 19. Question for you<br />Question: What are the opportunities for change at UHI?<br />Technology supported assessment<br />April 28, 2011| slide 18<br />
  20. 20. Change<br />Challenge to Change<br />JISC Typology of technology use in assessment and feedback doc<br />Technology supported assessment<br />April 28, 2011| slide 19<br />
  21. 21. Do you have the policies in place to back change?<br />Strathclyde Policies and Feedback is a Dialogue site<br />University of Dundee Online Assessment Policy and Procedures<br /><br />Technology supported assessment<br />April 28, 2011| slide 20<br />
  22. 22. Relevant papers<br />Nicol, D (in press), Laying the foundation for lifelong learning: cases studies of technology supported assessment processes in large first year classes, British Journal of Educational Technology (to be published July 2007).<br />Nicol, D (2007) E-assessment by design: using multiple-choice tests to good effect, Journal of Further and Higher Education.<br />Nicol, D. & Milligan, C. (2006), Rethinking technology-supported assessment in relation to the seven principles of good feedback practice. In C. Bryan and K. Clegg, Innovations in Assessment, Routledge.<br />Nicol, D (2006), Increasing success in first year courses: assessment redesign, self-regulation and learning technologies, Paper prepared for ASCILITE conference, Sydney, Australia, Dec 3-6.<br />Nicol, D, J. & Macfarlane-Dick (2006), Formative assessment and self-regulated learning: A model and seven principles of good feedback practice, Studies in Higher Education, 31(2), 199-218.<br />Nicol, D & Boyle, J. (2003), Peer Instruction versus Class-wide discussion in large classes. Studies in Higher Education, 28(4), 457-473<br />Boyle, J.T. and Nicol, D. J. (2003). Using classroom communication systems to support interaction and discussion in large class settings, Association for Learning Technology Journal, 11(3), 43-57<br />Technology supported assessment<br />April 28, 2011| slide 21<br />