REAP Assessment/Feedback Principles and Examples

Uploaded on


More in: Education , Technology
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads


Total Views
On Slideshare
From Embeds
Number of Embeds



Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

    No notes for slide


  • 1. Martin Hawkseye-Learning Advisor (Higher Education)
    Technology supported assessment
    Increasing learner success with technology supported assessment
    April 28, 2011| slide 1
  • 2. Plan
    Defining assessment and feedback
    Outline the REAP Project and its philosophy towards a ‘principled’ approach to course redesign
    Some examples of REAP pilots
    Questions and discussion
    Technology supported assessment
    April 28, 2011| slide 2
  • 3. Defining assessment and feedback
    Assessment – the narrow meaning is exam for external accreditation
    Feedback – the narrow sense is what the tutor writes about/on a finished piece of student work
    REAP was also interested in:
    Self-assessment and reflection - a form of internally generated feedback
    Peer dialogue as feedback and peer assessment
    Technology supported assessment
    April 28, 2011| slide 3
  • 4. Definitions
    Who is involved in formative assessment and feedback
    External (e.g. placement supervisor)
    Computer generated
    Technology supported assessment
    April 28, 2011| slide 4
  • 5. Why take formative assessment and feedback seriously?
    Assessment is a key driver of student learning
    Assessment is a major cost in HE (staff time)
    Widely reported that students don’t read the feedback
    Dropout/retention – linked to academic experience
    First year experience – students need regular and structured feedback opportunities.
    National student survey (NSS) – students are dissatisfied with feedback.
    QAA reports – main area of criticism in England
    Technology supported assessment
    April 28, 2011| slide 5
  • 6. NSS 2007 - Assessment and Feedback Results
    Technology supported assessment
    • Nationally only 55% of students think feedback is prompt and had helped to clarify things they did not understand [Scotland: 48%]
    • 7. Nationally only 63% of students agree that have received detailed comments on their work [Scotland: 49%]
    April 28, 2011| slide 6
  • 8. Key messages
    Formative assessment and feedback by others can only have an impact on learning when it influences a student’s own self-regulatory processes (adapted from Boud, 1995)
    Students are already self-assessing and generating feedback so we should build on this capacity (Nicol and Macfarlane-Dick, 2004)
    Technology supported assessment
    April 28, 2011| slide 7
  • 9. REAP
    3 HEIs (Strathclyde, Glasgow Caledonian Business School, Glasgow University)
    Targeting large 1st year classes
    Multi-disciplinary as well as faculty-wide (19 pilots, ~6000 students)
    Range of technologies: online tests, simulations, discussion boards, e-voting, e-portfolios, peer/feedback software, admin systems, VLEs, offline-online
    A ‘principled’ approach to designing and embedding assessment practices
    Technology supported assessment
    April 28, 2011| slide 8
  • 10. Scaffolding self regulation: 7 principles of good feedback (assessment design)
    Clarify what good performance is (goals, criteria, standards).
    Facilitate reflection and self-assessment in learning
    Deliver high quality feedback to students: feedback that enables students to monitor and self-correct
    Encourage peer and tutor dialogue around learning
    Encourage positive motivational beliefs & self esteem through assessment
    Provide opportunities to close the feedback loop
    Use feedback information to shape teaching
    Source: Nicol and Macfarlane-Dick (2006)
    Technology supported assessment
    April 28, 2011| slide 9
  • 11. Two super principles
    Super-principle 1: developing learner self-regulation (empowerment) i.e steers to encourage ownership of learning – the seven principles discussed above.
    Super-principle 2: time on task and effort (engagement) i.e. steers on how much work to do and when – Gibbs and Simpson 4 conditions
    Case examples from REAP – applying these principles/conditions
    Technology supported assessment
    April 28, 2011| slide 10
  • 12. REAP Pilots (1)
    Department of Mechanical Engineering
    Students: 250
    Technology: Commercial online homework packages, electronic voting system (EVS)
    Assessment Activities: Weekly tests provide on-demand feedback on student performance for both students and tutors. Just-in-time teaching supported by interactive classes using EVS.
    Efficiencies: 60% reduction in assessment workload saving 102 staff hours. License cost of commercial packages £12.95 and £7 per student per annum.
    Learning Gains: Results from 2006/7 diet indicate strong class attainment maintained (90% pass rate, 65% average) "I think it’s managed to save a lot of time for ourselves and the tutors and given them more time to develop what they are going to talk about and give more time for them to speak to people individually if they need it." Student comment
    Technology supported assessment
    April 28, 2011| slide 11
  • 13. REAP Pilots (2)
    Department of Psychology
    Students: 560
    Technology: Online collaborative group tasks supported by VLE message-board
    Assessment Activities: Regular collaborative tasks support peer feedback processes and student engagement.
    Efficiencies: 50% of lectures replaced with online tasks. Staff time re-directed to support online tasks.
    Learning Gains: Significant overall improvement in average exam pass mark (51.1% in 2005/06 diet rising to 57.4% in 2006/07). Exam failure rate reduced from 13% to 5%. Course failure rate reduced from 12.1% to 2.8%.
    Technology supported assessment
    April 28, 2011| slide 12
  • 14. September 2007
    JISC CETIS Assessment SIG
    REAP Pilots (3)
    Department of Hospitality & Tourism Management
    Students: 200
    Technology: Podcasts, electronic voting system
    Assessment Activities: Regular podcast releases support changes to lectures to include interactive discussion using EVS.
    Efficiencies: 50% reduction in lectures (however, lectures delivered twice to smaller groups).Licensing and development costs associated with podcasts (c. £1k per 30 minutes podcast)
    Learning Gains: Significant gain in overall exam mark in 2006/07 diet (+12.2%) compared with 2005/06. Significant reduction (-25%) in students receiving non-qualification.
  • 15. September 2007
    JISC CETIS Assessment SIG
    REAP Pilots (4)
    Department of Modern Languages
    Students: 200
    Technology: Online homework and tests supported by VLE, electronic voting system (EVS)
    Assessment Activities: Diagnostic tests of student knowledge at start of year to inform teaching.Formative feedback from regular online testing and EVS classroom use.
    Efficiencies: Effective delivery of course made possible despite significant cuts in funding and staffing. Tutorials reduced by 50% and replaced with online tasks. Listening classes reduced from 360 to 160 hours. Saving of 200 staff hours.
    Learning Gains: Failure rate in final exam reduced from 24% to 4.6% compared with 2005/06 diet. "Having almost immediate feedback on marks was very useful as I was aware at every point as to how well I was coping" Student comment
  • 16. September 2007
    JISC CETIS Assessment SIG
    REAP Pilots (5)
    School of Pharmacy - Pharmacy Practice 3
    Students: 240
    Technology: Online simulation tutorial
    Assessment Activities: Regular online tasks to improve student engagement, multiple opportunities for self-testing and regular feedback.
    Efficiencies: Savings in staff time due to reduced need for remedial revision work with individual students.
    Learning Gains: Significant gain in overall exam mark in 2006/07 (+16%) compared to 2005/06 diet. “The tutorial was an excellent resource and learning tool to supplement our class” Simulated exactly what it would be like to carry out a check on a prescription, allowing us to experience the difficulties involved and discover where we needed improvement” Student comments
  • 17. REAP Pilot (6)
    School of Pharmacy - Foundation Pharmacy
    Students: 120
    Technology: e-portfolio, electronic feedback form
    Assessment Activities: Improved tutor feedback to students supported by feedback form, enhanced opportunities for reflection activities.
    Efficiencies: Staff workload increased during test implementation but reductions anticipated in future years.
    Learning Gains: No significant improvements in academic performance reported. 89% of students receiving feedback via the feedback form agreed the feedback was ‘useful’ or ‘very useful’. “The feedback form was especially useful, I found it easier to work from as it was segmented into the different aspects of the report I had written and had comments on both the strong and weak elements of my report” Student comment
    Technology supported assessment
    April 28, 2011| slide 16
  • 18. Slight detour
    A more detailed exampleAudio/Video Feedback
    Technology supported assessment
    April 28, 2011| slide 17
  • 19. Question for you
    Question: What are the opportunities for change at UHI?
    Technology supported assessment
    April 28, 2011| slide 18
  • 20. Change
    Challenge to Change
    JISC Typology of technology use in assessment and feedback doc
    Technology supported assessment
    April 28, 2011| slide 19
  • 21. Do you have the policies in place to back change?
    Strathclyde Policies and Feedback is a Dialogue site
    University of Dundee Online Assessment Policy and Procedures
    Technology supported assessment
    April 28, 2011| slide 20
  • 22. Relevant papers
    Nicol, D (in press), Laying the foundation for lifelong learning: cases studies of technology supported assessment processes in large first year classes, British Journal of Educational Technology (to be published July 2007).
    Nicol, D (2007) E-assessment by design: using multiple-choice tests to good effect, Journal of Further and Higher Education.
    Nicol, D. & Milligan, C. (2006), Rethinking technology-supported assessment in relation to the seven principles of good feedback practice. In C. Bryan and K. Clegg, Innovations in Assessment, Routledge.
    Nicol, D (2006), Increasing success in first year courses: assessment redesign, self-regulation and learning technologies, Paper prepared for ASCILITE conference, Sydney, Australia, Dec 3-6.
    Nicol, D, J. & Macfarlane-Dick (2006), Formative assessment and self-regulated learning: A model and seven principles of good feedback practice, Studies in Higher Education, 31(2), 199-218.
    Nicol, D & Boyle, J. (2003), Peer Instruction versus Class-wide discussion in large classes. Studies in Higher Education, 28(4), 457-473
    Boyle, J.T. and Nicol, D. J. (2003). Using classroom communication systems to support interaction and discussion in large class settings, Association for Learning Technology Journal, 11(3), 43-57
    Technology supported assessment
    April 28, 2011| slide 21