Glasgow feedback v1
Upcoming SlideShare
Loading in...5
×
 

Like this? Share it with your network

Share

Glasgow feedback v1

on

  • 603 views

 

Statistics

Views

Total Views
603
Views on SlideShare
603
Embed Views
0

Actions

Likes
0
Downloads
1
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

CC Attribution-ShareAlike LicenseCC Attribution-ShareAlike License

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • Intro – feedback as continuous correction and the feedback loop Examples Audio – reflect on personal concept map Vid – alt epigeum winner from 2010 Clickers – exeter business school Adaptive – Access – CAA at bradford Mobile – campusM/txttools? Are these the way forward To go next – take examples from the PASS and TESTA projects Bring out the implications of RH work on student identity and mindset – reactions to feedback may be a symptom and not a cause. PASS
  • 03/02/11
  • 03/02/11

Glasgow feedback v1 Presentation Transcript

  • 1. Assessment & feedback: rediscovering that the feedback loop works! Peter Hartley, University of Bradford [email_address] http :// www.brad.ac.uk /educational-development/ aboutus / team/Full_details_27414_en.php
  • 2. This input
    • Exploring the meaning of ‘feedback’
    • Different examples of innovation
      • Audio feedback.
      • Video and audio combinations.
      • Clickers and response systems.
      • Adaptive systems.
      • Integrating systems and mobile applications.
    • Where to go next?
      • Feedback-embedded curriculum design.
      • Programme-based assessment (PASS project)
      • Defining/comparing assessment environments (TESTA project)
      • Exploring the assessment/identity link (following Higgins et al).
  • 3. Assessment is a problem: feedback is just part of it.
    • See the PASS Project Issues Paper
      • Please comment/feedback and use.
        • http://www.pebblepad.co.uk/bradford/viewasset.aspx?oid=260486&type=file
    • Would highlight:
      • Assessment ‘drives and channels’
      • What/why are we measuring: the ‘slowly learnt’problem.
      • Limitations of grading (e.g. marks are not numbers).
      • Implications for course structures/regulations
      • .
  • 4. An example to start …
    • 59% Excellent.
        • This was the only tutor comment on a student assignment. How do you think the student reacted and felt?
  • 5. The meaning of feedback
      • Cannot we ‘recapture’ the ‘original’ meaning of feedback: enabling self-correcting behaviour towards a known goal.
      • This means rediscovering the ‘feedback loop’ whereby information must be ‘fed back’ so that it:
        • relates to the goal.
        • is received.
        • is correctly interpreted.
        • enables corrective action.
      • cf. the work of Royce Sadler in Higher Education, e.g.
        • http://www.northumbria.ac.uk/sd/central/ar/academy/cetl_afl/earli2010/themes/rsadler/
  • 6. Assessment: multi-purpose & multi-audience
  • 7. Example 1: audio
    • The ASEL project
      • led by Bradford with Kingston as partner.
      • various uses of audio, including feedback, in different disciplines.
    • Noted:
      • Technology is now easy and accessible.
      • Positive student reactions.
      • Different tutor styles and approaches.
      • A different form of communication?
      • Serendipity – e.g. feedback stimulated podcasts.
  • 8. ASEL main conclusion
      • … audio is a powerful tool, providing opportunities for personalising learning, promoting greater student engagement, and encouraging creativity. In introducing audio into their practice, lecturers were required to rethink their pedagogical approaches and learning design, adopting new and innovative ways to enable students to be more actively involved in the learning process. It allowed lecturers to provide more personal and richer feedback to students, and increased the level of interaction and dialogue amongst students and between students and lecturers. (Stewart and Dearnley)
  • 9.  
  • 10. Example 2: audio and video
    • Growing number of examples.
      • ALT/Epigeum Awards 2010: see the ALT Open Access Repository
    • See the winning entry by Read and Brown from Southampton:
      • Organic Chemistry.
      • Use of tablets to show solutions and working. See at: http://www.soton.ac.uk/chemistry/media/ALT/
      • Focus on self-assessment.
  • 11. Example 3: clickers are coming
    • Student Response Systems at the moment?
      • They work … they can change staff and student behaviour and performance.
    • But
      • can be cumbersome and fiddly.
      • setup time.
      • need strong commitment and support (e.g. see experience at Exeter Business School).
  • 12. Example 3: clickers are coming
    • Student Response Systems in the future?
      • They will radically change staff and student behaviour.
      • They will be flexible and easy to use.
      • They will be on the student’s own device!
  • 13. Example 4: adaptive systems
    • PBL with consequences – you get immediate feedback on the consequences of your decisions.
      • e.g. The G4 project at St George’s
        • http://www.generation4.co.uk/
    • Adaptive assessment
      • e.g. the work of Trevor Barker
        • https://uhra.herts.ac.uk/dspace/bitstream/2299/1726/1/901864.pdf
  • 14. Example 5a: integrating systems: CAA
    • IT4SEA project at Bradford
      • New 100-seater facility.
      • Thin client technology.
      • QMP as University standard for summative assessment.
      • Procedures agreed with Exam Office.
      • Design of room (available as cluster outside assessment times)
      • Teaching potential.
  • 15. The main CAA room at Bradford
  • 16. And the growth …
  • 17. Example 5b: integrating applications
    • Use of mobile technology
      • e.g. CampusM at Bradford:
        • http://www.campusm.com/
        • http://www.techrepublic.com/software/university-of-bradford-about-uob-10-mobile/2194295?tag=content;selector-1
    • Integrating different technologies
      • e.g. Clive Barker et al:
        • http://caa.ecs.soton.ac.uk/Papers/Barker-CAA2010.pdf
  • 18. Where to go next?
    • Feedback-embedded curriculum design (see recent/ongoing JISC Programmes ).
    • Programme-based assessment (PASS).
    • Defining/comparing assessment environments (TESTA project).
    • Exploring the assessment/identity link (following Higgins et al).
  • 19. Programme-based assessment: PASS
    • NTFS group project over 3 years
      • development and investigation leading to pilots and implementation
    • Consortium
      • Led by Bradford
      • 2 CETLs – ASKE and AfL
      • Plus Exeter, Plymouth and Leeds Met.
  • 20. TESTA project
    • NTFS group project with 4 partners:
      • ‘ aims to improve the quality of student learning through addressing programme-level assessment. ’
    • starting from audit of current practice on nine programmes:
      • surveyed students using focus groups and AEQ – Assessment Experience Questionnaire – Graham Gibbs et al
      • also using tool to identify programme level ‘assessment environments’ (Gibbs)
  • 21. Assessment environment and impact
    • Interim findings from TESTA
      • variety of assessments can cause problems
      • Issues over understanding assessment criteria, marker variation, and feedback
      • variation across programmes
      • QA ‘myths and traditions’ can get in the way
  • 22. The need for strategy
    • An example finding from Gibbs
      • ‘ greater explicitness of goals and standards was not associated with students experiencing the goals and standards to be clearer’
    • And what did make a difference?
  • 23. The need for strategy
    • An example finding from Gibbs
      • ‘ greater explicitness of goals and standards was not associated with students experiencing the goals and standards to be clearer’
    • And what did make a difference?
      • Formative-only assessment
      • More oral feedback
      • Students ‘came to understand standards through many cycles of practice and feedback’
  • 24. Typical student concerns (based on PASS)
    • perceptions of ‘the course’ variable.
    • assessment experienced as ‘fragmented’.
    • anxieties re move to more integrated assessment – perceived risk in terms of performance.
    • Concerns about feedback and timing.
  • 25. An example from PASS: Peninsula Medical School
    • Includes:
    • four assessment modules that run through the 5 year undergraduate medical programme and are not linked directly to specific areas of teaching
    • focus on high-quality learning (Mattick and Knight, 2007)
  • 26. Will PBA be the ‘effective assessment strategy’
  • 27. And finally …assessment/ identity interface
    • Students as ‘conscientious consumers’ (Higgins et al, 2002).
    • But:
    • personal identity as ‘mediator’.
      • e.g. apprentice (‘feedback is useful tool’) cf. victim (‘feedback is another burden’).
    • So need to change the mindsets of some students.
  • 28. And finally finally … some other contacts
    • PASS
      • Project Manager: Ruth Whitfield r.whitfield@ bradford.ac.uk
    • ASEL
      • Project Manager: Will Stewart w.stewart@ bradford.ac.uk
    • CAA (building on IT4SEA)
      • Project Manager: John Dermo [email_address]