Assessment Futures: The Role for e-Assessment?


Published on

Presentation delivered by Peter Hartley, University of Bradford, at the 2011 e-Assessment Scotland conference.

Published in: Education, Technology
  • Be the first to comment

  • Be the first to like this

Assessment Futures: The Role for e-Assessment?

  1. 1. Assessment Futures: role for e-Assessment? Peter Hartley, University of Bradford [email_address] http :// /educational-development/ aboutus / team/Full_details_27414_en.php
  2. 2. This session <ul><li>Why bother? Why worry? </li></ul><ul><li>Assessment trends and futures </li></ul><ul><li>What can e-assessment contribute? </li></ul>
  3. 3. And the argument <ul><li>Assessment is a problem which will ‘get worse’. </li></ul><ul><li>Assessment futures: trends re strategy and environment. </li></ul><ul><li>Significant potential for ‘e’: some examples. </li></ul><ul><li>And so the role for e-assessment must be part of a broader strategic initiative. </li></ul>
  4. 4. Assessment is a problem, #1 <ul><li>Ask a student of X: what makes a ‘ First Class X? </li></ul>
  5. 5. Assessment is a problem, #1 <ul><li>Ask a student of X: what makes a ‘ First Class X? ’ </li></ul><ul><ul><li>How would they respond? </li></ul></ul><ul><ul><li>Can they give you a coherent summary of the main programme outcomes (or just the algorithm that determines the marks)? </li></ul></ul>
  6. 6. Assessment is a problem, #1 <ul><li>See useful summaries of major issues: </li></ul><ul><ul><li>the the PASS Project Issues Paper Please comment/feedback and use. </li></ul></ul><ul><ul><ul><li> </li></ul></ul></ul><ul><ul><li>Recent article: </li></ul></ul><ul><ul><ul><li>Price, M., Carroll, J., O'Donovan, B. and Rust, C. (2011) ''If I was going there I wouldn't start from here: a critical commentary on current assessment practices', Assessment & Evaluation in Higher Education . Vol. 36, No. 4, 479-492. </li></ul></ul></ul>
  7. 7. Assessment is a problem, #2 <ul><li>From PASS, would highlight: </li></ul><ul><ul><li>Assessment ‘drives and channels’ </li></ul></ul><ul><ul><li>What/why are we measuring: the ‘slowly learnt’problem. </li></ul></ul><ul><ul><li>Limitations of grading (e.g. marks are not numbers). </li></ul></ul><ul><ul><li>Implications for course structures/regulations . </li></ul></ul>
  8. 8. Problem #3: multi-purpose & multi-audience
  9. 9. Problem #4: The meaning of feedback <ul><ul><li>Cannot we ‘recapture’ the ‘original’ meaning of feedback: enabling self-correcting behaviour towards a known goal. </li></ul></ul><ul><ul><li>This means rediscovering the ‘feedback loop’ whereby information must be ‘fed back’ so that it: </li></ul></ul><ul><ul><ul><li>relates to the goal. </li></ul></ul></ul><ul><ul><ul><li>is received. </li></ul></ul></ul><ul><ul><ul><li>is correctly interpreted. </li></ul></ul></ul><ul><ul><ul><li>enables corrective action </li></ul></ul></ul><ul><ul><li>cf. the work of Royce Sadler in Higher Education, e.g. </li></ul></ul><ul><ul><ul><li> </li></ul></ul></ul>
  10. 10. Problem #5 <ul><li>More demanding ‘consumers’: </li></ul><ul><li>E.g. the NUS Charter </li></ul>
  11. 11. Assessment futures: strategy and environment
  12. 12. Environment: TESTA project <ul><li>NTFS group project with 4 partners: </li></ul><ul><ul><li>‘ aims to improve the quality of student learning through addressing programme-level assessment. ’ </li></ul></ul><ul><li>starting from audit of current practice on nine programmes: </li></ul><ul><ul><li>surveyed students using focus groups and AEQ – Assessment Experience Questionnaire – Graham Gibbs et al </li></ul></ul><ul><ul><li>also using tool to identify programme level ‘assessment environments’ (Gibbs) </li></ul></ul>
  13. 13. Consistent practice? Characterising programme-level assessment environments that support learning by Graham Gibbs and Harriet Dunbar-Goddet Published in: Assessment & Evaluation in Higher Education , Volume 34, Issue 4 August 2009 , pages 481 - 489
  14. 14. Data from TESTA
  15. 15. Assessment environment and impact <ul><li>Interim findings from TESTA </li></ul><ul><ul><li>variety of assessments can cause problems </li></ul></ul><ul><ul><li>Issues over understanding assessment criteria, marker variation, and feedback </li></ul></ul><ul><ul><li>variation across programmes </li></ul></ul><ul><ul><li>QA ‘myths and traditions’ can get in the way </li></ul></ul>
  16. 16. The need for strategy <ul><li>An example finding from Gibbs </li></ul><ul><ul><li>‘ greater explicitness of goals and standards was not associated with students experiencing the goals and standards to be clearer’ </li></ul></ul><ul><li>And what did make a difference? </li></ul>
  17. 17. The need for strategy <ul><li>An example finding from Gibbs </li></ul><ul><ul><li>‘ greater explicitness of goals and standards was not associated with students experiencing the goals and standards to be clearer’ </li></ul></ul><ul><li>And what did make a difference? </li></ul><ul><ul><li>Formative-only assessment </li></ul></ul><ul><ul><li>More oral feedback </li></ul></ul><ul><ul><li>Students ‘came to understand standards through many cycles of practice and feedback’ </li></ul></ul>
  18. 18. Programme-based assessment: PASS <ul><li>NTFS group project over 3 years </li></ul><ul><ul><li>development and investigation leading to pilots and implementation </li></ul></ul><ul><li>Consortium </li></ul><ul><ul><li>Led by Bradford </li></ul></ul><ul><ul><li>2 CETLs – ASKE and AfL </li></ul></ul><ul><ul><li>Plus Exeter, Plymouth and Leeds Met. </li></ul></ul>
  19. 19. What are we investigating? <ul><li>How to design an effective, efficient, inclusive and sustainable assessment strategy that delivers the key course/programme outcomes. </li></ul>
  20. 20. Also look out for:
  21. 21. And also … <ul><li>New JISC Programme on Assessment and Feedback </li></ul><ul><ul><li>See the Strand A projects. </li></ul></ul>
  22. 22. Typical student concerns (based on PASS) <ul><li>perceptions of ‘the course’ variable. </li></ul><ul><li>assessment experienced as ‘fragmented’. </li></ul><ul><li>anxieties re move to more integrated assessment – perceived risk in terms of performance. </li></ul><ul><li>Concerns about feedback and timing. </li></ul>
  23. 23. Searching for types
  24. 24. Searching for types
  25. 25. An example from PASS: Peninsula Medical School <ul><li>Includes: </li></ul><ul><li>four assessment modules that run through the 5 year undergraduate medical programme and are not linked directly to specific areas of teaching </li></ul><ul><li>focus on high-quality learning (Mattick and Knight, 2007) </li></ul>
  26. 26. Further case studies being explored <ul><li>Brunel </li></ul><ul><ul><li>New regulations which separate study and assessment blocks. </li></ul></ul><ul><ul><li>Visit last week. </li></ul></ul><ul><li>Liverpool Hope </li></ul><ul><ul><li>New regulations which ‘abandon modules’ in all undergraduate programmes. </li></ul></ul><ul><ul><li>‘ Key Honours Assessment’ . </li></ul></ul><ul><ul><li>Visit now arranged </li></ul></ul>
  27. 27. Brunel: the regs <ul><li>120 credits per year of study. </li></ul><ul><li>Course/programme can include mix of study, assessment and modular blocks. </li></ul><ul><li>Option blocks must be modular. </li></ul><ul><li>Blocks must be in multiples of 5 credits </li></ul><ul><li>Maximum assessment block is 40 points </li></ul>
  28. 28. Examples from Brunel <ul><li>Biomedical Sciences </li></ul><ul><ul><li>Study and assessment blocks in all years. </li></ul></ul><ul><ul><li>Cut assessment load by 2/3rds; generated more time for class contact. </li></ul></ul><ul><ul><li>Synoptic exam in all three years. </li></ul></ul><ul><li>Mathematics </li></ul><ul><ul><li>Conventional modules in final year only. </li></ul></ul><ul><ul><li>Improved understanding and ‘carry-over’ of ‘the basics’ into year 2. </li></ul></ul>
  29. 29. Do you PASS?
  30. 31. What can e-assessment achieve?
  31. 32. 1. Processes and systems
  32. 33. Examples from the Curriculum Design Prog. <ul><li>eBioLabs (Bristol) </li></ul><ul><ul><ul><ul><li>By combining interactive media with formative self-evaluation assessments students learn the methods and techniques they will use in the lab, without risking valuable time, equipment or materials. Because students first experiment on-line there is a reduced chance of cognitive overload during the practical and they are more able to concentrate on the wider aims of the experiment, rather than blindly following the lab instructions. </li></ul></ul></ul></ul><ul><ul><ul><ul><li>Because eBiolabs includes tools that automatically mark student assignments, and tools that allow academics to easily track student attendance and achievement, the marking and administrative burden associated with running practicals is very significantly reduced. </li></ul></ul></ul></ul>
  33. 34. Curriculum Design example <ul><li>CASCADE project (Oxford) </li></ul><ul><ul><li>Online assessment submission </li></ul></ul><ul><ul><ul><li>Students can now submit assignments much more easily at any time from anywhere in the world. It is also possible to predict significant efficiencies in assignment handling time for the Registry staff who deal with student submissions for approximately 260 course assignments across 48 course cohorts a year: a saving of 30 minutes per assignment or more soon cumulates savings in the order of half a day per week. Other advantages of the new online system are the reduction in paper handling and photocopying, as well as better auditing and control. Reduction in paper storage is a further advantage both in terms of less physical space being required and also in terms of less staff time being required to retrieve data from the archive. ’  </li></ul></ul></ul>
  34. 35. Curriculum Design example <ul><li>ESCAPE project (Hertfordshire) </li></ul><ul><ul><li>Effectiveness vs efficiency. (watch the video ) </li></ul></ul><ul><ul><li>Next two slides from Mark Russell, Deputy Director of the Blended Learning Unit, University of Hertfordshire </li></ul></ul>
  35. 36. ESCAPE Themes … <ul><li>Good assessment for learning: </li></ul><ul><li>… Engages students with assessment criteria </li></ul><ul><li>… Supports personalised learning </li></ul><ul><li>… Focuses on student development </li></ul><ul><li>… Ensures feedback leads to improvement </li></ul><ul><li>… Stimulates dialogue </li></ul><ul><li>… Considers staff and student effort </li></ul>
  36. 37. Numerous legacy resources
  37. 38. 2. Feedback and self-evaluation
  38. 39. Example 2.1: audio <ul><li>The ASEL project </li></ul><ul><ul><li>led by Bradford with Kingston as partner. </li></ul></ul><ul><ul><li>various uses of audio, including feedback, in different disciplines. </li></ul></ul><ul><ul><li>See the intro by Will Stewart as part of the recent Leeds University Building Capacity project: </li></ul></ul><ul><ul><ul><li>Jorum resource </li></ul></ul></ul><ul><ul><ul><li>Direct weblink </li></ul></ul></ul>
  39. 40. ASEL noted: <ul><li>Technology now easy and accessible. </li></ul><ul><li>Positive student reactions. </li></ul><ul><li>Different tutor styles and approaches. </li></ul><ul><li>Serendipity – e.g. feedback stimulated podcasts. </li></ul><ul><li>And </li></ul><ul><li>A different form of communication? </li></ul>
  40. 41. ASEL main conclusions <ul><ul><li>… audio is a powerful tool, providing opportunities for personalising learning, promoting greater student engagement, and encouraging creativity. In introducing audio into their practice, lecturers were required to rethink their pedagogical approaches and learning design, adopting new and innovative ways to enable students to be more actively involved in the learning process. </li></ul></ul>
  41. 42. ASEL main conclusions <ul><ul><li>… (audio) allowed lecturers to provide more personal and richer feedback to students, and increased the level of interaction and dialogue amongst students and between students and lecturers. </li></ul></ul><ul><ul><li>(Will Stewart and Chris Dearnley) </li></ul></ul>
  42. 44. Example 2.2: audio and video <ul><li>Growing number of examples. </li></ul><ul><ul><li>ALT/Epigeum Awards 2010: see the ALT Open Access Repository </li></ul></ul><ul><li>See the winning entry by Read and Brown from Southampton: </li></ul><ul><ul><li>Organic Chemistry. </li></ul></ul><ul><ul><li>Use of tablets to show solutions and working. </li></ul></ul><ul><ul><li>Focus on self-assessment. </li></ul></ul>
  43. 45. Example 2.3: clickers are coming <ul><li>Student Response Systems at the moment? </li></ul><ul><ul><li>They work … they can change staff and student behaviour and performance. </li></ul></ul><ul><li>But </li></ul><ul><ul><li>can be cumbersome and fiddly. </li></ul></ul><ul><ul><li>setup time. </li></ul></ul><ul><ul><li>need strong commitment and support (e.g. see experience at Exeter Business School). </li></ul></ul>
  44. 46. Example 2.3: clickers are coming <ul><li>Student Response Systems in the future? </li></ul><ul><ul><li>They will radically change staff and student behaviour. </li></ul></ul><ul><ul><li>They will be flexible and easy to use. </li></ul></ul><ul><ul><li>They will be on the student’s own device! </li></ul></ul><ul><ul><ul><li>e.g. experimentation at University of Bradford arising from our JISC Building Capacity project – work with TxtTools by John Fairhall et al) </li></ul></ul></ul>
  45. 47. Example 2.4: adaptive systems <ul><li>PBL with consequences – you get immediate feedback on the consequences of your decisions. </li></ul><ul><ul><li>e.g. The G4 project at St George’s </li></ul></ul><ul><ul><ul><li> </li></ul></ul></ul><ul><ul><ul><li>iEthics case online </li></ul></ul></ul><ul><li>Adaptive assessment </li></ul><ul><ul><li>e.g. the work of Trevor Barker </li></ul></ul>
  46. 48. 3. Integrating systems
  47. 49. Example 3.1: integrating systems: CAA <ul><li>ITS4SEA project at Bradford </li></ul><ul><ul><li>100-seater facility, now plus break-out 30 seats. </li></ul></ul><ul><ul><li>Thin client technology. </li></ul></ul><ul><ul><li>QMP as University standard for summative assessment. </li></ul></ul><ul><ul><li>Procedures agreed with Exam Office. </li></ul></ul><ul><ul><li>Design of room (available as cluster outside assessment times) </li></ul></ul><ul><ul><li>Teaching potential. </li></ul></ul>
  48. 50. The main CAA room at Bradford
  49. 52. And the growth …
  50. 53. And recent changes … <ul><li>growth in ‘hybrid exams’: </li></ul><ul><ul><li>mix of automatic marking (QMP) and open ended response items (e.g. short answer questions). </li></ul></ul><ul><ul><li>short answers to spreadsheet & marked by human. </li></ul></ul><ul><ul><li>likely use of word-processing in future. </li></ul></ul><ul><ul><li>Impact on teaching - new flexibility. </li></ul></ul><ul><li>Example of impact: </li></ul><ul><ul><li>‘ reduced my marking load for this module from 5 days to one day, whilst still enabling assessment of higher order cognitive skills.’ </li></ul></ul>
  51. 54. Example 3.2: integrating applications <ul><li>Use of mobile technology </li></ul><ul><ul><li>e.g. CampusM at Bradford: </li></ul></ul><ul><ul><ul><li> </li></ul></ul></ul><ul><ul><ul><li>;selector-1 </li></ul></ul></ul>
  52. 55. And finally … back to the role <ul><li>E-assessment can play an important role re: </li></ul><ul><ul><li>Assessment authenticity and diversity </li></ul></ul><ul><ul><li>Feedback quantity and quality </li></ul></ul><ul><ul><li>Profiling and ‘mindset’. </li></ul></ul><ul><li>BUT ONLY IF </li></ul><ul><li>Embedded in meaningful course/programme strategy/environment </li></ul>
  53. 56. And what is an ‘effective assessment strategy’? <ul><li>Will it explain to staff, students and external agencies: </li></ul><ul><ul><li>How the course/programme assesses the main learning outcomes? </li></ul></ul><ul><ul><li>How assessment and teaching are linked? </li></ul></ul><ul><ul><li>How assessment both supports ‘high-quality learning’ and develops it over the course? </li></ul></ul>
  54. 57. And finally … assessment/identity interface <ul><li>Students as ‘conscientious consumers’ (Higgins et al, 2002). </li></ul><ul><li>But: </li></ul><ul><li>personal identity as ‘mediator’. </li></ul><ul><ul><li>e.g. apprentice (‘feedback is useful tool’) cf. victim (‘feedback is another burden’). </li></ul></ul><ul><li>So need to change the mindsets of some students. </li></ul>
  55. 58. And finally finally … some other contacts <ul><li>PASS </li></ul><ul><ul><li>Project Manager: Ruth Whitfield r.whitfield@ </li></ul></ul><ul><li>ASEL </li></ul><ul><ul><li>Project Manager: Will Stewart w.stewart@ </li></ul></ul><ul><li>CAA (building on ITS4SEA) </li></ul><ul><ul><li>Project Manager: John Dermo [email_address] </li></ul></ul>
  56. 59. And maybe students will never again see … <ul><li>59% </li></ul><ul><li>Excellent. </li></ul><ul><ul><ul><li>This was the only tutor comment on a student assignment. How do you think the student reacted and felt? </li></ul></ul></ul>
  57. 60. And some other interesting stuff <ul><li>Challenging students to search Wikipedia creatively: C-Link </li></ul><ul><ul><ul><li>http://www.conceptlinkage/clink / </li></ul></ul></ul><ul><li>Helping students review and evaluate their interview performance </li></ul><ul><ul><ul><li> </li></ul></ul></ul><ul><li>Helping research students to prepare for their viva: Interviewer Viva Contact me for further info/demo. </li></ul>
  58. 61. Thank you for listening <ul><li>Peter Hartley </li></ul><ul><li>[email_address] </li></ul>