Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Review of eAssessment Quality


Published on

Presentation on the work of the REAQ project by Lester Gilbert.

Published in: Technology, Business
  • Be the first to comment

  • Be the first to like this

Review of eAssessment Quality

  1. 1. REAQ Lester Gilbert Gary Wills Bill Warburton Veronica Gale Report on E-Assessment Quality A JISC project in eLearning October 2009
  2. 2. THE PITCH
  3. 3. The system
  4. 4. Assessment
  5. 5. Quality management
  6. 6. Report questions
  7. 7. REAQ process
  8. 8. The team <ul><li>Management group </li></ul><ul><ul><li>Lester Gilbert, ECS, University of Southampton , PI </li></ul></ul><ul><ul><li>Dr Gary Wills, ECS, University of Southampton </li></ul></ul><ul><li>Expert consultants group </li></ul><ul><ul><li>Cliff Beevers, Heriot-Watt </li></ul></ul><ul><ul><li>Paul Booth, Question Tools </li></ul></ul><ul><ul><li>John Kleeman & Greg Pope, Questionmark </li></ul></ul><ul><ul><li>Harvey Mellor, IoE </li></ul></ul><ul><ul><li>Chris Ricketts, Plymouth </li></ul></ul><ul><ul><li>Denise Whitelock, OU </li></ul></ul><ul><li>The workers </li></ul><ul><ul><li>Veronica Gale, Consultant researcher </li></ul></ul><ul><ul><li>Bill Warburton, iSolutions, University of Southampton </li></ul></ul>
  9. 9. Interviewees <ul><li>HEIs </li></ul><ul><ul><li>Heriot-Watt </li></ul></ul><ul><ul><li>University of Southampton </li></ul></ul><ul><ul><li>Newcastle University </li></ul></ul><ul><ul><li>University of Plymouth </li></ul></ul><ul><ul><li>The Open University </li></ul></ul><ul><ul><li>Edinburgh University </li></ul></ul><ul><ul><li>Institute of Education </li></ul></ul><ul><li>Cambridge Assessment </li></ul><ul><li>SQA </li></ul><ul><li>Question Tools (Network Rail) </li></ul><ul><li>Vrije Universiteit Amsterdam </li></ul>
  10. 10. Questions asked (1) <ul><li>What denotes ‘high quality’ in summative e-assessment? </li></ul><ul><li>What steps do you follow to create and use summative e-assessment? </li></ul><ul><li>How do you ensure e-assessment: </li></ul><ul><ul><li>reliability, </li></ul></ul><ul><ul><li>validity, </li></ul></ul><ul><ul><li>security, and </li></ul></ul><ul><ul><li>accessibility? </li></ul></ul><ul><li>How does the process of creating good quality e-assessment differ from the process of creating traditional assessment? </li></ul>
  11. 11. Questions asked (2) <ul><li>Please give us examples of good e-assessment; why are these ‘good’? </li></ul><ul><li>When you have heard of poor e-assessment, what has made it ‘poor’? </li></ul><ul><li>What feedback have you received from students who have taken e-assessments? </li></ul><ul><li>What advice would you give to others using summative e-assessment? </li></ul><ul><li>What research or other work has informed your thinking about summative e-assessment? </li></ul><ul><li>What further research would you like to see conducted? </li></ul>
  13. 13. We expected to hear about… … delivery issues…
  14. 14. And hear about… … psychometric measures… … intended learning outcomes …
  15. 15. As well as hearing about… … appropriate standards … … and if we were lucky, capability maturity …
  16. 16. WHAT WE HEARD
  17. 17. We did hear an (awful) lot about… Delivery issues: infrastructure, support, and how things go wrong …
  18. 18. But not much about… Point biserials, Cronbach Alphas, Kuder-Richardsons Content validity, Conformance to ILOs
  19. 19. And hardly anything about… Metrics, Capability maturity Practice standards
  20. 20. And when we did hear… Difficulty coefficients / facility values were often inappropriately used…
  21. 21. SO…
  22. 22. Conclusions <ul><li>Essentially, little support for e-assessment, in the areas of: </li></ul><ul><li>Tools & toolkits </li></ul><ul><li>Guidance & advice focused upon quality </li></ul><ul><li>Exemplars of good practice </li></ul><ul><li>Little evidence for: </li></ul><ul><li>Maturity of good practice </li></ul><ul><li>Expectations (‘demand characteristics’) of quality </li></ul>
  23. 23. Recommendations <ul><li>Tools & toolkits </li></ul><ul><li>Exemplars of good practice </li></ul><ul><li>Project to develop item bank quality statistics </li></ul><ul><li>Suppliers to make quality reports more accessible </li></ul><ul><li>Workshops, guidance, & advice focused upon quality: </li></ul><ul><ul><li>Quality management </li></ul></ul><ul><ul><li>Standards </li></ul></ul><ul><ul><li>Metrics & psychometrics </li></ul></ul><ul><ul><li>Capability maturity </li></ul></ul><ul><li>JISC bids & project outputs to include, as relevant: </li></ul><ul><ul><li>Psychometric measures </li></ul></ul><ul><ul><li>Standards </li></ul></ul><ul><ul><li>Capability maturity modelling </li></ul></ul>
  24. 24. THANKS! <ul><li>Comments, questions, … </li></ul>