RAeAT SIG Presentation 270109 V0.3

718 views

Published on

Myles Danson of JISC presents a report on the Review of Advanced eAssessment Techniques.

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
718
On SlideShare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
4
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

RAeAT SIG Presentation 270109 V0.3

  1. 1. Review of Advanced e-Assessment Techniques Project by Martin Ripley Ltd (the team: Martin Ripley; Jeremy Tafler; Hakan Redif; Robert Harding; Mike Peppiatt; Jim Ridgway) Email contact: Martin Ripley on martin.ripley1@btinternet.com
  2. 2. Review of Advanced e-Assessment Techniques <ul><li>Project Purpose: </li></ul><ul><li>Build a significant body of information about who is using different techniques, the associated issues and benefits </li></ul><ul><li>The project will: </li></ul><ul><li>Identify a range of projects/techniques useful to HE and FE; </li></ul><ul><li>Select and study 5-10 case studies to obtain information in: </li></ul><ul><ul><li>Test and assessment systems; </li></ul></ul><ul><ul><li>Task/test design; </li></ul></ul><ul><ul><li>Administration; </li></ul></ul><ul><ul><li>Marking, scoring and making judgements. </li></ul></ul>
  3. 3. Review of Advanced e-Assessment Techniques <ul><li>Current state of play: </li></ul><ul><li>Over 90 projects identified </li></ul><ul><li>Working with the University of Durham to create the online ‘Advanced e-Assessment Catalogue’ </li></ul><ul><li>URL = www.dur.ac.uk/smart.centre1/aeac </li></ul><ul><li>Pages likely to be active by the end of February 2009. </li></ul><ul><li>Final project report to be delivered by end of March 2009. </li></ul><ul><li>An example entry ‘in the catalogue’: </li></ul>Application LISC Site Kent University Contact name Alison Fowler Publications caa 2008 Email [email_address] Description CALL system allowing input of whole phrases: free text scoring for MFL marked by evaluating language strings; significantly different approach to NLP
  4. 4. Review of Advanced e-Assessment Techniques <ul><li>Question: </li></ul><ul><li>What to do with the list of the 90+ projects? </li></ul><ul><li>Issues: </li></ul><ul><li>Potential for something more dynamic </li></ul><ul><li>Usefulness for the JISC community </li></ul><ul><li>Tool/resource for further use </li></ul><ul><li>Interactive/management/updating </li></ul><ul><li>[email_address] </li></ul>
  5. 5. Review of Advanced e-Assessment Techniques <ul><li>The case studies chosen from the list: </li></ul>Broad area Candidate Adaptive testing AsTTle, University of Auckland Combining human/ computer marking Assessment 21, University of Manchester Higher order skills Premium, Federation of State Medical Boards of the United States Language testing LISC, Kent University Versant (with Ordunate technology), Psyhcorp, USA Short text* Automark, Dundee University IAT, Open University Storage/databanks* A range of practices (no specific focus on 1 candidate)
  6. 6. Review of Advanced e-Assessment Techniques <ul><li>Emerging findings (not conclusions): </li></ul><ul><li>Isolated nature of innovations leads to difficulties in ‘growing’ the concept for wider use. </li></ul><ul><li>Due to isolated nature of some developments, technology to support cross-institutional applications not exploited. </li></ul><ul><li>The crucial need for applications to demonstrate business efficiencies (e.g. saving tutors’ time) to win wider adoption. </li></ul><ul><li>Lack of consistency in adoption of key standards. </li></ul><ul><li>Nervousness of take up of new technologies vs willingness of institutions governing bodies to support e-Assessment. </li></ul>

×