This desktop research commissioned by the Higher Education Academy set out to consult with the academic community about which references on assessment and feedback with technology enhancement were most useful to practitioners. While all the recommended publications may be characterised as reputable and the majority were peer-reviewed (67.7%), only a minority provided quantitative data (28.2%), of which relatively few provided appropriate experimental designs or statistical analysis (18.5%). The majority of publications were practitioner-led case studies. The references that were recommended to us are clearly having an impact on current practice and are found valuable by practitioners. The key messages from these sources are consistent and often give detailed and practical guidance for other academics. We found that most of the recommended literature focused on the goals that technology enhancement can enable assessment and feedback to meet and how assessment and feedback can be designed to make best use of the technology.
TEST BANK For Lewis's Medical Surgical Nursing in Canada, 4th Edition by Jane...
Technology-Enhanced Assessment and Feedback: How is evidence-based literature informing practice?
1. Technology-Enhanced Assessment and Feedback: How is evidence-based literature informing practice? Denise Whitelock 1 , Lester Gilbert 2 , and Veronica Gale 2 1 Institute of Educational Technology, The Open University, UK 2 Learning Societies Lab in the School of Electronics and Computing Sciences, University of Southampton [email_address] DMW - CAA 2011 - July 2011
2.
3. I hate marking but want the tasks and feedback to assist student learning DMW - CAA 2011 - July 2011
4.
5.
6. Categories of evidence used DMW - CAA 2011 - July 2011 Category Description 1a Peer reviewed generalizable study providing effect size estimates and which includes (i) some form of control group or treatment (may involve participants acting as their own control, such as before and after), and / or (ii) blind or preferably double-blind protocol. 1b Peer reviewed generalizable study providing effect size estimates, or sufficient information to allow estimates of effect size. 2 Peer reviewed ‘generalizable’ study providing quantified evidence (counts, percentages, etc) short of allowing estimates of effect sizes. 3 Peer-reviewed study. 4 Other reputable study providing guidance.
7. Number of references recommended in each evidence category DMW - CAA 2011 - July 2011 Evidence category Number of references recommended (a) Cumulative % 1a 15 12.1% 1b 8 18.5% 2 12 28.2% 3 49 67.7% 4 40 100.00% Total 124
8.
9.
10. CAP peer assessment system, BSc. Network Management & Security (Intl.), Glamorgan, Phil Davies DMW - CAA 2011 - July 2011
11.
12. Authentic assessments :e-portfolios Electronic NVQ portfolio cover contents page, OCR IT Practitioner, EAIHFE, Robert Wilsdon DMW - CAA 2011 - July 2011
14. Building e-portfolios on a chef’s course food preparation for e-portfolio, Modern Apprenticeship in Hospitality and Catering, West Suffolk College, Mike Mulvihill Evidence of food preparation skill for e-portfolio, Modern Apprenticeship in Hospitality and Catering, West Suffolk College, Mike Mulvihill DMW - CAA 2011 - July 2011
15.
16. MCQs: Variation on a theme (1) The question is an example of a COLA assessment used at the Reid Kerr College, Paisley. It is a Multiple response Question used in one of their modules. The question was developed using Questionmark Perception at the University of Dundee. It is part a set of formative assessment for medical students. DMW - CAA 2011 - July 2011
17. MCQs: Variation on a theme (2) Example of LAPT Certainty-Based Marking, UK cabinet ministers demo exercise showing feedback, University College, Tony Gardner-Medwin Drug Chart Errors and Omissions, Medicines Administration Assessment, Chesterfield Royal Hospital DMW - CAA 2011 - July 2011
18.
19.
20.
21.
22.
23.
24.
25.
26.
27.
28. Elliott’s characteristics of Assessment 2.0 activities A d v i c e f o r A c t i o n DMW - CAA 2011 - July 2011 Characteristics Descriptor Authentic Involving real-world knowledge and skills Personalised Tailored to the knowledge, skills and interests of each student Negotiated Agreed between the learner and the teacher Engaging Involving the personal interests of the students Recognise existing skills Willing to accredit the student’s existing work Deep Assessing deep knowledge – not memorization Problem oriented Original tasks requiring genuine problem solving skills Collaboratively produced Produced in partnership with fellow students Peer and self assessed Involving self reflection and peer review Tool supported Encouraging the use of ICT