• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Looking backwards to move forwards: Seminal research that has influenced key researcher in the field of Computer Assisted Assessment
 

Looking backwards to move forwards: Seminal research that has influenced key researcher in the field of Computer Assisted Assessment

on

  • 427 views

Findings from a survey of key researchers in the computer assisted assessment field to identify the seminal research in this field to date.

Findings from a survey of key researchers in the computer assisted assessment field to identify the seminal research in this field to date.

Statistics

Views

Total Views
427
Views on SlideShare
427
Embed Views
0

Actions

Likes
0
Downloads
0
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • When it came to identifying trends then, In this graph, we compared the distribution of the four main categories of comments within the four levels of pass. The results can be seen in this graph. There is, more or less, a pattern for each standard of pass with regard to the types of comments given by tutors. [talk through graph] (The comments had been coded by two people with a ratability of approximately 89%) So category A shows ‘praise and agreement’ B shows ‘direction and evaluation’ C shows questions and D shows ‘disagreement’ So looking at the four levels of pass….. Etc [talk through graph] So the main objective of this phase of the analysis was to identify a set of trends in the tutor interactions that matched the grade awarded.
  • The results can be seen here once again. This chart illustrates the distribution of each category of comment across all levels of pass. This doesn’t however enable us to identify trends. This gets interesting when we look at the break-down once again… [next slide]
  • Are you comparing another study to SRAFTE? OR SRAFTE to REAQ

Looking backwards to move forwards: Seminal research that has influenced key researcher in the field of Computer Assisted Assessment Looking backwards to move forwards: Seminal research that has influenced key researcher in the field of Computer Assisted Assessment Presentation Transcript

  • Looking backwards to move forwardsSeminal research that has influenced key researchers in the field of Computer Assisted Assessment Denise Whitelock Institute of Educational Technology The Open University d.m.whitelock@open.ac.uk
  • Outline Overview Seminal literature Quick survey Discussion Analysis Response DMW July 12 CAA conference
  • Objectives. To understand what is meant by seminal work in the field. To investigate key researchers’ understanding of seminal work. To analyse the findings. To identify the gaps that need to be filled in the current DMW July 12 CAA conference
  • Seminal literature for CAA• What are the classics? • What should all your• What’s in the CAA students read? archive? DMW July 12 CAA conference
  • Ask the experts• 12 subjects• Epistolary interviews• Facilitated dialogue• Seminal literature redefined• What is seminal in our time?• Context changes focus DMW July 12 CAA conference
  • Subjects • 5 women • 7 men • Age approx 48 years • Majority professorial status • Publications that have had impact DMW July 12 CAA conference
  • The experts’ response• Difficult question• Why do you want to know?• What influenced me in my research• What I think is a big influence for now• Paper often before its time• Ignore citation index• Looking more to a 4* paper? DMW July 12 CAA conference
  • Top articlesJournal Article Number of ResponsesBennet R.E. (2002) Inexorable and Inevitable: the Continuing Story of Technology and 6Assessment JLearning Technology and Assessment Vol 1 no1Collins, Hawkins and Friederiksen (1994) Three Different views of Students : the Role of 5Technology in Assessing Student performance JLearning Sciences 3 (2) 205-217Sleeman and Brown (1982) Intelligent Tutoring Systems Science vol 228 Issue 4698 7Academic Press 456-462Nicol and Macfarlane Dick (2006) Formative assessment and self regulated learning: A 9model and 7 principles of good practice feedback studies in Higher Ed,32 (2) 199 -216H.S Ashton, C.E Beevers et al ( 2006) Automatic measurement of Mathematical Ability in 4Secondary Education (2006) BJET 37 1, 93-119Landauer, Latham and Foltz Automatic Essay Assessment 7http;www.tandfonline.com/dol/abs/10.1080/0969594032000148154Whitelock, D., Watt, S., Raw, Y. and Moreale, E. (2003) ‘Analysing Tutor Feedback to 3Students: First steps towards constructing an Electronic Monitoring System’. Associationfor Learning Technology Journal (ALT-J). Vol. 11, No. 3 DMW July 12 CAA conference
  • Main categories• Automatic essay marking• Modelling• History recaps• Automatic Assessment in Anger• Feedback both non and automatic DMW July 12 CAA conference
  • Automatic marking of freetext entry • Open comment • Mitchell • Science at the OU • Jordan • SafeSea new EPSRC project Whitelock and Pulman DMW July 12 CAA conference
  • Open Mentor: Feedback totutorsWhat is Open Mentor? • A learning support tool for tutors that provides reflective comments on their assessment of feedback added to students’ assignmentsHow does it work?• Flander’s categories inappropriate• Bales’ categories• Open Mentor provides tutors with guidance by analysing the comments and grouping them in four major categories DMW July 12 CAA conference
  • Identifying trends: H801 D Pass 4 D Pass 3 Bales Interactional Categories at each PassLlevel D Pass 2 D Pass 1 C Pass 4 C Pass 3 C Pass 2 C Pass 1 B Pass 4 B Pass 3 B Pass 2 B Pass 1 A Pass 4 A Pass 3 A Pass 2 A Pass 1 0 5 10 15 20 25 Number ofIincidences Graph shows conflated Bales’ categories against mean number of incidences in H801 scripts DMW July 12 CAA conference
  • Identifying trends: H801 1.61 5.96 5.73 Key: A A = Positive reactions B C B = Responses D C = Questions D = Negative reactions 17.13Pie Chart shows the mean number of incidences per pass per conflatedBales Interactional Category for all four levels of pass in H801 scripts DMW July 12 CAA conference
  • DMW, CALRG, May 2009
  • HEA-funded Synthesis Reporton Assessment and Feedback• Consult the academic community on useful references – Seminar series – Survey – Advisors – Invited contributors• Prioritise evidence-based references• Synthesise main points• For readers: – Academics using technology enhancement for assessment and feedback – Learning technologists – Managers of academic departments DMW July 12 CAA conference
  • Evidence-based literature • 142 references • Technology-enhanced methods • Use for assessment and feedback • Type of evidence • Ease of access (18 could not be retrieved) DMW July 12 CAA conference
  • Categories of evidence used Category Description 1a Peer reviewed generalizable study providing effect size estimates and which includes (i) some form of control group or treatment (may involve participants acting as their own control, such as before and after), and / or (ii) blind or preferably double-blind protocol. 1b Peer reviewed generalizable study providing effect size estimates, or sufficient information to allow estimates of effect size. 2 Peer reviewed ‘generalizable’ study providing quantified evidence (counts, percentages, etc) short of allowing estimates of effect sizes. 3 Peer-reviewed study. 4 Other reputable study providing guidance. DMW July 12 CAA conference
  • Number of references recommended ineach evidence category Evidence Number of Cumulative category references % recommended 1a 15 12.1% 1b 8 18.5% 2 12 28.2% 3 49 67.7% 4 40 100.00% Total 124 DMW July 12 CAA conference
  • How do the findings compare with Gilbert,Whitelock and Gale HEA study? • All Journal articles • No practice guides • More technical papers • History of deep questions showing the early struggles in the field • David Nicholls’ work common to both • Whitelock’s work common to both DMW July 12 CAA conference
  • Characteristics DescriptorAuthentic Involving real-world knowledge and skillsPersonalised Tailored to the knowledge, skills and interests of each studentNegotiated Agreed between the learner and the teacherEngaging Involving the personal interests of the studentsRecognises existing skills Willing to accredit the student’s existing workDeep Assessing deep knowledge – not memorizationProblem oriented Original tasks requiring genuine problem solving skillsCollaboratively produced Produced in partnership with fellow studentsPeer and self assessed Involving self reflection and peer reviewTool supported Encouraging the use of ICT Advice for Action Elliott’s characteristics of Assessment 2.0 activities DMW July 12 CAA conference
  • What’s under the bonnet?:Algorithms or heuristics?• Well! It’s all code• Pros and cons• Formalising models• Pedagogical theory operationalised OPEN TO TEST DMW July 12 CAA conference
  • e-Assessment futures• Free text entry• Adaptive testing• Automatic marking• Advice for Action• Learning Analytics Data mining• Motivation Badges and Dweck DMW July 12 CAA conference
  • Four assessment special issues • Brna, P. & Whitelock, D. (Eds.) (2010) Special Issue of International Journal of Continuing Engineering Education and Life-long Learning, Focusing on electronic Feedback: Feasible progress or just unfulfilled promises? Volume 2, No. 2 • Whitelock, D. (Ed.) (2009) Special on e-Assessment: Developing new dialogues for the digital age. Volume 40, No. 2 • Whitelock, D. and Watt, S. (Eds.) (2008). Reframing e-assessment: adopting new media and adapting old frameworks. Learning, Media and Technology, Vol. 33, No. 3 • Whitelick, D. and Warburton, W. (2010). Special Issue of International Journal of e-Assessment (IJEA) entitled ‘Computer Assisted Assessment: Supporting Student Learning’ DMW July 12 CAA conference