Your SlideShare is downloading. ×
Analysis of feedback webinar
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Introducing the official SlideShare app

Stunning, full-screen experience for iPhone and Android

Text the download link to your phone

Standard text messaging rates apply

Analysis of feedback webinar

547
views

Published on

Slides from Jisc Assessment and Feedback webinar on analysis of feedback 24 June 2013

Slides from Jisc Assessment and Feedback webinar on analysis of feedback 24 June 2013

Published in: Education, Technology

0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
547
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
3
Comments
0
Likes
1
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. JISC Assessment and Feedback ProgrammeWebinarAnalysis of Feedback24 June 2013 12.00 - 13.00
  • 2. Holly Smith IoE Assessment Careers H.Smith@ioe.ac.ukAnne Jones QUB e-AFFECT a.m.jones@qub.ac.ukMaria Fernandez-ToroOpen University eFEP Maria.Fernandez-Toro@open.ac.ukPeter Chatterton Daedalus e-WorldCritical Friend peter.chatterton@daedalus-e-world.comInstitute of Educationwww.ioe.ac.uk/assessmentcareersQueen’s University Belfasthttp://www.qub.ac.uk/directorates/AcademicStudentAffairs/CentreforEducationalDevelopment/e-AFFECTproject/Open University http://www.open.ac.uk/blogs/efep/Speakers
  • 3. To explore the following:• Why analyse feedback?• Approaches and tools for analysing feedback• Institutional experiences of using the tools - resultingfeedback profiles and audits• Benefits, impact and challenges from using the toolsWebinar objective
  • 4. Why analyse feedback?
  • 5. Why analyse feedback?• Widely inconsistent practice in feedback (quality & quantity)• Lack of learner understanding of, engagement with, anddialogue/action & on feedback• High teacher effort - low efficiency• Not utilising self/peer feedback• Transmitted feedback creates dependency on teacher• Reduced staff satisfaction as evidence of feed-forward not seen• NSS scoresDependent learnersfeedback “done to them”Independent learnerscapable of self-review
  • 6. Approaches and toolsfor analysing feedback
  • 7. IOE Coding framework• The score is the number of times a classificationappears in the feedback• The default unit for analysis was the sentence• Where a sentence contains clauses that makedistinct points, it was split into separate clauses,each of which was classified separately.• Neutral comments that for example describe thepiece of work, but do not make any judgement areunclassified
  • 8. IOE Feedback toolP1 Giving praiseP2 Recognising progress or ipsative feedbackCriticismsC1 Correction of errorsC2 Factual criticismsC3 Criticism of approachGiving adviceA1 Specific to content current assignmentA2 General skills in current assignmentA3 For future assignmentsQ Clarifications and questionsO Other unclassified statementsAdapted from Orsmond & Merry,2011 including Hughes, 2011.
  • 9. e-AFFECT - What the students sayLast year their feedbackpointed out spellingmistakes or referencingmistakes, but we werenot told how to do thingsrightWhat do they meanby clear and concise?Sometimes they saysomething toencourage that is notreally true – am Iexcelling at it or are you being nice?I liked feedback whichhelped me improve mywork the next time. Despitethis, I felt that my marksnever really changed muchand they tended to stay atthe same levelWhat do ‘?’ and‘What?’ in themargin mean?
  • 10. e-AFFECT - Analysis of coursework• Content analysis of depth of feedback using adaptedcategories developed by Glover and Brown (2006)• Indication that there is an strength/error/weakness oromission (Level 1)• Provides correction or appropriate response/indication why astrength (Level 2)• Provides explanation as to why the student’s response wasincorrect or inappropriate or why suggestion was preferable orhow a strength can be built upon (Level 3)Essays Lab reports TotalYear 1 40 26 66Year 2 61 24 85Total 101 50 151
  • 11. • Analysis criteria based around two dimensions:o Whether feedback focuses on strengths or weaknesseso How much information the feedback provides, cf ‘depth’ offeedback (Brown & Glover 2006) layers of scaffolding• Analysis tool: Feedback Analysis Chart for Tutors (FACT)Provides a visual ‘profile’ of a tutor’s feedbackAnalysing assignment feedbackScreencast description: http://www.open.ac.uk/blogs/efep/?page_id=114
  • 12. Layers Comments focusing on weaknesses Comments focusing on strengths1Error identified only Strength identified only2Error categorised, but not corrected Strength categorised or described as permarking criteria3Error corrected Illustrated with specific example fromstudent’sperformance4Explanation given Explanation given5Advice given on how to prevent errors infuture performanceAdvice given on how to developexisting strengths in futureManos blancosManos blancos Good workGood work ????Layers of scaffolding in assignment feedbackManos blancos AgreementManos blancos AgreementAlthough it ends in O, ‘mano’ isa feminine noun.Although it ends in O, ‘mano’ isa feminine noun.Manos blancos blancasManos blancos blancasRevise section 6.1 ofyour grammar bookRevise section 6.1 ofyour grammar bookYou use a wide range oflanguage structuresYou use a wide range oflanguage structuresNo digo que quieran...Good use of the subjunctiveNo digo que quieran...Good use of the subjunctiveThis connector makes it very clearthat a new section is starting here.This connector makes it very clearthat a new section is starting here.Good, you couldalso look up...Good, you couldalso look up...
  • 13. Feedback Analysis Chart for Tutors (FACT)Possible uses:Enables us to compare... feedback relating to different criteria feedback given by different tutors feedback given to more/less proficient students feedback related to different types of assignment(e.g. spoken presentations vs. written essays) feedback delivered through different media(e.g. written vs. audio-recorded feedback)
  • 14. LanguageWeaknesses StrengthsIdentified only Identified onlyCategorisedCategorised /DescribedCorrected /ModelledExemplifiedExplained ExplainedFuture-oriented Future-orientedExample of FACT analysis grid(tutor A: beginner assignment)Weaknesses StrengthsIdentifiedonly6IdentifiedonlyCategorised 2Categorised /DescribedCorrected /Modelled8 ExemplifiedExplained 3 ExplainedFuture-orientedFuture-orientedWeaknesses StrengthsIdentified only Identified onlyCategorised 2Categorised /DescribedCorrected /Modelled1 ExemplifiedExplained ExplainedFuture-oriented Future-orientedWeaknesses StrengthsIdentifiedonlyIdentifiedonlyCategorised 3Categorised /DescribedCorrected /ModelledExemplifiedExplained ExplainedFuture-oriented1Future-orientedContent Feedback formNotes on script
  • 15. LanguageWeaknesses StrengthsIdentified only Identified onlyCategorised 3Categorised /DescribedCorrected /Modelled6 ExemplifiedExplained 2 ExplainedFuture-oriented Future-orientedExample of FACT analysis grid(tutor B: advanced assignment)Weaknesses StrengthsIdentifiedonly4IdentifiedonlyCategorised 1 5Categorised /DescribedCorrected /Modelled2 6 ExemplifiedExplained5ExplainedFuture-orientedFuture-orientedWeaknesses StrengthsIdentified only Identified onlyCategorised 6Categorised /DescribedCorrected /Modelled4 2 ExemplifiedExplained 1 1 ExplainedFuture-oriented Future-orientedWeaknesses StrengthsIdentifiedonlyIdentifiedonlyCategorised 1 6Categorised /DescribedCorrected /Modelled2 ExemplifiedExplained 2 ExplainedFuture-orientedFuture-orientedContent Feedback formNotes on script
  • 16. Questions?
  • 17. Institutional experiencesof using the tools
  • 18. IOE Data• Analysed formative and summative assessmentfeedback for modules on 5 postgraduate programmesat the IOE (total 228 pieces)• Recorded the total number of comments in eachcategory and the average per script• Ranked the categories to obtain a feedback profile atprogramme level as well as an aggregate profile of the5 programmes.
  • 19. IOE Profile for Summative AssessmentTotal comments for 5 programmesSummative assessment (N= 165)Category of feedback Average per script RankP1 Praise 4.4 1P2 Ipsative (progress) 0 (negligible) 5C1-C3 Critique 2.7 2A1-A3 Advice for current or futureassignments1.9 (mostly for currentassignment)3Q Questions and clarificationrequests0.1 4
  • 20. e-AFFECT - Comments on essays0.05.010.015.020.025.030.035.040.045.0Student response Student skills Achievement%Distribution of comments at Year 1StrengthsWeaknesses0.05.010.015.020.025.030.035.040.045.050.0Student response Student skills Achievement%Distribution of comments at Year 2StrengthsWeaknesses
  • 21. e-AFFECT - Further learning comments0.000.501.001.502.002.503.003.50SourcematerialsDialogueandreflectionFuture work%% Year 1 work% Year 2 work
  • 22. e-AFFECT - Depth of feedbackDepth of feedbackIndication that there is anstrength/error/weakness or omission (Level 1)Provides correction or appropriateresponse/indication why a strength (Level 2)Provides explanation as to why the student’sresponse was incorrect or inappropriate or whysuggestion was preferable or how a strength canbe built upon (Level 3)-30 -20 -10 0 10 20123%Depth offeedbackYear 2Weakness Strength-15 -10 -5 0 5 10 15123%Depth offeedbackYear 1Weakness Strength
  • 23. -40 -20 0 20 40123%Depth offeedbackYear 1 Module AWeakness Strength-30 -20 -10 0 10 20 30123%Depth offeedbackYear 1 Module BWeakness Strength-60 -40 -20 0 20 40123%Depth offeedbackYear 2 Module CWeakness Strength-40 -20 0 20 40123%Depth offeedbackYear 2 Module DWeakness Strengthe-AFFECT – Module depth of feedback
  • 24. e-AFFECT - Feedback workshop
  • 25. Analysing what?Feedback on language assignments at the OUOur sample: 100 writing assignments 100 speaking assignments 4 levels (9 tutors per level) 108 students (3 per tutor)Feedback consists of: 200 e-feedback forms 100 annotated scripts 100 audio filesTutorStudentWell done!
  • 26. Summary of FACT analysis results: Use of the four mediaE-feedback forms (writing/speaking)Contain the lowest proportion of… Comments on strengths Comments on content Categorised strengths/weaknesses Advice for future (proportion on script = 0%).Quite low everywhere else (both in the audiofeedback and on the e-feedback forms)Script annotations/Audio feedbackContain the highest proportion of…Comments on weaknesses (oftenadressed at more than one depth)Comments on languageCorrected errorsExplanations (especially high occurrencein the audio feedback)Contain the lowest proportion of… Comments on weaknesses Comments on language Corrected errors/Exemplified strengths ExplanationsAll of these occur a bit more frequently forthe speaking assignmentContain the highest proportion of… Comments on strengths Comments on content Categorised strengths/weaknessesAll of these occur even more frequentlyfor the written assignment
  • 27. Questions?
  • 28. Benefits, impact and challengesfrom using the tools
  • 29. IOE Benefits, Impact and Challenges• Enables initiation of discussion of feedback at theprogramme team level• Facilitates reflection on the purpose of feedbackwithin programme teams• Feedback practices are very entrenched and resistantto change
  • 30. e-AFFECTBenefits, impacts and challenges from using the toolsBenefits• Staff seeing the real issue rather than an imagined issue• Engendering dialogue• Raising awareness of feedback messagesImpacts• Work in progressChallenges• Reaching consensus of the level and quantity offeedback
  • 31. • Useful research tool: Overall patterns of use of different media for givingfeedback on language assignments at the OU• Results need to be interpreted with caution (e.g. ‘deeper’ feedback is notnecessarily the most appropriate in all contexts)• Not suitable for quantitative evaluation by practitioners: Coding requirescomplex guidelines in order to be reliable• Suitable for awareness-raising purposes in staff training events. Materialsinclude: sample of marked assignments + coding grids + student webcastsgiving their ‘feedback on feedback’(Online training event: 67% ‘very useful’ – 33% ‘possibly useful’)• FACT criteria now also presented as a simplified checklist for reflection• Informs new research strand focusing on feedback alignmentBenefits, impact and challengesfrom using the FACT analysis tool
  • 32. Questions?
  • 33. Further info on projects
  • 34. More info on projectsAssessment Careers-Institute of Educationwww.ioe.ac.uk/assessmentcareershttp://youtu.be/VSaGbPoXPh0References:1. Brown, E. & Glover, C. (2006) Evaluating written feedback. in: B. C. & K. Klegg(Eds) Innovative assessment in higher education. London, Routledge), 81-91.2. Hattie, J. & Timperley, H. (2007) The Power of Feedback. Review of EducationalResearch, 77, 81-112.3. Hughes, G. (2011) Aiming for Personal Best: a Case for Introducing IpsativeAssessment in Higher Education Studies in Higher Education 36 (3): 353 – 367.4. Orsmond, P. & Merry, S. (2011) Feedback alignment: effective and ineffectivelinks between tutors’ and students’ understanding of coursework feedback.Assessment & Evaluation in Higher Education. 36(2): 125-126.Holly Smith H.Smith@ioe.ac.uk
  • 35. More info on projectsQueen’s University Belfaste-AFFECThttp://www.qub.ac.uk/directorates/AcademicStudentAffairs/CentreforEducationalDevelopment/e-AFFECTproject/http://blogs.qub.ac.uk/e-affect/Anne Jones a.m.jones@qub.ac.uk
  • 36. More info on projectsOpen UniversityThe eFeedback Evaluation Project (eFEP)http://www.open.ac.uk/blogs/efep/Screencast description:http://www.open.ac.uk/blogs/efep/?page_id=114Maria Fernandez-Toro Maria.Fernandez-Toro@open.ac.uk