Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Collecting meaningful feedback on Information literacy training: results of a project to evaluate feedback methods - Coles & Perris


Published on

Presented at LILAC 2018

Published in: Education
  • Login to see the comments

Collecting meaningful feedback on Information literacy training: results of a project to evaluate feedback methods - Coles & Perris

  1. 1. Collecting meaningful feedback on information literacy training: results of a project to evaluate feedback methods LILAC 2018, Liverpool 3rd – 6th April Kim Coles – University of Reading Kate Perris – London School of Hygiene and Tropical Medicine
  2. 2. 2 CC0. Pixabay.
  3. 3. Review of teaching – and evaluation • 2016/17 a review of information skills teaching included a review of teaching evaluation The aim of this project was to trial alternative methods of feedback collection to evaluate their effectiveness and recommend feedback methods. 3 CC0. Pixabay.
  4. 4. What is being evaluated? • 1,160 London based students, 3,274 distance learning students • Teaching delivered face-to-face as part of modules, and online via Moodle (VLE) for distance learners • Feedback on information skills teaching has been collected since 2013/14 • Print A4 survey, at the end of each session, completed by hand • Input into MS Access database, report produced each year 4 CC0. Pixabay.
  5. 5. Why evaluate our teaching? 1. have attendees acquired the skills to meet the learning objectives? 2. do attendees feel more confident in their skills as a result of training? 3. do attendees use the skills acquired in the training in their literature searching? 4. have attendees’ skills improved as a result of training? 5 CC0. Pixabay.
  6. 6. Feedback methods 6 Short term reflection Long term reflection More complex administration/collection Simple administration/collection *These methods may be used as proxies for evaluation aims, and may be affected by other factors.
  7. 7. When should we ask for feedback? 7 Cellphone seesaw by Tilemahos Efthimiadis. CC BY-SA. Flickr. []
  8. 8. Which methods did we test? 8 Method Delay Classes Online survey using student bookings system 24 hours Foundation 1 Online survey using Bristol Online Surveys In session IID, IDAC, DH Online survey using Bristol Online Surveys 1 week PHEC, HPPF Online polling using Mentimeter/PollEverywhere In session FRH Print confidence rating question In session Travel Medicine Print One Minute Paper In session Foundation 2 Print survey In session GMH, DrPH Online survey using Bristol Online Surveys In students’ own time* Distance Learning Group 1 Comment and votes using Tricider (snowball-ish) In students’ own time* Distance Learning Group 2 *Feedback was open throughout the online course, students were notified on Moodle and at the end of the course that they could submit feedback
  9. 9. Response rates 9 *Online feedback using the student bookings system was only requested with a 24 hour delay **Online feedback using the Bristol Online Surveys tool includes feedback collected with a 1 week delay. If only in session Bristol Online Surveys are averaged, the response rate is 68%. All other feedback methods were administered in the session.
  10. 10. Ease of analysis • How easy was it to collect responses? • How easy was it to input the data? • How easy was it to analyse the data? Online surveys and short surveys (print and online) scored highly in ease of collection and input. Online surveys scored highly on data analysis. 10 Deckchairs at the beach. Sharlon Garland. CC BY. Flickr.
  11. 11. 11 CC0. Pixabay.
  12. 12. What is feedback for? Feedback can serve a number of purposes: • Development: to improve teaching and learning • Appraisal: collect evidence of teacher competence • Accountability: collect evidence of course/programme effectiveness • Innovation: initiate - test/experiment - develop Light, Greg, Dr;Cox, Roy, Dr. 2001., Learning & Teaching in Higher Education. [online]. SAGE Publications Ltd. Available from:<> 7 June 2017 12 Handwriting by Anntimony. CC BY. Flickr.
  13. 13. What questions had we been asking? 13 Student bookings system online survey Bristol Online Surveys online survey Print Survey Print confidence rating Print one minute paper Online poll (PollEverywhere/Mentimeter) Questions on developing teaching Questions on appraising teaching/teacher General questions (eg. cohort) Questions on accountability and effectiveness Questions on innovation
  14. 14. Selecting questions 14
  15. 15. Implementing the new system of evaluation 3 versions of a feedback questionnaire: • Online Evaluation Form ( • Print Survey • PollEverywhere feedback slides New document including recommendations for use of each survey. 15
  16. 16. Online evaluation form 16
  17. 17. Print evaluation form 17
  18. 18. PollEverywhere feedback slides 18
  19. 19. Results and next steps • Which of the three worked best? • What kind of information can we get? • How can this be used to improve teaching? • What other method of evaluation could be useful? 19
  20. 20. Thank you by Rachel Patterson. CC BY-NC-ND. Flickr.
  21. 21. Appendix 1: methods of feedback collection Short-term/immediate feedback Written feedback at the end of the class Online feedback at the end of a class One minute paper: What is the most important thing you learned during this session? What is uppermost in your mind now at the end of the session? Plus/delta feedback form: What do you now understand as a result of the session (+) What do you still have questions about (Δ) These are passed around, and students can comment on each other’s comments Reflective triads at the end of sessions 'Snowball' evaluation Ask students to make one positive and one negative statement about the class, place these on the board or pass them round, and ask students to vote on the ones that they agree with Muddiest point: ask students what is still unclear after the session Action plan: ask students to state what activity they will complete as a result of the class/or what they will do differently in their research now 21 alsobytemptationizeCCBYND.Flickr.
  22. 22. Bibliography • Frutchey, Jim. "Utilizing Google Docs as an Assessment Tool for Academic Reference Librarians." Journal of Library Innovation 3.1 (2012): 148-54. Print. • Gerwitz, Sarah. "Evaluating an Instruction Program with Various Assessment Measures." 42 (2014): 16-33. Print. • Light, Greg, and Roy Cox. Learning & Teaching in Higher Education. United Kingdom: Sage Publications Ltd, 2001. Print. • Meredith, William, and Jessica Mussell. "Amazed, Appreciative, or Ambivalent? Student and Faculty Perceptions of Librarians Embedded in Online Courses." Internet Reference Services Quarterly 19.2 (2014): 89-112. Print. • Nichols, James, Barbara Shaffer, and Karen Shockey. "Changing the Face of Instruction: Is Online of in- Class More Effective?": American Library Association, 2003. 378. Vol. 64. Print. • van Helvoort, A. A. J. "How Adult Students in Information Studies Use a Scoring Rubric for the Development of Their Information Literacy Skills." Journal of Academic Librarianship 38.3 (2012): 165- 71. Print. • Willson, Rebekah. "Independent Searching During One-Shot Information Literacy Instruction Sessions: Is It an Effective Use of Time?" Evidence Based Library & Information Practice 7.4 (2012): 52-67. Print. 22