Your SlideShare is downloading. ×
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

The Information Literacy Impact Factor: How to Measure Value - Author: Lorna Dodd

383

Published on

Presentation given by Lorna Dodd (UCD Library Liaison Librarian) at LILAC 2013 conference in Manchester, England - March 25-27, 2013

Presentation given by Lorna Dodd (UCD Library Liaison Librarian) at LILAC 2013 conference in Manchester, England - March 25-27, 2013

Published in: Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
383
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
7
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide
  • 34 schools in seven colleges
  • Online tools – instructional videos, libguides, tutorial
  • Spoke to colleagues in teaching and learning for advise on best tool to collect this information We currently don’t offer a wide range of online options. Academics more likely to say they would prefer the option they are familiar with
  • 314 responses – 45% response rate 268 completed survey
  • Focus on those who have had library isntruction
  • Focus on those who have had library isntruction
  • Tours good for using better & broader range resources of improved ability at finding information Significant proportion feel tours are not good for citing and plagiarism awareness Disagreement regarding students being better at everything
  • Similar to tours at good: Lectures good for better & broader range of resources & improved ability at finding information Less disagree that students more aware of citing & plagiarism Disagreement regarding students being better at everything
  • Significantly better in all aspects Most noticeable is that very few disagree with any Stage 2 significantly more positive
  • Feel Taught postgrads get more out of tour in all aspects Significant difference to UGs is that no disagreement in all aspects except plagiarism
  • Similar pattern to tours
  • Similar pattern Overall most effective at improved ability at finding information
  • Bespoke sessions Specifically tailored – much better at achieving all categories
  • Of 268 people who completed survey
  • Nearly everyone who had library instruction believes that ALL students should have it Whether they have had library instruction as slight impact on whether they think all students should have it
  • Of 268 people who completed survey
  • Subject specific workshop Even split between others
  • Much more even split Face-to-face with library professional still preferred
  • Similar pattern to undergraduates
  • Similar pattern to undergraduates
  • Not everyone rated from 1-6 so results a little unreliable Think that explains why subject specific workshops got more last choices than any other Same pattern for those who had tours, lectures, workshops, no instruction
  • Fear of loosing these services
  • Fear of loosing these services
  • Definite benefit of library instruction and should therefore continue in some way Active learning preferred – presents a challenge More emphasis on impact we can have on learning how to cite and avoid plagiarism Measuring impact still very subjective
  • Agreement that all students should have the opportunity If resources are limited need to focus on specific groups – stage 1 & 2
  • Does this have implications for the lower impact in students understanding how to cite and avoid plagairsim?
  • Fear of loosing these services
  • Transcript

    • 1. The Information Literacy Impactfactor: How to Measure Value Lorna Dodd College Liaison Librarian UCD Library | Dublin 4. email: Lorna.dodd@ucd.ie Tel: +353 (0)1 716 7074 www.ucd.ie/library
    • 2. Outline  Why conduct a survey of module coordinators?  Review of the Literature  Rationale  How the survey was conducted  Survey results  Implications for the future  ConclusionsLorna Dodd: LILAC 2013
    • 3. Why?Lorna Dodd: LILAC 2013
    • 4. Why? Looking Back & Looking ForwardLorna Dodd: LILAC 2013
    • 5. Looking Back  Large University  No single strategy for the development of Information Literacy  Heavy load of information skills delivery  Driven by individuals  Sometimes evaluated but rarely measured  No clear indication in meeting learning outcomes (Dodd, L. & Kendlin, V., 2010)Lorna Dodd: LILAC 2013
    • 6. Looking Forward… New Library Strategic Plan 2010-2014 “monitor, measure and evaluate the Library’s teaching and learning strategy and activities.” (UCD Library, 2010)Lorna Dodd: LILAC 2013
    • 7. Looking forward  Realignment of staffing structure  From approx 17 Liaison Librarians to 6 College Liaison Librarians  Move from module to programme approach  Move from responsive to consultation  Measure impact on student learningLorna Dodd: LILAC 2013
    • 8. Looking Forward…Lorna Dodd: LILAC 2013
    • 9. Literature Review (1)  A lot available discussing why it is important to measure impact  Much of what has been written focuses on Library use rather than instruction. (Stone 2011; Poll & Payne 2006; Schilling & Applegate 2012).  Limitations of traditional organisational approaches at measuring success and meeting targets.  Most libraries focus on process and output indicators rather than measuring impact.  Measuring impact on student learning requires systematic evaluation of training (Markless & Streatfield 2006; Schilling & Applegate 2012))Lorna Dodd: LILAC 2013
    • 10. Literature Review (2)  Several problems arise when measuring impact.  Data protection rules  Differing value of services for specific groups  Difficulty in measuring long-term impact  Time consuming nature of measuring impact of Library instruction.  Important to use results of any study conducted to measure the impact of Library services  Benchmarking activities  Improving services  Justifying resources used for services  Campaigning for increased funding.  Common/Successful ways to measure impact of Library instruction  Surveys; Pre and post-tests; Self-assessments; Behavioural observations. (Poll & Payne 2006)Lorna Dodd: LILAC 2013
    • 11. Literature Review (3) 2006 LIR/SCONUL measuring impact initiative Developed impact process Identified key performance indicators Used to evaluate the effectiveness of interventions supporting teaching and learning Research provided information regarding emerging issues throughout the process. Results included:  Improved relationships with academic community  A raised profile for Library staff  Better understanding of how the Library can support academic programmes. (Markless & Streatfield 2006; Blagden, 2005)Lorna Dodd: LILAC 2013
    • 12. Literature Review (4) 2012 CONUL Information Literacy Survey Gathered feedback on the impact of information literacy instruction  Found library instruction had a positive impact on students’ skill development  Information Literacy often assessed Targeted respondents Academics already fully engaged First year students only Small sample from each institution (CONUL Advisory Committee on Information Literacy, 2013)Lorna Dodd: LILAC 2013
    • 13. Rationale We needed to: Problems: Make academics aware of new Measuring impact approach retrospectively Create a strategy to get Perceived impact is academics to embrace new subjective approach External factors influence Identify what has been successful to date development of information skills Collect feedback from those not actively engaged with the More likely to opt for Library options that are familiar Identify what methods would be useful going forwardLorna Dodd: LILAC 2013
    • 14. Survey  Seeking information on modules  Two parts: 1. Evaluating information skills delivery to date 2. Identifying what would be useful in the future  Survey Monkey  11 questions  Mainly quantitative  Multiple choice questions  Measurement/ranking questionsLorna Dodd: LILAC 2013
    • 15. Survey Introduction: Level of module – undergraduate/taught postgraduate If there has been Library instruction Part 1: Kind of instruction Impact of instruction on students’ abilities Part 2: Should all students have library instruction When should library instruction happen? Which services should be developed Free QuestionLorna Dodd: LILAC 2013
    • 16. Promotion  Approx. 700 module coordinators  Only possible to target academic staff  Not all module coordinators are academic staff and not all academic staff are module coordinators  Targeting email – approx 1,900  College Liaison Librarian contacts  Three weeksLorna Dodd: LILAC 2013
    • 17. Survey Results  Type of instruction  Impact of instruction  When and who?  Preferred services for the future  Qualitative feedbackLorna Dodd: LILAC 2013
    • 18. Type of Library InstructionLorna Dodd: LILAC 2013
    • 19. Impact of Library InstructionLorna Dodd: LILAC 2013
    • 20. Lorna Dodd: LILAC 2013
    • 21. Lorna Dodd: LILAC 2013
    • 22. Lorna Dodd: LILAC 2013
    • 23. Lorna Dodd: LILAC 2013
    • 24. Lorna Dodd: LILAC 2013
    • 25. Mostly Taught Postgraduate:  EndNote  Special Collections  PBL  One-on-One ConsultationLorna Dodd: LILAC 2013
    • 26. “Students have always received instruction to date and so difficult toanswer the comparison questions below”Lorna Dodd: LILAC 2013
    • 27. Should ALL students have Library Instruction?Lorna Dodd: LILAC 2013
    • 28. Should ALL students have Library Instruction?Lorna Dodd: LILAC 2013
    • 29. Should Library Instruction happen in every year?Lorna Dodd: LILAC 2013
    • 30. Should Library Instruction happen in every year?Lorna Dodd: LILAC 2013
    • 31. Which Stage?Lorna Dodd: LILAC 2013
    • 32. “Instruction in Year one should be built on in subsequent years to enhance skills by year”Lorna Dodd: LILAC 2013
    • 33. Preferred Services for the FutureLorna Dodd: LILAC 2013
    • 34. Lorna Dodd: LILAC 2013
    • 35. Lorna Dodd: LILAC 2013
    • 36. Influence of instruction had on preference? Instruction First Choice Second Choice Last Choice had Tour Subject Generic workshop Tour Specific Tutorial Workshops Online Video Lecture Subject Generic Workshop Tour/Online Specific Tutorial Video Workshops Workshop Subject Generic Workshop Tour Specific Tutorial Workshops No Subject Subject Specific Tour Instruction Specific Workshops WorkshopsLorna Dodd: LILAC 2013
    • 37. Lorna Dodd: LILAC 2013
    • 38. Lorna Dodd: LILAC 2013
    • 39. Lorna Dodd: LILAC 2013
    • 40. “One of the best approaches would be to create materials that are reusable and for any student, regardless of level. Just because a student moves up a level doesnt mean theyve acquired all of the learning they should have” “…really good online video materials, from complete intro to specific and advanced, and available on demand is the way to go”. “Tutorials (tutors would be trained by Library staff)”Lorna Dodd: LILAC 2013
    • 41. Qualitative FeedbackLorna Dodd: LILAC 2013
    • 42. Comments Themes  20% (53) of those who completed the survey made comments  In line with our new strategy  Online options  Constructive comments & suggestions on current and future services  Complimenting services to date  Comments on the surveyLorna Dodd: LILAC 2013
    • 43. Comments  5 respondents made comments inline with our new strategy  Programme Approach  Reusable Learning object and Online tools  6 respondents made comments about proposed future instructionLorna Dodd: LILAC 2013
    • 44. Comments  12 respondents made comments about current services and changes  Problems with current service: “…same introductory talks by librarian using lecture slots of various modules for at least three times (the third time he skipped as he found it ridiculous) “Students just skip the guest lecture.”Lorna Dodd: LILAC 2013
    • 45. Comments  Benefits of current service: “…requires personal contact with a friendly face early on. “The current reduction in library services is a real retrograde step especially with the explosion in available information sources”Lorna Dodd: LILAC 2013
    • 46. Comments Over half (28) complimented current strategy “we really love our librarians; thank you” ‘Campaigning’ for traditional service Acknowledging individuals Recognising value of information literacy to studentsLorna Dodd: LILAC 2013
    • 47. “…it makes a noticeable difference in the quality of assignment students produce. The Library Tutorial is typically rated as one of the most important lectures they have in that Stage 1 module.”Lorna Dodd: LILAC 2013
    • 48. Implications for the FutureLorna Dodd: LILAC 2013
    • 49. Impact of Instruction  Definite benefit in Library instruction  Active learning environment is preferred  Stronger evidence in the discovery and use of resources  Remains difficult to identify the ‘real’ impactLorna Dodd: LILAC 2013
    • 50. Timing of Instruction  Most module coordinators feel all students should have Library instruction  Instruction for students at every stage is desirable  Taught Postgraduates  Beginning and middle of Semester 1  Undergraduates  Stage 1  Stage 3Lorna Dodd: LILAC 2013
    • 51. Preferred Services  Active learning sessions tailored to subject needs most preferred  Tours perceived as ineffective Questions:  Asked to identify how effective instruction was at improving students’ information literacy skills.  When they request instruction, do they usually consider this?  Or do they feel students need to know about the library rather than what they need in order to transition from 2nd level, develop researching skills etc.Lorna Dodd: LILAC 2013
    • 52. Questions Did respondents choose workshop environment over online version because they are more familiar? If so, then why did the majority of those who had no instruction choose workshop? If delivering workshops on this scale is unsustainable, what can we do ensure effectiveness? What work do we need to do in terms of marketing?Lorna Dodd: LILAC 2013
    • 53. Conclusion Shared understanding between Library & Academic in:  Benefit of Instruction  Taking a programme approach  Ensuring all students get an equal opportunity (core modules) Difference in:  Most appropriate method of deliveryLorna Dodd: LILAC 2013
    • 54. References  Blagden, P. (2005) ‘The LIRG/SCONUL Measuring Impact Initiative: Overview of phase 1 impact projects’, Library & Information Research (LIR) vol. 29 (91)  Dodd, L. & Kendlin, V. (2010) ‘Damned if we do and Damned if we don’t: How to address sustainability in the delivery of information literacy components in UCD’: Librarians Information Literacy Annual Conference (LILAC), March 29-31, Limerick, Ireland.  Markless, S & Streatfield D. (2006) ‘Gathering and applying evidence of the impact of UK university libraries on student learning and research: A facilitated action research approach’, Journal of Information Management vol. 26 pp. 3-15  Poll, R & Payne, P. (2006) ‘Impact measures for libraries and information services’, Library Hi Tech vol. 24 (4) pp. 547-562  Schilling, K. & Applegate, R. (2012) ‘Best methods for evaluating education impact: a comparison of the efficacy of commonly used measures in library instruction’, Journal of the Medical Library Association, vol. 100 (4) pp. 258-269  Stone, G. et al (2011) ‘Does Library use affect student attainment? A preliminary report on the library impact data project’ Liber Quarterly vo. 21(1) pp. 5-22  UCD Library (2010) ‘UCD Library Strategic Plan 2010-2014’ http://www.ucd.ie/library/news_publicity/showcase_strategy /Lorna Dodd: LILAC 2013

    ×