Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Evidence Based Librarianship in Practice


Published on

From a workshop given at the CLRC in Syracuse on March 24, 2014.

Published in: Education
  • Be the first to comment

  • Be the first to like this

Evidence Based Librarianship in Practice

  1. 1. EVIDENCE BASED LIBRARIANSHIP IN PRACTICE USING EVIDENCE IN HEALTH SCIENCES LIBRARIES Lorie Kloda, MLIS, PhD, AHIP McGill University Central New York Library Resources Council, Syracuse, March 2014
  2. 2. Introductions Lorie Kloda Assessment Librarian since 2012 Health Sciences Librarian for 12 years Montreal, McGill University Associate Editor, EBLIP journal
  3. 3. Introductions 1. Your name 2. Your title/position 3. Your city, institution 4. What is your interest in evidence based practice? Why are you here today?
  5. 5. Course objectives • Identify the steps in evidence based practice • Formulate answerable questions relevant to their own work setting • Define what constitutes evidence in their own work setting • Identify strategies for locating local or external evidence to answer their questions • Make use of tools for critically appraising published research • Provide examples of how evidence can be applied by health librarians in the real world
  6. 6. ACTIVITY 1 what are your "burning" questions?
  8. 8. What is EBLIP? “an approach to information science that promotes the collection, interpretation and integration of valid, important and applicable user-reported, librarian observed, and research-derived evidence. The best available evidence, moderated by user needs and preferences, is applied to improve the quality of professional judgements.” (Booth, 2000)
  9. 9. Why should you care? “Wisdom means acting with knowledge while doubting what you know.” Jeffrey Pfeffer & Robert I. Sutton
  10. 10. A brief history 1997 Hypothesis article by Jon Eldredge 2000 MLA Research Section created an Evidence-Based Librarianship Implementation Committee 2000 Eldredge publishes papers that provide the framework for EBL 2001 1st Evidence Based Librarianship conference held in Sheffield, UK 2004 Booth and Brice book on EBIP 2006 EBLIP journal launches
  11. 11. The 5 As of EBLIP 1) Formulate a focused question (Ask) 2) Find the best evidence to help answer that question (Acquire) 3) Critically appraise what you have found to ensure the quality of the evidence (Appraise) 4) Apply what you have learned to your practice (Apply) 5) Evaluate your performance (Assess)
  12. 12. 5 As process Hayward, 2007,
  13. 13. Is the EBLIP model used? • The ideal vs reality • Criticisms of EBLIP • Barriers to practicing in an evidence based manner
  14. 14. Barriers to evidence use • Organizational dynamics • Lack of time/competing demands on time • Personal outlook / lack of confidence • Education and training gaps • Information needs not being met • Financial limits
  15. 15. Determinants by level of control
  16. 16. Other considerations • individual vs group decision making • influences / biases • impact of work environment • types of evidence • enablers
  17. 17. Widening the model A revised process: Articulate – come to an understanding of the problem and articulate it. Assemble – assemble evidence from multiple sources that are most appropriate to the problem at hand. Assess – place the evidence against all components of the wider overarching problem. Assess the evidence for its quantity and quality. Agree – determine the best way forward and if working with a group, try to achieve consensus based on the evidence and organisational goals. Adapt –revisit goals and needs. Reflect on the success of the implementation.
  18. 18. Bringing the components together Research Evidence Professional knowledge Local evidence
  19. 19. Questions to ask yourself What do I already know? What local evidence is available? What does the literature say? What other information do I need to gather? How does the information I have apply to my context? Make a decision What worked? What didn’t? What did I learn? PRACTITIONER
  20. 20. Case examples Academic librarian wants to know what professors think of information literacy instruction to students Librarian at a pediatric hospital wonders if residents’ searches are improved with librarian assistance
  21. 21. BREAK
  23. 23. ―Questions drive the entire EBL process. […] The wording and content of the questions will determine what kinds of research designs are needed to secure answers.” (J. Eldredge, 2000)
  24. 24. SPICE question structure Setting the context (e.g., hospital library, academic health center) Perspective the stakeholder(s) (e.g., graduate students, managers, reference librarians) Intervention the service being offered (e.g., chat reference, RefWorks workshops) Comparison the service to which it is being compared (optional) Evaluation the measure used to determine change/success/impact (e.g., usage statistics, course grade)
  25. 25. Librarianship domains Reference/Enquiries—providing service and access to information that meets the needs of library users. Education— Incorporating teaching methods and strategies to educate users about library resources and how to improve research skills. LIS Education subset – Specifically pertaining to the professional education of librarians. Collections—Building a high-quality collection of print and electronic materials that is useful, cost-effective and meets the users’ needs. Management—managing people and resources within an organization. This includes marketing and promotion as well as human resources. Information access and retrieval—creating better systems and methods for information retrieval and access. Professional Issues—exploring issues that affect librarians as a profession. (Koufogiannakis, Crumley, and Slater, 2004)
  26. 26. Librarianship domains • Information access & retrieval • Collections • Management • Education • Reference • Professional issues • [Scholarly communications]
  27. 27. Burning question example 1 What are university faculty members’ perceptions of information literacy?
  28. 28. SPICE example 1 Setting Research university Perspective Librarians Professors Intervention Survey questionnaire to determine attitudes, perceptions, experiences Comparison Not applicable Evaluation Ratings of information literacy competencies Inclusion of IL in courses Disciplinary differences
  29. 29. Burning question example 2 Are pediatric residents’ search results improved with help from a librarian?
  30. 30. SPICE example 2 Setting Pediatric teaching hospital Perspective Librarians Intervention Help from a medical librarian for a literature search Comparison Literature search without assistance Evaluation Relevance of retrieved results; Quality of search strategy
  31. 31. ACTIVITY 2 formulate your burning question using SPICE
  33. 33. Definition of evidence “the available body of facts or information indicating whether a belief or proposition is true or valid” (Oxford English Dictionary, 2011)
  34. 34. ACTIVITY 3 What are some possible evidence sources we use to make decisions in libraries?
  35. 35. Evidence Sources Hard evidence Soft evidence Published literature Input from colleagues Statistics Tacit knowledge Local research and evaluation Feedback from users Other documents Anecdotal evidence Facts
  36. 36. LUNCH 12:15 – 1:00 36
  38. 38. Locating Published research • Databases • Books, bibliographies • Mail lists, blogs, word of mouth • Conferences • Systematic reviews, Evidence summaries
  39. 39. Creating Local evidence • Usage data • Transaction data • Evaluation results • Survey, interview, focus group findings • Inputs, outputs, outcomes, impact
  40. 40. Locating published evidence Databases • Library and information studies • Management • Education • Social sciences • Health sciences, psychology
  41. 41.
  42. 42. Locating published evidence Conferences • EBLIP (1-7) • Health librarianship, e.g., MLA, CHLA, EAHIL, ICML • Subject librarianship (music, law) • Assessment, e.g., Northumbria Conference, Library Assessment Conference • Academic, e.g., ACRL • Information literacy, e.g., LOEX, WILU, LILAC • LIS research conferences, e.g., ISIC, ASIS&T, CAIS, ALISE, IIiX, AMIA
  43. 43. Locating published evidence Systematic reviews
  44. 44. Locating published evidence Evidence summaries Evidence Based Library and Information Practice journal, 2006- >250 evidence summaries
  45. 45. Creating evidence Data and findings • Usage data • Transaction data • Evaluation results • Survey, interview, focus group findings
  46. 46. Creating evidence Sources for local evidence already available • Library assessment department • University planning and institutional analysis • Annual reports • Internal reports • "Stats"
  47. 47. Creating evidence
  48. 48. Evidence for example 1 Locating evidence • Databases: LISA • Systematic Review Wiki • Journals: Communications in IL, J of IL, J of Academic Librarianship • Conferences: LILAC, LOEX, WILU • EBLIP Evidence Summary Creating evidence • survey questionnaire
  49. 49. Evidence for example 2 Locating evidence • Databases: LibValue, LISA • Systematic review wiki • Journals: JMLA, HILJ, etc. • Conferences: MLA • EBLIP Evidence Summary Creating evidence • ???
  50. 50. ACTIVITY 4 1. identify 2-3 sources for locating evidence to answer your question 2. consider 1 potential source of local evidence to look into
  51. 51. CRITICALAPPRAISAL Appraise
  52. 52. Critical appraisal Weigh up the evidence • Reliable • Valid • Applicable Checklists help with critical appraisal process Language is different for interpretive (qualitative) research
  53. 53. Reliability 1. Results clearly explained 2. Response rate 3. Useful analysis 4. appropriate analysis 5. Results address research question(s) 6. Limitations 7. Conclusions based on actual results
  54. 54. Validity 1. Focused issue/question 2. Conflict of interest 3. Appropriate and replicable method 4. Population and representative sample 5. Validated instrument
  55. 55. Applicability 1. Implications reported in original study 2. Applicability to other populations 3. More information required
  56. 56. ReLIANT For appraising research on information skills instruction Focuses on: • Study design • Educational context • Results • Relevance Koufogiannakis, D., Booth, A., & Brettle, A. (2006) Reliant: Reader's Guide to the Literature on Interventions Addressing the Need for Education and Training. Library & Information Research 30(94), 44-51.
  57. 57. CRiSTAL Checklist For appraising research on user studies Focuses on: • Study design • Results • Relevance Developed by Andrew Booth and Anne Brice. Available from:
  58. 58. ACTIVITY 5 critically appraise a study using the appropriate checklist
  59. 59. Critical appraisal: the shortcut
  61. 61. Ways to apply evidence 1) The evidence is directly applicable 2) The evidence needs to be locally validated 3) The evidence improves understanding Reflection
  63. 63. Enablers of evidence use • Positive organizational dynamics • Ongoing education • Positive personal outlook • Time
  64. 64. ACTIVITY 6 3 things you will take home and act upon
  65. 65. CONCLUSION Assess