Evidence-based Librarianship for All

  • 424 views
Uploaded on

Slides from a workshop held at the Ontario Library Association Super Conference in Toronto, January 29, 2014

Slides from a workshop held at the Ontario Library Association Super Conference in Toronto, January 29, 2014

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
424
On Slideshare
0
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
2
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. EBLIP  for  all:  Using  an  evidence   based  approach  in  your  library     Lorie  Kloda   McGill  University  
  • 2. Introductions 1.  Your  name   2.  Your  city,  insDtuDon    
  • 3. Course objectives •  •  •  introduce  the  process  of  eblip   demonstrate  tools  and  strategies  for  applying   evidence  in  pracDce  in  the  real-­‐world   parDcipants  will:   o  o  learn  strategies  to  incorporate  different  types  of   evidence  into  their  decision  making   have  opportuniDes  to  work  through  their  own   pracDce  quesDons  and  determine  an  approach    to   take  back  to  their  workplace  
  • 4. What we will cover today 9:00   IntroducDons  and  overview   9:15   The  EBLIP  process   9:30   FormulaDng  quesDons   10:15   Break   10:30   Sources  of  evidence   11:00   CriDcal  appraisal   11:40   Applying  evidence  in  pracDce   11:50   Wrap  up  
  • 5. AcDvity  1   what  are  your  "burning"  quesDons?  
  • 6. The  EBLIP  Process  
  • 7. What is EBLIP?  “an  approach  to  informaDon  science  that   promotes  the  collecDon,  interpretaDon  and   integraDon  of  valid,  important  and  applicable   user-­‐reported,  librarian  observed,  and   research-­‐derived  evidence.  The  best  available   evidence,  moderated  by  user  needs  and   preferences,  is  applied  to  improve  the  quality   of  professional  judgements”     (Booth,  2000)    
  • 8. Why should you care? Wisdom  means  acDng  with  knowledge  while   doubDng  what  you  know.   Jeffrey  Pfeffer  and  Robert  I.  Su^on  
  • 9. A brief history 1997   2000           2000       2001       2004   2006    Hypothesis  arDcle  by  Jon  Eldredge    MLA  Research  SecDon  created  an    Evidence-­‐Based  Librarianship        ImplementaDon  Commi^ee    Eldredge  publishes  papers  that      provide  the  framework  for  EBL    First  Evidence  Based  Librarianship    conference  held  in  Sheffield,  UK    Booth  and  Brice  book  on  EBIP      EBLIP  journal  launches                    
  • 10. The 5 A's of EBLIP 1)   2)     3)     4)     5)    Formulate  a  focused  quesDon  (Ask)    Find  the  best  evidence  to  help  answer  that          quesDon  (Acquire)    CriDcally  appraise  what  you  have  found  to          ensure  the  quality  of  the  evidence  (Appraise)    Apply  what  you  have  learned  to  your  pracDce        (Apply)    Evaluate  your  performance  (Assess)  
  • 11. 5  A's  process   Hayward, 2007, http:// www.cche.net/info.asp
  • 12. Is the EBLIP model used? •  The  ideal  vs  reality   •  CriDcisms  of  EBLIP   •  Barriers  to  pracDcing  in  an  evidence  based   manner    
  • 13. Barriers to evidence use •  OrganizaDonal  dynamics   •  Lack  of  Dme/compeDng  demands  on  Dme   •  Personal  outlook  /  lack  of  confidence   •  EducaDon  and  training  gaps   •  InformaDon  needs  not  being  met   •  Financial  limits  
  • 14. Determinants by level of control
  • 15. FormulaDng  an  answerable   quesDon   Ask  
  • 16.   Ques+ons  drive  the  en+re  EBL  process.     […]  The  wording  and  content  of  the     ques+ons  will  determine  what  kinds     of  research  designs  are  needed     to  secure  answers.   (J.  Eldredge,  2000)  
  • 17. Burning question example     What  do  university  professors     think  of  informaDon  literacy     instrucDon?  
  • 18. SPICE question structure Se6ng   the  context  (e.g.,  university  library,  academic   health  center,  K-­‐12  school)   Perspec<ve   the  stakeholder(s)  (e.g.,  graduate  students,   managers,  reference  librarians,  parents,  teachers)   Interven<on   the  service  being  offered  (e.g.,  chat  reference,   RefWorks  workshops,  discovery  layer)   Comparison   the  service  to  which  it  is  being  compared  (opDonal)   Evalua<on   the  measure  used  to  determine  change/success/ impact  (e.g.,  usage  staDsDcs,  course  grade)  
  • 19. SPICE example Se6ng   Research  university   Perspec<ve   Librarians   Professors   Interven<on   Survey  quesDonnaire  to  determine  ahtudes,   percepDons,  experiences   Comparison   Not  applicable   Evalua<on   RaDngs  of  informaDon  literacy  competencies   Inclusion  of  IL  in  courses   Disciplinary  differences  
  • 20. Librarianship domains Reference/Enquiries—providing  service  and  access  to  informaDon  that  meets  the   needs  of  library  users.   Educa<on—  IncorporaDng  teaching  methods  and  strategies  to  educate  users  about   library  resources  and  how  to  improve  research  skills.   LIS  Educa+on  subset  –  Specifically  pertaining  to  the  professional  educaDon  of   librarians.   Collec<ons—Building  a  high-­‐quality  collecDon  of  print  and  electronic  materials  that  is   useful,  cost-­‐effecDve  and  meets  the  users’  needs.   Management—managing  people  and  resources  within  an  organizaDon.  This  includes   markeDng  and  promoDon  as  well  as  human  resources.   Informa<on  access  and  retrieval—creaDng  be^er  systems  and  methods  for   informaDon  retrieval  and  access.   Professional  Issues—exploring  issues  that  affect  librarians  as  a  profession.   (Koufogiannakis,  Crumley,  and  Slater,  2004)  
  • 21. Librarianship domains •  InformaDon  access  &  retrieval   •  CollecDons   •  Management   •  EducaDon   •  Reference   •  Professional  issues   •  [Scholarly  communicaDons]  
  • 22. AcDvity  3   formulate  your  burning  quesDon   using   SPICE  
  • 23. What  counts  as  evidence?  
  • 24. Definition “the  available  body  of  facts  or  informa+on   indica+ng  whether  a  belief  or  proposi+on  is   true  or  valid”     (Oxford  English  DicDonary,  2011).    
  • 25. AcDvity  4   What  are  some  possible   evidence  sources  we  use  to  make   decisions  libraries?  
  • 26. Evidence Sources Hard  evidence   SoK  evidence   Published  literature   Input  from  colleagues   StaDsDcs   Tacit  knowledge   Local  research  and  evaluaDon   Feedback  from  users   Other  documents   Facts   Anecdotal  evidence  
  • 27. Sources  for  locaDng  and  creaDng   evidence   Acquire  
  • 28. Loca<ng   Published  research   •  Databases   •  Books,  bibliographies   •  Mail  lists,  blogs,  word   •  •  of  mouth   Conferences   SystemaDc  reviews,   Evidence  summaries  
  • 29. Crea<ng   Local  evidence   •  Usage  data   •  TransacDon  data   •  EvaluaDon  results   •  Survey,  interview,   •  focus  group  findings   Inputs,  outputs,   outcomes,  impact  
  • 30. Locating published evidence Databases   Library  and  informaDon  studies   Management   EducaDon   Social  sciences   Health  sciences,  psychology   •  •  •  •  • 
  • 31.   h^p://libvalue.cci.utk.edu/       h^p://www.informedlibrarian.com/   h^p://eprints.rclis.org/  
  • 32. Locating published evidence Conferences   EBLIP  (1-­‐7)   Assessment,  e.g.,  Northumbria  Conference,   Library  Assessment  Conference   Academic,  e.g.,  ACRL   InformaDon  literacy,  e.g.,  LOEX,  WILU,  LILAC   LIS  Research,  e.g.,  ASIS&T,  ALISE,  CAIS,  IIiX,   ISIC   Subject  librarianship  (health,  music)   •  •  •  •  •  • 
  • 33. Locating published evidence SystemaDc  reviews   h^p://lis-­‐systemaDc-­‐reviews.wikispaces.com  
  • 34. Locating published evidence Evidence  summaries   h^p://ejournals.library.ualberta.ca/index.php/EBLIP   Evidence  Based  Library  and  Informa+on  Prac+ce   journal,  2006-­‐   >250  evidence  summaries  
  • 35. Creating evidence Data  and  findings   Usage  data   TransacDon  data   EvaluaDon  results   Survey,  interview,  focus  group  findings   •  •  •  • 
  • 36. Creating evidence Sources  for  local  evidence  already  available   Library  assessment  department   University  planning  and  insDtuDonal  analysis   Annual  reports   Internal  reports   "Stats"   •  •  •  •  • 
  • 37. Creating evidence     Dudden,  R.  F.  (2007).  Using  benchmarking,  needs  assessment,  quality  improvement,  outcome  measurement,   and  library  standards.  New  York:  Neal  Schuman.  
  • 38. Evidence for example LocaDng  evidence   Databases:  LISA   SystemaDc  Review  Wiki   Journals:  Communica+ons  in  IL,  J  of  IL,  J  of   Academic  Librarianship   Conferences:  LILAC,  LOEX,  WILU   EBLIP  Evidence  Summary   CreaDng  evidence   survey  quesDonnaire   •  •  •  •  •  • 
  • 39. AcDvity  6   1.  idenDfy  2-­‐3  sources  for  locaDng   evidence  to  answer  your  quesDon     2.  consider  1  potenDal  source  of  local   evidence  to  look  into  
  • 40. CriDcal  appraisal   Appraise  
  • 41. Critical appraisal Weigh  up  the  evidence   Reliable   Valid   Applicable   •  •  •  Checklists  help  with  criDcal  appraisal  process   Language  is  different  for  interpreDve   (qualitaDve)  research  
  • 42. Reliability 1.  Results  clearly  explained   2.  Response  rate   3.  Useful  analysis   4.  appropriate  analysis   5.  Results  address  research  quesDon(s)   6.  LimitaDons   7.  Conclusions  based  on  actual  results  
  • 43. Validity 1.  Focused  issue/quesDon   2.  Conflict  of  interest   3.  Appropriate  and  replicable  method   4.  PopulaDon  and  representaDve  sample   5.  Validated  instrument  
  • 44. Applicability 1.  ImplicaDons  reported  in  original  study   2.  Applicability  to  other  populaDons   3.  More  informaDon  required  
  • 45. CRiSTAL Checklist For  appraising  research  on  user  studies   Focuses  on:   Study  design   Results   Relevance   •  •  •  Developed  by  Andrew  Booth  and  Anne  Brice.  Available  from:  h^p://nehngtheevidence.pbworks.com/w/page/ 11403006/CriDcal%20Appraisal%20Checklists  
  • 46. AcDvity  7   criDcally  appraise  Bury  study  using   the  Cristal  checklist  
  • 47. Critical appraisal: the shortcut
  • 48. Other  elements  to  consider   when  following  the  EBLIP   process    
  • 49. Other considerations •  individual  vs  group  decision  making   •  influences  /  biases   •  impact  of  work  environment  
  • 50. Widening the model A  revised  process:   1. Ar+culate  –  come  to  an  understanding  of  the  problem  and   arDculate  it.   2. Assemble  –  assemble  evidence  from  mulDple  sources  that  are   most  appropriate  to  the  problem  at  hand.   3. Assess  –  place  the  evidence  against  all  components  of  the   wider  overarching  problem.  Assess  the  evidence  for  its   quanDty  and  quality.   4. Agree  –  determine  the  best  way  forward  and  if  working  with   a  group,  try  to  achieve  consensus  based  on  the  evidence  and   organisaDonal  goals.   5. Adapt  –revisit  goals  and  needs.  Reflect  on  the  success  of  the   implementaDon.    
  • 51. Bringing the components together
  • 52. Questions to ask yourself
  • 53. Applying  evidence  in  pracDce   Apply  
  • 54. Ways to apply evidence 1) The  evidence  is  directly  applicable       2) The  evidence  needs  to  be  locally  validated     3) The  evidence  improves  understanding   ReflecDon  
  • 55. AcDvity  8   ReflecDon  on  applying  evidence  to   your  pracDce  quesDon  
  • 56. Wrap  up   Assess  
  • 57. AcDvity  9   3  things  you  will  take  home  and  act   upon