• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Assessing the impact of evidence summaries in library and information studies
 

Assessing the impact of evidence summaries in library and information studies

on

  • 455 views

Canadian Library Association Annual Conference, June 2, 2012, Ottawa

Canadian Library Association Annual Conference, June 2, 2012, Ottawa

Statistics

Views

Total Views
455
Views on SlideShare
455
Embed Views
0

Actions

Likes
0
Downloads
0
Comments
1

0 Embeds 0

No embeds

Accessibility

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

CC Attribution-NonCommercial-ShareAlike LicenseCC Attribution-NonCommercial-ShareAlike LicenseCC Attribution-NonCommercial-ShareAlike License

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel

11 of 1 previous next

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
  • After attending ABQLA Conference#81 and listened to closing keynote speaker Lorie Kloda, I came to realize that libraries in every institutions
    needed to be assessed. My high school library will definitely be assessed in the near future using some of Lorie's ideas as incentive. She was right when she said that this is the way of the future. Very good presentation K.Lukian-Byrnes -Library technichian-acting librarian at John Rennie High School.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • Overview:Study background and purposeMethods and findings Discussion of findingsQuestions for the audience
  • Question to the crowd:Ask them to think about this throughout the talk, and we will raise the question again at the end of the talk with some time for discussion.(Some ideas of our own:We publish a journal, spend time getting ESs in there. We hope they are “used” or at least read, and if so, we think they are helpful. Curious to understand why they are used, how they are used, and what differences they make to the readers.
  • Ask if they are familiar already. Can flash through this and next slide quickly.Modeled after synopses found in the medical literature, such as those in APC journal club, or InfoPOEMs. There, clinicians can read structured abstracts of clinical research with commentaries, and sometimes a “bottom line” – or, answers the question: what does this change (or not) in my practice?
  • Develop a tool to assess impact, and validate its use with librarians for evidence summariesDetermine how and why evidence summaries are usedUnderstand how evidence summaries impact knowledge (cognition), practice, users
  • Phase 1: conducted with editorial team, librarians, authors of original IAM in medicine late fall and winter 2010-2011Phase 2: readers of EBLIP and others invited spring 2011Phase 3: subset of survey respondents fall and winter 2011-2012: Just completed. This is the first time the findings from the full study are being presentedCritical Incident Technique – the survey asks the respondent to identify one ES to assess impact. The interview will follow-up on the same incident. The impact of only ONE ES is being assessed per respondent.Interviews were designed to gather data to corroborate findings from survey respondents, by matching answers to determine if the tool actually captured “impacts” accurately, as well to uncover impacts or uses we had not considered.
  • Proposed impactsBegan with a tool from health sciences, used to assess impact of summaries delivered on a handheld device for GPs.Modified it, with consultation with librarians
  • McGill researchers in Family MedicineOriginal IAM tool, refined for different populations – this one is for Canadian Physicians to rate a resources known as eTherapuetics+Also mention JASIST articleWe came up with 3 areas of impact, with multiple items (specific impacts) in each:Librarian’s knowledge (cognition)Librarian’s practiceUser community
  • Using the tool developed in phase 1, we invited readers of EBLIP journal to participate. Those who expressed interest were invited to complete the survey in early spring of 2011 (i.e. a little over a year ago)
  • Recruitment from March-April 2011 (continued after that, 7 more recruits) [Survey invitations sent April 19, 2011 (EST)First reminder sent May 9, 2011Second reminder sent May 31, 2011Data collection ended June 3, 2011Data collection took place over 1.5 months] Response Rate: 56% (86 completed usable surveys out of 153 potential participants – self selected) Of 86, 62 have read an ES
  • Quick overview of the country. Closely parallels the readership of the journal.
  • Majority of respondents read between 1-10 or so a yearSome read many more. Some of these include our copyeditors.
  • 25 Evidence Summaries were identified[9 did not remember or name an article, but still completed the survey2 named articles other than Ess]While not all about health, all are relevant to health sciences librarianship, and have broad appeal across settings.
  • Other: research, teaching, peer review for EBLIP, new position responsibilities
  • The majority had read the ES within the last 3 months
  • Other: share with others/colleagues; confirmed no/need to read original study; no impact because not relevant (title misleading); confirmatory
  • Other: impacted research method; impacted reader’s advisory service; general knowledgeSo, the “other” mostly fell into the existing category, with one new possible practice change: research approach (which isn’t exactly “practice” since it is a separate role for a librarian)
  • Out of 62 respondents, 41 added comments.Anecdotal changes: sensed that the institution saved money; prof reported students did better on course assignment as a result of librarian interventionNo formal assessment reported
  • We then invited a subset of respondents to talk more about the evidence summary.Thus far, we have interviewed 13 participants and thematically analyzed the interview transcripts. We discussed the same ES as the one used for answering the survey. In some cases there was a large time lapse since they had read the ES. This had advantages and distadvantagesOn the one hand, difficulty with recallOn the other hand, more time for impacts to be perceived
  • SettingAcademic9Health/hospital (not academic)1Teaching faculty (LIS or health)3 EducationPost-graduate diploma (after college)1MLIS or MA (or >1)9PhD3 CountryCanada6United States4United Kingdom1Australia1Hong Kong1PositionSupport staff1Librarian (non-management)6Manager3Professor3 2 participants were previously evidence summary writers; 1 copyeditor (and others peer reviewers)13/62 participants were interviewed. 21% of survey respondents who have read ESs.13/86 if you count all survey respondents (including those who did not read ESs) = 15%
  • Not sure what to do with this table….use final column for our “revised” impacts (not including new, suggestion possible impacts/uses). OR:, just include a check mark in the 2nd and 3rd columns to indicate these are confirmed?We could have aCheck markAnd X for a badly defined impactA 0 for a never recorded impact, but we still want to keep it because it is possibleAnd a ? For those that we are still unsure of
  • In the interviews, participants also discussed what they liked/disliked about the ESs in general, and what other potential impact or uses they perceived these could have on knowledge, practice, and users.Discovery was a theme which emerged describing specific areas of learningThere was also the concept of using information to influence one’s own or others’ decision-making. Not sure where this falls. Two uses: sharing, and DM.
  • Other advantages or themes which emerged from analysis of the interviews.Liked the format compared to the original articles.
  • Similar to Grad & Pluye et al’s findings with respect to positive cognitive impactsThe low “community impact” could be due to a misunderstanding of the question (the word “Community” could be interpreted as a change in workplace/colleagues), but possibly also likely due to:lack of measures to assess impact in library and information settingsEnvironmental barriers to assessing (EBP culture not always present)Delay in time to assess impact (unlike patient care, where it is one-on-one and more immediate)LIS is new to the concept of assessment of end users, it will take time for the culture to change and for these results to become observable/measurable.
  • elicit more detailed answers (corroborate survey findings) and determine why some impacts were not selected
  • (Also thank McGill for the support to conduct this research while on sabbatical).

Assessing the impact of evidence summaries in library and information studies Assessing the impact of evidence summaries in library and information studies Presentation Transcript

  • Lorie Kloda, MLIS, PhD(c), McGill UniversityDenise Koufogiannakis, MA, MLIS, PhD(c), University of AlbertaAlison Brettle, MLIS, PhD, University of SalfordCanadian Library Association Annual Conference, Ottawa, 2012
  • Structured abstract objective – design – setting – subjects – method – main results – conclusionCommentary 300-400 words appraisal of validity, reliability, applicability significance, implications for practice 4
  • 5
  • To investigate the impact of evidencesummaries on library and informationprofessionals and their practiceKnowledgePracticeUsers
  • Phase 1 Development and face-validation of toolPhase 2 Survey questionnaire to readers (QUANT)Phase 3 Interviews (QUAL)
  • Development of a tool to assess impact
  • Findings (Phase 1)Development ofImpactAssessmentMethodGrad, R., Pluye, P., &Beauchamp, M.-E.(2007). Validation ofa method to assessthe clinical impact ofelectronicknowledgeresources, e-ServiceJournal, 5(2), 113-135. http://iam2009.pbworks.com/
  • Survey questionnaire to readers of evidence summaries
  • Findings (Phase 2)Survey 175 153 emailed 101Respondents recruited IAM survey respondents 1 unusable 15 3 bounced email incomplete 86 21 duplicates 49 no reply completed IAM survey 153 remaining 101 total respondents n=62
  • Findings (Phase 2) USACountry Canada UK Australia USA Ireland Australia Finland Malaysia UK Saudi Arabia Iran Puerto Rico Spain Canada Brazil Hong Kong
  • Findings (Phase 2) 50+ 25+Number of 15Evidence 14 13Summaries 12 11Read in Past 10 9Year 8 7 6 5 4 3 2 1 0 0 2 4 6 8 10 No. of respondents
  •  Decline in Reference Transactions with Few Questions Referred to Librarian when the Reference Desk is Staffed by a Paraprofessional (8) The Presence of Web 2.0 Applications Is Associated with the Overall Service Quality of Library Websites (6) Google Scholar Out-Performs Many Subscription Databases when Keyword Searching (4) Statistical Measures Alone Cannot Determine Which Database (BNI, CINAHL, MEDLINE, or EMBASE) Is the Most Useful for Searching Undergraduate Nursing Topic (4) A Graduate Degree in Library or Information Science Is Required, but not Sufficient, to Enter the Profession (3)
  • Findings (Phase 2)Reason for Freq. %Reading For general interest or curiosity 15 24%EvidenceSummary For personal continuing 18 29% professional educationn=62 To answer a specific question 21 34% or address a specific issue in my practice Other 8 13%
  • Findings (Phase 2)When the > 1 yearEvidence 6-12 monthsSummary was 3-6 monthsRead 1-3 months 1 week-1 month 1-7 days Today 0 5 10 15 20 25
  • Findings (Phase 2) Freq. %Cognitive My practice was (will be) improved 11 13%Impact I learned something new 36 42% I recalled something I already knew 14 16% It prompted me to investigate more 23 27% It confirmed I did (I am doing) the right thing 17 20% I was reassured 13 15% I was dissatisfied: There is a problem with the 1 1% presentation of this evidence summary I was dissatisfied: I disagree with the content 0 0% of this evidence summary It is potentially harmful 0 0% Other 9 10%
  • Findings (Phase 2)Practice Freq. %Impact Change my service approach 5 6%“You reported: Change my approach to collections 1 1%My practice was(will be) Change my management approach 4 5%improved. What Change my approach to teaching 4 5%did you (will you)do differently Change my professional approach 4 5%after reading theEvidence Other 3 3%Summary?”
  • Findings (Phase 2)Community  NoneImpact  Hypothesized future/potential“If reading thisEvidence Summary impacts on usersresulted in some  Reinforced cognitive or practicechange to yourindividual practice, doyou think it led to an impacts, not user outcomesimpact on anyone  5 reported actual impact at thiswithin the communityyou serve or level:environment inwhich you work?  Change in teaching LIS studentsPlease explain in thecomment box.”  Observed (anecdotal) changes
  • Interviews with subsample of survey respondents
  • Findings (Phase 3)13 participants Subsample from survey respondent Various settings, education levels, countries, and professional positions
  • Proposed Impacts Suggested Confirmed Phase 1 Phase 2 Phase 3C1 Practice was improved ? ✗C2 Learned something new ✔ ✔C3 Recalled something ✔ ✓C4 Prompted to investigate ✔ ✔C5 Confirmed ✔ ✓C6 Reassured ✔ ?C7 Dissatisfied – presentation ? 0C8 Dissatisfied - content 0 0C9 Potentially harmful 0 0P1 Service ✔ ✔P2 Collections ✔ ✓P3 Management ✔ ✓P4 Teaching ✔ ✓P5 Professional practice ✔ ✓U User Community ? ?
  • Findings (Phase 3)Potential Discovery  New researchImpacts  Interesting topics  MethodsUncovered  Keeping current Sharing  With colleagues, managers  Report writing  Recommended reading Assistance  Research  Writing, presentations  Teaching (for professors)
  • Findings (Phase 3)“Likes” Format of evidence summaries  Descriptive  Accuracy of abstract  Concise Time saving Database of Evidence Summaries
  •  One evidence summary assessed per respondent Cognitive impact comparable to findings in Grad, Pluye, et al. (2006) Practice impact – two-tiered Low community impact
  • How to best assess impact of reading aboutresearch?Do the items in the tool resonate with you?How important is documenting the impact ofresearch on practice?
  • Revise Impact Assessment Tool  Include new or revised impact items  Improve wording of itemsConduct survey of a larger sample of evidencesummary readersAdapt tool for other forms of researchdissemination
  • Canadian Association of Research LibrariesResearch in Librarianship GrantRoland Grad and Pierre Pluye, McGill University,for their feedbackAll of our survey respondents and interviewparticipants
  • denise.koufogiannakis@ualberta.calorie.kloda@mcgill.caa.brettle@salford.ac.uk @eblip