Assessing the impact of evidence summaries in library and information studies: A mixed methods approach
Upcoming SlideShare
Loading in...5
×
 

Assessing the impact of evidence summaries in library and information studies: A mixed methods approach

on

  • 758 views

Medical Library Association Annual Meeting, May 2012, Seattle

Medical Library Association Annual Meeting, May 2012, Seattle

Lorie Kloda, McGill University
Denise Koufogiannakis, University of Alberta
Alison Brettle, Univerity of Salford

Statistics

Views

Total Views
758
Views on SlideShare
757
Embed Views
1

Actions

Likes
0
Downloads
0
Comments
0

1 Embed 1

http://www.linkedin.com 1

Accessibility

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

CC Attribution-NonCommercial-ShareAlike LicenseCC Attribution-NonCommercial-ShareAlike LicenseCC Attribution-NonCommercial-ShareAlike License

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • EBLIP for short, is an open access journal including original research, reviews, commentaries, and Ess, among other publication types. Peer reviewed. All aspects of LIS, with signif proportion of health librarianship related content.
  • Ask if they are familiar already. Can flash through this and next slide quickly.Modeled after synopses found in the medical literature, such as those in APC journal club, or InfoPOEMs. There, clinicians can read structured abstracts of clinical research with commentaries, and sometimes a “bottom line” – or, answers the question: what does this change (or not) in my practice?
  • Develop a tool to assess impact, and validate its use with librarians for evidence summariesDetermine how and why evidence summaries are usedUnderstand how evidence summaries impact knowledge (cognition), practice, users
  • Phase 1: conducted with editorial team, librarians, authors of original IAM in medicine late fall and winter 2010-2011Phase 2: readers of EBLIP and others invited spring 2011Phase 3: subset of survey respondents fall and winter 2011-2012: Just completed. This is the first time the findings from the full study are being presentedCritical Incident Technique – the survey asks the respondent to identify one ES to assess impact. The interview will follow-up on the same incident. The impact of only ONE ES is being assessed per respondent.Interviews were designed to gather data to corroborate findings from survey respondents, by matching answers to determine if the tool actually captured “impacts” accurately, as well to uncover impacts or uses we had not considered.
  • Proposed impactsBegan with a tool from health sciences, used to assess impact of summaries delivered on a handheld device for GPs.Modified it, with consultation with librarians
  • McGill researchers in Family MedicineOriginal IAM tool, refined for different populations – this one is for Canadian Physicians to rate a resources known as eTherapuetics+Also mention JASIST articleWe came up with 3 areas of impact, with multiple items (specific impacts) in each:Librarian’s knowledge (cognition)Librarian’s practiceUser community
  • Using the tool developed in phase 1, we invited readers of EBLIP journal to participate. Those who expressed interest were invited to complete the survey in early spring of 2011 (i.e. a little over a year ago)
  • Recruitment from March-April 2011 (continued after that, 7 more recruits) [Survey invitations sent April 19, 2011 (EST)First reminder sent May 9, 2011Second reminder sent May 31, 2011Data collection ended June 3, 2011Data collection took place over 1.5 months] Response Rate: 56% (86 completed usable surveys out of 153 potential participants – self selected) Of 86, 62 have read an ES
  • Quick overview of the country. Closely parallels the readership of the journal.
  • Majority of respondents read between 1-10 or so a yearSome read many more. Some of these include our copyeditors.
  • 25 Evidence Summaries were identified[9 did not remember or name an article, but still completed the survey2 named articles other than Ess]While not all about health, all are relevant to health sciences librarianship, and have broad appeal across settings.
  • Other: research, teaching, peer review for EBLIP, new position responsibilities
  • The majority had read the ES within the last 3 months
  • Other: share with others/colleagues; confirmed no/need to read original study; no impact because not relevant (title misleading); confirmatory
  • Other: impacted research method; impacted reader’s advisory service; general knowledgeSo, the “other” mostly fell into the existing category, with one new possible practice change: research approach (which isn’t exactly “practice” since it is a separate role for a librarian)
  • Out of 62 respondents, 41 added comments.Anecdotal changes: sensed that the institution saved money; prof reported students did better on course assignment as a result of librarian interventionNo formal assessment reported
  • We then invited a subset of respondents to talk more about the evidence summary.Thus far, we have interviewed 13 participants and thematically analyzed the interview transcripts. We discussed the same ES as the one used for answering the survey. In some cases there was a large time lapse since they had read the ES. This had advantages and distadvantagesOn the one hand, difficulty with recallOn the other hand, more time for impacts to be perceived
  • SettingAcademic9Health/hospital (not academic)1Teaching faculty (LIS or health)3 EducationPost-graduate diploma (after college)1MLIS or MA (or >1)9PhD3 CountryCanada6United States4United Kingdom1Australia1Hong Kong1PositionSupport staff1Librarian (non-management)6Manager3Professor3 2 participants were previously evidence summary writers; 1 copyeditor (and others peer reviewers)13/62 participants were interviewed. 21% of survey respondents who have read ESs.13/86 if you count all survey respondents (including those who did not read ESs) = 15%
  • Not sure what to do with this table….use final column for our “revised” impacts (not including new, suggestion possible impacts/uses). OR:, just include a check mark in the 2nd and 3rd columns to indicate these are confirmed?We could have aCheck markAnd X for a badly defined impactA 0 for a never recorded impact, but we still want to keep it because it is possibleAnd a ? For those that we are still unsure of
  • In the interviews, participants also discussed what they liked/disliked about the ESs in general, and what other potential impact or uses they perceived these could have on knowledge, practice, and users.Discovery was a theme which emerged describing specific areas of learningThere was also the concept of using information to influence one’s own or others’ decision-making. Not sure where this falls. Two uses: sharing, and DM.
  • Other advantages or themes which emerged from analysis of the interviews.Liked the format compared to the original articles.
  • Similar to Grad & Pluye et al’s findings with respect to positive cognitive impactsThe low “community impact” could be due to a misunderstanding of the question (the word “Community” could be interpreted as a change in workplace/colleagues), but possibly also likely due to:lack of measures to assess impact in library and information settingsEnvironmental barriers to assessing (EBP culture not always present)Delay in time to assess impact (unlike patient care, where it is one-on-one and more immediate)LIS is new to the concept of assessment of end users, it will take time for the culture to change and for these results to become observable/measurable.
  • Tool requires more investigation to improve validity, but results are promising, especially for cognitive and practice impact.Limit to LIS practitioners, exclude LIS faculty from valid population, for now.Cognitive impacts were overwhelmingly positive – new or reassuring informationPractice impacts in different areas – less than cognitive, but still perception of positive impactAlso: we are not capturing the impact of ES on instructors in LIS – faculty members who are teaching. Not sure if the tool should be expanded to do that, or should be limited to the population of “practitioners” within the traditional librarians/info specialist role.There appears to be two types of practice impact: direct impact on the individual’s practice and impact on colleagues, workplace, setting. Need to investigate if this is different from impact on “Community” – which we intended to mean the “users” or non-users e.g., student, faculty, patrons, (equivalent of “patients” in the original IAM tool)
  • elicit more detailed answers (corroborate survey findings) and determine why some impacts were not selected
  • (Also thank McGill for the support to conduct this research while on sabbatical).

Assessing the impact of evidence summaries in library and information studies: A mixed methods approach Assessing the impact of evidence summaries in library and information studies: A mixed methods approach Presentation Transcript

  • Lorie Kloda, MLIS, PhD(c), AHIP, McGill UniversityDenise Koufogiannakis, MA, MLIS, PhD(c), University of AlbertaAlison Brettle, MLIS, PhD, University of SalfordMedical Library Association Annual Meeting, May 2012, Seattle
  • Gap between research and practice in library andinformation studies (LIS)(Booth, 2003; Crowley, 2005; Genoni, Haddow, & Ritchie, 2004; Turner, 2002)Only method likely to improve communication is“inclusion of research reports in (…) publicationsfrequently read by practitioners.”(Haddow&Klobas, 2004)Evidence Based Library and Information Practicejournal, 2006->200 evidence summaries
  • Structured abstract objective – design – setting – subjects – method – main results – conclusionCommentary 300-400 words appraisal of validity, reliability, applicability significance, implications for practice 3
  • 4
  • To investigate the impact of evidencesummaries on library and informationprofessionals and their practiceKnowledgePracticeUsers
  • Phase 1 Development and face-validation of toolPhase 2 Survey questionnaire to readers (QUANT)Phase 3 Interviews (QUAL)
  • Development of a tool to assess impact
  • Findings (Phase 1)Development ofImpactAssessmentMethodGrad, R., Pluye, P., &Beauchamp, M.-E.(2007). Validation ofa method to assessthe clinical impact ofelectronicknowledgeresources, e-ServiceJournal, 5(2), 113-135. http://iam2009.pbworks.com/
  • Survey questionnaire to readers of evidence summaries
  • Findings (Phase 2)Survey 175 153 emailed 101Respondents recruited IAM survey respondents 1 unusable 15 3 bounced email incomplete 86 21 duplicates 49 no reply completed IAM survey 153 remaining 101 total respondents n=62
  • Findings (Phase 2) USACountry Canada UK Australia USA Ireland Australia Finland Malaysia UK Saudi Arabia Iran Puerto Rico Spain Canada Brazil Hong Kong
  • Findings (Phase 2) 50+ 25+Number of 15Evidence 14 13Summaries 12 11Read in Past 10 9Year 8 7 6 5 4 3 2 1 0 0 2 4 6 8 10 No. of respondents
  •  Decline in Reference Transactions with Few Questions Referred to Librarian when the Reference Desk is Staffed by a Paraprofessional (8) The Presence of Web 2.0 Applications Is Associated with the Overall Service Quality of Library Websites (6) Google Scholar Out-Performs Many Subscription Databases when Keyword Searching (4) Statistical Measures Alone Cannot Determine Which Database (BNI, CINAHL, MEDLINE, or EMBASE) Is the Most Useful for Searching Undergraduate Nursing Topic (4) A Graduate Degree in Library or Information Science Is Required, but not Sufficient, to Enter the Profession (3)
  • Findings (Phase 2)Reason for Freq. %Reading For general interest or curiosity 15 24%EvidenceSummary For personal continuing 18 29% professional educationn=62 To answer a specific question 21 34% or address a specific issue in my practice Other 8 13%
  • Findings (Phase 2)When the > 1 yearEvidence 6-12 monthsSummary was 3-6 monthsRead 1-3 months 1 week-1 month 1-7 days Today 0 5 10 15 20 25
  • Findings (Phase 2) Freq. %Cognitive My practice was (will be) improved 11 13%Impact I learned something new 36 42% I recalled something I already knew 14 16% It prompted me to investigate more 23 27% It confirmed I did (I am doing) the right thing 17 20% I was reassured 13 15% I was dissatisfied: There is a problem with the 1 1% presentation of this evidence summary I was dissatisfied: I disagree with the content 0 0% of this evidence summary It is potentially harmful 0 0% Other 9 10%
  • Findings (Phase 2_Cognitive freq. %Impact: Too much information 0 0%PresentationProblem Not enough information 1 1%“You reported: I was Information is poorly written 0 0%dissatisfied; There wasa problem with thepresentation of this Information is too technical 0 0%Evidence Summary.Which of the following Other 0 0%problems did youencounter?”
  • Findings (Phase 2) Freq. %Cognitive The structured abstract did not 0 0%Impact: adequately explain the original studyDisagree with The writer of the evidence summary 0 0%Content presented incorrect information“You reported: I was The commentary was overly negative 0 0%dissatisfied; I disagreewith the content of this The commentary was not critical 0 0%Evidence enoughSummary. Which of the The writer of the evidence summary did 0 0%following content not place this study in contextelements did youdisagree with?” Other 0 0%
  • Findings (Phase 2)Practice Freq. %Impact Change my service approach 5 6%“You reported: Change my approach to collections 1 1%My practice was(will be) Change my management approach 4 5%improved. What Change my approach to teaching 4 5%did you (will you)do differently Change my professional approach 4 5%after reading theEvidence Other 3 3%Summary?”
  • Findings (Phase 2)Community  NoneImpact  Hypothesized future/potential“If reading thisEvidence Summary impacts on usersresulted in some  Reinforced cognitive or practicechange to yourindividual practice, doyou think it led to an impacts, not user outcomesimpact on anyone  5 reported actual impact at thiswithin the communityyou serve or level:environment inwhich you work?  Change in teaching LIS studentsPlease explain in thecomment box.”  Observed (anecdotal) changes
  • Interviews with subsample of survey respondents
  • Findings (Phase 3)13 participants Subsample from survey respondent Various settings, education levels, countries, and professional positions
  • Proposed Impacts Suggested Confirmed Phase 1 Phase 2 Phase 3C1 Practice was improved ? ✗C2 Learned something new ✔ ✔C3 Recalled something ✔ ✓C4 Prompted to investigate ✔ ✔C5 Confirmed ✔ ✓C6 Reassured ✔ ?C7 Dissatisfied – presentation ? 0C8 Dissatisfied - content 0 0C9 Potentially harmful 0 0P1 Service ✔ ✔P2 Collections ✔ ✓P3 Management ✔ ✓P4 Teaching ✔ ✓P5 Professional practice ✔ ✓U User Community ? ?
  • Findings (Phase 3)Potential Discovery  New researchImpacts  Interesting topics  MethodsUncovered  Keeping current Sharing  With colleagues, managers  Report writing  Recommended reading Assistance  Research  Writing, presentations  Teaching (for professors)
  • Findings (Phase 3)“Likes” Format of evidence summaries  Descriptive  Accuracy of abstract  Concise Time saving Database of Evidence Summaries
  •  One evidence summary assessed per respondent Cognitive impact comparable to findings in Grad, Pluye, et al. (2006) Practice impact – two-tiered Low community impact
  •  Tool validation Cognitive impact Practice impact  Individual practice  Workplace practice Difficult to assess impact on community/users
  • Revise Impact Assessment Tool  Include new or revised impact items  Improve wording of itemsConduct survey of a larger sample of evidencesummary readersAdapt tool for other forms of researchdissemination
  • Canadian Association of Research LibrariesResearch in Librarianship GrantRoland Grad and Pierre Pluye, McGill University,for their feedbackAll of our survey respondents and interviewparticipants
  • lorie.kloda@mcgill.ca@loriekloda@eblipslideshare.net/lkloda