Your SlideShare is downloading. ×
0
Evaluating E-Reference: An Evidence Based Approach<br />Elaine Lasda Bergman and Irina I. Holden<br />University at Albany...
Overview<br />What is Evidence Based Librarianship?<br />Methods <br />What constitutes “evidence?”<br />Systematic review...
Overview<br />Results of our review<br />Methods of determining user satisfaction<br />Comparison of variables<br />Range ...
What is Evidence Based Librarianship?<br /><ul><li>Booth and Brice’s definition of Evidence Based Information Practice:
“The Retrieval of rigorous and reliable evidence to inform… decision making” </li></ul>(Booth and Brice, ix)<br />
What is Evidence Based Librarianship (EBL)?<br />History <br />Gained traction in Medical fields in 1990’s and spread to s...
Don’t we ALREADY use “evidence”?<br />Evidence is “out there, somewhere” <br />Disparate locations: many different journal...
	Heirarchy of “Evidence”<br />Source: http://ebp.lib.uic.edu/applied_health/?q=node/12<br />
Systematic Reviews vs. Literature Reviews<br />
Systematic Reviews: When Are They Useful?<br />Too much information in disparate sources<br />Too little information, hard...
Process of Systematic Review<br /><ul><li>Formulate Research Question
Database Search
Review Results
Critical Appraisal
Analysis</li></li></ul><li>Research Questions<br />Research question formulation<br />Description of the parties involved ...
Our Research Questions<br />1. 	What is the level of satisfaction of patrons who utilize digital reference? <br />2. 	What...
Database Search<br />LISTA (EBSCO platform): 123 articles retrieved<br />LISA (CSA platform): 209 articles retrieved<br />...
Working with Results<br />279 Results after de-duplication <br />Only format retrieved: journal articles<br />Abstracts we...
Inclusion/Exclusion Criteria<br />Should be pre-determined at the beginning of the study<br />Minimizes bias <br />Allows ...
Sample Inclusion/Exclusion Criteria<br />		Inclusion<br />Peer reviewed journals<br />Articles comparing e-reference with ...
Working with Results<br />93 articles were selected based on inclusion/exclusion criteria<br />Full text was obtained and ...
Results of Full Text Review<br />
Critical Appraisal Tools<br />QUOROM (The Lancet, 1999, vol. 354, 1896-1900)<br />Downs-Black scale (“Checklist for study ...
Glynn’s Critical Appraisal Tool<br />Population<br />Data collection<br />Study design<br />Results<br />
Critical Appraisal Process<br />24 articles were subjected to critical appraisal <br />Each question from Glynn’s tool was...
Analysis (Findings of Review)<br />Settings and general characteristics:<br />Multiple instruments in a single article<br ...
Similar Variables in Surveys<br />“Willingness to return”<br />11 surveys of all instruments (Nilsen)<br />Staff person vs...
Analysis<br />Other questions in obtrusive studies<br />“Were you satisfied?” <br /> “Would you recommend to a colleague?”...
Analysis: <br />Reason for variation:<br />Nature of questions asked is contingent on context in which satisfaction was me...
Unobtrusive studies: Transcript Analysis<br />2 Basic Methods:<br />Transcript analysis by person asking the question (pro...
Upcoming SlideShare
Loading in...5
×

Evaluating e reference

777

Published on

Description of a systematic review and evidence based librarianship related to virtual reference services.

Published in: Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
777
On Slideshare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
24
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • Important to determine at the beginning of the studies to provide some organized structure for future work. It helps to minimize the bias for inclusion of some articles; it can be verifiable by the readers who would like to make sure that the authors did adhere to the selected criteria. There’s also something that is called “publication bias”; it means that often the studies with the positive results get to be published more often that the studies with the negative results or often such studies are published in the journals that are of a least importance and not indexed properly in major databases.
  • In our example we knew right away that we would like to stay clear off the articles that talked about implementation of the electronic reference or establishing such service. We wanted to avoid reviews or book reviews as they were not original studies. Some articles examined the demographic parameters of their users – for example how many female vs. male patrons used the IM services or what was their age, etc. We felt as though it is not important to the main idea of our study of user satisfaction. We also knew that we won’t be able to read raticles in non-English language, so those were excluded as well.With the inclusion criteria we tried to come up with some clear parameters that helped us to identify the initial group of articles to examine. It helped a lot, in fact.
  • For this part of the process the tool was needed. Different researchers approach this in different ways: some look for existing tools, some came up with their own questions that better suit their topics.
  • Each of the four sections contains from 5 to 8 questions. For example, the population section questions whether the study population is representative of all the users, actual and eligible, whether inclusion/exclusion criteria definitely outlined, sample size, whether the population choice is bias-free, etc. Answering these questions can be difficult – we spent a lot of time doing it first on our own and then together, discussing the articles over and over.
  • Transcript of "Evaluating e reference"

    1. 1. Evaluating E-Reference: An Evidence Based Approach<br />Elaine Lasda Bergman and Irina I. Holden<br />University at Albany<br />Presentation for Reference Renaissance<br />Denver, CO August 10, 2010<br />
    2. 2. Overview<br />What is Evidence Based Librarianship?<br />Methods <br />What constitutes “evidence?”<br />Systematic reviews and analyses<br />Systematic Review Process<br />Research question<br />Database Search<br />Article Review<br />Critical Appraisal<br />Synthesize, analyze, discuss<br />
    3. 3. Overview<br />Results of our review<br />Methods of determining user satisfaction<br />Comparison of variables<br />Range of results<br />Conclusions, lessons learned<br />About evidence based librarianship<br />About research quality<br />About user satisfaction with electronic reference<br />
    4. 4. What is Evidence Based Librarianship?<br /><ul><li>Booth and Brice’s definition of Evidence Based Information Practice:
    5. 5. “The Retrieval of rigorous and reliable evidence to inform… decision making” </li></ul>(Booth and Brice, ix)<br />
    6. 6. What is Evidence Based Librarianship (EBL)?<br />History <br />Gained traction in Medical fields in 1990’s and spread to social sciences after that<br />Medical librarians were the first to bring this approach to LIS research<br />Increasingly used in social sciences and information/library science<br />Sources: Booth and Brice, ix.<br />
    7. 7. Don’t we ALREADY use “evidence”?<br />Evidence is “out there, somewhere” <br />Disparate locations: many different journals, many different researchers<br />Evidence is not summarized, readily available and synthesized<br />No formal, systematized, concerted effort to quantify and understand if there is a pattern or just our general sense of things<br />
    8. 8. Heirarchy of “Evidence”<br />Source: http://ebp.lib.uic.edu/applied_health/?q=node/12<br />
    9. 9. Systematic Reviews vs. Literature Reviews<br />
    10. 10. Systematic Reviews: When Are They Useful?<br />Too much information in disparate sources<br />Too little information, hard to find all of the research<br />Help achieve consensus on debatable issues<br />Plan for new research<br />Provide teaching/learning materials<br />
    11. 11. Process of Systematic Review<br /><ul><li>Formulate Research Question
    12. 12. Database Search
    13. 13. Review Results
    14. 14. Critical Appraisal
    15. 15. Analysis</li></li></ul><li>Research Questions<br />Research question formulation<br />Description of the parties involved in the studies (librarians and patrons, for ex.)<br />What was being studied (effectiveness of instructional mode, for ex.)<br />The outcomes and how they can be compared <br />What data should be collected for this purpose (either student surveys or pre/post tests, etc.)<br />
    16. 16. Our Research Questions<br />1. What is the level of satisfaction of patrons who utilize digital reference? <br />2. What are the measures researchers use to quantify user satisfaction and how do they compare?<br />
    17. 17. Database Search<br />LISTA (EBSCO platform): 123 articles retrieved<br />LISA (CSA platform): 209 articles retrieved<br />ERIC: no unique studies retrieved<br />
    18. 18. Working with Results<br />279 Results after de-duplication <br />Only format retrieved: journal articles<br />Abstracts were reviewed applying inclusion and exclusion criteria<br />
    19. 19. Inclusion/Exclusion Criteria<br />Should be pre-determined at the beginning of the study<br />Minimizes bias <br />Allows outside verification of why studies were included/excluded<br />
    20. 20. Sample Inclusion/Exclusion Criteria<br /> Inclusion<br />Peer reviewed journals<br />Articles comparing e-reference with face-to-face reference<br />Articles on academic, public and special libraries<br />Articles on e-mail, IM, and “chat” reference<br /> Exclusion<br />Articles describing how to implement digital reference programs<br />Articles discussing quantitative or demographic data only<br />Reviews, editorials and commentary<br />Non-English articles<br />
    21. 21. Working with Results<br />93 articles were selected based on inclusion/exclusion criteria<br />Full text was obtained and read by both authors independently to determine if at least one variable pertaining to user satisfaction was present; then the results were compared<br />
    22. 22. Results of Full Text Review<br />
    23. 23. Critical Appraisal Tools<br />QUOROM (The Lancet, 1999, vol. 354, 1896-1900)<br />Downs-Black scale (“Checklist for study quality”)<br />CriSTAL (Critical Skills Training in Appraisal for Librarians (Andrew Booth)<br />
    24. 24. Glynn’s Critical Appraisal Tool<br />Population<br />Data collection<br />Study design<br />Results<br />
    25. 25. Critical Appraisal Process<br />24 articles were subjected to critical appraisal <br />Each question from Glynn’s tool was answered (either yes, no, unclear or N/A) and the results were calculated<br />12 research papers selected and subjected to the systematic review<br />
    26. 26. Analysis (Findings of Review)<br />Settings and general characteristics:<br />Multiple instruments in a single article<br />9 unique journals<br />US based<br />Methods and timing of data collection<br />7 paper surveys<br />3 pop up surveys<br />3 transcript analysis<br />
    27. 27. Similar Variables in Surveys<br />“Willingness to return”<br />11 surveys of all instruments (Nilsen)<br />Staff person vs service<br />“Have you used it before?”<br />Ranged from 30%-69% (email)<br />Positivity of experience<br />7 point, 4 point, 3 point scales<br />65% - 98.2% (email, small group)<br />14-417 respondents<br />Staff quality<br />7 point, 4 point, 3 point scales<br />68% - 92.8% (14 respondents)<br />
    28. 28. Analysis<br />Other questions in obtrusive studies<br />“Were you satisfied?” <br /> “Would you recommend to a colleague?” <br />each only asked in only 1 of the studies<br />
    29. 29. Analysis: <br />Reason for variation:<br />Nature of questions asked is contingent on context in which satisfaction was measured <br />[correlate to guidelines, librarian behaviors, reference interviews, etc.]<br />
    30. 30. Unobtrusive studies: Transcript Analysis<br />2 Basic Methods:<br />Transcript analysis by person asking the question (proxy patron) (Schachaf and Horowitz, 2008, Sugimoto, 2008). <br />75% “complete”, 68% “mostly incomplete”<br />Transcripts independently assessed for quality and coded (Marsteller and Mizzy, 2003, Schachaf and Horowitz, 2008)<br />3 point scale, “+ or –” scale<br />2.24 out of 3 (level of quality); 5 negatives/200 transactions<br />Research question: Efficacy of third party assessors vs. user surveys<br />
    31. 31. Lessons Learned<br />Lessons about user satisfaction with electronic reference:<br />Overall pattern of users being satisfied, regardless of methodology or questions asked<br />Measurement of user satisfaction is contingent upon context<br />Researchers most often try to connect user satisfaction to another variable, satisfaction the sole focus of only one article<br />
    32. 32. Lessons Learned<br />Lessons about library research<br />Extensive amount of qualitative research makes performing systematic reviews challenging<br />Inconsistency of methodologies used in original research makes the systematic review challenging, meta-analysis is more often than not impossible<br />Common pitfalls in LIS research that affect the quality of the published article<br />
    33. 33. Lessons Learned<br />Benefits of undertaking a systematic review:<br />Sharpens literature searching skills: benefits for both librarians and their patrons who need this kind of research<br />Researcher gains the ability to critically appraise research<br />The practice of librarianship is strengthened by basing decisions on a methodological assessment of evidence<br />
    34. 34. Systematic Reviews and EBL:Impact on the Profession<br />Formal gathering and synthesis of evidence may:<br />Affirm our intuitive sense about the patterns in current research<br />Refine, clarify and enhance a more robust understanding of a current problem in librarianship<br />May, on occasion, provide surprising results!<br />
    35. 35. Questions?http://www.slideshare.net/librarian68<br />Elaine M. Lasda Bergman<br />ebergman@uamail.albany.edu<br />Irina I. Holden<br />iholden@uamail.albany.edu<br />
    1. A particular slide catching your eye?

      Clipping is a handy way to collect important slides you want to go back to later.

    ×