Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Mining Virtual Reference Data for an Iterative Assessment Cycle


Published on

My presentation from ALA Annual 2011 at the 17th Annual Reference Research Forum.

Published in: Education, Technology
  • Be the first to comment

Mining Virtual Reference Data for an Iterative Assessment Cycle

  1. 1. Mining Virtual Reference Data for an Iterative Assessment Cycle<br />Amanda Clay Powers<br />Virtual Reference Project Manager<br />Ag & Life Sciences Librarian / Assistant Professor<br />Mississippi State University Libraries<br /><br />@amandaclay<br />17th Annual Reference Research Forum<br />June 26, 2011<br />
  2. 2. Managed from within Reference<br />Requires Systems/Reference Work<br />Ongoing transcript review in place<br />Chat at the MSU Libraries<br />@amandaclay<br />
  3. 3. ACRL Value of Academic Libraries: A Comprehensive Research Review and Report (2010)<br />SACS Accreditation Review<br />MSU Libraries Measurement & Evaluation Committee formed<br />Assessment on the Mind<br />@amandaclay<br />
  4. 4. March 2010 – Web site redesign begins <br />April 2010 – Transcript analysis proposed<br />May 2010 – Analysis delivered<br />August 2010 –Web site and Discovery launched<br />Chronology<br />@amandaclay<br />
  5. 5. @amandaclay<br />
  6. 6. @amandaclay<br />
  7. 7. @amandaclay<br />
  8. 8. with Web Services Manager Clay Hill and Digital Projects Web Services Specialist Julie Shedd<br />Write-up of April 2010 data analysis and results published in the Journal of Web Librarianship<br />Includes Wordleanalysis of 1800+ email and chat questions<br />“The Role of Virtual Reference in Library Web Site Design: A Qualitative Source for Usage Data”<br />@amandaclay<br />
  9. 9. Faculty Chat Wordle: Entry Questions and Departments<br />@amandaclay<br />
  10. 10. Do “Topic Searches,” “Finding Known Article Searches,” and “Online Catalog Help” searches decrease in frequency?<br />Will reluctant librarians, burned by federated searching, adopt Discovery?<br />Lingering Questions<br />@amandaclay<br />
  11. 11. “Integrating the sampling of virtual reference data into an iterative assessment cycle….can allow for ongoing measures to be taken of the library’s effectiveness.” <br />(Powers et al, 111)<br />The paper recommends twice yearly assessments using the methodology.<br />Taking Your Own Medicine<br />@amandaclay<br />
  12. 12. Based on Flynn, Gilchrist and Olson. 2004. “Using the Assessment Cycle as a Tool for Collaboration.” Resource Sharing & Information Networks 17(1):187-203.<br />Cycle of Assessment<br />Goal:<br />To evaluate the Web Site redesign and Discovery Implementation<br />@amandaclay<br />
  13. 13. Using a modified Grounded Theory (GT) model<br />Evaluate chat transcripts from April 2010 <br />Follow up with an evaluation of April 2011 using the same model<br />Methodology<br />@amandaclay<br />
  14. 14. IRB approval<br />Excel Spreadsheet—identifier, entry question, “Questions,” “Answers” <br />Review transcripts and create short descriptions of transactions<br />Refer to the data to find patterns <br />Use patterns to create codes<br />Code data to quantify<br />Breaking It Down<br />@amandaclay<br />
  15. 15. Lower number of chat transactions in April 2011<br />For coding, types of questions asked = questions<br />For coding, tools used in answering questions = answers<br />Constructs<br />@amandaclay<br />
  16. 16. Number of Chats in April (2008-2011)<br />-25% ?<br />@amandaclay<br />
  17. 17. @amandaclay<br />
  18. 18. Photo credit: Ryan Hoke, MSU Meteorology Student and Storm Chaser<br /><br />
  19. 19. Identifying Patternsin Question Types<br />TOPIC SEARCH<br />
  20. 20.
  21. 21. @amandaclay<br />38%<br />28%<br />44%<br />
  22. 22. Question Analysis Results<br />Decrease in “Topic Search” questions<br />Slight decrease in “Finding Known Articles” <br />Increase in questions related to the Online Catalog/Holds<br />@amandaclay<br />
  23. 23. But What About the Chat Librarians?<br />Tools used in chat are the “answers” counted as defined:<br />Databases from Libraries’ Database Portal list<br />Service Referrals to Specialists (internal or external)<br />Other Resources not above include ILL, librarian answered (RR), guides, print sources, etc.<br />@amandaclay<br />
  24. 24. Databases Used to Answer Chat Questions<br />April 2010<br />April 2011<br />@amandaclay<br />
  25. 25. @amandaclay<br />
  26. 26. Discussion<br />Overall chat transactions decrease by 25%<br />Topic Search questions decrease<br />Discovery supplants use of Academic Search Premier for chat librarians<br />Online catalog/holds questions and references increase<br />@amandaclay<br />
  27. 27. Discovery is not taking the place of the Online Catalog despite being included in Discovery search<br />In February 2011, 17 libraries were added for a total of 8 systems with 42 libraries in our Online Catalog<br />What’s Up with the Online Catalog? <br />@amandaclay<br />
  28. 28. Difficult to compare 120 questions and 90 questions<br />Assessment needs change over time—past data analysis may need to be revisited<br />Limit yourself to what you really need to know<br />Challenges<br />@amandaclay<br />
  29. 29. Looking Forward: Modifications for Future Assessment<br />@amandaclay<br />
  30. 30. Need more data points to confirm Topic Search and possible Finding Known Articles reduction<br />Discovery has been adopted by chat librarians in lieu of Academic Search Premier<br />Something is going on with the online catalog worth investigating<br />Conclusions<br />@amandaclay<br />
  31. 31. Questions?<br />