Your SlideShare is downloading. ×
How well are we tracking our reference statistics? A usability study on electronic reference statistics
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Saving this for later?

Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime - even offline.

Text the download link to your phone

Standard text messaging rates apply

How well are we tracking our reference statistics? A usability study on electronic reference statistics

760
views

Published on

Title: How Well Are We Tracking Our Reference Statistics? A Usability Study on Electronic Reference Statistics …

Title: How Well Are We Tracking Our Reference Statistics? A Usability Study on Electronic Reference Statistics
Objective: After implementing an electronic reference statistics tracking system, does the new system allow more precise and easily tracked statistics?
Methods: The reference and education department used a paper system to monitor patron interactions and combined it with information from an electronic calendar to report statistics. Unfortunately, each librarian had a different way of self-reporting interactions on paper. To move away from this outdated and incomplete method, we pursued an electronic system for tracking patron interactions. Librarians explored commercially available systems and systems created by other libraries. We consulted with our in-house systems department about creating a database for tracking the transactions. Because of its ease of use and low cost, librarians chose to implement an online survey using SurveyMonkey to track patron interactions. An updated electronic survey was created using questions from the original print version with edits based on library faculty feedback. For consistency, the library staff added the survey link to their web browsers or as icons on their desktops. As a follow-up, librarians implemented a usability study to learn what revisions are needed. The study was conducted with library faculty and staff.
Results: Twenty-five out of 30 faculty and staff members completed an anonymous survey. Sixty percent of respondents answered that they record their patron interactions more than 75% of the time. Sixty-eight percent of respondents either agreed or strongly agreed that the “new online system helped them keep track of patron interactions.” Usability observations confirmed that power users were quicker to complete the survey and had fewer questions. Those less familiar with the survey were less likely to report that they routinely used it. Ninety-two percent of those surveyed preferred the electronic system paper-based systems.
Conclusions: After almost a year of use, the new system has met the expectations of the developers and resulted in satisfied users. Due to the ease of development, ease of use, and low cost, SurveyMonkey has proved an effective alternative to paper statistics and many of the specialized statistical tracking systems.
Authors: Vedana Vaidhyanathan, Librarian, Reference and Educational Services; Emily Vardell, Director, Reference, Education, and Community Engagement; Kimberly Loper, Special Projects and Digital Initiatives Librarian, Louis Calder Memorial Library, Miller School of Medicine, University of Miami, Miami, FL; John Reynolds, Reference Librarian, West Boca Branch, Palm Beach County Library System, Boca Raton, FL; Tanya Feddern-Bekcan, AHIP, Head, Education; Louis Calder Memorial Library; Miller School of Medicine, University of Miami, Miami, FL

Published in: Technology, Education

0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
760
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
3
Comments
0
Likes
1
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Results of Satisfaction Survey
    • Most (25/30) faculty and staff members completed an anonymous survey.
    • 84% record interactions at least half the time
    • 56% incorporated the survey into daily routine
    • 74% report the survey takes <one minute to complete
    • 92% preferred the electronic over the paper-based system
    • What is your favorite thing about the survey?
    • It is very easy to complete
    • It helps with my statistics for faculty review
    • It is Green – no wasted paper
    • It helps me identify my varied duties
    • It is a very easy way to collect data
    How long do you believe it takes to complete the survey? How often do you interact with clients? How well are we tracking our reference statistics? A usability study on electronic reference statistics Vedana Vaidhyanathan, Emily Vardell, Kimberly Loper, John Reynolds*, Tanya Feddern-Bekcan Department of Health Informatics, University of Miami Miller School of Medicine; *Palm Beach County Library, West Boca Branch Objective To adapt a survey instrument to tract reference statistics. After implementing an electronic reference statistics tracking system, does the new system allow for more precise and easily tracked statistics? Conclusions After almost a year of use, the new system has met the expectations of the developers and resulted in satisfied users. Due to the ease of development, ease of use, and low cost, SurveyMonkey has proven an effective alternative to previous paper statistics system. How often do you record your client interactions with the survey? To move away from this outdated and incomplete method, Calder librarians pursued an electronic system for tracking patron interactions. Librarians explored commercially-available systems and systems created by other libraries. Reference librarians consulted with in-house systems department staff members about creating a database for tracking the transactions. Because of its ease of use and low cost, librarians chose to use an online survey instrument (SurveyMonkey) to track patron interactions. Librarian survey developers used librarian feedback on the existing print version to create a modified electronic version. All staff members were asked to add the survey link to their Web browsers or as icons on their desktops. Three months into survey implementation the survey developers administered a library-wide satisfaction and use survey. Seven months later librarians conducted a usability study, to learn what revisions if any were needed. The study was conducted with library faculty and staff. Methods Results of the Usability Study
    • 69% record interactions at least half of the time
    • 92% have a link to the survey
    • 46% completed the survey in <one minute
    • 30% reported that the survey took less time than was observed.
    Survey usage statistics since implementation Details
    • Resources: Test was performed on the finished product at the primary work station of each participant.
    • Individual test sessions: 13 participants (n=13). One participant per session, each session lasted approximately 15 minutes.
    • Role of the observer: Watched real users work with the product, timed the activity, and took notes.
    • Pretest questions:
    • What percent of client interactions do you record with SurveyMonkey?
    • Do you have a direct link to the survey?
    • How long do you believe it takes you to complete the survey?
    • Tasks: Participants were asked typical questions and were timed while completing the survey.
    • Post-test question: How can the survey be improved?
    How often do you record your patron interactions with the survey? Actual timed results to complete the survey How can the survey be improved?
    • Change formatting so that all options show on the screen
    • Fewer choices on resources question
    • Increase the # of choices for interaction time
    • to better reflect the amount of time a particular interaction took
    n=25 Satisfaction and Use Survey Usability Study n=13 compare compare Background Previously the Reference and Education Department used a paper system to monitor patron interactions combined with an electronic calendar to report statistics. Unfortunately, each librarian had a different way of self-reporting interactions on paper.