Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
2 Birds, 1 Stone:
A Mixed Methods Approach to
Measure Service Process and Identify
Pain Points in Virtual Reference
Christ...
2 BIRDS, 1 STONE:
A MIXED METHODS APPROACH TO MEASURE
SERVICE PROCESS AND IDENTIFY PAIN
POINTS IN VIRTUAL REFERENCE
Christ...
VR Services at MSU Libraries
• Member of two VR Cooperatives:
• Research Help Now
• QuestionPoint 24/7 Academic Cooperativ...
Measuring Service Process in VR
• What is service process?
• HOW and WHY is VR used? (Service perspective)
• How can it be...
• Which types of questions are asked in VR?
• Is VR a valid research service point?
• Customized Descriptive Codes
• Deriv...
Customized Descriptive Codes Assigned by Major Category
(n= 7,095)
40
221
558
2726
36
183
526
2805
0 1000 2000 3000
Local ...
0
1000
2000
3000
4000
Local
Resources
Tech/Help Library
Services
Library
Resources
Total
2011-2012
2012-2013
Trends in Ass...
73
108
162
131
150
208
334
337
554
616
78
127
127
138
154
205
355
479
562
597
Citation Help
Circulation
Public Services
Jo...
Trends in Top Ten Assigned Customized Descriptive Codes:
2011 - 2013
0
200
400
600
800
2011-2012
2012-2013
Mixed Methods Approach:
Virtual Reference Assessment
• Why is VR service used?
• What types of questions are
asked?
• Vali...
Pain Points in VR
What are pain points?
• Expressions of
frustration, irritation, confu
sion when using library’s
website ...
Pain Points: The Process
• Systematic sample of VR transcripts
• May 2011 – April 2012 (n=253)
• Dedoose software (http://...
The Mixed Methods Dashboard
(Dedoose)
Dedoose: Filtered by Descriptive Code
Dedoose: Filtered with Excerpts
Dedoose: Excerpts List By Code
“I’m not finding info very quickly”
(Librarian)
“I can’t figure out….I’ve been
fiddling around with the website
for a whil...
Findings & Implications
• Assessment of VR beyond service quality and information literacy!
• Service Process:
• VR is a v...
References
Houlson, Van, Kate McCready, and Carla Steinberg Pfahl. 2006. “A Window into Our
Patron’s Needs.” Internet Refe...
Questions?
Now…Please ask!
Later…Contact Me!
Christine Tobias
User Experience and Reference Librarian
Michigan State Unive...
Upcoming SlideShare
Loading in …5
×

2 Birds, 1 Stone: A Mixed Methods Approach to Measure Service Process and Identify Pain Points in Virtual Reference

520 views

Published on

Presentation for the 19th Annual Reference Research Forum, ALA Annual Conference, Chicago, IL

Published in: Technology, Education
  • Be the first to comment

  • Be the first to like this

2 Birds, 1 Stone: A Mixed Methods Approach to Measure Service Process and Identify Pain Points in Virtual Reference

  1. 1. 2 Birds, 1 Stone: A Mixed Methods Approach to Measure Service Process and Identify Pain Points in Virtual Reference Christine Tobias User Experience & Reference Librarian Michigan State University Libraries 19th Annual Reference Research Forum June 29, 2013
  2. 2. 2 BIRDS, 1 STONE: A MIXED METHODS APPROACH TO MEASURE SERVICE PROCESS AND IDENTIFY PAIN POINTS IN VIRTUAL REFERENCE Christine Tobias User Experience and Reference Librarian Michigan State University Libraries 19th Annual Reference Research Forum ALA Annual Conference June 29, 2013
  3. 3. VR Services at MSU Libraries • Member of two VR Cooperatives: • Research Help Now • QuestionPoint 24/7 Academic Cooperative • Chat and Instant Messaging (IM) Services with 24/7 Coverage Year Web Chat IM (Qwidget) Total Questions Received 2011 1,874 1,321 3,195 2012 1,691 1,814 3,505 2013 (Jan.- May) 648 731 1,379 Number of Questions Received from MSU Patrons Chat and IM: 2011-2013
  4. 4. Measuring Service Process in VR • What is service process? • HOW and WHY is VR used? (Service perspective) • How can it be measured? • Quantitative: Stats and Numbers • Content of VR transcripts • What can be measured? • Traffic Patterns: How often and which service? • Staffing Patterns: Who’s answering the question? • Access Points: Where are users accessing the service? • Question Types: What types of questions are asked?
  5. 5. • Which types of questions are asked in VR? • Is VR a valid research service point? • Customized Descriptive Codes • Derived using Grounded Theory Model • Created in QuestionPoint (VR software) • MSU patrons only • Up to 3 codes assigned to each question • Based on initial question(s) asked at point of service entry. • Only one “coder” to maintain consistency Service Process Measurement Plan
  6. 6. Customized Descriptive Codes Assigned by Major Category (n= 7,095) 40 221 558 2726 36 183 526 2805 0 1000 2000 3000 Local Resources Tech/Help Library Services Library Resources 2011-2012 2012-2013
  7. 7. 0 1000 2000 3000 4000 Local Resources Tech/Help Library Services Library Resources Total 2011-2012 2012-2013 Trends in Assignment of Customized Descriptive Codes By Major Category: 2011 - 2013
  8. 8. 73 108 162 131 150 208 334 337 554 616 78 127 127 138 154 205 355 479 562 597 Citation Help Circulation Public Services Journal Holdings Ready Reference Databases Book/Document Electronic Resources Article (Known Citation) Research Question 2011-2012 2012-2013 Top Ten Assigned Customized Descriptive Codes: 2011 – 2013 (n=5,495)
  9. 9. Trends in Top Ten Assigned Customized Descriptive Codes: 2011 - 2013 0 200 400 600 800 2011-2012 2012-2013
  10. 10. Mixed Methods Approach: Virtual Reference Assessment • Why is VR service used? • What types of questions are asked? • Validate use of VR as research service point Service Process (Quantitative) • Why/Where are users frustrated, confused, lost? • Observe and understand information-seeking behaviors Pain Points (Qualitative) What is the relationship between service process and website usability in virtual reference?
  11. 11. Pain Points in VR What are pain points? • Expressions of frustration, irritation, confu sion when using library’s website and online resources How can pain points be identified? • VR transcripts – evidence of user behavior • Text analysis (qualitative)
  12. 12. Pain Points: The Process • Systematic sample of VR transcripts • May 2011 – April 2012 (n=253) • Dedoose software (http://www.dedoose.com) • Text analysis of transcript content • Create and categorize excerpts based on: • Expressions of frustration, confusion, irritation, etc. • Patron and Librarian perspectives • Wayfinding/Navigation • Filter by descriptive code(s) to see specific usability problems • Is transparency in presentation of resources lacking? • Are users not understanding the functionality of tools presented? • Is relevant, pertinent information buried?
  13. 13. The Mixed Methods Dashboard (Dedoose)
  14. 14. Dedoose: Filtered by Descriptive Code
  15. 15. Dedoose: Filtered with Excerpts
  16. 16. Dedoose: Excerpts List By Code
  17. 17. “I’m not finding info very quickly” (Librarian) “I can’t figure out….I’ve been fiddling around with the website for a while….” “…don’t know how to get there from here.” [access to database] Excerpts = Evidence of Pain Points
  18. 18. Findings & Implications • Assessment of VR beyond service quality and information literacy! • Service Process: • VR is a valid and valuable research service point. • Pain Points: • Presentation of access to library resources is not transparent. • Information about library services is buried. • Navigation of library’s website is difficult. • Functionality of tools provided is not clear. • Mixed methods approach is creative, effective, efficient, and practical • Service Process + Pain Points = Prelude to Usability Testing • Evidence-based model for study of user behavior in the digital environment
  19. 19. References Houlson, Van, Kate McCready, and Carla Steinberg Pfahl. 2006. “A Window into Our Patron’s Needs.” Internet Reference Services Quarterly 11 (4): 19–39. doi:10.1300/J136v1ln04_02. Luo, Lili. 2008. “Chat Reference Evaluation: a Framework of Perspectives and Measures.” Reference Services Review 36 (1): 71–85. doi:10.1108/00907320810852041. Maximiek, Sarah, Eric Rushton, and Elizabeth Brown. 2010. “Coding into the Great Unknown: Analyzing Instant Messaging Session Transcripts to Identify User Behaviors and Measure Quality of Service.” College & Research Libraries 71 (4): 361–373. Powers, Amanda Clay, Julie Shedd, and Clay Hill. 2011. “The Role of Virtual Reference in Library Web Site Design: a Qualitative Source for Usage Data.” Journal of Web Librarianship 5 (2): 96–113. doi: 10.1080/19322909.2011.573279 “Qualitative Software.” American Evaluation Association. http://www.eval.org/p/cm/ld/fid=81.
  20. 20. Questions? Now…Please ask! Later…Contact Me! Christine Tobias User Experience and Reference Librarian Michigan State University Libraries tobiasc@msu.edu Presentation available at: http://slideshare.net/tobiasc

×