• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Say What?  An Analysis of Virtual Reference at the University Libraries

Say What? An Analysis of Virtual Reference at the University Libraries



Presentation during the 2011 RIS Summer Workshops (May 24-25), Blacksburg, VA.

Presentation during the 2011 RIS Summer Workshops (May 24-25), Blacksburg, VA.



Total Views
Views on SlideShare
Embed Views



0 Embeds 0

No embeds



Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
Post Comment
Edit your comment

    Say What?  An Analysis of Virtual Reference at the University Libraries Say What? An Analysis of Virtual Reference at the University Libraries Presentation Transcript

    • Rebecca K. Miller
      RIS Summer Workshops 2011
      May 25, 2011
      Say what? AnanalysisDiscussion of virtual reference at the University Libraries
    • Why VR?
    • A VR training session for operators
      An evaluation of VR at VT
      A critical review of VR at VT
      A comprehensive, statistically valid analysis of VR at VT
      This is not…
    • A contemplative discussion about virtual reference at Virginia Tech, driven by a few statistics and some literature
      A brainstorming session about the issues that impact us and how we can improve VR (and other) service
      A time of reflection about where we’ve been
      Admission: I don’t really know, despite some digging!
      A time of creative consideration about where we’re going
      This is…
    • Overwhelming amount
      With the phrase “virtual reference,” Summon returns around 500journal articles published between January 2008 and May 2011. Library Lit & Info (Wilson) returns 255 articles.
      Ranges from the practical to the philosophical:
      Pervasiveness of VR
      Usage (low?)
      User expectations related to speed/readily available materials
      Users and usage statistics
      User perception
      User satisfaction
      Question depth
      Specific tools (Meebo, Second Life, etc.)
      Service enhancements
      Reference interview issues/customer service
      Instruction opportunity/informing instruction
      Core competencies for operators
      Recent Literature
    • Take a few minutes and consider:
      Which of these issues impact us here, at Virginia Tech?
      What other issues not listed here may impact us?
      What do you want to know about virtual reference at Virginia Tech?
      Reflection: Community as context
    • “So the first step toward improving VR is for librarians to stop acting like computers.” (Zino, 2009)
      “You’d think I’d get used to the rush one feels at this point in the transaction…the challenge to get it right quickly.” (Harmeyer, 2008)
      “Taking into account the differences between an in-person transaction and one done over e-mail, chat, or texting, the big thing missing from those in the latter category is the ability to visually demonstrate during the teaching moments of the transaction.” (Steiner, 2010)
      “The extent to which [reference services] adapts to Google, WorldCat, Facebook, and other social networking tools, the iPhoneand derivatives of handheld devices will ultimately determine future patterns of service and open up the possibilities.” (Bodner, 2009)
      Thought provoking quotes from recent literature
    • In October 2004, Luke Vilelle, Dave Beagle, and Buddy Litchfield analyzed VT’s virtual reference service:
      1 question per 105 university affiliates
      1.33 questions per hour
      48% of questions asked by undergrads
      30% asked by grads
      12% asked by faculty/staff
      10% asked by non-affiliates
      Live Ref = 13.87% of total reference questions
      Where we’ve been:2004 statistics
    • A review of October 2010statistics (of taken chats) shows:
      Total: 349 chats started; 86 email tickets received = 435, total, received
      1 question per 87affiliates
      (6,866 faculty + 31,006 students): 37,872 total affiliates
      0.98 questions per hour (out of 441.5 library open hours)
      Out of the 343 chat transcripts available:
      42% of questions asked by undergrads
      36% asked by grads
      14% asked by faculty/staff
      5% asked by non-affiliates
      3% asked by alumnus/na
      VR = 16.75% of total reference questions (total of 1734 + 435)
      Where we are?2010 statistics
    • 2004/2011 Comparison chart
      *per total open hours of Newman Library, October 2011
      Using figure of 435 total questions (email and chat)
    • October 2011: Grouped by skills
      Compare with in-person (desk) reference*:
      BHSS: 701 questions
      Sci/Tech: 754 questions
      Torg/Tower: 270 questions
      *Statistics courtesy of Heather
    • Sticking with October 2010—a few averages:
      Operator: 40 seconds
      Visitor: 33 seconds
      Response length
      Operator lines: 10.65
      Operator words: 114.9
      Visitor lines: 8.58
      Visitor words: 87.84
      Other Interesting details
    • Since March 16, 2011, we’ve received 37 texts:
      March 2011: 11 text messages
      April 2011: 22 text messages
      May (1-22) 2011: 4 text messages
      LivePerson doesn’t capture ID statistics
      Message content:
      Directional, general, quick answer: 28
      Subject-specific, in-depth answer: 5
      User rang in, then didn’t respond: 4
      REF-Texting (Rexting?)
    • Enhancement through technology
      Videos & images (Screenr, Jing, tutorials on library site)
      Web annotation (AwesomeHighlighter, SharedCopy)
      Demo of Screenr and SharedCopy
      Personal awareness
      • Log into LivePerson and review your transcripts
      • Review word counts, response times, and other elements
      Mining the data…
      Concepts from the literature
    • Take a few minutes and consider:
      What sort of information can VR transactions tell us about library users?
      What sort of information can VR transactions tell us about library services?
      What else can VR transcripts tell us about our work and planning for the future?
      Reflection:Future Research
    • Bodner, S. (2009). Virtual reference reflections.Journal of Library Administration, 49(7), 675-685. doi:10.1080/01930820903260432
      DeMars, J. M., & Breitbach, W. (2009). Enhancing virtual reference: Techniques and technologies to engage users and enrich interaction. Internet Reference Services Quarterly, 14(3-4), 82-91. doi:10.1080/10875300903256571
      Harmeyer, D. (2008). Virtual reference: Less is more.The Reference Librarian, 48(1), 113-116. doi:10.1300/J120v48n99_11
      Olszewski, L., & Rumbaugh, P. (2010). An international comparison of virtual reference services. Reference & User Services Quarterly, 49(4), 360-368.
      Steiner, H. M. (2010). Livening virtual reference with screencasting and screen sharing. Library Hi Tech News,27(4/5), 9-11. doi:10.1108/07419051011083172
      Sullivan, D. (2008). Is the virtual reference interview dead? Incite, 29(12), 13-14.
      Walton-Sonda, D. (2009). Virtual reference service: From competencies to assessment. Australian Academic & Research Libraries, 40(1), 67-68.
      Zino, E. (2009). Let's fix virtual reference. Library Journal, 134(2), 94-94
      Further reading:A (very) small sample