What Do We Know About IPL Users?

501 views
431 views

Published on

An examination of IPL users, their questions, and the processing of questions by the service. Compares two studies which researched user information needs and IPL service provision in 1999 and 2007.

Published in: Technology, Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
501
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
6
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

What Do We Know About IPL Users?

  1. 1. What do we know about IPL users and the services provided to them? Winter 2009
  2. 2. Overview <ul><li>Information describing visitors of the IPL’s website </li></ul><ul><li>Research about the IPL’s question answering service: </li></ul><ul><ul><li>Findings from two studies of the IPL’s email reference service- one in 1999, the other in 2007- will be highlighted. </li></ul></ul><ul><ul><li>Studies examined the service’s users, the types of questions posed to the service, and characteristics of the service’s question handling. </li></ul></ul><ul><ul><li>Sample questions and answers from three specific user communities: </li></ul></ul><ul><ul><ul><li>Educational organizations </li></ul></ul></ul><ul><ul><ul><li>Community organizations </li></ul></ul></ul><ul><ul><ul><li>Non-profits </li></ul></ul></ul>
  3. 3. Geographic Distribution of IPL Visitors, 2005-2008
  4. 4. IPL Visitors by Country, 2005-2008 Source: Google Analytics
  5. 5. Email Reference Service, 1999 <ul><li>Carter & Janes (2000) study analyzed reference interactions of the IPL’s question answering service from questions posed to the service in 1999. </li></ul><ul><li>Research conducted for an understanding about the service’s users, questions asked, and how those questions were handled by the service. </li></ul><ul><ul><li>Examined 3,022 reference questions. </li></ul></ul><ul><ul><li>Questions received from January to March 1999 were included for the study. </li></ul></ul>
  6. 6. Characteristics of the Service’s Users, 1999 <ul><li>Twenty-four percent of users identified themselves as business persons, and fewer self-identified as teachers or librarians. ( n =888) </li></ul><ul><li>Fifty-two percent of users selected the purpose of their question as school-related. ( n = 1,073) </li></ul><ul><li>Users submitted their questions to the service in the following ways: </li></ul><ul><ul><ul><li>General Adult web form- 68% </li></ul></ul></ul><ul><ul><ul><li>Direct email- 26% </li></ul></ul></ul><ul><ul><ul><li>KidSpace web form- 4% </li></ul></ul></ul><ul><ul><ul><li>An IPL web form on the site intended for another purpose- 1.4% </li></ul></ul></ul>
  7. 7. Users’ Questions by Subject, 1999 <ul><li>Users designated the subject of their question on the web forms. Questions unanswered by the service do not receive a subject code. </li></ul><ul><li>The most frequent subjects assigned by users include other , education , humanities , science , and government/law . </li></ul><ul><ul><ul><li>The subject category blank was assigned by researchers when questions were received through direct email. </li></ul></ul></ul>
  8. 8. Service Characteristics, 1999 <ul><li>Approximately 25% of all received questions were rejected (unanswered). </li></ul><ul><li>Top three reasons for question rejection ( n =700) : </li></ul><ul><li>1. Quota (too many questions pending answers) – 53.4% </li></ul><ul><li>2. Rejected due to patron’s specified need-by date- 17.9% </li></ul><ul><li>3. Bad email address- 7.3% </li></ul><ul><li>Questions submitted via the KidSpace web form were most rejected: </li></ul><ul><li> </li></ul><ul><li>Timeliness of response for answered questions ( n = 2,322) : </li></ul><ul><ul><ul><li>2.10 days for factual questions; 2.31 days for sources questions </li></ul></ul></ul><ul><ul><ul><li>About 25% of questions answered within 1 day </li></ul></ul></ul>
  9. 9. Email Reference Service, 2007 <ul><li>A follow-up to the Carter & Janes (2000) study was conducted using 2007 reference data to explore the IPL’s question answering service. </li></ul><ul><li>Analysis of reference interactions was undertaken by Rozaklis, MacDonald, & Abels (2008) for a poster session competition at Drexel Research Day. </li></ul><ul><li>Investigated the service’s users, the kinds of questions posed, and how those questions were handled by the service. </li></ul><ul><ul><li>Examined 128 reference questions, a 1% stratified sample of the total number of questions received by the Ask an IPL Librarian service in 2007. </li></ul></ul>
  10. 10. Characteristics of the Service’s Users, 2007 <ul><li>Twenty-nine percent of users identified themselves as outside of the United States. Sixty-two percent identified their location inside the United States. ( n =128) </li></ul><ul><li>Eighty-three percent of users submitted their questions through the General Adult web form. ( n =128) </li></ul><ul><ul><ul><li>49% of users’ questions posed via the General Adult web form indicated that they planned to use the response for a school assignment. </li></ul></ul></ul><ul><li>Users who reported consulting outside sources prior to contacting the IPL stated having referred to the following sources: </li></ul>Source Percentage ( n =128) Search engine 32% A specific or general Internet resource 28% An academic or public library 9% Another person (teacher, librarian, parent, friend) 1%
  11. 11. Users’ Questions by Subject, 2007 <ul><li>Researchers analyzed all questions, both answered and rejected, to determine which subject codes users assign to their questions. ( n =128) </li></ul><ul><li>The subjects users assigned to questions most frequently include history , other , science , and education . </li></ul><ul><li>Users and IPL Librarians agreed on subject codes assigned to questions 78% of the time. </li></ul>
  12. 12. Users’ Questions by Question Type, 2007 <ul><li>Users’ questions were classified following Schwartz’s (2003) schema. </li></ul>Category and Description of Question Type: Following Schwartz’s (2003) Question Typology Percentage ( n =128) Provide: “Provide questions were distinguishable by the phrase: “Please provide me with all of the information you have about . . .” or similar injunction. Such questions placed the entire burden of research on the librarian, without any request for instruction or interest in participating in the process of gathering information.” 38% Factual: “Factual questions asked for a specific piece of information, much like a traditional ready reference question.” 33% Need: “ Need questions simply stated a research need, for example, “I need information about stock prices in Argentina,” or “I need three articles about women in Bangladesh”. 12% Advise : “ Advise questions asked for advice in the research process, specifically about which sources to consult or which databases would be best for a particular project.” 10% Instruct: “Instruct questions asked for instruction on solving an information need. These e-mails could include “How do I find information about the Roman Empire?” The emphasis in this category was a conspicuous request for instruction. Not only did a person need to find something, but they wished to know how to do it–perhaps in order to become more self-sufficient and knowledgeable about conducting research.” 6% Statements: “Statements were not questions at all, rather they were general compliments, suggestions, or complaints.” 1%
  13. 13. Service Characteristics, 2007 <ul><li>Of the 128 questions sampled for the study, 40% were rejected. </li></ul><ul><ul><ul><li>This figure is slightly higher than the rejection rate for all questions received in 2007, at 33%. </li></ul></ul></ul><ul><li>Roughly 50% of questions received through the KidSpace web form were rejected . </li></ul><ul><li>Fifty-percent of questions posed for work-related purposes were rejected. </li></ul><ul><li>Treatment of questions by how the user stated that they would use the response ( n =128) : </li></ul>
  14. 14. Specific User Groups of the Ask an IPL Librarian Service <ul><li>Additional user communities identified include: </li></ul><ul><ul><li>Educational organizations </li></ul></ul><ul><ul><li>Community organizations </li></ul></ul><ul><ul><li>Non-profits </li></ul></ul><ul><li>Sample questions and answers from those specific user groups included in the remaining slides. </li></ul>
  15. 15. Educational Organizations (1)
  16. 16. Educational Organizations (2)
  17. 17. Community Organizations
  18. 18. Non-profit Organizations
  19. 19. Sources Consulted <ul><li>Carter, D.S., & Janes, J. (2000). Unobtrusive Analysis of Digital Reference Questions and Service at the Internet Public Library: An exploratory study. Library Trends, 49 (2), 251-265. </li></ul><ul><li>Schwartz, J. (2003). Toward a Typology of E-mail Reference Questions. Internet Reference Services Quarterly, 8 (3), 1-15. </li></ul>

×