Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

PM8_Plum_Point of use Web Surveys for Networked Electronic Resources

199 views

Published on

Martha Kyrillidou, Terry Plum, Bruce Thompson

Published in: Technology, Business
  • Be the first to comment

  • Be the first to like this

PM8_Plum_Point of use Web Surveys for Networked Electronic Resources

  1. 1. Point-of-Use Web Surveys for Networked Electronic Resources: Implementation and Sampling Plans Presented By: Martha Kyrillidou, ARL Terry Plum, Simmons GSLIS Bruce Thompson, Texas A&M University 8th Northumbria International Conference on Performance Measures in Libraries and information Services Florence, Italy August 18, 2009
  2. 2. <ul><li>The most popular current method of measuring usage of electronic resources by libraries is not through web-based usage surveys, but through vendor supplied data of library patron usage or transaction usage. </li></ul><ul><li>Web-based usage surveys are increasingly relevant in the collection of usage data to make collection development and service decisions, to document evidence of usage by certain patron populations, and to collect and analyze performance outputs. </li></ul><ul><li>Brinley Franklin and Terry Plum, “Successful Web Survey Methodologies for Measuring the Impact of Networked Electronic Services (MINES for Libraries ® )” IFLA Journal 32 (1) March, 2006 </li></ul>Measuring Digital Content Use
  3. 3. ARL New Measures Toolkit: StatsQUAL ® LibQUAL + ® is a rigorously tested Web-based survey that libraries use to solicit, track, understand, and act upon users‘ opinions of service quality. LibQUAL + ® DigiQUAL ® The DigiQUAL ® online survey designed for users of digital libraries that measures reliability and trustworthiness of Web sites. DigiQUAL ® is an adaptation of LibQUAL + ® in the digital environment. MINES for Libraries ® Measuring the Impact of Networked Electronic Resources (MINES) is an online transaction- based survey that collects data on the purpose of use of electronic resources and the demographics of users. ARL Statistics ™ ARL Statistics™ is a series of annual publications that describe the collections, expenditures, staffing, and service activities for Association of Research Libraries (ARL) member libraries. ClimateQUAL ™ ClimateQUAL™: Organizational Climate and Diversity Assessment is an online survey that measures staff perceptions about: (a) the library's commitment to the principles of diversity, (b) organizational policies and procedures, and (c) staff attitudes.
  4. 4. What is MINES? <ul><li>M easuring the I mpact of N etworked E lectronic S ervices (MINES) </li></ul><ul><li>MINES is a research methodology that measures the usage of networked electronic resources of a library or consortium by a specific category of the patron population. </li></ul><ul><li>Action Research </li></ul><ul><ul><li>Historically rooted in indirect cost studies </li></ul></ul><ul><ul><li>Point of use intercept study </li></ul></ul><ul><ul><li>Random moment sample </li></ul></ul><ul><ul><ul><li>Inference - valid and reliable </li></ul></ul></ul><ul><ul><li>Set of recommendations for research design </li></ul></ul><ul><ul><li>Set of recommendations for web survey presentation </li></ul></ul><ul><ul><li>Set of recommendations for information architecture in libraries </li></ul></ul><ul><ul><li>Plan for continual assessment of networked electronic resources </li></ul></ul><ul><ul><li>An opportunity to benchmark across libraries </li></ul></ul>
  5. 5. Library User Survey
  6. 6. Library User Survey Affiliation
  7. 7. Library User Survey Purpose
  8. 8. MINES Strategy <ul><li>A representative sampling plan, including sample size, is determined at the outset. Typically, there are 48 hours of surveying over 12 months at a medical library and 24 hours a year at a main library. </li></ul><ul><li>Random moment/web-based surveys are employed at each site. </li></ul><ul><li>Local implementation attempts to be complete for networked electronic resources </li></ul><ul><ul><li>Ejournals </li></ul></ul><ul><ul><li>Ebooks </li></ul></ul><ul><ul><li>Databases </li></ul></ul><ul><ul><li>Open URL linkers - SFX </li></ul></ul><ul><ul><li>Catalog </li></ul></ul><ul><ul><ul><li>856 links </li></ul></ul></ul><ul><ul><li>Interlibrary Loan </li></ul></ul><ul><ul><li>Digital Collections (dSpace, CONTENTdm) </li></ul></ul><ul><ul><li>Electronic course reserves (sometimes) </li></ul></ul>
  9. 9. Issues with Web surveys <ul><li>Research design </li></ul><ul><ul><li>Response rate </li></ul></ul><ul><ul><ul><li>Response representativeness </li></ul></ul></ul><ul><ul><li>Random sampling and inference </li></ul></ul><ul><ul><li>Non-respondents - Three types of non-respondents </li></ul></ul><ul><ul><ul><li>Those who do not see the survey </li></ul></ul></ul><ul><ul><ul><ul><li>Bookmarks </li></ul></ul></ul></ul><ul><ul><ul><ul><li>IP authentication </li></ul></ul></ul></ul><ul><ul><ul><ul><li>Passwords </li></ul></ul></ul></ul><ul><ul><ul><li>Those who see the survey but do not respond </li></ul></ul></ul><ul><ul><ul><ul><li>Mandatory questions </li></ul></ul></ul></ul><ul><ul><ul><li>Those who respond but who are not surveyed again during the two hour session. </li></ul></ul></ul><ul><ul><ul><ul><li>Session IDs </li></ul></ul></ul></ul>
  10. 10. Quality Checks <ul><li>Target population is the population frame – surveyed the patrons who were supposed to be surveyed - except in libraries with outstanding open digital collections. </li></ul><ul><li>Check usage against IP. In this case, big numbers may not be good. May be seeing the survey too often. </li></ul><ul><li>Alter order of questions and answers, particularly sponsored and instruction. </li></ul><ul><li>Spot check IP against self-identified location </li></ul><ul><li>Spot check undergraduates choosing sponsored research – measurement error </li></ul><ul><li>Check self-identified grant information against actual grants </li></ul><ul><ul><li>Grant PI, sponsoring agency, name of grant </li></ul></ul><ul><li>Content validity – discussed with librarians and pre-tested. </li></ul><ul><li>Turn-aways – number who elected not to fill out the survey </li></ul><ul><li>Library information architecture -- </li></ul><ul><ul><li>Infrastructure of assessment - effectively a gateway - good </li></ul></ul><ul><ul><li>HTML pages – not good </li></ul></ul><ul><ul><ul><li>there is a substantial difference in results </li></ul></ul></ul>
  11. 11. What is a session? <ul><li>A successful request of an online service. </li></ul><ul><li>It is one cycle of user activities that  typically starts when a user connects to theservice or database and ends by  terminating activity that is either explicit  (by leaving the service through exit or  logout) or implicit (timeout due to user  inactivity) (NISO) </li></ul>Counter Code of Practice: Journals and Databases. Release 3. Glossary of Terms. http://www.projectcounter.org/code_practice.html
  12. 12. Sample Survey Data File Generated
  13. 13. Federated searches are different <ul><li>Search and session activity generated by federated search engines and other automated search agents should be categorized differently from regular searches. </li></ul><ul><ul><li>Any searches or sessions derived from any federated search engine (or similar automated search agent) should be included in separate “Searches_federated” and “Sessions_federated” counts….and are not to be included in the “Searches_run” and “Sessions” counts. </li></ul></ul>Counter Code of Practice: Journals and Databases. Release 3. Glossary of Terms. http://www.projectcounter.org/code_practice.html
  14. 14. Interception strategies - 2008
  15. 15. Interception strategies - 2009
  16. 16. Mandatory v. Optional <ul><li>Mandatory </li></ul><ul><ul><li>Addresses response rate </li></ul></ul><ul><ul><ul><li>Patrons who see the survey but who do not fill it out. </li></ul></ul></ul><ul><ul><li>2 hour sample period </li></ul></ul><ul><ul><li>Higher response rate </li></ul></ul><ul><ul><ul><li>Elective turn aways </li></ul></ul></ul><ul><li>Optional </li></ul><ul><ul><li>Different sample </li></ul></ul><ul><ul><li>Faculty and graduate students drop out </li></ul></ul><ul><ul><li>Sponsored research is lower </li></ul></ul>
  17. 17. UConn data: Jan - May 2005 Mandatory v. Optional
  18. 18. UConn data: Jan - May 2005 Mandatory v. Optional
  19. 19. UConn data: Jan - May Mandatory v. Optional
  20. 20. UConn data: Jan - May Mandatory v. Optional
  21. 21. UConn data: Jan - May Mandatory v. Optional
  22. 22. UConn data: Jan - May Mandatory v. Optional
  23. 23. Every n th instead of session? <ul><li>What if we did every n th instead of a session? </li></ul><ul><ul><li>Same quality checks </li></ul></ul><ul><ul><li>Don’t have to set up a session, which is technically difficult. </li></ul></ul><ul><ul><li>Will lose continuity of use, but usage and user are identical. </li></ul></ul><ul><ul><li>Patrons might be re-surveyed, but perhaps not. </li></ul></ul><ul><ul><ul><li>Could check IP’s if network topology is fully understood. </li></ul></ul></ul>
  24. 24. Every n th <ul><li>Began every n th study at major Canadian research library </li></ul><ul><li>In 2004-2005 ran a similar session-based survey as part of the Ontario Council of University Libraries. </li></ul><ul><li>Can compare session data with every n th data for similar months at same university, although chronologically different. </li></ul>
  25. 25. Comparison session and every n th
  26. 26. Comparison session and every n th
  27. 27. Comparison session and every n th
  28. 28. Comparison session and every n th
  29. 29. Comparison session and every n th
  30. 30. Comparison session and every n th
  31. 31. Conclusion <ul><li>Review of web survey limitations </li></ul><ul><li>Description of MINES for Libraries </li></ul><ul><li>Discussed some techniques for intercept point of use surveys </li></ul><ul><li>Compared mandatory v. optional surveys with data </li></ul><ul><li>Compared session ID tracking and every nth with data. </li></ul><ul><li>Questions? </li></ul>

×