Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Data-Informed Decision Making for Digital Resources

457 views

Published on

This session will provide three case studies of assessment and evaluation programs in libraries--one past, one current, and one future. The cases use three different modes of data gathering and analysis and show the power of understanding user needs and how well your organization is meeting them.

Published in: Education
  • Be the first to comment

  • Be the first to like this

Data-Informed Decision Making for Digital Resources

  1. 1. Data-Informed Decision Making for Digital Resources Christine Madsen + Megan Hurst Athenaeum21 Electronic Resource & Libraries Conference Austin, TX | April 2016
  2. 2. Library Assessment & Evaluation Programs Past, Present, Future ● 3 case studies of assessment and evaluation programs in libraries ● 3 different modes of data gathering and analysis ● show the power of understanding user needs and how well your organization is meeting them.
  3. 3. Agenda 1. Introduction (5 min) 2. Harvard University Library, Open Collections Program (10 min) 3. Resource Discovery @ The University of Oxford (10 min) 4. University of California, Davis, Open Library Assessment Dashboard & Toolkit (10 min) 5. Questions & Discussion (10 min)
  4. 4. Who We Are Megan Hurst MLIS, MFA integrated media @MHzUX Athenaeum21 Christine Madsen MLIS, DPhil (PhD) @mccarthymadsen Athenaeum21 Expertise ● User experience research and design ● Library assessment ● Publishing workflows and models Work Harvard, EBSCO, Athenaeum21 Expertise ● Digital library systems ● Digitization (doing it right) ● Methods for measuring impact Work Harvard, Oxford, Bodleian, Athenaeum21
  5. 5. Setting Goals Is Easy, but Achieving Them Isn't 65% of organizations have an agreed-upon strategy. 14% of employees understand the organization’s strategy. Less than 10% of all organizations successfully execute the strategy. (source Forbes.com) Users
  6. 6. Timeline 2002 20172007 2012 2004 Harvard University Library Open Collections Program “Outreach and Evaluation” Program initiated 2015 University of Oxford Resource Discovery User Needs Analysis 2008 UK JISC TIDSR Toolkit, Developed by Christine Madsen for the Oxford Internet Institute, based on Michalak, Madsen and Hurst’s work for Open Collections Program 2016- 2017 Library Assessment Dashboard & Toolkit, UC Davis w/ The Bodleian Libraries, Goettingen State and University Library 2007 iPhone Released 2006 Google Analytics Released 2006 First “Assessment Librarian” position advertised 2007 First “UX Librarian” position advertised 2015 Use and Usability Assessment of Digital Libraries Survey, Digital Library Federation 2006 Facebook opened to general public 13+ years old 2017 NISO Link Origin Tracking Best Practices
  7. 7. 2002-2007 Harvard University Library Open Collections Program
  8. 8. Open Collections Program, 2002-2007 ● Digitization program aimed at making a selection of Harvard’s primary sources freely available to high school and community college students
  9. 9. Open Collections Program, 2002-2007 ● Two initial goals of the program: ○ Increase the availability and use of historical resources from Harvard's libraries, archives and museums for teaching, learning, and research ○ Offer a new model for digital collections that will benefit students and teachers around the world
  10. 10. Open Collections Program, 2002-2007 Three main principles of quality: ● Selection Standards - Create comprehensive, topic-based digital collections by carefully selecting topics, and materials; ● Production Standards - Create digital surrogates that are both faithful to the original publications and of such high quality that there will be no need for re-digitization by other institutions; ● Access Standards - Provide easy online access to digital collections within the Harvard community and around the world. ○ A commitment not just to making the materials freely available online, but to making sure people knew these collections existed.
  11. 11. Open Collections Program, 2002-2007 ● Context / Climate ○ No “Assessment Librarian” or “User Experience Librarian” positions in libraries ○ Google Analytics did not yet exist ○ Online marketing and conversion analysis was still very young ○ Libraries dominated by a ‘build it and they will come’ approach to digitization ● Motivation for Assessment ○ To measure and increase usage ○ To demonstrate and report progress and impact to funders
  12. 12. Understanding & Prioritizing User Types Students General Public Open Web Librarians Educators Target Users Library Websites Researchers Open Collections Website/s
  13. 13. OCP: Tools & Methods for Assessment 1. Web Analytics (pre-Google Analytics) ○ Early inlink and referral analysis ○ Homegrown “digital circulation” tracking 2. User Research ○ Task-completion studies ○ Ethnographic and heuristic studies ○ Surveys (undergraduates, educators) ○ Focus Groups ○ Interviews
  14. 14. OCP Tools: Highlight of Quantitative Methods 1. Homegrown “Digital Circulation” Tracking a. Time in book b. # of pages viewed per book c. Provided insights into user behaviors inside digital objects (including multi-page items), and across collection
  15. 15. OCP Tools: Highlight of “Outreach” Methods 1. Early Search Engine Optimization (SEO) efforts a. Added keywords to collection websites that reflected open web searches, rather than library subject terms b. Created web content that would attract open web searchers in our target audience (before Google Knowledge Graph existed) i. Historical events ii. Historical figures iii. Historical organizations c. Enhanced Wikipedia articles with OCP content
  16. 16. Lessons Learned Use multiple methods, iterate, and triangulate! ● Match assessment methods to research questions ● Analyze target audience and usage types to set expectations ● Assessment questions should then flow from your understanding of that audience Understand the Data ● Repeat visitors (“Converts”) is more important than visits or hits ● Inlinks (links to your site from other sites) are important and worth working for ● Quality of use is important, because it can tell stories of impact
  17. 17. Outreach&Evaluation ProgramBegins
  18. 18. Email & Listserv Outreach Campaign
  19. 19. Faithful Converts: Year-over-Year Unique visitors who visited 2004 2005 1x 21,292 88,643 2x 2,414 8,015 3x 766 2,449 4x 392 1,107 5x 201 631 >5x 849 3,932 TOTAL 25,914 104,777
  20. 20. Lessons Learned, part 2 Assessment helped to: ● Document and illustrate a clear connection between outreach efforts and collection usage ● Guide enhancements to digital services ● Tell stories of impact and value
  21. 21. Lessons Learned, part 2 ● Outreach & Evaluation Program ○ Cost 2% of project budget (approx $30,000USD) ○ Increased visits by 400% ○ Increased # and % of “ Converts” (Visitors who visited >5x)
  22. 22. Lessons Learned, part 2 ‘Well-marketed’ collections are still more heavily used, many years later, than those that have not been ‘marketed’
  23. 23. 2008-2010 Toolkit for the Impact of Digitised Scholarly Resources jisc - UK / Oxford Internet Institute
  24. 24. 2015- Resource Discovery @ The University of Oxford
  25. 25. ● Context / Climate ○ 92 Libraries (30 Bodleian; 41 College; 21 Department) 7 museums, 2 gardens ○ Some have less than 50% of their collections ‘catalogued’ with electronic metadata ○ No single discovery solution (nor is there likely to be in the near future) ● Motivation for Assessment ○ To understand how Oxford should prioritise resources to maximise users’ discovery of resources across libraries and museums ○ To scope new approaches to finding information and collections of relevance to research and teaching at Oxford Resource Discovery @ The University of Oxford
  26. 26. Peer Institutions Users Metadata Experts Vendors & Publishers Literature Review Roadmap 5 Strands of Research Resource Discovery @ The University of Oxford Synthesis
  27. 27. Tools & Methods: ● Interviews (in person) ● Observational/ ethnographic research ● Think-aloud protocols Users targeted: ● Undergraduate students ● Graduate students ● Faculty ● Researchers ● Staff ● Alumni Resource Discovery @ The University of Oxford
  28. 28. The Future of Finding @ Oxford Outcomes: ● Detailed roadmap for investment in tools and infrastructure Lessons Learned: ● In depth, ethnographic (qualitative) methods are useful for assessing long term goals and outcomes ● Pattern recognition (that is, being able to ‘code’ your data) is an important part of making using of qualitative methods -- you need to be able to see the patterns
  29. 29. 2016- Open Library Assessment Dashboard & Toolkit University of California, Davis
  30. 30. Setting Goals Is Easy, but Achieving Them Isn't 65% of organizations have an agreed-upon strategy. 14% of employees understand the organization’s strategy. Less than 10% of all organizations successfully execute the strategy. (source Forbes.com) Users
  31. 31. Open Library Assessment Dashboard & Toolkit Context / Climate ○ Lots of Assessment Librarians using lots of different tools ○ As many metrics and formulae as there are questions ○ Libraries are getting better at using data ○ Standards for digital content have evolved
  32. 32. Open Library Assessment Dashboard & Toolkit Motivation for Assessment: ○ How do you give someone the ‘big picture’? ○ How do you deal with data overload? ■ Quality of use and kinds uses ■ Online and off ■ How much does it really cost? ■ What is the impact? - before you can understand impact, you have to have good data and understand how to use it. ○ How do you make assessment useful throughout the organization?
  33. 33. Open Library Assessment Dashboard & Toolkit The Project 1. Research and scoping the work required to develop an open Library Assessment Dashboard and Toolkit to give library leaders and managers a single-page overview into all of their library data to better facilitate data- driven decision making 2. Developing resource and cost estimates for a 2017 kickoff
  34. 34. Open Library Assessment Dashboard & Toolkit Tools & Methods: ● Create an open, customizable framework of tools, data sources and standard metrics / key performance indicators (KPIs) Users targeted: ● Senior library managers and directors
  35. 35. Dashboards from Other Industries Sources: http://www.informationbuilders. com/products/intelligence https://www.johndaniel.com/index.php/solutions/industry- segment/healthcare/
  36. 36. Open Library Assessment Dashboard & Toolkit Strategic (Quarterly to Annual) Managerial (Daily to Monthly) Metrics / KPIs Data Sources Metrics / KPIs Data Sources Service Area 1 Key Management Questions >> Key formulas Top 3-5 Data Sources Key Management Questions >> Key formulas Top 3-5 Data Sources Service Area 2 Key Management Questions >> Key formulas Top 3-5 Data Sources Key Management Questions >> Key formulas Top 3-5 Data Sources etc Key Management Questions >> Key formulas Top 3-5 Data Sources Key Management Questions >> Key formulas Top 3-5 Data Sources … Key Management Questions >> Key formulas Top 3-5 Data Sources Key Management Questions >> Key formulas Top 3-5 Data Sources
  37. 37. ● Opening hours compared to demand ● Availability of required titles ● Percentage of rejected sessions ● Ratio of requests received to requests sent out in interlibrary lending ● Staff per capita ● User satisfaction ● Library visits per capita ● Seat occupancy rate ● Number content units downloaded per capita ● Collection use (turnover) ISO 11620: Library Performance Indicators ● Percentage of stock not used ● Loans per capita ● Percentage of loans to external users ● Reference questions per capita ● Ratio of acquisitions costs to staff costs ● Acquisition speed ● Lending speed ● Interlibrary loan speed ● Percentage of acquisitions expenditure spent on the electronic collection ● And more.... https://www.iso.org/obp/ui/#iso:std:iso:11620:ed-3:v1:en
  38. 38. ● Currently engaged in user needs research with multiple libraries at varying stages of assessment ● Discussion! ○ What are you measuring in your library today? ○ What would you like to be measuring? Open Library Assessment Dashboard & Toolkit
  39. 39. More Information & Assessment Resources ● Resource Discovery @ The University of Oxford Final Report, December 2015 ● University of California, Davis, Open Library Assessment Dashboard & Toolkit Scoping Project ● Toolkit for the Impact of Digitised Scholarly Resources (TIDSR) ● Surveying the Landscape: Use and Usability Assessment of Digital Libraries. Digital Library Federation (DLF) Assessment Interest Group (AIG) User Studies Working Group White Paper, November 2015 ● NISO Link Origin Tracking initiative
  40. 40. Questions? Comments? Please be in touch! Athenaeum21 www.athenaeum21.com Christine Madsen (UK) madsen@athenaeum21.com @mccarthymadsen Megan Hurst (US) hurst@athenaeum21.com @MHzUX

×