Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Evaluating the Library

136 views

Published on

MLA presentation that introduces two different examples on how to plan and implement ongoing library evaluation: both start with the library's strategic plan. The first example is from Carnegie Public Library in Red Lodge and the second is from Montana State University. Two very different models using different tools and methods. Both libraries use an ongoing methodology that provides formative assessment data to facilitate continuous improvement.

Originally presented at the Montana Library Association conference, April 13, 2018, Bozeman, MT

Panel presenters;
Joann Flick, Montana State LIbrary
Jodie Moore, Carnegie Public Library, Red Lodge MT
Stef Johnson, Butte-Silverbow Public Library, Butte MT
David Swedman, Alyssa Heller & Mary Anne Hansen, Montana State University Renne Library

Published in: Education
  • Be the first to comment

  • Be the first to like this

Evaluating the Library

  1. 1. EVALUATING THE LIBRARY MONTANA LIBRARY ASSOCIATION ANNUAL CONFERENCE April 13, 2018
  2. 2. Today  Your panel:  Mary Anne Hansen  Jodie Moore  Jo Flick  Stef Johnson  David Swedman  Alyssa Heller  How do you quantify success? Photo: Jo Flick, MT State Library
  3. 3. Focus on ASSESSMENT  Assessment: ongoing, formative  Evaluation: conclusive, summative STRATEGIC PLANPhoto: Pixabay CC0
  4. 4. Measure  Strategic Plan  Objectives/Outcomes/Goals for activities & programs  Assessments: “What does success look like?” Photo: Pixabay CC0
  5. 5. RESULTS  Collect:  Data  Observations  Analyze  Continual Improvement  Tell the story Photo: Pixabay CC0
  6. 6. Start with what you already have…  Checkouts  Registration/attendance  Requests for information  Library card sign-ups  Online stats – Google analytics  MSU’s data: NVivo software  Published research, articles  Observation  Spontaneous feedback  Suggestion box Photo: Pixabay CC0
  7. 7. Let’s get real…  ONGOING ASSESSMENT Jodie, Carnegie Library, Red Lodge, MT  Board reports to continually monitor progress  Putting the Strategic Plan to work  CRUNCHING THE NUMBERS: accountability – Mary Anne, Alyssa, David, Montana State University, Bozeman, MT  Data collection > continuous improvement  Stakeholders see proof-of-performance  NVivo – analyze text
  8. 8. Red Lodge Carnegie Library  Background on our community and library  Our strategic plan:  Timeline: FY16 – FY19  Five service goals  Planned maintenance  Annual operational goals for trustees and staff  Include measurable outcomes for success  Reflect current realities of budget, staff, trustees  Review  Monthly review of operational goals  Annual review of strategic plan  Assessing surveys using Survey Monkey  Text Analysis turns qualitative data into quantifiable responses  Requires a Standard (or higher) Plan
  9. 9. Survey Monkey Fun  Plans & Pricing  Analyzing Responses  Question Summaries  Chart Type  Text Analysis Categorizing (AKA coding or tagging)  Word Clouds
  10. 10. Mission & Strategy Map User Perspective – Our users will… Mission: We support and advance teaching, learning, and research for Montana State University and the people of Montana by providing access to information and knowledge. Learning & Growth Perspective – By working in ways that… Financial Perspective –While managing our finances in ways that… Interact with a welcoming and responsive physical and digital library environment. Build and sustain an organizational culture of evidence-based decision-making and assessment. Internal Processes Perspective –We will… Create useful, dynamic, and accessible digital and physical spaces. Expand, diversify, and adapt our collections and services. Grow an engaged library community through marketing and outreach. Collaborate with researchers to produce digital research and scholarship. Cultivate a climate of engagement and empowerment in which all employees are valued. Foster an organizational culture that supports ongoing professional growth. Experience convenient access to extensive collections. Achieve positive learning outcomes and develop their information literacy abilities. Develop financial resources to implement the building Master Plan. Improve classified staff compensation to the representative peer market average.
  11. 11. Why Assess?  To demonstrate our value and relevance;  To determine our users’ wants and needs;  To give us a framework for improvement;  To help us realize our mission and values; and  To challenge assumptions and biases.
  12. 12. How to Assess  Quantitative Analysis:  Statistical analysis, survey techniques and questionnaire design, correlational research, experimental research, data visualization, action research.  Qualitative Analysis:  Content analysis, ethnography, interviews, focus groups, observation, action research.  UX Design:  Information architecture, content strategy, journey mapping, service blueprinting, usability testing, user personas, prototyping, wireframing, heuristic evaluation/expert review.
  13. 13. Title TCLI 2017 Participants
  14. 14. TCLI Agenda and Evaluation Transcriptions
  15. 15. Methods and Tools - NVivo  Qualitative coding via NVivo  “NVivo is software that supports qualitative and mixed methods research. It’s designed to help you organize, analyze and find insights in unstructured, or qualitative data like: interviews, open-ended survey responses, articles, social media and web content.  NVivo gives you a place to organize and manage your material so that you can start to find insights in your data. It also provides tools that allow you to ask questions of your data in a more efficient way.” – QSR International’s “What is NVivo?”
  16. 16. Cultural Programmi...Professional Knowledge and Skills Programming Cultural Programmi... Prayer or Blessing General Native American E... Professional Knowledge and Skills Programming Arch... Library... Poster Sessions Health Information Technology General Librarianship Tribal College Librarianship Cultural Programmi... Prayer or Blessing General Native American E... Professional Knowledge and Skills Programming Arch... Library... Poster Sessions Health Information Technology Ebooks Mukurtu Electroni... Digitization General Technology General Librarianship Tribal College Librarianship Burning ...TC Relevant SkillsGeneral TCL
  17. 17. Cultural ProgrammingOrganization 2. Program Topic Suggestions Financial Aid Self-care Adult Program Previous Topic Follow-up Project Management Security Building or Layout PlanningStaffing or ManagementCollection Development Crafts or Hands-on Projects Field Trip Government Program AccreditationArchives or Special CollectionsLibrary Trends or Future StepsYouth Program Community Digitization Student Education Cultural Programming Specific Organization Panel or Round Table Discussion Programming Grant Writing Other Assessment Information Literacy Non-Western or Indigenous Methods Technology Technology Non-Western/ Indigenous Methods Information Literacy Assessment Other Grant Writing Programming Panel/ Round Table Discussion Specific Organization Community Digitization Student Education Cultural Programming Youth Program Government Program Field Trip Crafts/ Hands-on Projects Project Management Adult Program Financial Aid Self-care Previous Topic Follow-upSecurity Collection Development Library Trends/ Future Steps Archives/ Special Collections Accreditation Staffing/ Management Building/ Layout Planning
  18. 18. Cultural ProgrammingProfessional Knowledge and Skills Programming Professional Knowledge/ Skills Programming Cultural Programming Agenda Compilation- Programming
  19. 19. Coded to Multiple Themes Cultural Programming General Professional Knowledge and Skills Programming  Technology  Digitization
  20. 20. Same data, different view
  21. 21. c. Direct Benefits Books or Physical... Professional Deve... RejuvenationNetworkingNew Ways of Thinking New Ways of Thinking Networking Rejuvenation Professional Development Books/ Physical TakeawaysEvaluation Compilation- Direct Benefits
  22. 22. a. Most Liked Aspects Field Trip Sharing Professional Expertise Presentations or SessionsNetworking Field Trip Presentations/ Sessions Networking Sharing Professional Expertise Evaluation Compilation- Most Liked Aspects
  23. 23. Goals: • Develop a set of recommendations for the MSU Library’s Research Commons based on user feedback • Promote a culture of evidence-based decision making Commons Assessment Group 2.0
  24. 24. • LibQUAL+ “LibQUAL+ is a web-based survey offered by the Association of Research Libraries that helps libraries assess and improve library services, change organizational culture, and market the library. The survey instrument measures library users' minimum, perceived, and desired levels of service quality across three dimensions: Affect of Service, Information Control, and Library as Place. ” - Association of Research Libraries’ “What is LibQUAL+®?” • MSU’s 2016 LibQUAL+ Results - 1049 respondents from a randomly selected sample of ~4,000 students, faculty, and staff - 422 comments Methods and Tools - LibQUAL+
  25. 25. • Survey distribution via Qualtrics Qualtrics is a powerful online survey tool for building, distributing, and analyzing surveys • Tutor Surveys 43 Respondents 5 Questions 215 Individual Responses Methods and Tools - Qualtrics
  26. 26. • A group of 8 faculty, staff, and students read through the 422 LibQUAL+ comments, and coded them using Brown University’s Methodology for Coding Qualitative Data (User Comments) • The 173 comments coded as “Suggestion” and/or “Negative” were combined with the 215 tutor responses then coded within NVivo according to the following parameters: • “Awareness” – does the comment indicate a lack of awareness of library spaces and/or services? • “Suggestion” – does the comment provide a suggestion related to something that the library can change? • “Space” – does the comment focus on the library’s spaces? • “Service” – does the comment focus on the library’s services? • “Use” – does the comment indicate using a specific library space or service? • “Policy” – does the comment relate to a library policy? Methods and Tools - Coding
  27. 27. • A group of 8 faculty, staff, and students read through the 422 LibQUAL+ comments, and coded them using Brown University’s Methodology for Coding Qualitative Data (User Comments) • The 173 comments coded as “Suggestion” and/or “Negative” were combined with the 215 tutor responses then coded within NVivo according to the following parameters: • “Awareness” – does the comment indicate a lack of awareness of library spaces and/or services? • “Suggestion” – does the comment provide a suggestion related to something that the library can change? • “Space” – does the comment focus on the library’s spaces? • “Service” – does the comment focus on the library’s services? • “Use” – does the comment indicate using a specific library space or service? • “Policy” – does the comment relate to a library policy? Methods and Tools - Coding
  28. 28. Analysis and Recommendations
  29. 29. 1. Implement a Campus-Wide Communications Campaign to Increase General Awareness of Library Spaces and Services
  30. 30. 2. Enforce Current Quiet Policies, and Possibly Expand Quiet Areas
  31. 31. 3. Increase Group Study Room Capacity
  32. 32. 4. Increase Number of Power Outlets Available to Users
  33. 33. 5. Improve our Users’ Understanding of Our Online Offerings and Resources
  34. 34. Takeaways  Through this process, we created a rich database of information that will help us respond to questions about library usage while also giving us the ability to identify and address some immediate “low-hanging fruit” to improve our library’s user experience  This database, and the process used to create it, have given us the ability support our stakeholders in making evidence-based decisions on matters both anticipated and unanticipated
  35. 35. Photo: Pixabay CC0

×