Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

“The library has a website?” User-Centered Library Assessment

875 views

Published on

ALAO 2014 assessment staff interest group workshop presented at OCLC Conference Center, April 24, 2014, Dublin, OH.

Published in: Education
  • Be the first to comment

“The library has a website?” User-Centered Library Assessment

  1. 1. Lynn Silipigni Connaway, Ph.D. Senior Research Scientist OCLC Research Vice-chair , ACRL Value of Academic Libraries Committee Thursday, April 24th 2014 Academic Library Association of Ohio’s Assessment Special Interest Group Spring Workshop "The library has a website?” User-Centered Library Assessment
  2. 2. The Road Travelled 2
  3. 3. Value of Academic Libraries Report 3 Freely available http://acrl.org/value
  4. 4. Themes from Summits 4 Accountability Unified approach Student learning/success Evidence-based
  5. 5. Value of Academic Libraries Initiative 5 Keep Up-to-Date • Value of Academic Libraries Blog •Valueography Outreach & Collaboration • Presentations (e.g. CNI, LAC, & Northumbria) • ACRL Liaisons Assembly Assessment Management Systems Under Discussion • Librarian Competencies • Research agenda • Library Poster
  6. 6. ACRL Plan for Excellence 6 Value of Academic Libraries Goal: Academic libraries demonstrate alignment with and impact on institutional outcomes. Objectives: • Leverage existing research to articulate and promote the value of academic and research libraries. • Undertake and support new research that builds on the research agenda in The Value of Academic Libraries: A Comprehensive Review and Report. • Influence national conversations and activities focused on the value of higher education. • Develop and deliver responsive professional development programs that build the skills and capacity for leadership and local data-informed and evidence- based advocacy.
  7. 7. 7 1. Defining Outcome(s) 2. Setting Criteria 3. Performing Action(s) & 4. Gathering Evidence 5. Analyzing Evidence 6. Planning Change Cycle of Assessment [focused on] Library Value ACTING August- December 2013 SHARING March-May 2014 January - February 2014 REFLECTING PLANNING June- July 2013
  8. 8. Recommendations 8  Define outcomes  Create or adopt systems for assessment management  Determine what libraries enables students, faculty, student affairs professionals, administrators and staff to do.  Develop systems to collect data on individual library user behavior, while maintaining privacy.  Record and increase library impact on student enrollment.  Link libraries to improved student retention and graduation rates.  Review course content, readings, reserves, and assignments.  Document and augment library advancement of student experiences, attitudes, and perceptions of quality.  Track and increase library contributions to faculty research productivity.  Contribute to investigate library impact on faculty grant proposals and funding, a means of generating institutional income.  Demonstrate and improve library support of faculty teaching.  Create library assessment plans.  Promote and participate in professional development.  Mobilize library administrators.  Leverage library professional associations.
  9. 9. Recommendations 9 1. Increase the profession’s understanding of library value in relation to various dimensions of student learning and success 2. Articulate and promote the importance of assessment competencies necessary for documenting and communicating library impact on student learning and success. 3. Create professional development opportunities for librarians to learn how to initiate and design assessment that demonstrates the library’s contributions to advancing institutional mission and strategic goals.
  10. 10. Recommendations cont. 4. Expand partnerships for assessment activities with higher education constituent groups and related stakeholders. 5. Integrate the use of existing ACRL resources with library value initiatives. 10
  11. 11. Assessment in Action Goals 11 Professional Competencies Collaborative Relationships Approaches, Strategies, Practices
  12. 12. 12 Institutional Researcher/ Assessment Officer Faculty Member Librarian Leader TeamApproach
  13. 13. AiA 2013 Institutional Teams 13
  14. 14. Library Factors Examined • instruction: games, single/multiple session, course embedded, tutorials • reference • physical space • discovery: institutional web, resource guides • collections • personnel 14
  15. 15. Variety of Tools/Methods 15 • survey • interviews • focus group(s) • observation • pre/post test • rubric • student portfolio • research paper/project • other class assignment • test scores • GPA • degree completion rate • retention rate
  16. 16. • What is your definition of assessment? • What comes to mind when you hear the term “assessment”? • What benefits do you see for assessment? • What are your concerns? Some Initial Questions 16
  17. 17. Process of… – Defining – Selecting – Designing – Collecting – Analyzing – Interpreting – Using information to increase service/program effectiveness Interpreting Analyzing Collecting Assessment Defined 17
  18. 18. • Answers questions: – What do users/stakeholders want & need? – How can services/programs better meet needs? – Is what we do working? – Could we do better? – What are problem areas? • Traditional stats don’t tell whole story Why Assessment? 18
  19. 19. Importance of Assessment “Librarians are increasingly called upon to document and articulate the value of academic and research libraries and their contribution to institutional mission and goals.” (ACRL Value of Academic Libraries, 2010, p. 6) 19
  20. 20. Formal vs. Informal Assessment • Formal Assessment – Data driven – Evidence-based – Accepted methods – Recognized as rigorous • Informal Assessment – Anecdotes & casual observation – Used to be norm – No longer acceptable 20
  21. 21. Outcomes Assessment Basics • Outcomes: “The ways in which library users are changed as a result of their contact with the library’s resources and programs” (ALA, 1998). • “Libraries cannot demonstrate institutional value to maximum effect until they define outcomes of institutional relevance and then measure the degree to which they attain them” (Kaufman & Watstein, 2008, p. 227). 21
  22. 22. Steps in Assessment Process • Why? Identify purpose • Who? Identify team • How? Choose model/approach/method • Commit • Training/planning 22
  23. 23. Outputs & Inputs • Outputs – Quantify the work done – Don’t relate factors to overall effectiveness • Inputs – Raw materials – Measured against standards – Insufficient for overall assessment 23
  24. 24. Principles for Applying Outcomes Assessment • Center on users – Assess changes in service/resources use • Relate to inputs - identify “best practices” • Use variety of methods to corroborate conclusions – Choose small number of outcomes – Need not address every aspect of service • Adopt continuous process 24
  25. 25. Examples of Outcomes • User matches information need to information resources • User can organize an effective search strategy • User effectively searches online catalog & retrieves relevant resources • User can find appropriate resources 25
  26. 26. What We Know About Assessment • Ongoing process to understand & improve service • Librarians are busy with day-to-day work & assessment can become another burden • Can build on what has already been done or is known 26
  27. 27. “One size fits none!” (Lynn’s Mom) 27
  28. 28. Survey Research “…to look at or to see over or beyond…allows one to generalize from a smaller group to a larger group” 28 (Connaway & Powell, 2010, p. 107)
  29. 29. • Explores many aspects of service • Demographic information • Online surveys (e.g., Survey Monkey) provide statistical analysis • Controlled sampling • High response rates possible • Data reflect characteristics & opinions of respondents • Cost effective • Can be self-administered • Survey large numbers Survey Research: Advantages (Hernon & Altman, 1998) 29
  30. 30. Survey Research: Disadvantages • Produces a snapshot of situation • May be time consuming to analyze & interpret results • Produces self-reported data • Data lack depth of interviewing • High return rate can be difficult (Hernon & Altman, 1998) 30
  31. 31. Design Issues • Paper or Online (e.g., Survey Monkey) • Consider order of questions • Demographic q’s first • Instructions – Be specific – Introduce sections • Keep it simple • Pre-test! 31
  32. 32. 32 Survey Research Interpreting Results • Objectively analyze all data • Interpret results with appropriate level of precision • Express proper degree of caution about conclusions • Use data as input in outcome measures • Consider longitudinal study, compare results over time • Qualitative data requires special attention
  33. 33. 33 Example: Seeking Synchronicity CIT: VRS Potential User Online Survey Questions Think about one experience in which you felt you achieved (or did not achieve) a positive result after seeking library reference services in any format. (Connaway & Radford, 2011) a. Think about one experience in which you felt you did (or did not) achieve a positive result after seeking library reference services in any format. b. Describe each interaction. c. Identify the factors that made these interactions positive or negative.
  34. 34. Interviews 34 Conversation involving two or more people guided by a predetermined purpose (Lederman, 1996)
  35. 35. • Structured • Semi-structured • Formats: – Individual • Face-to-face • Telephone • Skype – Focus Group Interviews Types of Interviews 35
  36. 36. Types of Questions • OPEN • “What is it like when you visit the library?” • DIRECTIVE • “What happened when you asked for help at the reference desk?” • REFLECTIVE • “It sounds like you had trouble with the mobile app?” • CLOSED • “Have I covered everything you wanted to say?” 36
  37. 37. Interviews: Advantages • Face-2-face interaction • In-depth information • Understand experiences & meanings • Highlight individual’s voice • Preliminary information to “triangulate” • Control sampling – Include underrepresented groups • Greater range of topics 37
  38. 38. Interviews: Disadvantages • Time Factors – Varies by # & depth – Staff intensive • Cost Factors – Higher the #, higher the cost • Additional Factors – Self-reported data – Errors in note taking possible 38
  39. 39. Example: Digital Visitors & Residents Participant Questions 39 1. Describe the things you enjoy doing with technology and the web each week. 2. Think of the ways you have used technology and the web for your studies. Describe a typical week. 3. Think about the next stage of your education. Tell me what you think this will be like. (White & Connaway, 2011-2012)
  40. 40. Focus Group Interviews 40 “…interview of a group of 8 to 12 people representing some target group and centered on a single topic.” (Zweizig, Johnson, Robbins, & Besant, 1996)
  41. 41. Conducting Focus Group Interviews • Obtain permission to use information & if taping – Report and/or publication • Enlist note-taker or, if recording, check equipment, bring back-up • Begin by creating safe climate 41
  42. 42. 42 WorldCat.org Study Recruitment • Difficult – Little data of user-base – Participants across 3 continents – Hard-to-reach populations • Historians • Antiquarian booksellers • Non-probabilistic methods – Convenience sampling – Snowball sampling (Connaway & Wakeling, 2012)
  43. 43. 43 Example: WorldCat.org Focus Group Interview Questions Tell us about your experiences with WorldCat.org. Broad introductory question to reveal the extent to which users have engaged with WorldCat.org, and the information-seeking contexts within which they use the system. (Connaway & Wakeling, 2012, p. 7)
  44. 44. Structured Observations 44 Systematic description focusing on designated aspects of behavior to test causal hypotheses (Connaway & Powell, 2010)
  45. 45. 45 Structured Observations: A Guide • Develop observational categories – Define appropriate, measurable acts – Establish time length of observation – Anticipate patterns of phenomena – Decide on frame of reference (Connaway & Powell, 2010)
  46. 46. Ethnographic Research 46 Rich description (Connaway & Powell, 2010)
  47. 47. Ethnographic research 47 • Incredibly detailed data • Time consuming – Establishing rapport – Selecting research participants – Transcribing observations & conversations – Keeping diaries (Connaway & Powell, 2010, p.175) (Khoo, Rozaklis, & Hall, 2012)
  48. 48. Analysis 48 “summary of observations or data in such a manner that they provide answers to the hypothesis or research questions” (Connaway & Powell, 2010)
  49. 49. Analysis 49 • Collection of data affects analysis of data • Ongoing process • Feeds back into research design • Theory, model, or hypothesis must grow from data analysis
  50. 50. Data Analysis: Digital Visitors & Residents • Codebook • Nvivo 10 50 I. Place A. Internet 1. Search engine a. Google b. Yahoo 2. Social Media a. FaceBook b. Twitter c. You Tube d. Flickr/image sharing e. Blogging B. Library 1. Academic 2. Public 3. School (K-12) C. Home D. School, classroom, computer lab E. Other (White & Connaway, 2011-2012)
  51. 51. Getting the Right Fit! • What do we know? • Where do we go from here? Use tools & research design to customize project to fit your assessment needs 51
  52. 52. References ALA/ACRL. 1998. Task force on academic library outcomes assessment report. Available: http://www.ala.org/Content/NavigationMenu/ACRL/Publications/White_Papers_and_Report s/Task_Force_on_Academic_Library_Outcomes_Assessment_Report.htm Brown, Karen, and Kara J. Malenfant. 2012. Connect, collaborate, and communicate: a report from the Value of Academic Libraries Summits. [Chicago, Ill.]: Association of College & Research Libraries. http://www.acrl.ala.org/value. Connaway, Lynn S., Johnson, Debra W., & Searing, Susan. 1997. Online catalogs from the users’ perspective: The use of focus group interviews. College and Research Libraries, 58(5), 403-420. Connaway, Lynn S. & Radford, Marie L. 2011. Seeking Synchronicity: Revelations and recommendations for virtual reference. Dublin, OH: OCLC Research. Retrieved from http://www.oclc.org/reports/synchronicity/full.pdf Connaway, Lynn S. & Powell, Ronald R. 2010. Basic research methods for librarians (5th ed.). Westport, CN: Libraries Unlimited. Connaway, Lynn S., & Wakeling, Simon. 2012. To use or not to use Worldcat.org: An international perspective from different user groups. OCLC Internal Report. Dervin, Brenda, Connaway, Lynn S., & Prabha, Chandra. 2003-2006 Sense-making the information confluence: The whys and hows of college and university user satisficing of information needs. Funded by the Institute of Museum and Library Flanagan, John C. 1954. The critical incident technique. Washington: American Psychological Association. Geertz, Clifford. 1973. The interpretation of cultures: Selected essays. New York: Basic Books. 52
  53. 53. References Hernon Peter & Altman, Ellen. 1998. Assessing Service Quality: Satisfying the Expectations of Library Customers. Chicago, IL: American Library Association. Kaufman, Paula, and Sarah Barbara Watstein. 2008. "Library value (return on investment, ROI) and the challenge of placing a value on public services". Reference Services Review. 36 (3): 226-231. Khoo, Michael, Rozaklis, Lily, & Hall, Catherine (2012). A survey of the use of ethnographic methods in the study of libraries and library users. Library and Information Science Research, 34(2), 82-91. Lederman, Linda C. 1996. Asking questions and listening to answers: A guide to using individual, focus group, and debriefing interviews. Dubuque, Iowa: Kendall/Hunt. Oakleaf, Megan J. 2010. The value of academic libraries: a comprehensive research review and report. Chicago, IL: Association of College and Research Libraries, American Library Association. QSR International. 2011. NVivo 9: Getting started. Retrieved from http://download.qsrinternational.com/Document/NVivo9/NVivo9-Getting-Started-Guide.pdf Services (IMLS). http://www.oclc.org/research/activities/past/orprojects/imls/default.htm White, David S., & Connaway, Lynn S. 2011-2012. Visitors and residents: What motivates engagement with the digital information environment. Funded by JISC, OCLC, and Oxford University. Retrieved from http://www.oclc.org/research/activities/vandr/ Zweizig, Douglas, Johnson, Debra W., Robbins, Jane, & Besant, Michele. 1996. The tell it! Manual. Chicago: ALA. 53
  54. 54. Thank You! ©2014 OCLC. This work is licensed under a Creative Commons Attribution 3.0 Unported License. Suggested attribution: “This work uses content from [presentation title] © OCLC, used under a Creative Commons Attribution license: http://creativecommons.org/licenses/by/3.0/” Lynn Silipigni Connaway, Ph.D. Senior Research Scientist OCLC Research Vice-chair , ACRL Value of Academic Libraries Committee @LynnConnaway connawal@oclc.org 54

×