Lynn Silipigni Connaway, Ph.D.
Senior Research Scientist
OCLC Research
Vice-chair , ACRL Value of Academic Libraries Commi...
The Road Travelled
2
Value of Academic Libraries Report
3
Freely available
http://acrl.org/value
Themes from Summits
4
Accountability
Unified approach
Student learning/success
Evidence-based
Value of Academic Libraries Initiative
5
Keep Up-to-Date
• Value of Academic
Libraries Blog
•Valueography
Outreach &
Col...
ACRL Plan for Excellence
6
Value of Academic Libraries
Goal: Academic libraries demonstrate alignment with and
impact on i...
7
1. Defining
Outcome(s)
2. Setting
Criteria
3. Performing
Action(s) &
4. Gathering
Evidence
5.
Analyzing
Evidence
6.
Plan...
Recommendations
8
 Define outcomes
 Create or adopt systems for assessment management
 Determine what libraries enables...
Recommendations
9
1. Increase the profession’s understanding of library
value in relation to various dimensions of student...
Recommendations cont.
4. Expand partnerships for assessment
activities with higher education constituent
groups and relate...
Assessment in Action Goals
11
Professional Competencies
Collaborative Relationships
Approaches, Strategies, Practices
12
Institutional
Researcher/
Assessment
Officer
Faculty
Member
Librarian
Leader
TeamApproach
AiA 2013 Institutional Teams
13
Library Factors Examined
• instruction: games, single/multiple session,
course embedded, tutorials
• reference
• physical ...
Variety of Tools/Methods
15
• survey
• interviews
• focus group(s)
• observation
• pre/post test
• rubric
• student portfo...
• What is your definition of assessment?
• What comes to mind when you hear the term
“assessment”?
• What benefits do you ...
Process of…
– Defining
– Selecting
– Designing
– Collecting
– Analyzing
– Interpreting
– Using
information to increase ser...
• Answers questions:
– What do users/stakeholders
want & need?
– How can services/programs
better meet needs?
– Is what we...
Importance of Assessment
“Librarians are increasingly called
upon to document and articulate the
value of academic and res...
Formal vs.
Informal Assessment
• Formal Assessment
– Data driven
– Evidence-based
– Accepted methods
– Recognized as rigor...
Outcomes Assessment Basics
• Outcomes: “The ways in which library users are
changed as a result of their contact with the ...
Steps in Assessment Process
• Why? Identify purpose
• Who? Identify team
• How? Choose model/approach/method
• Commit
• Tr...
Outputs & Inputs
• Outputs
– Quantify the work done
– Don’t relate factors to overall effectiveness
• Inputs
– Raw materia...
Principles for Applying Outcomes
Assessment
• Center on users
– Assess changes in service/resources use
• Relate to inputs...
Examples of Outcomes
• User matches information need to
information resources
• User can organize an effective search
stra...
What We Know
About Assessment
• Ongoing process to understand & improve
service
• Librarians are busy with day-to-day work...
“One size fits none!”
(Lynn’s Mom)
27
Survey Research
“…to look at or to see over or
beyond…allows one to generalize from a
smaller group to a larger group”
28
...
• Explores many aspects of service
• Demographic information
• Online surveys (e.g., Survey Monkey) provide statistical
an...
Survey Research: Disadvantages
• Produces a snapshot of situation
• May be time consuming to analyze &
interpret results
•...
Design Issues
• Paper or Online (e.g., Survey Monkey)
• Consider order of questions
• Demographic q’s first
• Instructions...
32
Survey Research
Interpreting Results
• Objectively analyze all data
• Interpret results with appropriate level of
preci...
33
Example: Seeking Synchronicity CIT:
VRS Potential User Online Survey Questions
Think about one experience in which you ...
Interviews
34
Conversation involving two or more people
guided by a predetermined purpose
(Lederman, 1996)
• Structured
• Semi-structured
• Formats:
– Individual
• Face-to-face
• Telephone
• Skype
– Focus Group Interviews
Types o...
Types of Questions
• OPEN
• “What is it like when you visit the library?”
• DIRECTIVE
• “What happened when you asked for ...
Interviews: Advantages
• Face-2-face interaction
• In-depth information
• Understand experiences & meanings
• Highlight in...
Interviews: Disadvantages
• Time Factors
– Varies by # & depth
– Staff intensive
• Cost Factors
– Higher the #, higher the...
Example: Digital Visitors & Residents
Participant Questions
39
1. Describe the things you enjoy
doing with technology and ...
Focus Group Interviews
40
“…interview of a group of 8 to 12 people
representing some target group and
centered on a single...
Conducting Focus Group Interviews
• Obtain permission to use information & if
taping
– Report and/or publication
• Enlist ...
42
WorldCat.org Study Recruitment
• Difficult
– Little data of user-base
– Participants across 3
continents
– Hard-to-reac...
43
Example: WorldCat.org
Focus Group Interview Questions
Tell us about your
experiences with
WorldCat.org.
Broad introduct...
Structured Observations
44
Systematic description focusing on
designated aspects of behavior to test
causal hypotheses
(Co...
45
Structured Observations: A Guide
• Develop observational
categories
– Define appropriate,
measurable acts
– Establish t...
Ethnographic Research
46
Rich description
(Connaway & Powell, 2010)
Ethnographic research
47
• Incredibly detailed data
• Time consuming
– Establishing rapport
– Selecting research
participa...
Analysis
48
“summary of observations or data in such a
manner that they provide answers to the
hypothesis or research ques...
Analysis
49
• Collection of data
affects analysis of data
• Ongoing process
• Feeds back into
research design
• Theory, mo...
Data Analysis:
Digital Visitors & Residents
• Codebook
• Nvivo 10
50
I. Place
A. Internet
1. Search engine
a. Google
b. Ya...
Getting the Right Fit!
• What do we know?
• Where do we go from
here?
Use tools & research
design to customize
project to ...
References
ALA/ACRL. 1998. Task force on academic library outcomes assessment report. Available:
http://www.ala.org/Conten...
References
Hernon Peter & Altman, Ellen. 1998. Assessing Service Quality: Satisfying the Expectations of
Library Customers...
Thank You!
©2014 OCLC. This work is licensed under a Creative Commons Attribution 3.0 Unported License. Suggested attribut...
Upcoming SlideShare
Loading in …5
×

“The library has a website?” User-Centered Library Assessment

760 views
567 views

Published on

ALAO 2014 assessment staff interest group workshop presented at OCLC Conference Center, April 24, 2014, Dublin, OH.

Published in: Education
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
760
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
3
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide
  • ACRL has long been interested in accountability and assessment, since the early 1980s really.

    About 4 years ago, the ACRL Board really began to take an intense interest and began making a commitment to help librarians demonstrating the value they bring to their colleges and universities.

    Partly in response to the general environment and also because ACRL members told us,

    in membership surveys and focus groups, that demonstrating the value of academic libraries is a top issue facing the profession.

    Image from http://butlertwp.org/Images/Road.jpg

    ACRL has responded to these calls for greater accountability with the Value of Academic Libraries Initiative

    ACRL responded in many ways: the Board had a briefing paper, held an invitational meeting, and decided to issue an RFP for a comprehensive review of the literature.
  • Image from http://www.acrl.ala.org/value/?page_id=21

    Value of Academic Libraries: A Comprehensive Research Review and Report (September 2010) A review of the quantitative and qualitative literature, methodologies and best practices currently in place for demonstrating the value of academic libraries, developed for ACRL by Megan Oakleaf of the iSchool at Syracuse University. The primary objective of this comprehensive review is to provide academic librarians with a clearer understanding of what research about the performance of academic libraries already exists, where gaps in this research occur, and to identify the most promising best practices and measures correlated to performance.

    While many of you may be familiar with the report, this is but one piece of a much larger, long term initiative which has its genesis with initial ACRL Board conversations in July 2009.

    ACRL’s VAL initiative is now anchored into the organization through the strategic plan and a standing committee.

    It is intended to build capacity for data-driven advocacy by:
    enabling librarians to apply and
    extend the research base on the value of academic and research libraries.

    Align libraries with institutional outcomes.
    Empower libraries to carry out work locally.
    Create shared knowledge and understanding.
    Contribute to higher education assessment.




  • During the presentations, discussions, and collaborative work, the following four broad themes emerged about the dynamic nature assessment in higher education:
    Accountability drives higher education discussions.
    A unified approach to institutional assessment is essential.
    Student learning and success are the primary focus of higher education assessment.
    Academic administrators and accreditors seek evidence-based reports of measureable impact.


  • There are numerous activities being undertaken by the ACRL Value of Academic Libraries Committee

    Value of Academic Libraries blog

    Valueography
    The aim is to update and enhance the bibliography in The Value of Academic Libraries: A Comprehensive Review and Report

    Assessment management systems
    We’ve been discussing how libraries can better manage and track assessment data, identifying learning outcomes assessment management systems and encouraging those vendors to exhibit at ACRL13

    Outreach and collaboration
    Presentations: LAC & CNI In October, we gave an update at the Library Assessment Conference and in December at the Coalition for Networked Information. AALHE & EBLIP: In June we’ll present at Association for the Assessment of Learning in Higher Education and in July at Evidence Based Library and Information Practice Conference

    ACRL liaisons assembly
    We are working with ACRL’s liaison training and development committee to help liaison to other h.e. organizations and disciplinary societies be prepared to talk about the value of academic libraries.

    Under discussion:
    Librarian competencies statement
    Will be informed by he work we do with the grant participants

    Research agenda
    We held the invitational meeting at ALA Annual and expect that the “assessment in action” grant will further inform the creation of a research agenda, as we see the nature of the projects people undertake and the kinds of questions they explore.

    Library poster
    We’ve just begun exploring the idea for a template poster on the value of academic libraries that is customizable so libraries could use it locally to promote their services

  • Value of Academic Libraries

    Goal: Academic libraries demonstrate alignment with and impact on institutional outcomes.
    Objectives:
    Leverage existing research to articulate and promote the value of academic and research libraries.
    Undertake and support new research that builds on the research agenda in The Value of Academic Libraries:  A Comprehensive Review and Report.
    Influence national conversations and activities focused on the value of higher education.
    Develop and deliver responsive professional development programs that build the skills and capacity for leadership and local data-informed and evidence-based advocacy.

    After adopting this in April 2011, the ACRL Board created and charged a Value of Academic Libraries Committee in May 2011.




  • Back to the VAL report for a moment. As I said, it is one piece of a larger initiative, but has been key piece in helping ACRL chart our course.

    It made many recommendations, and we felt a key opportunity, where we have great strength was the recommendation that ACRL:

    create a professional development program to build the profession’s capacity to document, demonstrate, and communicate library value in alignment with the mission and goals of their colleges and universities.

    In order to determine how to shape that program we sought advice from a broad range of stakeholders, as Kara will describe next.

    Image from http://desismartgrid.com/wp-content/uploads/2012/07/keyrecoooo.jpg
  • Given this intensified attention to assessment and accountability issues in the higher education sector, five overarching recommendations for the academic library profession emerged…
  • Image from Microsoft Clip Art
  • Image from http://www.calledtoyouthministry.com/sites/default/files/cck/resource_images/outcomes.jpg

    ALA/ACRL. (1998). Task force on academic library outcomes assessment report. Available: http://www.ala.org/Content/NavigationMenu/ACRL/Publications/White_Papers_and_Reports/Task_Force_on_Academic_Library_Outcomes_Assessment_Report.htm

    Kaufman, Paula, and Sarah Barbara Watstein. “Library Value (Return on Investment, ROI) and the Challenge of Placing a Value on Public Services.” Reference Services Review 36, no. 3 (2008): 226-231. Cited in the Values Report (2010)

    Satisfaction is an outcome.
    So is dissatisfaction.
    Satisfaction is complex.
    Specific service goals should be set.
  • Image from Microsoft Clip Art

    Identify purpose & role of assessment
    What do we want to know?
    What are we going to do with results?
    What outcomes do we want to assess?
  • Image from http://kluwermediationblog.com/files/2013/06/simple-inputs-and-outputs.jpg

    Outputs
    Quantify the work done:
    # books circulated
    # reference questions answered
    Library Use Instruction attendance

    Inputs: $, space, collection, equipment, & staff
  • Image from Microsoft Clip Art
  • Connaway, L. S., & Powell, R. R. (2010). Basic research methods for librarians (5th ed.). Westport, CN: Libraries Unlimited.
  • Image from http://www.esds.co.in/blog/wp-content/uploads/2011/01/Advantages-And-Disadvantages.jpg

    Adapted from Hernon& Altman, 1998
    Hernon P. & Altman, E. (1998) Assessing Service Quality: Satisfying the Expectations of Library Customers. Chicago, IL: American Library Association.
  • Image from http://gwynteatro.files.wordpress.com/2012/08/observe-look-magnifying-glass.jpg?w=307&h=205

    Adapted from Hernon& Altman, 1998
    Hernon P. & Altman, E. (1998) Assessing Service Quality: Satisfying the Expectations of Library Customers. Chicago, IL: American Library Association.
  • Image from http://www.survey-reviews.net/wp-content/uploads/2012/09/Survey1.png
  • Image from http://hangingtogether.org/wp-content/uploads/2011/12/seeking_synchronicity.jpg

    Connaway, L. S. & Radford, M. L. (2011). Seeking Synchronicity: Revelations and recommendations for virtual reference. Dublin, OH: OCLC Research. Retrieved from http://www.oclc.org/reports/synchronicity/full.pdf
  • Adapted from Lederman, L. C. (1996). Asking questions and listening to answers: A guide to using individual, focus group, and debriefing interviews. Dubuque, Iowa: Kendall/Hunt.
  • Image from http://blogs.voices.com/voxdaily/500x292xmicrophones-for-interviews.jpg.pagespeed.ic.Y-aQN_YNmb.jpg
  • Image from Microsoft Clip Art
  • Image from http://literatureset.com/wp-content/uploads/2014/04/interview1.jpg
  • Image from http://www.usnews.com/dims4/USNEWS/2b439a1/2147483647/thumbnail/418x278/quality/85/?url=%2Fcmsmedia%2F63%2Fbd5043afff94d1dcf3b91126fb3bab%2F16893FE_DA_0526brink.money.jpg
  • Image from http:/

    White, D. S., & Connaway, L. S. (2011-2012). Visitors and Residents: What motivates engagement with the digital information environment. Funded by JISC, OCLC, and Oxford University. Retrieved from http://www.oclc.org/research/activities/vandr/
    /cdn.pearltrees.com/s/pic/th/digital-visitors-residents-40212558
  • Connaway, L. S., & Wakeling, S. (2012). To use or not to use Worldcat.org: An international perspective from different user groups. OCLC Internal Report.

    1. Tell us about your experiences with WorldCat.org
    A broad introductory question intended to reveal the extent to which users have engaged with WorldCat.org, and the information-seeking contexts within which they use the system.
    2. Describe a time when you used WorldCat.org that you considered a success.
    Explores the features and functions of WorldCat.org that participants view positively. Requiring participants to discuss a particular instance provides richer data about the range of uses of the system.
    3. Describe a time when using WorldCat.org was unsuccessful – i.e., you did not get what you wanted.
    Explores the features and functions (or lack thereof) of WorldCat.org that participants view negatively.
    4. Think of a time when you did not find what you were looking for, but did find something else of interest or useful to your work?
    Intended to encourage discussion about the role of serendipity in information seeking, and the extent to which WorldCat.org facilitates resource discovery .
    5. If you had a magic wand, what would your ideal WorldCat.org provide?
    Encourages participants to discuss potential improvements to WorldCat.org. The use of the phrase “magic wand” ensures that participants are not restricted by what they believe to be practical or realistic.

    *Questions 2-4 employ the Critical Incident Technique


  • Connaway, L. S. & Powell, R. R. (2010). Basic research methods for librarians (5th ed.). Westport, CN: Libraries Unlimited.
  • Image from http://psychology4a.com/Resear6.gif

    Connaway, L. S. & Powell, R. R. (2010). Basic research methods for librarians (5th ed.). Westport, CN: Libraries Unlimited.

    Recording Observations:

    Rating scales
    “All-or-none” categories
    Checklists of categories
    Video recording
    Useful for overall view of behavior
    Analyze closely later

    Increase Observation Reliability:

    Develop definitions of behavior
    Train observers
    Avoid observer bias
    Take behaviors at face value

  • Connaway, L. S. & Powell, R. R. (2010). Basic research methods for librarians (5th ed.). Westport, CN: Libraries Unlimited.
  • Image from http://elnil.org/Ethnographic_Interview.gif

    Connaway, L. S. & Powell, R. R. (2010). Basic research methods for librarians (5th ed.). Westport, CN: Libraries Unlimited.

    Khoo, M., Rozaklis, L., & Hall, C. (2012). A survey of the use of ethnographic methods in the study of libraries and library users. Library and Information Science Research, 34(2), 82-91.

    Participant/Immersive Observations:

    Move into the setting as deeply as possible
    Disturb participants as little as possible
    Participant observation
    Open, direct interaction and observation as part of the group

    Unobtrusive observation
    Disguised
    Field-based
    Indirect
    Reactive
    Obtrusive observation
    Build rapport with participants
    Informal for conversation
    Formal to reinforce nonjudgmental interaction

    Example of participant observation: researcher appears to reference librarians as a patron.
  • Connaway, L. S. & Powell, R. R. (2010). Basic research methods for librarians (5th ed.). Westport, CN: Libraries Unlimited.
  • Image from Microsoft Clip Art

    Analyzing Data

    Two approaches
    Ethnographic summary
    Qualitative
    Direct quotations
    “Thick description” (Geertz, 1973, p.6)
    Content analysis approach
    Numerical descriptions of data
    Tallying of mentions of specific factors
    Can be combined

    Connaway, L. S., Johnson, D. W., & Searing, S. (1997). Online catalogs from the users’ perspective: The use of focus group interviews. College and Research Libraries, 58(5), 403-420.
    Services (IMLS). http://www.oclc.org/research/activities/past/orprojects/imls/default.htm
    Geertz, C. (1973). The interpretation of cultures: Selected essays. New York: Basic Books.
    Powell, R. R., & Connaway, L. S. (2010). Basic research methods for librarians (5th ed.). Westport, Conn: Libraries Unlimited.

    Stability
    Same data coded more than once by same coder with same results
    Accuracy
    Extent to which the classification of text corresponds to a standard
    Reproducibility
    Intercoder reliability
    Same data coded with same results by more than one coder
    Powell, R. R., & Connaway, L. S. (2010). Basic research methods for librarians (5th ed.). Westport, Conn: Libraries Unlimited.

    Sense-Making the Information Confluence, Dervin, B., Connaway, L.S., & Prabha, C. 2003-2006 Sense-making the information confluence: The whys and hows of college and university user satisficing of information needs. Funded by the Institute of Museum and Library
    Ethnographic & content analysis approach
    Collated & ranked by frequency of response
    Themes within each situation & type of user group
    Quotations used

    WorldCat.org, Connaway, L. S., & Wakeling, S. (2012). To use or not to use Worldcat.org: An international perspective from different user groups. OCLC Internal Report.
    Code-book
    Findings divided by
    Use
    Strengths
    Challenges & weaknesses
    Suggestions for improvement



  • Screenshot of Digital Visitors and Residents codebook in NVivo 10

    White, D. S., & Connaway, L. S. (2011-2012). Visitors and Residents: What motivates engagement with the digital information environment. Funded by JISC, OCLC, and Oxford University. Retrieved from http://www.oclc.org/research/activities/vandr/

    QSR International. (2011). NVivo 9: Getting started. Retrieved from http://download.qsrinternational.com/Document/NVivo9/NVivo9-Getting-Started-Guide.pdf



    How we used the codebook:

    First: the 3 researchers and research assistant coded the same two transcripts, one US and one UK (both secondary school).
    We met and re-met and coded and re-coded until there was agreement.
    THEN we went our separate ways to code. BUT we still communicate with each other in our coding processes, because of the remote nature of our collaboration.
    Coding is done on paper, for the most part, and then the codes are entered into word, or scanned and sent to research assistants, who enter the coding into NVivo9. When the research assistants are entering the coding, they notice discrepancies, document them, and generate a discussion about what was intended, and what should the final coding be.
    This continuous communication provides more consistency in coding across the researchers.
    When Inter-coder Reliability was calculated between all five coders it was found to be an average of 0.6380 for Kappa and an Agreement Percentage of 98%.

    Nvivo10:

    Qualitative research software
    Upload documents, PDFs, & videos
    Create nodes & code transcripts
    Merge files
    Queries
    Reports
    Models


  • “The library has a website?” User-Centered Library Assessment

    1. 1. Lynn Silipigni Connaway, Ph.D. Senior Research Scientist OCLC Research Vice-chair , ACRL Value of Academic Libraries Committee Thursday, April 24th 2014 Academic Library Association of Ohio’s Assessment Special Interest Group Spring Workshop "The library has a website?” User-Centered Library Assessment
    2. 2. The Road Travelled 2
    3. 3. Value of Academic Libraries Report 3 Freely available http://acrl.org/value
    4. 4. Themes from Summits 4 Accountability Unified approach Student learning/success Evidence-based
    5. 5. Value of Academic Libraries Initiative 5 Keep Up-to-Date • Value of Academic Libraries Blog •Valueography Outreach & Collaboration • Presentations (e.g. CNI, LAC, & Northumbria) • ACRL Liaisons Assembly Assessment Management Systems Under Discussion • Librarian Competencies • Research agenda • Library Poster
    6. 6. ACRL Plan for Excellence 6 Value of Academic Libraries Goal: Academic libraries demonstrate alignment with and impact on institutional outcomes. Objectives: • Leverage existing research to articulate and promote the value of academic and research libraries. • Undertake and support new research that builds on the research agenda in The Value of Academic Libraries: A Comprehensive Review and Report. • Influence national conversations and activities focused on the value of higher education. • Develop and deliver responsive professional development programs that build the skills and capacity for leadership and local data-informed and evidence- based advocacy.
    7. 7. 7 1. Defining Outcome(s) 2. Setting Criteria 3. Performing Action(s) & 4. Gathering Evidence 5. Analyzing Evidence 6. Planning Change Cycle of Assessment [focused on] Library Value ACTING August- December 2013 SHARING March-May 2014 January - February 2014 REFLECTING PLANNING June- July 2013
    8. 8. Recommendations 8  Define outcomes  Create or adopt systems for assessment management  Determine what libraries enables students, faculty, student affairs professionals, administrators and staff to do.  Develop systems to collect data on individual library user behavior, while maintaining privacy.  Record and increase library impact on student enrollment.  Link libraries to improved student retention and graduation rates.  Review course content, readings, reserves, and assignments.  Document and augment library advancement of student experiences, attitudes, and perceptions of quality.  Track and increase library contributions to faculty research productivity.  Contribute to investigate library impact on faculty grant proposals and funding, a means of generating institutional income.  Demonstrate and improve library support of faculty teaching.  Create library assessment plans.  Promote and participate in professional development.  Mobilize library administrators.  Leverage library professional associations.
    9. 9. Recommendations 9 1. Increase the profession’s understanding of library value in relation to various dimensions of student learning and success 2. Articulate and promote the importance of assessment competencies necessary for documenting and communicating library impact on student learning and success. 3. Create professional development opportunities for librarians to learn how to initiate and design assessment that demonstrates the library’s contributions to advancing institutional mission and strategic goals.
    10. 10. Recommendations cont. 4. Expand partnerships for assessment activities with higher education constituent groups and related stakeholders. 5. Integrate the use of existing ACRL resources with library value initiatives. 10
    11. 11. Assessment in Action Goals 11 Professional Competencies Collaborative Relationships Approaches, Strategies, Practices
    12. 12. 12 Institutional Researcher/ Assessment Officer Faculty Member Librarian Leader TeamApproach
    13. 13. AiA 2013 Institutional Teams 13
    14. 14. Library Factors Examined • instruction: games, single/multiple session, course embedded, tutorials • reference • physical space • discovery: institutional web, resource guides • collections • personnel 14
    15. 15. Variety of Tools/Methods 15 • survey • interviews • focus group(s) • observation • pre/post test • rubric • student portfolio • research paper/project • other class assignment • test scores • GPA • degree completion rate • retention rate
    16. 16. • What is your definition of assessment? • What comes to mind when you hear the term “assessment”? • What benefits do you see for assessment? • What are your concerns? Some Initial Questions 16
    17. 17. Process of… – Defining – Selecting – Designing – Collecting – Analyzing – Interpreting – Using information to increase service/program effectiveness Interpreting Analyzing Collecting Assessment Defined 17
    18. 18. • Answers questions: – What do users/stakeholders want & need? – How can services/programs better meet needs? – Is what we do working? – Could we do better? – What are problem areas? • Traditional stats don’t tell whole story Why Assessment? 18
    19. 19. Importance of Assessment “Librarians are increasingly called upon to document and articulate the value of academic and research libraries and their contribution to institutional mission and goals.” (ACRL Value of Academic Libraries, 2010, p. 6) 19
    20. 20. Formal vs. Informal Assessment • Formal Assessment – Data driven – Evidence-based – Accepted methods – Recognized as rigorous • Informal Assessment – Anecdotes & casual observation – Used to be norm – No longer acceptable 20
    21. 21. Outcomes Assessment Basics • Outcomes: “The ways in which library users are changed as a result of their contact with the library’s resources and programs” (ALA, 1998). • “Libraries cannot demonstrate institutional value to maximum effect until they define outcomes of institutional relevance and then measure the degree to which they attain them” (Kaufman & Watstein, 2008, p. 227). 21
    22. 22. Steps in Assessment Process • Why? Identify purpose • Who? Identify team • How? Choose model/approach/method • Commit • Training/planning 22
    23. 23. Outputs & Inputs • Outputs – Quantify the work done – Don’t relate factors to overall effectiveness • Inputs – Raw materials – Measured against standards – Insufficient for overall assessment 23
    24. 24. Principles for Applying Outcomes Assessment • Center on users – Assess changes in service/resources use • Relate to inputs - identify “best practices” • Use variety of methods to corroborate conclusions – Choose small number of outcomes – Need not address every aspect of service • Adopt continuous process 24
    25. 25. Examples of Outcomes • User matches information need to information resources • User can organize an effective search strategy • User effectively searches online catalog & retrieves relevant resources • User can find appropriate resources 25
    26. 26. What We Know About Assessment • Ongoing process to understand & improve service • Librarians are busy with day-to-day work & assessment can become another burden • Can build on what has already been done or is known 26
    27. 27. “One size fits none!” (Lynn’s Mom) 27
    28. 28. Survey Research “…to look at or to see over or beyond…allows one to generalize from a smaller group to a larger group” 28 (Connaway & Powell, 2010, p. 107)
    29. 29. • Explores many aspects of service • Demographic information • Online surveys (e.g., Survey Monkey) provide statistical analysis • Controlled sampling • High response rates possible • Data reflect characteristics & opinions of respondents • Cost effective • Can be self-administered • Survey large numbers Survey Research: Advantages (Hernon & Altman, 1998) 29
    30. 30. Survey Research: Disadvantages • Produces a snapshot of situation • May be time consuming to analyze & interpret results • Produces self-reported data • Data lack depth of interviewing • High return rate can be difficult (Hernon & Altman, 1998) 30
    31. 31. Design Issues • Paper or Online (e.g., Survey Monkey) • Consider order of questions • Demographic q’s first • Instructions – Be specific – Introduce sections • Keep it simple • Pre-test! 31
    32. 32. 32 Survey Research Interpreting Results • Objectively analyze all data • Interpret results with appropriate level of precision • Express proper degree of caution about conclusions • Use data as input in outcome measures • Consider longitudinal study, compare results over time • Qualitative data requires special attention
    33. 33. 33 Example: Seeking Synchronicity CIT: VRS Potential User Online Survey Questions Think about one experience in which you felt you achieved (or did not achieve) a positive result after seeking library reference services in any format. (Connaway & Radford, 2011) a. Think about one experience in which you felt you did (or did not) achieve a positive result after seeking library reference services in any format. b. Describe each interaction. c. Identify the factors that made these interactions positive or negative.
    34. 34. Interviews 34 Conversation involving two or more people guided by a predetermined purpose (Lederman, 1996)
    35. 35. • Structured • Semi-structured • Formats: – Individual • Face-to-face • Telephone • Skype – Focus Group Interviews Types of Interviews 35
    36. 36. Types of Questions • OPEN • “What is it like when you visit the library?” • DIRECTIVE • “What happened when you asked for help at the reference desk?” • REFLECTIVE • “It sounds like you had trouble with the mobile app?” • CLOSED • “Have I covered everything you wanted to say?” 36
    37. 37. Interviews: Advantages • Face-2-face interaction • In-depth information • Understand experiences & meanings • Highlight individual’s voice • Preliminary information to “triangulate” • Control sampling – Include underrepresented groups • Greater range of topics 37
    38. 38. Interviews: Disadvantages • Time Factors – Varies by # & depth – Staff intensive • Cost Factors – Higher the #, higher the cost • Additional Factors – Self-reported data – Errors in note taking possible 38
    39. 39. Example: Digital Visitors & Residents Participant Questions 39 1. Describe the things you enjoy doing with technology and the web each week. 2. Think of the ways you have used technology and the web for your studies. Describe a typical week. 3. Think about the next stage of your education. Tell me what you think this will be like. (White & Connaway, 2011-2012)
    40. 40. Focus Group Interviews 40 “…interview of a group of 8 to 12 people representing some target group and centered on a single topic.” (Zweizig, Johnson, Robbins, & Besant, 1996)
    41. 41. Conducting Focus Group Interviews • Obtain permission to use information & if taping – Report and/or publication • Enlist note-taker or, if recording, check equipment, bring back-up • Begin by creating safe climate 41
    42. 42. 42 WorldCat.org Study Recruitment • Difficult – Little data of user-base – Participants across 3 continents – Hard-to-reach populations • Historians • Antiquarian booksellers • Non-probabilistic methods – Convenience sampling – Snowball sampling (Connaway & Wakeling, 2012)
    43. 43. 43 Example: WorldCat.org Focus Group Interview Questions Tell us about your experiences with WorldCat.org. Broad introductory question to reveal the extent to which users have engaged with WorldCat.org, and the information-seeking contexts within which they use the system. (Connaway & Wakeling, 2012, p. 7)
    44. 44. Structured Observations 44 Systematic description focusing on designated aspects of behavior to test causal hypotheses (Connaway & Powell, 2010)
    45. 45. 45 Structured Observations: A Guide • Develop observational categories – Define appropriate, measurable acts – Establish time length of observation – Anticipate patterns of phenomena – Decide on frame of reference (Connaway & Powell, 2010)
    46. 46. Ethnographic Research 46 Rich description (Connaway & Powell, 2010)
    47. 47. Ethnographic research 47 • Incredibly detailed data • Time consuming – Establishing rapport – Selecting research participants – Transcribing observations & conversations – Keeping diaries (Connaway & Powell, 2010, p.175) (Khoo, Rozaklis, & Hall, 2012)
    48. 48. Analysis 48 “summary of observations or data in such a manner that they provide answers to the hypothesis or research questions” (Connaway & Powell, 2010)
    49. 49. Analysis 49 • Collection of data affects analysis of data • Ongoing process • Feeds back into research design • Theory, model, or hypothesis must grow from data analysis
    50. 50. Data Analysis: Digital Visitors & Residents • Codebook • Nvivo 10 50 I. Place A. Internet 1. Search engine a. Google b. Yahoo 2. Social Media a. FaceBook b. Twitter c. You Tube d. Flickr/image sharing e. Blogging B. Library 1. Academic 2. Public 3. School (K-12) C. Home D. School, classroom, computer lab E. Other (White & Connaway, 2011-2012)
    51. 51. Getting the Right Fit! • What do we know? • Where do we go from here? Use tools & research design to customize project to fit your assessment needs 51
    52. 52. References ALA/ACRL. 1998. Task force on academic library outcomes assessment report. Available: http://www.ala.org/Content/NavigationMenu/ACRL/Publications/White_Papers_and_Report s/Task_Force_on_Academic_Library_Outcomes_Assessment_Report.htm Brown, Karen, and Kara J. Malenfant. 2012. Connect, collaborate, and communicate: a report from the Value of Academic Libraries Summits. [Chicago, Ill.]: Association of College & Research Libraries. http://www.acrl.ala.org/value. Connaway, Lynn S., Johnson, Debra W., & Searing, Susan. 1997. Online catalogs from the users’ perspective: The use of focus group interviews. College and Research Libraries, 58(5), 403-420. Connaway, Lynn S. & Radford, Marie L. 2011. Seeking Synchronicity: Revelations and recommendations for virtual reference. Dublin, OH: OCLC Research. Retrieved from http://www.oclc.org/reports/synchronicity/full.pdf Connaway, Lynn S. & Powell, Ronald R. 2010. Basic research methods for librarians (5th ed.). Westport, CN: Libraries Unlimited. Connaway, Lynn S., & Wakeling, Simon. 2012. To use or not to use Worldcat.org: An international perspective from different user groups. OCLC Internal Report. Dervin, Brenda, Connaway, Lynn S., & Prabha, Chandra. 2003-2006 Sense-making the information confluence: The whys and hows of college and university user satisficing of information needs. Funded by the Institute of Museum and Library Flanagan, John C. 1954. The critical incident technique. Washington: American Psychological Association. Geertz, Clifford. 1973. The interpretation of cultures: Selected essays. New York: Basic Books. 52
    53. 53. References Hernon Peter & Altman, Ellen. 1998. Assessing Service Quality: Satisfying the Expectations of Library Customers. Chicago, IL: American Library Association. Kaufman, Paula, and Sarah Barbara Watstein. 2008. "Library value (return on investment, ROI) and the challenge of placing a value on public services". Reference Services Review. 36 (3): 226-231. Khoo, Michael, Rozaklis, Lily, & Hall, Catherine (2012). A survey of the use of ethnographic methods in the study of libraries and library users. Library and Information Science Research, 34(2), 82-91. Lederman, Linda C. 1996. Asking questions and listening to answers: A guide to using individual, focus group, and debriefing interviews. Dubuque, Iowa: Kendall/Hunt. Oakleaf, Megan J. 2010. The value of academic libraries: a comprehensive research review and report. Chicago, IL: Association of College and Research Libraries, American Library Association. QSR International. 2011. NVivo 9: Getting started. Retrieved from http://download.qsrinternational.com/Document/NVivo9/NVivo9-Getting-Started-Guide.pdf Services (IMLS). http://www.oclc.org/research/activities/past/orprojects/imls/default.htm White, David S., & Connaway, Lynn S. 2011-2012. Visitors and residents: What motivates engagement with the digital information environment. Funded by JISC, OCLC, and Oxford University. Retrieved from http://www.oclc.org/research/activities/vandr/ Zweizig, Douglas, Johnson, Debra W., Robbins, Jane, & Besant, Michele. 1996. The tell it! Manual. Chicago: ALA. 53
    54. 54. Thank You! ©2014 OCLC. This work is licensed under a Creative Commons Attribution 3.0 Unported License. Suggested attribution: “This work uses content from [presentation title] © OCLC, used under a Creative Commons Attribution license: http://creativecommons.org/licenses/by/3.0/” Lynn Silipigni Connaway, Ph.D. Senior Research Scientist OCLC Research Vice-chair , ACRL Value of Academic Libraries Committee @LynnConnaway connawal@oclc.org 54

    ×