Your SlideShare is downloading. ×
0
Putting "Value" in Evaluation: Building Relevant, Dynamic Statistical Analysis
Putting "Value" in Evaluation: Building Relevant, Dynamic Statistical Analysis
Putting "Value" in Evaluation: Building Relevant, Dynamic Statistical Analysis
Putting "Value" in Evaluation: Building Relevant, Dynamic Statistical Analysis
Putting "Value" in Evaluation: Building Relevant, Dynamic Statistical Analysis
Putting "Value" in Evaluation: Building Relevant, Dynamic Statistical Analysis
Putting "Value" in Evaluation: Building Relevant, Dynamic Statistical Analysis
Putting "Value" in Evaluation: Building Relevant, Dynamic Statistical Analysis
Putting "Value" in Evaluation: Building Relevant, Dynamic Statistical Analysis
Putting "Value" in Evaluation: Building Relevant, Dynamic Statistical Analysis
Putting "Value" in Evaluation: Building Relevant, Dynamic Statistical Analysis
Putting "Value" in Evaluation: Building Relevant, Dynamic Statistical Analysis
Putting "Value" in Evaluation: Building Relevant, Dynamic Statistical Analysis
Putting "Value" in Evaluation: Building Relevant, Dynamic Statistical Analysis
Putting "Value" in Evaluation: Building Relevant, Dynamic Statistical Analysis
Putting "Value" in Evaluation: Building Relevant, Dynamic Statistical Analysis
Putting "Value" in Evaluation: Building Relevant, Dynamic Statistical Analysis
Putting "Value" in Evaluation: Building Relevant, Dynamic Statistical Analysis
Putting "Value" in Evaluation: Building Relevant, Dynamic Statistical Analysis
Putting "Value" in Evaluation: Building Relevant, Dynamic Statistical Analysis
Putting "Value" in Evaluation: Building Relevant, Dynamic Statistical Analysis
Putting "Value" in Evaluation: Building Relevant, Dynamic Statistical Analysis
Putting "Value" in Evaluation: Building Relevant, Dynamic Statistical Analysis
Putting "Value" in Evaluation: Building Relevant, Dynamic Statistical Analysis
Putting "Value" in Evaluation: Building Relevant, Dynamic Statistical Analysis
Putting "Value" in Evaluation: Building Relevant, Dynamic Statistical Analysis
Putting "Value" in Evaluation: Building Relevant, Dynamic Statistical Analysis
Putting "Value" in Evaluation: Building Relevant, Dynamic Statistical Analysis
Putting "Value" in Evaluation: Building Relevant, Dynamic Statistical Analysis
Putting "Value" in Evaluation: Building Relevant, Dynamic Statistical Analysis
Putting "Value" in Evaluation: Building Relevant, Dynamic Statistical Analysis
Putting "Value" in Evaluation: Building Relevant, Dynamic Statistical Analysis
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Putting "Value" in Evaluation: Building Relevant, Dynamic Statistical Analysis

776

Published on

Recent history has taught us that we must begin assessing what it is we really do, alter our record-keeping to include an ever-widening group of new services and features, provide evidence that we are …

Recent history has taught us that we must begin assessing what it is we really do, alter our record-keeping to include an ever-widening group of new services and features, provide evidence that we are actually accomplishing our goals, and find open-ended assessment tools that anticipate future change in library operations. This type of rigorous self-examination makes it more difficult and perhaps unwise to use a one-size-fits-all statistical analysis. Accordingly, this presentation focuses on the process necessary for meaningful and dynamic statistical analysis, including: parsing your mission statement to discover categories of evaluation, brainstorming key indicators that relate directly to these categories, leveraging your organization's current statistical analyses, and evaluating your methods to ensure future adaptability.

Published in: Education, Technology
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
776
On Slideshare
0
From Embeds
0
Number of Embeds
6
Actions
Shares
0
Downloads
5
Comments
0
Likes
1
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide
  • Anecdote: not satisfied with making Reference staffing decisions based on circulation statistics… Do they actually relate?
  • When was the last time we took a look at our mission statements? When were they last revised? What do they say about our organizations? What sorts of promises do we make with our constituents in documents such as these?
  • When was the last time we took a look at our mission statements? When were they last revised? What do they say about our organizations? What sorts of promises do we make with our constituents in documents such as these?
  • When was the last time we took a look at our mission statements? When were they last revised? What do they say about our organizations? What sorts of promises do we make with our constituents in documents such as these?
  • Use “indicator” examples from DCL spreadsheets, just to get the ball rolling. (See next slide.)
  • I already added a “Tours/Programs” indicator. I am aware that most of the audience will be unfamiliar with how we run things, but I think they *will* be familiar with what reference librarians do.
  • And the one item I had someone give me information for was later replaced by a different statistic we already kept.
  • And the one item I had someone give me information for was later replaced by a different statistic we already kept.
  • Anecdote: I could have used “minutes” instead of “hours” of computer use to make my branch look better (since the numbers were used in an average, the number of minutes would have been correct, but the end-product would have been completely misleading, unfair, and unethical).
  • Transcript

    • 1. Putting “Value” in EvaluationBuilding Relevant, Dynamic StatisticalAnalysisPresented by Joshua Johnson @ #ULA2013See Josh Blog: www.joshinglibrarian.infoSee Josh Tweet: @JoshinLibrarian
    • 2. Don’t We Already UseStatistical Analysis?We’ve been doing this for years;
    • 3. Librarians are well known as gatherers ofstatistics and other data. However, we do notalways use the information that we gather,preferring instead to point to the numbersthemselves as evidence of our work.Blake & Schleper, “From Data to Decisions…”Quotable Research/Literature Review
    • 4. What is the worth of the library in the networked-computer age? How do shifts in use patterns reflectchanges in customers’ valuations of libraryservices, and how would customers prefer thatlibrary resources be added or reallocated? Whatbenefits are conferred on different types of librarycustomers by their variant uses of … libraries?And, how can those benefits be measured?Holt & Elliott, “Measuring Outcomes…”Quotable Research/Literature Review
    • 5. Librarians may be good at counting, but as a wholethe profession is not trained to evaluate andanalyze statistics. Librarians offer numbers often asproof of the value of their work with little thought asto whether those numbers really establish anythingof value…Library Managers need to think criticallyabout which statistics are useful.Tim Spindler, “Statistical Analysis Models:Applications for Libraries”Quotable Research/Literature Review
    • 6. It doesn’t really matter whether you can quantify yourresults. What matters is that you rigorously assembleevidence—quantitative or qualitative—to track yourprogress.If the evidence is primarily qualitative, think like a triallawyer assembling the combined body of evidence. Ifthe evidence is primarily quantitative, then think ofyourself as a laboratory scientist assembling andassessing the data.Jim Collins, Good to Great and the Social SectorsQuotable Research/Literature Review
    • 7. Examine Current Methods of Analysis• Are you basing reference staffing decisions oncirculation data?• Are you comfortable extrapolating futuretrends from your current data sets?• Does the data you collect match the servicesyou currently provide?• Is the process by which you collect andsynthesize your data easily transferred fromone staff member to another in the event of apromotion or retirement?
    • 8. WHERE DO I START?Okay, I’ll bite. I’m rethinking my library’s statistical analysis anyway;
    • 9. Evaluate Your Mission Statement• Does your mission statement accuratelydescribe what your library does?• What promises are you making in your missionstatement?• Which of your services, collections, etc. arerelated to each promise?• Are you considering adding to your servicesand unsure if you can handle the changewithout adding staff?
    • 10. Evaluate Your Mission StatementFor example, part of the mission statementof the American Library Association:The American Library Association wascreated to provide leadership for thedevelopment, promotion, and improvement oflibrary and information services and theprofession of librarianship in order to enhancelearning and ensure access to information forall.
    • 11. Evaluate Your Mission StatementLet’s highlight a few of the promises thatreasonable people might assume based onthe quote:• ALA members will be better at promotion.• ALA members will be better librarians.• Library patrons served by ALA members willbe better served than those served by non-ALA members.• Library patrons served by ALA members willreceive better access to information than thoseserved by non-ALA members.
    • 12. Brainstorm Related Key Indicators• What factors, practices, personnel influence yourability to make good on these promises?• What physical or monetary limitations couldimpact your organization’s ability to fulfillpromises?• How might you be able to assess theorganization’s perceived value or measure youreffectiveness at providing services or staffing?• Would the organization be better able to fulfillsome promises if it changed the promises it madeor cut some services in favor of others?
    • 13. In this case, the indicators represent daily, measurable tasksperformed by each set of employees. I consider the lists ofindicators “in process.”What Do I Mean by Indicators?
    • 14. TAKE THE PROCESS FOR ASPINNow that we’ve talked about the process,
    • 15. Parsing the Mission Statement• Form groups and select a “scribe.”• Examine the sample mission statement as agroup and write as many promises made aspossible in the time allotted.• Was it difficult? Was it worth it?
    • 16. Brainstorming Key Indicators• Stay in the same groups.• Take your sample mission statements and the listof promises your group made in the last exerciseand switch them with another group.• Read the mission statement as necessary, butfocus on the promises made by the statement.
    • 17. Brainstorming Key Indicators• Make a list of indicators (services, practices, etc.) thatrelate to these promises.• As your group brainstorms, keep in mind:• What factors, practices, personnel influence theorganization’s ability to make good on thesepromises?• What physical or monetary limitations couldimpact the organization’s ability to fulfill promises?• In what ways might you be able to assess theorganization’s perceived value or measure youreffectiveness at providing services or staffing?• Would the organization be better able to fulfillsome promises if it changed the promises it made,or cut some services in favor of others?
    • 18. Process Evaluation• How difficult was this process?• How productive was it?• What other documents could be used in additionto mission statements?• What kinds of ideas did it spark for examiningyour own workplace processes?• How could we streamline the brainstormingprocess?
    • 19. What reference indicators might reasonably beadded, based on your experience in libraries?Process Evaluation
    • 20. WHERE DO I GO FROM HERE?That was interesting, and perhaps useful, but
    • 21. Leverage Current Statistical Analyses• Reuse the data and statistical analysis youalready have. You almost certainly keep someform of raw statistical record-keeping or statisticalanalysis. There’s no need to completely reinventthe wheel.• Track additional data as needed - make sure it isrelevant.
    • 22. Leverage Your Current StatisticalAnalyses• Nearly all of the information used in this project was pulled directlyfrom statistics already kept by the library system.• Excel, for example, will pull information from other Excel files andupdate it automatically; this makes pulling data for your own projects abreeze.
    • 23. “FIND & REPLACE” & MACROS SIMPLIFY REPETITIVE TASKS.Leverage Current Statistical Analyses• Take advantage of time/labor saving computer programs.Spreadsheets often allow you to use macros and a widerange of other helps.• The example below contains instructions forcopying/updating monthly statistics using the “find &replace” function available in many spreadsheets. Eachtype of information can be updated in a slightly differentway.
    • 24. Leverage Your Current StatisticalAnalyses• Be logical - Be certain the picture you paint withstatistics is as objective as possible; somestatistics are misleading or unethical.• Compare apples to apples - This can be moredifficult than you think. Statistics make moresense when you find ways to comparecommonalities.
    • 25. Methods Should Allow Adapdability• Think about how you lay out your information;why organize it over and over?• Will your methods be easily passed to your futuresuccessor?
    • 26. • What libraries “do” in 5-10 years will change, so expect it.• When planning, leave room to grow, even if it is just a blankspace in the sheet.• Review promises and indicators for relevance and value toensure the most valuable evaluation of your organization.Evaluate Methods to Ensure Adaptability
    • 27. Brainstorming Helps for Key IndicatorsHere are some examples to get you going; they arenot exhaustive. Many come from Blake & Schleper(2004).Quantitative• Circulation statistics (most commonly "check-outs,"but also in-house uses, etc.)• Website analytics (hits, unique visitors, duration, etc.)• Subject/date analysis (examining average publicationdates in a given subject or collection)• Cost-per-use analysis• ILL requests
    • 28. Brainstorming Helps for Key IndicatorsQualitative•Comparison of collections to peer organizations•User input• Questionnaires• Interviews• Compare holdings to collection development policies• Comparison of overall services to peer institutions• Print collections• Electronic holdings• eBooks/vendors• Study space• Training/community outreach• Staffing levels
    • 29. Brainstorming Helps for Key IndicatorsQualitative• Physical examination of collection (wear & tear or dust)• Anecdotes (patrons or colleagues opinions)
    • 30. QUESTIONS?Are there any
    • 31. Spindler, T (2009). Statistical analysis models: Applications for libraries.Library Publications. Paper 11. http://docs.rwu.edu/librarypub/11References & Suggested ResourcesBlake, J C & Schleper, S P. (2004). From data to decisions: Usingsurveys and statistics to make collection management decisions.Library Collections, Acquisitions, & Technical Services 28(4).Holt, G & Elliott, D. (2003). Measuring outcomes: Applying cost-benefitanalysis to middle-sized and smaller public libraries. Library Trends. 51(3).424-440 <link to article>Collins, J (2005). Good to great and the social sectors: Why businessthinking is not the answer. Boulder, CO: J. Collins. Print.
    • 32. Presented @ #ULA2013By Joshua JohnsonJosh’s Blog: www.joshinglibrarian.infoJosh’s Tweets: @JoshinLibrarian

    ×