Your SlideShare is downloading. ×
Deja Vu Ja Vu
Deja Vu Ja Vu
Deja Vu Ja Vu
Deja Vu Ja Vu
Deja Vu Ja Vu
Deja Vu Ja Vu
Deja Vu Ja Vu
Deja Vu Ja Vu
Deja Vu Ja Vu
Deja Vu Ja Vu
Deja Vu Ja Vu
Deja Vu Ja Vu
Deja Vu Ja Vu
Deja Vu Ja Vu
Deja Vu Ja Vu
Deja Vu Ja Vu
Deja Vu Ja Vu
Deja Vu Ja Vu
Deja Vu Ja Vu
Deja Vu Ja Vu
Deja Vu Ja Vu
Deja Vu Ja Vu
Deja Vu Ja Vu
Deja Vu Ja Vu
Deja Vu Ja Vu
Deja Vu Ja Vu
Deja Vu Ja Vu
Deja Vu Ja Vu
Deja Vu Ja Vu
Deja Vu Ja Vu
Deja Vu Ja Vu
Deja Vu Ja Vu
Deja Vu Ja Vu
Deja Vu Ja Vu
Deja Vu Ja Vu
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Deja Vu Ja Vu

755

Published on

Published in: Education, Technology
1 Comment
0 Likes
Statistics
Notes
  • I'm glad to see your slide show posted here.<br /><br/>
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here
  • Be the first to like this

No Downloads
Views
Total Views
755
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
0
Comments
1
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Déjà vu, ja vu From Outputs to Outcomes & Outcomes to Outputs … and Back Again Alan W. Zimmerman Consultant, PL System Administration & Finance Wisconsin Department of Public Instruction Keith Curry Lance Director Library Research Service Colorado State Library & University of Denver
  • 2. What We’ll Talk About
    • Outputs vs. outcomes
    • Output measures
      • Traditional, newer, on the horizon
      • State detail, Bertot/McClure e-measures, state e-measures
    • Analytical perspectives
    • Contexts of use
    • Tools
    • From output to outcome measures
  • 3. Outputs vs. Outcomes
    • Definitions
      • Amount of service provided vs. how it made a difference for the end-user
    • “ Ownership”
      • Library or project vs. user or indirect beneficiary
    • Perspectives
      • One person’s output is another’s outcome!
  • 4. FSCS Output Measures
    • Traditional
    • Per capita: Library visits, reference transactions, circulation, interlibrary loan
    • Children’s as percent of total circulation, program attendance
  • 5. More FSCS Measures
    • Newer
      • Users of e-resources per capita
      • Program attendance per capita
    • On the horizon
      • Registration as percent of population
      • Alternatives to users of e-resources:
        • Users of public Internet computers per capita
        • Visits to library’s home page
  • 6. More Detailed State Measures
    • Circulation by format (e.g., books, videos, audio books)
    • Reference questions by source (e.g., in-person, phone, e-mail, chat)
    • Interlibrary loan (e.g., returnable vs. non-returnable, in-/out-of-state, by library type)
    • Programs/attendance by type (e.g., story hours, reading clubs, literacy training)
  • 7. Outputs from Network Performance Measures (Bertot & McClure)
    • Public access workstation users
    • Formal user IT training
    • Point-of-use IT training
    • Virtual reference transactions
  • 8. Electronic/Digital Output Measures from States
    • IT training attendance (IA, KY, MD, NM, TX)
    • Database searches (AZ, LA, PA, WA)
    • Virtual visits (LA, NC, VA, WA)
    • Virtual reference transactions (FL, LA, WA)
    • Users of public access computers (MD, VA)
    • OPAC sessions (NC, WA)
    • Items examined (PA, VA)
  • 9. PLUS a few unique items …
    • Public access computer circulation (Nevada)
    • Number of electronic holds (Washington)
    • Number of electronic renewals (Washington)
    • Percent of time public Internet access terminals in use (Washington)
  • 10. Analytical Perspectives
    • Simple Reporting
    • Lending Perspective
    • Trend Analysis
    • Peer Comparison
    • Comparative Trend Analysis
  • 11. Simple Reporting
    • Totals
    • Per capita figures
  • 12. Totals Example
  • 13. Per Capita Example
  • 14. Lending Perspective
    • Per capita figures
    • Day-in-the-life figures
      • Wichita PL annual report
      • In an average day, WPL …
      • Registered almost 50 new customers
      • Circulated over 5400 items
      • Answered over 900 questions
      • Provided 409 public computer sessions
      • Attracted 183 customers to library programs
    • Comparisons of library outputs with those of other public agencies, nonprofits, private sector
      • On daily basis, US public libraries circulate 4 times as many items as FedEx delivers.
      • Annual visits to CO libraries outnumber ski lift ticket sales 6-to-1.
  • 15. Trend Analysis
    • Year-to-year change (up or down)
    • Change over time
  • 16. Year to Year Change Example
  • 17. Change Over Time Example
  • 18. Peer Comparison
    • Rankings
      • Hennen ratings, FSCS state rankings
    • Percentiles, quartiles
    • Average
  • 19. Ranking Example #1
  • 20. Ranking Example #2
  • 21. Percentiles Example
  • 22. Comparative Trend Analysis
    • Differences in trends associated with some library characteristic (e.g., LSA population, legal basis type, metro status)
  • 23. Comparative Trend Analysis Example
  • 24. Contexts of Use
    • Accountability (Reporting)
    • Advocacy
    • Planning
    • Evaluation (including Standards)
  • 25. Tools
    • Bibliostat CONNECT software
    • NCES Compare Libraries tool
    • LRS-i peer comparison & historical analysis tools
  • 26. Bibliostat Connect Software
    • Functionality: user selects peers & data for comparison, software generates tables & charts
    • Datasets: FSCS, PLDS, state(s), Census data
    • Availability: http://connect.informata.com / (via statewide or local subscription)
  • 27. NCES Compare Libraries Tool
    • Functionality: user selects peers & data for comparison, site generates tables & charts
    • Datasets: FSCS only
    • Availability: http://nces.ed.gov/surveys/libraries/compare/Index.asp
  • 28. LRS-i Tools
    • Functionality: peer comparison & historical analysis (one or more libraries), site generates tables & charts
    • Datasets: State only
    • Availability: http://www.lrs.org/interactive/index.asp
  • 29. From Output to Outcome Measures
    • Is it possible to bridge the gap between outputs and outcomes?
    • To what extent is stating outcomes a matter of perspective (user’s vs. library’s)?
    • What’s the relationship between intended outcomes and outputs? In the absence of true outcome data, can a rhetorical connection be made?
    • When is one person’s output another’s outcome?
  • 30. From Output to Outcome Measures
    • Attendance at Pleasantville Public Library’s 2005 series of summer-job-seeking workshops for teenagers was 2,500.
    • Outcome: By June 15, 2005, 2,250 of 2,500 teenagers (90%) who attended workshops at library found jobs.
    • Outcome-like: In 2005, 2,500 teenagers learned how to seek summer job successfully at workshops.
  • 31. From Output to Outcome Measures
    • Enrollment in Pleasantville’s summer reading program (SRP) was 5,000. 4,500 completed it.
    • Outcome: Of 4,500 children who completed SRP, 95% maintained or gained reading skills, compared to 65% of all elementary students.
    • Outcome-like: Of 5,000 children enrolled in SRP, 90% completed it, improving odds they would maintain or improve reading skills over summer.
  • 32. From Output to Outcome Measures
    • During FY 2004-05, the AskMarian virtual reference service answered 42,000+ questions statewide.
    • Outcome: As a result of using AM, 1/3 of users received help with homework or school project, 1/4 obtained info for work, and 1/10 learned more about issue.
    • Outcome-like: 42,000 AM users gained info they could use at school or work, to be better-informed community members, or to meet other info needs.
  • 33. From Output to Outcome Measures
    • FY 2004-05, statewide ILL statistics: 250,000 provided, 300,000 received. FY 2002-03, 200,000 & 250,000, respectively.
    • Outcome: After implementation of the state’s new fast-track interlibrary loan system, interlibrary lending increased by 25% and borrowing by 20%.
    • Remember, output for one or more libraries may be the state’s outcome.
  • 34. From Output to Outcome Measures
    • Remember 3 Keys to Creating Outcome-Like Output Statements
    • Express from user’s viewpoint, not library’s
    • Remember that intentions anticipate outcomes (at least theoretically), and connect output to those intentions rhetorically
    • Remember that one person’s (or agency’s) output may be another’s outcome
  • 35. From Output to Outcome Measures
    • Exercise

×