Deja Vu Ja Vu


Published on

Published in: Education, Technology
1 Comment
  • I'm glad to see your slide show posted here.<br /><br/>
    Are you sure you want to  Yes  No
    Your message goes here
  • Be the first to like this

No Downloads
Total Views
On Slideshare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Deja Vu Ja Vu

  1. 1. Déjà vu, ja vu From Outputs to Outcomes & Outcomes to Outputs … and Back Again Alan W. Zimmerman Consultant, PL System Administration & Finance Wisconsin Department of Public Instruction Keith Curry Lance Director Library Research Service Colorado State Library & University of Denver
  2. 2. What We’ll Talk About <ul><li>Outputs vs. outcomes </li></ul><ul><li>Output measures </li></ul><ul><ul><li>Traditional, newer, on the horizon </li></ul></ul><ul><ul><li>State detail, Bertot/McClure e-measures, state e-measures </li></ul></ul><ul><li>Analytical perspectives </li></ul><ul><li>Contexts of use </li></ul><ul><li>Tools </li></ul><ul><li>From output to outcome measures </li></ul>
  3. 3. Outputs vs. Outcomes <ul><li>Definitions </li></ul><ul><ul><li>Amount of service provided vs. how it made a difference for the end-user </li></ul></ul><ul><li>“ Ownership” </li></ul><ul><ul><li>Library or project vs. user or indirect beneficiary </li></ul></ul><ul><li>Perspectives </li></ul><ul><ul><li>One person’s output is another’s outcome! </li></ul></ul>
  4. 4. FSCS Output Measures <ul><li>Traditional </li></ul><ul><li>Per capita: Library visits, reference transactions, circulation, interlibrary loan </li></ul><ul><li>Children’s as percent of total circulation, program attendance </li></ul>
  5. 5. More FSCS Measures <ul><li>Newer </li></ul><ul><ul><li>Users of e-resources per capita </li></ul></ul><ul><ul><li>Program attendance per capita </li></ul></ul><ul><li>On the horizon </li></ul><ul><ul><li>Registration as percent of population </li></ul></ul><ul><ul><li>Alternatives to users of e-resources: </li></ul></ul><ul><ul><ul><li>Users of public Internet computers per capita </li></ul></ul></ul><ul><ul><ul><li>Visits to library’s home page </li></ul></ul></ul>
  6. 6. More Detailed State Measures <ul><li>Circulation by format (e.g., books, videos, audio books) </li></ul><ul><li>Reference questions by source (e.g., in-person, phone, e-mail, chat) </li></ul><ul><li>Interlibrary loan (e.g., returnable vs. non-returnable, in-/out-of-state, by library type) </li></ul><ul><li>Programs/attendance by type (e.g., story hours, reading clubs, literacy training) </li></ul>
  7. 7. Outputs from Network Performance Measures (Bertot & McClure) <ul><li>Public access workstation users </li></ul><ul><li>Formal user IT training </li></ul><ul><li>Point-of-use IT training </li></ul><ul><li>Virtual reference transactions </li></ul>
  8. 8. Electronic/Digital Output Measures from States <ul><li>IT training attendance (IA, KY, MD, NM, TX) </li></ul><ul><li>Database searches (AZ, LA, PA, WA) </li></ul><ul><li>Virtual visits (LA, NC, VA, WA) </li></ul><ul><li>Virtual reference transactions (FL, LA, WA) </li></ul><ul><li>Users of public access computers (MD, VA) </li></ul><ul><li>OPAC sessions (NC, WA) </li></ul><ul><li>Items examined (PA, VA) </li></ul>
  9. 9. PLUS a few unique items … <ul><li>Public access computer circulation (Nevada) </li></ul><ul><li>Number of electronic holds (Washington) </li></ul><ul><li>Number of electronic renewals (Washington) </li></ul><ul><li>Percent of time public Internet access terminals in use (Washington) </li></ul>
  10. 10. Analytical Perspectives <ul><li>Simple Reporting </li></ul><ul><li>Lending Perspective </li></ul><ul><li>Trend Analysis </li></ul><ul><li>Peer Comparison </li></ul><ul><li>Comparative Trend Analysis </li></ul>
  11. 11. Simple Reporting <ul><li>Totals </li></ul><ul><li>Per capita figures </li></ul>
  12. 12. Totals Example
  13. 13. Per Capita Example
  14. 14. Lending Perspective <ul><li>Per capita figures </li></ul><ul><li>Day-in-the-life figures </li></ul><ul><ul><li>Wichita PL annual report </li></ul></ul><ul><ul><li>In an average day, WPL … </li></ul></ul><ul><ul><li>Registered almost 50 new customers </li></ul></ul><ul><ul><li>Circulated over 5400 items </li></ul></ul><ul><ul><li>Answered over 900 questions </li></ul></ul><ul><ul><li>Provided 409 public computer sessions </li></ul></ul><ul><ul><li>Attracted 183 customers to library programs </li></ul></ul><ul><li>Comparisons of library outputs with those of other public agencies, nonprofits, private sector </li></ul><ul><ul><li>On daily basis, US public libraries circulate 4 times as many items as FedEx delivers. </li></ul></ul><ul><ul><li>Annual visits to CO libraries outnumber ski lift ticket sales 6-to-1. </li></ul></ul>
  15. 15. Trend Analysis <ul><li>Year-to-year change (up or down) </li></ul><ul><li>Change over time </li></ul>
  16. 16. Year to Year Change Example
  17. 17. Change Over Time Example
  18. 18. Peer Comparison <ul><li>Rankings </li></ul><ul><ul><li>Hennen ratings, FSCS state rankings </li></ul></ul><ul><li>Percentiles, quartiles </li></ul><ul><li>Average </li></ul>
  19. 19. Ranking Example #1
  20. 20. Ranking Example #2
  21. 21. Percentiles Example
  22. 22. Comparative Trend Analysis <ul><li>Differences in trends associated with some library characteristic (e.g., LSA population, legal basis type, metro status) </li></ul>
  23. 23. Comparative Trend Analysis Example
  24. 24. Contexts of Use <ul><li>Accountability (Reporting) </li></ul><ul><li>Advocacy </li></ul><ul><li>Planning </li></ul><ul><li>Evaluation (including Standards) </li></ul>
  25. 25. Tools <ul><li>Bibliostat CONNECT software </li></ul><ul><li>NCES Compare Libraries tool </li></ul><ul><li>LRS-i peer comparison & historical analysis tools </li></ul>
  26. 26. Bibliostat Connect Software <ul><li>Functionality: user selects peers & data for comparison, software generates tables & charts </li></ul><ul><li>Datasets: FSCS, PLDS, state(s), Census data </li></ul><ul><li>Availability: / (via statewide or local subscription) </li></ul>
  27. 27. NCES Compare Libraries Tool <ul><li>Functionality: user selects peers & data for comparison, site generates tables & charts </li></ul><ul><li>Datasets: FSCS only </li></ul><ul><li>Availability: </li></ul>
  28. 28. LRS-i Tools <ul><li>Functionality: peer comparison & historical analysis (one or more libraries), site generates tables & charts </li></ul><ul><li>Datasets: State only </li></ul><ul><li>Availability: </li></ul>
  29. 29. From Output to Outcome Measures <ul><li>Is it possible to bridge the gap between outputs and outcomes? </li></ul><ul><li>To what extent is stating outcomes a matter of perspective (user’s vs. library’s)? </li></ul><ul><li>What’s the relationship between intended outcomes and outputs? In the absence of true outcome data, can a rhetorical connection be made? </li></ul><ul><li>When is one person’s output another’s outcome? </li></ul>
  30. 30. From Output to Outcome Measures <ul><li>Attendance at Pleasantville Public Library’s 2005 series of summer-job-seeking workshops for teenagers was 2,500. </li></ul><ul><li>Outcome: By June 15, 2005, 2,250 of 2,500 teenagers (90%) who attended workshops at library found jobs. </li></ul><ul><li>Outcome-like: In 2005, 2,500 teenagers learned how to seek summer job successfully at workshops. </li></ul>
  31. 31. From Output to Outcome Measures <ul><li>Enrollment in Pleasantville’s summer reading program (SRP) was 5,000. 4,500 completed it. </li></ul><ul><li>Outcome: Of 4,500 children who completed SRP, 95% maintained or gained reading skills, compared to 65% of all elementary students. </li></ul><ul><li>Outcome-like: Of 5,000 children enrolled in SRP, 90% completed it, improving odds they would maintain or improve reading skills over summer. </li></ul>
  32. 32. From Output to Outcome Measures <ul><li>During FY 2004-05, the AskMarian virtual reference service answered 42,000+ questions statewide. </li></ul><ul><li>Outcome: As a result of using AM, 1/3 of users received help with homework or school project, 1/4 obtained info for work, and 1/10 learned more about issue. </li></ul><ul><li>Outcome-like: 42,000 AM users gained info they could use at school or work, to be better-informed community members, or to meet other info needs. </li></ul>
  33. 33. From Output to Outcome Measures <ul><li>FY 2004-05, statewide ILL statistics: 250,000 provided, 300,000 received. FY 2002-03, 200,000 & 250,000, respectively. </li></ul><ul><li>Outcome: After implementation of the state’s new fast-track interlibrary loan system, interlibrary lending increased by 25% and borrowing by 20%. </li></ul><ul><li>Remember, output for one or more libraries may be the state’s outcome. </li></ul>
  34. 34. From Output to Outcome Measures <ul><li>Remember 3 Keys to Creating Outcome-Like Output Statements </li></ul><ul><li>Express from user’s viewpoint, not library’s </li></ul><ul><li>Remember that intentions anticipate outcomes (at least theoretically), and connect output to those intentions rhetorically </li></ul><ul><li>Remember that one person’s (or agency’s) output may be another’s outcome </li></ul>
  35. 35. From Output to Outcome Measures <ul><li>Exercise </li></ul>