Electronic Resource Assessment: Adventures in Engagement

460 views
402 views

Published on

Published in: Technology, Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
460
On SlideShare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Electronic Resource Assessment: Adventures in Engagement

  1. 1. Electronic Resource Assessment: Adventures in Engagement John Tofanelli | Colleen Major | Jeff Carroll Columbia University Libraries
  2. 2. MOVING TOWARDS ENGAGEMENT • The larger assessment environment • Methods for involving selectors in the assessment process • An invention that facilitates involvement: Google fund calendars
  3. 3. THE FISCAL ENVIRONMENT
  4. 4. THE COMMERCIAL PUBLISHING ENVIRONMENT Proliferations • Publications that go forth and multiply • Packages and/or databases that overlap in content • Databases and e-journals available from more than one vendor, each providing access through its own interface
  5. 5. THE COMMERCIAL PUBLISHING ENVIRONMENT Challenges • All-or-nothing aggregator packages • Licensing agreements that limit the scope of cancellation activity within e-journal packages • Columbia Libraries have over 98,000 ongoing e-resource purchases (includes serials & continuing service fees) • Payments are due throughout the fiscal year • Vendors must typically be notified of serials cancellations 30 days before renewal date
  6. 6. THE DATA ENVIRONMENT • E-Resource Management Systems • Standardized Usage Data (COUNTER)
  7. 7. USAGE STATISTICS: STANDARDIZED & EXPLAINED COUNTER (Counting Online Usage of NeTworked Electronic Resources) "an international initiative serving librarians, publishers and intermediaries by setting standards that facilitate the recording and reporting of online usage statistics in a consistent, credible and compatible way." http://www.projectcounter.org/about.html COUNTER Code of Practice: Release 1 published 2003 Draft release 4 currently available for public comment
  8. 8. THE INTERNAL WORKPLACE ENVIRONMENT As found at Columbia University • More than forty subject selectors • Seven library divisions: History & Humanities; Science & Engineering; Art & Architecture; Area Studies; Social Sciences; Theology; Rare Books & Manuscripts • Many selectors not yet trained in use of new tools available in the data environment • Assessment practices uneven across disciplinary areas
  9. 9. WHEN, WHY, AND HOW DO WE ASSESS E-RESOURCES? Chapter 1 Where our Journey Began The Conservative, Minimalist, and Reactive Approach • assessment is largely confined to pre-purchase trial period • subsequent assessment is done primarily in response to known problems • characteristic of a growth economy • emphasis is on expansion of online collections
  10. 10. WHEN, WHY, AND HOW DO WE ASSESS E- RESOURCES? Chapter 12 Where we need to be now The Expansive, Holistic, and Proactive Approach • assessment is an ongoing periodic duty • assessment can discover or anticipate problems • characteristic of a constrained economy • emphasis is on managing what we have; growing collections judiciously; ensuring value for dollar
  11. 11. Engaging Selectors in Assessment
  12. 12. Engaging Selectors in Assessment • Make assignments (working groups, projects, etc.)
  13. 13. Engaging Selectors in Assessment • Make assignments (working groups, projects, etc.) • Announce new projects, workflows, etc. to selectors
  14. 14. Engaging Selectors in Assessment • Make assignments (working groups, projects, etc.) • Announce new projects, workflows, etc. to selectors • Follow up in weeks after announcement to see who has made progress
  15. 15. Engaging Selectors in Assessment • Make assignments (working groups, projects, etc.) • Announce new projects, workflows, etc. to selectors • Follow up in weeks after announcement to see who has made progress • Enlist 1 or 2 of these selectors to present at a follow up session as example/testimonial
  16. 16. Engaging Selectors in Assessment • Make assignments (working groups, projects, etc.) • Announce new projects, workflows, etc. to selectors • Follow up in weeks after announcement to see who has made progress • Enlist 1 or 2 of these selectors to present at a follow up session as example/testimonial • Solicit feedback from individuals and divisions
  17. 17. • E-Resource Assessment Task Force (2008) • E-Resource Assessment Working Group • 2009 - and beyond Both groups consisted of: Make Assignments
  18. 18. • E-Resource Assessment Task Force (2008) • E-Resource Assessment Working Group • 2009 - and beyond Both groups consisted of: • Selectors from various divisions Make Assignments
  19. 19. • E-Resource Assessment Task Force (2008) • E-Resource Assessment Working Group • 2009 - and beyond Both groups consisted of: • Selectors from various divisions • E-Resources librarian Make Assignments
  20. 20. • E-Resource Assessment Task Force (2008) • E-Resource Assessment Working Group • 2009 - and beyond Both groups consisted of: • Selectors from various divisions • E-Resources librarian • Asst. Director for Collection Development Make Assignments
  21. 21. Announce new projects • Eresources Assessment Working Group announces new projects
  22. 22. Follow up • Eresources Assessment Working Group announces new projects • Group follows up after a period of time
  23. 23. Follow up • Eresources Assessment Working Group announces new projects • Group follows up after a period of time o educational offerings in the use of electronic resource management tools
  24. 24. Enlist those engaged • Eresources Assessment Working Group announces new projects • Group follows up after a period of time • Ask selectors who have become engaged to present on their findings and demonstrate their methodology
  25. 25. Enlist those engaged • Eresources Assessment Working Group announces new projects • Group follows up after a period of time • Ask selectors who have become engaged to present on their findings and demonstrate their methodology o inspirational presentations from librarians who have improved collections using such tools
  26. 26. Track engagement • E-resources Assessment Working Group announces new projects • Group follows up after a period of time • Ask selectors who have become engaged to present on their findings and demonstrate their methodology • Evaluate the level of engagement by selectors
  27. 27. Google Calendar • Created e-resource renewal reminder system in Google Calendar • Announced and demonstrated capabilities of calendar at a meeting of the Selectors Discussion Group • Followed up with selectors to see who had actually made use of the calendar • Enlisted engaged selectors to present their findings and methodologies • Tracked level of overall engagement by looking at calendar usage
  28. 28. • What we had been using • What we wanted: o easy to use o easy to implement o pushed to users o played off existing efforts and strategies • How we did it Python code is available at: https://github.com/nadaoneal/gapps_python_whatever Creation of an e-resource reminder system in Google Calendar
  29. 29. one month view
  30. 30. expenditure event view
  31. 31. 3 month event reminder
  32. 32. 5 month event reminder
  33. 33. Staff work view
  34. 34. Searching across calendars
  35. 35. Image: http://www.openclipart.org/detail/159643/thank-you-pinned-by-juliobaha

×