How Do We Measure Success In Digital Repositories

1,564 views

Published on

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
1,564
On SlideShare
0
From Embeds
0
Number of Embeds
4
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • While there are different types of digital repositories, subject repositories, digital archives, institutional repositories, I am going to focus on institutional repositories. Much of what I cover will be applicable for all kinds, but some will be specific to IRs. When trying to measure success, you first define what you consider to be successful, then set goals and criteria to reach that point, then benchmark your success over time.
  • How Do We Measure Success In Digital Repositories

    1. 1. How Do We Measure Success in Building Digital Repository Collections? Richard R. Bernier Reference and Electronic Services Librarian April 4, 2008
    2. 2. How do you define successful? <ul><li>It depends on your overall mission and goals for your IR. That is the starting point upon which to build and set criteria for measuring the success of the IR. </li></ul>
    3. 3. A definition for a successful IR <ul><li>A successful IR would be one that has become fully integrated into the culture of the scholarly communication mission of an institution, thereby gaining full support and participation by faculty, libraries, and the administration. </li></ul><ul><li>Tangible indicators would include a </li></ul><ul><li>high number of repository deposits, high number of document downloads, administrative support and commitment, and suitable management elements </li></ul><ul><li>in place . </li></ul>
    4. 4. Some goals <ul><li>Create avenues for greater access to information by making it free and open </li></ul><ul><li>Expand readership of research material, thus expanding impact </li></ul><ul><li>Permanently preserve scholarly materials in electronic form </li></ul>
    5. 5. Some goals <ul><li>High percentage of faculty submitting to the repository </li></ul><ul><li>Large number of items in the collection </li></ul><ul><li>Library is sufficiently funded, staffed , and system is managed in accordance with community standards </li></ul><ul><li>Administration recognizes the IR as an important function of the library </li></ul>
    6. 6. So how do we measure this? <ul><li>Measured by how mature the repository is </li></ul><ul><li>1) Content – Size and Participation </li></ul><ul><ul><ul><li>Size & scope of collection, growth rate, acceptance by user community, usage </li></ul></ul></ul><ul><li>2) How mature the system is </li></ul><ul><ul><ul><li>Financial / Administrative support </li></ul></ul></ul><ul><ul><ul><li>Preservation policy & practice </li></ul></ul></ul><ul><ul><ul><li>Management policy & practice </li></ul></ul></ul><ul><ul><ul><li>Workflow establishment </li></ul></ul></ul>
    7. 7. IR size and participation <ul><li>Total number of items in IR </li></ul><ul><li>How does it compare with IRs from similar sized institutions? </li></ul><ul><ul><li>Who would serve as good examples? </li></ul></ul>
    8. 8. IR size and participation <ul><li>Percentage of university faculty that have deposited items in the IR </li></ul><ul><li>Must take into account: </li></ul><ul><ul><li>The difference between the number of depositors and the number of authors </li></ul></ul><ul><ul><li>The # of authors per article with each author counted </li></ul></ul><ul><ul><li>Articles with multiple authors from multiple institutions </li></ul></ul>
    9. 9. IR size and participation <ul><li>Of the total number of authors, how many submitted 1 article, 2, 3, etc? </li></ul><ul><ul><li>X number of authors make up x% of content </li></ul></ul>
    10. 10. IR size and participation <ul><li>Percentage of published peer-reviewed articles being deposited </li></ul><ul><ul><li>Database of all research would be helpful in comparing </li></ul></ul><ul><li>Of those not being deposited, why? </li></ul><ul><ul><li>Publisher won’t allow? </li></ul></ul><ul><ul><li>Embargo on self archiving? </li></ul></ul><ul><ul><li>Other (Don’t have time, don’t know if they can, don’t want to) </li></ul></ul>
    11. 11. IR Characteristics <ul><li>By Class (e.g. Education, Engineering, Science, & Social Sciences </li></ul><ul><li>By Subclass (Biology, Sociology, etc.) </li></ul><ul><li>By Department </li></ul><ul><li>By Version (Pre-print, post prints, etc.) </li></ul><ul><li>By Type (Report, journal article, conference paper, working paper) </li></ul><ul><li>By Date (By deposit date by year) </li></ul><ul><li>Numbers of depositors (author vs. non author) </li></ul>
    12. 12. Overall cost <ul><li>Overall cost for an IR can be very expensive </li></ul><ul><li>Monitoring costs can help determine the overall value of the collection </li></ul><ul><li>Separate startup cost from yearly costs </li></ul><ul><ul><li>MIT – $285,000 per year for personnel and systems </li></ul></ul><ul><ul><li>U of Oregon – 2,280-3,190 staff hours for fiscal year 2003-2004 </li></ul></ul><ul><ul><li>U of Rochester - $200,000 for Oct 2003 – Sept 2004 </li></ul></ul>
    13. 13. Cost per deposit <ul><li>Total # of IR content / total amount of IR Costs </li></ul><ul><li>Poor evaluation – does not account for what is being gained </li></ul><ul><ul><li>- Effect of serials budget which might not be known for years and relies on other institutions involvement </li></ul></ul>
    14. 14. Usage assessment <ul><li>Downloads = raw number of downloads </li></ul><ul><li>Know where traffic is coming from (IR searches or from outside sources, e.g. Google, OAIster, etc.) </li></ul><ul><li>Ability to sort out: </li></ul><ul><ul><li>administration-end access </li></ul></ul><ul><ul><li>Artificial inflation (robots, crawlers, spam bots) </li></ul></ul>
    15. 15. Citation analysis <ul><li>Ability to determine how often articles are being cited </li></ul><ul><ul><li>Still hard to do for non-ISI indexed journals </li></ul></ul><ul><li>Article level impacts are still being developed </li></ul><ul><li>Downloads and citation assessments will both help in convincing authors to deposit into an IR </li></ul>
    16. 16. Maturity of “project” <ul><li>“ Project” indicates: </li></ul><ul><ul><li>It’s experimental </li></ul></ul><ul><ul><li>It’s temporary </li></ul></ul><ul><ul><li>Funding is temporary and might dry up </li></ul></ul><ul><ul><li>It’s separate from “regular library tasks” </li></ul></ul><ul><li>Transform the IR from a “project” to a library core mission. </li></ul>
    17. 17. Library mission <ul><li>Starting Point: library mission statement and goals - IR should outlive current library administration </li></ul><ul><li>Show commitment through: </li></ul><ul><ul><li>Funding – Should be line item in the budget </li></ul></ul><ul><ul><ul><li>Staffing, systems, etc. </li></ul></ul></ul>
    18. 18. Library mission <ul><ul><li>Staffing </li></ul></ul><ul><ul><ul><li>Project administrator </li></ul></ul></ul><ul><ul><ul><li>Adequate tech support </li></ul></ul></ul><ul><ul><ul><li>Integration of IR needs with other staff </li></ul></ul></ul><ul><ul><ul><ul><li>(subject liaisons, outreach, other value added services) </li></ul></ul></ul></ul><ul><ul><li>Systems – plan for future systems needs </li></ul></ul><ul><ul><ul><li>System and server upgrades, additional servers, vendor price increases, additional staffing as needed </li></ul></ul></ul>
    19. 19. Institutional mission <ul><li>Institutional Recognition / Mandate </li></ul><ul><ul><li>Mandate on library to maintain the IR </li></ul></ul><ul><ul><li>Financial commitment </li></ul></ul><ul><ul><li>Board of Trustees recognition </li></ul></ul><ul><ul><li>Eventually a (faculty approved) mandate requiring faculty to deposit their peer-reviewed articles into the IR </li></ul></ul>
    20. 20. A mature system <ul><li>Show commitment toward meeting community standards </li></ul><ul><ul><li>Inoperability </li></ul></ul><ul><ul><li>Metadata </li></ul></ul><ul><ul><li>Preservation </li></ul></ul><ul><ul><li>Trusted repository </li></ul></ul>
    21. 21. Trusted repository <ul><li>Merriam Webster Dictionary definition of trust : </li></ul><ul><li>assured reliance on the character, ability, strength, or truth of someone or something </li></ul><ul><li>one in which confidence is placed </li></ul><ul><li>a charge or duty imposed in faith or confidence or as a condition of some relationship </li></ul><ul><li>something committed or entrusted to one to be used or cared for in the interest of another </li></ul>
    22. 22. Trusted repository <ul><li>Trust can come in three varieties: </li></ul><ul><li>1) How institutions earn the trust of their designated communities </li></ul><ul><li>2) How institutions trust third party providers </li></ul><ul><li>3) How users trust the documents provided to them by a repository </li></ul>
    23. 23. Attributes of a trusted digital repository <ul><li>Open Archival Information System (OAIS) compliant </li></ul><ul><li>Administrative responsibility </li></ul><ul><li>Organizational viability </li></ul><ul><li>Financial sustainability </li></ul><ul><li>Technological and procedural suitability </li></ul><ul><li>System security </li></ul><ul><li>Procedural accountability </li></ul>
    24. 24. OAIS
    25. 25. Trusted Repository Certification <ul><li>-Center for Research Libraries – </li></ul><ul><li>-National Archives and Records Administration </li></ul><ul><li>CRL-NARA Task Force on Digital Repository Certification - </li></ul><ul><ul><li>“ Trustworthy Repositories Audit & Certification: Criteria and Checklist </li></ul></ul>
    26. 26. Conclusion <ul><li>Define success </li></ul><ul><li>Set goals to create a mature system </li></ul><ul><ul><li>Content development </li></ul></ul><ul><ul><li>Maturity of the system </li></ul></ul>
    27. 27. Questions? <ul><li>Richard Bernier </li></ul><ul><li>Reference & Electronic Services Librarian </li></ul><ul><li>Rose-Hulman Institute of Technology </li></ul><ul><li>[email_address] ___________________________________________________________ </li></ul><ul><li>Selected Bibliography </li></ul><ul><li>Thibodeau, Kenneth “What Constitutes Success in a Digital Repository?” Workshop on Digital Curation & Trusted Repositories: Seeking Success” June 15, 2006 </li></ul><ul><li>“ Trustworthy Repositories Audit and & Certification: Criteria and Checklist,” RLG-NARA Task Force on Digital Repository Certification, February 2007. </li></ul><ul><li>“ Trusted Digital Repositories: Attributes and Responsibilities,” An RLG-OCLC Report” RLG, Mountain View, CA 2002. </li></ul><ul><li>Xia, Jingfeng and Li Sun, “Assessment of Self-Archiving in Institutional Repositories: Depositorship and Full-Text Availability,” Serials Review , January 2007, 14-20. </li></ul>

    ×