Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Keepit Course 5: Tools for Assessing Trustworthy Repositories


Published on

This presentation provides a quick overview of two key, and complementary, tools used to measure trust of digital repositories. First it focusses on Trustworthy Repositories Audit and Certification (TRAC), leading towards another tool, DRAMBORA, that is applied more extensively in the next presentation. The presentation was given as part of the final module of a 5-module course on digital preservation tools for repository managers, presented by the JISC KeepIt project. For more on this and other presentations in this course look for the tag ’KeepIt course’ in the project blog

Published in: Technology
  • Be the first to comment

  • Be the first to like this

Keepit Course 5: Tools for Assessing Trustworthy Repositories

  1. 1. Tools for assessing trustworthy repositories A quick overview of TRAC leading to DRAMBORA by Steve Hitchcock by eurovision_nicola Haven’t we met somewhere before? Previously on this JISC KeepIt course…
  2. 2. How AIDA was built 1. An Audit Checklist for the Certification of Trusted Digital Repositories. This is the document originally produced by RLG- NARA, which later became Trustworthy Repositories Audit and Certification (TRAC): Criteria and Checklist 2. The Cornell University Survey of Institutional Readiness 3. Summary of RLG-OCLC Framework Component Characteristics 4. Network of Expertise in long-term STORage (NESTOR)
  3. 3. DRAMBORA and DAF • DRAMBORA (Digital Repository Audit Method Based on Risk Assessment) – Requires much self-insight and preparedness – Sees everything as risk management – Predicated on TDR – A bit complex • DAF (Data Audit Framework) – Concerned with one asset type (research data) – Aims to improve management – Helps to measure value of research data
  4. 4. Overview • KRDS1 Aim – investigate costs, develop model and recommendations • Method – detailed analysis of 4 models: LIFE1/2 & NASA CET in combination with OAIS and UK Research TRAC; • Plus literature review;12 interviews; 4 detailed case studies.
  5. 5. Overview • KRDS1 Aim – investigate costs, develop model and recommendations • Method – detailed analysis of 4 models: LIFE1/2 & NASA CET in combination with OAIS and UK Research TRAC; Sorry, wrong TRAC. This is HEFCE’s Transparent Approach to Costing • Plus literature review;12 interviews; 4 detailed case studies.
  6. 6. LIFE3 Integration • DROID • Planets Content Profile • FITS Cost • DRAMBORA • DRAMBORA Estimation • Plato • Planets Preservation Tool • JISC Framework Policy • Data Audit Framework •Context 6
  7. 7. Digital Preservation  Reference Models - Records Management, ISO 15489:2000 - OAIS: Open Archival Information System, ISO 14721:2003  Audit & Certification Initiatives - RLG- National Archives and Records Administration Digital Repository Certification Task Force: Trustworthy Repositories Audit & Certification: Criteria and Checklist (TRAC) - NESTOR: Catalogue of Criteria of Trusted Digital Repositories - DCC/DPE: DRAMBORA: Digital Repository Audit Method Based on Risk Assessment
  8. 8. … because good research needs good data Trustworthy Repositories Audit & Certification (TRAC) Criteria and Checklist • RLG/NARA assembled an International Task Force to address the issue of repository certification • TRAC is a set of criteria applicable to a range of digital repositories and archives, from academic institutional preservation repositories to large data archives and from national libraries to third-party digital archiving services • Provides tools for the audit, assessment, and potential certification of digital repositories • Establishes audit documentation requirements required • Delineates a process for certification • Establishes appropriate methodologies for determining the soundness and sustainability of digital repositories KeepIt #5: University of Northampton, 30 March 2010
  9. 9. TRAC: not quite a global standard • These efforts to merge development of a certification process highlighted small but important differences between the criteria in this audit checklist and the nestor Criteria Catalogue, for example. For now, a single, standardized set of criteria and applicable rules have proven impractical for geopolitical reasons. Version 1.0, February 2007 Working towards ISO standardisation of Digital Repository Audit and Certification - wiki
  10. 10. Structure of TRAC There are three primary areas to be assessed within TRAC: 1. Organizational Infrastructure 2. Digital Object Management 3. Technologies, Technical Infrastructure, Security
  11. 11. Ten principles In January 2007 representatives of four preservation organizations convened at the Center for Research Libraries in Chicago to seek consensus on core criteria for digital preservation repositories, to guide further international efforts on auditing and certifying repositories. … ten basic characteristics of digital preservation repositories … The key premise underlying the core requirements is that for repositories of all types and sizes preservation activities must be scaled to the needs and means of the defined community or communities.
  12. 12. … because good research needs good data 10 Characteristics of Digital Repositories • An intellectual context for the work: • Commitment to digital object maintenance • Organisational fitness • Legal & regulatory legitimacy © HATII UofGlasgow, 2007 • Effective & efficient policies • Acquisition & ingest criteria • Integrity, authenticity & usability • Provenance • Dissemination • Preservation planning & action • Adequate technical infrastructure (CRL/OCLC/NESTOR/DCC/DPE meeting, January 2007) KeepIt #5: University of Northampton, 30 March 2010
  13. 13. Structure of checklist entries A1.1 Repository has a mission statement that reflects a commitment to the long-term retention of, management of, and access to digital information. The mission statement of the repository must be clearly identified and accessible to depositors and other stakeholders and contain an explicit long-term commitment. Evidence: Mission statement for the repository; mission statement for the organizational context in which the repository sits; legal or legislative mandate; regulatory requirements.
  14. 14. TRAC Criteria Checklist • Within TRAC, there are 84 individual criteria Only 82 criteria to go!
  15. 15. To certify or not to certify? That is the question 1. Take a spreadsheet with all 84 TRAC criteria. 2. Select one. 3. Decide whether you could certify your repository for this, based on where your repository is now or where you think it might be after participating in this course. by Cayusa by fabiux
  16. 16. TRAC: audit or certification? • Audit is the basis for comparing local capabilities against a set of core criteria for a trusted digital repository. • Certification is a further step that some repositories will and/or must take for formal, objective recognition at the international or network level. • The result of any audit must be viewed in the context in which it was undertaken.
  17. 17. TRAC applicability in diversity The digital preservation community has come to not only recognize but embrace the fact that not all repositories will be “equal.” …proliferation of repository types (institutional repositories, open- access repositories, digital repositories, digital preservation repositories, digital archives, etc.) on local, regional, national, and international levels. For many of these repositories, preservation is not the primary purpose or explicit priority. With that understanding, it is easy to comprehend why some repositories may not choose to pursue certification, just as it is easy to see why others should feel compelled (or perhaps be compelled) to pursue certification.
  18. 18. TRAC in use: CRL reviews Portico • "Center for Research Libraries’ report marks the first public disclosure of a digital certification review conducted by an independent entity.” • Portico ( is a not-for-profit digital preservation service providing a permanent archive of electronic journals, books, and other scholarly content. As of October 2009, the Portico archive preserved over 14 million e-journal articles and 1,900 e-books. • CRL has concerns about Portico’s status on 12 of the 84 criteria ortico%20Audit%202010.pdf
  19. 19. CRL reviews Portico: Organisational Infrastructure • Criteria - A3.2 Repository has procedures and policies in place, and mechanisms for their review, update, and development as the repository grows and as technology and community practice evolve. • Portico policy infrastructure has improved considerably since the test audit in 2006, but some of these policies still suffer from internal contradictions and inconsistencies, specifically in the area of roles & responsibilities and job descriptions.
  20. 20. CRL reviews Portico: Digital Object Management • Criteria - B2.10 Repository has a documented process for testing understandability of the information content and bringing the information content up to the agreed level of understandability. • Portico needs to continue to identify what its community believes is necessary for “understandability” or usability of the preserved content. Portico should develop a process to support ongoing research into the needs of its community and determine what the Portico stakeholders think is an understandable e-journal, e-journal article, e-book, etc. As those needs evolve, Portico should develop test scenarios to evaluate how well the archive meets those needs.
  21. 21. CRL reviews Portico: System Infrastructure • Criteria - C2.2 Repository has software technologies appropriate to the services it provides to its designated community and has procedures in place to receive and monitor notifications, and evaluate when software technology changes are needed. • Portico’s ability to disseminate content to the users in the event of a major “trigger event” (for example, where all content from a large publisher with a large user base must be made available) is limited. This relates to Portico’s status as a dark archive.
  22. 22. TRAC in use: eCrystals repository TRAC, is open-ended and exploratory, taking into account vision and goals and plans for a repository and therefore more suited to repositories with an established long-term archival and preservation mandate. At the current stage of development of the eCrystals data repository we recommend self-assessment using the DRAMBORA toolkit as an instrument. The audit process in many ways is more important than actual certification, since it allows repositories to analyse and respond to their archives' strengths and weaknesses in a systematic fashion. Also, DRAMBORA takes a more quantified approach to assessing repositories and would therefore work best for an established repository looking for self-assessment. Patel and Coles, A study of Curation and Preservation Issues in the eCrystals Data Repository and Proposed Federation, 7 Sept. 2007
  23. 23. DRAMBORA in use: eCrystals Patel and Coles • Due to the recent rapid developments in this area, as well as the estimated time, effort and cost of undertaking an audit (the DRAMBORA documentation estimates 28-40 hours, depending on the scope and objectives), we have been unable to complete an audit by the end of Phase 3 (June 2007).
  24. 24. DRAMBORA in use: eCrystals Patel and Coles • Having examined the criteria being used in the various audit checklists, it is clear that there is a need to establish the scope and objectives of an audit more explicitly and to relate them to eCrystals more closely • For a long-term repository it would be beneficial to have regular audits, which verify periodically the proper functioning of records management procedures and systems and the authenticity and reliability of the records kept. Such monitoring is also useful in building up a profile of the repository over time in the face of a continuously changing environment. We suggest that a self-audit be undertaken at a frequency of once a year See also Patel, Preservation Planning for Crystallography Data, 25 June 2009 090625.pdf