Your SlideShare is downloading. ×
0
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly

2,390

Published on

This presentation provides an extensive introduction to the Digital Repository Audit Method Based On Risk Assessment …

This presentation provides an extensive introduction to the Digital Repository Audit Method Based On Risk Assessment
(DRAMBORA). After considering how the ideas of risk and trust might affect digital repositories, the DRAMBORA methodology is applied using two practical exercises. DRAMBORA can also be applied using an interactive Web-based version of the tool, and the presentation provides an illustrated guided tour of this version of the tool. The presentation was given as part of the final module of a 5-module course on digital preservation tools for repository managers, presented by the JISC KeepIt project. Concluding by comparing DRAMBORA with DAF, the Data Asset Framework, also produced by the DCC, brings the KeepIt course effectively full circle, as the course began in module 1 with DAF. For more on this and other presentations in this course look for the tag ’KeepIt course’ in the project blog http://blogs.ecs.soton.ac.uk/keepit/

Published in: Technology
0 Comments
3 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
2,390
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
40
Comments
0
Likes
3
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide
  • Hard to define: but we must define it if we want to automate its handling Multi-facetted: not one simple score that covers all data quality, need many such scores… expensive Highly app specific means: again need more than one score, hard to reuse the details of the scores Highly subjective: means difficult to completely automate
  • Physical : Theft, vandalism, arson, building related risks, Storm, flood, other related weather, damage to vehicles, mobile plant and equipment.
  • In the last 50 years, computer science has witnessed numerous cycles of software development migration, and the literature contains many studies, case reports, and models. Several publications were very useful in developing our understanding of risk assessment of digital information. Rapid Development (McConnell 1996) is a monograph on the general problems associated with software development. In many respects, software development exhibits several of the same problems associated with basic digital preservation. While researching risk assessments, we were struck by the vast differences in basic definitions used by different disciplines. (For example, see Reinert, Bartell, and Biddinger [1994], Warren-Hicks and Moore [1995], McNamee [1996], Wilson and Crouch [1987], Starr [1969], and Lagadec [1982]). Numerous professions measure risk, and each assigns risks a unique vocabulary and context. The degree and type of risk associated with any data archive may be understood differently by administrators, operational staff members, and data users, depending upon their individual training and experience. The measurement of risk was equally problematic. One paper correlated risk level with the nonlinear relative probability of risk occurring (Kansala 1997). Another publication introduced an algebraic formula (McConnell 1996). In a third instance, a research group felt that cases where one could accurately assess the probability of a future event were rare because the information technology environment for software changes so rapidly. They preferred simple estimates, such as high , medium , and low , which they believed facilitated decision making (Williams, Walker, and Dorofee 1997). Risk-measurement scales, like risk definitions, are as distinctive as their developers.
  • Funded by the German Ministry of Education and Research
  • If an auditor goes to where the activity takes place, he will observe more than he could with mere explanation, and will have more insight on what follow-ups to ask. Also ask for a demonstration of the procedure: when a procedure takes place is often not described. And some procedures take place in multiple or undefined locations.
  • Excerpt: The process was extremely insightful and highlighted possible areas where the DRAMBORA methodology could be improved, as well as a range of generic objectives, functions and concerns common to digital libraries. We concluded that the take-up and use of DRAMBORA would benefit from the introduction of an interactive tool and sharable registry of responses so that organisations undertaking self-assessment could profit from the experiences and responses of the organisations that they consider to be their peers. Further clarification was achieved about the specific classes of post-holders that should participate in the assessment process, as well as the optimal number of participants and most appropriate means of establishing conversational focus. We gained a better understanding of the practical ways in which organisations assess their risks; as a result we concluded that the original DRAMBORA risk impact and probability scores could be made less granular, and that more opportunities should be available for respondents to consider the severity of their risks in more relative terms, rather than in comparison with objective impact and probability metrics. These four assessments have also made it possible to develop a generic risk profile for digital libraries. Finally, from the perspective of each of the audited institutions, the process was overwhelmingly successful; testimonials from representatives of each described in detail the benefits of formally scrutinising the organisational characteristics and implicit challenges faced within their own digital library.
  • Risk assessment and the DRAMBORA methodology 15 July 2009 Donnelly, McHugh, Ross, Innocenti, Ruusalepp and Hofman AREAS OF EXPRESSION - Reputation and intangibles - Organisational viability - Service delivery - Technology
  • Trick is to work backwards from mandate and goals: what could prevent these from being achieved?
  • No single entry point into lifecycle. Tools cover different, but at times overlapping, phases of it
  • - The underlying methodologies are the same: both are self-management tools to assess the extent to which you’re meeting goals, be that running effective repositories or good data management - The chief difference lies in the context in which they’re applied. DRAMBORA focuses on repositories and how well they’re meeting their mission OR helps with planning the development of trusted repositories (PLATTER useful here too) DAF assesses the management of data in the earlier stages of the lifecycle, looking more at the work of researchers - Information may not always map directly between the two tools as each assessment will have different parameters and be a self-contained process, however they will inform each other. e.g. the risks / gaps noted in a DAF assessment will point to what’s needed from a repository or more formal curation environment e.g. the standards and technologies used for data creation could point to preservation issues or DRAMBORA risks e.g. tracking assets from the point of creation using DAF to deposit and ongoing risk assessment activity may have a positive impact on the assets’ authenticity for the longer-term
  • - Primary output from DAF is the register of data assets. This could feed into DRAMBORA assessment, but would also be a useful tool for repositories e.g. to track data / prompt ingest - Information on roles could map directly to DRAMBORA as some support / skills may overlap - Underlying context is a key area covered in DAF that can inform DRAMBORA. Funder requirements / legislative context will be similar Standards / best practice used at creation will affect longer-term curation - Data management risks will likely echo repository risks as both will have similar aims e.g. ensuring data integrity / authenticity, continued access, meaningful / reusable resources - Risks faced in one context could cast light on issues that may be faced in another OR may help to demonstrate value of other e.g. data better curated in repository context
  • Here is a visualisation of the DRAMBORA process and the points at which DAF information may feed through Note: Assets has broader definition in DRAMBORA - includes software, physical assets, services, processes, people and intangibles Considering the DAF asset register as a collection may be beneficial for providing a more well-defined context for DRAMBORA audits. In this respect, the collections of assets could be assessed to reflect specific research projects, specific types of assets (e.g., images), department or institution, or perhaps even from a contributor perspective (i.e, a collection of assets reflecting various work for a specific researcher.
  • - JISC funded project starting in November to integrate several related data management planning tools - Aim is to help institutions plan for and benchmark / assess their data management strategy AIDA about institutional preparedness for preservation: organisation; technology; resources LIFE about preservation costs – HATII will be developing the tool - Will work in collaboration with / provide support to 07/09 data infrastructure projects Let us know what overlaps you see between DRAMBORA and DAF methodologies / toolkits, or areas where you think they could be brought closer together
  • Transcript

    • 1. DRAMBORA: Risk and Trust and Data Management Martin Donnelly DCC, University of Edinburgh [email_address] (and Andrew McHugh, Sarah Jones, Joy Davidson, Seamus Ross, Raivo Ruusalepp, Perla Innocenti…)
    • 2. Order of Play
      • Part I: Risk and Trust in Digital Repositories
      • Part II: The DRAMBORA Methodology
        • How it was arrived at
        • Where it can take you
      • Part III: Risk Management Exercise
      • Part IV: DRAMBORA Interactive
        • An introductory overview
        • Preview of v2.0
      • Part V: DRAMBORA and DAF within the preservation lifecycle
        • Future systems integration
    • 3. Order of Play
      • Part I: Risk and Trust in Digital Repositories
      • Part II: The DRAMBORA Methodology
        • How it was arrived at
        • Where it can take you
      • Part III: Risk Management Exercise
      • Part IV: DRAMBORA Interactive
        • An introductory overview
        • Preview of v2.0
      • Part V: DRAMBORA and DAF within the preservation lifecycle
        • Future systems integration
    • 4. DRAMBORA
      • The D igital R epository A udit M ethod B ased O n R isk A ssessment
      • ( DRAMBORA ) was developed by the Digital Curation Centre (DCC) and
      • DigitalPreservationEurope (DPE) to assist repository management and
      • staff to identify, assess, manage, and mitigate risks.
      • Definition: risks describe challenges or threats that impede the achievement of repository objectives , obstruct activities , and prejudice the continued availability of essential assets .
      • In DRAMBORA, risks have several attributes: probability , impact , severity (a derived value, p*i), owner(s) , and management strategies . Risks may also link to other risks. (See ‘Anatomy of a Risk’ below…)
    • 5. DRAMBORA covers:
        • information assets (analogue/digital materials, databases, data files, contracts, agreements, documentation, policies and procedures);
        • software assets;
        • physical assets;
        • services and utilities;
        • business processes;
        • people (staffing and skills);
        • intangibles, such as reputation.
    • 6. Definition of a repository
      • We propose that a digital repository is differentiated from other digital collections by the following characteristics:
          • content is deposited in a repository, whether by the content creator, owner or third party;
          • the repository architecture manages content as well as metadata;
          • the repository offers a minimum set of basic services e.g. put, get, search, access control;
          • the repository must be sustainable and trusted, well-supported and well-managed.
          • Heery and Anderson (2005) ‘Digital Repositories Review’
          • http://www.jisc.ac.uk/uploaded_documents/digital-repositories-review-2005.pdf
      • For DRAMBORA, ‘repository’ is a broad term encompassing many different types of resource and collection
      • (N.B. despite its acronym, the DRAMBORA methodology and system may be used for analogue collections as well as digital content!)
    • 7. 10 Characteristics of Digital Repositories
      • An intellectual context for the work:
        • Commitment to digital object maintenance
        • Organisational fitness
        • Legal & regulatory legitimacy
        • Effective & efficient policies
        • Acquisition & ingest criteria
        • Integrity, authenticity & usability
        • Provenance
        • Dissemination
        • Preservation planning & action
        • Adequate technical infrastructure
      (CRL/OCLC/NESTOR/DCC/DPE meeting, January 2007) © HATII UofGlasgow, 2007
    • 8. Trustworthiness and Archival Stewardship
      • Trustworthiness is an increasingly sought after commodity
      • Decentralisation part of a normal progression (see UK AHDS)
      • Trustworthiness has wide reaching implications
        • external (financiers, depositors, creators, consumers)
        • internal (management, strategic planning)
    • 9. The Challenge of Building Trust
      • There is work going on now to define certification methodologies and
      • processes for trusted digital repositories, but formal certification is still
      • some way off. The DCC view is that the most effective way to build trust
      • amongst stakeholder communities at this time is not necessarily through
      • formal certification, but rather by the ability to:
      • illustrate that you know what risks threaten your ability to meet your mandate
      • provide evidence that you have considered these risks, understand them, and have appropriate measures in place to manage and mitigate them over time
    • 10. Trustworthy Repositories Audit & Certification (TRAC) Criteria and Checklist
      • RLG/NARA assembled an International Task Force to address the issue of repository certification
        • TRAC is a set of criteria applicable to a range of digital repositories and archives, from academic institutional preservation repositories to large data archives and from national libraries to third-party digital archiving services
      • Provides tools for the audit, assessment, and potential certification of digital repositories
      • Establishes audit documentation requirements required
      • Delineates a process for certification
      • Establishes appropriate methodologies for determining the soundness and sustainability of digital repositories
    • 11. Risk and Repositories
    • 12. Types of preservation risk
      • Economic
      • Financial
      • Political
      • Contractual
      • Environmental
      • Technological
      • Physical
      • Organisational
      • Socio-cultural
      • Legal
    • 13. Standard Risk Management Model
    • 14. Risk Management and Digital Preservation
      • Lack of literature for risk-assessment in LIS compared with Computer Science
      • Differences in definitions used by different disciplines
      • Quantifying risk is problematic
      • The greatest challenge is the interpretation of the risk, i.e. to determine when a risk is acceptable
      • To manage this we create a risk register
    • 15. The nestor Catalogue of Criteria
      • The nestor working group developed a Catalogue of Criteria for Trusted Digital Repositories…
        • Aimed at German memory organisations and institutions, service providers devising, planning and implementing digital repositories
        • Provides guidance, tools for self-checking, and potentially certification
        • Abstract criteria, applicable for a range of digital repositories, and valid over a longer period,
        • Basic principle: Adequacy. E valuation is always based on the objectives and tasks of the individual digital repository concerned
      www.digitalpreservation.de
    • 16. Top down approach: tried and tested
      • Many auditable domains benefit from objective criteria
        • Information and IT security
        • Financial regulation
      • But disregards diversity evident across preservation discipline
        • funding, scale, legislative responsibilities and restrictions, content types, technology and policy vary
      • Generic criteria are difficult to conceive
    • 17. The Risks of Objectivism
      • Difficulties associated with a generalisation of optimal repository characteristics
      • Do all repositories share singularity of purpose / uniform priorities?
      • Documenting a set of ‘blue sky’ aspirational repository qualities is useful – nestor and TRAC make compelling reference materials
      • But both check-lists are necessarily vague
    • 18. Order of Play
      • Part I: Risk and Trust in Digital Repositories
      • Part II: The DRAMBORA Methodology
        • How it was arrived at
        • Where it can take you
      • Part III: Risk Management Exercise
      • Part IV: DRAMBORA Interactive
        • An introductory overview
        • Preview of v2.0
      • Part V: DRAMBORA and DAF within the preservation lifecycle
        • Future systems integration
    • 19. The Evolution of an Audit Methodology
      • Pilot Audits Aiming to:
        • Develop
        • Validate
        • Refine
        • Deploy
      • A methodology for repository audit
    • 20. DRAMBORA Method
      • Discrete phases of (self-)assessment, reflecting the realities of audit
      • Preservation is fundamentally a risk management process:
        • Define Scope
        • Document Context and Classifiers
        • Formalise Organisation
        • Identify and Assess Risks
      • Builds audit into internal repository management procedures
    • 21. What does this mean in practice?
      • Establish organisational profile
      • Develop contextual understanding
      • Identify and classify repository activities and assets
      • Derive registry of pertinent risks
      • Undertake assessment of risks (and existing management means)
      • Commit to management strategies
    • 22. The Risks of Subjectivity
      • DRAMBORA is fundamentally ‘bottom-up’
      • Comparability and reproducibility of results are compromised
      • Improvement in self-assessment is limited by one’s own horizons (no external view)
      • How can repositories comment on unanticipated risks when they are unaware of available opportunities?
    • 23. Finding Islands of Objectivity
      • 80 or so sample risks included in methodology to prompt thinking... but many more were needed!
      • DRAMBORA Interactive may enable repositories to align their objectives, activities, strengths and shortcomings with other peer repositories’ responses
      • Ambition to collate these as a series of repository profiles, encapsulating key roles, responsibilities, functions and risks
    • 24. * Discussion Break *
      • Who cares about repository audit?
      • Who will pay for it?
        • Who are the beneficiaries?
      • Should submitting to audit be compulsory?
        • Carrot versus stick?
      • Is auditing worthwhile?
      • What are the drawbacks of self-assessment?
    • 25. The Audit Process in a bit more detail
    • 26. 6 key questions ahead of the audit
      • Why is the audit being done?
      • What exactly is to be audited?
      • Who will conduct the audit?
      • Where will the audit take place?
      • When will the audit take place?
      • How will the work be carried out?
    • 27. 6 questions for auditing: #1 Why?
      • Identify and manage risks
      • Verify compliance
      • Check effectiveness
      • Identify opportunities for improvements
      • Engender trust in stakeholder communities
    • 28. 6 questions for auditing: #2 What?
      • Digital repositories, digital libraries, digital archives…
      • Information collections
      • Those that purport to be OAIS ‘compliant’?
      • Ongoing projects vs. projects not yet started
    • 29. 6 questions for auditing: #3 Who?
      • Organisations, research centres, data centres, libraries, museums….
      • National and international remits
      • Public and private sector
      • Auditor(s): internal or external
      • Members of staff with specific roles and responsibilities within the repository
      AND
    • 30. 6 questions for auditing: #4 Where?
      • Comfortable environment with Internet connection
      • Close to where the activity takes place
      • Where demonstrations are feasible
      • Where staff can discuss without interruption
    • 31. 6 questions for auditing: #5 When?
      • Plan well in advance
      • Schedule with consideration for the status of project and/or repository being audited
      • Schedule onsite activities over consecutive days, but with time allocated before and after for additional analysis and conclusion
    • 32. 6 questions for auditing: #6 How?
      • Familiarity with DRAMBORA (inc. the online system) and other complementary methodologies
      • Aggregate, accumulate and create appropriate documentation
      • Online and onsite
      • Communication is critical
    • 33. Risk Impact, Risk Management and DRAMBORA
    • 34. Risk Impact in the repository context
      • Impact can be considered in terms of:
        • impact on repository staff or public well-being
        • impact of damage to or loss of assets
        • impact of statutory or regulatory breach
        • damage to reputation
        • damage to financial viability
        • deterioration of product or service quality
        • environmental damage
        • loss of ability to ensure digital object authenticity and understandability is ultimate expression of impact
    • 35. Risk Management and DRAMBORA
      • The toolkit refrains from prescribing specific management policies
      • Instead, auditors should:
        • choose and describe risk management strategy
        • assign responsibility for adopted measures
        • define performance and timescale targets
        • reassess success iteratively
    • 36. DRAMBORA Workflow Preliminary collecting and analysis of repository documentation Organise appointments and onsite visits with repository staff (managers, curators, IT, legal experts…) Risk registry finalisation Audit report finalisation Impact on individuals and organisations
    • 37. DRAMBORA Sample Audits (i)
      • Sample audits carried out at…
        • The Michigan-Google Digitization Project and MBooks at the University of Michigan Library
        • Gallica at the Bibliothèque nationale de France
        • the Digital Library of the National Library of Sweden
        • CERN’s Document Server
      Ross, S., McHugh, A., Innocenti, P., Ruusalepp, R.: Investigation of the potential application of the DRAMBORA toolkit in the context of digital libraries to support the assessment of the repository aspects of digital libraries (Glasgow: DELOS NoE, August 2008) (ISBN: 2-912335-41-8)
    • 38. DRAMBORA Sample Audits (ii)
      • Key conclusions
        • Identified areas for future improvement in the DRAMBORA methodology
        • Clarified key roles in the audit process
        • Positive feedback received on direct and subsidiary benefits of carrying out audits
        • Genesis of DRAMBORA Interactive…
      Ross, S., McHugh, A., Innocenti, P., Ruusalepp, R.: Investigation of the potential application of the DRAMBORA toolkit in the context of digital libraries to support the assessment of the repository aspects of digital libraries (Glasgow: DELOS NoE, August 2008) (ISBN: 2-912335-41-8)
    • 39. Order of Play
      • Part I: Risk and Trust in Digital Repositories
      • Part II: The DRAMBORA Methodology
        • How it was arrived at
        • Where it can take you
      • Part III: Risk Management Exercise
      • Part IV: DRAMBORA Interactive
        • An introductory overview
        • Preview of v2.0
      • Part V: DRAMBORA and DAF within the preservation lifecycle
        • Future systems integration
    • 40. DRAMBORA stages in brief
      • Establish organisational profile;
      • Develop contextual understanding;
      • Identify and classify repository activities and assets;
      • Derive registry of pertinent risks;
      • Undertake assessment of risks (and existing management means);
      • Commit to management strategies.
    • 41. Defining and identifying risks
      • Definition: risks describe challenges or threats that impede the achievement of repository objectives , obstruct activities , and prejudice the continued availability of essential assets .
      • In DRAMBORA, risks have several attributes: probability , impact , severity (derived, p*i), area of expression , owner(s) , and management strategies . Risks may also link to other risks.
      • With DRAMBORA, you can choose to:
        • Recycle existing risks (a number of ‘off-the-shelf’ risks are available for you to select and modify); or
        • Develop new risks from scratch.
    • 42. Anatomy of a risk The name of the individual who assumes ultimate responsibility for the risk in the event of the stated risk owner relinquishing control Escalation Owner: Name of risk owner - usually the same as owner of corresponding activity Owner: Hardware, software or communications equipment and facilities Operations and service delivery Personnel, management and administration procedures Physical environment Nature of Risk: Date that risk was first identified Date of Risk Identification: Example circumstances within which risk will or may execute Example Risk Manifestation(s): A longer text string offering a fuller description of this risk Risk Description: A short text string describing the risk Risk Name: A text string provided by the repository to uniquely identify this risk and facilitate references to it within risk relationship expressions Risk Identifier:
    • 43. Anatomy of a risk A targetted risk-severity rating plus risk reassessment date Risk Management Activity Target: Individual(s) responsible for performance of risk management activities Risk Management Activity Owner: Practical activities deriving from defined policies and procedures Risk Management Activity(ies): Description of policies and procedures to be pursued in order to manage (avoid and/or treat) risk Risk Management Strategy(ies): A derived value, representing the product of probability and potential impact scores Risk Severity: This indicates the perceived impact of the execution of this risk in terms of loss of digital objects' understandability and authenticity Risk Potential Impact: This indicates the perceived likelihood of the execution of this particular risk Risk Probability: A description of each of the risks with which this risk has relationships Risk Relationships: Parties with an investment or assets threatened by the risk's execution, or with responsibility for its management Stakeholders:
    • 44. Risk Relationships where risks exist in isolation, with no relationships with other risks Atomic where avoidance or treatment associated with a single risk renders the avoidance or treatment of another less effective Domino where avoidance or treatment mechanisms associated with one risk also benefit the management of another Complementry where a single risk’s execution will increase the likelihood of another’s Contagious where the simultaneous execution of n risks has an impact in excess of the sum of each risk occurring in isolation Explosive Definition of Risk Relationship Risk Relationship
    • 45. Scenario for the Exercise
      • You work in an archive that has recently expanded its mandate to include the stewardship of digital materials…
        • How do you determine your ability to safeguard the data you accept?
        • How can you prove your trustworthiness to those depositing data and reusing the resources over time?
    • 46. Part I – Identify a risk (30 minutes)
      • Each group should identify one risk (based on your own
      • experiences wherever possible), and complete the
      • DRAMBORA worksheet.
      • Groups should complete:
        • name and description of the risk;
        • example manifestations of the risk;
        • nature of the risk;
        • risk owner(s);
        • stakeholders who would be affected;
        • if possible, relationships with other risks.
    • 47. Part II – Mitigate the risk (30 minutes)
      • Now identify what steps your archive might take to manage and mitigate the identified risk over time…
      • Each group should complete:
        • Risk management strategy/-ies;
        • Risk management activities;
        • Risk management activity owner(s).
    • 48. Benefits of Risk Assessment Exercise
      • Firmly established organisational mandate
      • Understanding of legal and regulatory framework within which you are working
      • Development and maintenance of a realistic risk register
      • Identification and collation of relevant policies and strategies
      • Identification of staff skills and gaps
      • Identification of strengths and weaknesses in operations
      • Pre-cursor to self-audit or external audit
    • 49. Order of Play
      • Part I: Risk and Trust in Digital Repositories
      • Part II: The DRAMBORA Methodology
        • How it was arrived at
        • Where it can take you
      • Part III: Risk Management Exercise
      • Part IV: DRAMBORA Interactive
        • An introductory overview
        • Preview of v2.0
      • Part V: DRAMBORA and DAF within the preservation lifecycle
        • Future systems integration
    • 50. DRAMBORA Interactive www.repositoryaudit.eu
    • 51. DRAMBORA Interactive
      • System was developed as a labour-saving device following feedback on the initial paper-based DRAMBORA audit methodology
      • Essentially a means of guiding users through the audit process, and recording information
      • Reporting functionality built in, with other bells and whistles which make it more flexible and user-friendly than the paper-based process
    • 52. Step-by-Step
      • Create a new repository
        • complete name, institution and as many additional details as you wish
      • Create a corresponding user
        • this will enable you to log into the system; the initial user has coordinator status to oversee the audit
      • Create a staff member association
        • this describes the relationship between the user and the created repository
    • 53. Repository Registration
    • 54. Login
      • You will be sent a confirmation email – follow the link to finalise your registration
      • Now you can click on the ‘Home’ link to begin the audit process
      • The first step is to set up some more details about your repository, and about the audit itself
    • 55. Before the audit can start…
      • The most important initial steps are to:
        • Refine the repository characteristics
        • Make explicit the audit scope and purpose
        • Determine the structure for the audit
        • Define staff and allocate roles accordingly
      • These details can be updated at any time, but it’s worth spending time getting a reasonably full set of responses
    • 56. Repository Administration
      • Numerous fields are available to describe the repository
      • No two repositories are identical; diversity manifests itself in various ways
      • Repository profiling can help identify commonalities between repositories, and facilitate the exchange of experiences and ideas
    • 57. Repository Administration
    • 58. Define the Audit Scope
      • Auditors must make explicit the scope of the audit – no repository exists in a vacuum, and it is vital that a perimeter is introduced to determine that which is internal and external to the assessment
      • Also, the audit must be defined in terms of its chronological relationship with the repository. Does it precede the repository, or does it take a retrospective look at efforts already underway?
    • 59. Define Repository Scope
    • 60. Functional classes
      • Functional classes are a means of categorising audit information to facilitate the process and make reports more meaningful
      • You must select at least one functional class at this stage, and it is recommended that you spend some time here to ensure your choice is comprehensive
      • If you feel that available functional classes are insufficient you may define your own additional ones, although a default set of ten is provided (and recommended)
    • 61. Functional Classes
    • 62. Repository Staff
      • Staff are the real people that occupy the various roles in your repository
      • You can choose to associate individual staff members with DRAMBORA Interactive user accounts, but this is not necessary
      • Staff will need user accounts to log into the DRAMBORA tool themselves
      • As with all repository administration activities, only coordinators can create and edit staff members
    • 63. Add/Edit Repository Staff
    • 64. Repository Roles
      • Within DRAMBORA, roles are characterised by their function (e.g., Ingest, Dissemination, Financial Management, Preservation Planning…)
      • Their relationship to staff members is m to n . This means that many staff members can perform a single role, and a given staff member may perform multiple roles.
      • Roles are used to associate activities, risks and risk management responsibilities with specific individuals or sets of individuals
    • 65. Add, Edit & Assign Roles
    • 66. User Administration
      • While logged in, a user can update his/her own details at any time
      • Coordinators can also limit the IP addresses that users may log in from, for security purposes; this supports wild cards
        • *.*.*.* for example permits access from any IP
        • 130.209.*.* permits access from anywhere on the 130.209.x.x network
      • You may wish to restrict access to only your own IP or local network range
    • 67. User Administration
    • 68. Beginning the Audit
      • Once the preparatory stages are complete, we visit the Assessment Centre to begin the audit
      • This corresponds closely with the DRAMBORA methodology; the first step is to define the repository’s mandate
      • DRAMBORA is an iterative process, and each stage can be returned to at any time
    • 69. Define Repository Mandate
      • The repository’s mandate is the first detail that we record
      • This describes the repository’s raison d’être
      • A repository may have multiple mandates, each associated with different contextual organisational levels
    • 70. Define Mandate
    • 71. Define Constraints
      • We then move on to record any constraints that the repository is subject to or influenced by
      • This should include any relevant factor that influences or informs the repository’s objectives or activities (e.g. policy, laws, technical constraints, or even less tangible cultural considerations such as lack of financial confidence)
      • External files can be linked to offer further information
    • 72. Define Constraints
    • 73. Define Objectives
      • At this stage we define each of the repository’s objectives
      • These can be associated with the constraints defined in the previous stage
      • Again, these are structured according to the repository’s functional classes
      • See the DPE Platter report for more information about SMART objectives
    • 74. Define Objectives
    • 75. Define Activities, Assets, Owners
      • This stage requires you to describe the specific activities undertaken within your organisation to complete individual objectives
      • An asset is anything that is required to facilitate the achievement of particular objectives, tangible or otherwise
      • You can also add details of required or related assets for each activity, and an owner (or role) that has responsibility for each activity
    • 76. Define Activities etc.
    • 77. Identify Risks
      • We now continue to identify risks
      • Users can choose to:
        • a) Recycle existing risks (a number of ‘off-the-shelf’ risks are presented to choose and modify)
        • b) Create a new risk from scratch
      • Search functionality is planned for the next software release
    • 78. Identify Risks
      • For each risk you must define a name and description, as well as details of its owner and the corresponding functional class
      • You can also describe
        • the nature of the risk, in simple terms
        • ways in which the risk might manifest itself
        • associated vulnerabilities worth noting
        • relationships with other risks
    • 79. Identify Risks
    • 80. Assess Risks
      • Once you have identified risks, the next step is to undertake risk assessment in order to determine their severity
      • Risk assessment can be done on a whole selection of risks at a time, either by functional class, or by a custom user-defined grouping
    • 81. Assess Risks
      • Three items of information are recorded in the process of assessing each risk
        • impact : the potential impact that the risk would have if it should occur
        • impact expression : the way in which negative effects of the risk’s occurrence manifest themselves
        • probability : the likelihood of the risk occurring
    • 82. Risk Assessment
    • 83. Manage Risks
      • The final stage of the audit is to define appropriate management measures and targets for each risk
      • You can record details of treatment or avoidance measures, as well as anticipated outcomes, and a future date at which point the risk might be reassessed
    • 84. Risk Relationships risks exists in isolation, with no relationships with other risks Atomic where avoidance or treatment associated with a single risk renders the avoidance or treatment of another less effective Domino where avoidance or treatment mechanisms associated with one risk also benefit the management of another Complementry where a single risk’s execution will increase the likelihood of another’s Contagious where the simultaneous execution of n risks has an impact in excess of the sum of each risk occurring in isolation Explosive Definition of Risk Relationship Risk Relationship
    • 85. Manage Risks
    • 86. Reporting Audit Results
      • Users can export their risk register to HTML or to PDF, and a report customising tool is also available
      • DRAMBORA v2.0 will have more sophisticated reporting capabilities
      • We’re interested in hearing the reporting mechanisms that would be of particular interest to users…
    • 87. Audit Reporting
    • 88. Audit Snapshots
      • This feature allows users to record the state of their repository at any given time
      • Facilitates comparison at a later date: can be used to track improvements (or deterioration!) over time
      • A read-only view of the saved responses facilitates analysis of inter-relationships between repository information: a useful reporting tool in itself
    • 89. Snapshot View
    • 90. Ongoing and future developments
      • Supporting JISC’s Research Data programme
      • DRAMBORA v2.0
        • Software currently being redesigned and recoded from scratch, linked to Integrated Data Management Planning (IDMP) work
        • Improved and more user-friendly graphical interface
        • More sophisticated reporting functionality
        • Better combinability to enable integration with DCC and third-party tools, such as DAF
      • Repository profiling (perhaps later…)
    • 91. Order of Play
      • Part I: Risk and Trust in Digital Repositories
      • Part II: The DRAMBORA Methodology
        • How it was arrived at
        • Where it can take you
      • Part III: Risk Management Exercise
      • Part IV: DRAMBORA Interactive
        • An introductory overview
        • Preview of v2.0
      • Part V: DRAMBORA and DAF within the preservation lifecycle
        • Future systems integration
    • 92. Digital Curation Lifecycle Model
      • The curation lifecycle model provides a common means of describing the range of curation actions and roles.
      • The use of the model will help to contextualise project outputs and identify practical workflows for new and existing tools and resources.
    • 93. What is DAF?
      • A set of methods to:
      • find out what data assets are being created and held;
      • explore how they’re stored, managed, shared and reused;
      • identify any risks e.g. misuse, data loss or irretrievability;
      • learn about researchers’ attitudes towards data;
      • suggest ways to improve ongoing data management.
    • 94. Overlaps and Differences self-management tools to assess the effectiveness of approach to data management or preservation - Repository focus - Process emphasis - Lifecycle: Preservation phase - Researcher focus - Data emphasis - Lifecycle: Creation phase
    • 95. What is collected in DAF?
      • Register of data assets
      • Roles and responsibilities
      • e.g. who manages data, research or IT support available
      • Data management strategies / context
      • e.g. funder requirements, resources / support, standards used, awareness of best practice in curation…
      • Risks / recommendations
    • 96. Mappings to DRAMBORA
      • RISKS AND RECOMMENDATIONS
      • ROLES AND RESPONSIBILITIES
      • STRATEGIES, REQUIREMENTS, STANDARDS, BEST PRACTICE
      • DATA ASSET REGISTER
    • 97. Integrated Data Management Planning tool AIDA http://aida.jiscinvolve.org http://www.life.ac.uk/ http://www.data-audit.eu/ coming soon…
    • 98.
      • To learn more about DRAMBORA,
      • to request support, or to
      • join the DRAMBORA user community,
      • visit www.repositoryaudit.eu
      • For further information on DAF,
      • see http://www.data-audit.eu/
      • THANK YOU
      Contacts

    ×