This presentation provides an extensive introduction to the Digital Repository Audit Method Based On Risk Assessment
(DRAMBORA). After considering how the ideas of risk and trust might affect digital repositories, the DRAMBORA methodology is applied using two practical exercises. DRAMBORA can also be applied using an interactive Web-based version of the tool, and the presentation provides an illustrated guided tour of this version of the tool. The presentation was given as part of the final module of a 5-module course on digital preservation tools for repository managers, presented by the JISC KeepIt project. Concluding by comparing DRAMBORA with DAF, the Data Asset Framework, also produced by the DCC, brings the KeepIt course effectively full circle, as the course began in module 1 with DAF. For more on this and other presentations in this course look for the tag ’KeepIt course’ in the project blog http://blogs.ecs.soton.ac.uk/keepit/
KeepIt Course 5: DRAMBORA: Risk and Trust and Data Management, by Martin Donnelly
1. DRAMBORA: Risk and Trust and Data Management Martin Donnelly DCC, University of Edinburgh [email_address] (and Andrew McHugh, Sarah Jones, Joy Davidson, Seamus Ross, Raivo Ruusalepp, Perla Innocenti…)
36. DRAMBORA Workflow Preliminary collecting and analysis of repository documentation Organise appointments and onsite visits with repository staff (managers, curators, IT, legal experts…) Risk registry finalisation Audit report finalisation Impact on individuals and organisations
37.
38.
39.
40.
41.
42. Anatomy of a risk The name of the individual who assumes ultimate responsibility for the risk in the event of the stated risk owner relinquishing control Escalation Owner: Name of risk owner - usually the same as owner of corresponding activity Owner: Hardware, software or communications equipment and facilities Operations and service delivery Personnel, management and administration procedures Physical environment Nature of Risk: Date that risk was first identified Date of Risk Identification: Example circumstances within which risk will or may execute Example Risk Manifestation(s): A longer text string offering a fuller description of this risk Risk Description: A short text string describing the risk Risk Name: A text string provided by the repository to uniquely identify this risk and facilitate references to it within risk relationship expressions Risk Identifier:
43. Anatomy of a risk A targetted risk-severity rating plus risk reassessment date Risk Management Activity Target: Individual(s) responsible for performance of risk management activities Risk Management Activity Owner: Practical activities deriving from defined policies and procedures Risk Management Activity(ies): Description of policies and procedures to be pursued in order to manage (avoid and/or treat) risk Risk Management Strategy(ies): A derived value, representing the product of probability and potential impact scores Risk Severity: This indicates the perceived impact of the execution of this risk in terms of loss of digital objects' understandability and authenticity Risk Potential Impact: This indicates the perceived likelihood of the execution of this particular risk Risk Probability: A description of each of the risks with which this risk has relationships Risk Relationships: Parties with an investment or assets threatened by the risk's execution, or with responsibility for its management Stakeholders:
44. Risk Relationships where risks exist in isolation, with no relationships with other risks Atomic where avoidance or treatment associated with a single risk renders the avoidance or treatment of another less effective Domino where avoidance or treatment mechanisms associated with one risk also benefit the management of another Complementry where a single risk’s execution will increase the likelihood of another’s Contagious where the simultaneous execution of n risks has an impact in excess of the sum of each risk occurring in isolation Explosive Definition of Risk Relationship Risk Relationship
84. Risk Relationships risks exists in isolation, with no relationships with other risks Atomic where avoidance or treatment associated with a single risk renders the avoidance or treatment of another less effective Domino where avoidance or treatment mechanisms associated with one risk also benefit the management of another Complementry where a single risk’s execution will increase the likelihood of another’s Contagious where the simultaneous execution of n risks has an impact in excess of the sum of each risk occurring in isolation Explosive Definition of Risk Relationship Risk Relationship
94. Overlaps and Differences self-management tools to assess the effectiveness of approach to data management or preservation - Repository focus - Process emphasis - Lifecycle: Preservation phase - Researcher focus - Data emphasis - Lifecycle: Creation phase
Hard to define: but we must define it if we want to automate its handling Multi-facetted: not one simple score that covers all data quality, need many such scores… expensive Highly app specific means: again need more than one score, hard to reuse the details of the scores Highly subjective: means difficult to completely automate
Physical : Theft, vandalism, arson, building related risks, Storm, flood, other related weather, damage to vehicles, mobile plant and equipment.
In the last 50 years, computer science has witnessed numerous cycles of software development migration, and the literature contains many studies, case reports, and models. Several publications were very useful in developing our understanding of risk assessment of digital information. Rapid Development (McConnell 1996) is a monograph on the general problems associated with software development. In many respects, software development exhibits several of the same problems associated with basic digital preservation. While researching risk assessments, we were struck by the vast differences in basic definitions used by different disciplines. (For example, see Reinert, Bartell, and Biddinger [1994], Warren-Hicks and Moore [1995], McNamee [1996], Wilson and Crouch [1987], Starr [1969], and Lagadec [1982]). Numerous professions measure risk, and each assigns risks a unique vocabulary and context. The degree and type of risk associated with any data archive may be understood differently by administrators, operational staff members, and data users, depending upon their individual training and experience. The measurement of risk was equally problematic. One paper correlated risk level with the nonlinear relative probability of risk occurring (Kansala 1997). Another publication introduced an algebraic formula (McConnell 1996). In a third instance, a research group felt that cases where one could accurately assess the probability of a future event were rare because the information technology environment for software changes so rapidly. They preferred simple estimates, such as high , medium , and low , which they believed facilitated decision making (Williams, Walker, and Dorofee 1997). Risk-measurement scales, like risk definitions, are as distinctive as their developers.
Funded by the German Ministry of Education and Research
If an auditor goes to where the activity takes place, he will observe more than he could with mere explanation, and will have more insight on what follow-ups to ask. Also ask for a demonstration of the procedure: when a procedure takes place is often not described. And some procedures take place in multiple or undefined locations.
Excerpt: The process was extremely insightful and highlighted possible areas where the DRAMBORA methodology could be improved, as well as a range of generic objectives, functions and concerns common to digital libraries. We concluded that the take-up and use of DRAMBORA would benefit from the introduction of an interactive tool and sharable registry of responses so that organisations undertaking self-assessment could profit from the experiences and responses of the organisations that they consider to be their peers. Further clarification was achieved about the specific classes of post-holders that should participate in the assessment process, as well as the optimal number of participants and most appropriate means of establishing conversational focus. We gained a better understanding of the practical ways in which organisations assess their risks; as a result we concluded that the original DRAMBORA risk impact and probability scores could be made less granular, and that more opportunities should be available for respondents to consider the severity of their risks in more relative terms, rather than in comparison with objective impact and probability metrics. These four assessments have also made it possible to develop a generic risk profile for digital libraries. Finally, from the perspective of each of the audited institutions, the process was overwhelmingly successful; testimonials from representatives of each described in detail the benefits of formally scrutinising the organisational characteristics and implicit challenges faced within their own digital library.
Risk assessment and the DRAMBORA methodology 15 July 2009 Donnelly, McHugh, Ross, Innocenti, Ruusalepp and Hofman AREAS OF EXPRESSION - Reputation and intangibles - Organisational viability - Service delivery - Technology
Trick is to work backwards from mandate and goals: what could prevent these from being achieved?
No single entry point into lifecycle. Tools cover different, but at times overlapping, phases of it
- The underlying methodologies are the same: both are self-management tools to assess the extent to which you’re meeting goals, be that running effective repositories or good data management - The chief difference lies in the context in which they’re applied. DRAMBORA focuses on repositories and how well they’re meeting their mission OR helps with planning the development of trusted repositories (PLATTER useful here too) DAF assesses the management of data in the earlier stages of the lifecycle, looking more at the work of researchers - Information may not always map directly between the two tools as each assessment will have different parameters and be a self-contained process, however they will inform each other. e.g. the risks / gaps noted in a DAF assessment will point to what’s needed from a repository or more formal curation environment e.g. the standards and technologies used for data creation could point to preservation issues or DRAMBORA risks e.g. tracking assets from the point of creation using DAF to deposit and ongoing risk assessment activity may have a positive impact on the assets’ authenticity for the longer-term
- Primary output from DAF is the register of data assets. This could feed into DRAMBORA assessment, but would also be a useful tool for repositories e.g. to track data / prompt ingest - Information on roles could map directly to DRAMBORA as some support / skills may overlap - Underlying context is a key area covered in DAF that can inform DRAMBORA. Funder requirements / legislative context will be similar Standards / best practice used at creation will affect longer-term curation - Data management risks will likely echo repository risks as both will have similar aims e.g. ensuring data integrity / authenticity, continued access, meaningful / reusable resources - Risks faced in one context could cast light on issues that may be faced in another OR may help to demonstrate value of other e.g. data better curated in repository context
Here is a visualisation of the DRAMBORA process and the points at which DAF information may feed through Note: Assets has broader definition in DRAMBORA - includes software, physical assets, services, processes, people and intangibles Considering the DAF asset register as a collection may be beneficial for providing a more well-defined context for DRAMBORA audits. In this respect, the collections of assets could be assessed to reflect specific research projects, specific types of assets (e.g., images), department or institution, or perhaps even from a contributor perspective (i.e, a collection of assets reflecting various work for a specific researcher.
- JISC funded project starting in November to integrate several related data management planning tools - Aim is to help institutions plan for and benchmark / assess their data management strategy AIDA about institutional preparedness for preservation: organisation; technology; resources LIFE about preservation costs – HATII will be developing the tool - Will work in collaboration with / provide support to 07/09 data infrastructure projects Let us know what overlaps you see between DRAMBORA and DAF methodologies / toolkits, or areas where you think they could be brought closer together