Successfully reported this slideshow.

More Related Content

Related Books

Free with a 14 day trial from Scribd

See all

Risk management and auditing

  1. 1. Risk management and auditing Dorothea Salo
  2. 2. Threat model •“Preservation” means nothing unmodified. • This is why it becomes such a bogeyman! •Two things you need to know first: • why you’re preserving what you’re preserving, and • what you’re preserving it against. •Libraries: your collection-development policy should inform the first question. • Your coll-dev policy doesn’t include local born-digital or digitized materials? This is a problem. Fix it. •The second question is your “threat model.”
  3. 3. What is your threat model for print?
  4. 4. Homelessness
  5. 5. Water
  6. 6. Flora and fauna
  7. 7. Physical damage
  8. 8. Loss or destruction
  9. 9. Why did I just make you do that? •I’m weird. •I’m trying to destroy the myth that any given medium “preserves itself.” •Media do not preserve themselves. People preserve media—or media get bizarrely lucky. •We need not panic over digital preservation any more than we panic about print. •Approach digital preservation the same way you approach print preservation.
  10. 10. Now... List important threats to digital data.
  11. 11. Physical medium failure
  12. 12. “Bitrot”
  13. 13. File format obsolescence
  14. 14. Forgetting what you have
  15. 15. Forgetting what the stuff you have means
  16. 16. Rights and DRM
  17. 17. Lack (or disappearance) of organizational commitment
  18. 18. One word: Geocities.
  19. 19. ? Ignorance •“It’s in Google, so it’s preserved.” (Not even “Google Books!”) •“I make backups, so I’m fine.” •“I have a graduate student who takes care of these things.” •“Metadata? What’s that? I have to have it?” •“Digital preservation is an unsolvable problem, so why even try?” (I’ve heard this one from librarians. I bet you have too.)
  20. 20. Apathy
  21. 21. Mitigating the risks: planning and auditing tools
  22. 22. Audit frameworks • Trusted Repository Audit Checklist • (If you see “NARA/RLG” somewhere? This is the framework that evolved into TRAC. Long story.) • You can get an actual formal TRAC audit from CRL! Who has? Portico, Hathi, “Chronicle of Life,” two-three others. This audit is HARSH. (So don’t write off a repo because it hasn’t had a TRAC audit.) • If you hear the phrase “trusted digital repository,” it should mean that the repo has had (or is pursuing) a TRAC audit. • DRAMBORA • More flexible, less finger-shaking than TRAC. • Less of this “designated community” nonsense. • Less dependent on OAIS model (which I consider a strength). • Encourages archives to consider and document their individual situations and think hard about risk mitigation.
  23. 23. Newer: SPOT model •Even less clunky than DRAMBORA. •I quite like this one. •Identifying Threats to Successful Digital Preservation: the SPOT Model for Risk Assessment • http://www.dlib.org/dlib/september12/vermaaten/ 09vermaaten.html
  24. 24. So what do they audit? •Mission (and adherence to it) •Plans and policies • including contingency plans •Staff infrastructure •Operations documentation • including tech infrastructure, service infrastructure •Sustainable funding •“Doing the right things with the stuff.” • identifiers, ingest file format management, migration, etc. •NOTICE WHAT’S FIRST ON THE LIST. • remember, the tech part is the easy part!
  25. 25. TRAC, DRAMBORA, and DH •TRAC, DRAMBORA, and SPOT are designed to audit repositories, not individual datasets, data files, or research projects. • They assume a lot of infrastructure and (in TRAC’s case) a long-term time horizon that you probably aren’t. •So if you’re trying to think through a project, where do you go? • TRAC and DRAMBORA are probably overkill! • (Though parts of DRAMBORA won’t hurt you.)
  26. 26. Data Curation Profiles •Research project out of Purdue’s Digital Data Curation Center (“D2C2”) •“Toolkit:” interview instrument, user guide for interview instrument, worksheet. •Small library of completed profiles •Ignore the user guide. Grab the worksheet, and use the interview instrument for reference. •http://datacurationprofiles.org • You have to make a login to download the toolkit pieces.
  27. 27. Mitigating specific risks
  28. 28. Physical medium failure •Gold CDs are not the panacea we thought. • They’re not bad; they’re just hard to audit, so they fail (when they fail) silently. Silent failure is DEADLY. •Current state of the art: get it on spinning disk. •Back up often. Distribute your backups geographically. Test them now and then. • Consider a LOCKSS cooperative agreement. Others have. •Bitrot-detection techniques may help here too. •Any physical medium WILL FAIL. Have a plan for when it does.
  29. 29. “Digital forensics” •The art and science of investigating digital file formats and media. • Reading obsolete ones. • Reverse-engineering and/or documenting existing ones so they don’t go obsolete. • Ensuring secure deletion, when necessary. • Reconstructing what used to be on a physical storage medium. (Surprising how often this is possible!) • Audit trails for legal and records-management purposes. • AMAZING report (highly highly recommended!): “Digital Forensics and Born-Digital Content in Cultural Heritage Institutions.” http://www.clir.org/pubs/abstract/ pub149abst.html. Both computer-nerdy and humanities- nerdy in the best possible way.
  30. 30. Avoiding “bitrot” •Sometimes used for “file format obsolescence.” •I use it for “the bits flipped unexpectedly.” •Checking a file bit-by-bit against a backup copy is computationally impractical for every day. • Though on ingest it’s a good idea to verify bit-by-bit! •Checksums • A file is, fundamentally, a great big number. • Do math on the number file. Store the result as metadata. • To check for bitrot, redo the math and check the answer against the stored result. If they’re different, scream. • Several checksum algorithms; for our purposes, which one you use doesn’t matter much. • “Hash collision:” it’s possible, but unlikely, for different files to have the same checksum. Potential hack vector!
  31. 31. Migration vs. emulation: dealing with obsolescence •Migration • change the file to be usable in new software/hardware configurations • risks: information loss (FONTS!), imperfect transfer, choosing the wrong migration path • smart systems don’t throw away the old files! •Emulation • keep the file, train new software/hardware to behave like the old • risks: imperfect emulation, impractical emulation • makes more sense for software (games!), less for files •Pragmatically: redigitization.
  32. 32. Finding tools •Migration • Current versions of the original software may be able to open old files. • Open-source software in the same genre may be able to translate proprietary file formats (often imperfectly). Tend to maintain translators longer than you’d think. • Look on the web! • MIGRATE FAST. Once it’s damaged or obsolete, it’s probably too late. •Emulation • look for the gamers! it’s WILD what they’ll emulate! • Look to the open-source community for operating- system, hardware-driver emulators. • Frankly, there’s a lot of hype and vaporware here.
  33. 33. When is a PDF not a PDF? •When it’s a .doc with the wrong file extension •When there’s no file extension on it at all •When it’s so old it doesn’t follow the standardized PDF conventions •When it’s otherwise malformed, made by a bad piece of software. •How do you know whether you have a good PDF? (Or .doc, or .jpg, or .xml, or anything else.)
  34. 34. File format registries and testing tools •JHOVE: JSTOR/Harvard Object Validation Environment • Java software intended to be pluggable into other software environments • Answers “What format is this thing?” and “Is this thing a good example of the format?” • Limited repertoire of formats •PRONOM/DROID + GDFR = Unified Digital Formats Registry •Wrapper tool: FITS, File Information Tool Set • JHOVE + DROID + various other testers. State of the art.
  35. 35. Thanks! •Copyright 2011 by Dorothea Salo. •This lecture and slide deck are licensed under a Creative Commons Attribution 3.0 United States License.

×