Revisiting File Formats for Digitization Steven T. Puglia Digital Conversion Services Manager Office of Strategic Initiatives Library of Congress 101 Independence Ave, SE Washington, DC 20540, USA Phone: 202-707-5726 Email: firstname.lastname@example.org
Federal Agencies Digitization Guidelines Initiative
In general, within the digital library community, format and compression recommendations for master and derivative image files remain based on older perspectives regarding digitization, digital preservation, and IT/network/web technologies.
According to the reasoning that dominated until recently, compressed master files take up less storage, are easier to deliver to users with slow network connections, and are more convenient to handle internally.
In recent years, a number of institutions have come to question this tenet as storage costs have plummeted and network speeds have dramatically increased.
Compression creates a number of problems.
Another very important issue is that both lossy and lossless compression add still another level of complexity to the encoding of a file, making it even more difficult for future archeologists trying to decipher its contents.
File Compression Strategies, ALA Mid-Winter Panel, 1999 (RLG DigiNews)
Advocated the use of visually lossless (but lossy) compression with certain types of originals. Because "practical people make practical decisions,"
Time, conversion costs, staffing, etc., are all factors in the decision-making process for digital reformatting.
He called attention to a handout that framed his position as a debate proposition:
1. Resolved, that visually lossless (yet lossy) compression of tonal images of illustrated book pages can be used to create high-quality digital masters if…conditions are met.
2. Further resolved, that such images are of sufficient quality to serve as preservation images for books which are: Found in the stacks, not in the rare book room. Likely to remain available somewhere in physical form. 3. Further resolved, that such images are of comparable or superior quality to accepted preservation approaches such as microfilm. 4. Further resolved, that cost matters in digital library image conversion projects, even though it is other people's money. With a final nod to the improvements proposed for JPEG 2000, Sharpe argued that at a minimum, the library and archival community should not close the door on the use of visually lossless compression.
So much digital data now moves around the globe that those who endeavor to measure it employ a new - or new to non-nerds - term.
Meet the exabyte.
Rise of Information in the Digital Age http://www.washingtonpost.com/wp-dyn/content/graphic/2011/02/11/GR2011021100614.html?sid=ST2011021100514
Really big data: The challenges of managing mountains of information, by John Brandon, October 18, 2011 http://www.computerworld.com/s/article/9220504/Really_big_data_The_challenges_of_managing_mountains_of_information ? The Library of Congress processes 2.5 petabytes of data each year, which amounts to 40TB per week. Thomas Youkel, group chief of enterprise systems engineering at the Library, estimates the data load will quadruple in the next few years as the Library continues to carry out its dual mandates to serve up data for historians and preserve information in all its forms.
My case is that, as we see from the last few years focus on "sustainability of digital preservation", the major problems in digital preservation are economic.
(Blue Ribbon Task Force on Sustainable Digital Preservation and Access - http://brtf.sdsc.edu/)
Andy Jackson, The British Library http://www.openplanetsfoundation.org/blogs/2011-01-12-format-obsolescence-and-sustainable-access This means that the long-term cost of preserving our collection scales not only with the size of the files, but also rises as the number of formats we are required to support is increased.
We can not avoid dealing with compressed file formats, including lossy compressed-
Currently, up to 375 billion digital photos are taken each year and the number continues to increase - “ How many photos have ever been taken?” by Jonathan Good, Sept. 15, 2011
This is orders of magnitude more raster image files than are being produced by digitization efforts
We can expect pretty much all of these 375 billion digital photos per year are JPG files
The answer for digital preservation is not going to be insisting all image files be saved as uncompressed formats
David Rosenthal, Stanford University http://blog.dshr.org/2011/03/how-few-copies.html Compression reduces the redundancy within a single copy and increases the risk of damage. There are also techniques that increase the redundancy within a single copy and reduce the risk.
If you ask the people who run large data centers what are the most important causes of data loss, you get a list like this:
Erik Hetzner, California Digital Library http://groups.google.com/group/digital-curation/msg/b487a1b0188f9c0c I see no reason to store, as a matter of policy, uncompressed files on our disks. In fact, I think we should be more aggressive about compressing files. (Hetzner focuses on lossless compression.)
Erik Hetzner, California Digital Library http://groups.google.com/group/digital-curation/msg/b487a1b0188f9c0c Even without error correcting codes, I don’t think the arguments for storing uncompressed data only as a matter of policy are strong at all. When we take error correcting codes into account, not compressing your data as a policy in order to keep a higher level of redundancy seems like the worst way to increase the redundancy of the data. Smart people have figured out how to make codes which can reliably correct limited errors in bytestreams. Why not use them?
Data corruption is and will remain a problem. An active part of digital preservation will be to overcome this problem. The LOCKSS concept includes one approach for dealing with the problem – “…the bits and bytes are continually audited and repaired…to protect fragile digital content for the very long time.” http://www.eecs.harvard.edu/~mema/publications/SOSP2003.pdf LOCKSS now has a 12 year track record.
Thus we can say that some digital content is going to get lost or damaged. This shouldn't be news; the same is true of analog content.
We have rules of thumb to guide us in trying to reduce the amount of loss and damage:
The more copies the safer
The less correlated the copies the safer
The more reliable each copy the safer
The faster failures are detected and repaired the safer
The less aggressive the compression the smaller the effect of damage
If image files are being brought into a managed environment, compression, particularly lossless compression, is much less of a concern. Conversely, if images are being stored on DVDs on a shelf, then compression raises the risks significantly.
One option for file format and compression (lossless and lossy) - JPEG 2000
There remain barriers for many organizations to adoption of JPEG 2000 (limited open source tools), and concerns and related potential risks (corruption and potential legal issues). These issues have been acknowledged within the broader cultural heritage digitization community.
A number of research studies have been conducted on the robustness of JPEG 2000. Studies have seen similar results in terms of susceptibility to corruption. Nevertheless, organizations have concluded that JPEG 2000 is an appropriate file format choice from a robustness perspective – “conclude that JPEG 2000 is a good current solution for our digital repositories.” A Format for Digital Preservation of Images by Buonora and Liberati http://www.dlib.org/dlib/july08/buonora/07buonora.html
It is worth noting the format includes some “resiliency” elements that add robustness and thereby counteract some effects of data loss. These resiliency elements are described in the notes at the bottom of the Sustainability of Digital Formats – Planning for Library of Congress web page ( http://www.digitalpreservation.gov/formats/fdd/fdd000138.shtml) .
Wellcome Library http://jpeg2000wellcomelibrary.blogspot.com/2010/06/we-need-how-much-storage.html In 2009, the Wellcome Library set out an ambitious vision to digitise a large proportion of its historic collections. This would take the annual digitisation activities of the Library from hundreds, or at most, thousands of images per year to several million images per year. … we realised this could see the generation of up to 30m images over 5 years. Exciting, but perhaps slightly daunting, considering we didn't yet have an infrastructure to fully support such a large collection of digital assets.
Wellcome Library- Anyone reading this blog will understand why the scale of the programme is key to the blog topic. When we asked our IT department to tell us how much it would cost to store 30m TIFF files - our de facto standard for the couple hundred thousand images in our existing picture library - we were stunned. Two petabytes of online, spinning disk storage with a top-of-the-line enterprise management system and remote backup would cost how much? We learned that the cost would be something like a fifth of our total budget for the entire digitisation programme.
Wellcome Library- Should we consider a lower-cost storage solution? Even tape back-up was quite expensive for that scale, and you can't serve images up online from tape anyway. We revised our image sizes, factoring in smaller and smaller resolutions and/or bit depths for material like the printed books, which didn't need full colour, high resolution images. We still couldn't afford the storage costs. Finally, we saw the light and started looking into a relatively new image format called JPEG 2000 .
It is very possible, more digital images are produced by mass digitization efforts and saved as JPEG 2000 files than other file formats. Despite concerns and a clear need for organizational support relating to implementing JPEG 2000, far more cultural heritage organizations are using JPEG 2000 for digitization than most people realize.
Also, JPEG 2000 is widely implemented in other communities as well-
Law enforcement (facial image compression for biometrics).
The Department of Defense and Intelligence Community have adopted the ISO standard for JPEG 2000 for the National Imagery Transmission Format standard as well.
Conclusions: There is not a single answer to the question of file format for raster image files produced by digitization projects. There are a number of file formats worthy of consideration – suitable from technical, sustainability, fiscal, and other perspectives. Compression can represent a reasonable risk for appropriate efforts, and is likely a practical reality as digitization and digital preservation efforts scale. Not using compression likely represents a real risk, particularly given the dramatic and continued growth in digital data.