They provide heuristics for time and cost estimates for processing projects Metrics can be used to evaluate processing techniques at varying levels Moreover, they can be an essential tool for evaluating effectiveness of these techniques from a quantitative perspective In addition, metrics provide an iterative way to create and revisit benchmarks at your institution, allowing archivists to set realistic expectations for productivity.
In their seminal article, Mark Greene and Dennis Meissner dedicate the majority of their article to the importance of archival metrics. They write a full 5 pages to review the few tracking efforts that had been done in the United States since the 1970s. They worked under the assumptions that processing includes arrangement, description, and minor conservation tasks such as removing fasteners and rehousing; and also that processing archivists typically devote about 230 days per year to processing, considering vacation, leave, meetings, and other commitments. In 1976, Texas A&M archivist Charles Shultz did a study of 6 repositories processing modern manuscript collections and came up with a processing average of 40 cubic feet per year. In 1978 and 1982, an archivist at the University of Illinois Urbana-Champaign did an assessment of 76 series and collections, which suggested that archivists could potentially process between 250 – 600 cubic feet per year. A smaller study in 1980 by W.N. Davis tracked all staff including clerks and archivists of all levels and found an average of 8 hours per cubic foot.
In a study of 30 NHPRC and 25 NEH processing grant projects, Lynch and Lynch found an average of 12.7 and 10.6 hours per cubic foot. Of note is their observation that none of the processing projects were completed before their anticipated end date, and that a number of projects required additional time to be completed. The Washington State study did retrospective tracking of processing done from 1975 by graduate student employees, and found a much greater estimate of processing time.
Uli Haller’s 1987 study of 2 large congressional collections was significant because of his observation that archivists lack standardization in the levels of arrangement, preservation, and access to archival material. This is one of the first studies that observes this problem. The 1995 study by archivists at the Billy Graham Center Archives used the studies I’ve described to develop processing expectations in terms of time and cost estimates. Their study of processing rates was surprising, in that they were spending much more time and money on processing per cubic foot than they had anticipated. Their conclusion points out the need for standardization of processing levels and areas to measure in order to create greater uniformity across institutions for cost and time estimates.
Greene and Meissner spent more time exploring metrics in order to demonstrate the lack of standardization in the profession. But they also use them to illustrate that metrics are simply a natural fit with more efficient archival processing. All of us are challenged to do more with less, including less staffing and less funding. Efficient processing is a necessity of today’s economic realities. Given that we may never again see an administrator or granting agency who does NOT expect quantitative tracking of progress, it is necessary for archivists to gather metrics as a way to justify resource needs. Processing more efficiently allows archivists to provide clear benchmarks and definitions for processing levels – which is easier to explain to administrators and funding agencies. Tracking time spent isn’t enough. By parsing out your processing levels, you can gauge – in an ITERATIVE way – how efficient your processing techniques are. Metrics are key to making sure that you are processing effectively. If you are processing at a “minimal level,” how can you verify that? Like peanut butter and jelly, metrics and efficient processing belong together…
The NGTS initiative to create efficient processing guidelines resulted in the creation of a sub-group focused on “defining a methodology and identifying a data gathering instrument for capturing processing rates.” Basically, this group was charged with crafting minimal guidelines for particular elements to track and, in addition, some tools for tracking processing.
We decided to start with a review of existing tools and resources, including: the UC system’s CLIR grant to process environmental collections and the resulting metrics report; the Philadelphia-Area Consortium for Special Collections hidden collection project and their tracking worksheets, the OCLC report on assessment, and the comprehensive Harvard Center for the History of Medicine archival metrics database as well as studies by users at NC State. From this review, we went forward with a series of interviews with archivists and librarians about how they track archival processing. We spoke via phone, email, and in person with representatives from… We wanted to ask specifically: what are the essential data points that you track? Who are the key users of your tools or tracking efforts? How do you ensure successful tracking and staff buy-in? and finally, what are the challenges and benefits to tracking metrics?
In a report, we summarized the results of our survey.
Overall, this lightning team agrees that while an instrument for tracking archival processing metrics like the one created at Harvard could be implemented in the University of California, such a tool would need to be created jointly and collaboratively between the UC campuses in order to promote its use and clarify expectations, benchmarks, and units of measurement. If such a tool were created, this lightning team highly recommends that it be a web-based tool that is accessible and scalable across the UC campuses. Staff time and financial resource constraints, in addition to concerns about the intended and potential usefulness of processing data, have led our team to create a simple processing tracking spreadsheet template that could be implemented across the UC system.
Our chapter added to the Lightning Team 2 manual for efficient processing addresses the importance of processing plans in helping processors make educated decisions about processing levels and anticipated time and resource estimates for processing projects. The document draft defines key units of measurement, defers to LT2's definitions for processing rates/levels as part of the key unit of measurement, and defines a template instrument for recording processing rates. Most significantly, the template instrument and chapter provides a minimum expectation for tracking archival processing, while leaving flexibility for institutions with greater resources to go above and beyond these ground floor expectations. As you can see, we’ve identified the following elements to track… Ultimately, there is not a prescriptive standard for what processing tasks are included, and we suggest that it can be tallied separately depending on the needs of the institution. Our simple spreadsheet calculates the processing rate per linear foot based on the extent and processing hours input into the sheet.
In the Efficient Processing Guidelines, we point to several of the tools that we evaluated as well as the worksheet template that we developed. If there’s time I am happy to walk you through these resources. Thank you!
State of the Art: Methods and Tools for Archival Processing Metrics
Methods and Tools for
Archival Processing Metrics
Audra Eagle Yun, MLIS, CA
Acting Head of Special Collections and Archives
University of California, Irvine Libraries
Society of California Archivists
Annual General Meeting 2013
Why archival metrics?
• Heuristics for time and cost
• Evaluate processing techniques
• Quantitatively evaluate
• Create and revisit benchmarks
A history of archival metrics
• Greene and Meissner’s review
– Tracking includes arrangement, description, and minor
– Archivist works average of 230 days per year
• 1976: Charles Schultz study
– 40 cubic feet/year
• 1978 and 1982: William Maher study at University of Illinois
– general files 3.0 hours / cubic foot
– personal papers 6.9 hours / cubic foot
• 1980: W.N. Davis at California State Archives
– 8 hours / cubic foot, average of all staff
A history of archival metrics,
• 1982: Karen Temple Lynch and Thomas E. Lynch
study of NHPRC and NEH processing grants
century collections average 12.7 hours / cubic foot
– organizational records average 10.6 hours / cubic foot
• 1985: Terry Abraham, Stephen Balzarine, Anne
Frantilla at Washington State
– graduate workers average 5.5 days / cubic foot (<1
foot) or 3 days / cubic foot (>1 foot) for manuscripts
– 2 days / cubic foot for archival series
A history of archival metrics,
• 1987: Uli Haller, University of Washington
– Large 20th
century collections, 3.8 hours / cubic foot
– Significant observation: lack of standardization of levels for
• 1995: Paul Erickson and Robert Shuster, Billy Graham
– Used prior studies to set expectations
– Significant observation: “we’re processing more intensively than
we realized or intended”
– Actual averages: 15.1 hours and $375 / cubic foot
– Important conclusion: “It is almost accepted as a given in the
literature that processing methodologies and local conditions
vary so widely from archives to archives that figures developed
at one institution are meaningless at another”
Metrics and efficient processing:
a love story
• Economic realities
• Justification for resources
• Gauging techniques
NGTS POT3 LT2B
• Alphabet soup
University of California Libraries’ Next-Generation
Technical Services, Power of Three Group 3, Lightning
“Define a methodology and identify a data gathering
instrument for capturing processing rates, to facilitate
cost/benefit analysis of processing approaches. Data
collected will additionally assist campuses in evaluating
local processing benchmarks.”
Environmental scan & interviews• Literature and resource review
– CLIR UCEC
– CHoM database and users: NCSU, Princeton
• Email, phone, and in-person interviews
– Harvard Medical School Center for the History of Medicine
– UCLA Library Special Collections
– UCB Bancroft Library
– Stanford University Library, Special Collections & University Archives
– Free Library of Philadelphia
– Princeton University, Seeley G. Mudd Manuscripts Library
– Key users of tracking tools
– Essential data points
– Ensuring success/buy-in
– Challenges and benefits
• Institutions that have implemented a tracking database or system like the Harvard Processing
Metrics Database indicate that these tools can become integrated into the work structure, given
team support and involvement in planning and implementation.
• Institutions that have chosen not to use a structured tracking system indicate that these tools are
far too complex, granular, and time-consuming than is necessary. A few suggested that the level
of detail expected for such systems is incompatible with MPLP techniques.
• Both users and non-users of a complex tracking database commented on the barriers to using
Microsoft Access and suggested that a web-based solution would be more user-friendly.
• Interviewees suggested that clear expectations about units of measurement (time and linear feet)
and processing plans are among the most useful aspects of processing metrics.
• Interviewees who were not tracking archival processing metrics advocate for resource allocation
estimates (time and linear feet) that are created during planning and reviewed upon completion
of processing projects.
• Some interviewees expressed concern about the possibility of tracking data being used to assess
staff productivity or individual work quality.
• All interviewees discussed importance of tracking processing work in some way in order to justify
funding and staffing from resource allocators, as well as to provide more accurate information
about expected timeframe and perceived value of archival work. One interviewee suggested that
archival metrics data can also assist in collection development decisions, contributing to estimates
on how much time and money is needed to make available certain types of collections.
• Metrics not as a mandate, but
facilitator of data-driven decision
• Can help with creating set of
common benchmarks across
• Metrics for justification of needs,
demonstration of value
• Minimum baseline data elements