Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

State of the Art: Methods and Tools for Archival Processing Metrics

4,969 views

Published on

Presented to the 2013 Annual General Meeting of the Society of California Archivists.

Published in: Education, Technology
  • Be the first to comment

  • Be the first to like this

State of the Art: Methods and Tools for Archival Processing Metrics

  1. 1. Methods and Tools for Archival Processing Metrics Audra Eagle Yun, MLIS, CA Acting Head of Special Collections and Archives University of California, Irvine Libraries Society of California Archivists Annual General Meeting 2013 Berkeley, California
  2. 2. Why archival metrics? • Heuristics for time and cost • Evaluate processing techniques • Quantitatively evaluate • Create and revisit benchmarks
  3. 3. A history of archival metrics • Greene and Meissner’s review • Assumptions: – Tracking includes arrangement, description, and minor conservation – Archivist works average of 230 days per year • 1976: Charles Schultz study – 40 cubic feet/year • 1978 and 1982: William Maher study at University of Illinois – general files 3.0 hours / cubic foot – personal papers 6.9 hours / cubic foot • 1980: W.N. Davis at California State Archives – 8 hours / cubic foot, average of all staff
  4. 4. A history of archival metrics, continued • 1982: Karen Temple Lynch and Thomas E. Lynch study of NHPRC and NEH processing grants – 20th century collections average 12.7 hours / cubic foot – organizational records average 10.6 hours / cubic foot • 1985: Terry Abraham, Stephen Balzarine, Anne Frantilla at Washington State – graduate workers average 5.5 days / cubic foot (<1 foot) or 3 days / cubic foot (>1 foot) for manuscripts – 2 days / cubic foot for archival series
  5. 5. A history of archival metrics, continued • 1987: Uli Haller, University of Washington – Large 20th century collections, 3.8 hours / cubic foot – Significant observation: lack of standardization of levels for processing • 1995: Paul Erickson and Robert Shuster, Billy Graham Center Archives – Used prior studies to set expectations – Significant observation: “we’re processing more intensively than we realized or intended” – Actual averages: 15.1 hours and $375 / cubic foot – Important conclusion: “It is almost accepted as a given in the literature that processing methodologies and local conditions vary so widely from archives to archives that figures developed at one institution are meaningless at another”
  6. 6. Metrics and efficient processing: a love story • Economic realities • Justification for resources • Gauging techniques
  7. 7. NGTS POT3 LT2B • Alphabet soup University of California Libraries’ Next-Generation Technical Services, Power of Three Group 3, Lightning Team 2B • Charge “Define a methodology and identify a data gathering instrument for capturing processing rates, to facilitate cost/benefit analysis of processing approaches. Data collected will additionally assist campuses in evaluating local processing benchmarks.”
  8. 8. Environmental scan & interviews• Literature and resource review – CLIR UCEC – PACSCL – OCLC – CHoM database and users: NCSU, Princeton • Email, phone, and in-person interviews – Harvard Medical School Center for the History of Medicine – UCLA Library Special Collections – UCB Bancroft Library – Stanford University Library, Special Collections & University Archives – Free Library of Philadelphia – Princeton University, Seeley G. Mudd Manuscripts Library • Topics – Key users of tracking tools – Essential data points – Ensuring success/buy-in – Challenges and benefits
  9. 9. Interview findings • Institutions that have implemented a tracking database or system like the Harvard Processing Metrics Database indicate that these tools can become integrated into the work structure, given team support and involvement in planning and implementation. • Institutions that have chosen not to use a structured tracking system indicate that these tools are far too complex, granular, and time-consuming than is necessary. A few suggested that the level of detail expected for such systems is incompatible with MPLP techniques. • Both users and non-users of a complex tracking database commented on the barriers to using Microsoft Access and suggested that a web-based solution would be more user-friendly. • Interviewees suggested that clear expectations about units of measurement (time and linear feet) and processing plans are among the most useful aspects of processing metrics. • Interviewees who were not tracking archival processing metrics advocate for resource allocation estimates (time and linear feet) that are created during planning and reviewed upon completion of processing projects. • Some interviewees expressed concern about the possibility of tracking data being used to assess staff productivity or individual work quality. • All interviewees discussed importance of tracking processing work in some way in order to justify funding and staffing from resource allocators, as well as to provide more accurate information about expected timeframe and perceived value of archival work. One interviewee suggested that archival metrics data can also assist in collection development decisions, contributing to estimates on how much time and money is needed to make available certain types of collections.
  10. 10. Recommendations • Metrics not as a mandate, but facilitator of data-driven decision making • Can help with creating set of common benchmarks across institutions • Metrics for justification of needs, demonstration of value • Minimum baseline data elements identified
  11. 11. Available tools • UC Libraries Baseline Archival Processing Metrics Spreadsheet • PACSCL/CLIR Hidden Collections Processing Project, Processing Worksheet • Processing Metrics Collaborative: Database Development Initiative • UC Libraries Archival Processing Metrics Worksheet http://uclib-prd-old.cdlib.org/cdc/hosc/efficientprocessing/index.html

×