Tufts/Yale Project Reports

On October 13, 2009, in Best Practices, Research, Software Reviews, by Chris Prom

Over the past week and a half, I spent a good amount of time  reading and grappling with the Tufts/Yale “Fedora and the Preservation of University Records Project” Reports.    I am very glad I did, and I think the reports deserve a lot more attention than, to my knowledge, they have previously received.  There are several reasons why I like them so much.

First, they challenged some of my own assumptions, and I hope that after having read them I will avoid making some very big mistakes over the next several months. Electronic records are a new area for me, and like many archivists, I would very much like to find an easy approach to e-records in general and my research project in particular.  Of course I know how complex the issues are, but I deliberately conceptualized my research proposal in a very simple (possibly even naive) fashion:  I hoping that by doing software evaluation and working with real live records, I could somehow assemble a soft of jackknife that I could pull out of pocket to at least bring myself and others in the profession several steps closer to solving e-records problems.  Similarly, Eliot Wilczek and Kevin Glick (co-PI’s on the Tufts/Yale project thought they could accession some records into Fedora to evaluate whether it could be used as a good preservation enviroment for e-records.  But during their project, they discovered that (not only did Fedora 2.1 have many shortcomings, but that they were asking the wrong question.  As a result, they ended up refocusing their project to develop a true set of evaluation criteria for determining whether an institution (comprised of its people, infrastructure and  applications)  can ingest, maintain, and provide access to electronic records.

That is a very, very big task and one I had (belatedly) come to think I was going to need to complete on my own–at least until I read thier report.  So, I am indebted to them for saving me a lot of work!

That brings me to the second reason I like their project so much: their reports provide a very useful framework that can be used not only to develop insittution-specific workflows and project plans, but (more critically) to evaluate applications as a part of those workflows/plans.  In other words, I feel just a little more able to grapple with the whole range of activities, software and hardware that involved in e-records appraisal, preservation, management and access than I did before I read about their project. Specifically, the report provides at least three very valuable things:

  • Requirements for Trustworthy Recordkeeeping Systems and the Preservation of Electronic Records in a University Setting. Report 5.1. The Records Preservation Section provides an amalgamation and crosswalk of requirements  from numerous other standards*, conceived and executed within the context of 34 subsections of the OAIS reference model.  For each subsection (e.g. “Receive Submission”) they list the features, behaviors and qulities that an Institution,, the institutions’s infrastructure, a Jurdical Person, Natural Person or the Preservation Application must, should or may manifest in order to ensure the trustworthy preservation of authentic electronic records.  While the list is daunting, it is less so that others I have seen, and doubly useful in that each requirement is mapped to corresponding requirements for other standards footnoted below.
  • Ingest Guide. Report 2.1. The ingest guide describes, a detailed set of (non-linear) activities that a repository would need to undertake in order to ensure a trustworthy ingest process that would allow users to reach a reasonable conclusion that materials maintained in a repository are authentic, i.e. that they provide trustworthy evidence of the activities they document.  The guide is split into two discrete activities “Negotiate Submission Agreement” and “Transfer and Validate Records”  For each subsection (e.g. “establish relationships” or “Package Records in a SIP”, a decision tree is provided, along with a brief description of the step and a list of any datapoints or reports that the step either requires as an input or produces as an output.  (For example, the step A 1.3 “Identify Producer” uses accession logs, activity logs and (possibly) producer records as input and modifies the producer entry as output.)  Helpfully, an appendix
  • Maintain Guide. Report 3.1.  The “Maintain Guide” focuses on one part of the overall system of policies, infrastructure, people and tools necessary to deliver records to user in a way the preserved their continuity and ‘recordness’:  the technologies and procedures needed to implement a preservation repository.  It lists a series of regular and irregular high-level events that the preservation environment needs to be able to complete (such as verifying records components or accessioning digital objects) that the application and procedures need to support in order to preserve both the record components as well as the archive information packet (AIP and its associated preservation description information).

Even through the reports are complex, they make a possible  (if still arduous) task to define a set of applications and services that can manage the entirety of a e-records appraisal, submission, ingest, preservation, management, and dissemination functions.

In this respect, my project is at a critical juncture, because while I have been reading the Tufts/Yale reports, I’ve also been slowly building a list of applications to examine, test and evaluate.  But it only makes sense to test and evaluate them once they have cross referenced against a generic workflow and set of requirements (such as the ingest activities or maintenance events in the Tuft/Yale reports)–which is the task I’ll be working on over the next week or so, hopefully in database form, since certain application may cover multiple activities, and certain activities may require more than one application.


*  Among these I’d include documents such as the Indiana University Functional Requirements for RecordKeeping Systems, InterPARES I Projects, “Requirements for Assessing and Maintaining the Authenticity of Electronic Records”, ISO 15489-1MoReq2 and  MoReq1, the Public Records Office, Functional Requirements for Electronic Records Management (which was superceded by MoReq), the UCSD’s Preserving the Electronic Records Stored in a Records Management Application (PERM project results), the U.S. Department of Defense 5015.2 standard and the Pittsburgh Functional Requirements for Evidence in Recordkeeping.

Tagged with:  

Comments are closed.