Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

The evolution of the collections management system

278 views

Published on

How the museum collections management system has developed

  • Be the first to comment

  • Be the first to like this

The evolution of the collections management system

  1. 1. The Evolution of the Collections DatabaseIan RowsonThis presentation is about how both changes in technology and wider influences have affected CollectionsManagement System (CMS) development, and how they will continue to do so. As I represent Adlibinformation systems, this talk will have a strong adlib flavour.I’m going to attempt to break it down into a chronological progression, and try not to get too bogged down inthe technicalities, although of course they do play their part in the story.The beginnings of collections automationLibraries led the way with their use of computer systems to store catalogue data in the late 1960s, and as asoftware product, Adlib shares this heritage. Prior to this of course, libraries had employed card catalogues.Computerised library systems largely rely on cataloguing according to the MARC standard, also developed inthe 60s – one copy of a book is, after all, very much like another, and therefore cataloguing can become alargely automated process. Library catalogues were basically replacement for the old card index system,workflow processes such as book purchasing and management of loans didn’t follow until later.Adlib came on the scene in the mid 70’s, and was built using the FORTRAN 4 programming language. Thesoftware was designed as a generic ‘information management’ tool. In other words, a software toolkit forbuilding database applications, a bit like the modern Microsoft Access or Filemaker Pro.The first customer system was shipped in 1978.A restriction on the uptake of automation in libraries was cost. This was the time way before PCs, socomputers were large, and expensive. Adlib software was designed to run on PRIME mini computers.While researching this presentation I came across this great still from a tv advert for PRIME computers whichI have to share with you. We can laugh at the idea of ‘stepping into the 80s with Prime’, but doesn’t thatpicture remind us (those of us who are old enough to remember the 80s, anyway) how far things havemoved on, technologically speaking?And for me, this is the great dilemma of museum computing. We want to preserve our collections data, likeour collections, into the future. But can we be sure that in 20 years time that we won’t be laughing at thetechnology we use today? I think we probably will be.Technology races ahead at breakneck speed. Obsolescence of computing hardware, operating systems andsoftware not to mention data storage media are among the greatest risks posed to our data. It’s not just thatthese issues may arise if we’re unlucky. They WILL arise, and so we have to be ready for them.If you take only one thought away from this session, I’d like you to perhaps ponder about data preservationissues in your own institution.What led to the demise of PRIME, along with so many other computer manufacturers, was of course theemergence of the personal computer, the PC. Fortunately, Adlib as a company had anticipated and preparedfor this eventuality, and our software had been successfully ported to MS-DOS, then and so on through the 1
  2. 2. different versions of Windows, with each development customer’s data was safely carried forward into thenew computing environment.These changes happened in the form of a continuous evolution. At no point in time was a total cut-offimposed upon the users of the Adlib software which “forced” them into adopting a new technology. Olderimplementations faded away “naturally” and new technology was introduced gradually. This was, and still is,a deliberate choice allowing users, but also the software developers and support staff, to live through smoothtransitions. For this reason new developments will continue to take place on multiple tracks, but with aconvergence towards the same technologyIf planned correctly, a technology transition can occur as a natural process, almost unnoticed by the user.For example, at the moment we are in the midst of another such transition. Like many of our competitors of asimilar age and heritage, Adlib started off running it’s own ‘proprietary’ or native database platform. This dataformat is unique to our software.Modern IT departments are reluctant to implement such databases, preferring instead to adopt the morewidely employed database platforms, such as MS SQL Server and ORACLE. About 4 years ago we adaptedAdlib to be able to run on those platforms, and we’ve been gradually upgrading customer systems, onrequest, to use them.Eventually we anticipate that the use of the proprietary database will eventually decline, but this process willlikely take a fair number of years. We certainly have no plans any time soon to withdraw support for themany hundreds of ‘native’ adlib systems which are currently out in the field.I mentioned earlier about how Adlib was developed as a ’database building kit’. The name ‘adlib’ actuallystands for ‘adaptive library’, meaning the structure of the system is flexible.Adlib’s application building toolset, which, true to the original concept from the 70s, is shipped with each copyof the software, enabled the trained system administrator to carry out a whole range of tasks, includingadding new databases or fields or indexes or screen layouts to the system. Such is the capability of this toolthat we have no need to use externally provided database software within our organisation. All our internalsystems, such as our customer relationship management and helpdesk databases, are built using our ownsoftware.The slide shows one screen from the current version of Adlib designer, which is windows based application.Back in the early days of Adlib, this functionality was offered but through a character based interface thatran from the operating system prompt. This was powerful, but quite tricky to use.A library cataloguing system was the first commercially available product built using Adlib, but it wasn’t longbefore customers using this application came to us and asked if we could build them a database forrecording their object collections as well. This was done on an ad-hoc basis, until the emergence of 2
  3. 3. Collections Trust’s (at that time MDA) Spectrum standard gave clear direction for software developers aboutwhat a museum CMS should look like.Adlib in fact played a supporting role in the development of the Spectrum standard, and you can be sure wewill continue to do so in future.Incidentally, the same approach was adopted for the development of the Adlib archive application in the late90s, although this time the standard to be implemented in the software was the archival managementstandard, ISAD(G)What I’ve simplistically sketched out so far is a linear form of technical development, eventually leading tothe Adlib Museum CMS package in use in over 1,500 institutions worldwide.However, developments such as this which were mirrored across the world by many software companies,were not universally welcomed by the museum profession in the early days. In her MA thesis entitled TheEvolution Of Museum Collection Management Software, Perian Sully describes how in the late 60s, IBM andthe US Metropolitan Museum of Art had convened a conference to discuss the future of computer technologyin US museums:And I quote:“This concern that curatorial or scholarly product would be overshadowed or undermined by the computer isa recurrent topic to this day. This fear was summarized in 1968 by curator J.C. Gardin, when discussing theinstitutional implications of collections technology. He asks if there is:a) a danger of substituting superficial, mechanical knowledge for “organic and deeper form of culture” gainedfrom the personal work of curators,b) a contradiction between rigidly organized data of the database and the intellectual viewpoints of personalcuratorial files, andc) a risk of subordinating individual research to “de facto monopolies of information that may eventually havethe power to control the ‘whos’ and ‘whats’ of scientific inquiry?”Despite the early worries of curators that their oversight and knowledge would not be properly reflectedwithin these new computer systems, the need for tracking and accountability of objects took centre stagewith other professionals.”Sully, P (2006)In other words, the great motivator for the uptake of CMS in US museums was the need to carry out auditscollections to be able to demonstrate accountability.Sulley continues: 3
  4. 4. “Museums of all sizes found that they needed to get their record-keeping in order. In the 1960s, largeinstitutions had led the charge, but during the 1970s mid-sized museums realized that, they too, needed tomake sure their records were in order. Fortunately, computers had decreased substantially in cost. Themicrocomputer became widely available to museums with fewer resources.”Sully, P (2006)Here in the UK, I would argue that although accountability was no doubt a driver in the early days, a greatimpetus was felt in the late 1990’s by the new labour government’s ‘e-learning’ objectives, mainly set by theNational Grid for Learning, the vision for which was first outlined in the report Connecting the LearningSociety (Department for Education & Employment, 1997).Then there came a swathe of texts focussed on delivering digital collections to fuel an ‘educational provision’agenda, but these did tend to gloss over issues about the management or curatorship of the digitalcollections developed for this purpose: See, for example; A Netful of Jewels: New Museums in the LearningAge (National Museum Director’s Conference, 1999) and Building the Digital Museum: A National Resourcefor the Learning Age.(Smith 2000) Together these texts signalled a new direction in policy which aimed tofully establish learning as the central function of the museum. New technologies were deemed to be themethod of delivering that service to the wider community (Smith, L 2000).The funding possibilities which ran ‘on the back of’ these initiatives led to a great expansion of CMSimplementation in the UK. This infamous ‘rush to digitise’ resulted in many projects that opened a window onto collections data, some of which were perhaps were not quite ready to have a window opened on them –mainly for reasons of incomplete or unverified data. After all, what museum is not carrying a documentationbacklog?The overriding desire to open up collections data for educational/public access become a major justificationfor accessing funding to undertake a CMS project. This of course, was made possible and desirable by theincreasing growth of the world wide web.Current CMS have since matured to offer a bewildering array of functionality, not unlike business softwareapplications, such as MS Word for example. Who uses more than about 20% of the capabilities of these?Although (like in the library model beforehand) CMS began as simple tools for cataloguing collections, theyare now used to track inventories, donor information, condition reports, artist biographies, exhibitioninformation, bibliographic texts, and curatorial papers as well as present multimedia files and interface withthe museum’s Website. The function is shifting from being a collections management system to a contentmanagement system.(Sully, P 2006)We now take for granted such features as image/multi-media management, driven by needs to provideexciting interactive material for web users, but enabled by the capabilities of the inexpensive powerful PC. 4
  5. 5. Also web driven is support for the ‘social networking’ phenomena, we are incorporating into adlib productsthe ability to capture User Generated Content such as comments, tagging, uploaded images, etc.However, despite all this functionality, from my own personal experience, (and this is borne out by Sully’sresearch) the CMS installed in the average museum remains quite severely under-utilised. I wonder why thisshould be?Sulley did do some research into this issue. She tells us; Richard Gerrard looked at the number of failedprojects in the past and suggested that failure was a historical trend, because there was often earlyenthusiasm for new features, buoyed by an infusion of grants. This, he said, created inflated expectations onthe part of users, a lack of critical examination by developers, and resistance within the institution’sadministrative structure. Soon thereafter, the feature which promises this great advancement in productivityis abandoned in favor of the next technological wonder.(Sully, P 2006)My take on this is that it often seems that the purchase and installation of a CMS is championed by aparticular member of staff. When they leave to go to another job, systems can then often seem to ‘drift’without direction. What is really needed is for a specific member of staff to be assigned to manage thesystem, but often this does not happen in a smaller institution. It is much more reliant on the interests ofparticular personalities, whose main job is invariably something else.I’m going to bring things right up to date, to look at how current trends are shaping the CMS of the future.A key driver of development at the moment is that of the API – applications programming interfaceBut what is an API, and why would you need one?Modern computer program design (service oriented architecture - SOA) promotes the breaking up ofcomplex applications in small manageable components that communicate with each other using APIs.Designing programs in this way not only makes a system flexible and scalable, but it also provides a platformfor integration between different software components (even from different vendors). Adlib currently supportsthis model to some degree.Let me give you a real-world example, that of a fairly recent development, the Adlib image handler API.The idea behind this is as follows.Adlib, like most other CMS packages, has in-built the capability to display linked images of collection objects.However, many customers are already using other software with similar capabilities, such as contentmanagement and/or digital asset management packages, leading to overlap and duplication of functionality.Images which are stored in one software package need to be accessible from the others. 5
  6. 6. APIs offer a solution to this problem.The Adlib media handler separates out from our software the image management function, in such a waythat it can be easily accessed by either adlib, or other external software. Furthermore, the possibility is alsoraised that images held in other software (such as a DAMS) could be linked to by the Adlib CMS, instead ofusing the Adlib image handler.But we are not stopping there. In future, eventually all programs in the Adlib suite will follow the (SOA)paradigm. To support this, a new set of APIs being developed, supporting both data access and metadataaccess. The modules will be accessible though web services and as “traditional” (managed) DLLs.External stakeholders (including customers) were invited to cooperate in the API development processearlier this year, and development is already under way.Another current development from the IT world is that of Cloud computing – but what does it mean?I’ve reverted to Wikipedia for an explanation:Cloud computing is a style of computing in which information technology resources are provided as a serviceover the Internet. Users need not have knowledge of, expertise in, or control over the technologyinfrastructure "in the cloud" that supports them. The term cloud is used as a metaphor for the Internet, basedon how the Internet is depicted in computer network diagrams, and is an abstraction for the complexinfrastructure it conceals.The concept incorporates infrastructure as a service (IaaS), and software as a service (SaaS) as well asother technology trends from the last couple of years that have the common theme of reliance on the Internetfor satisfying the computing needs of the users. Cloud computing services usually provide common businessapplications online that are accessed from a web browser, while the software and data are stored on theremote servers.The key driver behind cloud computing is that users can avoid capital expenditure on hardware and software,rather paying a provider only for what they use. Consumption is billed on a utility (e.g. resources consumed,like electricity) or subscription (e.g. time based, like a newspaper) basis with little or no upfront cost. Otherbenefits of this time sharing style approach are low barriers to entry, shared infrastructure and costs, lowmanagement overhead and immediate access to a broad range of applications. Users can generallyterminate the contract at any time (thereby avoiding return on investment risk and uncertainty) and theservices are usually covered by service level agreementAccording to Nicholas Carr the strategic importance of information technology is diminishing as it becomesstandardised and cheaper. He argues that the cloud computing paradigm shift is similar to the displacementof electricity generators by electricity grids early in the 20th century. 6
  7. 7. (Wikipedia 2009)Adlib have been offering our CMS systems ‘in the cloud’ as a subscription service for a couple of years now,and while we have a few, mainly commercial customers using these services, generally speaking uptakefrom the museum sector has been slow.I’d suggest there are a couple of possible reasons for this:  In the UK, museums so far have been able to access funding for capital projects from a variety of sources. Funding which pays an annual fee, on the other hand, is more difficult to raise.  Museums are reluctant to hand-over custody of their data to an outside organisation, and of course there are risks associated with this which must be managed.Wikipedia lists seven security issues which one should discuss with a cloud-computing vendor in order tomitigate risks: 1. Who has access to your data? 2. Is the vendor is willing to undergo external audits and/or security certifications? 3. Data location—ask if a provider allows for any control over the location of your data 4. Data segregation—is data encryption is available? 5. Recovery—find out what will happen to data in the case of a disaster; do they offer complete restoration and, if so, how long that would take? 6. Investigative Support—enquire whether a vendor has the ability to investigate any inappropriate or illegal activity? 7. Long-term viability—ask what will happen to data if the company goes out of business; how will data be returned and in what format?In practice, one can best determine data-recovery capabilities by experiment: asking to get back some data,seeing how long it takes, and verifying that it is correct.(Wikipedia 2009)Our brand name ADLIB stands for “ADaptive LIBrary” system and although the use of our software is nolonger restricted to just libraries, the “adaptive” or “flexible” qualification has always been retained as the keybenefit of using our software.  Flexibility in the form of the Adlib Designer toolkit, which allows the trained System Administrator to make changes to both the database structure, and the behaviour of the software.  Flexibility in the form of APIs which allow tight integration with other software applications in use within the institution, and allow data to be re-purposed in audio tours, on the web or by digital asset management systems. 7
  8. 8.  Flexibility in the form of different ways you can run the software – by traditional purchase, or ‘in the cloud’ as a service.In the next generation of products we want to move a step further and place even more flexibility in thehands of the actual user of the system, as opposed to the system administrator. In current versions thisprocess has already started, for instance by enabling the user to adapt their own toolbar, or by enabling theuser to generate reports on the fly by using the “print wizard”. This principle will be implemented throughoutwith more “personal” preferences settings. One can think about search behaviour (e.g. default truncation orprovision of lists) or the appearance of the software (allows the user to add style sheets, personal outputformats, or change colour schemes).One thing that you can be sure of, is that development of the Adlib product range will remain at leading edgeof CMS development. We understand that technology is a shifting sand on which to build, but we employproven strategies to deal with that.Adlib has the experience and the capability to help all collecting institutions to secure their data for futuregenerations.ReferencesSMITH, L., (ed.) (2000) Building the Digital Museum: A National Resource for the Learning Age. Cambridge,MDASULLY, P (2006) Inventory, Access, Interpretation: The Evolution Of Museum Collection ManagementSoftware, [online] MA Thesis, John F. Kennedy University Available at:http://conference.archimuse.com/biblio/inventory_access_interpretation_the_evolution_of_muse.htmlhttp://en.wikipedia.org/wiki/APIhttp://en.wikipedia.org/wiki/Cloud_computing 8

×