The content task group has investigated various option for subscribing to, purchasing and creating digital content. In addition to that, we investigated copyright and worked with University Counsel to create fair use guidelines for images, we created a metadata sub group which is currently working on Digital Asset Management implementation, and we are continuously exploring other content issues for the Digital Asset Management System. Early on we discovered that the best and least expensive way to provide content to the CU system was going to be through a subscription service. The focus quickly turned to ARTstor, and the majority of our time over the last few months was spent investigating ARTstor, coordinating and holding demos for the Steering Committee and other interested parties, setting up a 2-week trial period, organizing a faculty focus group to test ARTstor, and creating a questionnaire for those faculty across the system who agreed to participate. The Content portion of the presentation today will focus on the ARTstor trial and faculty feedback, and conclude with a recommendation that the CU system subscribe to ARTstor as a means of providing the critical mass of content necessary to transition to digital images.
The steering committee charged the infrastructure group with the task of investigating digital asset management system, focusing on issues of functionality, scalability and interoperability. The group evaluated 19 digital asset management systems using a comprehensive list of criteria. We will describe this evaluation process in greater detail later in the presentation. The group determined that Luna Insight is the only system to satisfy the majority of the evaluation criteria.
Trial period occurred January 18 th through February 1 st 2005; it was the second week of the new semester for UCB but it was the first for Auraria and UCCS Faculty members from Boulder, Auraria, and Colorado Springs who teach with images were identified prior to the 2-week ARTstor trial period and asked to provide feedback on ARTstor in a questionnaire A questionnaire was created and distributed to the faculty focus group participants during the first days of the trial and returned on or near February 4 th . The questionnaire included general questions on image use, ARTstor’s ease of use, content, and image quality, and then asked for the participants overall assessment of ARTstor. Special thanks to Cindy Hashert and Judith Rice-Jones for coordinating trials, training, focus groups, and compiling feedback for UCD and UCCS.
We had a remarkably high rate of return on the questionnaires, receiving a total of 51 replies from a wide variety of disciplines. To view the feedback from each campus and the complete list of disciplines included in the Boulder trial, as well as a copy of this report please see our website (http://ucblibraries.colorado.edu/art/DAM.htm). You will also receive the URL in an email that will be sent later today. It should be noted that several of the responses from UCCS were email responses and not fully completed questionnaires
*At the Boulder campus, 19 participants attended training and or demo sessions *We do not have information on participation in training/demo sessions from Auraria and UCCS *At the training sessions in Boulder, we gave a brief introduction to ARTstor, its functions, and its content. We then addressed technical requirements, the procedure for registering, and the ARTstor Help screen. This was followed by a demonstration of the three ways to search for images, how to create and use image groups, and downloading and using the Offline Image Viewer. At the training sessions, which were held in one of Norlin library’s electronic classrooms, participants were given ample time to work individually at computer workstations with assistance. At the demo, participants were invited to ask questions and suggest searches. Several faculty, who could not make the training sessions or demos, were given one-on-one training sessions that covered all of the above. Web demonstrations occurred in Auraria Library that gave an introduction to ARTstor, its functions, and its content. Technical requirements, registration, and ways of searching were included. Faculty on the Auraria campus had problems with remote access and the offline viewer and received guidance to overcome these problems in some cases. We will now go over the questionnaire and responses we received.
35mm slides are still the most common means of presenting images in the classroom. Many people are turning to PowerPoint and websites in order to incorporate digital images into their teaching.
Participants from the 3 campuses used anywhere from 0-2500 images per course, with an average of 300-1000.
Faculty respondents outside the disciplines of art and architecture use images in their teaching to illustrate historical concepts, as backdrops for discussions, to illustrate cultural issues, enhance historical materials, and to enhance textual materials. Many of the faculty who participated noted that they are creating multi-media courses and websites as study aids for the students.
Use of ARTstor ranged from 1 to 12 times, with an average of 3-4 times.
Not all participants met the minimum system requirements for the Offline Image Viewer and others simply did not attempt to download it.
45 participants responded to this question. The majority, 53%, felt that the content in ARTstor was very easy to find. Most comments were favorable. Of the unfavorable comments, one noted that he disliked computers, two thought that the search engine could be slow, and others had concerns about metadata
22 participants answered this question. Not everyone downloaded the Offline Image Viewer in his or her free time, but those who were at training sessions in Boulder used it in the electronic classroom. The OIV was the main source of technical difficulties, the most frequent of which was failure in meeting the system requirements listed on ARTstor’s web site. That being said, the vast majority of respondents found that it was moderately or very easy to use. One stated that it was a “vast improvement over PowerPoint.” Some felt that any difficulty they experienced was due to their unfamiliarity with the new software, and many were concerned with the availability of technical support and training.
Only one person incorporated personal images into the Offline Image Viewer and she mentioned that the Offline Image Viewer highlighted the extraordinary quality of the ARTstor images.
Almost half of respondents felt that ARTstor’s content would meet at least 50% of their image needs. Many participants outside of the disciplines of art and architecture thought that ARTstor’s content would meet most if not all of their image needs. Participants in the disciplines of art and architecture believed that ARTstor would provide the majority of content needed to teach the very large and often generalized survey courses that require 1000’s of images a semester. Those requiring images of contemporary art and architecture felt that it was lacking in this area. Very few participants believed that the content would not meet any of their needs at present. Most participants were reassured by the list of upcoming collections to be added into ARTstor, and understood that ARTstor’s current collection is in its nascence.
Responses to this question ranged from “it has everything” to “obviously a much expanded database.” Photography and contemporary art and architecture were mentioned repeatedly.
The vast majority felt that the overall image quality was good to excellent. Comments included “much better than some of the older slides” and “much better than our own scans.” Many noted that the images quality varies in ARTstor. Overall, respondents were impressed with the quality, especially when downloaded into the Offline Image Viewer. It is important to note that the Offline Image Viewer is an essential means of assessing the quality of ARTstor’s images. One can only view ARTstor’s images at 72 ppi in the online browser.
Most respondents said that a product like ARTstor would make them more likely to teach with digital images, one replied that he would not use it with its current inventory and the lack of smart classrooms around campus, and several replied “maybe,” if the content met their needs. The responses were overwhelmingly positive and included remarks such as: “absolutely,” “it’s so easy and convenient,” and, “Use of this wonderful tool would undoubtedly lead to new strategies in teaching in many disciplines.” Many people were, however, concerned with the lack of smart classrooms and worried about the technical support that would be made available.
A total of 10 participants answered this question. Of those, one replied that he would definitely use it and others were primarily concerned with classroom support and equipment.
The majority of assessments were favorable, and included comments such as “excellent, I can’t wait,” “the sooner the university subscribes the better,” “it will become the standard source of digital images for art historians in the US,” “the collection already available would satisfy the content I teach in my classes,” and “ARTstor is the greatest!” There were some mixed and negative reviews, with most criticisms centering on holes ARTstor’s content, technical difficulties, reservations about digital images vs. slides in general, and concerns about cost. While the respondents were pleased overall with the content, many noted that additional local content would be necessary to make the complete transition to digital images. One person stated, &quot;I stopped looking when I found the limited number of post 1960s work, this is a big limitation for me.” Many participants again noted their concern over technical support, training, and the lack of smart classrooms. One faculty member remarked, “Great! Definitely need to purchase; but—we need to purchase equipment to support it in our classrooms, and we need to maintain it.” In addition, we should note that it is apparent that some faculty members feel apprehensive and even resistant to digital technologies This will always be the case when transitioning to new media; for instance, that there are still faculty who mourn the loss of lantern slides.
We noted 3 issues that could have affected responses: training, tech support, and the nature of image use. Those who received training generally had a better understanding of the product and were able to better use the tools in ARTstor. Some individuals who were technically savvy were able to explore and use ARTstor effectively without training, although some comments lead us to believe that some of these participants may not have fully understood all aspects of the product. During the two-week trial there were a total of 34 inquiries on the Boulder campus regarding tech support, outside of the training sessions and demos. Most were simple questions that could be answered by looking at the technical requirements page on the ARTstor web site, or exploring the troubleshooting section of ARTstor’s Help pages. The Offline Image Viewer was the source of most technical difficulties. There were three inquiries that could not be resolved using ARTstor’s online resources. We e-mailed ARTstor, which responded the same day to each inquiry. There appeared to be no significant correlation between an individual’s opinion of ARTstor and the frequency and amount of images he or she uses in teaching.
Overall, the response to ARTstor was quite positive, and demonstrated that it would likely be a valuable and heavily used resource. Participants repeatedly expressed the importance of smart classrooms, training, and technical support, including classroom support. This will be true of any digital resource that may be used in the classroom. The Content Task Group highly recommends that the University of Colorado system subscribe to ARTstor.
This is what ARTstor will cost at the three CU campuses.
ARTstor would ultimately save the CU system decades of work and millions of dollars. As you can see from these figures, the cost per image would be significantly less than if we were to purchase the equivalent number of images. ARTstor would provide crucial content “critical mass” that would allow us to focus on building the unique digital image content required by our local curriculums. Because ARTstor will never provide all of the content we need, we still need a digital asset management system.
The Content Task Group has made a strong case for a system-wide subscription to ARTstor and you might be asking yourselves why do we also need to purchase digital asset management software. First, CU’s image collection are not part of ARTstor and ARTstor does not currently offer hosting services. Second, ARTstor does not accommodate audio, video, animation, or contain images outside the disciplines of art and architecture. And CU will require local control of our image collections. If a faculty member needs an image added in a short time frame, we need to be able to do that, and other issues pertaining to privacy and copyright will need to be managed at the local level.
When the steering committee was formed last May, our goal was to choose a common software platform that could be adopted for use throughout the CU system. Currently, we have multiple digital image silos across the system that are not connected and don’t communicate with each other. A common software platform will provide CU with a unified interface, cost-sharing potential among collaborators, and the SIGNIFICANT advantage of CU faculty, staff, and students having to learn only ONE system.
However, even with common software, the MANAGEMENT of how the system is deployed must reflect how each campus, department, or unit is best able to accommodate it. It would be naïve to think that the entire CU system would be ready to join in this digital endeavor all at the same time. Each will have their own unique needs in the areas of staffing, budgets, tech support, and their own timelines for embracing the digital transistion.
The infrastructure group recommends Luna Insight as the common platform and recognizes that multiple implementations are necessary to accommodate the variety of organizational styles and i.t. environments across the CU system or even on a single campus, for that matter. The digital asset management system is a counterpart of the integrated library system. In fact, this successful system-wide collaboration is an excellent model to follow. The libraries across the system collectively agreed upon the ILS software vendor, but each library runs its own server and software. Can you imagine how cumbersome it would be for one library to manage the server and software for every other library in the system? I’d also like to note here that when I refer to “implementation”, I’m talking about an instance of Insight software running on a server. Luna Insight is designed to share collections across servers, so CU students, faculty, and staff can have a single unified digital library regardless of the number of Insight servers deployed across the system.
Before going any further, I’d like to summarize the task group’s work to date. We identified 19 potential digital asset management systems through research and conversations with colleagues. Then we identified 75 criteria to evaluate the software on system architecture, functionality, scalability, interoperability, and vendor services. In the process of creating these evaluation criteria we realized that it would be difficult to compare open source software with turn-key solutions, because they represent different very different approaches. We returned to the steering committee for advice on which to pursue. After a discussion of each option, the steering committee instructed us to look for a turn key solution, because open source software requires a greater outlay of resources in the long run for local development and technical support. Based on the 75 evaluation criteria, we wrote a survey with 10 broad-based questions to collect information from the vendors. 11 of the vendors responded to our request for information, generating over 200 pages of documentation. We evaluated the results and eliminated systems for a multitude of reasons. Just to list a few of the common problems: Several of the DAMS we investigated were designed for commercial purposes and we did not feel that they would not work in an academic environment. Some systems are designed around proprietary databases and we were concerned about issues of interoperability and future migration. Many systems offer limited metadata models and we are looking for a system that will accommodate data from a broad range of disciplines.
The task group found that Luna Insight was the only system that met the majority of the evaluation criteria and there were no close seconds. Insight is an enterprise solution with an open, modular architecture. It will scale from a small departmental deployment to larger multi-departmental implementations. Insight’s standards based design and XML gateway ensure interoperability with a variety of other systems used by campuses in the system. We were also impressed by Luna’s impressive university client base. Everyone from Cornell, to MIT to the CDL is using Insight to manage their digital collections.
So, at the last meeting the task group was prepared to endorse Luna’s digital asset management system, Insight. They also offer a cataloging tool called Inscribe which comes bundled with the system. While investigating the software, we discovered that some universities that use Insight prefer to use other cataloging tools instead of Inscribe, so at the December meeting we said that we would evaluate the suitability of Inscribe as a cataloging tool. A metadata subgroup was formed headed by Chris Cronin. Chris will report on that group’s work.
So how much is all this going to cost? Right now, Insight is offering their software for $5,000 which represents a savings of $15,000 dollars. Not only that, but they are currently offering several key components like the XML gateway bundled with the standard license. Normally these modules must be purchased separately. The annual maintenance and support contract runs between $4,000-6,000 dollars depending on the level of service we require. Luna offers a full suite of services and I think we will definitely want to contract with them for installation and some initial training. The cost for services varies; the figures you see here are just estimates based on Luna’s averages. So we could get started with the software for as little as $20,000.
As I’ve said before, the Insight software can be distributed over multiple servers and still appear to our faculty and staff to be a single digital library. This is a significant advantage, because it allows each academic unit within the CU system to adopt Insight in a way that works within that unit’s organizational structure, it environment, budget, and timeline. There are multiple ways that Insight implementations can be managed, but I’m only going to touch on two scenarios just to provide some ideas about how we can get started. In the first case, an individual department can manage their own implementation of Insight. This approach makes most sense for departments or academic units that have the staff and resources to manage all aspects of the system and that prefer to control the way the system is managed and operated. Not all departments on campus have the resources to manage their own implementations. In the second scenario, departments who have similar needs for digital asset management can collaborate on a shared implementation. It’s very difficult to talk about these scenarios in the abstract. Library = Individual, Art and Architecture = Shared. These models apply to academic units on all three campuses. It’s really difficult to talk about these scenarios in the abstract.
All three campuses can participate. Academic units can collaborate within a campus or across the system as appropriate. Academic units can join in when they are ready. Multiple servers means maximum flexibility. Students, faculty, and staff of the CU system can access one digital library for their research, teaching, and learning.
Scenario 1: Hybrid of the Centralized and Distributed Implementations the collection manager, user manager and associated database reside on a modest server (to keep down the cost of db license) the Browser Insight and Media Server components should each be separately scaleable servers—can probably start small recommends/requires some degree of duplicate infrastructure so that system administrators can perform upgrades and tests without incurring substantial down time of the service Budgetary estimates (assuming Oracle is the db): $60K for HW + Oracle. Then factor in $20K/year for maintenance, support, and renewal and replacement (three or four year replacement cycle) Scenario 2: Centralized Implementation for a Small Collection MySQL 4.1 database Starting off: Arch = 40 Gb (compressed) 364 Gb (uncompressed) Library = 31 Gb (compressed) 523 Gb (uncompressed), add 10 Gig in the first year
Digital Asset Management Task Group Recommendations: Content and Digital Library Software for the CU System 24 February 2005
* A 15% discount on the ACF was given to participants who subscribed to ARTstor prior to December 31, 2004. It may be possible to still negotiate for this discount
** The AAF is prorated until the archive reaches 500,000 images – expected 1/01/2006. For 2005, participants pay 80% of the AAF.
*** Previously we had classified UCCS as a small institution when in fact it is classified as a Carnegie Masters I institution which makes it a "medium" size institution under ARTstor’s classification system.
A common software platform still allows for multiple servers / multiple implementations
Two choices for implementation models:
Individual Department manages Insight implementation (server + license)
Multiple Departments share an Insight implementation (server + license)
New CU partners can join in phases
Shared Insight Implementation (example: UCD College of Architecture and Planning UCB Department of Art and Art History UCD College of Arts and Media) Individual Departmental Insight Implementation (example:UCB Libraries) UCCS UCDHSC UCB Students, Faculty, and Staff of CU system