Methodology for Technology Review, Assessment, and Recommendation:

287 views

Published on

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
287
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
5
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Methodology for Technology Review, Assessment, and Recommendation:

  1. 1. Texas Center for Digital Knowledge – University of North Texas _________________________________________________________ MERIC – The MetaResource Metadata Education and Research Information Center Methodology for Technology Review, Assessment, and Recommendation William E. Moen <wemoen@unt.edu> February 4, 2006 Draft Dr. William E. Moen Texas Center for Digital Knowledge SLIS 5223 University of North Texas
  2. 2. Methodology for Technology Review, Assessment, and Recommendation Version Control Specify the Version, Date and Time of Modification of the document, Name of the Modifier, Section of the document where the changed have been made, and Brief Description of the Changes. Document Title Document Filename Original Creation Date Original Author Version Date and Time of Name of Modifier Section Modified Brief Description Modification of the Changes MERIC – The MetaResource February 4, 2006
  3. 3. Methodology for Technology Review, Assessment, and Recommendation Table of Contents Introduction..................................................................................................................................................1 Categories of Candidate Technologies .......................................................................................................1 Out-of-the-Box Software...........................................................................................................................1 Toolkit Software.......................................................................................................................................1 Home-Grown Software.............................................................................................................................2 What to Do: Review? Assessment? Or Recommendation?.........................................................................2 How to Proceed...........................................................................................................................................2 Timeline.......................................................................................................................................................3 MERIC – The MetaResource February 4, 2006
  4. 4. Methodology for Technology Review, Assessment, and Recommendation Methodology for Technology Review, Assessment, and Recommendation Introduction This document presents some ideas for carrying out work related to reviewing, assessing, and recommending specific technologies for the MetaResource application. The requirements for the application have been stated to some extent in documents prepared by the MERIC Board, and those documents also list some possible technologies that could be considered. Ultimately, the decision about appropriate technologies has to be informed by the specific requirements for the application. However, the timeline for our work on the MetaResource application requires that we begin exploring the different technologies in parallel to the identification, clarification, and prioritization of the functional requirements. For the initial work on reviewing candidate technologies, the focus should be on preparing a good description of the technologies. Separate working groups of students will be assigned to one or more technologies, and will prepare a general overview of each of the assigned technologies for discussion with other students and the MERIC Board. Categories of Candidate Technologies We might think of candidate technologies as belonging to one of the following categories: • An out-of-the box software application that would be customized for the MetaResource application • A set of software tools or modules (i.e., a toolkit) that would require programming and customization for the MetaResource application • A home-grown approach using a variety of tools to build the MetaResource application. Let me try to describe these in a bit more detail. Out-of-the-Box Software Two candidate technologies mentioned by the MERIC Board fit into this category: DSpace and E-Prints. Both of these software applications are used for institutional repositories. They both provide methods for submitting digital objects to the repository, use metadata to describe and control the objects, and provide methods for searching and browsing the collections of objects. We are currently using DSpace for a pilot project of a UNT institutional repository called STARchive <http://meta.lis.unt.edu/starchive/index.jsp>. For these applications, one downloads and installs the software package, configures it, and then customizes it for the specific application. One of the key issues for this category of software is its potential for configuration and customization to meet the functional requirements. This category provides the least flexibility, but is likely the most time-efficient in terms of getting a prototype running. Toolkit Software I’m not sure if this is the best label for this category, but there are some candidate technologies that need more effort than the out-of-the-box software, but possibly less than the home-grown approach. My sense of this category is that it will require higher levels of programming skills and effort than DSpace or E- Prints. It is likely, though, that toolkits will provide more flexibility in customization and options than the out-of-the-box applications. The two technologies I have in mind for review in this category are Fedora and Keystone. • “Fedora open source software gives organizations a flexible service-oriented architecture for managing and delivering their digital content.” <http://www.fedora.info/> MERIC – The MetaResource 1 February 4, 2006
  5. 5. Methodology for Technology Review, Assessment, and Recommendation • “The Keystone Digital Library Suite is a family of Open Source digital content management, portal management and information discovery software packaged together to provide libraries, museums and archives with state-of-the-art digital library services.” <http://www.indexdata.com/keystone/> One of the key issues with this category of technologies is the level of effort, skills and knowledge, etc, needed to implement even a prototype application. Home-Grown Software The third category of technologies is basically a name for building the entire application from the ground up. Obviously, in this approach, we would use existing tools such as web servers, database management systems, scripting languages, etc. Typically, such approaches provide the most flexibility and control over the application. And typically, such an approach is basically geared to one application – it may or may not be generalized or even re-used for other applications. If we decided to use this approach, it would be necessary to have all functional requirements ready, choose specific software to address the requirements, and then tie all the software together for the application. While a simple MySQL database- driven website with metadata, etc., could be built as a prototype in relatively short order, this approach would require a lot of effort in preparing an architecture for the MetaResource application to provide future-proofing and extensibility when new requirements or tools are needed for the application. What to Do: Review? Assessment? Or Recommendation? For Work Area C, we need to accomplish three different things. First, we need to review the candidate technologies and develop a basic understanding of what they can do, etc. Second, we need to assess each of the short-list of candidate technologies against the functional requirements for the MetaResource application. Finally, we need to make a recommendation on a technology to use for the MetaResource application. For the initial work on this Work Area, the working groups will be carrying out a fact-finding mission to review and develop an understanding of the technologies. Since some of the requirements for the MetaResource are described in the MERIC Board documents, it should be possible to do a very, very tentative assessment of the technologies relative to the requirements. However, the full assessment of the candidate technologies will be done once we have a stable set of functional requirements. The initial work should be primarily descriptive rather than evaluative. However, if it is possible to make any evaluative judgments about the “fit” of a technology for the MetaResource application, you can include those. Those judgments have to be based in a good understanding of the technology being reviewed. In addition, it would be very helpful if you look at any actual implementations using the technologies to help you in your understanding of the functioning and use of the technologies. How to Proceed For this Work Area, I am establishing three working groups. Each working group will have one or two technologies to review and understand. You will work together to develop a technical report describing each of the technologies you are assigned. The following are the working groups: • DSpace and E-Prints Working Group: Number of students -- 3 • Fedora Working Group: Number of students -- 3 • Keystone Working Group: Number of students -- 3 My rationale for one group addressing two technologies is: 1) we have some experience with DSpace, and 2) E-Prints and DSpace are similar since they were developed primarily as technologies to support institutional repository applications. Fedora and Keystone may be a bit more complex and deserve separate treatment. I am excluding a working group for a home-grown approach at this time. To me, the MERIC – The MetaResource 2 February 4, 2006
  6. 6. Methodology for Technology Review, Assessment, and Recommendation effort required for doing a good and extensible home-grown system is likely beyond our resources, and also the MERIC Board had indicated an implicit preference for using an existing technology. As soon as the working group assignments are made, I will set up discussion categories in the MERIC Project Discussion tool <http://meta.lis.unt.edu/slis5223/forum/> for each of these working groups. Do all your communication within those discussion categories. I will be monitoring and answering questions there. Next, begin reading the documentation available about the technologies. The websites for the technologies are: • DSpace: <http://www.dspace.org> • Eprints: <http://www.eprints.org/> • Fedora: <http://www.fedora.info/> • Keystone: <http://www.indexdata.com/keystone/> You will likely need to go beyond the documentation available on those websites to get a fuller understanding of the technologies, but you should begin at the product websites and mine that as deeply as possible for good, authoritative information. To help understand the technology, try to find implementations of each of the technologies, and see what the people involved with those implementations have written about the technologies. As you review and gain understanding of the technologies, begin writing up a technical report for each technology you are assigned. I will work with all the working groups to see if we can come up with a somewhat standard format to describe various aspects of the technologies. For example, we would want a section of the report that describes the operating system requirements (Linux, Unix, Windows, etc.), hardware and software requirements, lists and descriptions of functionality available, etc. Make sure you keep track of the documents (title, author, date, URL, etc) you discover and include those in a list of references in your technical report. Timeline Your work on this part of the technology review, assessment, and recommendation that focuses on the description of a technology will be carried out prior to the onsite meeting at the end of February. Again, this initial work will only focus on the description and understanding, with that reflected in your technical report. I would hope that all Functional Requirements Working Groups and all Technology Review Working Groups can produce a complete and near final draft of their respective reports by Feb 20 for distribution to all students. MERIC – The MetaResource 3 February 4, 2006

×