NISO Webinar: Discovery & Delivery: Innovations & Challenges
 

NISO Webinar: Discovery & Delivery: Innovations & Challenges

on

  • 1,487 views

Today’s library discovery services are primarily based upon indexes derived from journals, e-books and other electronic information of a scholarly nature. The content comes from a range of ...

Today’s library discovery services are primarily based upon indexes derived from journals, e-books and other electronic information of a scholarly nature. The content comes from a range of information providers and products--commercial, open access, institutional, etc. By indexing the content in advance, discovery services have the ability to deliver more sophisticated services with instant performance, compared to the federated search techniques used previously. Libraries increasingly rely on index-based discovery services as their strategic interfaces through which their patrons gain access to the rapidly growing breadth of information that may be available to them.

This webinar will discuss the challenges of operating a centralized index-based discovery system. Learn about their strengths, and their weaknesses, the needs for standards and best practices in this arena, how libraries and providers can assess the usage, and how libraries can satisfy audiences with different needs--ranging from undergraduates to faculty across every discipline.

Statistics

Views

Total Views
1,487
Views on SlideShare
941
Embed Views
546

Actions

Likes
1
Downloads
23
Comments
0

3 Embeds 546

http://www.niso.org 539
https://www.linkedin.com 4
http://www.linkedin.com 3

Accessibility

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • Publishers must decide what content is appropriate and at what level. respect the rights of the publisher and be sensitive to their business needs. Trust by the information provider that the information indexed is correct and updated. Sharing of information on the use of the indexed content. Show users only what they are allowed to see. Authority – indicate the source of the record. 3. Fair linking by discovery providers – typically in the hands of the library via OpenURL link resolvers.4. How can publishers assess use of their content in Discovery ServicesCmplexity and uncertainty pose barriers to participation
  • Currently working OK but still governed by private agreements between discovery service provider and content providers and relies on a wide range of formats and data exchange processes; Complexity and uncertainty pose barriers to participation.
  • The goal of this work item is to create a working group to define standards and/or best practices for the new generation of library discovery services that are based on indexed search. These discovery services are primarily based upon indexes derived from journals, ebooks and other electronic information of a scholarly nature. Project goals include: provision of effective means for libraries to evaluate the breadth and depth of content indexed by discovery services and the degree of availability of content to different institutions and to different users; development of a set of best practices that can help streamline the process by which information providers work with discovery service vendors, including creation of a common vocabulary; streamlining of the interaction and communication between vendors and information providers, including activation of libraries' subscriptions; definition of models for fair linking from the discovery service to the publisher content; and determination of what usage statistics should be collected, for whom, and how these should be disseminated.
  • Talking PointsThe Five Laws were proposed by S.R. Ranganathan (a Mathematician and Librarian from India) to outline his view for operating a library systemHe is considered to be one of the early thinkers leading the view that librarianship was a science While his original vision was focused on a world of physical media, his laws are founded in providing a library system that is focused on the user (the “reader” in his terminology).
  • Given there is such a wide array of media, devices, professional networks, and platforms that are available to users today, we propose the following revision to the 5 Laws…The Five Laws can be updated and reinterpreted to accommodate the variety of media that are now available – but the focus must remain on the varying needs and goals of users.
  • At the highest level, the goals of users are very different and what they value is very different.
  • In our industry, the focus of analytics has been on “Usage”.The focus of Usage Analytics is understanding the discovery and consumption of documents. This includes Counter reporting of variables such as “Searches” and “Retrieves”. ProQuest has begun to weave in analysis from two other sources which have lead to some opportunities for “break through” research which will allow ProQuest to become more efficient and effective with its platform over time. Behavioral Analytics allow us to understand how users are getting to our site. Visits, page views, time on site and specific actions can be tracked. Attitudinal Analytics allow us to see who the user is and what do they want. This leads to the ability to track satisfaction, which we think may hold promise as a proxy for the performance from a user perspective. It also allows us to perform segment analysis on specific personas so we can match visitors with actions.
  • In our industry, the focus of analytics has been on “Usage”.The focus of Usage Analytics is understanding the discovery and consumption of documents. This includes Counter reporting of variables such as “Searches” and “Retrieves”. ProQuest has begun to weave in analysis from two other sources which have lead to some opportunities for “break through” research which will allow ProQuest to become more efficient and effective with its platform over time. Behavioral Analytics allow us to understand how users are getting to our site. Visits, page views, time on site and specific actions can be tracked. Attitudinal Analytics allow us to see who the user is and what do they want. This leads to the ability to track satisfaction, which we think may hold promise as a proxy for the performance from a user perspective. It also allows us to perform segment analysis on specific personas so we can match visitors with actions.
  • In this scenario, the group starts their search for “space tourism” via a discovery service such as Summon. They click through to the ProQuest platform and land on the Washington Post article on the subject. They decide to run another search, which “hits” all databases in their subscription.Notice when they arrive at the ProQuest platform, they are searching all databases in their subscription. Therefore, subsequent searches will hit every database on their subscription. This activity shows up on usage reports.
  • In the talk, mention that we discovered the cause by analyzing some referring URL data. That’s part of the behavioral data that we now segue to.
  • This is view of visits to our new platform per day. [Should we hide the Y-Axis label?]We have included the 30 day moving average.Precise measurement of visits on a daily basis allow us to see exactly when there are deviations. Notice how there is a very consistent pattern for 6 straight weeks. In the 7th week, there is anomaly? Why?... That is February 14th, Valentine’s day. Apparently, not many were studying on that day.Notice spike in traffic near the end of March (just before most US Spring Break) and the dip in early April (vacation, Easter). Continuous monitoring allow us to see regular patterns in the data, which give us a credible baseline. We can then detect even small changes that deviate from the pattern.
  • After making a new release, we found that the pages viewed/ visit were down.This was considered a Key Performance Indicator. However, after reviewing data from Satisfaction and Time Spent on Site, we found that my making our platform more efficient and effective, we may have been making improvement our platform despite diminishing this metric.
  • In our industry, the focus of analytics has been on “Usage”.The focus of Usage Analytics is understanding the discovery and consumption of documents. This includes Counter reporting of variables such as “Searches” and “Retrieves”. ProQuest has begun to weave in analysis from two other sources which have lead to some opportunities for “break through” research which will allow ProQuest to become more efficient and effective with its platform over time. Behavioral Analytics allow us to understand how users are getting to our site. Visits, page views, time on site and specific actions can be tracked. Attitudinal Analytics allow us to see who the user is and what do they want. This leads to the ability to track satisfaction, which we think may hold promise as a proxy for the performance from a user perspective. It also allows us to perform segment analysis on specific personas so we can match visitors with actions.
  • We can confidently state that End Users (Students and Professors) have a higher satisfaction than Librarians The numbers are so polarized that we may run the risk of diminishing satisfaction with End Users if we improve satisfaction with Librarians. Is difference due to Task, Visit Frequency or Persona differences?
  • We can confidently state that End Users (Students and Professors) have a higher satisfaction than Librarians The numbers are so polarized that we may run the risk of diminishing satisfaction with End Users if we improve satisfaction with Librarians. Is difference due to Task, Visit Frequency or Persona differences?
  • This chart shows a trend-line of Student Satisfaction.Notice that there is a December dip… perhaps due to exam season, tight deadlines.. End of the exam season.Would have expected a similar drop in May. However, we were able to maintain similar satisfaction levels.It will be interesting to see if we are able to gain satisfaction in September.
  • This graphic shows Student Satisfaction by Week. The trend-line is satisfaction, the bars in the background show the number of surveys collected for the week. The green boxes (that fade in) show our performance leading up to academic semesters. In both cases our satisfaction was at relatively low
  • This chart shows the primary purpose for visitors to our platform and their Satisfaction differences.Those that are working on an ongoing research project, are selecting or exploring a research project and are looking for 2-3 good articles have slight higher than average SatisfactionThose that are on our platform to find a quote, statistics or fact, looking for a specific document or have followed a link to a specific article have lower than average satisfaction.This chart shows that expectations of the audience can have a dramatic impact on satisfaction.
  • What you see on this graph is a comparison the Satisfaction of students by School Type (Community College, Undergrad, Graduate School and Elementary/High School)We are able to perform segment analysis on the data with the goal of finding out if there are significant differences in Satisfaction by group.The blue dots represent the indexed satisfaction score on a 100 points scale. The upper bound (green dash) and the lower bound (red dash) represent the 90% confidence interval range (as calculated by the sample size of respondents and the standard deviation of the satisfaction data). The highest satisfaction group is community colleges. The lowest Satisfaction group is Elementary/High Schools. It is interesting to note that the Undergrads have a slightly higher than average satisfaction and Graduate students have a slightly lower than average Satisfaction, though both are close to the overall average. This reinforces that we are matching intended “sweet spot” for our platform, which is “intermediate searchers”.
  • What you see on this graph is a comparison the Satisfaction of students by School Type (Community College, Undergrad, Graduate School and Elementary/High School)We are able to perform segment analysis on the data with the goal of finding out if there are significant differences in Satisfaction by group.The blue dots represent the indexed satisfaction score on a 100 points scale. The upper bound (green dash) and the lower bound (red dash) represent the 90% confidence interval range (as calculated by the sample size of respondents and the standard deviation of the satisfaction data). The highest satisfaction group is community colleges. The lowest Satisfaction group is Elementary/High Schools. It is interesting to note that the Undergrads have a slightly higher than average satisfaction and Graduate students have a slightly lower than average Satisfaction, though both are close to the overall average. This reinforces that we are matching intended “sweet spot” for our platform, which is “intermediate searchers”.
  • Depending on the school type – the tasks are different. 4 key columns Select or explore topicArticles for an assignmentResearch project (high for graduate school)Specific Article (elementary in alignment with graduate)
  • 1. Retrieves per search is higher with hospitals with lower satisfaction while the inverse is true for the Nursing Medical Schools.

NISO Webinar: Discovery & Delivery: Innovations & Challenges NISO Webinar: Discovery & Delivery: Innovations & Challenges Presentation Transcript

  • [insert web address for NISO webinar page]Discovery and Delivery: Innovations and Challenges September 26, 2012 Speakers: Lucy Harrison, Timothy Babbitt, David Bietila
  • Introducing the Open Discovery Initiative NISO Webinar: Discovery and Delivery: Innovations and ChallengesLucy Harrison, Florida Virtual Campus, D2D Liaison September 26, 2012 2
  • Topics• What are centralized indexes?• What are their strengths and weaknesses?• What is the NISO ODI initiative?• How will it help improve the discovery landscape? 3
  • Evolution of Library Search• Card Catalogs• Online Catalogs• Federated search tools• Next-generation library catalogs (discovery interfaces)• Index-based discovery services 4
  • Discovery Interfaces ILS Data Digital Search: Local Collections Index ProQuest Search Results EBSCOhost Federated Search Engine … MLA Bibliography ABC-CLIO Real-time query and responses
  • Index-based Discovery ILS Data Digital Search: Collections Consolidated Index ProQuest Search Results EBSCOhost … MLA Bibliography ABC-CLIO Harvesting and indexing performed in advance
  • Strengths of Index-basedDiscovery• Fast response time (vs. federated search)• Structured metadata: – Improves search & retrieval – Faceted navigation – Improves integration of search results• Indexing full-text of content amplifies access 7
  • Issues with Index-basedDiscovery• Important to understand depth of indexing – Currency, dates covered, full-text or citation, quality of metadata• Uneven participation diminishes impact• Ecosystem dominated by private agreements• Complexity and uncertainty poses barriers for participation
  • Need to Bring Order to Chaos• Libraries need the ability to understand and evaluate tools, content, providers• Information providers need the confidence that their content is being treated fairly• Service providers need the ability to more efficiently integrate content 9
  • Key Areas for Libraries• Strategic investments (in subscriptions and discovery solutions)• Expect comprehensive representation of resources in discovery indexes• Need to be able to evaluate the depth and quality of these index-based discovery products• Usage reporting
  • Collection Coverage Questions • How well does the index cover the body of scholarly content? • Why do some publishers not participate? • How can libraries understand the differences in coverage among competing services? • How are your library’s content packages represented by the discovery service? • Which resources are not represented in index? • Is content indexed at the citation or full-text level? • What is the quality of the metadata? • What are the restrictions for non-authenticated users?
  • Key Areas for Service Providers• Encourage information providers to participate• Lower thresholds of technical involvement• Clarify the business rules associated with involvement• Common industry standards and definitions• Usage reporting
  • Key Areas for InformationProviders• Discovery brings uncertainty• Want to expose content widely (increase usage), but• There are trust issues – With Access / Authentication – With ―Fair‖ Linking• Private agreements• Usage reporting
  • Need healthy ecosystemamong discovery service providers, libraries and information providers
  • OPEN DISCOVERY INITIATIVEPromoting Transparency in Discovery 15
  • ODI Pre-History• June 26, 2011: Exploratory meeting @ ALA Annual• July 2011: NISO expresses interest• Aug 7, 2011: Proposal drafted by participants submitted to NISO 16
  • ODI ProposalDefine standards and/or best practices forindex-based discovery services – Evaluate the breadth and depth of content – Evaluate availability of content to different institutions and to different users – Streamline workflows – Define models for fair linking – Determine what usage statistics should be collected and disseminated 17
  • ODI Pre-History• June 26, 2011: Exploratory meeting @ ALA Annual• July 2011: NISO expresses interest• Aug 7, 2011: Proposal drafted by participants submitted to NISO• Aug 2011: Proposal accepted by D2D• Vote of approval by NISO membership• Oct 2011: ODI launched• Feb 2012: ODI Workgroup Formed 18
  • ODI Charge and Objectives• Improve information services to end users as mediated through index-based discovery services• Create an environment that broadens stakeholder participation and ensures confidence• Foster development of best practices and effective means of assessment 19
  • Specific Benefits Librarians – Can offer their users as wide a range of content as possible via their discovery service of choice – Can better evaluate discovery services to address their needs 20
  • Specific Benefits Information providers – Have the confidence that the discovery service providers are handling their content in an appropriate manner – Are therefore encouraged to make available the widest range of content— in terms of breadth and depth – for indexing by the discovery service providers 21
  • Specific Benefits Discovery service providers – Receive more standardized and efficient integration with the information providers through common industry definitions and communications 22
  • Balance of Constituents Libraries Marshall Breeding, Co-Chair Michele Newberry, Florida Virtual Campus Jamene Brooks-Kieffer, Kansas State University Sara Brownmiller, University of Oregon Laura Morse, Harvard University Lucy Harrison, Florida Virtual Campus (D2D Ken Varnum, University of Michigan liaison/observer) Information Providers Lettie Conrad, SAGE Publications Linda Beebe, American Psychological Assoc Roger Schonfeld, ITHAKA/JSTOR/Portico Aaron Wood, Alexander Street Press Jeff Lang, Thomson Reuters Peter Noerr, MuseGlobal Service Providers Jenny Walker, Ex Libris Group (Co-Chair) David Lindahl, University of Rochester (XC) John Law, Serials Solutions Jeff Penka, OCLC (D2D liaison/observer) Michael Gorrell, EBSCO Information Services 23
  • Organization• Reports in NISO through Document to Delivery topic committee (D2D)• Staff support from NISO (Nettie Lagace)• Co-Chairs – Jenny Walker (Ex Libris) – Marshall Breeding (Library Consultant)• D2D Observers: – Jeff Penka (OCLC), Lucy Harrison (FLVC) 24
  • ODI Project Goals1. Identify, possibly through surveys or other questionnaires, the needs and requirements of the three stakeholder groups in this area of work• Created subgroups for information gathering: – Level of Indexing – Library Rights – Technical formats – Usage Statistics – Fair Linking 25
  • ODI Project Goals1. Identify, possibly through surveys or other questionnaires, the needs and requirements of the three stakeholder groups in this area of work• Created subgroups for information gathering• Conducted interviews with stakeholders• Created survey with input from all sub groups• Survey is currently live (closes October 4) 26
  • https://www.surveymonkey.com/s/QBXZXSB 27
  • ODI Project Goals1. Identify, possibly through surveys or other questionnaires, the needs and requirements of the three stakeholder groups in this area of work• Created subgroups for information gathering• Conducted interviews with stakeholders• Created survey with input from all sub groups• Survey is currently live (closes October 4)• Analyze results as input to Goal 2 28
  • ODI Project Goals:2. Create recommendations and tools to streamline the process by which information providers, discovery service providers, and librarians work together to better serve libraries and their users
  • ODI Project Goals:3. Provide effective means for librarians to assess the level of participation by information providers in discovery services, to evaluate the breadth and depth of content indexed and the degree to which this content is made available to the user
  • Specific deliverables• Standard Vocabulary• NISO Recommended Practice: – Data format & transfer – Communicating content rights – Levels of indexing, content availability – Linking to content – Usage statistics – Evaluate compliance• Inform and Promote Adoption 31
  • TimelineMilestone Target Date StatusAppointment of working group December 2011Approval of charge and initial work plan March 2012Agreement on process and tools June 2012Survey completed Oct 4, 2012Completion of information gathering October 2012Completion of initial draft January 2013Completion of final draft May 2013 32
  • Connect with ODI• ODI Project website: http://www.niso.org/workrooms/odi/• Interest group mailing list: http://www.niso.org/lists/opendiscovery/• Email ODI: odi@niso.org 33
  • Seeing Discovery Through User Colored GlassesNISO Webinar: Discovery and Delivery: Innovations and Challenges Timothy Babbitt, Senior Vice President, Platform Management, ProQuest
  • Understanding What Is Valuable to UsersFoundation of Librarianship S.R. Ranganathan’s The Five Laws of Library Science (Madras India: Madras Library Association, 1931) 1. Books are for use. 2. Every reader his [or her] book. 3. Every book its reader. 4. Save the time of the reader. 5. The library is a growing organism. Even at a time when the emphasis was entirely on physical media the focus was on the individual goals and needs each type of user (the reader)
  • The Five Laws Updated Ranganathan Updated Laws 1. Books are for use. 1. Information in all of its forms is for 2. Every reader his [or her] book. use. 2. Every researcher their 3. Every book its reader. information. 4. Save the time of the reader. 3. Every medium and delivery 5. The library is a growing platform its user. organism. 4. Enable efficient discovery by the user. 5. The library is part of an evolving research ecosystem. The proliferation of web based information tools allows us to track users and their behavior with increasing precision but understanding the unique needs and behaviors of different researchers requires deep analysis and interpretation of a variety of data – traditional usage data does not tell the whole story.
  • How do we measure value in the evolvingresearch landscape? Traditional Model  Usage data to measure value (searches and retrievals) – more usage = more value Growing Trend  Web based analytics – behavioral and attitudinal dimensions segmented by types of users Key Question  How can we combine both approaches and what can they provide that traditional usage data cannot?  Usage Data = what they did  Other Analytics = what they were trying to do; did they succeed; what was the context; and who was doing it?
  • Value Differs by UserWho is doing the research matters Librarians  Making their patrons successful  Promoting services of library to students and faculty with confidence  Delivering services that are convenient for their users Faculty  Extending their influence and reach in their discipline through published research  Efficiently directing students to materials that meet learning objectives  Obtaining research grants Students  Completing coursework in accordance with faculty directives  Remaining in compliance with source attribution policies  Accessing information conveniently
  • Conventional Wisdom Might Tell Us  High use = high value  High satisfaction = high value  Use (search and retrievals) is homogenous Including Attitudinal and Behavioral dimensions to traditional usage data allows us to challenge many long held assumptions
  • Three Dimensions of Analysis Focus of Analysis: The discovery and consumption of documents (COUNTER Usage Reporting e.g. Searches and Retrieves) etc.Focus of Analysis: Focus of Analysis: What is the user’sWhat is the user Attitudinal Behavioral context?satisfaction? (Page Views per visit, Time on Site, and Clickstream etc.)
  • Three Dimensions of Analysis Focus of Analysis: The discovery and consumption of documents (COUNTER Usage Reporting e.g. Searches and Retrieves) etc. Attitudinal Behavioral
  • Usage in the new age of discovery A case study Earlier this year, a library contacted us about a large increase in search usage in one of their databases: Month 11-Jan 11-Feb 11-Mar 11-Apr … Searches Run 50 250 43 199 Month 12-Jan 12-Feb 12-Mar 12-Apr … Searches Run 265 616 1176 847
  • Web scale discovery and workflow By default, the user runs an all database search (thus showing as a ―hit‖ against all DB’s in the library’s subscription.)In this scenario, the user starts their search for ―space tourism‖ via a discovery service such as Summon. Finding a document they like, they click through to the ProQuest platform and land at the document level. Then decide to run another search, which ―hits‖ all databases.
  • Usage in the new age of discovery  The cause? Heavily utilization of a new web scale discovery service.  Users now used our website differently than users who started at the library ―Databases A-Z‖ page. The change in how their users were coming to our site led to a change in usage.  Usage data showed the effect of the change of their patrons’ research environment. Determining the cause required looking beyond the usage data. For more details see the white paper ―Usage in the New Age of Discovery‖
  • Three Dimensions of Analysis Usage Focus of Analysis: What is the user’s Attitudinal Behavioral context? (Page Views per visit, Time on Site, and Clickstream etc.)
  • Behavioral Analysis:Precise Measure of Visits
  • Behavioral AnalyticsPages Viewed/ Visits
  • Three Dimensions of Analysis UsageFocus of Analysis:What is the user Attitudinal Behavioralsatisfaction?
  • Attitudinal – Satisfaction Comparison Survey Methodology  Over 19,500 Surveys  Data from November 2011 through September 2012  Continuous monitoring  Predictive modeling Librarians Faculty Students
  • Satisfaction Comparison: Role
  • Satisfaction Comparison: RoleNumber of Respondents
  • Attitudinal Analysis:Student Satisfaction Trend-line
  • Student Satisfaction: By Week
  • Satisfaction Comparison: Primary Purpose
  • Satisfaction Comparison: Primary PurposeNumber of Respondents
  • Attitudinal AnalysisSatisfaction Comparison: School Type
  • Attitudinal AnalysisSatisfaction Comparison: School Type
  • Cross Tab:Primary Purpose and School TypeData from November 8, 2011 through August 12, 2012
  • “Deeper Dive” with Segment Analysis:Data: Nov. 8, 2011 through Aug 7, 2012
  • Seeing Value from a UserPerspective  We need the triangulation of  Usage data  Behavioral data  Attitudinal data  Drive understanding of  What did the users do?  What was the context of use?  What were they trying to do and were they successful?  Important because the research ecosystem is changing  i.e. Web scale discovery
  • Toward understanding what isvaluable to users Updated Laws 1. Information in all of its forms is for use. 2. Every researcher their information. 3. Every medium and delivery platform its user. 4. Enable efficient discovery by the user. 5. The library is part of an evolving research ecosystem.
  • Next steps  Complete integration and analysis of  Usage data  Behavioral data  Attitudinal data  Give libraries a deep understanding of value from user segments  Next stop…content and search analysis!
  • Collecting Patron Perspectives onDiscovery ToolsNISO WebinarDiscovery and Delivery: Innovations andChallengesSeptember 26, 2012David BietilaWeb Program DirectorThe University of Chicago Library
  • Data Sources Finding Aids Databases Digital & Indexes Collections Discovery Catalog Tool Web Pages
  • Assessment Questions Product Characteristics  Technical  Functional  Interface Relevance to Users  Define use case  Novice users of databases and electronic resources who need to find articles on a topic.  Assess product’s applicability to this use case
  • 1. User CommentsTopic User attitudes toward the product, and specific aspects valued by users.Method Posted link to a comment form in the header. Set up an info table for three days. Required minimal effort to collect data.Sample Questions What type of resource were you looking for with the Articles Plus? Please share any positive or negative comments about your experience.Results Unprecedented proportion of positive comments. Users valued the speed of this search, and the ability to search both books and articles in one place. Functional & Interface Assessment
  • Resource sought Journal articleSuccessful? YesAffiliation Graduate or Professional School StudentComment This was a quick search and gave me exactly what I needed.Resource sought Academic journal articlesSuccessful? NoAffiliation Graduate or Professional School Student Record included books, needed a way to filter this outComment results from Time magazine, not what I was looking for at all Both books and articles on certain topics (ones related to philosophy, psychology andResource sought literature).Successful? PartialAffiliation Graduate or Professional School Student Its wonderful to have ONE place where you can search for both articles and books! However, it seems like more books should show up because some books relevant to my search showed upComment in lens but not in Articles Plus. If you dont choose this search tool, please do adopt some search tool that allows comprehensive searching of both books and articles!
  • 2. Usage StatisticsTopic Usage relative to major databasesMethod Examined data presented in the discovery tool’s admin interface. Compared usage with that of high traffic databases (Web of Science, JSTOR, Academic Search Premier).Results Observed increasing usage, reaching parity in the initial year with Academic Search Premier. Verified that users were finding and using the tool. Interface Assessment
  • Searches per Month - Articles Plus and Major Databases500004500040000 EDS Foundation Index35000 (ArticlesPlus)30000 Web of Science25000 JSTOR2000015000 Academic Search10000 Premier 5000 0
  • 3. Usability TestingTopic Clarity of search interface, including functional elements and labels. User ability to complete article discovery tasks.Method Assigned representative article searching tasks. Recording screen activity and spoken comments for analysis.Sample Questions Can you locate full text of the following article? Mary L. Dudziak, “Desegregation as a Cold War Imperative,” 41 Stanford Law Review 61 (1988) You are working to prepare a summary of developments in the field of astronomy. Can you locate five articles on astronomy that were published in Nature in 2010? Functional & Interface Assessment
  • Task Participant ParticipantSession Status Department 1 2 3 4 5 6 7 8 9 10 1 Graduate Public Policy         X  2 Graduate CMES ~  X   X  X X  3 Undergrad Undeclared     ~  X  X  4 Undergrad Anthro       ~    5 Graduate MAPH           6 Undergrad Physics ~     ~     7 Undergrad English      ~   X ~ 8 Graduate MAPSS ~      ~   ~
  • 3. Usability Testing Results • Cut or relabeled certain facets • Made heavily used features more prominent. • Removed unused or confusing features • Determined collections to be retained or removed from coverage. Functional & Interface Assessment
  • Iterative Evaluation Refine Evaluate Use Tool Case
  • Other Use Cases Experienced researchers  Searching interdisciplinary topics  Searching outside their primary area  Searching within their primary area
  • 4. Subject AssessmentsTopic Applicability of the discovery tool for use by advanced researchers in disciplines with a variety of requirements.Method Enlisted bibliographers and subject specialists as proxies for different user constituencies. Created an evaluation rubric to ensure comparable results across over 40 disciplines assessed.Results Determined which disciplines would be best served by the tool. Identified strength of coverage in numerous disciplines. Functional Assessment
  • Subject Evaluation Form
  • Compared to Compared toSubjects – Social First 25 Results Core Database JSTOR/ GoogleSciences Results Scholar ResultsAnthropology/Geography/Maps ~ ~ ~Gender Studies ~ ~ -History + + +International Political Economy ~ - ~Psychology + - ~Sexuality Studies + + +
  • Outcomes & Future Assessment Recommendation to purchase Improvements  Local configuration  Enhancement requests Marketing & Communication Benefit of Iterative Assessment Future assessment  Applicability of new coverage  Revisiting usability  Overlap with other search tools
  • THANK YOU Thank you for joining us today.Please take a moment to fill out the brief online survey. We look forward to hearing from you!