Bearcat Search: Implementing Federated Searching  at the Newman Library   Michael Waldman Lisa Ellis Stephen Francoeur Joseph Hartnett Rita Ormsby Teaching and Technology Conference Baruch College, New York, NY March 28, 2008
Federated Search What is it? Why use it at Baruch? Spring 2006: product selection/committee formed
Implementation Plan Phase I: Setting It Up Phase II: Testing and Tweaking Phase III: Public Launch Phase IV: Optimization and Expansion
Phase I Setting It Up
Database selection Subject selection Branding Page design Setting It Up
Phase II Testing and Tweaking
Are things working as planned? Testing and more testing Problems reported on wiki and submitted to vendor  Value of responsive vendor and customer service team Testing and Tweaking
Tempering High Hopes with Realistic Expectations Marketing and promoting as a tool of discovery Contending with time delays or baffling results Decisions to improve functionality Change default search from keyword to title
Our Learning Curve Query and results flow Databases don’t play nice with each other Connectors Gateways: Z39.50, screenscraping HTML; XML; SRU/SRW
Phase III Public Launch
Pre-launch: Committee members explained Bearcat Search to colleagues Comments sought before and after launch Marketing efforts begin After launch: Individual librarians demonstrate Bearcat search Public Launch
93%  of early respondents were: satisfied with search results said they would use Bearcat Search again “ Saved me a lot of time and found what I needed .” --Novice searcher “  Found something I didn’t know existed.” --Professor searching topic of his    ongoing research Suggestions received : “ Break down the subjects even more—like marketing in business ” “ Add a search box on homepage.” “ Make it faster” Early comments  on Bearcat Search
March 1 – March 23 Statistics 1,400 search sessions 3,000 searches Usage
Phase IV Optimization and Expansion
Usability testing More informed instruction Improved interaction between databases and 360 search Optimization
Deployment of specialty search boxes Addition of more databases Monitoring, modifying, moving ahead Expansion
Boyd, John, et al. “The One-Box Challenge: Providing a Federated Search That Benefits the Research Process.”  Serials Review  32.4 (Dec. 2006): 247-254. Cervone, Frank. “What We've Learned from Doing Usability Testing on OpenURL Resolvers and Federated  Search Engines.”  Computers in Libraries  25.9 (October 2005): 10-14. Cox, Christopher. “An Analysis of the Impact of Federated Search Products on Library Instruction Using the ACRL  Standards.”  Libraries and the Academy  6.3 (2006) 253-267 Lampert, Lynn D., and Katherine J. Dabbour “Librarian Perspective on Teaching Metasearch and Federated  Search Techniques.”  Internet Reference Services Quarterly  12.1/2 (April 2007): 253–278. Lederman, Sol. “ Content Access Basics - Part I - Screen Scraping .”  Federated Search Blog . 27 December 2007. ----. “ Content Access Basics - Part II – XML .”  Federated Search Blog . 30 December 2007. ----. “ Content Access Basics - Part III – OpenSearch .”  Federated Search Blog . 4 January 2008. ----. “ Content Access Basics - Part IV - SRU/SRW/Z39.50 .”  Federated Search Blog . 16 January 2008. ----. “ What Is a Connector .”  Federated Search Blog . 13 December 2007. Wilkin, John. “ Metasearch vs. Google Scholar .”  John Wilkin’s Blog . 5 November 2007. Wruber, Laura, and Kari Schmidt. “Usability Testing of a Metasearch Interface: A Case Study.” College and Research Libraries 68.4 (July 2007): 292-311. Recommended Reading
http://www.slideshare.net/newmanlibrary Location of These Slides

Bearcat Search: Implementing Federated Searching at the Newman Library

  • 1.
    Bearcat Search: ImplementingFederated Searching at the Newman Library Michael Waldman Lisa Ellis Stephen Francoeur Joseph Hartnett Rita Ormsby Teaching and Technology Conference Baruch College, New York, NY March 28, 2008
  • 2.
    Federated Search Whatis it? Why use it at Baruch? Spring 2006: product selection/committee formed
  • 3.
    Implementation Plan PhaseI: Setting It Up Phase II: Testing and Tweaking Phase III: Public Launch Phase IV: Optimization and Expansion
  • 4.
  • 5.
    Database selection Subjectselection Branding Page design Setting It Up
  • 6.
    Phase II Testingand Tweaking
  • 7.
    Are things workingas planned? Testing and more testing Problems reported on wiki and submitted to vendor Value of responsive vendor and customer service team Testing and Tweaking
  • 8.
    Tempering High Hopeswith Realistic Expectations Marketing and promoting as a tool of discovery Contending with time delays or baffling results Decisions to improve functionality Change default search from keyword to title
  • 9.
    Our Learning CurveQuery and results flow Databases don’t play nice with each other Connectors Gateways: Z39.50, screenscraping HTML; XML; SRU/SRW
  • 10.
  • 11.
    Pre-launch: Committee membersexplained Bearcat Search to colleagues Comments sought before and after launch Marketing efforts begin After launch: Individual librarians demonstrate Bearcat search Public Launch
  • 12.
    93% ofearly respondents were: satisfied with search results said they would use Bearcat Search again “ Saved me a lot of time and found what I needed .” --Novice searcher “ Found something I didn’t know existed.” --Professor searching topic of his ongoing research Suggestions received : “ Break down the subjects even more—like marketing in business ” “ Add a search box on homepage.” “ Make it faster” Early comments on Bearcat Search
  • 13.
    March 1 –March 23 Statistics 1,400 search sessions 3,000 searches Usage
  • 14.
    Phase IV Optimizationand Expansion
  • 15.
    Usability testing Moreinformed instruction Improved interaction between databases and 360 search Optimization
  • 16.
    Deployment of specialtysearch boxes Addition of more databases Monitoring, modifying, moving ahead Expansion
  • 17.
    Boyd, John, etal. “The One-Box Challenge: Providing a Federated Search That Benefits the Research Process.” Serials Review 32.4 (Dec. 2006): 247-254. Cervone, Frank. “What We've Learned from Doing Usability Testing on OpenURL Resolvers and Federated Search Engines.” Computers in Libraries 25.9 (October 2005): 10-14. Cox, Christopher. “An Analysis of the Impact of Federated Search Products on Library Instruction Using the ACRL Standards.” Libraries and the Academy 6.3 (2006) 253-267 Lampert, Lynn D., and Katherine J. Dabbour “Librarian Perspective on Teaching Metasearch and Federated Search Techniques.” Internet Reference Services Quarterly 12.1/2 (April 2007): 253–278. Lederman, Sol. “ Content Access Basics - Part I - Screen Scraping .” Federated Search Blog . 27 December 2007. ----. “ Content Access Basics - Part II – XML .” Federated Search Blog . 30 December 2007. ----. “ Content Access Basics - Part III – OpenSearch .” Federated Search Blog . 4 January 2008. ----. “ Content Access Basics - Part IV - SRU/SRW/Z39.50 .” Federated Search Blog . 16 January 2008. ----. “ What Is a Connector .” Federated Search Blog . 13 December 2007. Wilkin, John. “ Metasearch vs. Google Scholar .” John Wilkin’s Blog . 5 November 2007. Wruber, Laura, and Kari Schmidt. “Usability Testing of a Metasearch Interface: A Case Study.” College and Research Libraries 68.4 (July 2007): 292-311. Recommended Reading
  • 18.