LITA Forum 2010


Published on

Presentation given at the 2010 LITA National Forum

  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

LITA Forum 2010

  1. 1. “Quick Search” It Is Not: Testing Response Times of Traditional and NextGen Catalogs<br />Nina McHale<br />Margaret Brown-Sica<br />LITA Forum 2010<br />
  2. 2. Esteemed Researchers<br />
  3. 3. Our Research<br />Forthcoming:<br /> Margaret Brown-Sica, Jeffrey Beall, and Nina McHale, “Next-Generation Library Catalogs and the Problem of Slow Response Time,” Information Technology and Libraries, Volume 29/4,December 2010, 207-216. <br />
  4. 4. Not-So-Quick-Search<br />
  5. 5. Our Research Questions<br />Are NextGencatalogs—or traditional catalogs that add NextGen content—too slow?<br />Do 2.0/NextGen features slow them down too much?<br />
  6. 6. Our Conclusions<br />Yup.<br />Features such as cover art, reviews, tagging, etc., can significantly increase the amount of data, and therefore time, required to return a catalog record page.<br />Performance factors, particularly speed, should be required criteria for librarians and vendors evaluating and designing products.<br />
  7. 7. Speed Standards?<br />W3C does not set forth standards<br />Jakob Nielsen<br />0.1 of a second: feels “instantaneous”<br />1.0 second: feels “uninterrupted”<br />10 seconds:<br />“About the limit for keeping the user’s attention focused on the dialogue.”<br />Give the user “time-remaining” feedback<br />Jakob Nielsen, Usability Engineering (San Francisco: Morgan Kaufmann, 1994) 135.<br />
  8. 8. Our Method<br />During a busy time during the semester, we recorded response times in seconds of permalinks for three catalog records<br />Tested our classic/NextGen catalogs and three others<br />3 books, 5 catalogs, 3 times per day for 13 days=585 data points<br />Collecting several data points in this way using ensured that data was consistent<br />
  9. 9. Additional Catalogs Tested<br />Library of Congress Catalog <br />Voyager<br />Traditional catalog<br />University of Texas at Austin<br />Innovative Interfaces <br />Traditional catalog with added NextGen elements<br />University of Southern California<br />Sirsi/Dynix<br />Traditional catalog with added NextGen elements<br />
  10. 10. Books Used<br />Hard Lessons: The Iraq Reconstruction Experience. Washington, DC: Special Inspector General, Iraq Reconstruction, 2009. (OCLC number 302189848)<br />Ehrenreich, Barbara. Nickel and Dimed: On (Not) Getting by in America. 1st ed. New York: Metropolitan Books, 2001. (OCLC number 256770509)  <br />Langley, Lester D. Simón Bolívar: Venezuelan Rebel, American Revolutionary. Lanham: Rowman & Littlefield Publishers, c2009. (OCLC number 256770509)  <br />
  11. 11. Permalink Examples<br /><br /><br /><br />   <br />{CKEY}<br />
  12. 12. Testing Tools<br />WebSitePulse™<br /><br />Allows testing on any web page/site; does not require server installation<br />Similar services:<br /><br /><br /><br />your favorite?<br />
  13. 13. WebSitePulse™<br />
  14. 14. WebSitePulse™ Results<br />Horizontal bar:<br />Gives visual representation of load time for each item (image files, javascript files, style sheets, etc.)<br />Provides quick indication of “sticking points”<br />Table:<br /> provides specifics about file size and delivery time for each<br />DNS, Connect, Redirect, First Byte, Last Byte, Error<br />
  15. 15. Numbers Crunched: Average Response Time in Seconds<br />Auraria’s Skyline: 1.2930<br />Auraria’s WCL: 11.5734 <br />Library of Congress: 2.1530<br />University of Texas at Austin: 3.4997<br />University of Southern California: 4.1085<br />
  16. 16. Individual Catalog Test Results<br />After data was analyzed, we took a closer look at each individual catalog, using the Hard Lessons catalog record<br />WebSitePulse™ allowed us to take a glimpse at the inner workings of each catalog<br />Findings confirmed that extra data and load times were from 2.0/NextGen content<br />
  17. 17. “Skyline,” Auraria Library<br />
  18. 18. Skyline Test Results: Graph<br />
  19. 19. Skyline Test Results: Table<br />
  20. 20. Skyline Findings<br />Missing favicon (item 4)<br />0.9172 seconds “uninterrupted” per Nielsen<br />14 items, for a total of 84.64 K:<br />9 GIFs<br />2 CSS<br />1 JavaScript<br />Good performance, but an interface that only a librarian could love<br />
  21. 21. WorldCat@Auraria<br />
  22. 22. WorldCat@Auraria Results: Graph<br />
  23. 23. WorldCat@Auraria Results: Table<br />
  24. 24. WorldCat@Auraria Findings<br />Reference & Instruction librarians’ observations corroborated<br />10.3615 seconds<br />31 items, for a total of 633.09 K, to load:<br />10 CSS files<br />10 JavaScript files<br />8 GIFs/PNGs<br />No single NextGen feature slowed down load time, but multitude of files created unacceptable delay<br />
  25. 25. Library of Congress Catalog<br />
  26. 26. Library of Congress Catalog Results: Graph<br />
  27. 27. Library of Congress Catalog Results: Table<br />
  28. 28. Library of Congress Catalog Findings<br />Overall, second fastest of all five catalogs tested<br />1.2900 seconds<br />Only six items and 19.27 K to load:<br />2 CSS files<br />3 GIFs<br />Like Skyline, fast, but has that “legacy look”<br />
  29. 29. University of Texas at Austin<br />
  30. 30. UT Austin Results: Graph<br />
  31. 31. UT Austin Results: Table<br />
  32. 32. University of Texas at Austin Findings<br />Added NextGen features:<br />Cover art<br />LibraryThing’s Catalog Enhancement<br />Supports recommendations, tag browsing, alternate editions/translations<br />2.4674 seconds: user experience interrupted<br />19 items, 138.84 K<br />Cover art nearly doubles response time<br />Item 14: script on ILS that queries Amazon for art<br />
  33. 33. University of Southern California: HOMER<br />
  34. 34. USC Results: Graph<br />
  35. 35. USC Results: Table<br />
  36. 36. USC Findings<br />Slowest among traditional catalogs; Sirsi/Dynix takes longer to make initial connection (Item 1 on graph)<br />8.7295 seconds (though average was 4.1085 seconds)<br />16 items, 148.47 K<br />While attractive and well-integrated, Syndetic Solutions content (cover art, summary, author biography, and table of contents) adds 1.2 seconds to load time<br />
  37. 37. Isthe ContentWorth the Wait?<br /> “The new database seems based on I don’t need suggestions, and poor ones at that, of related books when I use the library. I don’t need to see what other borrowers thought of the book. The information I need is poorly displayed. It is hard to cut and paste. It takes several screens to scan through, instead of the much quicker scroll in the traditional format…. It supplies distracting, if not useless information (a picture of the cover, the distance to other libraries—as if I need to know how far Provo is).” <br /> -Auraria Campus Faculty Member <br />
  38. 38. Our Conclusions<br />Make performance testing part of evaluation process for vendor products<br />Adhere to industry standards for acceptable response times when testing<br />Optimize delivery of 2.0/NextGen features as much as possible<br />Conduct user testing to ensure that the content is “worth the wait” to their minds<br />
  39. 39. Questions? Comments?<br />Nina McHale<br />@ninermac<br /><br />