Your SlideShare is downloading. ×
Search Analytics:  Diagnosing what ails your site
Upcoming SlideShare
Loading in...5

Thanks for flagging this SlideShare!

Oops! An error has occurred.

Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Search Analytics: Diagnosing what ails your site


Published on

Lou Rosenfeld's presentation on search analytics, given at's Web Manager University, October 27, 2006.

Lou Rosenfeld's presentation on search analytics, given at's Web Manager University, October 27, 2006.

Published in: Technology, Education
  • Be the first to comment

No Downloads
Total Views
On Slideshare
From Embeds
Number of Embeds
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

No notes for slide


  • 1. Search Analytics: Diagnosing what ails your site Web Manager University September 27, 2006 Louis Rosenfeld
  • 2. About me
    • Information architecture (IA) consultant; formerly president Argus Associates
    • Publisher and founder, Rosenfeld Media (
    • Background in librarianship/information science; consult for Fortune 500s
    • Co-author, Information Architecture for the World Wide Web (3rd edition out this fall)
    • Co-founder, Information Architecture Institute ( and UXnet (
  • 3. AOL Searcher #4417749
    • Interests
      • 60 single men
      • aameetings in georgia
      • plastic surgeons in gwinnett county
      • applying to west point
      • bipolar
      • panic disorders
      • yerba mate
      • shedless dogs
      • movies for dogs
      • new zealand real estate
    • Thelma Arnold
      • 62-year old widow
      • Lilburn, GA resident
    NY Times , August 9, 2006: “A Face Is Exposed for AOL Searcher No. 4417749”
  • 4. Our Inadvertent Search Analytics Education, courtesy AOL
    650,000 searchers 21,000,000 queries
  • 5. Analyze This:
    • Keywords: focis; 0; 11/26/04 12:57 PM; XXX.XXX.XXX.2
    • Keywords: focus; 167; 11/26/04 12:59 PM; XXX.XXX.XXX.2
    • Keywords: focus pricing; 12; 11/26/04 1:02 PM; XXX.XXX.XXX.2
    • Keywords: discounts for college students; 0; 11/26/04 3:35 PM; XXX.XXX.XXX.59
    • Keywords: student discounts; 3; 11/26/04 3:35 PM; XXX.XXX.XXX.59
    • Keywords: ford or mercury; 500; 11/26/04 3:35 PM; XXX.XXX.XXX.126
    • Keywords: (ford or mercury) and dealers; 73; 11/26/04 3:36 PM; XXX.XXX.XXX.126
    • Keywords: lorry; 0; 11/26/04 3:36 PM; XXX.XXX.XXX.36
    • Keywords: “safety ratings”; 3; 11/26/04 3:36 PM; XXX.XXX.XXX.55
    • Keywords: safety; 389; 11/26/04 3:36 PM; XXX.XXX.XXX.55
    • Keywords: seatbelts; 2; 11/26/04 3:37 PM; XXX.XXX.XXX.55
    • Keywords: seat belts; 33; 11/26/04 3:37 PM; XXX.XXX.XXX.55
  • 6. The Head, the Long Tail, and the Interesting Stuff in Between Sorting queries by frequency results in a Zipf Distribution Can we improve performance for the most popular queries?
  • 7. User Research: What do they want?…
    • SA is a true expression of users’ information needs (often surprising: e.g., SKU numbers at LL Bean; URLs at IBM)
    • Provides context by displaying aspects of single search sessions
  • 8. User Research: …who wants it?…
    • What can you learn from knowing these things?
      • What specific segments want; determined by:
        • Security clearance
        • IP address
        • Job function
        • Account information
      • Which pages they initiate searches from
  • 9. Users Research: …and when do they want it?
    • Time-based variation (and clustered queries)
    • By hour, by day, by season
    • Helps determine “best bets” and “guide” develop- ment
  • 10. Search Entry Interface Design: “The Box” or something else?
    • SA identifies “dead end” points (e.g., 0 hits, 2000 hits) where assistance could be added (e.g., revise search, browsing alternative)
    • Syntax of queries informs selection of search features to expose (e.g., use of Boolean operators, fielded searching)
    … OR…
  • 11. Search Results Interface Design: Which results where?
    • #10 result is clicked through more often than #s 6, 7, 8, and 9 (ten results per page)
    From SLI Systems (
  • 12. Search Results Interface Design: How to sort results?
    • Financial Times has found that users often include dates in their queries
    • Obvious but effective improvement: Allow users to sort by date
  • 13. Navigation: Any improvements?
    • Michigan State University builds A-Z index automatically based on frequent queries
  • 14. Navigation: Where does it fail?
    • Track and study pages (excluding main page) where search is initiated
      • Are there obvious issues that would cause a “dead end”?
      • Are there user studies that could test/validate problems on these pages?
    • Sandia Labs analyzes most requested documents to test content independent of site structure; results used to improve structure
  • 15. Search System: What to change?
    • Identify new functionality: Financial Times added spell checking
    • Retrieval algorithm modifications:
      • Deloitte, Barnes & Noble use SA to demonstrate that basic improvements (e.g., Best Bets) are insufficient
      • Financial Times weights company names higher
  • 16. Metadata Development: How do users express their needs?
    • SA provides a sense of tone: how users’ needs are expressed
      • Jargon (e.g., “cancer” vs. “oncology,” “lorry” vs. “truck,” acronyms)
      • Length (e.g., number of terms/query)
      • Syntax (e.g., Boolean, natural language, keyword)
  • 17. Metadata Development: Which metadata values?
    • SA helps in the creation of controlled vocabularies
    • Terms are fodder for metadata values (e.g., “cell phone,” “JFK” vs. “John Kennedy,” “country music”), especially for determining preferred terms
    • Works with tools that cluster synonyms (example from, enabling concept searching and thesaurus development
  • 18. Metadata Development: Which metadata attributes?
    • SA helps in the creation of vocabularies
    • Simple cluster analysis can detect metadata attributes (e.g., “product,” “person,” “topic”)
    • Look for variations between short head and long tail (Deloitte intranet: “known-item” queries are common; research topics are infrequent)
    known-item queries research queries
  • 19. Content Development: Do we have the right content?
    • SA identifies content that can’t be found (0 results)
    • Does the content exist? If so, there are wording, metadata, or spidering problems
    • If not, why not?
  • 20. Content Development: Are we featuring the right stuff?
    • Clickthrough tracking helps determine which results should rise to the top (example: SLI Systems)
    • Also suggests which “best bets” to develop to address common queries
  • 21. Organizational Impact: Educational opportunities
    • SA is a way to “reverse engineer” how your site performs in order to:
      • Sensitize organization to analytics, specifically related to findability
      • Sensitize content owners/authors to benefits of good practices around content titling, tagging, and navigational placement
  • 22. Organizational Impact: Rethinking how you do things
    • Financial Times learns about breaking stories from their logs by monitoring spikes in company names and individuals’ names and comparing with their current coverage
    • Discrepancy = possible breaking story; reporter is assigned to follow up
    • Next step? Assign reporters to “beats” that emerge from SA
  • 23. The Ideal SLA report 1/2 (from Avi Rappoport)
    • # searches for each week/month/quarter/year
    • Top 1% of queries (cluster by stem if possible)
    • Top 10% of no-matches queries
    • Top 10% of low-matches queries? (one to 4 hits, or more depending on site size)
    • # empty searches
    • Changes in these over the last week/month/quarter/year
    • Changes’ correlation to changes in the site, search engine, company profile
  • 24. The Ideal SLA report 2/2 (from Avi Rappoport)
    • Queries showing significant increases
    • Patterns in less-frequent queries -- names? places? web site addresses?
    • Top pages retrieved in search results and the queries that retrieved them
    • Queries that retrieved the best/most important pages
    • For search zones, create reports for each zone (will have significant impact on no-matches data)
  • 25. SA as User Research Method: Sleeper, but no panacea
    • Benefits
      • Non-intrusive
      • Inexpensive and (usually) accessible
      • Large volume of “real” data
      • Represents actual usage patterns
    • Drawbacks
      • Provides an incomplete picture of usage: was user satisfied at session’s end?
      • Difficult to analyze: where are the commercial tools?
    • Ultimately an excellent complement to qualitative methods (e.g., task analysis, field studies)
  • 26. SA headaches: What gets in the way?
    • Lack of time
    • Few useful tools for parsing logs, generating reports
    • Tension between those who want to perform SA and those who “own” the data (chiefly IT)
    • Ignorance of the method
    • Hard work and/or boredom of doing analysis
    • From summer 2007 survey (134 responses)
  • 27. Please Share Your SA Knowledge: Visit our “book in progress” site
    • Site URL:
    • Feed URL:
    • Site contains:
    • Reading list
    • Survey results
    • Perl script for parsing logs
    • Log samples
    • … and more
  • 28. Contact Information
    • Louis Rosenfeld LLC
    • 902 Miller Avenue
    • Ann Arbor, Michigan 48103 USA
    • [email_address]
    • +1.734.302.3323 voice
    • +1.734.661.1655 fax