Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Search Quality - A Business-Friendly Perspective

339 views

Published on

Have you ever been asked to fix a bad search experience? Have you ever been asked to predict the outcome of a change to your search algorithm? Fixing search problems can feel like a game of whack-a-mole -- fixing one set of queries breaks others. It's a frustrating game of guesswork and trade-offs. But you're responsible for the overall performance of a search system, and the powers that be want assurances. Wouldn't it be great if you could report, with confidence, the overall quality of search? Search quality metrics such as MAP, MRR, and nDCG exist to help the search engineer, but mapping them to real-world business goals is challenging. And pointing to obscure metrics in the face of contrary perceptions is unsatisfying to everyone. Can we do better?

In this talk, we'll look at some of these evaluation metrics and how to use them. Then, we'll explore the gap between these metrics and the business perspective on search quality. Finally, we'll explore tactics you can use to tell an effective story about search quality that non-experts can understand.

Published in: Technology
  • Be the first to comment

  • Be the first to like this

Search Quality - A Business-Friendly Perspective

  1. 1. Search Quality A Business-Friendly Perspective
  2. 2. Peter Fries Search Consultant
  3. 3. Search Quality
  4. 4. Successful Delivery of Search Projects
  5. 5. Successful Delivery of Search Projects
  6. 6. Search Project Quality
  7. 7. 1. DO YOUR HOMEWORK 2. GATHER PRELIMINARIES 3. INTERVIEWS 4. ANALYSIS 5. ROADMAPPING 6. PRIORITIZE & PLAN 7. EXECUTE ITERATIVELY a. IRON TRIANGLE b. FEEDBACK LOOPS A METHOD TO THE MADNESS
  8. 8. WHAT MADNESS?
  9. 9. Search Quality Anti-Patterns ● Search Quality is Bug Squashing ● Search Quality is Too Hard For You ● Search Quality is Off Limits ● Search Quality is a Feeling ● Search Quality is Relevance ● Search Quality is Mysterious
  10. 10. Search Quality Anti-Patterns ● Search Quality is Bug Squashing ● Search Quality is Too Hard For You ● Search Quality is Off Limits ● Search Quality is a Feeling ● Search Quality is Relevance ● Search Quality is Mysterious
  11. 11. Search Quality Anti-Patterns ● Search Quality is Bug Squashing ● Search Quality is Too Hard For You ● Search Quality is Off Limits ● Search Quality is a Feeling ● Search Quality is Relevance ● Search Quality is Mysterious
  12. 12. Search Quality Anti-Patterns ● Search Quality is Bug Squashing ● Search Quality is Too Hard For You ● Search Quality is Off Limits ● Search Quality is a Feeling ● Search Quality is Relevance ● Search Quality is Mysterious
  13. 13. Search Quality Anti-Patterns ● Search Quality is Bug Squashing ● Search Quality is Too Hard For You ● Search Quality is Off Limits ● Search Quality is a Feeling ● Search Quality is Relevance ● Search Quality is Mysterious
  14. 14. Search Quality Anti-Patterns ● Search Quality is Bug Squashing ● Search Quality is Too Hard For You ● Search Quality is Off Limits ● Search Quality is a Feeling ● Search Quality is Relevance ● Search Quality is Mysterious
  15. 15. Search Quality Anti-Patterns ● Search Quality is Bug Squashing ● Search Quality is Too Hard For You ● Search Quality is Off Limits ● Search Quality is a Feeling ● Search Quality is Relevance ● Search Quality is Mysterious
  16. 16. Search Quality Anti-Patterns ● Search Quality is Bug Squashing ● Search Quality is Too Hard For You ● Search Quality is Off Limits ● Search Quality is a Feeling ● Search Quality is Relevance ● Search Quality is Mysterious!
  17. 17. 1. DO YOUR HOMEWORK 2. GATHER PRELIMINARIES 3. INTERVIEWS 4. ANALYSIS 5. ROADMAPPING 6. PRIORITIZE & PLAN 7. EXECUTE ITERATIVELY a. IRON TRIANGLE b. FEEDBACK LOOPS A METHOD TO THE MADNESS
  18. 18. 1. DO YOUR HOMEWORK 2. GATHER PRELIMINARIES 3. INTERVIEWS 4. ANALYSIS 5. ROADMAPPING 6. PRIORITIZE & PLAN 7. EXECUTE ITERATIVELY a. IRON TRIANGLE b. FEEDBACK LOOPS A METHOD TO THE MADNESS
  19. 19. DO YOUR HOMEWORK
  20. 20. GATHER PRELIMINARIES
  21. 21. TALK TO THE PEOPLE
  22. 22. 1. DO YOUR HOMEWORK 2. GATHER PRELIMINARIES 3. INTERVIEWS 4. ANALYSIS 5. ROADMAPPING 6. PRIORITIZE & PLAN 7. EXECUTE ITERATIVELY a. IRON TRIANGLE b. FEEDBACK LOOPS A METHOD TO THE MADNESS
  23. 23. ● Big Picture ○ Use Cases, Domain, IR Vertical ● Classes of Queries ○ Navigational, Informational, Transactional, Research ○ Length, Category, Intent ● Distribution of Queries ○ Head/Tail Analysis ○ Over Tail, Over Time ○ Relative Performance, Outliers ○ Quantify Scale/Impact ● Connect to Revenue ● Points of Interest Analysis
  24. 24. 1. DO YOUR HOMEWORK 2. GATHER PRELIMINARIES 3. INTERVIEWS 4. ANALYSIS 5. ROADMAPPING 6. PRIORITIZE & PLAN 7. EXECUTE ITERATIVELY a. IRON TRIANGLE b. FEEDBACK LOOPS A METHOD TO THE MADNESS
  25. 25. ROADMAPPING
  26. 26. Searching for People Motivation: Many customers are searching for movies by actor name. ~10% of random sample includes an actor name in the query. These queries have poor performance relative to their peers as measured by click-through. We believe that recognizing named entities (people names) will bring performance in-line with peer group. IMPACT: HIGH RISK: LOW LOE: MEDIUM SCOPE: Named Entities queries SCALE: ~10% MEASURE: Relative CTR VALUE: $XX.XX
  27. 27. 1. DO YOUR HOMEWORK 2. GATHER PRELIMINARIES 3. INTERVIEWS 4. ANALYSIS 5. ROADMAPPING 6. PRIORITIZE & PLAN 7. EXECUTE ITERATIVELY a. IRON TRIANGLE b. FEEDBACK LOOPS A METHOD TO THE MADNESS
  28. 28. 1. DO YOUR HOMEWORK 2. GATHER PRELIMINARIES 3. INTERVIEWS 4. ANALYSIS 5. ROADMAPPING 6. PRIORITIZE & PLAN 7. EXECUTE ITERATIVELY a. IRON TRIANGLE b. FEEDBACK LOOPS A METHOD TO THE MADNESS
  29. 29. Iron Triangle of Search Quality MEASUREMENT OPTIMIZATIONDATA
  30. 30. Iron Triangle of Search Quality MEASUREMENT OPTIMIZATIONDATA
  31. 31. Iron Triangle of Search Quality MEASUREMENT OPTIMIZATIONDATA
  32. 32. Iron Triangle of Search Quality MEASUREMENT OPTIMIZATIONDATA
  33. 33. Search Evaluation Metrics ● MRR ● MAP ● Precision @K ● NDCG ● LGTM
  34. 34. RELEVANCE JUDGMENTS ● relevance is approximately query similarity ● relevance is binary ● judgment lists agree with users ● judgment lists are complete and consistent
  35. 35. Search Quality A Business-Friendly Perspective
  36. 36. What are we talking about? 1. Does it deliver value? 2. At what cost?
  37. 37. Who are we talking about? 1. Marketing a. define offerings b. attract and retain customers 2. Management a. set goals b. plan and allocate resources
  38. 38. HIERARCHY OF BUSINESS OBJECTIVES REVENUE ENGAGEMENT CONVERSIONSRETENTION MARKET SHARE CUSTOMER SATISFACTION CTR ABANDONMENTACQUISITION
  39. 39. Search Behaviors ● behaviors make sense ● behaviors are measurable ○ even without relevance data ● behaviors tell a story ● you can map behaviors to user tasks
  40. 40. HIERARCHY OF BUSINESS OBJECTIVES REVENUE ENGAGEMENT CONVERSIONSRETENTION MARKET SHARE CUSTOMER SATISFACTION CTR ABANDONMENTACQUISITION SEARCH-EXITS SERP-TO-CART THRASHING POGO-STICKING PAGINATION SEARCHES PER-SESSION REVENUE PER-SEARCH ZERO-CLICKS DWELL-TIME SEARCH-RATE
  41. 41. 1. DO YOUR HOMEWORK 2. GATHER PRELIMINARIES 3. INTERVIEWS 4. ANALYSIS 5. ROADMAPPING 6. PRIORITIZE & PLAN 7. EXECUTE ITERATIVELY a. IRON TRIANGLE b. FEEDBACK LOOPS A METHOD TO THE MADNESS
  42. 42. Development Framework for Search LABOPERATIONS INTEGRATION CLICKSTREAM A/B TESTING EVALUATION OFFLINE RELEVANCE ONLINE AUTOMATED TESTING IRON TRIANGLE
  43. 43. 1. DO YOUR HOMEWORK 2. GATHER PRELIMINARIES 3. INTERVIEWS 4. ANALYSIS 5. ROADMAPPING 6. PRIORITIZE & PLAN 7. EXECUTE ITERATIVELY a. IRON TRIANGLE b. FEEDBACK LOOPS A METHOD TO THE MADNESS
  44. 44. THANK YOU!

×