Ricardo Baeza-Yates
SIGIR 2013 – Industry Talk
An	
  Engaging	
  Click	
  
Why is it important to engage users?
•  In today’s wired world, users have enhanced expectations
about their interactions ...
CTR and user engagement
CTR
3An Engaging Click
Multimedia search
activities often driven
by entertainment
needs, not by
information needs
CTR and entertainment driven se...
I just wanted the phone number … I am totally satisfied J
CTR and factual needs
An Engaging Click 5
This talk
What is user engagement?
What are the characteristics of user engagement?
How to measure user engagement?
What i...
http://thenextweb.com/asia/2013/05/03/kakao-talk-rolls-out-plus-friend-home-a-
revamped-platform-to-connect-users-with-the...
What is user engagement?
User engagement is a quality of the user experience
that emphasizes the positive aspects of inter...
Considerations in the measurement of
user engagement
•  Short term (within session) and long term
(across multiple session...
Characteristics of user engagement (I)
• Users must be focused to be engaged
• Distortions in the subjective perception of...
Characteristics of user engagement (II)
•  Novelty, surprise, unfamiliarity and the unexpected
•  Appeal to users’ curiosi...
14
Measuring user engagement
Measures	
   Characteristics	
  
Self-
reported
engagement
Questionnaire, interview, report,
pro...
Large-scale measurements of
user engagement – Web analytics
Intra-session measures Inter-session measures
•  Dwell time / ...
Dependency on task
•  Engagement varies by task:
–  user who accesses a website to check for emails
(goal-specific) has di...
18
User engagement in search – “relevance”
•  Click-through rate (CTR)
•  Dwell time (search result)
•  Time to first click
•...
20
Click vs cursor – heat-map
Estimate search result relevance
(Bing - Microsoft employees – 366,473 queries; 21,936 unique c...
Mouse movement – what can hovering tell
about relevance?
Click-through rate:
% of clicks when URL
Shown (per query)
Hover ...
•  Domain: Yahoo! Answers Japan
•  Study: Inter-session engagement metric
23
(Dupret & Lalmas, 2013)
If users find a web a...
Absence time andsurvivalanalysis
Easy to implement
and interpret
Can compare many
things in one go
No need to estimate
bas...
Using absence time to compare 6 ranking
functions (buckets) on Yahoo!Answers Japan
1.  Returning relevant results is impor...
26
Online multi-tasking
users spend more and more of their online session multi-tasking, e.g. emailing,
reading news, searchi...
•  Domain: 700+ web applications
•  Study: Online multi-tasking
28
(Lehmann et al, 2013)
Online multi-tasking affects the ...
Online multi-tasking – and search
181K users, 2 months browser
data, 600 sites, 4.8M sessions
• only 40% of the sessions h...
Navigating between sites –
hyperlinking, backpaging and teleporting
timestamp page navi
1346242507 1 T
1346242567 2 L
1346...
Revisitation and navigation patterns
auctionsites[complexattention]
●
●
●
●
●
●
●
●
●
101112
1 2 3 4 5 6 7 8 9
p-value = 0...
Online multi-tasking – and web search
•  48% sites visited at least 9 times
•  Revisitation “level” depends on site
•  10%...
33
Networked user engagement:
engagement across a network of sites
•  Large online providers (AOL, Google, Yahoo!,
MSN, etc.)...
Measuring downstream engagement
User session
Providersites
Downstream engagement
for site A
(% remaining session time)
Sit...
Influential features
o  Time of day
o  Number of (non-image/non-video) links to Yahoo! sites in HTML body
o  Average rank ...
•  Domain: social media (Yahoo! Answers and Wikipedia)
•  Study: serendipity (in entity search)
37
(Bordino, Mejova & Lalm...
Yahoo!Answers vs Wikipedia
community-driven question &
answer portal
•  67 336 144 questions &
261 770 047 answers
•  Janu...
Wikipedia
39
Yahoo! Answers
An Engaging Click
Retrieval
Wikipedia Yahoo!
Answers
Combined
Precision @ 5 0.668 0.724 0.744
MAP 0.716 0.762 0.782
Justin Bieber, Nicki Min...
| relevant & unexpected | / | unexpected |
number of serendipitous results out of all
of the unexpected results retrieved
...
Interestingness ≠ Relevance
Interesting > Relevant
Relevant > Interesting
Oil Spill à
Penguins in Sweaters WP
Robert Patt...
Similarity (Kendall’s tau-b) between result sets and reference ranking
43
	
  Data	
   tau-­‐b	
  
Which	
  result	
  is	
...
44
Take-away messages
•  Search is not just about specific information needs
•  People search for many other reasons
–  Navig...
Thank you
Acknowledgements: Mounia Lalmas, Jahnette Lehmann, George
Dupret, Ilaria Bordino, Yelena Mejova and Elad Yom-Tov...
An engaging click
An engaging click
Upcoming SlideShare
Loading in...5
×

An engaging click

1,926

Published on

A good search engine is one when users come very regularly, type their queries, get their results, and leave quickly. With user engagement metrics from web analytics, these translate to a low dwell time, often low CTR, but a very high return rate. But user engagement is not just about this. User engagement is a multifaceted, complex phenomenon, giving rise to a number of approaches for its measurement: self-reporting (e.g. questionnaires); observational methods (e.g., facial expression analysis, desktop actions); and of course web analytics using online behavior metrics. These methods represent various trade-offs between the scale of the data analyzed and the depth of understanding. For instance, surveys are hardly scalable but offer rich, qualitative insights, whereas click data can be collected on a large-scale but are more difficult to analyze. This talk will present various efforts aiming at combining approaches to measure engagement and seeking to provide insights into what makes an engaging experience. The talk will focus of what makes users click or not click, and what this means in terms of user engagement.

SIGIR 2013 Industry Track: Keynote by Ricardo Baeza-Yates - VP, Yahoo! Research Europe & Latin America

Published in: Technology, Business
0 Comments
2 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
1,926
On Slideshare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
44
Comments
0
Likes
2
Embeds 0
No embeds

No notes for slide

An engaging click

  1. 1. Ricardo Baeza-Yates SIGIR 2013 – Industry Talk An  Engaging  Click  
  2. 2. Why is it important to engage users? •  In today’s wired world, users have enhanced expectations about their interactions with technology … resulting in increased competition amongst the purveyors and designers of interactive systems. •  In addition to utilitarian factors, such as usability, we must consider the hedonic and experiential factors of interacting with technology, such as fun, fulfillment, play, and user engagement. 2An Engaging Click
  3. 3. CTR and user engagement CTR 3An Engaging Click
  4. 4. Multimedia search activities often driven by entertainment needs, not by information needs CTR and entertainment driven search (Slaney, 2011) An Engaging Click 4
  5. 5. I just wanted the phone number … I am totally satisfied J CTR and factual needs An Engaging Click 5
  6. 6. This talk What is user engagement? What are the characteristics of user engagement? How to measure user engagement? What is an engaging click? 1.  inter-session metric 2.  online multi-tasking 3.  serendipity 6An Engaging Click Work on user engagement across web applications Implications to search
  7. 7. http://thenextweb.com/asia/2013/05/03/kakao-talk-rolls-out-plus-friend-home-a- revamped-platform-to-connect-users-with-their-favorite-brands/ Engagement is on everyone’s mind http://socialbarrel.com/70-percent-of-brand-engagement-on-pinterest-come-from-users/ 51032/ http://iactionable.com/user-engagement/ http://www.cio.com.au/article/459294/ heart_foundation_uses_gamification_drive_user_engagement/ http://www.localgov.co.uk/index.cfm?method=news.detail&id=109512 http://www.trefis.com/stock/lnkd/articles/179410/linkedin-makes-a-90- million-bet-on-pulse-to-help-drive-user-engagement/2013-04-15 An Engaging Click 8
  8. 8. What is user engagement? User engagement is a quality of the user experience that emphasizes the positive aspects of interaction – in particular the fact of being captivated by the technology (Attfield et al, 2011). user feelings: happy, sad, excited, … emotional, cognitive and behavioural connection that exists, at any point in time and over time, between a user and a technological resource user interactions: click, read, comment, buy… user mental states: involved, lost, concentrated… 9An Engaging Click
  9. 9. Considerations in the measurement of user engagement •  Short term (within session) and long term (across multiple sessions) •  Laboratory vs. field studies •  Subjective vs. objective measurement •  Large scale (dwell time of 100,000 people) vs. small scale (gaze patterns of 10 people) •  User engagement as process vs. product One is not better than other; it depends on what is the aim. 10An Engaging Click
  10. 10. Characteristics of user engagement (I) • Users must be focused to be engaged • Distortions in the subjective perception of time used to measure it Focused attention (Webster & Ho, 1997; O’Brien, 2008) • Emotions experienced by user are intrinsically motivating • Initial affective “hook” can induce a desire for exploration, active discovery or participation Positive Affect (O’Brien & Toms, 2008) • Sensory, visual appeal of interface stimulates user & promotes focused attention • Linked to design principles (e.g. symmetry, balance, saliency) Aesthetics (Jacques et al, 1995; O’Brien, 2008) • People remember enjoyable, useful, engaging experiences and want to repeat them • Reflected in e.g. the propensity of users to recommend an experience/a site/a product Endurability (Read, MacFarlane, & Casey, 2002; O’Brien, 2008) 12An Engaging Click
  11. 11. Characteristics of user engagement (II) •  Novelty, surprise, unfamiliarity and the unexpected •  Appeal to users’ curiosity; encourages inquisitive behavior and promotes repeated engagement Novelty (Webster & Ho, 1997; O’Brien, 2008) •  Richness captures the growth potential of an activity •  Control captures the extent to which a person is able to achieve this growth potential Richness and control (Jacques et al, 1995; Webster & Ho, 1997) •  Trust is a necessary condition for user engagement •  Implicit contract among people and entities which is more than technological Reputation, trust and expectation (Attfield et al, 2011) •  Difficulties in setting up “laboratory” style experiments •  Why should users engage? Motivation, interests, incentives, and benefits (Jacques et al., 1995; O’Brien & Toms, 2008) 13An Engaging Click
  12. 12. 14
  13. 13. Measuring user engagement Measures   Characteristics   Self- reported engagement Questionnaire, interview, report, product reaction cards, think-aloud Subjective Short- and long-term Lab and field Small-scale Product outcome Cognitive engagement Task-based methods (time spent, follow-on task) Physiological measures (e.g. EEG, SCL, fMRI, eye tracking, mouse- tracking) Objective Short-term Lab and field Small-scale and large- scale Process outcome Interaction engagement Web analytics metrics + models Objective Short- and long-term Field Large-scale Process outcome 15An Engaging Click
  14. 14. Large-scale measurements of user engagement – Web analytics Intra-session measures Inter-session measures •  Dwell time / session duration •  Play time (video) •  (Mouse movement) •  Click through rate (CTR) •  Mouse movement •  Number of pages viewed (click depth) •  Conversion rate (mostly for e-commerce) •  Number of UCG (comments) •  Fraction of return visits •  Time between visits (inter-session time, absence time) •  Total view time per month (video) •  Lifetime value (number of actions) •  Number of sessions per unit of time •  Total usage time per unit of time •  Number of friends on site (social networks) •  Number of UCG (comments) •  Intra-session engagement measures our success in attracting the user to remain on site for as long as possible. •  Inter-session engagement can be measured directly or, for commercial sites, by observing lifetime customer value. 16An Engaging Click
  15. 15. Dependency on task •  Engagement varies by task: –  user who accesses a website to check for emails (goal-specific) has different engagement patterns from one browsing for leisure. •  In (Yom-Tov et al, 2013), sessions in which 50% or more of the visited sites belonged to the 5 most common sites (for each user) were classified as goal- specific. –  38% sessions were goal-specific –  most users (92%) both goal-specific and non-goal- specific sessions –  average downstream engagement in goal- specific sessions was 0.16 vs. 0.2 during non- goal-specific sessions 17An Engaging Click
  16. 16. 18
  17. 17. User engagement in search – “relevance” •  Click-through rate (CTR) •  Dwell time (search result) •  Time to first click •  Skipping •  Abandonment rate •  Number of query reformulations •  Search engine switching •  Interleaving •  Cumulative gain family of metrics •  … An Engaging Click 19
  18. 18. 20
  19. 19. Click vs cursor – heat-map Estimate search result relevance (Bing - Microsoft employees – 366,473 queries; 21,936 unique cookies; 7,500,429 cursor move or click) the role of hovering (Huang et al, 2011) 21An Engaging Click
  20. 20. Mouse movement – what can hovering tell about relevance? Click-through rate: % of clicks when URL Shown (per query) Hover rate: % hover over URL (per query) Unclicked hover: Media time user hovers over URL but no click (per query) Max hover time: Maximum time user hover over a result (per SERP) (Huang et al, 2011) 22An Engaging Click
  21. 21. •  Domain: Yahoo! Answers Japan •  Study: Inter-session engagement metric 23 (Dupret & Lalmas, 2013) If users find a web application interesting, engaging or useful, they will return to it sooner.
  22. 22. Absence time andsurvivalanalysis Easy to implement and interpret Can compare many things in one go No need to estimate baselines But need lots of data to account for noise (Dupret & Lalmas, 2013) 24An Engaging Click Survival Analysis: high hazard rate = short absence
  23. 23. Using absence time to compare 6 ranking functions (buckets) on Yahoo!Answers Japan 1.  Returning relevant results is important, but is not enough to keep returning to the search application 2.  Clicks after the 5th results reflect poorer user experience; users cannot find what they are looking for 3.  No click means a bad user experience 4.  Clicking lower in the ranking suggests more careful choice from the user 5.  Clicking at bottom is a sign of low quality overall ranking 6.  Users finding their answers quickly (click sooner) return sooner to the search application 7.  Returning to the same search result page is a worse user experience than reformulating the query. An Engaging Click 25
  24. 24. 26
  25. 25. Online multi-tasking users spend more and more of their online session multi-tasking, e.g. emailing, reading news, searching for information à ONLINE MULTI-TASKING navigating between sites, using browser tabs, bookmarks, etc seamless integration of social networks platforms into many services leaving a site is not a “bad thing!” (fictitious navigation between sites within an online session) 181K users, 2 months browser data, 600 sites, 4.8M sessions • only 40% of the sessions have no site revisitation • hyperlinking, backpaging and teleporting An Engaging Click 27
  26. 26. •  Domain: 700+ web applications •  Study: Online multi-tasking 28 (Lehmann et al, 2013) Online multi-tasking affects the way users interact (or engage) with sites.
  27. 27. Online multi-tasking – and search 181K users, 2 months browser data, 600 sites, 4.8M sessions • only 40% of the sessions have no site revisitation •  commonly accessed sites between visits à search 22%, navigation 12%, social 8% •  for some sites (e-commerce) same sites are accessed between visits à one task? •  no patterns for sites such as mail, social à anchor, habit? •  longer time between visits à a different task (new search) •  more vs less times spent at each revisit à increased vs shift of attention An Engaging Click 29
  28. 28. Navigating between sites – hyperlinking, backpaging and teleporting timestamp page navi 1346242507 1 T 1346242567 2 L 1346242627 3 L (1346242687) 1 B 1346242687 4 L 1346242747 5 T 1346329147 6 L (1346329207) 5 B 1346329207 7 L (1346329267) 2 B 1346329267 8 L 2 3 1 4 8 5 76 click-tree 1 click-tree 2 1 - 2 - 3 - 1 - 4 - 5 - 6 - 5 - 7 - 2 - 8timestamp page referral 1346242507 1 - 1346242567 2 1 1346242627 3 2 1346242687 4 1 1346242747 5 - 1346329147 6 5 1346329207 7 5 1346329267 8 2 8 7 6 2 3 1 4 5 2 3 1 4 8 5 76 click-tree 1 click-tree 2 (a) Interaction data click-stream (b) Navigation path click-stream (c) Logical navigation click-trees (d) Interaction data tree-stream (e) Navigation path tree-stream Page [L] Hyperlinking [B] Backpaging [T] Teleportingn Number of backpaging actions is an under-estimate! (using browser back button, or user returns to one of several open tabs/windows) An Engaging Click 30
  29. 29. Revisitation and navigation patterns auctionsites[complexattention] ● ● ● ● ● ● ● ● ● 101112 1 2 3 4 5 6 7 8 9 p-value = 0.24 m = 0.142 100% 67% 54% 46% 41% 35% 31% 29% 26% searchsites[increasingattention] ● ● ● ● ● ● ● ● ● 10.811.011.2 1 2 3 4 5 6 7 8 9 100% 69% 54% 44% 38% 33% 29% 26% 23% p-value < 0.05 m = 0.063 ● ● ● ● ● ● ● ● ● 10.811.2 1 2 3 4 5 6 7 8 9 p-value < 0.05 100% 54% 36% 26% 20% 17% 14% 12% 10% proportion of users %oftotalpageviewsonsite%ofnavigationtype Hyperlinking mailsites[decreasingattention] ● ● ● ● ● ● ● ● ● 10111213 1 2 3 4 5 6 7 8 9 100% 62% 41% 29% 21% 16% 13% 10% 8% p-value < 0.05 m = -0.288 averageattention 1 2 3 4 5 6 7 8 9 0.00.40.8 0.00.40.8 1 2 3 4 5 6 7 8 9 0.00.40.8 1 2 3 4 5 6 7 8 9 0.00.40.8 1 2 3 4 5 6 7 8 9 k [kth visit on site] k [kth visit on site] k [kth visit on site] k [kth visit on site] Teleporting Backpaging An Engaging Click 31
  30. 30. Online multi-tasking – and web search •  48% sites visited at least 9 times •  Revisitation “level” depends on site •  10% users accessed a site 9+ times (23% for search sites); 28% at least four times (44% for search sites) •  Activity on site decreases with each revisit but activity on many search (and adult) sites increases •  Backpaging usually increases with each revisit but hyperlinking remains important means to navigate between sites An Engaging Click 32
  31. 31. 33
  32. 32. Networked user engagement: engagement across a network of sites •  Large online providers (AOL, Google, Yahoo!, MSN, etc.) offer not one service (site), but a network of sites •  Each service is usually optimized individually, with some effort to direct users between them •  Success of a service depends on itself, but also on how it is reached from other services (user traffic) An Engaging Click 34
  33. 33. Measuring downstream engagement User session Providersites Downstream engagement for site A (% remaining session time) Site A 35 (Yom-Tov etal, 2012)
  34. 34. Influential features o  Time of day o  Number of (non-image/non-video) links to Yahoo! sites in HTML body o  Average rank of Yahoo! links on page o  Number of (non-image/non-video) links to non-Yahoo! sites in HTML body o  Number of span tags (tags that allow adding style to content or manipulating content, e.g. JavaScript) o  Link placements and number of Yahoo! links can influence downstream engagement o  Not new, but here shown to hold also across sites o  Links to non-Yahoo! sites have a positive effect on downstream engagement o  Possibly because when users are faced with abundance of outside links they decide to focus their attention on a central content provider, rather than visiting multitude of external sites (Yom-Tov et al, under submission)
  35. 35. •  Domain: social media (Yahoo! Answers and Wikipedia) •  Study: serendipity (in entity search) 37 (Bordino, Mejova & Lalmas, 2013) Interesting search results may promote serendipitous browsing.
  36. 36. Yahoo!Answers vs Wikipedia community-driven question & answer portal •  67 336 144 questions & 261 770 047 answers •  January 1, 2010 – December 31, 2011 •  English-language community-driven encyclopedia •  3 795 865 articles •  as of end of December 2011 •  English Wikipedia curated high-quality knowledge variety of niche topics minimally curated opinions, gossip, personal info variety of points of view 38An Engaging Click Entity Search we build an entity-driven serendipitous search system based on entity networks extracted from Wikipedia and Yahoo! Answers Serendipity finding something good or useful while not specifically looking for it, serendipitous search systems provide relevant and interesting results
  37. 37. Wikipedia 39 Yahoo! Answers An Engaging Click
  38. 38. Retrieval Wikipedia Yahoo! Answers Combined Precision @ 5 0.668 0.724 0.744 MAP 0.716 0.762 0.782 Justin Bieber, Nicki Minaj, Katy Perry, Shakira, Eminem, Lady Gaga, Jose Mourinho, Selena Gomez, Kim Kardashian, Miley Cyrus, Robert Pattinson, Adele %28singer%29, Steve Jobs, Osama bin Laden, Ron Paul, Twitter, Facebook, Netflix, IPad, IPhone, Touchpad, Kindle, Olympic Games, Cricket, FIFA, Tennis, Mount Everest, Eiffel Tower, Oxford Street, Nubcrburgring, Haiti, Chile, Libya, Egypt, Middle East, Earthquake, Oil spill, Tsunami, Subprime mortgage crisis, Bailout, Terrorism, Asperger syndrome, McDonal's, Vitamin D, Appendicitis, Cholera, Influenza, Pertussis, Vaccine, Childbirth 3 labels per query-result pair gold standard quality control Yahoo! Answers Jon Rubinstein Timothy Cook Kane Kramer Steve Wozniak Jerry York Wikipedia System 7 PowerPC G4 SuperDrive Power Macintosh Power Computing Corp. Steve Jobs •  Annotator agreement (overlap): 0.85 •  Average overlap in top 5 results: <1 40 retrieve entities most related to a query entity using random walk An Engaging Click
  39. 39. | relevant & unexpected | / | unexpected | number of serendipitous results out of all of the unexpected results retrieved | relevant & unexpected | / | retrieved | serendipitous out of all retrieved 41 Baseline   Data   Top:  5  en//es  that  occur  most  frequently   WP   0.63  (0.58)   in  top  5  search  from  Bing  and  Google   YA   0.69  (0.63)   Top  –WP:  same  as  above,  but  excluding     WP   0.63  (0.58)   Wikipedia  page  from  results   YA   0.70  (0.64)   Rel:  top  5  en//es  in  the  related  query     WP   0.64  (0.61)   sugges/ons  provided  by  Bing  and  Google   YA   0.70  (0.65)   Rel  +  Top:  union  of  Top  and  Rel   WP   0.61  (0.54)   YA   0.68  (0.57)   Serendipity “making fortunate discoveries by accident” Serendipity = unexpectedness + relevance “Expected” result baselines from web search An Engaging Click
  40. 40. Interestingness ≠ Relevance Interesting > Relevant Relevant > Interesting Oil Spill à Penguins in Sweaters WP Robert Pattinson à Water for Elephants WP Lady Gaga à Britney Spears WP Egypt à Cairo Conference WP Netflix à Blu-ray Disc YA Egypt à Ptolemaic Kingdom WP & YA 42An Engaging Click
  41. 41. Similarity (Kendall’s tau-b) between result sets and reference ranking 43  Data   tau-­‐b   Which  result  is  more    WP   0.162   relevant  to  the  query?    YA   0.336   If  someone  is  interested  in  the  query,  would    WP   0.162   they  also  be  interested  in  the  result?    YA   0.312   Even  if  you  are  not  interested  in  the  query,    WP   0.139   is  the  result  interes;ng  to  you  personally?    YA   0.324   Would  you  learn  anything  new  about    WP   0.167    the  query  from  the  results    YA   0.307   Following (Arguello et al, 2011) 1.  Labelers provide pairwise comparisons between results 2.  Combine into a reference ranking 3.  Compare result ranking to optimal ranking using Kendall’s tau Assessing “interestingness” An Engaging Click
  42. 42. 44
  43. 43. Take-away messages •  Search is not just about specific information needs •  People search for many other reasons –  Navigation –  Transaction –  Fun (ECIR 2012 workshop) –  Etc. •  Engagement in search is to view search activities as part of the current overall task of a user •  We never know what we get if we are ready to explore –  Users do things that no one expects, not even them! (like staying inside Yahoo! in spite of having many links to go elsewhere) –  So a link is not everything, for search too! •  Summarizing, we need to look at engagement in a broader way
  44. 44. Thank you Acknowledgements: Mounia Lalmas, Jahnette Lehmann, George Dupret, Ilaria Bordino, Yelena Mejova and Elad Yom-Tov. An Engaging Click 46
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.

×