Your SlideShare is downloading. ×
Measuring Web User Engagement: a cauldron of many things.
Upcoming SlideShare
Loading in...5

Thanks for flagging this SlideShare!

Oops! An error has occurred.


Introducing the official SlideShare app

Stunning, full-screen experience for iPhone and Android

Text the download link to your phone

Standard text messaging rates apply

Measuring Web User Engagement: a cauldron of many things.


Published on

In the online world, user engagement refers to the quality of the user experience that emphasizes the positive aspects of the interaction with a web application and, in particular, the phenomena …

In the online world, user engagement refers to the quality of the user experience that emphasizes the positive aspects of the interaction with a web application and, in particular, the phenomena associated with wanting to use that application longer and frequently. User engagement is a multifaceted, complex phenomenon; this gives rise to a number of potential approaches for its measurement. Common ways of measuring user engagement include: self-reporting (e.g., questionnaires); observational methods (e.g., facial expression analysis, speech analysis, desktop actions); and web analytics using online behavior metrics that assess users’ depth of engagement with a site. These methods represent various tradeoffs between the scale of data analyzed and the depth of understanding. For instance, surveys are small-scale but deep, whereas clicks can be collected on a large-scale but provide shallow understanding. However, little is known in validating and relating these types of measurement. This talk will present various efforts aiming at combining techniques from web analytics (in particular clicks) and existing works on user engagement coming from the domains of information science, multimodal human computer interaction and cognitive psychology.

This is a revised presentation of a keynote given at TPDL 2012. New work include online multi-tasking and exploring mouse movement.

Published in: Technology

No Downloads
Total Views
On Slideshare
From Embeds
Number of Embeds
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

No notes for slide


  • 1. Measuring Web User Engagement: a cauldron of web analytics, focus attention, positive affect, user interest, saliency, mouse movement & multi-tasking Mounia Lalmas Yahoo! Labs Barcelona
  • 2. Click-through rate as proxy of userengagement!
  • 3. Click-through rate as proxy of relevance! Multimedia search activities often driven by entertainment needs, not byM. Slaney, Precision-Recall Is Wrong for Multimedia, IEEE Multimedia Magazine, 2011. information needs
  • 4. Click-through rate as proxy of useruser satisfaction! I just wanted the phone number … I am totally satisfied 
  • 5. In this talk – results, messages & questions1. Big data and in-depth focused user studies “a must”!2. Users “multi-task” online, what does this mean?3. Mouse movement hard to “experiment with” and/or “interpret”.4. Using crowd-sourcing “I think” worked fine.
  • 6. This talk is not about aesthetics … but see later (e-commerce – skating)Source:
  • 7. This talk is not about usability (music repository)Source:
  • 8. User Engagement – connecting three sides  User engagement is a quality of the user experience that emphasizes the positive aspects of interaction – in particular the fact of being captivated by the technology.  Successful technologies are not just used, they are engaged with.user feelings: happy, sad, user mental states: concentrated, user interactions: click, readexcited, … lost, involved, … comment, recommend, buy, … The emotional, cognitive and behavioural connection that exists, at any point in time and over time, between a user and a technological resourceS. Attfield, G. Kazai, M. Lalmas and B. Piwowarski. Towards a science of user engagement (Position Paper),WSDM Workshop on User Modelling for Web Applications, 2011.
  • 9. Characteristics of user engagement Focused Attention Novelty Positive Affect Richness & Control Aesthetics Reputation, Trust & Expectation Endurability Motivation, Interests, Incentives & BenefitsH.L. OBrien. Defining and Measuring Engagement in User Experiences with Technology. PhD Thesis,2008.H.L. OBrien & E.G. Toms. JASIST 2008, JASIST 2010.
  • 10. Measuring user engagement
  • 11. Connecting three measurement approaches nt ageme ti on eng interacUSER ENGAGEMENT self- repo r te d en gage ment m en t n gage e n it ive cog
  • 12. Models of user engagement …towards a taxonomy? nt ageme ti on eng interacUSER ENGAGEMENT self- repo r te d en gage ment m en t n gage e n it ive cog
  • 13. Models of user engagement Online sites differ concerning their engagement! Games Search Users spend Users come much time per frequently and do visit not stay long Social media Special Users come Users come on frequently and average once stay long Service News Users visit site, Users come when needed periodically
  • 14. Data and MetricsInteraction data, 2M users, July 2011, 80 US sites Popularity #Users Number of distinct users #Visits Number of visits #Clicks Number of clicks Activity ClickDepth Average number of page views per visit. DwellTimeA Average time per visit Loyalty ActiveDays Number of days a user visited the site ReturnRate Number of times a user visited the site DwellTimeL Average time a user spend on the site.
  • 15. Methodology General models Time-based modelsDimensions weekdays, weekend 8 metrics 8 metrics per time span#Dimensions 8 16 Kernel k-means with Kendall tau rank correlation kernel Nb of clusters based on eigenvalue distribution of kernel matrix Significant metric values with Kruskal-Wallis/Bonferonni#Clusters(Models) 6 5 Analysing cluster centroids = models
  • 16. Models of user engagement[6 general]• Popularity, activity and loyalty are independent from each other• Popularity and loyalty are influenced by external and internal factors  e.g. frequency of publishing new information, events, personal interests• Activity depends on the structure of the site interest-specificperiodicmediae-commercemedia (daily)search models based on engagement metrics only
  • 17. Time-based [5 models] Models based on engagement over weekdays and weekend work-relateddaily news hobbies, interest-specific weather time-based models ≠ general models next put all and more together! let machine learning tell you more!
  • 18. Models of user engagement – Recap & Next User engagement is complex and standard metrics capture only a part of it User engagement depends on time (and users) First step towards a taxonomy of models of user engagement … and associated metrics Next  More sites, more models  User demographics, time of the day, geo-location, etc.  Online multi-taskingJ. Lehmann, M. Lalmas, E. Yom-Tov and G. Dupret. Models of User Engagement, UMAP 2012.
  • 19. Online multi-tasking nt ageme ti on eng interacUSER ENGAGEMENT self- repo r te d en gage ment m en t n gage e n it ive cog
  • 20. Online multi-tasking 181K users, 2 months browser data, 600 sites, 4.8M sessions •only 40% of the sessions have no site revisitation •hyperlinking, backpaging and teleporting leaving a site is not a “bad thing!”(fictitious navigation between sites within an online session) users spend more and more of their online session multi-tasking, e.g. emailing, reading news, searching for information  ONLINE MULTI-TASKING navigating between sites, using browser tabs, bookmarks, etc seamless integration of social networks platforms into many services
  • 21. Navigating between sites –hyperliking, backpaging and teleportingNumber of backpaging actions is an under-estimate!
  • 22. Revisitation and navigation patterns
  • 23. Online multi-tasking – Some results48% sites visited at least 9 timesRevisitation “level” depends on site10% users accessed a site 9+ times (23% for search sites); 28% at least four times (44% for search sites)Activity on site decreases with each revisit but activity on many search and adult sites increasesBackpaging usually increases with each revisit but hyperlinking remains important means to navigate between sites
  • 24. Online multi-tasking – Recap & NextJ. Lehmann, M. Lalmas & G. Dupret. Online Multi-Tasking and User Engagement. Submitted for publication, 2013.
  • 25. Focus attention, positive affect & saliency nt ageme ti on eng interacUSER ENGAGEMENT self- repo r te d en gage ment m en t n gage e n it ive cog
  • 26. Saliency, attention and positive affectHow the visual catchiness (saliency) of “relevant” information impacts user engagement metrics such as focused attention and emotion (affect) focused attention refers to the exclusion of other things affect relates to the emotions experienced during the interactionSaliency model of visual attention developed by Itti and Koch L. Itti and C. Koch. A saliency-based search mechanism for overt and covert shifts of visual attention. Vision Research, 40, 2000.
  • 27. Manipulating saliency non-salient condition salient condition Web page screenshot Saliency maps
  • 28. Study design 8 tasks = finding latest news or headline on celebrity or entertainment topic Affect measured pre- and post- task using the Positive e.g. “determined”, “attentive” and Negative e.g. “hostile”, “afraid” Affect Schedule (PANAS) Focused attention measured with 7-item focused attention subscale e.g. “I was so involved in my news tasks that I lost track of time”, “I blocked things out around me when I was completing the news tasks” and perceived time Interest level in topics (pre-task) and questionnaire (post-task) e.g. “I was interested in the content of the web pages”, “I wanted to find out more about the topics that I encountered on the web pages” 189 (90+99) participants from Amazon Mechanical Turk
  • 29. Saliency and positive affectWhen headlines are visually non-salient  users are slow at finding them, report more distraction due to web page features, and show a drop in affectWhen headlines are visually catchy or salient  user find them faster, report that it is easy to focus, and maintain positive affectSaliency is helpful in task performance, focusing/avoiding distraction and in
  • 30. Saliency and focused attentionAdapted focused attention subscale from the online shopping domain to entertainment news domainUsers reported “easier to focus in the salient condition” BUT no significant improvement in the focused attention subscale or differences in perceived time spent on tasksUser interest in web page content is a good predictor of focused attention, which in turn is a good predictor of positive affect
  • 31. Saliency and user engagement – Recap & Next Interaction of saliency, focused attention, and affect, together with user interest, is complex Next: include web page content as a quality of user engagement in focused attention scale more “realistic” user (interactive) reading experience bio-metrics (mouse-tracking, eye-tracking, facial expression, etc)L. McCay-Peet, M. Lalmas, V. Navalpakkam. On saliency, affect and focused attention, CHI 2012
  • 32. Mouse tracking, positive effect, attention nt ageme ti on eng interacUSER ENGAGEMENT self- repo r te d en gage ment m en t n gage e n it ive cog
  • 33. Mouse tracking … and user engagement  324 users from Amazon Mechanical Turk (between subject design)  Two domains (BBC and Wikipedia)  Two tasks (reading and quiz)  “Normal vs Ugly” interface  Questionnaires (qualitative data)  focus attention, positive effect, novelty, interest, usability, aesthetics  + demographics, handeness & hardware  Mouse tracking (quantitative data)  movement speed, movement rate, click rate, pause length, percentage of time still
  • 34. “Ugly” vs “Normal” Interface (BBC News)
  • 35. “Ugly” vs “Normal” (Wikipedia)
  • 36. Mouse tracking can tell aboutAgeHardware Mouse TrackpadTask  Searching: There are many different types of phobia. What is Gephyrophobia a fear of?  Reading: (Wikipedia) Archimedes, Section 1: Biography
  • 37. Mouse tracking could not tell much on
  • 38. Mouse tracking and user engagement — Recap & Next High level of ecological validity Age, task, and hardware Do we have a Hawthorne Effect??? “Usability” vs engagement “Even uglier” interface? I don’t think so Within- vs between-subject design? Next Sequence of movements Automatic clusteringD. Warnock and M. Lalmas. An Exploration of Cursor tracking Data. Submitted for publication, 2013.
  • 39. Connecting three measurement approaches on interacti The value of a click? self- repo r te d e gn itiv co
  • 40. Thank you Collaborators: Ioannis Arapakis Ricardo Baeza-Yates Georges Dupret Janette Lehmann Lori McCay-Peet (Dalhousie University) Vidhya Navalpakkam David Warnock (Glasgow University) Elad Yom-TovContact: and many others at Yahoo! Labs