Advertisement
Advertisement

More Related Content

Similar to User engagement in the digital world(20)

Advertisement

More from Mounia Lalmas-Roelleke(20)

Advertisement

User engagement in the digital world

  1. User Engagement in the Digital World Mounia Lalmas Yahoo! Labs Barcelona mounia@acm.org
  2. A bit about myself   1999-2008: Lecturer (assistant professor) to Professor at Queen Mary, University of London   2008-2010 Microsoft Research/RAEng Research Professor at the University of Glasgow   2011- Visiting Principal Scientist at Yahoo! Labs Barcelona  Research topics  XML/structured retrieval and evaluation (INEX)  Quantum theory to model interactive information retrieval  Aggregated search  Bridging the digital divide  Models and measures of user engagement
  3. Outline  Characteristics and measurement  (my) Vision and focus  Some results … and ideas  What next?
  4. Outline  Characteristics and measurement  (my) Vision and focus  Some results … and ideas  What next?
  5. User Engagement – connecting three sides   User engagement is a quality of the user experience that emphasizes the positive aspects of interaction – in particular the fact of being captivated by the technology.   Successful technologies are not just used, they are engaged with. user feelings: happy, sad, user mental states: concentrated, user interactions: click, read excited, bored, … lost, involved, … comment, recommend, buy, … The emotional, cognitive and behavioural connection that exists, at any point in time and over time, between a user and a technological resource S. Attfield, G. Kazai, M. Lalmas and B. Piwowarski. Towards a science of user engagement (Position Paper), WSDM Workshop on User Modelling for Web Applications, 2011.
  6. Would a user engage with this web site? http://www.nhm.ac.uk/
  7. Would a user engage with this web site? (content) http://www.amazingthings.org/ (art event calendar)
  8. Would a user engage with this web site? (aesthetics) http://www.lowpriceskates.com/ (e-commerce – skating)
  9. Would a user engage with this web site? (navigation) http://chiptune.com/ (music repository)
  10. Would a user engage with this web site? (navigation) http://www.theosbrinkagency.com/ (photographer)
  11. Characteristics of user engagement (I) •  Users must be focused to be engaged Focused attention •  Distortions in the subjective perception of time used to measure it •  Emotions experienced by user are intrinsically motivating Positive Affect •  Initial affective hook can induce a desire for exploration, active discovery or participation •  Sensory, visual appeal of interface stimulates, promote Aesthetics focused attention •  Linked to design principles (e.g. symmetry, balance, saliency) •  People remember enjoyable, useful, engaging experiences and want to repeat them Endurability •  Reflected in e.g. the propensity of users to recommend an experience/a site/a product
  12. Characteristics of user engagement (II) •  Novelty, surprise, unfamiliarity and unexpected Novelty •  Appeal to user curiosity, encourages inquisitive behavior and promotes repeated engagement •  Richness captures the growth potential of an activity Richness and control •  Control captures the extent to which a person is able to achieve this growth potential •  Trust is a necessary condition for user engagement Reputation, trust and •  Implicit contract among people and entities which is expectation more than technological Motivation, interests, •  Difficulties in setting up “laboratory” style experiments incentives, and •  Why should user engage? benefits
  13. Forrester Research – The four I’s •  Presence of a user Involvement •  Measured by e.g. number of visitors, time spent •  Action of a user Interaction •  Measured by e.g. CTR, online transaction, uploaded photos or videos •  Affection or aversion of a user Intimacy •  Measured by e.g. satisfaction rating, sentiment analysis in blogs, comments, surveys, questionnaires •  Likelihood a user advocates Influence •  Measured by e.g. forwarded content, invitation to join Measuring Engagement, Forrester Research, June 2008.
  14. Measuring user engagement Measures   Characteristics   Self-reported Questionnaire, interview, report, Subjective, engagement   product reaction cards   user study (lab/online) Mostly  qualita,ve   Cognitive Task-based methods (time spent, Objective, engagement   follow-on task) user study (lab/(online)) Neurological measures (e.g. EEG) Mostly  quan,ta,ve   Physiological measures (e.g. eye Scalability  an  issue?   tracking, mouse-tracking)   Interaction Web analytics + “data science” Objective, engagement   data study Information retrieval metrics + user models   Quan,ta,ve   Large  scale  
  15. Objective measures – Online activities Proxy of user engagement
  16. Online measures as proxy of user engagement! measuring user engagement and interpreting metrics is hard!!
  17. Online measures as proxy of user engagement! Multimedia search activities often driven by entertainment needs, not by information needs M. Slaney, Precision-Recall Is Wrong for Multimedia, IEEE Multimedia Magazine, 2011.
  18. Outline  Characteristics and measurement  (my) Vision and focus  Some results … and ideas  What next?
  19. What affect user (& site) engagement? Web page & site style? Web page & site content?
  20. User engagement… connecting three sides + layout + links The three sides + saliency + content + sentimentality, … + emotional + cognitive + behavioral user engagement Measurements and methodologies within and across site + online analytics metrics (dwell time, CTR, …) + complex networks metrics + new metrics + survival analysis Goals + Models of user engagement + questionnaires, surveys, … + Metrics of user engagement + crowd-sourcing + biometrics (eye tracking, mouse tracking, …)
  21. User Engagement – connecting three measurement approaches eme nt tio n engag interac self- repo rted e ngag e me nt
  22. Diagnostic and what we have done Diagnostic: work exists, but fragmented. In particular: o  What and how to measure depend on services and goals o  Lack of understanding of how to relate subjective and objective measures The rest of this talk: 1.  Models of user engagement 2.  Attention & affect & saliency 3.  Attention & affect & gaze & sentimentality 4.  Attention & affect & mouse tracking (results pending)
  23. Outline  Characteristics and measurement  (my) Vision and focus  Some results … and ideas  What next?
  24. User Engagement – connecting three measurement approaches me nt tio ne ngage interac self-r epor ted e ngag eme nt
  25. Models of user engagement Online sites differ concerning their engagement! Games Search Users spend Users come much time per frequently and visit do not stay long Social media Special Users come Users come on frequently and average once stay long Service News Users visit site, Users come when needed periodically Is it possible to model these differences?
  26. Data and Metrics Interaction data, 2M users, July 2011, 80 US sites Popularity #Users Number of distinct users #Visits Number of visits #Clicks Number of clicks Activity ClickDepth Average number of page views per visit. DwellTimeA Average time per visit Loyalty ActiveDays Number of days a user visited the site ReturnRate Number of times a user visited the site DwellTimeL Average time a user spend on the site.
  27. Methodology General models Time-based models Dimensions weekdays, weekend 8 metrics 8 metrics per time span #Dimensions 8 16 Kernel k-means with Kendall tau rank correlation kernel Nb of clusters based on eigenvalue distribution of kernel matrix Significant metric values with Kruskal-Wallis/Bonferonni #Clusters (Models) 6 5 Analysing cluster centroids = models
  28. Models of user engagement [6 general] •  Popularity, activity and loyalty are independent from each other •  Popularity and loyalty are influenced by external and internal factors   e.g. frequency of publishing new information, events, personal interests •  Activity depends on the structure of the site interest-specific periodic media e-commerce media (daily) search models based on engagement metrics only
  29. Time-based [5 models] Models based on engagement over weekdays and weekend work-related daily news hobbies, interest-specific weather time-based models ≠ general models next put all and more together! let machine learning tell you more!
  30. Models of user engagement – Recap & Next  User engagement is complex and standard metrics capture only a part of it  User engagement depends on time (and users)  First step towards a taxonomy of models of user engagement … and associated metrics  Next  More sites, more models?  Interaction between sites (online multi-tasking)  User demographics, time of the day, geo-location, etc J. Lehmann, M. Lalmas, E. Yom-Tov and G. Dupret. Models of User Engagement, UMAP 2012.
  31. Online multi-tasking July 2011, 25M sessions avg session length 26mn (sd 44) 1.7 Yahoo! sites and 4.9 external (sd 3.1 and 8.6) leaving a site is not a “bad thing!” (fictitious navigation between sites within an online session) users spend more and more of their online session multi-tasking, e.g. emailing, reading news, searching for information  ONLINE MULTI-TASKING navigating between sites, using browser tabs, bookmarks, etc seamless integration of social networks platforms into many services
  32. User Engagement – connecting three measurement approaches me nt tio ne ngage interac self- repo rted e ngag eme nt
  33. Saliency, attention and positive affect  How the visual catchiness (saliency) of “relevant” information impacts user engagement metrics such as focused attention and emotion (affect)  focused attention refers to the exclusion of other things  affect relates to the emotions experienced during the interaction  Saliency model of visual attention developed by Itti and Koch L. Itti and C. Koch. A saliency-based search mechanism for overt and covert shifts of visual attention. Vision Research, 40, 2000.
  34. Manipulating saliency non-salient condition salient condition Web page screenshot   Saliency  maps  
  35. Study design   8 tasks = finding latest news or headline on celebrity or entertainment topic   Affect measured pre- and post- task using the Positive e.g. “determined”, “attentive” and Negative e.g. “hostile”, “afraid” Affect Schedule (PANAS)   Focused attention measured with 7-item focused attention subscale e.g. “I was so involved in my news tasks that I lost track of time”, “I blocked things out around me when I was completing the news tasks” and perceived time   Interest level in topics (pre-task) and questionnaire (post-task) e.g. “I was interested in the content of the web pages”, “I wanted to find out more about the topics that I encountered on the web pages”   189 (90+99) participants from Amazon Mechanical Turk
  36. PANAS (10 positive items and 10 negative items)  You feel this way right now, that is, at the present moment [1 = very slightly or not at all; 2 = a little; 3 = moderately; 4 = quite a bit; 5 = extremely] [randomize items] distressed, upset, guilty, scared, hostile, irritable, ashamed, nervous, jittery, afraid interested, excited, strong, enthusiastic, proud, alert, inspired, determined, attentive, active D. Watson, L.A. Clark, A. Tellegen. Development and validation of brief measures of positive and negative affect: The PANAS Scales. Journal of Personality and Social Psychology, 47, 1988.
  37. 7-item focused attention subscale (part of the 31-item user engagement scale) 5-point scale (strong disagree to strong agree) 1.  I lost myself in this news tasks experience 2.  I was so involved in my news tasks that I lost track of time 3.  I blocked things out around me when I was completing the news tasks 4.  When I was performing these news tasks, I lost track of the world around me 5.  The time I spent performing these news tasks just slipped away 6.  I was absorbed in my news tasks 7.  During the news tasks experience I let myself go H.L. O'Brien. Defining and Measuring Engagement in User Experiences with Technology. PhD Thesis, 2008.
  38. Saliency and positive affect  When headlines are visually non-salient  users are slow at finding them, report more distraction due to web page features, and show a drop in affect  When headlines are visually catchy or salient  user find them faster, report that it is easy to focus, and maintain positive affect  Saliency is helpful in task performance, focusing/avoiding distraction and in maintaining positive affect
  39. Saliency and focused attention  Adapted focused attention subscale from the online shopping domain to entertainment news domain  Users reported “easier to focus in the salient condition” BUT no significant improvement in the focused attention subscale or differences in perceived time spent on tasks  User interest in web page content is a good predictor of focused attention, which in turn is a good predictor of positive affect
  40. Saliency and user engagement – Recap & Next  Interaction of saliency, focused attention, and affect, together with user interest, is complex  Next:  include web page content as a quality of user engagement in focused attention scale  more “realistic” user (interactive) reading experience  bio-metrics (mouse-tracking, eye-tracking, facial expression, etc) L. McCay-Peet, M. Lalmas, V. Navalpakkam. On saliency, affect and focused attention, CHI 2012
  41. User Engagement – connecting three measurement approaches me nt tio ne ngage interac self- repo rted e ngag eme nt
  42. Gaze, sentimentality, interest … and user engagement   News + comments   Sentiment, interest   57 users (lab-based)   Reading task (114)   Questionnaire (qualitative data)   Record mouse tracking, eye tracking, facial expression, EEG signal (quantitative data) Three metrics: gaze, focus attention and positive affect
  43. Interesting content promote users engagement metrics  All three metrics:  focus attention, positive affect & gaze  What is the right trade-off?  news is news   Can we predict?  provider, editor, writter, category, genre, visual aids, …, sentimentality, …  Role of user-generated content (comments)  As measure of engagement?  To promote engagement?
  44. Lots of sentiments but with negative connotations!  Positive effect (and interest, enjoyment and wanted to know more) correlates  Positively () with sentimentality (lots of emotions)  Negatively () with positive polarity (happy news) SentiStrenght (from -5 to 5 per word) sentimentality: sum of absolute values (amount of sentiments) polairity: sum of values (direction of the sentiments: positive vs negative) M. Thelwall, K. Buckley, G. Paltoglou, Sentiment strength detection for the social web. JASIST, 63,1, 2012.
  45. Effect of comments on user engagement  6 ranking of comments:  most replied, most popular, newest  sentimentality high, sentimentality low  polarity plus, polarity minus  Longer gaze on  newest and most popular for interesting news  most replied and high sentimentality for non-interesting news  Can we leverage this to prolonge user attention?
  46. Gaze, sentimentality, interest – Recap and Next  Interesting and “attractive” content!  Sentiment as a proxy of focus attention, positive affect and gaze?  Next  Larger-scale study  Other domains (beyond daily news!)  Role of social signals (e.g. Facebook, Twitter)  Lots more data: mouse tracking, EEG, facial expression I. Arapakis, M. Lalmas, B. Cambazoglu, M.-C. Marcos, J. Jose. Examining User Engagement through the Prism of Interest, Sentiment and Gaze, Submitted for Publication, 2012.
  47. User Engagement – connecting three measurement approaches me nt tio ne ngage interac self- repo rted e ngag eme nt
  48. Mouse tracking … and user engagement   400 users from Amazon Mechanical Turk   Two domains (BBC and Wikipedia)   Two tasks (reading and quiz)   “Normal vs Horrible” interface   Questionnaires (qualitative data)   Mouse tracking (quantitative data)   Interaction data (page view, dwell time)   Results pending! … Hawthorne Effect!!!!!!!!!!
  49. Mouse tracking … and user engagement (Taxonomy? Correlation vs Causation? Measurement? … )
  50. Outline  Characteristics and measurement  (my) Vision and focus  Some results … and ideas  What next?
  51. tion interac ent em engag self- re enga ported gem ent
  52. Outline digital world digital libraries user engagement mobile & tablet
  53. Thank you Collaborators Ioannis Arapakis Ricardo Baeza-Yates Georges Dupret Janette Lehmann Lori McCay-Peet Vidhya Navalpakkam David Warnock Elad Yom-Tov and many others at Yahoo! Labs
Advertisement