• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Analytics Update: Affective Computing - Cheil Innovation Session July 2013
 

Analytics Update: Affective Computing - Cheil Innovation Session July 2013

on

  • 705 views

Affective Computing (computers understanding and responding to our emotions) may seem a little scary, but actually the technology is really quite exciting. In this presentation our Strategic Analyst, ...

Affective Computing (computers understanding and responding to our emotions) may seem a little scary, but actually the technology is really quite exciting. In this presentation our Strategic Analyst, Tara Davanzati takes us through what it is, what's new with it, and how it's being used in advertising.

Statistics

Views

Total Views
705
Views on SlideShare
701
Embed Views
4

Actions

Likes
1
Downloads
8
Comments
0

2 Embeds 4

http://www.linkedin.com 3
https://twitter.com 1

Accessibility

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • I recently read a story in the New York Times about how a woman was in the car with her family and they were shouting (not at each other, but at the iPhone’s navigation system). When the woman interrupted to say that the phone didn’t understand that they were shouting and upset so what was the point in doing it, their response was that it should. Tech is becoming faster and more intuitive, but can’t it understand our feelings? Links: NYTimes – If Our Gadgets Could Measure Our Emotions http://www.nytimes.com/2013/06/02/technology/if-our-gadgets-could-measure-our-emotions.html?pagewanted=all
  • It seems that the term is already there and the world of tech innovation is finally starting to make sense of our emotions and use them. Source: Wikipedia –Affective Computing http://en.wikipedia.org/wiki/Affective_computing
  • In marketing, we already know that using emotion to convey messages and inspire purchase decisions is effective. In fact, it’s 15% more effective than rational messaging. And we all know the success that John Lewis has had with it’s Christmas adverts – the latest one helping to deliver sales of £142m. And so there’s have been new developments in the marketing world to understand our emotional responses to adverts. Links: Neuroscience Marketing – Emotional Ads Work Best http://www.neurosciencemarketing.com/blog/articles/emotional-ads-work-best.htmMarketing Week – John Lewis Festive Social Strategy ‘Most Effective’ http://www.marketingweek.co.uk/news/john-lewis-festive-social-strategy-most-effective/4005147.article
  • A company called Affectiva a spin off of some amazing people down at M.I.T. Media Lab have created Affdex which if you seen it on Forbes.com (and you still can if you search Interactive Smile) you’ll find all the latest ads from the Superbowl. When you click on an ad it will enable your webcam then watch you as the ad plays.Link: Interactive: Analyse Your Smile http://www.forbes.com/2011/02/28/detect-smile-webcam-affectiva-mit-media-lab.htmlhttp://www.affdex.com/
  • So just click on it above. The technology uses facial recognition to understand your engagement with the ad. Surprise, smile, confusion, dislike, attention, and valence which is another word for attentiveness… sort of. We have 10k different expressions and each is made up of specific signals, such as eyebrows raised or furrowed, a smile, etc. Facial recognition technology begins to understand our visual queues we make that tell people we’re conversing with if we like and acknowledge this information. Link: http://www.youtube.com/watch?v=4d8ZDSyFS2g
  • What we get is an actual engagement figure from people who viewed our ad. This is a completely opt-in service, as I imagine viewers would have a hard time letting advertisers view and record their face, but it is a world better than the current engagement and ad effectiveness measures that we currently use. Seen on the right here is the way we find out if people liked our latest campaign. We go through mentions about the ad from social media and define whether it’s a positive comment, negative, or somewhere in between. The usual finding is that it’s mostly in this ‘in between’ phase. If we were able to get clues from facial recognition, we may be able to see that people are interested, but not necessarily overwhelmed with joy.Link: Interactive: Analyse Your Smile http://www.forbes.com/2011/02/28/detect-smile-webcam-affectiva-mit-media-lab.html
  • So that’s all well and good to scan our facial responses, but what about those not so noticeable things we do? Our bodies respond to the environment and outside stressors in involuntary ways. The Sympathetic division of the Central Nervous System (responsible for our Fight or Flight response) gives us some good indicators of how we’re reacting to what we’re seeing and dealing with. As the diagram on the right displays, our pupils dilate, we start to sweat, our heart rate goes up. What if we could measure these traits?Links: https://www.google.co.uk/url?sa=i&rct=j&q=&esrc=s&source=images&cd=&docid=ka9CHJ1gZ-ooOM&tbnid=it-9D5_O1qt_tM:&ved=0CAUQjRw&url=http%3A%2F%2Fspinewave.co.nz%2Fasthma%2F&ei=3dfvUcqyFMrC0QXy6YBg&bvm=bv.49641647,d.ZGU&psig=AFQjCNFo-9R-7RK48WuZlJXdMVC-pTRYoA&ust=1374759251303677
  • Well developments at Microsoft with the Xbox One Kinect are doing just that. This is the first step in developing emotional insights in real-time. The new Kinect has vast improvements on it’s predecessor. New Scientist has reported that the new Xbox will have HD colour and infrared cameras “that can see if your eyes are open or closed in the dark. It will be able to detect your pulse from fluctuations in skin tone and, by measuring how light reflects off your face, it will know when you start to sweat.”Links: New Scientist – Latest Kinect Sensors Allow Games to Feed Off Your Fear http://www.newscientist.com/article/mg21829195.900-latest-kinect-sensors-allow-games-to-feed-off-your-fear.html#.Ue_YQo2R_pU
  • Games using this technology would be able to use the data (heart rate, sweating, etc) to alter your experience in real-time. So if you’re really pumped about killing off a bunch of Aliens then the game might produce more to keep you interested, or less if it looks like you’re maybe sweating too much. Furthermore, if the technology incorporated pupil dilation – which is likely not far off, it could detect whether you’ve hit a peak in your mental capacity. According to New Scientist “your pupils dilate when you are engaged in a challenge and return to normal when you’ve given up because something is tedious or too hard. If Kinect could detect pupil sizes it would be able to scale the games difficulty in real-time.”A system that notices you’re at your wits-end with a baddie may be able to produce ways to help you – like getting you to buy some extra life or better weapons… or if you’re like me… perhaps a game that’s a little less stressful like Kinectimals.Link:New Scientist – Latest Kinect Sensors Allow Games to Feed Off Your Fear http://www.newscientist.com/article/mg21829195.900-latest-kinect-sensors-allow-games-to-feed-off-your-fear.html#.Ue_YQo2R_pU
  • If the technology could exist for gaming, there’s not much stopping it on other emerging/mass market platforms, like mobile – where gaming is mega. As the article in the NY Times about the frustrating navigation system on the iPhone states, “Gadgets would be more efficient if they could respond when we’re frustrated, bored or too busy to be interrupted, yet they would also be intrusive in ways we can’t even fathom today. It sounds like a science-fiction movie, and in some ways it is. Much of this technology is still in its early stages, but it’s inching closer to reality.”Some may shudder at the idea, but frankly I’m excited to no end. Links: NYTimes – If Our Gadgets Could Measure Our Emotions http://www.nytimes.com/2013/06/02/technology/if-our-gadgets-could-measure-our-emotions.html?pagewanted=all

Analytics Update: Affective Computing - Cheil Innovation Session July 2013 Analytics Update: Affective Computing - Cheil Innovation Session July 2013 Presentation Transcript

  • Analytics Update: Affective Computing Transcript I recently read a story in the New York Times about how a woman was in the car with her family and they were shouting (not at each other, but at the iPhone‟s navigation system). When the woman interrupted to say that the phone didn‟t understand that they were shouting and upset so what was the point in doing it, their response was that it should. Tech is becoming faster and more intuitive, but can‟t it understand our feelings? Tara Davanzati Strategic Analyst
  • © 2013 Cheil Europe Ltd. All rights reserved. Affective Computing is the study and development of systems and devices that can recognize, interpret, process, and simulate human affects. In other words, a device that measures and responds to your emotions. We will never fully reach artificial intelligence until we build emotional intelligence. Transcript It seems that the term is already there and the world of tech innovation is finally starting to make sense of our emotions and use them.
  • © 2013 Cheil Europe Ltd. All rights reserved. Emotion sells The John Lewis Christmas ad helped the department store to deliver its best ever weekly performance, with sales reaching £142m. It also posted record online sales 46 per cent ahead of the previous year. Transcript In marketing, we already know that using emotion to convey messages and inspire purchase decisions is effective. In fact, it‟s 15% more effective than rational messaging. And we all know the success that John Lewis has had with it‟s Christmas adverts – the latest one helping to deliver sales of £142m. And so there‟s have been new developments in the marketing world to understand our emotional responses to adverts.
  • © 2013 Cheil Europe Ltd. All rights reserved. Transcript A company called Affectiva, a spin off of some amazing people down at M.I.T. Media Lab, have created Affdex which if you‟ve seen it on Forbes.com (and you still can if you search Interactive Smile) you‟ll find all the latest ads from the Superbowl. When you click on an ad it will enable your webcam then watch you as the ad plays.
  • © 2013 Cheil Europe Ltd. All rights reserved. Transcript The technology uses facial recognition to understand your engagement with the ad. Surprise, smile, confusion, dislike, etc. We have 10k different expressions and each is made up of specific signals, such as eyebrows raised or furrowed, a smile, etc. Facial recognition technology begins to understand our visual queues we make that tell people we‟re conversing with if we like and acknowledge this information. As you can see in the data below, I smiled quite a lot during this ad.
  • © 2013 Cheil Europe Ltd. All rights reserved. Affdex Technology “Traditional” Transcript What we get is an actual engagement figure from people who viewed our ad. This is a completely opt-in service, as I imagine viewers would have a hard time letting advertisers view and record their face, but it is a world better than the current engagement and ad effectiveness measures that we currently use. Seen on the right here is the way we find out if people liked our latest campaign. We go through mentions about the ad from social media and define whether it‟s a positive comment, negative, or somewhere in between. The usual finding is that it‟s mostly in this „in between‟ phase. If we were able to get clues from facial recognition, we may be able to see that people are interested, but not necessarily overwhelmed with joy.
  • © 2013 Cheil Europe Ltd. All rights reserved. Transcript So that‟s all well and good to scan our facial responses, but what about those not so noticeable things we do? Our bodies respond to the environment and outside stressors in involuntary ways. The Sympathetic division of the Central Nervous System (responsible for our Fight or Flight response) gives us some good indicators of how we‟re reacting to what we‟re seeing and dealing with. As the diagram on the right displays, our pupils dilate, we start to sweat, our heart rate goes up. What if we could measure these traits?
  • © 2013 Cheil Europe Ltd. All rights reserved. Transcript Well developments at Microsoft with the Xbox One Kinect are doing just that. This is the first step in developing emotional insights in real-time. The new Kinect has vast improvements on it‟s predecessor. New Scientist has reported that the new Xbox will have HD colour and infrared cameras “that can see if your eyes are open or closed in the dark. It will be able to detect your pulse from fluctuations in skin tone and, by measuring how light reflects off your face, it will know when you start to sweat.”
  • © 2013 Cheil Europe Ltd. All rights reserved. Transcript Games using this technology would be able to use the data (heart rate, sweating, etc) to alter your experience in real-time. So if you‟re really pumped about killing off a bunch of Aliens then the game might produce more to keep you interested, or less if it looks like you‟re maybe sweating too much. Furthermore, if the technology incorporated pupil dilation – which is likely not far off, it could detect whether you‟ve hit a peak in your mental capacity. According to New Scientist “your pupils dilate when you are engaged in a challenge and return to normal when you‟ve given up because something is tedious or too hard. If Kinect could detect pupil sizes it would be able to scale the games difficulty in real-time.” A system that notices you‟re at your wits-end with a baddie may be able to produce ways to help you – like getting you to buy some extra life or better weapons… or if you‟re like me… perhaps a game that‟s a little less stressful like Kinectimals.
  • © 2013 Cheil Europe Ltd. All rights reserved. Transcript If the technology could exist for gaming, there‟s not much stopping it on other emerging/mass market platforms, like mobile – where gaming is mega. As the article in the NY Times about the frustrating navigation system on the iPhone states, “Gadgets would be more efficient if they could respond when we‟re frustrated, bored or too busy to be interrupted, yet they would also be intrusive in ways we can‟t even fathom today. It sounds like a science-fiction movie, and in some ways it is. Much of this technology is still in its early stages, but it‟s inching closer to reality.” Some may shudder at the idea, but frankly I‟m excited to no end.
  • © 2013 Cheil Europe Ltd. All rights reserved. Thanks! Want to know more or have a chat about the latest analytics, get in touch. Tara.davanzati@cheil.com