Eye tracking and neuromarketing primer


Published on

This presentation was given to the Testing and Assessment Methods class (MS in Human Factors in Information Design) at Bentley University.

Published in: Design
  • Be the first to comment

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Eye tracking and neuromarketing primer

  1. 1. Eye Tracking and Neuromarketing Primer Quantitative to boost the qualitative Dan Berlin, MBA, MSHFID
  2. 2. Hi! • Psychology and generally geeky background • Graduated Bentley in 2008 • Two years at One to One Interactive ▫ Usability and neuromarketing studies • Left this past summer to start my own consultancy • dan@berlinconsulting.net • Twitter: @banderlin
  3. 3. Agenda • Eye Tracking ▫ Background ▫ Equipment ▫ Methods ▫ The great debate • Neuromarketing ▫ Background ▫ The Players ▫ The debate continues • War Stories • Q&A
  4. 4. Eye Tracking – history • Eye tracking has been around since the late 19th century
  5. 5. Eye Tracking – equipment • Two main players: & ▫ Tobii  Based in Sweden, offers the same equipment for scientific research, & has assistive technology products ▫ SMI  Based in Germany, offers high-end & integrated equipment for scientific research
  6. 6. Eye Tracking – equipment 1750 T60/120 T60 XL X60/120 Tobii Glasses • 1750 & X60: old technology is old • You probably don’t need 120 Hz • T60 & XL: depends on your needs • Tobii studio and Axure wireframes • Glasses: brand new do not play nicely together • depends on IR markers • only 30 Hz • small DVR
  7. 7. Eye Tracking – equipment RED iView X HED • RED • 60/120 Hz (also have 250 Hz model) • Germaphobes: gotta clean that hat • Use a screen up to 300” • iView X HED • Software advantage: moving AOIs & • Up to 200 Hz better statistical analysis • Uses a notebook or subcomputer • No IR markers
  8. 8. When is Eye Tracking Appropriate? • The age old question… • In usability studies ▫ NOT during think-aloud  It is natural for a participant to look at the moderator  And they will look at parts of the screen that they are talking about ▫ Does retrospective think-aloud alleviate this?  It asks participants to remember what they were unconsciously thinking  More likely: primacy and recency effects (Michael Summers, TrueAction) ▫ Allocate a few tasks to eye tracking where the user does not think- aloud ▫ Avoid bias: make up a story for the calibration
  9. 9. When is Eye Tracking Appropriate? • In benchmark studies ▫ Comparing user behavior to different design or interaction concepts ▫ No think-aloud, just explore the site  Can be done with a static composition or a wireframe  Compare, compare, compare – there are no benchmarks ▫ Use metrics to determine if participants are looking at areas of interest  Not all AOIs are equal – some should be more important to the business ▫ Static pages: 10 to 20 second exposure  Otherwise: the big red blob – they look everywhere Eye Tracking is a tool, not a methodology!
  10. 10. Eye Tracking Output • The typical outputs from eye tracking: ▫ Fixations & duration ▫ Time to 1st fixation ▫ Gaze plots & heat maps ▫ Areas of interest (AOIs) ▫ Pupil dilation
  11. 11. Eye Tracking Metrics • Fixations vs. duration ▫ Basically, they are the same  Both measure levels of active attention and cognition ▫ We will never know if an increased duration indicates confusion or interest ▫ Fixations per second is the traditional measure of active attention • Gaze plots and heat maps ▫ Eye candy and not much else – but clients love them ▫ Bolster your eye candy with data! • Pupil dilation ▫ Impossible to measure accurately – don’t use it
  12. 12. Ok, so how should I use eye tracking metrics? • Use areas of interest to compare metrics ▫ How many fixations are in (un)important AOIs?  Will determine if an important AOI needs more emphasis ▫ How do fixations in similar AOIs compare between different design treatments?  Will determine which design better achieves business goals ▫ How long does it take participants to get to a particular AOI? (time to 1st fixation)  You only have a few seconds to impress a user – are they looking at that which you want them to?
  13. 13. Participant Recruitment • Make sure you ask about eye ailments ▫ Retina & cornea damage, eye cancer & tumors, macular degeneration, cataracts, conjunctivitis, and nystagmus ▫ Not necessarily problematic: amblyopia, glaucoma, and strabismus • If possible, you want to use the data from everyone you bring in ▫ Add questions to your screener to ensure you can eye track your participants
  14. 14. The great eye tracking debate • If a person looks at something, does that mean that he comprehends it? ▫ Maybe • Does a long duration indicate confusion or interest? ▫ It depends • Does eye tracking really help the usability cause? ▫ I think so • Isn’t it all just crap? ▫ Not really – eye tracking gives us data to backup qualitative findings
  15. 15. Neuromarketing – background • Modern neuromarketing is born • Some current vendors measure EEG from fMRIs instead ▫ Brain loci with oxygenated blood ▫ Electrical activity on the scalp ▫ Perform a task or present a stimulus ▫ Brain waves indicate what the person and watch where the blood goes is experiencing ▫ Relies on knowledge about brain  Pleasure, anxiety, fight or flight, etc loci… and a large fMRI machine ▫ Relies on interpretations of EEGs
  16. 16. Neuromarketing – the players Company Method Location Notes NeuroFocus EEG + GSR Berkeley • Owned by Nielsen • Just proposed “NeuroStandards” for the field EmSense EEG San Francisco • Developed own headset InnerScope Biometrics Boston • Campbells label Sands Research EEG + Biometrics El Paso • Super Bowl study OTOinsights EEG + self report Boston • Uses Emotiv headset Buyology Consultant NYC • Think-tank Neurosciencemarketingblog.com is a great source of information
  17. 17. Neuromarketing – studies • Daimler-Chrysler fMRI study ▫ Attractive cars light up the facial recognition area of the brain • Campbell’s Soup label There is no “buy” button in the brain!
  18. 18. Neuromarketing – the great debate • Is neuromarketing ethical? • How do we know the results are accurate? • Show me the ROI!
  19. 19. War Stories
  20. 20. Creep Map • 1 minute exposure • These are the only hotspots on the entire page • When asked why this design comp was given a low rating, the response: “because she’s fat” • Oy vey
  21. 21. Q&A