Online survey methodology and eye tracking

673 views

Published on

By Lars Kaczmirek

Published in: Business, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
673
On SlideShare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Online survey methodology and eye tracking

  1. 1. Online survey methodology and eye-trackingExperiments about respondents behavior in reading e-mail invitations, question understanding and response options Lars KaczmirekGESIS – Leibniz Institute for the Social Sciences Mannheim, Cologne, Bonn, Berlin Germany
  2. 2. AcknowledgementsThis presentation reports on work which is part of a collaboration between:Lars Kaczmirek, GESIS – Leibniz Institute for the Social SciencesTimo Lenzner, GESIS – Leibniz Institute for the Social SciencesMirta Galesic, Max Planck Institute for Human Development, Berlin, GermanyIt has been presented at the following conferences:Lenzner, T., Kaczmirek, L., & Galesic, M. (2010). Good or bad? That is the question! Identifying poorsurvey question wording using eye tracking. Paper presented at the General Online ResearchConference (GOR10), May 26-28, 2010, Pforzheim, Germany.Lenzner, T., Kaczmirek, L., & Galesic, M. (2010). Seeing through the eyes of the respondent: An eye-tracking study on survey question comprehension. Paper presented at the 65th Annual AAPORConference, May 13-16, 2010, Chicago, Illinois.Kaczmirek, L., Lenzner, T., & Galesic, M. (2009). Is this e-mail relevant? An eye-tracking experiment onhow potential respondents read e-mail invitations. Paper presented at the 4th Internet SurveyMethodology (ISM) Workshop, September 17-19, 2009, Bergamo, Italy.The second experiment is published in:Lenzner, T., Kaczmirek, L., & Galesic, M. (in press). Seeing Through the Eyes of the Respondent: An Eye-tracking Study on Survey Question Comprehension. International Journal of Public Opinion Research. 2
  3. 3. ThesisIs the potential problem related to reading and gazing?Then eye tracking may help to pinpoint the problem.Enabling survey researchers to reach a betterunderstanding of the nature of the problem andto find a solution. 3
  4. 4. Overview• Introduction to eye-tracking• E-mail invitation and placement of the invitation link• Identifying problems in questions• Placement of answer boxes 4
  5. 5. Introduction to eye-tracking5
  6. 6. Participants Approx. 47 participants depending on research question Recruited from respondent pool maintained by MPI, Berlin 61% female (n = 27) Mean age: 26 (SD = 3.7), range: 19-34 Education: 100% received 12 or more years of schooling 78% university students (n = 30) 6
  7. 7. Video7
  8. 8. E-mail invitation and placement of the invitation link8
  9. 9. E-Mail Invitation Experiment
  10. 10. Areas of interest (AOI)
  11. 11. Experimental Conditions: 2 x 2 design• Position of incentive and time: top vs. bottom• Font style of topic, incentive and time: normal vs. bold position top position bottom font normal 1 3 font bold 2 4
  12. 12. Examples: (1) Top-Normal vs. (4) Bottom-Bold
  13. 13. Decision to participate: How long it takes (p=0.20) The link should be placed after the information. Bold font speeds up the decision process. (p=0.16)
  14. 14. Decision to participate is increased, if relevant information is placed before link Χ²(3)=10.1, p=.018* 20 Fishers exact test p=.03* 10 92% clicks vs. 64% clicks 0 top bottom Χ²(3)=2.2, p=.52 (no difference) 20 Fishers exact test p=.72 Descriptive: The condition with bold font 10 showed 83% clicks vs. 75% clicks for normal font. 0 normal boldshowing number of participants, n=47
  15. 15. Conclusions• Providing substantial information after the link interferes with the decision to participate, increases the time needed to come to a decision, and reduces the number of participants.• Information relevant for the decision should be placed before the link.
  16. 16. 16
  17. 17. Identifying problems in questions17
  18. 18. Some evidence…Difficult survey questions increase response error:  satisficing behavior (Krosnick, 1991)  breakoff (Galesic, 2006)  reporting of inaccurate answers (Schober & Conrad, 1997)  variability of question interpretations (Belson, 1981) 18
  19. 19. Advise from psycholinguisticsAvoid: Low-frequency words Vague or imprecise relative terms Vague or ambiguous noun-phrases Complex syntax Complex logical structures Low syntactic redundancy Bridging inferences (Graesser et al. 2006; Lenzner et al., 2009) 19
  20. 20. Example: Low-frequency words (LFRW)Poor:During the last four weeks, how often did you suffer fromsomatic pain?Better:During the last four weeks, how often did you suffer fromphysical pain? 20
  21. 21. Real video example• Red dot show eye gaze fixations• In your free time how often do you attend cultural events? A vague or ambiguous noun-phrase 21
  22. 22. Eye-tracking experiment: Questionnaire 28 x 2 questions (text feature version vs. control) 4 questions per text feature (2 attitudinal, 1 factual, 1 behavioral question) Adapted from ISSP, ALLBUS, SOEP Topics: Social inequality, environment, health, leisure time Language: German 22
  23. 23. Results: Relevant effect for the whole question Question fixation counts Question fixation times 23
  24. 24. Results: Fixation times longer on problematic areas Fixation times General linear models with repeated measures and reading rate/fixation rate as a covariate 24
  25. 25. Results: Heatmap (Low-frequency word) Low-frequency Higher- word frequency synonym „utile“ „useful“ 25
  26. 26. Results: Gazeplot (Complex syntax) Left-embedded syntax Right-embedded syntaxSentences beginning with many subordinateclauses embedded in the main clause 26
  27. 27. Conclusions Eyetracking shows: 6 out of 7 text features reduce question comprehensibility (longer fixation times and higher fixation counts) Effects persist over different question types Survey designers should avoid these text features when writing questions 27
  28. 28. 28
  29. 29. Placement of answer boxes• Paper vs. Web• Right: Follows the flow of reading• Left: Easier and nicer layout, easy mapping between response option and answer box 29
  30. 30. Example30
  31. 31. Answer boxes on left side• Less gaze switches, p = .002• Less fixation counts, p = .0001• Less fixation length , p = .0001• Small effect sizes• Whereas the fixation count/length on response options did not differ! 31
  32. 32. 32
  33. 33. Benefits DrawbacksDirect and non-reactive High initial costsmeasurementIdentifies hot spots Experience required in analysisIdentifies neglected parts Specific analyses are time consumingIdentifies problematic behavioral Missing: Studies assessing thepatterns (when combined with validity of the method inscreen recording) comparison to other well- established methods such as cognitive interviewing or expert reviewsNo bias in ability to expressoneself verbally 33
  34. 34. Thank you!lars.kaczmirek@gesis.orgwww.kaczmirek.dewww.gesis.org/online 34

×