Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
ANONYMOUS &
CHEAP
Experimenting with Unobtrusive
Methods of Measuring User
Experience and Engagement for In-
Gallery Inter...
Anonymous: Solution and Problem
Visitors: Low barrier to
entry
NO DATA
Naturalistic Observation (Lurking)
One task
One interpretive point
Usable ≠ Engaging
What these experiments were… and weren’t
WERE:
• UX engagement
• Stop-gap
• Trial and error/iterative
WEREN’T:
• Education...
“Engagement is a category of user experience characterized
by attributes of challenge, positive affect, endurability,
aest...
Questions about experience
• What proportion of visitors are using
them?
• Do visitors understand function?
• Do the inter...
First Line of
Attack:
Idle Timers &
Custom Event
Analytics
Idle Timer as
User Session
Proxy
Mean: 3.7 pinch-to-zoom interactions per “user session”
Idle Timer as
User Session
Proxy
Mean: 0.3 interactions with interpretive content per “user session”
0
500
1000
1500
2000
2500
3000
Blaschka Map: User Sess...
Second Approach:
Video Observations
and Optimized Forms
The Setup
• Two cameras
• Two rooms
• Four digital interactives
Deferred
Observations
• Unobtrusive
• Non-real-time
Optimized
Forms
• Two tools
• Multiple iterations
• Lots of trial and error
Counting
Visitors and
Interactive
Usage
Results
42%
Overall
64%
Child/teen
32%
Adult
47%
Senior
Tiffany“Workshop”:
PercentageofVisitors W ho
UsedAtLeastOneDigital
Interac...
Digging Deeper
• Individual observations of
behaviors and interactions
• Branching logic forms for data
collection optimiz...
Mosaic Theater
How do you measure
engagement with a
passive experience?
Mosaic Theater: Passive Engagement
30%
13%
18%
17%
22%
Mosaic Theater: Visitor Time Spent
less than 1 minute
1-2 minutes
2...
Mosaic Theater: Interaction
Tiffany Workshop Interactives: the Payoff
Workshop Interactives: Content
Engagement
Workshop Interactives: Disengagement
• Satisfied with information and/or experience: user purposefully examined most or al...
Where Do We Go From Here?
XKCD
MW18 Presentation: Anonymous And Cheap: Experimenting With Unobtrusive Methods Of Measuring User Experience And Engagement...
MW18 Presentation: Anonymous And Cheap: Experimenting With Unobtrusive Methods Of Measuring User Experience And Engagement...
Upcoming SlideShare
Loading in …5
×

MW18 Presentation: Anonymous And Cheap: Experimenting With Unobtrusive Methods Of Measuring User Experience And Engagement For In-Gallery Interactives

151 views

Published on

By Brian Hewitt, Corning Museum of Glass, USA

Anonymously used in-gallery digital interactives may have lower barriers to visitor usage than downloaded apps, purpose-built devices, or even Web apps. But because of that anonymity—and their typical nature as single “page,” limited purpose, context-sensitive, nonlinear, or non-conversion oriented applications—they can present challenges in meaningful engagement data collection and user experience analysis. Both high-tech and low-tech tools and techniques exist, but these can be expensive, imprecise, intrusive, or time-consuming. In this paper, we examine methods and techniques to gather user experience and engagement data in inexpensive and unobtrusive ways, which we have been exploring over the course of several exhibitions and multiple installed digital interactives. The use of inexpensive security-style cameras combined with data forms geared toward rapid, simple input allows for more naturalistic observation of more visitors in less time, and for the collection of standardized, quantitative data.

Programming of the interactives can include custom analytics events and rough session estimations based on user activity and idle times. These methods are not necessarily meant to be definitive but may provide alternatives where other methods are not feasible or desirable. In cases where other tools are available, combining these techniques may provide a richer picture of the overall effectiveness of installed interactives.

Published in: Education
  • Be the first to comment

  • Be the first to like this

MW18 Presentation: Anonymous And Cheap: Experimenting With Unobtrusive Methods Of Measuring User Experience And Engagement For In-Gallery Interactives

  1. 1. ANONYMOUS & CHEAP Experimenting with Unobtrusive Methods of Measuring User Experience and Engagement for In- Gallery Interactives
  2. 2. Anonymous: Solution and Problem Visitors: Low barrier to entry NO DATA
  3. 3. Naturalistic Observation (Lurking)
  4. 4. One task One interpretive point Usable ≠ Engaging
  5. 5. What these experiments were… and weren’t WERE: • UX engagement • Stop-gap • Trial and error/iterative WEREN’T: • Educational/interpretive goal measurement • Exhaustive or definitive • High-tech
  6. 6. “Engagement is a category of user experience characterized by attributes of challenge, positive affect, endurability, aesthetic and sensory appeal, attention, feedback, variety/novelty, inter-activity, and perceived user control” - O’Brien & Toms Question 1: What is “engaged”?
  7. 7. Questions about experience • What proportion of visitors are using them? • Do visitors understand function? • Do the interactives support the context? • What is the emotional response? Are there things we can measure to benchmark future projects?
  8. 8. First Line of Attack: Idle Timers & Custom Event Analytics
  9. 9. Idle Timer as User Session Proxy
  10. 10. Mean: 3.7 pinch-to-zoom interactions per “user session”
  11. 11. Idle Timer as User Session Proxy
  12. 12. Mean: 0.3 interactions with interpretive content per “user session” 0 500 1000 1500 2000 2500 3000 Blaschka Map: User Sessions and Map Pin Taps Map Pin Tap Events User Sessions
  13. 13. Second Approach: Video Observations and Optimized Forms
  14. 14. The Setup • Two cameras • Two rooms • Four digital interactives
  15. 15. Deferred Observations • Unobtrusive • Non-real-time
  16. 16. Optimized Forms • Two tools • Multiple iterations • Lots of trial and error
  17. 17. Counting Visitors and Interactive Usage
  18. 18. Results
  19. 19. 42% Overall 64% Child/teen 32% Adult 47% Senior Tiffany“Workshop”: PercentageofVisitors W ho UsedAtLeastOneDigital Interac�ve,byAgeCategory
  20. 20. Digging Deeper • Individual observations of behaviors and interactions • Branching logic forms for data collection optimized to visitor behaviors
  21. 21. Mosaic Theater How do you measure engagement with a passive experience?
  22. 22. Mosaic Theater: Passive Engagement 30% 13% 18% 17% 22% Mosaic Theater: Visitor Time Spent less than 1 minute 1-2 minutes 2-4 minutes 4-8 minutes more than 8 minutes
  23. 23. Mosaic Theater: Interaction
  24. 24. Tiffany Workshop Interactives: the Payoff
  25. 25. Workshop Interactives: Content Engagement
  26. 26. Workshop Interactives: Disengagement • Satisfied with information and/or experience: user purposefully examined most or all the content and/or engaged in discussion with others about the information presented • Interruption or distraction: interruption of attention by other person or activity • Usability issue(s): repeated unsuccessful attempts at interaction or visual frustration • Boredom or uninterested: user quickly perused or otherwise spent little time or attention on information or application • Time Constraint: summoned away before user appeared ready to depart. This category was sometimes difficult to distinguish from “interruption or distraction” • Unclear: could not infer reason for disengagement or otherwise unclear
  27. 27. Where Do We Go From Here? XKCD

×