Proposed Working Memory Measures for Evaluating Information Visualization Tools.

1,283 views

Published on

Presenter: Laura Matzen, Laura McNamara, Kerstan Cole, Alisa Bandlow, Courtney Dornburg, Travis Bauer
BELIV 2010 Workshop
http://www.beliv.org/beliv2010/

0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
1,283
On SlideShare
0
From Embeds
0
Number of Embeds
82
Actions
Shares
0
Downloads
12
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide

Proposed Working Memory Measures for Evaluating Information Visualization Tools.

  1. 1. Proposed Working Memory Measures for Evaluating Information Visualization Tools<br />Laura Matzen, Laura McNamara, Kerstan Cole, <br />Alisa Bandlow, Courtney Dornburg & Travis Bauer<br />Sandia National Laboratories<br />Albuquerque, NM 87185<br />This work was funded by Sandia’s Laboratory Research and Development Program as part of the <br />Networks Grand Challenge (10-119351). <br />Sandia National Laboratories is a multi-program laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin company, for the U.S. Department of Energy’s National Nuclear Security Administration under contract DE-AC04-94AL85000.<br />
  2. 2. Evaluation of information visualization tools<br />Evaluations typically developed for a single, specific task and tool<br />Time consuming, expensive, results can’t be generalized<br />We propose using measures of cognitive resources to create standardized evaluation metrics<br />
  3. 3. Why assess cognitive resources?<br />All analysis tasks are cognitively demanding<br />Human cognitive resources are finite<br />Well-designed interfaces should free cognitive resources for making sense of data<br />Reduce cognitive burden of searching and manipulating data<br />
  4. 4. Working Memory<br />Mental workspace underlying all complex cognition<br />Has a limited, measurable capacity<br />Often used as a performance metric in other domains<br />
  5. 5. Proposed methodology<br />Evaluate visual analytics interfaces using a dual-task methodology:<br />Primary task: Interaction with interface <br />Secondary task: Test of working memory capacity<br />Performance on the secondary task should correspond to cognitive resources that would be available for sensemaking in a real-world analysis task<br />
  6. 6. Example working memory task<br />Sternberg task (Sternberg, 1969)<br />Memory set  delay  probe items<br />Low-load memory set: M G J<br />High-load memory set: D K H Y R Q<br />M G J<br />F<br />M<br />W<br />
  7. 7. Comparisons of different interface designs<br />M<br />W<br />F<br />
  8. 8. Later, Compare Different Visualizations of Same Dataset<br />M<br />W<br />F<br />
  9. 9. By this time next year…<br />Pilot Study 1: Compare two interface designs for a simple video player with tagging feature<br />Pilot Study 2: Compare two different interface designs for a visual text analytics application developed at Sandia<br />Use NASA TLX to develop convergent evidence <br />Both studies should provide insight into the use of working memory based metrics for interface assessment <br />…we should have data!<br />
  10. 10. Comments, please!<br />

×