Comparative Evaluation of Two Interface Tools in Performing Visual Analytics Tasks Dong Hyun Jeong*, Tera Marie Green†, William Ribarsky*, Remco Chang* *Charlotte Visualization Center, UNC Charlotte†School of Interactive Arts and Technology, Simon Fraser University
Motivation Human interaction & flow of cognition Important for problem solving “Visualization design should avoid, as much as possible, menus or other actions that take the user outside of the frame of the task” by Green et al. 2008 Previous Literatures In visualization community, pull-down menus may require the human to sort through and think about menu items [Green et al.]. In HCI community, Not a clear difference between direct comparison between menu vs. direct manipulation icons [Lim et al.]
Objective To show the effectiveness of two interface tools, Floating-Menu and Interactive-Icon, on a highly interactive visual analytics tool. A comparative evaluation Quantitative and Qualitative evaluation to find the differences between the two.
A Visual Analytics Tool A visual analytics tool A genomic visualization (called GVis). Uses publicly available biological database (GenBank) by NCIBI (the National Center for Biotechnology Information). System overview Floating-Menu Interactive-Icon
Information Representation Representing information corresponding to each option selection # of published articles Year information about the publication Titles of the publishing journals # of matched results based on the search query
Comparative Study Within-subject study College students 31 participants (twelve males and nineteen females) Most participants are not familiar with visualization as well biology. Tasks Finding additional information (publications) related to specific organisms using the two interface tools.
Evaluation Results (Accuracy) Accuracy About 54.8% (17.0±7.9) [Floating-Menu] and 46.1% (14.3±5.0) [Interactive-Icon] of the participants answered correctly. Repeated Measures ANOVA Accuracy difference is not statistically significant across the two interfaces (p=0.24)
Evaluation Results (Speed) Speed No statistically significant Pearson's Correlation Coefficient measure A trend between the time spent and the difficulty of the task (r =.47, p<.0001).
Evaluation Results (post-task questionnaire) Easiness About 60% (18.6±0.5) [Floating-Menu] and about 43% (13.3±4.0) [Interactive-Icon] of the participants reported all 3 tasks to be “easy” or “very easy” Helpfulness About 74% (23±3.6) [Floating-Menu] and about 65% (20±3.6) [Interactive-Icon] of the participants reported the tool to be “helpful” or “very helpful” in solving tasks.
Evaluation Results (post-app. & study questionnaire) Learnability About 67% and 51% of the participants rated that Floating-Menu and Interactive-Icon were easy to use (“very easy” or “easy”). Preference No significant difference, but there was a gender difference. like comfortable
Discussion Two interface tools, Floating-Menu and Interactive-Icon, perform similarly both quantitatively and qualitatively. Limitations of the comparative evaluation method. Quantitative (Time & Accuracy) and Qualitative (Users’ feedback) are not good for evaluating on a highly interactive visual analytics tool. Importance of preserving humans’ flow of cognition Interactive-Icon might support well the human’s flow of cognition than Floating-Menu. With the comparative evaluation, the difference between the two interface tools cannot be generalized.
Q&A Charlotte Visualization Center UNC Charlotte email@example.com