JChueke HCID Open Day_apr2012_pt02

445 views

Published on

Published in: Technology, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
445
On SlideShare
0
From Embeds
0
Number of Embeds
4
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

JChueke HCID Open Day_apr2012_pt02

  1. 1. Methodology: iGoogle empirical study 1. OBJECT AND PARTICIPANTS 1.1. iGoogle Personal Web Portal (identified as suffering of bad PA for drag-and-drop interaction) 1.2. Ten (10) participants: Three (03) beginner, four (04) intermediate and three (03) advanced expertise with Internet and computer. No previous knowledge of the portal. 2. OBSERVATION PROTOCOL (Eye Tracking) 2.1. 10 second ‘Observation Phase’ (no comments from facilitator, no verbalization from subject). 2.2. 15 minutes ‘Observation with Think Aloud’ (TA) method (no interaction of any kind). Questions: ‘Q1: What is this website for?’, ‘Q2: What can you do in this kind of website?’, ‘Q3: Do you think is possible to change your screen the way you like it?’, ‘Q4: Is it possible to move anything in there?’ 3. DATA ANALYSIS 3.1. Deductive Approach for QUANTITATIVE Data Analysis (from Eye Tracking). 3.2. Inductive Approach for QUALITATIVE Data Analysis (from Verbalizations). 3.2. Conclusions and theory (generalization). 1
  2. 2. Methodology• Gazeplot Comparison: Beginner x Advanced Participant, 54 Participant, 22 Beginner Expertise Advanced Expertise 2
  3. 3. Quantitative Data Analysis: iGoogle 3
  4. 4. Quantitative Data Analysis: AOI 4
  5. 5. Quantitative Data Analysis: Categories Logo Search Menu_H Menu_V_ You Weather_Link Weather_Bar Weather_Options D&T_Link D&T_Bar D&T_Options You Tube_Link You Tube_Bar Options Tube_Options You Tube_Sign In Weather_Content D&T_Content SI_Link SI_Bar SI_Options Sports_Link Sports_Bar Sports_Options SI_Content You Tube_Content Menu_V CNET_Link CNET_Bar CNET_Options CNET_Content Sports_Content Simplify_Link Simplify_Bar Simplify_Options Epicurious_Link Epicurious_Bar Epicurious_Options Simplify_Content Epicurious_Sign In Epicurious_Content 5
  6. 6. Quantitative Data Analysis: Categories C1: SITE CONTROL C1: SITE CONTROL C1: SITE CONTROL C1: SITE C2: WIDGET C4: W. C2: WIDGET C2: WIDGET C4: W. C2: WIDGET C2: WIDGET C4: W. C2: WIDGET CONTROL CONTROL MOVE CONTROL CONTROL MOVE CONTROL CONTROL MOVE CONTROL C2: WIDGET CONTROL C3: WIDGET CONTENT C3: WIDGET CONTENT C2: WIDGET C4: W. C2: WIDGET CONTROL MOVE CONTROL C2: WIDGET C4: W. C2: WIDGET CONTROL MOVE CONTROL C3: WIDGET CONTENT C1: SITE CONTROL C3: WIDGET CONTENT C2: WIDGET C4: W. C2: WIDGET CONTROL MOVE CONTROL C3: WIDGET CONTENT C2: WIDGET C4: W. C2: WIDGET C3: WIDGET CONTENT CONTROL MOVE CONTROL C2: WIDGET C4: W. C2: WIDGET CONTROL MOVE CONTROL C3: WIDGET CONTENT C2: WIDGET CONTROL C3: WIDGET CONTENT 6
  7. 7. Quantitative Data Analysis: Categories C1: SITE CONTROL C1: SITE CONTROL C1: SITE CONTROL C1: SITE C2: WIDGET C4: W. C2: WIDGET C2: WIDGET C4: W. C2: WIDGET C2: WIDGET C4: W. C2: WIDGET CONTROL CONTROL MOVE CONTROL CONTROL MOVE CONTROL CONTROL MOVE CONTROL C2: WIDGET CONTROL C3: WIDGET CONTENT C3: WIDGET CONTENT C2: WIDGET C4: W. C2: WIDGET CONTROL MOVE CONTROL C2: WIDGET C4: W. C2: WIDGET CONTROL MOVE CONTROL C3: WIDGET CONTENT C1: SITE CONTROL C3: WIDGET CONTENT C2: WIDGET C4: W. C2: WIDGET CONTROL MOVE CONTROL C3: WIDGET CONTENT C2: WIDGET C4: W. C2: WIDGET C3: WIDGET CONTENT CONTROL MOVE CONTROL C2: WIDGET C4: W. C2: WIDGET CONTROL MOVE CONTROL C3: WIDGET CONTENT C2: WIDGET CONTROL C3: WIDGET CONTENT 7
  8. 8. Quantitative Data Analysis: Categories C1: SITE CONTROL C1: SITE CONTROL C1: SITE CONTROL C1: SITE C2: WIDGET C4: W. C2: WIDGET C2: WIDGET C4: W. C2: WIDGET C2: WIDGET C4: W. C2: WIDGET CONTROL CONTROL MOVE CONTROL CONTROL MOVE CONTROL CONTROL MOVE CONTROL C2: WIDGET CONTROL C3: WIDGET CONTENT C3: WIDGET CONTENT C2: WIDGET C4: W. C2: WIDGET CONTROL MOVE CONTROL C2: WIDGET C4: W. C2: WIDGET CONTROL MOVE CONTROL C3: WIDGET CONTENT C1: SITE CONTROL C3: WIDGET CONTENT C2: WIDGET C4: W. C2: WIDGET CONTROL MOVE CONTROL C3: WIDGET CONTENT C2: WIDGET C4: W. C2: WIDGET C3: WIDGET CONTENT CONTROL MOVE CONTROL C2: WIDGET C4: W. C2: WIDGET CONTROL MOVE CONTROL C3: WIDGET CONTENT C2: WIDGET CONTROL C3: WIDGET CONTENT 8
  9. 9. Quantitative Data Analysis: Categories C1: SITE CONTROL C1: SITE CONTROL C1: SITE CONTROL C1: SITE C2: WIDGET C4: W. C2: WIDGET C2: WIDGET C4: W. C2: WIDGET C2: WIDGET C4: W. C2: WIDGET CONTROL CONTROL MOVE CONTROL CONTROL MOVE CONTROL CONTROL MOVE CONTROL C2: WIDGET CONTROL C3: WIDGET CONTENT C3: WIDGET CONTENT C2: WIDGET C4: W. C2: WIDGET CONTROL MOVE CONTROL C2: WIDGET C4: W. C2: WIDGET CONTROL MOVE CONTROL C3: WIDGET CONTENT C1: SITE CONTROL C3: WIDGET CONTENT C2: WIDGET C4: W. C2: WIDGET CONTROL MOVE CONTROL C3: WIDGET CONTENT C2: WIDGET C4: W. C2: WIDGET C3: WIDGET CONTENT CONTROL MOVE CONTROL C2: WIDGET C4: W. C2: WIDGET CONTROL MOVE CONTROL C3: WIDGET CONTENT C2: WIDGET CONTROL C3: WIDGET CONTENT 9
  10. 10. Quantitative Data Analysis: Categories C1: SITE CONTROL C1: SITE CONTROL C1: SITE CONTROL C1: SITE C2: WIDGET C4: W. C2: WIDGET C2: WIDGET C4: W. C2: WIDGET C2: WIDGET C4: W. C2: WIDGET CONTROL CONTROL MOVE CONTROL CONTROL MOVE CONTROL CONTROL MOVE CONTROL C2: WIDGET CONTROL C3: WIDGET CONTENT C3: WIDGET CONTENT C2: WIDGET C4: W. C2: WIDGET CONTROL MOVE CONTROL C2: WIDGET C4: W. C2: WIDGET CONTROL MOVE CONTROL C3: WIDGET CONTENT C1: SITE CONTROL C3: WIDGET CONTENT C2: WIDGET C4: W. C2: WIDGET CONTROL MOVE CONTROL C3: WIDGET CONTENT C2: WIDGET C4: W. C2: WIDGET C3: WIDGET CONTENT CONTROL MOVE CONTROL C2: WIDGET C4: W. C2: WIDGET CONTROL MOVE CONTROL C3: WIDGET CONTENT C2: WIDGET CONTROL C3: WIDGET CONTENT 10
  11. 11. Quantitative Data Analysis: Heatmap• ‘Q1: What is this website for?’ 11
  12. 12. Quantitative Data Analysis: Heatmap• ‘Q2: What can you do in this kind of website?’ 12
  13. 13. Quantitative Data Analysis: Heatmap• ‘Q3: Do you think is possible to change your screen the way you like it?’ 13
  14. 14. Quantitative Data Analysis: Heatmap• ‘Q4: Is it possible to move anything in there?’ 14
  15. 15. Quantitative Data Analysis: Real Value x Estimated Value X 0.29 X 3.2 X 0.44 X 1.41 X 0.35 X 4.03 X 0.5 X 1.73 First Fixation Duration (Seconds) graphic, presenting queries x AOI categories. 15
  16. 16. Quantitative Data Analysis: Real Value x Estimated Value X 0.89 X 10.0 X 1.36 X 4.34 X 0.74 X 8.4 X 1.14 X 3.6 Total Fixation Duration (Seconds) graphic, presenting queries x AOI categories. 16
  17. 17. Quantitative Data Analysis: Real Value x Estimated Value X 2.5 X 28.8 X 3.9 X 12.0 X 2.9 X 32.8 X 4.46 X 14.0 Fixation Count (Saccades) graphic, presenting queries x AOI categories. 17
  18. 18. Empirical Study Conclusions• Area 2 (Widget Control Widget) and Area 4 (Widget Move) were observed in larger scale in comparison with projected numbers based on the area size.• Beginner expertise participants were unable to describe the difference between regular portals and personal web portals.• No novice participants could spot Drag and Drop Interactions.• We could infer whether the questions had an effect over participant’s gaze over the home page displayed. The response is: yes for the very first question and not particularly for the following.• I asked different question about control and participants did look for similar things, which suggests they are much more scanning over familiar items that they know have that property. There were quite a few accidental landings over content, which is expected, once is hard to avoid looking at pictures.• Where someone looks doesnt tell you what he or she is thinking. Only qualitative data will start unveiling what people are thinking about while theyre looking. 18
  19. 19. Conclusions and Future Work• By developing a methodology for an empirical study, which focuses onobservation prior to any interaction, we are willing to identify what elementspeople will focus on NUI screens.• A prototype with Post-WIMP characteristics and NUI mode of interaction will bebuilt in order to to understand how users visually scan such interfaces to obtainthe gist of its interactive potential.• Quantitative (Eye Tracking) and Qualitative (Verbalizations) data will becombined to produce conclusions about what kind of information can be obtainedwith the protocol – and how can this data be adapted to indicate better designinteractions with NUI systems.• Variables that would affect how one interprets an interface:• FAMILIARITY with the technology.• VISIBILITY of controls. Perceptible Affordances issue. Beyond sight beyondmind...• People TEST the environment to get response. SCAFFOLDING andWITHDRAWING concept. 19
  20. 20. Thank you for your attention!Jacques ChuekeJacques.chueke.1@city.ac.uk 20
  21. 21. Bibliography Beaudouin-Lafon, M. (November 2000). "Instrumental Interaction: An Interaction Model for Designing Post-WIMP User Interfaces". CHI 00: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. The Hague, The Netherlands: ACM Press. pp. 446–453. doi:10.1145/332040.332473. ISBN 1-58113-216-6. http://www.daimi.au.dk/CPnets/CPN2000/download/chi2000.pdf. Breeze, James. Eye Tracking: Best Way to Test Rich App Usability. UX Magazine, access on 25 November 2010. (http://www.uxmag.com/technology/eye-tracking-the-best-way-to-test-rich-app-usability) Buxton, W. (2001). Less is More (More or Less), in P. Denning (Ed.), The Invisible Future: The seamless integration of technology in everyday life. New York: McGraw Hill, 145–179 ITU Internet Reports 2005: The Internet of Things – Executive Summary. Dam, A. (February 1997). "POST-WIMP User Interfaces". Communications of the ACM (ACM Press) 40 (2): pp. 63–67. doi:10.1145/253671.253708. Dourish, P. Where the Action Is: The Foundations of Embodied Interaction. A Bradford Book: The MIT Press, USA, 2004. Ehmke & Wilson, 2007. Identifying Web Usability Problems from Eye-Tracking Data. Published by the British Computer Society. People and Computers XXI – HCI…but not the way we know it: Proceedings of HCI 2007. Gaver, W. Technology Affordances. Copyright 1991 ACM 0-89791-383-3/91/0004/0079. Gentner, D. and Nielsen, J. (April 1993). "The Anti-Mac Interface". Communications of the ACM (ACM Press) 39 (8): pp. 70–82. http://www.useit.com/papers/anti-mac.html. Jacob, R. et al. (2008). "Reality-Based Interaction: A Framework for Post-WIMP Interfaces". CHI 08: Proceedings of the Twenty-Sixth Annual SIGCHI Conference on Human Factors in Computing Systems. Florence, Italy: ACM. pp. 201–210. doi:http://doi.acm.org.ezproxy.lib.ucf.edu/10.1145/1357054.1357089. ISBN 978-1-60558-011-1. 21
  22. 22. Bibliography McGrenere, J., Ho, W. (2000). Affordances: Clarifying and Evolving a Concept. Procs. of Graphic Interfaces 2000, Montreal, May 2000. McNaughton, J. Utilizing Emerging Multi-touch Table Designs. Technology Enhanced Learning Research Group - Durham University. TR-TEL-10-01. Nielsen, J. (April 1993). "Noncommand User Interfaces". Communications of the ACM (ACM Press) 36 (4): pp. 83–99. doi:10.1145/255950.153582. http://www.useit.com/papers/noncommand.html. Norman, D. (1999). Affordance, Conventions and Design. In ACM Interactions, (May + June, 1999), 38-42. Picard, R. Affective Computing. The MIT Press, Cambridge, Massachusetts. London, England, 1998. PREECE, Jenny. SHARP, Helen. ROGERS, Yvonne. Interaction Design: Beyond Human-Computer Interaction [2nd edition]. John Wiley & Sons, Ltd. West Sussex, UK, 2009. Ramduny-Ellis, D.; Dix, A.; Hare, J.; Gill, S. Physicality: Towards a Less-GUI Interface (Preface). Procs. Third International Workshop on Physicality. Cambridge, England, 2009. Sorensen, M. Making a Case for Biological and Tangible Interfaces. Proceedings of the Third International Workshop on Physicality. Cambridge, England, 2009. Sternberg, R. Cognitive Psychology. Wadsworth, Cengage Learning. Belmont, CA, USA, 2009, 2006. Vyas, D., Chisalita, C. Veer, G. Affordance in Interaction. ECCE 06 Proceedings of the 13th Eurpoean conference on Cognitive ergonomics: trust and control in complex socio-technical systems. ACM New York, NY, USA ©2006 ISBN: 978-3-906509-23-5 WIGDOR, Deniel. WIXON, Dennis. Brave NUI World: designing natural user interfaces for touch and gesture. Morgan Kauffman Publishers, USA, 2011. 22

×