Beyond Mouse & Keyboard: Post-WIMP and Novel Forms of Interaction<br />Jacques Chueke<br />Open Day, Centre for HCI Design...
“The most profound technologies are those that disappear. They wave themselves into the fabric of everyday life until they...
It’s not supposed to be like this…<br />Gmail Motion, April 2011<br />3<br />
Post-WIMP<br /><ul><li>In computing post-WIMP refers to work on user interfaces, mostly graphical user interfaces, which a...
Post-WIMP<br /><ul><li>Defined by van Dam as interfaces “containing at least one interaction technique not dependent on cl...
The Question<br /><ul><li>Regarding the visual UI (User-Interface), are Perceptible Affordances within Post-WIMP interface...
<ul><li> "Affordance" means what you can do to an object. For example, a checkbox affords turning on and off, and a slider...
Hypothesis<br /><ul><li>The paradigms that define GUI and establish the conventions for manipulation, besides the presenta...
(…) The notion of the invisible interface correctly identifies the inflexible obtrusiveness of conventional interfaces as ...
TUI: Embodiment<br /><ul><li>Activity Centered Design: The tool is the way (Norman, 2005) + Embodiment (Heidegger ‘being i...
NUI: Intuitive<br /><ul><li>Assistive Technology: Tobii P-10 at the SmartLab (UEL).</li></ul>Tobii P-10 equipment, Oct. 20...
Social Aspects: Installations<br /><ul><li> Live Wall: The objectives are to create an installation, where I'm interested ...
Perceptible Affordances in Digital Art<br />Digital Art [INTERFACE LAYER]<br />OUTPUT<br />- No Prior-Learning, besides kn...
Conclusions<br /><ul><li>Novel technologies for interaction will be used to elicit user exploration of new and visually un...
We propose a view that identifies some fraction of a user interface as based on the Post-WIMP theme (1) plus some other fr...
Bibliography<br />Beaudouin-Lafon, M. (November 2000). "Instrumental Interaction: An Interaction Model for Designing Post-...
Bibliography<br />McGrenere, J., Ho, W. (2000). Affordances: Clarifying and Evolving a Concept. Procs. of Graphic Interfac...
Upcoming SlideShare
Loading in …5
×

hcid2011 - Gesture Based Interfaces: Jacques chueke (HCID, City University London)

4,810 views

Published on

One of the many talks at hcid2011 held at City University London on 19th April 2011

Published in: Technology, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
4,810
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
22
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • Welcome to my session, entitled ‘’. My name is Jacques. I’m certain we are about to have an interesting chat about a subject that I love, which is the evolution of digital interfaces and novel forms of interaction. If you’re wondering what accent is this, it’s Brazilian portuguese. I’m a PhD researcher at the Centre for HCI Design. I’m a teacher in Brazil, teaching Usability for web and software on Postgraduation degrees and graphic design at PUC-Rio university. It’ll be a great pleasure to share with you some thought about my research. Shall we start?
  • It’s happening for sure. There’s an obvious interest on novel forms of interaction within the industry, games (Wii, Playstation Move, X-Box 360-Kinect), Ubicomp, PCs, Arts.Mark D. Weiser (July 23, 1952 – April 27, 1999) was a chief scientist at Xerox PARC in the United States. Weiser is widely considered to be the father of ubiquitous computing, a term he coined in 1988.While Weiser worked for a variety of computer related startups, his seminal work was in the field of ubiquitous computing while leading the computer science laboratory at PARC, which he joined in 1987. His ideas were significantly influenced by his father&apos;s reading of Michael Polanyi&apos;s &quot;The Tacit Dimension&quot;. He became head of the computer science laboratory in 1988 and chief technology officer in 1996, authoring more than eighty technical publications.
  • Igonnaletyou decide what’swrongwiththispicture.
  • 1980 EPITOME of Desktop MetaphorAbstract Tasks x RBI (physics, inertia, etc)RBI: Based on physical skills, limbs extension, gaze, notion of depth/perspective, knowledge of the real world (physics, inertia). 3D interactions for instance. More Organic.Jacob (2008): ‘A useful interface will rarely entirely mimic the real world, but will necessarily include some unrealistic or artificial features and commands. In fact, much of the power of using computers comes from this multiplier effect—the ability to go beyond a precise imitation of the real world. For example, in a GUI, one might want to go beyond realistically pointing to and dragging individual files to more abstract commands like Archive all files older than 180 days or Delete all files that contain the text string “reality-based”. (FILTER)
  • REALISTIC X NON-REALISTIC / ABSTRACTION X REALISTICAndries &quot;Andy&quot; van Dam (born 8 December 1938, Groningen) is a Dutch-born American professor of computer science and former Vice-President for Research at Brown University in Providence, Rhode Island. Together with Ted Nelson he contributed to the first hypertext system, HES in the late 1960s. He co-authored Computer Graphics: Principles and Practice along with J.D. Foley, S.K. Feiner, and John Hughes. He also co-founded the precursor of today&apos;s ACM SIGGRAPH conference.Currently, Professor van Dam is teaching at Brown University. He teaches Introduction to Computer Graphics, as well as one first-year course every fall. He is also serving on the Technical Board of Microsoft Research, as Chairman of the Rhode Island Governor&apos;s Science and Technology Advisory Council (STAC), and as Chairman of the IEEE James H. Mulligan, Jr. Education Medal committee. In 1994 he was inducted as a Fellow of the Association for Computing Machinery,[6] and a chaired professorship was recently endowed in his honor at Brown University.
  • Very interested on this moment of first interaction with Post-WIMP/NUI without prior learning. How the interface should react? What should display? How to design to maximize efficiency and reduce learning/memory/attention (cognitive issues)?How to signal properly what can be clicked on and what can be done, even before any interaction takes place.That justifies the use of eye tracking/eye gaze.Although technology of interaction is evolving what is being presented on the ‘screen’ is not. If what is being displayed on the screen does not match the possibilities brought by new technologies for interaction, we might not experience the full embodied condition and the easiest and intuitive mode of interaction that could lighten our cognitive burden/learning process. There are more objects of interest than meets the eye.Personal computing has 3-5 small tasks in order to access a larger application to perform a major task. WIMP: Window, Icon, Menu, Pointing deviceWYSIWYG:what you see is what you get. The term is used in computing to describe a system in which content displayed during editing appears very similar to the final output,[2] which might be a printed document, web page, or slide presentation.Direct Manipulation. Shneiderman, 1983HCI: Plans, Procedures, Tasks and GoalsGOMS: Goals, Operators, Methods, and Selection rules (Idealized goals...)HCD: Human Centered Design: The tool dictates the activities.“In the early 1980s Xerox launched Star, the first commercial system with a Graphical User Interface (GUI) and the first to use the “desktop” metaphor to organize a user’s interactions with the computer. Despite the perception of huge progress, from the perspective of design and usage models, there has been precious little progress in the intervening years. In the tradition of Rip van Winkle, a Macintosh user from 1984 who just awoke from a 17-year sleep would have no more trouble operating a “modern” PC than operating a modern car.” (Buxton, 2001)&quot;Rip Van Winkle&quot; is a short story by the American author Washington Irving published in 1819: Rip is told that he has apparently been away from the village for twenty years.“Skill” in these computational modes is like remembering a recipe in order to use an application for a desired outcome. GUI models are performatory by nature, rather than exploratory based upon the senses they use. In addition, GUI has no intended specialization, therefore usability is hampered even more. Sorensen (2009)“Technological systems set up barriers between things – barriers between applications, barriers between files, barriers between activities, barriers between media, barriers between users, and so on. Applications on a PC exist in different worlds; even those that are integrated into “suites” still maintain barriers between different forms of content and different forms of interaction with that content.” Dourish (2005: 197)&quot;The Magical Number Seven, Plus or Minus Two: Some Limits on Our Capacity for Processing Information&quot; is one of the most highly cited papers in psychology.[1][2][3] It was published in 1956 by the cognitive psychologistGeorge A. Miller of Princeton University&apos;s Department of Psychology in Psychological Review. It supposedly argues that the number of objects an average human can hold in working memory is 7 ± 2. This is frequently referred to as Miller&apos;s Law (not to be confused with his theory of communication: Miller&apos;s Law).Recent research has demonstrated that not only is the &quot;law&quot; based on a misinterpretation of Miller&apos;s paper, but that the correct number is probably around three or four.In his article, Miller discussed a coincidence between the limits of one-dimensional absolute judgment and the limits of short-term memory.
  • We view the affordances of an artefact as the possibilities (for both: thinking and doing) that are signified by the users during their interaction with the artefact. Acknowledging the work of Baerentsen &amp; Trettvik, we propose an interaction-centered view of affordance, which we call Affordance in Interaction. From this view, affordances of an artefact are not the properties of the artefact but a relationship that is socially and culturally constructed between the users and the artefact in the lived world. This view strongly suggests that affordance emerges during a user’s interaction with the environment. In addition, the affordance in interaction view focuses on the ‘active interpretations’ of the users interacting with the artefact. From this view, users are actively participating in the interaction with the artefact and continuously interpreting the situation and constructing and re-building meanings about the artefact. We suggest that affordances can be better understood as an interpretative relationship between users and the artefact. Vyas et al (2006) SEMIOTICS SIGN: Charles Morris (Sintatic – Semantic – Pragmatic) Peirce (Representamen – Object – Interpretant x Icon – Index - Symbol)
  • I made a distinction for the purpose of better clarification on how Perceptible Aff operate in Post-WIMP. As a Bridge between the interface layer (visual, acoustic, haptic) + mode of interaction.The trick is to show what is really possible to be done rather than what is apparently possible.‘In today’s screen design sometimes the cursor shape changes to indicate the desired action (e.g., the change from arrow to handshape in a browser), but this is a convention, not an affordance. After all, the user can still click anywhere, whatever the shape of the cursor. Now if we locked the mouse button when the wrong cursor appeared, that would be a real affordance, although somewhat ponderous. The cursor shape is visual information: it is a learned convention. When you learn not to click unless you have the proper cursorform, you are following a cultural constraint. Norman (1999)PERCEIVED AFFORDANCE x (2) CULTURAL CONSTRAINT = CONVENTION + SYMBOLIC COMMUNICATION (SYMBOLIC MEANING ARBRITARY – LEARNED CONVENTION)EXAMPLE OF PERCEPTIBLE AFFORDANCE: SLIDER/BUTTON
  • The creation of novel visual metaphors/reactive interface should be tested in order to verify if their Perceptible Affordances efficiently convey possible interactions within Post-WIMP/NUI.
  • Tangible and social computing both reflect this central concern with embodiment. The tangible computing work attempts to capitalize on our physical skills and our familiarity with real world objects. It also tries to make computation manifest to us in the world in the same way as we encounter other phenomena, both as a way of making computation fit more naturally with everyday world and as a way of enriching our experiences with the physical. It attempts to move computation and interaction out of the world of abstract cognitive processes and into the same phenomenal world as our other sorts of interactions. Pg. 102-103.Dourish (2004)So the idea of the invisible interface is too simplistic. It frames interface interaction as an all-or-nothing issue. In arguing against the tyranny of complex interfaces that interfere with the job of getting things done, it misidentifies the problem, demonizes the interface, and abandons altogether the idea that the interface might mediate user action. (…) The notion of the invisible interface correctly identifies the inflexible obtrusiveness of conventional interfaces as problematic. Embodied interaction provides some conceptual tolls for understanding how the interface might move into the background without disappearing altogether. Dourish (2005: 202-203)
  • The most important achievement on TUI is bridging the gap between input and output by displaying outputs and inputs on the same surface, helping to integrate perception and action seamlessly into one environment. Sorensen (2009), apudSharlin (2004)MS Surface (2007): Specialized Applications.Activity Centered Design: The tool is the way (Norman, 2005)Addictive rather than transformative quality to the GUI. When a tool is physically acted upon, the result is twofold: causality is instantly observed and time is inherently felt. This coupling between user and tool allows for embodiment. ACD models and TUIs alternatively are more open-ended in application, requiring the user to explore rather than CONFORM or ADAPT to the tool. ‘In other words, as we act through technology that has become ready-to-hand, the technology itself disappears from our immediate concerns. We are caught up in the performance of the work; our mode of being is one of “absorbed coping.” The equipment fades into the background. This unspoken background against which our actions are played out is at the heart of Heidegger’s view of being-in-the-world. Dourish (2004:109)PHYSICAL EXPLORATION + VISUAL EXPLORATION (FAULTY…) = REACTIVE INTERFACEMENTION MEZATOP – MEIRION WILLIAMSMartin Heidegger (September 26, 1889 – May 26, 1976; German pronunciation: [ˈmaɐ̯tiːn ˈhaɪdɛɡɐ]) was an influential German philosopher known for his existential and phenomenological explorations of the &quot;question of Being.&quot;[3] His best-known book, Being and Time, is considered to be one of the most important philosophical works of the 20th century [4] and he has been influential beyond philosophy, in literature,[5] psychology,[6] and artificial intelligence.I think, therefore I am – had reflected a doctrine that we “occupy” two different and separate worlds, the world of physical reality and the world of mental existence. This doctrine, called Cartesian dualism, holds that mind and body are quite different; thinking and being are two different sets of phenomena. …So Heidegger rejected the dualism of mind and body altogether. He argued that thinking and being are fundamentally intertwined. …From his perspective, the meaningfulness of everyday experience lies not in the head, but in the world. Pg. 107Dourish (2004)
  • Accessibility: the technology conveys opportunities for people with special needs to interface with digital devices (multi-sensory).MENTION  GReAT (Gesture Recognition in Aphasia Therapy) SAM MUCROSOFT WORK
  • I’ve created this installation at the University of Plymouth, i-DAT Centre. Will include Eye Gaze in the future.Accidental Interactions: help initiate participation, legitimizes participation, invites people in. Evaluation apprehension: effects amplified by the screen.Generalized rather than specific gesture enable expressiveness. Gesture as performance. Live Wall: The installation &quot;Live Wall&quot; is intended to engage a wider public with emerging scientific, philosophical and technological issues in instructive, engaging and memorable ways. New insights into the practical and philosophical meaning of technology are achieved through the participation of the user at the deepest level of cognitive experience and natural behaviours as form of action/input/interaction - generating reaction/output/response.MENTION COLLECTIVE SEEPER: EVAN GRANT
  • Let’s revisit the concept of Perceptible Aff x Installations (Digital Art). Here I substituted the Perceptible Aff with SYMBOLIC?SYNESTHESIS?ABSTRACT interaction.- No SEMANTICS or specific language. Purely Reactive.The trick is to show what is really possible to be done rather than what is apparently possible.‘In today’s screen design sometimes the cursor shape changes to indicate the desired action (e.g., the change from arrow to handshape in a browser), but this is a convention, not an affordance. After all, the user can still click anywhere, whatever the shape of the cursor. Now if we locked the mouse button when the wrong cursor appeared, that would be a real affordance, although somewhat ponderous. The cursor shape is visual information: it is a learned convention. When you learn not to click unless you have the proper cursorform, you are following a cultural constraint. Norman (1999)(1) PERCEIVED AFFORDANCE x (2) CULTURAL CONSTRAINT = CONVENTION + SYMBOLIC COMMUNICATION (SYMBOLIC MEANING ARBRITARY – LEARNED CONVENTION)
  • The way we perceive things is changing. We need to re-interpret the shift we’re living and review the language itself that would better convey the message of Post-WIMP/NUI and encompass the experience of the very interaction itself.The interface should change to encompass TUI and NUI, rather than just co-exist with addictive features in an already exceeded GUI. Research about the different feedback (multi-sensorial) should take place in order to encompass more effcientlythe possibilitities of interfacing with eyes, gestures, voice, touch, emotions and the very mind itself. GUI additions such as Natural User Interfaces, Microsoft’s Surface Computer, eye-tracking and other Haptic interfaces are not transforming the underlying problems created with the GUI.Sorensen (2009)
  • hcid2011 - Gesture Based Interfaces: Jacques chueke (HCID, City University London)

    1. 1. Beyond Mouse & Keyboard: Post-WIMP and Novel Forms of Interaction<br />Jacques Chueke<br />Open Day, Centre for HCI Design<br />London, UK, April 2011<br />Master in Design, PUC-Rio, RJ, Brazil<br />PhD Researcher at the Centre for HCI Design<br />Faculty of Informatics, City University London<br />1<br />
    2. 2. “The most profound technologies are those that disappear. They wave themselves into the fabric of everyday life until they are indistinguishable from it.” Weiser (1991)<br />Is this possible?<br />Is this desirable?<br />2<br />
    3. 3. It’s not supposed to be like this…<br />Gmail Motion, April 2011<br />3<br />
    4. 4. Post-WIMP<br /><ul><li>In computing post-WIMP refers to work on user interfaces, mostly graphical user interfaces, which attempt to go beyond the paradigm of Windows, Icons, Menusand a Pointing device. E.g. virtual (VR) and augmented reality (AR), Tangible User-Interface (TUI), ubiquitous and pervasive computing, context-aware computing, mobile devices, console gaming, Affective Computing, Reality-Based Interactions (RBI), Organic-User Interfaces (OUI), Brain-Computer Interface (BCI) and Natural-User Interface (NUI). </li></ul>BumpTop 3D, 2007 - 2010<br />4<br />
    5. 5. Post-WIMP<br /><ul><li>Defined by van Dam as interfaces “containing at least one interaction technique not dependent on classical 2D widgets such as menus and icons”. Ultimately it will involve all senses in parallel, natural language communication and multiple users. Communications of ACM, 1997 </li></ul>MIT Media Lab: DepthJS – 2011<br />5<br />
    6. 6. The Question<br /><ul><li>Regarding the visual UI (User-Interface), are Perceptible Affordances within Post-WIMP interfaces (with NUI modes of interaction) efficiently signaling possible interactions? Regular WIMP-GUI might not be accomplishing this task efficiently. </li></ul>6<br />
    7. 7. <ul><li> "Affordance" means what you can do to an object. For example, a checkbox affords turning on and off, and a slider affords moving up or down. "Perceived affordances" are actions you understand just by looking at the object, before you start using it (or feeling it, if it's a physical device rather than an on-screen UI element). All of this is discussed in Don Norman's book The Design of Everyday Things. (a.k.a POET: Psychology of Everyday Things). </li></ul>JakobNielsen's Alertbox, February 19, 2008: Top-10 Application-Design Mistakeshttp://www.useit.com/alertbox/application-mistakes.html<br />7<br />Perceptible Affordances<br /><ul><li> The concept of affordance has been used in HCI to solve problems related to the usability of designed systems. The concept was originally coined by Gibson (1986) and introduced to the HCI field by Norman (1988) and was further appropriated by Gaver (1991), Bærentsen & Trettvik(2002), amongst others. Vyas, D.(2006)</li></li></ul><li>Perceptible Affordances in Post-WIMP<br />Post-WIMP GUI [INTERFACE LAYER]<br />OUTPUT<br />- LESS SYMBOLIC<br />- MORE INTUITIVE (USER)<br />- MORE REACTIVE (COMPUTER)<br />PERCEPTIBLE AFFORDANCE<br />TUI/NUI/RBI/VR/AR/BMI/OUI [MODE OF INTERACTION LAYER]<br />INPUT<br />8<br />
    8. 8. Hypothesis<br /><ul><li>The paradigms that define GUI and establish the conventions for manipulation, besides the presentation of the visual interface itself -within different digital environments (e.g. Mac and Windows OS and the WWW) - do not match and better display/encompass the most suitable use of Post–WIMP interactions. </li></ul>9<br />PranavMistry, inventor of SixthSense, 2009<br />
    9. 9. (…) The notion of the invisible interface correctly identifies the inflexible obtrusiveness of conventional interfaces as problematic. Embodied interaction provides some conceptual tolls for understanding how the interface might move into the background without disappearing altogether. <br />Dourish (2005: 202-203)<br />10<br />
    10. 10. TUI: Embodiment<br /><ul><li>Activity Centered Design: The tool is the way (Norman, 2005) + Embodiment (Heidegger ‘being in the world’ and Dourish): We’re the tool.</li></ul>Microsoft Surface, 2007<br />11<br />
    11. 11. NUI: Intuitive<br /><ul><li>Assistive Technology: Tobii P-10 at the SmartLab (UEL).</li></ul>Tobii P-10 equipment, Oct. 2010<br />12<br />
    12. 12. Social Aspects: Installations<br /><ul><li> Live Wall: The objectives are to create an installation, where I'm interested on the ethnographic and sociological aspects of the experience that takes place between the beholder and the reactive environment. All interaction will occur with regular gestures, eye gaze, movement and sound issued by the observer/participant. </li></ul>Live Wall 01, Jacques Chueke. Plymouth, 2010<br />Live Wall 02, Jacques Chueke. Plymouth, 2010<br />13<br />
    13. 13. Perceptible Affordances in Digital Art<br />Digital Art [INTERFACE LAYER]<br />OUTPUT<br />- No Prior-Learning, besides knowledge of the world<br />- Reactive<br />SYMBOLIC / SYNESTHESIC / ABSTRACT<br />TUI/NUI/RBI/VR/AR/BMI/OUI [MODE OF INTERACTION LAYER]<br />INPUT<br />14<br />
    14. 14. Conclusions<br /><ul><li>Novel technologies for interaction will be used to elicit user exploration of new and visually unfamiliar digital interfaces to understand how users visually scan such interface to obtain the gist of its interactive potential. Perceptible Affordances theory, within HCI and Cognitive Psychology will be used to better understand those issues.
    15. 15. We propose a view that identifies some fraction of a user interface as based on the Post-WIMP theme (1) plus some other fraction that provides computer-only functionality (2) that is not realistic. As a design approach or metric, the goal would be to make the first category as large as possible and use the second only as necessary, highlighting the tradeoff explicitly. </li></ul>Jacob at al, 2008<br />15<br />
    16. 16. Bibliography<br />Beaudouin-Lafon, M. (November 2000). "Instrumental Interaction: An Interaction Model for Designing Post-WIMP User Interfaces". CHI '00: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. The Hague, The Netherlands: ACM Press. pp. 446–453. doi:10.1145/332040.332473. ISBN 1-58113-216-6. http://www.daimi.au.dk/CPnets/CPN2000/download/chi2000.pdf. <br />Breeze, James. Eye Tracking: Best Way to Test Rich App Usability. UX Magazine, access on 25 November 2010. (http://www.uxmag.com/technology/eye-tracking-the-best-way-to-test-rich-app-usability) <br />Buxton, W. (2001). Less is More (More or Less), in P. Denning (Ed.), The Invisible Future: The seamless integration of technology in everyday life. New York: McGraw Hill, 145–179 ITU Internet Reports 2005: The Internet of Things – Executive Summary.<br />Dam, A. (February 1997). "POST-WIMP User Interfaces". Communications of the ACM (ACM Press) 40 (2): pp. 63–67. doi:10.1145/253671.253708. <br />Dourish, P. Where the Action Is: The Foundations of Embodied Interaction. A Bradford Book: The MIT Press, USA, 2004.<br />Gaver, W. Technology Affordances. Copyright 1991 ACM 0-89791-383-3/91/0004/0079.<br />Gentner, D. and Nielsen, J. (April 1993). "The Anti-Mac Interface". Communications of the ACM (ACM Press) 39 (8): pp. 70–82. http://www.useit.com/papers/anti-mac.html. <br />Jacob, R. et al. (2008). "Reality-Based Interaction: A Framework for Post-WIMP Interfaces". CHI '08: Proceedings of the Twenty-Sixth Annual SIGCHI Conference on Human Factors in Computing Systems. Florence, Italy: ACM. pp. 201–210. doi:http://doi.acm.org.ezproxy.lib.ucf.edu/10.1145/1357054.1357089. ISBN 978-1-60558-011-1.<br />16<br />
    17. 17. Bibliography<br />McGrenere, J., Ho, W. (2000). Affordances: Clarifying and Evolving a Concept. Procs. of Graphic Interfaces 2000, Montreal, May 2000.<br />McNaughton, J. Utilizing Emerging Multi-touch Table Designs. Technology Enhanced Learning Research Group - Durham University. TR-TEL-10-01.<br />Nielsen, J. (April 1993). "Noncommand User Interfaces". Communications of the ACM (ACM Press) 36 (4): pp. 83–99. doi:10.1145/255950.153582. http://www.useit.com/papers/noncommand.html. <br />Norman, D. (1999). Affordance, Conventions and Design. In ACM Interactions, (May + June, 1999), 38-42. <br />Picard, R. Affective Computing. The MIT Press, Cambridge, Massachusetts. London, England, 1998.<br />Ramduny-Ellis, D.; Dix, A.; Hare, J.; Gill, S. Physicality: Towards a Less-GUI Interface (Preface). Procs. Third International Workshop on Physicality. Cambridge, England, 2009.<br />Sorensen, M. Making a Case for Biological and Tangible Interfaces. Proceedings of the Third International Workshop on Physicality. Cambridge, England, 2009.<br />Sternberg, R. Cognitive Psychology. Wadsworth, Cengage Learning. Belmont, CA, USA, 2009, 2006.<br />Vyas, D., Chisalita, C. Veer, G. Affordance in Interaction. ECCE '06 Proceedings of the 13th Eurpoean conference on Cognitive ergonomics: trust and control in complex socio-technical systems. ACM New York, NY, USA ©2006 ISBN: 978-3-906509-23-5<br />17<br />

    ×