SlideShare a Scribd company logo
1 of 12
Download to read offline
Affordances and gestural interaction on multi-touch
interface systems: building new mental models
Adriano Bernardo Renzi1
*, and Sydney Freitas2
1 Esdi-UERJ, Senac-RJ, UX Design professor, Rio de
Janeiro, Brazil
adrianorenzi@gmail.com
2 Esdi-UERJ, Industrial Design professor, Rio de
Janeiro, Brazil
sydneyfreitas@terra.com.br
Abstract. This paper investigates users' gestural interaction mental models
using touch screen technology for the first time. The research used Think-aloud
Protocol technique for behavioral observation while performing tasks on two
different Apps using an Ipad2 device. The tasks helped perceive users'
recognition of functionality and gestural responses for each objective
completion. The conclusions based on the observed results are discussed
through mental model directives and Buxton and Spool theories of innovation
and factors to achieve high acceptance of users on new technology.
Keywords: user experience, mental model, think-aloud protocol.
1. Introduction: comparative experiments with mouse and
touch screen tablets
Shneiderman and Sears comparative research in 1989 regarding precision, cognitive
learning and function perception on user's interaction with mouse and with touch
screen technology presented results that implied an easier learning, higher precision
and lesser error actions with a touch screen device. This easier interaction became
evident specially when using icons with 16x16 pixels and 32x32 pixels.
adfa, p. 1, 2011.
© Springer-Verlag Berlin Heidelberg 2011
The research was developed during the 80's, when just a few people had access to
computer technology and personal computer were still far from being in every home.
Some of the participants had their first time experience interacting with both touch
screen pad and mouse devices. The affordance of both devices was based on users'
mental model of how they would work, based on previous gesture experiences, which
mostly were users' physical day-to-day life, apart from computer interaction.
From the late 80's to recent years, interactions with computer and mobile devices
changed progressively. Our affordance perceptions, mental models and expectations
changed with it. People's virtual and physical life belonged to two different domains,
but have been merging together throughout years of technology advancement and new
possibilities of digital interaction. Lately, it has been hard to consider the two worlds
apart. Friends in restaurants, or cafés, talk about events in their mutual social
networks, or check information online to add to the conversation. New words are
being adopted, new jokes are emerging, new ways of acquiring knowledge are
available. The perception of interactions are changing fast and physical life is no
longer the primary reference to understanding features of innovative devices or
systems. The perception of functionality and the learning process of how systems and
artifacts work are related to the user's mental model. This mental model is a result of
the users' primary and secondary references, knowledge and previous experiences
with analogous interactions.
The users' mental model changes and develops according to their experiences,
continuous learning and perceived affordances during their interaction with systems.
The mental model of each person influences his perception of interaction and
functionality of a new device (Renzi 2010). The term affordance, introduced by the
psychologist James J. Gibson in 1977, refers to the cognitive perception of
possibilities of actions that an object or system offers a person. This perception of
possibilities implies directly on the creation of one's mental model regarding the
functionality of the object or system. The better the affordance, the closest the mental
model will be to the reality of usage.
This research proposes to observe user's perception of icons functionality and
gestural interaction on a tablet multi-touch interface system based on each person's
previous experiences and references. The participants chosen for this experiment were
people who have never previously used any touch screen interaction artifact.
2. Online questionnaire: pre-selection of participants
adfa, p. 2, 2011.
© Springer-Verlag Berlin Heidelberg 2011
A previous online hybrid questionnaire was applied in undergraduate classes (design
major) in order to select participants with no previous experience with touch screen
devices. The questionnaire was created with the help of Google docs to easily reach
the participants and quickly collect the results for analysis. The questionnaire had
been open to participants for the period of two weeks.
Following Moura and Ferreira (2005) and Mucchielli (2004) recommendations on
how to structure a survey to keep the answers without distortion, it started with
broader topics and progressed to more specific subjects. Accordingly to the authors,
all questions should be very clear to the participants, specially when being presented
online and with no direct contact with the researcher to clarify any doubts (Renzi
2008).
Thirty six students participated, mostly with ages from seventeen to twenty two.
The questionnaire had nine questions divided into two parts:
- three essay questions related to participants' personal information (name, e-mail,
age)
- six multiple choice questions focused on participants' experience with web and
touch screen devices
Based on the results collected through the questionnaire, thirteen students who had
never previously used mobile multi-touch interface systems were pre-selected to be
part of the second phase of the experiment. From this sample, five students accepted
invitations to be in the second phase of the experiment.
3. Think-aloud Protocol: mapping user's experience
Think-aloud Protocol Technique was applied to better understand user's interactions
with mobile multi-touch interface systems. It was decided to use the Ipad2 (installed
with the iOS 6 system) as the device for the observation.
According to Villanueva (2004) the technique consists of a researcher observing
users doing specific tasks within a controlled environment. The users' actions and
thoughts are to be described verbally aloud by themselves on real time. The researcher
records the users' actions by written notifications, video shooting or voice recorder.
Filming and voice recording have the advantage of capturing the exact steps and
descriptions of users, while written notifications depends on the researcher experience
with observing reactions and quickness in writing down relevant actions of the
experiment (Renzi 2010). However, the use of writing notes have the advantage of
creating an informal observing environment, resulting in more flowing sessions with
the participant. While the choice of filming captures every movement for future
analysis, but could be considered intimidating. For this research the chosen direction
adfa, p. 3, 2011.
© Springer-Verlag Berlin Heidelberg 2011
was to use both voice recording and written notifications in order to keep the users in
an informal environment and record all their spoken actions. The App Voice Recorder
HD was used for this phase, where the participants could be recorded while doing the
required tasks at the same time.
When noticing some reluctancy from the users in verbalizing actions and thoughts
during the Think-aloud Protocol, questions related to the users' actions were placed to
keep the flow of verbalization of their thoughts, based on Xiao's (2000) experiments:
the researcher used general questions as users advanced on the proposed tasks for the
session: “what do you think about the tool?”, “did you understand the steps?”, “any
doubts regarding the tour?”, “do you have suggestions?”. According to Xiao, the
insert of questions during the session helped to identify problems from the perspective
of users of varying degrees of skill with the virtual system, as well as get constructive
ideas for improving the tool.
1.1 Tasks for gestural interaction
Two tasks were established to help understand the interaction flow and to point out
problems in functionality perception. The Apps used for the the tasks were Calendar, a
built-in schedule agenda, and Starblitz, a spaceship one-person game. Both developed
for the system iOS 6.2.
Although very different in nature, the two Apps require from the participants many
and distinct types of touch interaction in order to fulfill the tasks proposed: touch and
drag, lengthy touch, vertical spin, flip touch and indirect multitouch with both hands.
The proposed first task urged students to create a new event/appointment on the
Calendar App and include detailed information regarding the planned schedule:
beginning and ending hours, location of event and personal notes. As part of the same
task, a second event had to be inserted one week later from the first event with similar
amount of information. The first task was considered finalized after users completed
the registry or announced withdraw.
The Calendar App layout can be visualized on figure 1. The app simulate an open
leather journal. One can choose the focus of the layout to Day, Week, Month, Year
and List. All sessions recorded had the starting point on the Day layout. Participants
explored all layout options, but preformed the task of inserting the two appointments
using the Day layout.
adfa, p. 4, 2011.
© Springer-Verlag Berlin Heidelberg 2011
Fig.1. On the top left corner is the current day and its relation to the month, while on the right
top site the hour-by-hour schedule to insert appointment appears. On the bottom there is a
timeline with a today button on the left and a plus button on the right. The registry of a new
event creates a blue square that graphicaly indicate the beginning and ending of an
appointment. It can be relocated, edited or deleted anytime by clicking on it.
The second task was directed for the game Starblitz, where each participant had to
play the game and survive for as long as they could. In a spacial scenario (fig.2), the
users had to pilot the spacecraft, defend themselves from alien ships and collect a
green space mineral called Iridium to deposit it on proper space stations. The user’s
spacecraft also had an ally ship who helped protect the user. The ally ship was
controlled automatically by the app. The ability to reach-on second features, such as
change of weapons, were not compelled as part of the task, but was notified as part of
observations and analysis.
Each participant started the task at first level. Users were free to navigate through
the system scenario until they were forced to defend themselves.
The game layout (fig.2) presents an upper view of the user´s spaceship and a
computer controlled ally ship. On the upper seccion of the screen, the system shows
the players´s money and level. On the bottom side, there are two circular controls,
used for movement (left blue circle-controller) and for firing (right red circle-
controller). It is necessary to use the index fingers from both hands at the same time to
adfa, p. 5, 2011.
© Springer-Verlag Berlin Heidelberg 2011
be able to control both features. In order to direct the controls to the desired direction,
one needs to touch the circle-controller and drag it to the desired direction, to either
move the ship and direct the firepower.
Other features on the bottom part of the screen are: circular arrows icon to change
the type of weapon to use, exclamation icon to use items such as medical kit, and a
central bar to indicate the player´s current amount of health.
All participants were presented with basic explanations about the objetives of the
game, but in order to test their cognitive reactions and affordances, no explanations
regarding the controls were given
Fig.2. Starblitz spacial scenario layout with centered player's spacecraft and ally space shuttle
on the near top right. Alien enemy approaches from bottom. Circle-controllers on both lower
sides.
4. Observed interactions
On the first proposed task, there are two different ways to create a new event in the
iOS6 Calendar: clicking on the + button, placed on the bottom right of the screen, or
adfa, p. 6, 2011.
© Springer-Verlag Berlin Heidelberg 2011
touching on the desired hour for the event with an extended touch. Either way, a new
blue box will pop up to include title, notes, place and duration of the event.
During the observations, participants showed a hard time to figure out where to
click, in order to start a new event. None had perceived the possibility of an extended
touch interaction with the iPad. All participants tried all icons with a resemblance to a
3D button or an internet textual link. Users tried mostly to click (short and quick
touch) directly on hours, dates or icons with a 3D button look. They also tried to
interact with double-clicks and sequential clicking to achieve the objective. All
observed users demonstrated frustration and surprise when not succeding with a one-
short-click to create an event on the desired date.
During all observations, the use of the + button came as a last resource after testing
all and every graphic that could resemble interactiveness. Two participants gave up
and were able to complete the task only by reusing a previously existing event and
changing it's information to their needs. These two opened the previous event by
clicking directly on them.
After openning up a new event, users have to indicate the beginning and ending
hours by spreading the edges of the blue box (representing the extension of the event)
up to the desired time extent or by clicking on the hours' text-link. The event box
increases its size vertically and the hours are presented as shown in fig. 3. To change
the numbers, one has to vertically spin, simulating a rotative date rubber-stamp, and
stop it on the desired hour. Four of the five participants tried without success to short-
click directly on the numbers. Some (2) with increasing frustration while clicking
repeatedly (and double-clicks). All users took a long time to find the correct gestural
interaction to change the numbers. Four of the five participants understood the feature
by accidentally bumping onto the rotative artifact, making the numbers rotate. One
participant quit. Even with the rounded perspective view illusion of the virtual
rotative artifact, the participants could not relate the image with a vertical spin
interaction to rotate the numbers.
After the discovery of the proceedings to create an event, the second event creation
one week later came out with no difficulties. The flipping action to change pages in
the calendar was perceived easily and users used the short-click-and-shift on the right
corner rather than just clicking on the right edge. This action was used by the
participants to advance one week and create the second event, part of the proposed
task.
adfa, p. 7, 2011.
© Springer-Verlag Berlin Heidelberg 2011
Fig.3. selecting the desired hour for the event simulate a rotative date rubber-stamp
During the second task (survive as long as possible playing the game Starblitz),
similar reactions regarding gestural interactions were noticed. As soon as the game
started and the players' spacecraft appeared, all participants reacted instinctively
short-clicking with one finger directly on the spacecraft to move it. A second
instinctive action observed was click-and-drag the space vessel onto the direction
desired to move it.
As in the previous task with the Calendar App, the frustration with the non-
response of the ship eventually took the users to repeatedly touch (and drag) the
spacecraft nervously. The users' choice of hand to interact followed their natural side:
right-handed used the right index finger, while left-handed used the left one. Since the
ally craft is similar to the player's spaceship, users tried the same touch-and-drag
interaction with it. But with no success. Some of user's frustrations were recorded: “I
am trying to move it but it's not working” and “I can't do it! I can't do it!”
With the proximity of firing enemy alien spaceships, all participants' immediate
reactions were to click more strongly and more rapidly on their own spacecraft and on
the alien enemies, hoping the action would point the target for automatic fire from
their ship. These attempts, frequently resulted in repeatedly quick-strong touching on
the iPad2.
adfa, p. 8, 2011.
© Springer-Verlag Berlin Heidelberg 2011
The length of time that each user took to perceive the dual-touch possibility of the
controllers influenced their survival duration. Four users died in less than three
minutes (the quicker death took 2 minutes), but one player stood up with a faster
control recognition and survived for 8 minutes and thirty seconds. He was the only
player who also understood how to change the type of ammunition using the rotation
icon. Players who took longer to recognize the functionality of the circle controls
frenzied while enemy ships got closer and fired upon them. When understanding the
controls, their health was not high enough to keep them alive for much longer.
During the open interview with the participants, after both tasks were performed,
users confirmed being surprised with the many possibilities of gestural interaction
with the iPad2 besides short-click and click-and-drag.
5. Conclusion and discussion
The use of Think-aloud protocol made it possible to observe the participants'
reactions, flow and perceptions of functionality. It was important to compare these
observations with their previous experiences of interaction and technological
references collected on the online survey. The comparisson brought the possibility of
drafting characteristics of the students mental model regarding gestural interaction.
During the observation of participants trying to finish their tasks, it was possible to
verify that users had difficulty to perceive and understand actions that needed indirect
interaction. All users showed a tendency to interact with a direct short-click touch or
click-and-drag actions, simulating their previous experiences and references with
mouse interaction and desktop computers.
The users' choice of icons to interact or reach a desired function, demonstrate
reference and similarity to dynamics with internet navigation, visual hierarchy and
links. On the Calendar App, the + icon seemed imperceptible, partially caused by its
hidden location in the bottom corner, partially caused by users' perception of its
functionality. Users experimented to interact clicking the + icon only after exhausting
experimenting with every graphic or texts with any resemblance to Internet links. The
option of touching the desired date for 2 seconds to open a new event was never
perceived or considered as a possibility by any participant, as later interviews
confirmed.
Similar reactions were observed during the proceedings to mark the beginning and
ending of events through vertical spinning interaction. Although the mechanism
simulated graphically a rotative date rubber stamp, the possibility of rotating the
adfa, p. 9, 2011.
© Springer-Verlag Berlin Heidelberg 2011
numbers by vertically spinning the numbers never crossed any the participants's mind.
Once more, the interactions attempts based on direct short-clicks and after repetitive
touching, only finger slipping accidents showed the users how it worked.
The use of both hands to control the spaceship on the StarBlitz game seemed an
alien concept to the users. All participants reacted naturally based on the web and
mouse cultural convention of clicking one thing at a time. The possibility of
interacting with two icons at the same time was declared as a complete surprise. Users
perceived the circle-controllers as movement radars, as in computer games that use
mouse and keyboard to play it.
On both proposed tasks, the easier gestural interaction affordances perceived by
users for their first reaction were (1) direct short-clicks, (2) clicks-and-drag and (3)
page flips. All choices to click on and experiment interactions, as well as response
expectations, were similar to internet interaction models and links on websites,
computer software icons and mouse actions. Their mental models, based on previous
experiences with internet and desktop computers, influenced their functionality
perception and affordances while interacting with a touch screen device. Users
unconsciously reach for similarities of previous experiences, personal references and
learning concepts from reference groups (Renzi 2010, p.37). The results observed
indicate users' references based on commonly used devices, their related interactions
and possibly online advertisings about touch screen features, mostly depicting direct
short-clicks, click-and-drag gesture, image expansion-contraction and page flipping.
These references seems to influence not only users' perception of affordances, but also
their expectations of the iPad use and benefits.
From Schneiderman comparative tests on the late 80's to the recent use of touch
screen mobile devices spreading around the world, references, user experiences,
affordances and mental models are continuously being constructed. The evolution of
mental models through new experiences builds cultural change and consequently new
expectations and new cognitive readings, becoming base reference for innovations
and new interaction concepts. All participants have absorbed new references after the
experiment and have new mental models regarding touch screen interaction. If
contacted again for new tasks with Ipad2, the results would be very different, based
on new mental models.
A brand new product bringing innovative concepts without building reference and
mental models over the years could bring interactions too hard to understand or
accept. Innovation is a result of a long period of research, tests and consumers'
gradual familiarity to new concepts. Bill Buxton (2013), principal researcher at
Microsoft Research and professor at Technical University of Eindhoven and Toronto
University, shows a graphical representation of the “nose of innovation” with a length
of 20 years, from concept and first testings to product's highest advancements.
adfa, p. 10, 2011.
© Springer-Verlag Berlin Heidelberg 2011
Jared Spool (2013), founder of User Interface Engineering and working with user
experience since 1978, shows that an innovative product has to be built on factors to
achieve high acceptance and far spreading. One of these mentioned factors is Market
Maturity,
As well as the use of mouse, computer and internet over the years has built up
experiences, cultural conventions and references to help establish mental models and
expectations to use touch screen devices and gestural interaction, future technology
and concept innovations must consider the references and experiences being
developed now in order to reach users' expectations in the next 5-10 years with easy
cognitive affordances.
6. Bibliography
1. Buxton, Bill. Why eBay is a Better Prototyping Tool than a 3D Printer, The Long
Nose, and other Tales of History. Interaction South America. Recife (2013)
2. Moura, Maria Lúcia Seidl de; Ferreira, Maria Cristina. Projetos de pesquisa:
elaboração, redação e apresentação. p.144. Rio de Janeiro: Eduerj (2005)
3. Mucchielli, Roger. O questionário na pesquisa psicosocial. Ed. Martins Fontes. São
Paulo (1979)
4. Renzi, Adriano Bernardo. Usabilidade na procura e compra de livros em livrarias
online. Dissertation (Master of Science). Esdi – UERJ, Rio de Janeiro (2010)
5. Renzi, Adriano Bernardo; Freitas, Sydney; Santos, Robson. Expectativas dos usuários
nos processos de procura e decisão de compras de livros em lojas virtuais e livrarias: um
modelo mental. ABERGO. Porto Seguro, Bahia (2008)
6. Sears, Andrew; Shneiderman, Ben, High Precision Touchscreens: Design Strategies
and Comparisons with a Mouse. Department of Computer Science Human-Computer
Interaction Laboratory University of Maryland College Park, MD 20742 January 23, 1989
7. Spool, Jared. Mobile & UX: Inside the Eye of the Perfect Storm. Interaction South
America. Recife (2013)
8. Spool, Jared. Understanding the Kano Model – A tool for sophisticated designers.
User Interface engeneering article (2011) <http://www.uie.com/articles/kano_model/>
9. Villanueva, Rochelle de Asa. Think-aloud protocol aril heuristic evaluation of non-
immersive, desktop photo- realistic virtual environments. Dissertation (Master of Science) -
University of Otago, Dunedin - New Zealand (2004).
adfa, p. 11, 2011.
© Springer-Verlag Berlin Heidelberg 2011
10. Xiao, D. Y. Experiencing the library in a panorama virtual reality environment.
Library Hi Tech, vol. 18, nº 2. 177- 184 (2000)
adfa, p. 12, 2011.
© Springer-Verlag Berlin Heidelberg 2011

More Related Content

What's hot

Introduction hci
Introduction hciIntroduction hci
Introduction hcisawsan slii
 
Master of Exhibit Design at La Sapienza University, Introduction and Lesson 1
Master of Exhibit Design at La Sapienza University, Introduction and Lesson 1Master of Exhibit Design at La Sapienza University, Introduction and Lesson 1
Master of Exhibit Design at La Sapienza University, Introduction and Lesson 1Salvatore Iaconesi
 
Paper id 21201494
Paper id 21201494Paper id 21201494
Paper id 21201494IJRAT
 
Affective computing
Affective computingAffective computing
Affective computingAnkit Moonka
 
What Is Interaction Design
What Is Interaction DesignWhat Is Interaction Design
What Is Interaction DesignGraeme Smith
 
MindTrek2011 - ContextCapture: Context-based Awareness Cues in Status Updates
MindTrek2011 - ContextCapture: Context-based Awareness Cues in Status UpdatesMindTrek2011 - ContextCapture: Context-based Awareness Cues in Status Updates
MindTrek2011 - ContextCapture: Context-based Awareness Cues in Status UpdatesVille Antila
 
Contents of Internet of Things(IoT) By Thakur Pawan & Pathania Susheela
Contents of Internet of Things(IoT) By Thakur Pawan & Pathania SusheelaContents of Internet of Things(IoT) By Thakur Pawan & Pathania Susheela
Contents of Internet of Things(IoT) By Thakur Pawan & Pathania SusheelaGovt. P.G. College Dharamshala
 
Introduction of Internet of Things(IoT) By Thakur Pawan & Pathania Susheela
Introduction of Internet of Things(IoT) By Thakur Pawan & Pathania SusheelaIntroduction of Internet of Things(IoT) By Thakur Pawan & Pathania Susheela
Introduction of Internet of Things(IoT) By Thakur Pawan & Pathania SusheelaGovt. P.G. College Dharamshala
 
Understanding the Privacy Implications of Using Context-based Awareness Cues ...
Understanding the Privacy Implications of Using Context-based Awareness Cues ...Understanding the Privacy Implications of Using Context-based Awareness Cues ...
Understanding the Privacy Implications of Using Context-based Awareness Cues ...Ville Antila
 
What is Human Computer Interraction
What is Human Computer InterractionWhat is Human Computer Interraction
What is Human Computer Interractionpraeeth palliyaguru
 
Psychology Human Computer Interaction
Psychology Human Computer InteractionPsychology Human Computer Interaction
Psychology Human Computer InteractionSeta Wicaksana
 
Affective computing short -external
Affective computing   short -externalAffective computing   short -external
Affective computing short -externaldiannepatricia
 
Interaction 09 Introduction to Interaction Design
Interaction 09 Introduction to Interaction DesignInteraction 09 Introduction to Interaction Design
Interaction 09 Introduction to Interaction DesignDave Malouf
 
PACER: Fine-grained Interactive Paper via Hybrid Camera and Touch Gestures on...
PACER: Fine-grained Interactive Paper via Hybrid Camera and Touch Gestures on...PACER: Fine-grained Interactive Paper via Hybrid Camera and Touch Gestures on...
PACER: Fine-grained Interactive Paper via Hybrid Camera and Touch Gestures on...Chunyuan Liao
 
Human-Centered Artificial Intelligence: Reliable, Safe & Trustworthy
Human-Centered Artificial Intelligence: Reliable, Safe & TrustworthyHuman-Centered Artificial Intelligence: Reliable, Safe & Trustworthy
Human-Centered Artificial Intelligence: Reliable, Safe & TrustworthyJalnaAfridi
 
MOBILE AUGMENTED REALITY APPLICATION FOR PRIMARY SCHOOL EDUCATION
MOBILE AUGMENTED REALITY APPLICATION FOR PRIMARY SCHOOL EDUCATIONMOBILE AUGMENTED REALITY APPLICATION FOR PRIMARY SCHOOL EDUCATION
MOBILE AUGMENTED REALITY APPLICATION FOR PRIMARY SCHOOL EDUCATIONijma
 
RoutineMaker: Towards End-User Automation of Daily Routines Using Smartphones
RoutineMaker: Towards End-User Automation of Daily Routines Using SmartphonesRoutineMaker: Towards End-User Automation of Daily Routines Using Smartphones
RoutineMaker: Towards End-User Automation of Daily Routines Using SmartphonesVille Antila
 

What's hot (20)

Introduction hci
Introduction hciIntroduction hci
Introduction hci
 
Human Computer Interaction of an Information System
Human Computer Interaction of an Information SystemHuman Computer Interaction of an Information System
Human Computer Interaction of an Information System
 
Master of Exhibit Design at La Sapienza University, Introduction and Lesson 1
Master of Exhibit Design at La Sapienza University, Introduction and Lesson 1Master of Exhibit Design at La Sapienza University, Introduction and Lesson 1
Master of Exhibit Design at La Sapienza University, Introduction and Lesson 1
 
Paper id 21201494
Paper id 21201494Paper id 21201494
Paper id 21201494
 
Affective computing
Affective computingAffective computing
Affective computing
 
What Is Interaction Design
What Is Interaction DesignWhat Is Interaction Design
What Is Interaction Design
 
Hci activity#1
Hci activity#1Hci activity#1
Hci activity#1
 
MindTrek2011 - ContextCapture: Context-based Awareness Cues in Status Updates
MindTrek2011 - ContextCapture: Context-based Awareness Cues in Status UpdatesMindTrek2011 - ContextCapture: Context-based Awareness Cues in Status Updates
MindTrek2011 - ContextCapture: Context-based Awareness Cues in Status Updates
 
Contents of Internet of Things(IoT) By Thakur Pawan & Pathania Susheela
Contents of Internet of Things(IoT) By Thakur Pawan & Pathania SusheelaContents of Internet of Things(IoT) By Thakur Pawan & Pathania Susheela
Contents of Internet of Things(IoT) By Thakur Pawan & Pathania Susheela
 
Introduction of Internet of Things(IoT) By Thakur Pawan & Pathania Susheela
Introduction of Internet of Things(IoT) By Thakur Pawan & Pathania SusheelaIntroduction of Internet of Things(IoT) By Thakur Pawan & Pathania Susheela
Introduction of Internet of Things(IoT) By Thakur Pawan & Pathania Susheela
 
Understanding the Privacy Implications of Using Context-based Awareness Cues ...
Understanding the Privacy Implications of Using Context-based Awareness Cues ...Understanding the Privacy Implications of Using Context-based Awareness Cues ...
Understanding the Privacy Implications of Using Context-based Awareness Cues ...
 
What is Human Computer Interraction
What is Human Computer InterractionWhat is Human Computer Interraction
What is Human Computer Interraction
 
Iot and sensors
Iot and sensorsIot and sensors
Iot and sensors
 
Psychology Human Computer Interaction
Psychology Human Computer InteractionPsychology Human Computer Interaction
Psychology Human Computer Interaction
 
Affective computing short -external
Affective computing   short -externalAffective computing   short -external
Affective computing short -external
 
Interaction 09 Introduction to Interaction Design
Interaction 09 Introduction to Interaction DesignInteraction 09 Introduction to Interaction Design
Interaction 09 Introduction to Interaction Design
 
PACER: Fine-grained Interactive Paper via Hybrid Camera and Touch Gestures on...
PACER: Fine-grained Interactive Paper via Hybrid Camera and Touch Gestures on...PACER: Fine-grained Interactive Paper via Hybrid Camera and Touch Gestures on...
PACER: Fine-grained Interactive Paper via Hybrid Camera and Touch Gestures on...
 
Human-Centered Artificial Intelligence: Reliable, Safe & Trustworthy
Human-Centered Artificial Intelligence: Reliable, Safe & TrustworthyHuman-Centered Artificial Intelligence: Reliable, Safe & Trustworthy
Human-Centered Artificial Intelligence: Reliable, Safe & Trustworthy
 
MOBILE AUGMENTED REALITY APPLICATION FOR PRIMARY SCHOOL EDUCATION
MOBILE AUGMENTED REALITY APPLICATION FOR PRIMARY SCHOOL EDUCATIONMOBILE AUGMENTED REALITY APPLICATION FOR PRIMARY SCHOOL EDUCATION
MOBILE AUGMENTED REALITY APPLICATION FOR PRIMARY SCHOOL EDUCATION
 
RoutineMaker: Towards End-User Automation of Daily Routines Using Smartphones
RoutineMaker: Towards End-User Automation of Daily Routines Using SmartphonesRoutineMaker: Towards End-User Automation of Daily Routines Using Smartphones
RoutineMaker: Towards End-User Automation of Daily Routines Using Smartphones
 

Viewers also liked

Invasive Species Adaptive Managment Plan (NFCLP)
Invasive Species Adaptive Managment Plan (NFCLP)Invasive Species Adaptive Managment Plan (NFCLP)
Invasive Species Adaptive Managment Plan (NFCLP)Michael McPeak
 
Demanda contra la República y contra el Estado
Demanda contra la República y contra el EstadoDemanda contra la República y contra el Estado
Demanda contra la República y contra el EstadoGresskiki
 
AIESEC IN CAMEROON RECEPTION BOOKLET
AIESEC IN CAMEROON RECEPTION BOOKLETAIESEC IN CAMEROON RECEPTION BOOKLET
AIESEC IN CAMEROON RECEPTION BOOKLETNancy Supaki
 
artigo_02_v-1_n-1_ano-1
artigo_02_v-1_n-1_ano-1artigo_02_v-1_n-1_ano-1
artigo_02_v-1_n-1_ano-1Adriano Renzi
 
Accessibility Study of Touch and Gesture Interaction with Seniors
Accessibility Study of Touch and Gesture Interaction with SeniorsAccessibility Study of Touch and Gesture Interaction with Seniors
Accessibility Study of Touch and Gesture Interaction with SeniorsAndre Luiz Abrahão
 
Homestead Michael Schumacher World Tower - The Epitome Of Luxury, Homestead M...
Homestead Michael Schumacher World Tower - The Epitome Of Luxury, Homestead M...Homestead Michael Schumacher World Tower - The Epitome Of Luxury, Homestead M...
Homestead Michael Schumacher World Tower - The Epitome Of Luxury, Homestead M...Aadhar Homes
 
Imax technology (venkat)
Imax technology (venkat)Imax technology (venkat)
Imax technology (venkat)Venkata Krishna
 
Mind reading computer report
Mind reading computer reportMind reading computer report
Mind reading computer reportIshan Khan
 
optical fibre communication seminar report for brech.
optical fibre communication seminar report for brech.optical fibre communication seminar report for brech.
optical fibre communication seminar report for brech.abhishek birla
 
cell phone jammer report
cell phone jammer reportcell phone jammer report
cell phone jammer reportSameer Gupta
 
Lifi Seminar Report Full
Lifi Seminar Report FullLifi Seminar Report Full
Lifi Seminar Report FullArpit Gupta
 

Viewers also liked (20)

2016_LEM2_BAB1
2016_LEM2_BAB12016_LEM2_BAB1
2016_LEM2_BAB1
 
Invasive Species Adaptive Managment Plan (NFCLP)
Invasive Species Adaptive Managment Plan (NFCLP)Invasive Species Adaptive Managment Plan (NFCLP)
Invasive Species Adaptive Managment Plan (NFCLP)
 
Demanda contra la República y contra el Estado
Demanda contra la República y contra el EstadoDemanda contra la República y contra el Estado
Demanda contra la República y contra el Estado
 
Prezi tra
Prezi traPrezi tra
Prezi tra
 
Lauren A. Blakley
Lauren A. BlakleyLauren A. Blakley
Lauren A. Blakley
 
AIESEC IN CAMEROON RECEPTION BOOKLET
AIESEC IN CAMEROON RECEPTION BOOKLETAIESEC IN CAMEROON RECEPTION BOOKLET
AIESEC IN CAMEROON RECEPTION BOOKLET
 
artigo_02_v-1_n-1_ano-1
artigo_02_v-1_n-1_ano-1artigo_02_v-1_n-1_ano-1
artigo_02_v-1_n-1_ano-1
 
Accessibility Study of Touch and Gesture Interaction with Seniors
Accessibility Study of Touch and Gesture Interaction with SeniorsAccessibility Study of Touch and Gesture Interaction with Seniors
Accessibility Study of Touch and Gesture Interaction with Seniors
 
Proposed Methods of IP Spoofing Detection & Prevention
Proposed Methods of IP Spoofing Detection & Prevention Proposed Methods of IP Spoofing Detection & Prevention
Proposed Methods of IP Spoofing Detection & Prevention
 
Homestead Michael Schumacher World Tower - The Epitome Of Luxury, Homestead M...
Homestead Michael Schumacher World Tower - The Epitome Of Luxury, Homestead M...Homestead Michael Schumacher World Tower - The Epitome Of Luxury, Homestead M...
Homestead Michael Schumacher World Tower - The Epitome Of Luxury, Homestead M...
 
Imax technology (venkat)
Imax technology (venkat)Imax technology (venkat)
Imax technology (venkat)
 
finalgoogle
finalgooglefinalgoogle
finalgoogle
 
Mind reading computer report
Mind reading computer reportMind reading computer report
Mind reading computer report
 
Seminar report
Seminar reportSeminar report
Seminar report
 
3 d internet report
3 d internet report3 d internet report
3 d internet report
 
Optical computers pdf
Optical computers pdfOptical computers pdf
Optical computers pdf
 
optical fibre communication seminar report for brech.
optical fibre communication seminar report for brech.optical fibre communication seminar report for brech.
optical fibre communication seminar report for brech.
 
cell phone jammer report
cell phone jammer reportcell phone jammer report
cell phone jammer report
 
Lifi Seminar Report Full
Lifi Seminar Report FullLifi Seminar Report Full
Lifi Seminar Report Full
 
Google glass
Google glassGoogle glass
Google glass
 

Similar to adrianorenzi_duxu2014

SMARCOS Abstract Paper submitted to ICCHP 2012
SMARCOS Abstract Paper submitted to ICCHP 2012SMARCOS Abstract Paper submitted to ICCHP 2012
SMARCOS Abstract Paper submitted to ICCHP 2012Smarcos Eu
 
Towards_multimodal_emotion_recognition_i.pdf
Towards_multimodal_emotion_recognition_i.pdfTowards_multimodal_emotion_recognition_i.pdf
Towards_multimodal_emotion_recognition_i.pdfSHEEJAMOLPT
 
HCI LAB MANUAL
HCI LAB MANUAL HCI LAB MANUAL
HCI LAB MANUAL Um e Farwa
 
Musstanser Avanzament 4 (Final No Animation)
Musstanser   Avanzament 4 (Final   No Animation)Musstanser   Avanzament 4 (Final   No Animation)
Musstanser Avanzament 4 (Final No Animation)Musstanser Tinauli
 
An investigation into the physical build and psychological aspects of an inte...
An investigation into the physical build and psychological aspects of an inte...An investigation into the physical build and psychological aspects of an inte...
An investigation into the physical build and psychological aspects of an inte...Jessica Navarro
 
Supporting relationships with awareness systems
Supporting relationships with awareness systemsSupporting relationships with awareness systems
Supporting relationships with awareness systemsOnno Romijn
 
Towards the Design of Intelligible Object-based Applications for the Web of T...
Towards the Design of Intelligible Object-based Applications for the Web of T...Towards the Design of Intelligible Object-based Applications for the Web of T...
Towards the Design of Intelligible Object-based Applications for the Web of T...Pierrick Thébault
 
Methods and tools for human centered ICT: from human values to real-life inno...
Methods and tools for human centered ICT: from human values to real-life inno...Methods and tools for human centered ICT: from human values to real-life inno...
Methods and tools for human centered ICT: from human values to real-life inno...Human Centered ICT
 
Emotion-oriented computing: Possible uses and applications
Emotion-oriented computing: Possible uses and  applicationsEmotion-oriented computing: Possible uses and  applications
Emotion-oriented computing: Possible uses and applicationsAndré Valdestilhas
 
The empathic companion_a_character-based_interface
The empathic companion_a_character-based_interfaceThe empathic companion_a_character-based_interface
The empathic companion_a_character-based_interfaceCociaPodinaIoanaRoxa
 
Pakistan Journal of Science (Vol. 68 No.2June, 2016) 164 .docx
Pakistan Journal of Science (Vol. 68 No.2June, 2016)  164 .docxPakistan Journal of Science (Vol. 68 No.2June, 2016)  164 .docx
Pakistan Journal of Science (Vol. 68 No.2June, 2016) 164 .docxbunyansaturnina
 
Usability engineering of games a comparative analysis of measuring excitement...
Usability engineering of games a comparative analysis of measuring excitement...Usability engineering of games a comparative analysis of measuring excitement...
Usability engineering of games a comparative analysis of measuring excitement...ijujournal
 
Usability engineering of games a comparative analysis of measuring excitement...
Usability engineering of games a comparative analysis of measuring excitement...Usability engineering of games a comparative analysis of measuring excitement...
Usability engineering of games a comparative analysis of measuring excitement...ijujournal
 
USABILITY ENGINEERING OF GAMES: A COMPARATIVE ANALYSIS OF MEASURING EXCITEMEN...
USABILITY ENGINEERING OF GAMES: A COMPARATIVE ANALYSIS OF MEASURING EXCITEMEN...USABILITY ENGINEERING OF GAMES: A COMPARATIVE ANALYSIS OF MEASURING EXCITEMEN...
USABILITY ENGINEERING OF GAMES: A COMPARATIVE ANALYSIS OF MEASURING EXCITEMEN...ijujournal
 
5th PaeLife Newsletter in English
5th PaeLife Newsletter in English5th PaeLife Newsletter in English
5th PaeLife Newsletter in EnglishPaelife Consortium
 
Importance of UX-UI in Android/iOS Development- Stackon
Importance of UX-UI in Android/iOS Development- StackonImportance of UX-UI in Android/iOS Development- Stackon
Importance of UX-UI in Android/iOS Development- Stackonnajam gs
 
1User Interface Development User Interface Dev.docx
1User Interface Development User Interface Dev.docx1User Interface Development User Interface Dev.docx
1User Interface Development User Interface Dev.docxfelicidaddinwoodie
 
IRJET - Mutecom using Tensorflow-Keras Model
IRJET - Mutecom using Tensorflow-Keras ModelIRJET - Mutecom using Tensorflow-Keras Model
IRJET - Mutecom using Tensorflow-Keras ModelIRJET Journal
 

Similar to adrianorenzi_duxu2014 (20)

SMARCOS Abstract Paper submitted to ICCHP 2012
SMARCOS Abstract Paper submitted to ICCHP 2012SMARCOS Abstract Paper submitted to ICCHP 2012
SMARCOS Abstract Paper submitted to ICCHP 2012
 
Towards_multimodal_emotion_recognition_i.pdf
Towards_multimodal_emotion_recognition_i.pdfTowards_multimodal_emotion_recognition_i.pdf
Towards_multimodal_emotion_recognition_i.pdf
 
HCI LAB MANUAL
HCI LAB MANUAL HCI LAB MANUAL
HCI LAB MANUAL
 
C0353018026
C0353018026C0353018026
C0353018026
 
Musstanser Avanzament 4 (Final No Animation)
Musstanser   Avanzament 4 (Final   No Animation)Musstanser   Avanzament 4 (Final   No Animation)
Musstanser Avanzament 4 (Final No Animation)
 
An investigation into the physical build and psychological aspects of an inte...
An investigation into the physical build and psychological aspects of an inte...An investigation into the physical build and psychological aspects of an inte...
An investigation into the physical build and psychological aspects of an inte...
 
Supporting relationships with awareness systems
Supporting relationships with awareness systemsSupporting relationships with awareness systems
Supporting relationships with awareness systems
 
Towards the Design of Intelligible Object-based Applications for the Web of T...
Towards the Design of Intelligible Object-based Applications for the Web of T...Towards the Design of Intelligible Object-based Applications for the Web of T...
Towards the Design of Intelligible Object-based Applications for the Web of T...
 
Methods and tools for human centered ICT: from human values to real-life inno...
Methods and tools for human centered ICT: from human values to real-life inno...Methods and tools for human centered ICT: from human values to real-life inno...
Methods and tools for human centered ICT: from human values to real-life inno...
 
Emotion-oriented computing: Possible uses and applications
Emotion-oriented computing: Possible uses and  applicationsEmotion-oriented computing: Possible uses and  applications
Emotion-oriented computing: Possible uses and applications
 
The empathic companion_a_character-based_interface
The empathic companion_a_character-based_interfaceThe empathic companion_a_character-based_interface
The empathic companion_a_character-based_interface
 
Pakistan Journal of Science (Vol. 68 No.2June, 2016) 164 .docx
Pakistan Journal of Science (Vol. 68 No.2June, 2016)  164 .docxPakistan Journal of Science (Vol. 68 No.2June, 2016)  164 .docx
Pakistan Journal of Science (Vol. 68 No.2June, 2016) 164 .docx
 
Ice b poster
Ice b posterIce b poster
Ice b poster
 
Usability engineering of games a comparative analysis of measuring excitement...
Usability engineering of games a comparative analysis of measuring excitement...Usability engineering of games a comparative analysis of measuring excitement...
Usability engineering of games a comparative analysis of measuring excitement...
 
Usability engineering of games a comparative analysis of measuring excitement...
Usability engineering of games a comparative analysis of measuring excitement...Usability engineering of games a comparative analysis of measuring excitement...
Usability engineering of games a comparative analysis of measuring excitement...
 
USABILITY ENGINEERING OF GAMES: A COMPARATIVE ANALYSIS OF MEASURING EXCITEMEN...
USABILITY ENGINEERING OF GAMES: A COMPARATIVE ANALYSIS OF MEASURING EXCITEMEN...USABILITY ENGINEERING OF GAMES: A COMPARATIVE ANALYSIS OF MEASURING EXCITEMEN...
USABILITY ENGINEERING OF GAMES: A COMPARATIVE ANALYSIS OF MEASURING EXCITEMEN...
 
5th PaeLife Newsletter in English
5th PaeLife Newsletter in English5th PaeLife Newsletter in English
5th PaeLife Newsletter in English
 
Importance of UX-UI in Android/iOS Development- Stackon
Importance of UX-UI in Android/iOS Development- StackonImportance of UX-UI in Android/iOS Development- Stackon
Importance of UX-UI in Android/iOS Development- Stackon
 
1User Interface Development User Interface Dev.docx
1User Interface Development User Interface Dev.docx1User Interface Development User Interface Dev.docx
1User Interface Development User Interface Dev.docx
 
IRJET - Mutecom using Tensorflow-Keras Model
IRJET - Mutecom using Tensorflow-Keras ModelIRJET - Mutecom using Tensorflow-Keras Model
IRJET - Mutecom using Tensorflow-Keras Model
 

adrianorenzi_duxu2014

  • 1. Affordances and gestural interaction on multi-touch interface systems: building new mental models Adriano Bernardo Renzi1 *, and Sydney Freitas2 1 Esdi-UERJ, Senac-RJ, UX Design professor, Rio de Janeiro, Brazil adrianorenzi@gmail.com 2 Esdi-UERJ, Industrial Design professor, Rio de Janeiro, Brazil sydneyfreitas@terra.com.br Abstract. This paper investigates users' gestural interaction mental models using touch screen technology for the first time. The research used Think-aloud Protocol technique for behavioral observation while performing tasks on two different Apps using an Ipad2 device. The tasks helped perceive users' recognition of functionality and gestural responses for each objective completion. The conclusions based on the observed results are discussed through mental model directives and Buxton and Spool theories of innovation and factors to achieve high acceptance of users on new technology. Keywords: user experience, mental model, think-aloud protocol. 1. Introduction: comparative experiments with mouse and touch screen tablets Shneiderman and Sears comparative research in 1989 regarding precision, cognitive learning and function perception on user's interaction with mouse and with touch screen technology presented results that implied an easier learning, higher precision and lesser error actions with a touch screen device. This easier interaction became evident specially when using icons with 16x16 pixels and 32x32 pixels. adfa, p. 1, 2011. © Springer-Verlag Berlin Heidelberg 2011
  • 2. The research was developed during the 80's, when just a few people had access to computer technology and personal computer were still far from being in every home. Some of the participants had their first time experience interacting with both touch screen pad and mouse devices. The affordance of both devices was based on users' mental model of how they would work, based on previous gesture experiences, which mostly were users' physical day-to-day life, apart from computer interaction. From the late 80's to recent years, interactions with computer and mobile devices changed progressively. Our affordance perceptions, mental models and expectations changed with it. People's virtual and physical life belonged to two different domains, but have been merging together throughout years of technology advancement and new possibilities of digital interaction. Lately, it has been hard to consider the two worlds apart. Friends in restaurants, or cafés, talk about events in their mutual social networks, or check information online to add to the conversation. New words are being adopted, new jokes are emerging, new ways of acquiring knowledge are available. The perception of interactions are changing fast and physical life is no longer the primary reference to understanding features of innovative devices or systems. The perception of functionality and the learning process of how systems and artifacts work are related to the user's mental model. This mental model is a result of the users' primary and secondary references, knowledge and previous experiences with analogous interactions. The users' mental model changes and develops according to their experiences, continuous learning and perceived affordances during their interaction with systems. The mental model of each person influences his perception of interaction and functionality of a new device (Renzi 2010). The term affordance, introduced by the psychologist James J. Gibson in 1977, refers to the cognitive perception of possibilities of actions that an object or system offers a person. This perception of possibilities implies directly on the creation of one's mental model regarding the functionality of the object or system. The better the affordance, the closest the mental model will be to the reality of usage. This research proposes to observe user's perception of icons functionality and gestural interaction on a tablet multi-touch interface system based on each person's previous experiences and references. The participants chosen for this experiment were people who have never previously used any touch screen interaction artifact. 2. Online questionnaire: pre-selection of participants adfa, p. 2, 2011. © Springer-Verlag Berlin Heidelberg 2011
  • 3. A previous online hybrid questionnaire was applied in undergraduate classes (design major) in order to select participants with no previous experience with touch screen devices. The questionnaire was created with the help of Google docs to easily reach the participants and quickly collect the results for analysis. The questionnaire had been open to participants for the period of two weeks. Following Moura and Ferreira (2005) and Mucchielli (2004) recommendations on how to structure a survey to keep the answers without distortion, it started with broader topics and progressed to more specific subjects. Accordingly to the authors, all questions should be very clear to the participants, specially when being presented online and with no direct contact with the researcher to clarify any doubts (Renzi 2008). Thirty six students participated, mostly with ages from seventeen to twenty two. The questionnaire had nine questions divided into two parts: - three essay questions related to participants' personal information (name, e-mail, age) - six multiple choice questions focused on participants' experience with web and touch screen devices Based on the results collected through the questionnaire, thirteen students who had never previously used mobile multi-touch interface systems were pre-selected to be part of the second phase of the experiment. From this sample, five students accepted invitations to be in the second phase of the experiment. 3. Think-aloud Protocol: mapping user's experience Think-aloud Protocol Technique was applied to better understand user's interactions with mobile multi-touch interface systems. It was decided to use the Ipad2 (installed with the iOS 6 system) as the device for the observation. According to Villanueva (2004) the technique consists of a researcher observing users doing specific tasks within a controlled environment. The users' actions and thoughts are to be described verbally aloud by themselves on real time. The researcher records the users' actions by written notifications, video shooting or voice recorder. Filming and voice recording have the advantage of capturing the exact steps and descriptions of users, while written notifications depends on the researcher experience with observing reactions and quickness in writing down relevant actions of the experiment (Renzi 2010). However, the use of writing notes have the advantage of creating an informal observing environment, resulting in more flowing sessions with the participant. While the choice of filming captures every movement for future analysis, but could be considered intimidating. For this research the chosen direction adfa, p. 3, 2011. © Springer-Verlag Berlin Heidelberg 2011
  • 4. was to use both voice recording and written notifications in order to keep the users in an informal environment and record all their spoken actions. The App Voice Recorder HD was used for this phase, where the participants could be recorded while doing the required tasks at the same time. When noticing some reluctancy from the users in verbalizing actions and thoughts during the Think-aloud Protocol, questions related to the users' actions were placed to keep the flow of verbalization of their thoughts, based on Xiao's (2000) experiments: the researcher used general questions as users advanced on the proposed tasks for the session: “what do you think about the tool?”, “did you understand the steps?”, “any doubts regarding the tour?”, “do you have suggestions?”. According to Xiao, the insert of questions during the session helped to identify problems from the perspective of users of varying degrees of skill with the virtual system, as well as get constructive ideas for improving the tool. 1.1 Tasks for gestural interaction Two tasks were established to help understand the interaction flow and to point out problems in functionality perception. The Apps used for the the tasks were Calendar, a built-in schedule agenda, and Starblitz, a spaceship one-person game. Both developed for the system iOS 6.2. Although very different in nature, the two Apps require from the participants many and distinct types of touch interaction in order to fulfill the tasks proposed: touch and drag, lengthy touch, vertical spin, flip touch and indirect multitouch with both hands. The proposed first task urged students to create a new event/appointment on the Calendar App and include detailed information regarding the planned schedule: beginning and ending hours, location of event and personal notes. As part of the same task, a second event had to be inserted one week later from the first event with similar amount of information. The first task was considered finalized after users completed the registry or announced withdraw. The Calendar App layout can be visualized on figure 1. The app simulate an open leather journal. One can choose the focus of the layout to Day, Week, Month, Year and List. All sessions recorded had the starting point on the Day layout. Participants explored all layout options, but preformed the task of inserting the two appointments using the Day layout. adfa, p. 4, 2011. © Springer-Verlag Berlin Heidelberg 2011
  • 5. Fig.1. On the top left corner is the current day and its relation to the month, while on the right top site the hour-by-hour schedule to insert appointment appears. On the bottom there is a timeline with a today button on the left and a plus button on the right. The registry of a new event creates a blue square that graphicaly indicate the beginning and ending of an appointment. It can be relocated, edited or deleted anytime by clicking on it. The second task was directed for the game Starblitz, where each participant had to play the game and survive for as long as they could. In a spacial scenario (fig.2), the users had to pilot the spacecraft, defend themselves from alien ships and collect a green space mineral called Iridium to deposit it on proper space stations. The user’s spacecraft also had an ally ship who helped protect the user. The ally ship was controlled automatically by the app. The ability to reach-on second features, such as change of weapons, were not compelled as part of the task, but was notified as part of observations and analysis. Each participant started the task at first level. Users were free to navigate through the system scenario until they were forced to defend themselves. The game layout (fig.2) presents an upper view of the user´s spaceship and a computer controlled ally ship. On the upper seccion of the screen, the system shows the players´s money and level. On the bottom side, there are two circular controls, used for movement (left blue circle-controller) and for firing (right red circle- controller). It is necessary to use the index fingers from both hands at the same time to adfa, p. 5, 2011. © Springer-Verlag Berlin Heidelberg 2011
  • 6. be able to control both features. In order to direct the controls to the desired direction, one needs to touch the circle-controller and drag it to the desired direction, to either move the ship and direct the firepower. Other features on the bottom part of the screen are: circular arrows icon to change the type of weapon to use, exclamation icon to use items such as medical kit, and a central bar to indicate the player´s current amount of health. All participants were presented with basic explanations about the objetives of the game, but in order to test their cognitive reactions and affordances, no explanations regarding the controls were given Fig.2. Starblitz spacial scenario layout with centered player's spacecraft and ally space shuttle on the near top right. Alien enemy approaches from bottom. Circle-controllers on both lower sides. 4. Observed interactions On the first proposed task, there are two different ways to create a new event in the iOS6 Calendar: clicking on the + button, placed on the bottom right of the screen, or adfa, p. 6, 2011. © Springer-Verlag Berlin Heidelberg 2011
  • 7. touching on the desired hour for the event with an extended touch. Either way, a new blue box will pop up to include title, notes, place and duration of the event. During the observations, participants showed a hard time to figure out where to click, in order to start a new event. None had perceived the possibility of an extended touch interaction with the iPad. All participants tried all icons with a resemblance to a 3D button or an internet textual link. Users tried mostly to click (short and quick touch) directly on hours, dates or icons with a 3D button look. They also tried to interact with double-clicks and sequential clicking to achieve the objective. All observed users demonstrated frustration and surprise when not succeding with a one- short-click to create an event on the desired date. During all observations, the use of the + button came as a last resource after testing all and every graphic that could resemble interactiveness. Two participants gave up and were able to complete the task only by reusing a previously existing event and changing it's information to their needs. These two opened the previous event by clicking directly on them. After openning up a new event, users have to indicate the beginning and ending hours by spreading the edges of the blue box (representing the extension of the event) up to the desired time extent or by clicking on the hours' text-link. The event box increases its size vertically and the hours are presented as shown in fig. 3. To change the numbers, one has to vertically spin, simulating a rotative date rubber-stamp, and stop it on the desired hour. Four of the five participants tried without success to short- click directly on the numbers. Some (2) with increasing frustration while clicking repeatedly (and double-clicks). All users took a long time to find the correct gestural interaction to change the numbers. Four of the five participants understood the feature by accidentally bumping onto the rotative artifact, making the numbers rotate. One participant quit. Even with the rounded perspective view illusion of the virtual rotative artifact, the participants could not relate the image with a vertical spin interaction to rotate the numbers. After the discovery of the proceedings to create an event, the second event creation one week later came out with no difficulties. The flipping action to change pages in the calendar was perceived easily and users used the short-click-and-shift on the right corner rather than just clicking on the right edge. This action was used by the participants to advance one week and create the second event, part of the proposed task. adfa, p. 7, 2011. © Springer-Verlag Berlin Heidelberg 2011
  • 8. Fig.3. selecting the desired hour for the event simulate a rotative date rubber-stamp During the second task (survive as long as possible playing the game Starblitz), similar reactions regarding gestural interactions were noticed. As soon as the game started and the players' spacecraft appeared, all participants reacted instinctively short-clicking with one finger directly on the spacecraft to move it. A second instinctive action observed was click-and-drag the space vessel onto the direction desired to move it. As in the previous task with the Calendar App, the frustration with the non- response of the ship eventually took the users to repeatedly touch (and drag) the spacecraft nervously. The users' choice of hand to interact followed their natural side: right-handed used the right index finger, while left-handed used the left one. Since the ally craft is similar to the player's spaceship, users tried the same touch-and-drag interaction with it. But with no success. Some of user's frustrations were recorded: “I am trying to move it but it's not working” and “I can't do it! I can't do it!” With the proximity of firing enemy alien spaceships, all participants' immediate reactions were to click more strongly and more rapidly on their own spacecraft and on the alien enemies, hoping the action would point the target for automatic fire from their ship. These attempts, frequently resulted in repeatedly quick-strong touching on the iPad2. adfa, p. 8, 2011. © Springer-Verlag Berlin Heidelberg 2011
  • 9. The length of time that each user took to perceive the dual-touch possibility of the controllers influenced their survival duration. Four users died in less than three minutes (the quicker death took 2 minutes), but one player stood up with a faster control recognition and survived for 8 minutes and thirty seconds. He was the only player who also understood how to change the type of ammunition using the rotation icon. Players who took longer to recognize the functionality of the circle controls frenzied while enemy ships got closer and fired upon them. When understanding the controls, their health was not high enough to keep them alive for much longer. During the open interview with the participants, after both tasks were performed, users confirmed being surprised with the many possibilities of gestural interaction with the iPad2 besides short-click and click-and-drag. 5. Conclusion and discussion The use of Think-aloud protocol made it possible to observe the participants' reactions, flow and perceptions of functionality. It was important to compare these observations with their previous experiences of interaction and technological references collected on the online survey. The comparisson brought the possibility of drafting characteristics of the students mental model regarding gestural interaction. During the observation of participants trying to finish their tasks, it was possible to verify that users had difficulty to perceive and understand actions that needed indirect interaction. All users showed a tendency to interact with a direct short-click touch or click-and-drag actions, simulating their previous experiences and references with mouse interaction and desktop computers. The users' choice of icons to interact or reach a desired function, demonstrate reference and similarity to dynamics with internet navigation, visual hierarchy and links. On the Calendar App, the + icon seemed imperceptible, partially caused by its hidden location in the bottom corner, partially caused by users' perception of its functionality. Users experimented to interact clicking the + icon only after exhausting experimenting with every graphic or texts with any resemblance to Internet links. The option of touching the desired date for 2 seconds to open a new event was never perceived or considered as a possibility by any participant, as later interviews confirmed. Similar reactions were observed during the proceedings to mark the beginning and ending of events through vertical spinning interaction. Although the mechanism simulated graphically a rotative date rubber stamp, the possibility of rotating the adfa, p. 9, 2011. © Springer-Verlag Berlin Heidelberg 2011
  • 10. numbers by vertically spinning the numbers never crossed any the participants's mind. Once more, the interactions attempts based on direct short-clicks and after repetitive touching, only finger slipping accidents showed the users how it worked. The use of both hands to control the spaceship on the StarBlitz game seemed an alien concept to the users. All participants reacted naturally based on the web and mouse cultural convention of clicking one thing at a time. The possibility of interacting with two icons at the same time was declared as a complete surprise. Users perceived the circle-controllers as movement radars, as in computer games that use mouse and keyboard to play it. On both proposed tasks, the easier gestural interaction affordances perceived by users for their first reaction were (1) direct short-clicks, (2) clicks-and-drag and (3) page flips. All choices to click on and experiment interactions, as well as response expectations, were similar to internet interaction models and links on websites, computer software icons and mouse actions. Their mental models, based on previous experiences with internet and desktop computers, influenced their functionality perception and affordances while interacting with a touch screen device. Users unconsciously reach for similarities of previous experiences, personal references and learning concepts from reference groups (Renzi 2010, p.37). The results observed indicate users' references based on commonly used devices, their related interactions and possibly online advertisings about touch screen features, mostly depicting direct short-clicks, click-and-drag gesture, image expansion-contraction and page flipping. These references seems to influence not only users' perception of affordances, but also their expectations of the iPad use and benefits. From Schneiderman comparative tests on the late 80's to the recent use of touch screen mobile devices spreading around the world, references, user experiences, affordances and mental models are continuously being constructed. The evolution of mental models through new experiences builds cultural change and consequently new expectations and new cognitive readings, becoming base reference for innovations and new interaction concepts. All participants have absorbed new references after the experiment and have new mental models regarding touch screen interaction. If contacted again for new tasks with Ipad2, the results would be very different, based on new mental models. A brand new product bringing innovative concepts without building reference and mental models over the years could bring interactions too hard to understand or accept. Innovation is a result of a long period of research, tests and consumers' gradual familiarity to new concepts. Bill Buxton (2013), principal researcher at Microsoft Research and professor at Technical University of Eindhoven and Toronto University, shows a graphical representation of the “nose of innovation” with a length of 20 years, from concept and first testings to product's highest advancements. adfa, p. 10, 2011. © Springer-Verlag Berlin Heidelberg 2011
  • 11. Jared Spool (2013), founder of User Interface Engineering and working with user experience since 1978, shows that an innovative product has to be built on factors to achieve high acceptance and far spreading. One of these mentioned factors is Market Maturity, As well as the use of mouse, computer and internet over the years has built up experiences, cultural conventions and references to help establish mental models and expectations to use touch screen devices and gestural interaction, future technology and concept innovations must consider the references and experiences being developed now in order to reach users' expectations in the next 5-10 years with easy cognitive affordances. 6. Bibliography 1. Buxton, Bill. Why eBay is a Better Prototyping Tool than a 3D Printer, The Long Nose, and other Tales of History. Interaction South America. Recife (2013) 2. Moura, Maria Lúcia Seidl de; Ferreira, Maria Cristina. Projetos de pesquisa: elaboração, redação e apresentação. p.144. Rio de Janeiro: Eduerj (2005) 3. Mucchielli, Roger. O questionário na pesquisa psicosocial. Ed. Martins Fontes. São Paulo (1979) 4. Renzi, Adriano Bernardo. Usabilidade na procura e compra de livros em livrarias online. Dissertation (Master of Science). Esdi – UERJ, Rio de Janeiro (2010) 5. Renzi, Adriano Bernardo; Freitas, Sydney; Santos, Robson. Expectativas dos usuários nos processos de procura e decisão de compras de livros em lojas virtuais e livrarias: um modelo mental. ABERGO. Porto Seguro, Bahia (2008) 6. Sears, Andrew; Shneiderman, Ben, High Precision Touchscreens: Design Strategies and Comparisons with a Mouse. Department of Computer Science Human-Computer Interaction Laboratory University of Maryland College Park, MD 20742 January 23, 1989 7. Spool, Jared. Mobile & UX: Inside the Eye of the Perfect Storm. Interaction South America. Recife (2013) 8. Spool, Jared. Understanding the Kano Model – A tool for sophisticated designers. User Interface engeneering article (2011) <http://www.uie.com/articles/kano_model/> 9. Villanueva, Rochelle de Asa. Think-aloud protocol aril heuristic evaluation of non- immersive, desktop photo- realistic virtual environments. Dissertation (Master of Science) - University of Otago, Dunedin - New Zealand (2004). adfa, p. 11, 2011. © Springer-Verlag Berlin Heidelberg 2011
  • 12. 10. Xiao, D. Y. Experiencing the library in a panorama virtual reality environment. Library Hi Tech, vol. 18, nº 2. 177- 184 (2000) adfa, p. 12, 2011. © Springer-Verlag Berlin Heidelberg 2011