2. The Human: History of User Interface
Designing
• Human Machine Interaction is the study of interaction between
people (users) and computers.
• Also concerned with the design, evaluation and implementation
of interactive computing systems for human use and with the
study of major phenomena surrounding them.
• With today's technology and tools, and our motivation towards
digital India mission to create effective and usable interfaces and
screens.
• Still there are systems that are inefficient, confusing and
unusable? Is it because (1) System developer’s don't care?
(2) don't possess common sense? (3) don't have the time?
(4)don't know what really makes good design?
(Resourse : HMI Dr. Kalbande Wiley Publication)
8. Future: Augmented Reality
• The virtual world is brought into user’s reality.
• e.g sci-fi movies: The human is able to enter
into a video game or one of the game
characters come into our real world.
9. I/O channels
• The part of a computer and its software that people can see,
hear, touch, talk to, or otherwise understand or direct.
• The user interface has essentially two components: input and
output.
• Input is how a person communicates his / her needs to the
computer.
• Some common input components are the keyboard, mouse,
trackball, one's finger, and one's voice.
• Output is how the computer conveys the results of its
computations and requirements to the user.
• Today, the most common computer output mechanism is the
display screen, followed by mechanisms that take advantage of
a person's auditory capabilities: voice and sound.
(Resource: HCI Alan Dix Pearson)
10. Contd…
• Vision:
1. The human eye
2. Visual perception
3. Perceiving size and depth
4. Perceiving brightness
5. Perceiving color
6. Reading
• Hearing:
1. The human error
2. Processing Sound
• Touch: Thermoreceptors (heat and cold), nociceptors (intense
pressure) and mechanoreceptors (pressure)
• Movement: Speed and Accuracy
(Resource: HCI Alan Dix Pearson)
11. Contd….
• The user interface has essentially two components: input and output.
• Input is how a person communicates his / her needs to the computer.
E.g The keyboard, mouse, trackball.
• Output is how the computer conveys the results of its computations
and requirements to the user. E.g Display screen.
12. Hardware, Software and Operating
environments
• Hardware: Choose the hardware as per requirement of the user and top
it up with any software application.
• Software: Tool with which we can create effective user interface. e.g
front-end developer tools can create an audio/visual experience for the
user such as VB, HTML, PHP, animators etc.
• Operating environments: Our design decision should fulfill the user-
level acceptance test and the modification should be provided
immediately after any suggestion.
1. Friends, family members, colleagues are not representatives of target
users.
2. User requirements should be understand by a team and not by an
individual.
3. Goal should be to minimize user difficulties.
4. The hardware and software balance should be maintained.
(Resource : HMI Dr. Kalbande Wiley Publication)
13. PSYCHOPATHY OF EVERYDAY THINGS
Psychopathy Of Everyday Things : How the design of our everyday things can make us crazy.
(Resource : HMI Dr. Kalbande Wiley Publication)
15. Human Centered Design
1. Feedback: The effect of everyday action.
Every single user action has to be
acknowledged immediately.
2. Constraints: Prevent the user from making
mistakes
3. Affordance: Convey the rules by leaving
visual clues.
4. Power of observation: Learn from struggle
of others.
(Resource : HMI Dr. Kalbande Wiley Publication)
16. Norman’s Fundamental principles
• Affordance : Convey the rules by leaving visual clues. E.g Lid of
tupperware lunchbox.
• Signifiers: Physical form of showing functionality to the users such
as a sound, a printed word or an image. E.g Push and Pull labels on
door
17. •Perceived Affordance: What the user understands by looking at the object. It
may not be the same as what the designer intended it to be. E.g Up and down
arrows outside of lift (Lift Panel).
• Mapping: Present the relationship between two objects is mapping. Mapping
of actions to consequences. E.g Fan regulator
•Feedback: The effect of everyday action. Every single user action has to be
acknowledged immediately. E.g Home Appliances
•Constraints: Prevent the user from making mistakes. E.g Constraints for date
entry
•Power of observation: Learn from struggle of others. E.g Handling of
modern devices.
(Resource : HMI Dr. Kalbande Wiley Publication)
18. Psychopathy Of Everyday Actions
Our aim should not be to make humans adapt to our product.
Instead to build products that can adapt to the humans.
Seven stages of Action
1. Forming the goal
Execution
2. Forming the intention (plan)
3. Specifying an action (specify)
4. Executing the action (perform)
Evaluation
5. Perceiving the state of the world (perceive)
6. Interpreting the state of the world (reflect)
7. Evaluating the outcome (compare)
(Resource : HMI Dr. Kalbande Wiley Publication)
19. Psychology of every day actions
Our aim should not be to make humans
adapt to our product. Instead to build
products that can adapt to the humans.
Seven stages of Action
1. Forming the goal
Execution
2. Forming the intention (plan)
3. Specifying an action (specify)
4. Executing the action (perform)
Evaluation
5. Perceiving the state of the world
(perceive)
6. Interpreting the state of the world
(reflect)
7. Evaluating the outcome (compare)
To become expert in some technology
1. I want to become expert in some
technology
2. Programming seems to be good idea
3. Check for recent trends in
programming
4. Enroll for a online certificate course
and start course
5. You learn various concept of
technology
6. You interpret the effects to your
understanding
7. At the end, you check whether u can
able to develop application using that
technology or not.
20. Three Levels of Processing
• Reflective level(): Plan, Compare
Careful analysis and reflection of all the incidents
or experiences
• Behavioral level: Specify, Reflect
Emotional brain takes control of decision making.
• Visceral level: Perform, Perceive
Human reacts to audio, visual and other aspects of
a product before experiencing it. The Look and feel
of the product dominates the user in this level.
(Resource : HMI Dr. Kalbande Wiley Publication)
21.
22. Reasoning
Reasoning is the process by which we use the knowledge we have
to draw conclusions or infer something new about the domain of
interest.
• Deductive Reasoning: Deductive reasoning derives the logically
necessary conclusion from the given premises.
• Inductive Reasoning: Induction is generalizing from cases we
have seen to infer information about cases we have not seen.
• Abductive Reasoning: Abduction reasons from a fact to the
action or state that caused it. This is the method we use to derive
explanations for the events we observe.
(Resource: HCI Alan Dix Pearson)
23. Problem Solving
• Problem solving is the process of finding a solution to
an unfamiliar task, using the knowledge we have.
• Human problem solving is characterized by the
ability to adapt the information we have to deal with
new situations.
• Gestalt theory: problem solving is a matter of
reproducing known responses or trial and error.
• Problem space theory: problem states, and problem
solving involves generating these states using legal
state transition operators.
(Resource: HCI Alan Dix Pearson)
24. Reasoning and problem solving
Reasoning is finding inference from
knowledge base.
• Deductive Reasoning: Deductive
reasoning derives the logically necessary
conclusion from the given premises. e.g
All women of age above 60 years are
grandmothers. Shalini is 65 years.
Therefore, Shalini is a grandmother.
• Inductive Reasoning: Induction is
generalizing from cases we have seen to
infer information about cases we have not
seen. e.g: "Nita is a teacher. Nita is
studious. Therefore, All teachers are
studious."
• Abductive Reasoning: Abduction reasons
from a fact to the action or state that caused
it. This is the method we use to derive
explanations for the events we observe.
• Problem solving is the
process of finding a solution
to an unfamiliar task, using
the knowledge we have.
• It is characterized by the
ability to adapt the
information we have to deal
with new situations.
• Gestalt theory: Problem
solving is a matter of
reproducing known
responses or trial and error.
• Problem space theory:
Problem states, and problem
solving involves generating
these states using legal state
transition operators.
25. The computer: Devices
• Input devices for interactive use, allowing text entry, text entry:
traditional keyboard, phone text entry, speech and handwriting, pointing:
principally the mouse, but also touchpad, stylus and others, drawing and
selection from the screen, 3D interaction devices.
• Output display devices for interactive use: different types of screen mostly
using some form of bitmap display, large displays and situated displays for
shared and public use , digital paper may be usable in the near future.
• Virtual reality systems and 3D visualization which have special interaction
and display devices.
• Various devices in the physical world: physical controls and dedicated
displays, sound, smell and haptic feedback , sensors for nearly everything
including movement, temperature, bio-signs.
• Paper output and input: the paperless office and the less-paper office:
different types of printers and their characteristics, character styles and
fonts, scanners and optical character recognition.
(Resource: HCI Alan Dix Pearson)
26. Memory
• short-term memory: RAM
• long-term memory: magnetic and optical disks
capacity limitations related to document and
video storage
• Access methods as they limit or help the user.
(Resource: HCI Alan Dix Pearson)
27. Processing
• The effects when systems run too slow or too fast, the myth of the
infinitely fast machine
Limitations on processing speed
1. Computation bound: Long delays when using find/replace in a large
document.
2. Storage channel bound: Compressed data take less space to store, and is
faster to read in and out, but must be compressed before storage and
decompressed when retrieved.
3. Graphics bound: Most computers include a special-purpose graphics card
to handle many of the most common graphics operations.
4. Network capacity: To use shared file on remote machine the speed of the
network rather than that of the memory which limits performance
(Resource: HCI Alan Dix Pearson)
28. Interaction: Models
• Interaction models help us to understand what
is going on in the interaction between user and
system. They address the translations between
what the user wants and what the system does.
(Resource: HCI Alan Dix Pearson)
30. Ergonomics
• Ergonomics looks at the physical characteristics of the interaction
and how these influence its effectiveness.
• How the controls are designed, the physical environment in which
the interaction takes place, and the layout and physical qualities of
the screen.
• Human psychology and system constraints.
Arrangement of controls and displays
1. Functional controls and displays: Functionally related are placed together.
2. Sequential controls and displays: Reflect the order of their use in a typical interaction.
3. Frequency controls and displays: Most commonly used controls being the most easily
accessible.
(Resource: HCI Alan Dix Pearson)
31. Styles, WIMP elements
• The interaction takes place within a social and organizational
context that affects both user and system. The dialog
between user and system is influenced by the style of the
interface.
1. command line interface
2. menus
3. natural language
4. question/answer and query dialog
5. form-fills and spreadsheets
6. WIMP: Windows, Icons, Pointers, Menus, Buttons,
Toolbars, Palettes, Dialog boxes
7. point and click
8. three-dimensional interfaces.
(Resource: HCI Alan Dix Pearson)
32. Interactivity
• When looking at an interface, it is easy to focus on the visually
distinct parts (the buttons,
menus, text areas) but the dynamics, the way they react to a user’s
actions, are less obvious.
• The interaction takes place within a social and
organizational context that affects both user and
system.
• The choice and specification of appropriate sequences of actions and
corresponding changes in the interface state.
1. Understanding Experience
2. Designing Experience
3. Physical design and engagement
(Resource: HCI Alan Dix Pearson)
33. Designers are faced with many
constraint
• Ergonomic You cannot physically push buttons if they are too small or too
close.
• Physical The size or nature of the device may force certain positions or styles of
control, for example, a dial like the one on the washing machine would not fit on
the MiniDisc controller; high-voltage switches cannot be as small as low-voltage
ones.
• Legal and safety Cooker controls must be far enough from the pans that you
do not burn yourself, but also high enough to prevent small children turning them
on.
• Context and environment The microwave’s controls are smooth to make them
• easy to clean in the kitchen.
• Aesthetic The controls must look good.
• Economic It must not cost too much!
• If we want people to want to use a device or application we need to understand
their personal values. Why should they want to use it? What value do they get from
using it?
(Resource: HCI Alan Dix Pearson)
34. Paradigms for Interaction
• Examples of effective strategies for building interactive
systems provide paradigms for designing usable interactive
systems.
• The evolution of these usability paradigms also provides a
good perspective on the history of interactive computing.
• These paradigms range from the introduction of timesharing
computers, through the WIMP and web, to ubiquitous and
context-aware computing.
The designer of an interactive system ask two questions before
designing interactive system.
1. How can an interactive system be developed to ensure its
usability?
2. How can the usability of an interactive system be demonstrated
or measured?
(Resource: HCI Alan Dix Pearson)
35. • Time sharing
• Video display units
• Programming toolkits
• Personal computing
• Window systems and the WIMP interface
• The metaphor
• Direct manipulation
• Language versus action
• Hypertext
• Multi-modality : The use of multiple human communication
channels e.g All interactive systems
• Computer-supported cooperative work
• The world wide web
36. • Agent-based interfaces : Simply do what they are told. E.g
email filtering agent
• Ubiquitous computing :computing is made to appear anytime
and everywhere
• distributed computing, mobile computing, location
computing, mobile networking, sensor networks, human–
computer interaction, context-aware smart home technologies,
and artificial intelligence.
• Sensor-based and context-aware interaction
(Resource: HCI Alan Dix Pearson)