WHAT WILL THE FUTURE BRING?
1725-1970
1960 -Present
1963 -Present
2007-Present
2010 -Present
The evolution of human interaction with computing systems – from punched cards through graphical and natural
interfaces – is marked by a gap, layers of abstraction, that gets smaller and smaller as the interface evolves. This
reduced abstraction in the interface allows us to engage more closely and directly with the information we seek. As
we design for Extended Reality (XR), the level of abstraction between real and virtual is reduced further, until we
ourselves become the operating system.
Over the next five years, the border between our real and virtual worlds will blur as we experience digital and
physical objects together in mixed reality environments. Problem-solving, learning, and collaboration are changing
as holographic technologies like Microsoft’s HoloLens allow us to create knowledge using our whole bodies to
manipulate digital information. A new vocabulary of interactions is emerging that includes spatial mapping, sound,
gaze, emotion, and gesture. As information architects, our challenge is to let go of our 2D affordances and think
beyond the patterns we use today, toward universal interactions for a human-centered mixed reality of the future.
GESTURE
· Reaching into a pocket
· Looking at your wrist/watch
· Nodding or shaking your head
· Twisting a doorknob
· Taking a bite of food
SPATIAL MAPPING
· Proximity
· Physical environment
· Pinning digital items so they travel with you
· Breadcrumbs - virtual markers
GAZE
· Selecting objects
· Trigger options on the periphery
SOUND
· Voice commands
· Interactive sonification
INPUT
· Choice of input modes
· Speech to text
· Keyboard
· Handwriting
· Fingerspelling
EMOTION
· Recognition of universal emotions
Reality 2.0 - Evolution of Interaction
Punched cards
Originally for weaving,
paper cards were used for
data processing in the
20th century.
Command line UI
Text commands are
entered one line at a time
in a dialog with the
computer system.
Graphical UI
Pointing devices allow users
to control an application by
clicking images of widgets
and icons on a screen.
Natural UI
Multi-touch gestures and
voice commands enable
direct interaction with
content, with less
dependence on visual
metaphors like buttons.
XR (Extended Reality)
Common intuitive gestures
combined with contextual
attributes allow us to
engage directly with virtual
and real worlds
simultaneously.

Reality 2.0: The Evolution of Interaction

  • 1.
    WHAT WILL THEFUTURE BRING? 1725-1970 1960 -Present 1963 -Present 2007-Present 2010 -Present The evolution of human interaction with computing systems – from punched cards through graphical and natural interfaces – is marked by a gap, layers of abstraction, that gets smaller and smaller as the interface evolves. This reduced abstraction in the interface allows us to engage more closely and directly with the information we seek. As we design for Extended Reality (XR), the level of abstraction between real and virtual is reduced further, until we ourselves become the operating system. Over the next five years, the border between our real and virtual worlds will blur as we experience digital and physical objects together in mixed reality environments. Problem-solving, learning, and collaboration are changing as holographic technologies like Microsoft’s HoloLens allow us to create knowledge using our whole bodies to manipulate digital information. A new vocabulary of interactions is emerging that includes spatial mapping, sound, gaze, emotion, and gesture. As information architects, our challenge is to let go of our 2D affordances and think beyond the patterns we use today, toward universal interactions for a human-centered mixed reality of the future. GESTURE · Reaching into a pocket · Looking at your wrist/watch · Nodding or shaking your head · Twisting a doorknob · Taking a bite of food SPATIAL MAPPING · Proximity · Physical environment · Pinning digital items so they travel with you · Breadcrumbs - virtual markers GAZE · Selecting objects · Trigger options on the periphery SOUND · Voice commands · Interactive sonification INPUT · Choice of input modes · Speech to text · Keyboard · Handwriting · Fingerspelling EMOTION · Recognition of universal emotions Reality 2.0 - Evolution of Interaction Punched cards Originally for weaving, paper cards were used for data processing in the 20th century. Command line UI Text commands are entered one line at a time in a dialog with the computer system. Graphical UI Pointing devices allow users to control an application by clicking images of widgets and icons on a screen. Natural UI Multi-touch gestures and voice commands enable direct interaction with content, with less dependence on visual metaphors like buttons. XR (Extended Reality) Common intuitive gestures combined with contextual attributes allow us to engage directly with virtual and real worlds simultaneously.