Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

SXSW 2016 Human UI

1,244 views

Published on

"Smart phones have become a remote control for our lives and we’re too tethered to these devices. But is this how we want to interact w/ the world, staring at a small screen as we walk around like zombies? It’s time for designers to consider new ways to interact w/ the Internet of everything surrounding us. IoT has arrived with a growing focus on machine-to-machine communication, but human-to-machine communication is equally important in this new frontier of UI design. Think Human UI, where our movements, voice, thoughts, etc. cause systems to respond to us through our environment. Smart devices can play a role, but invisible & screen-less interfaces are critical for creating the best experiences."
https://vimeo.com/134469464

Published in: Design
  • Hear The Angels Sing: Listen to this free musical composition to clear away all the negativity in your life and welcome in miracles! Download your complimentary "Angel Soundscape" now. ■■■ http://t.cn/AiuvUMl2
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here
  • How Brainwave Frequencies Can Change Your Life! ➤➤ https://bit.ly/30Ju5r6
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here

SXSW 2016 Human UI

  1. 1. HUMAN UI THE NEXT ERA OF INTERACTION DESIGN
  2. 2. HELLO Greg Carley
 Head of Product Strategy Chaotic Moon
 chaoticmoon.com Creative Technology Studio
 Located in Austin, TX
 Everything else:
 https://about.me/gregcarley
  3. 3. ACCENTURE Digital /// Strategy Technology Consulting Operations FJORD Service Design + Living Services CHAOTIC MOON Product Realization + Human UI Design & Innovation + Human Expectations Interactive “Digitalization of Everything”
  4. 4. HUMAN EXPECTATIONS ARE SHIFTING — 
 EXPLICIT BECOMES IMPLICIT
  5. 5. Interactions with machines will evolve to be less screen dependent and more appealing to all of the human senses EXPLICIT • Lots of manual action required • Singular purpose apps • Limited data • Only appeals to a few senses • Screen Based
  6. 6. Digitalization of Everything — People, Places and Things (IoT) When done right, experiences will feel magical • Automatic • Default everything • Based on privacy • Interoperability • Appeals to many senses • Screens only used when necessary IMPLICIT
  7. 7. FIVE
 TECHNOLOGY 
 DRIVERS 1. Data and Analytics 2. Cloud Connectivity 3. Connected Sensors 4. Mobile Technology 5. User Interfaces
  8. 8. LIVING SERVICES — 
 DESIGNING FOR HUMAN BANDWIDTH AND CONVERSATIONS 
 WITH DATA
  9. 9. Just as we have moved away from point-and-click devices toward touch screens, next we will see our bodies being increasingly used as both a controller and an interface. DESIGNING FOR HUMAN 
 BANDWIDTH
  10. 10. DESIGNERS WILL NEED TO ASK THEMSELVES, 
 WHAT IS THE QUICKEST, MOST RELIABLE WAY TO GET INFORMATION INTO AND OUT OF THE HUMAN BODY? — TO DESIGN FOR A HUMAN INTERFACE, SUCH AS THE BODY, DESIGNERS WILL NEED TO MEASURE BANDWIDTH AGAINST USABILITY.
  11. 11. Responding to natural human behavior will become a more important element of design as we move into a mixed environment of screens and smart objects without screens. CONVERSATIONS WITH DATA
  12. 12. HUMAN UI — 
 THE COMBINATION OF BODY LANGUAGE + HUMAN SENSES TO NATURALLY COMMUNICATE WITH MACHINES AROUND US
  13. 13. — BODY LANGUAGE REMAINS A VITAL WAY IN WHICH WE TRANSMIT MEANING AND EMPHASIS. THE RISE OF PCS, LAPTOPS AND MOBILE 
 PHONES HAS CREATED A REVOLUTION IN 
 REMOTE COMMUNICATIONS—A WORLD WHERE 
 BODY LANGUAGE SEAMS LESS IMPORTANT.
  14. 14. Although there are cultural variations, all over the world 
 we ‘read’ other people through their body language, be it consciously or subconsciously. BODY LANGUAGE CONSIDERATIONS
  15. 15. Body language will manifest itself in a number of ways: 1) Gestures 2) Intent 3) Face Speed GAMING HAS ALREADY BEEN DOING IT
  16. 16. Speaking with our bodies can eliminate friction points from daily life. When implemented in a way that is useful to people, gestures will become something we barely think about. SKELETAL GESTURES
  17. 17. Designers will have the opportunity to create standards for gestures, but will need to be sensitive to specific cultural meanings and ‘gesture conflict.’ HAND GESTURES
  18. 18. We use a wide array of subtle gestures and signals in our daily interactions with each other to signify intent. The UI of intent is very important when navigating around the physical world. INTENT
  19. 19. AUTOMATED AND ROBOTIC OBJECTS’ WILL NEED TO EXPRESS CLEAR INTENT Self-driving cars, for instance, need to express to pedestrians waiting to cross the road whether or not it is safe for them to go.
  20. 20. As we begin to interact with objects embedded in our environment, and as that process becomes more human, we will become less tolerant of delays. FACE SPEED
  21. 21. HUMANIZED OBJECTS WILL BE EXPECTED TO RESPOND QUICKLY As we begin to interact with objects embedded in the environment around us, and as that process becomes more human, we’re going to become less tolerant of delays.
  22. 22. HUMAN SENSES SIGHT SOUND TOUCH SMELL TASTE Exteroceptive 
 Senses that perceive the body's own position, motion, and state
 • Temperature • Direction • Balance • Pain • Kinesthetic Interoception 
 Internal senses that are normally stimulated from within the body
 • Tension • Pressure • Stretch • Itch • Chemoreceptors • Thirst • Hunger Perception – Not based on a specific sensory organ: i.e. Time + Body Language Considerations: Gesture, Intent and Face Speed
  23. 23. SIGHT Ophthalmoception – This technically is two senses given the two distinct types of receptors present, one for color (cones) and one for brightness (rods). EYES Inputs • Cameras • Contacts • Eye Implant
  24. 24. SIGHT Ophthalmoception – This technically is two senses given the two distinct types of receptors present, one for color (cones) and one for brightness (rods). EYES Outputs • Screens 
 (TVs, Tablets, Phones) • Glass 
 (Windows, Mirrors, Visors) • Eye Wear 
 (Glasses, Goggles) • Projectors
  25. 25. SIGHT Ophthalmoception – This technically is two senses given the two distinct types of receptors present, one for color (cones) and one for brightness (rods). EYES Application • GUI and NUI • AR/VR/MR • Facial Recognition • Mood • Emotional Resonance • Image/video/object recognition • Gesture Control • Heat Mapping • Night Vision • Eye Tracking
  26. 26. SOUND Audioception – Detecting vibrations along some medium, such as air or water that is in contact with your ear drums MOUTH Inputs • Microphone • Sonar
  27. 27. SOUND Audioception – Detecting vibrations along some medium, such as air or water that is in contact with your ear drums EAR Outputs • Speaker • Headphones • Hearing Aid
  28. 28. SOUND Audioception – Detecting vibrations along some medium, such as air or water that is in contact with your ear drums EARS & MOUTH Application • Voice Simulation • Text to Speech • Translation • Recording • Noise Cancellation • Volume
  29. 29. TOUCH Tactioception – Refers to the body's ability to feel physical sensations. This sense uses several modalities distinct from touch like pressure, temperature, pain, and even itch senses. SKIN Inputs • Tactile Perception • Texture • Pressure • Temperature • Motion/Movement
  30. 30. TOUCH Tactioception – Refers to the body's ability to feel physical sensations. This sense uses several modalities distinct from touch like pressure, temperature, pain, and even itch senses. SKIN Outputs • Haptic Feedback • Vibrations
  31. 31. TOUCH Tactioception – Refers to the body's ability to feel physical sensations. This sense uses several modalities distinct from touch like pressure, temperature, pain, and even itch senses. SKIN Application • Clothing • Wearables • Tattoos • Implants • Navigation Tech Tats Sentari
  32. 32. SMELL Ophthalmoception – This technically is two senses given the two distinct types of receptors present, one for color (cones) and one for brightness (rods). NOSE Inputs • Sensors • Digital Device
  33. 33. SMELL Ophthalmoception – This technically is two senses given the two distinct types of receptors present, one for color (cones) and one for brightness (rods). NOSE Outputs • Digital Scent • Breath Analysis • Freshness Analysis • Health Detection • Odor Diffusion
  34. 34. SMELL Olfacoception – Yet another of the senses that work off of a chemical reaction. This sense combines with taste to produce flavors. NOSE Application • Clothing • Respirator • Breathable Wear • Nose Ring • Nose Plug • Connected Home
  35. 35. TASTE Gustaoception – This is sometimes argued to be five senses by itself due to the differing types of taste receptors (sweet, salty, sour, bitter, and umami), but generally is just referred to as one sense. MOUTH Inputs • Sensors • Tooth Implant • e-cigarette • Ingestable • Eating Utensils
  36. 36. TASTE Gustaoception – This is sometimes argued to be five senses by itself due to the differing types of taste receptors (sweet, salty, sour, bitter, and umami), but generally is just referred to as one sense. MOUTH Outputs • Taste Sensing • Taste Simulator • Teach Cooking
  37. 37. TASTE Gustaoception – This is sometimes argued to be five senses by itself due to the differing types of taste receptors (sweet, salty, sour, bitter, and umami), but generally is just referred to as one sense. MOUTH Application • Taste Profile • Meal Creation • Diet Recommendation • Medical
  38. 38. MEASURING THE MOOD OF AN EVENT We collected data from each individual portrait, live tweets from the event, and facial detection camera data, and analyzed this with Watson User Modeling. We represented this analysis through an evolving animated visual that details the Big 5 personality traits.
  39. 39. STATE FARM 
 POCKET AGENT • We gave Watson eyes: Using facial detection, we tracked various actions from visitors near the tasting booth, like smiling and attention. • Individual Conversations: We built a custom iPad app that allowed Watson to get to know visitors through a short, engaging conversation, using speech-to-text technology. • Concept expansion: We analyzed the Twitter feeds from event visitors to generate a list of topics, along with each individual’s interest level of each topic. WHAT WE DELIVERED G E W a l ! B h t u t p !
  40. 40. NETWORK DATA IN ACTION
  41. 41. Nullam id dolor id nibh ultricies vehicula ut id elit. Cum sociis natoque penatibus et magnis dis parturient montes, nascetur ridiculus mus. COLLECTIVE PORTRAIT
  42. 42. EMAIL AUSTIN OFFICE 319 Congress, Suite 200
 Austin, TX 78701 PHONE CHAOTICMOON.COM GET IN TOUCH greg@chaoticmoon.com 512.420.8800

×