Somatic Data Perception: Sensing Information Shadows

M
Mike KuniavskyUser Experience Designer, Author at ThingM
SOMATIC DATA PERCEPTIONSensing information shadows,[object Object],Mike KuniavskyARE2011Santa Clara	May 18, 2011,[object Object]
Somatic Data Perception: Sensing Information Shadows
Somatic Data Perception: Sensing Information Shadows
Somatic Data Perception: Sensing Information Shadows
Somatic Data Perception: Sensing Information Shadows
Somatic Data Perception: Sensing Information Shadows
Somatic Data Perception: Sensing Information Shadows
Somatic Data Perception: Sensing Information Shadows
Somatic Data Perception: Sensing Information Shadows
Augmented reality’s name problem,[object Object]
Somatic Data Perception: Sensing Information Shadows
Somatic Data Perception: Sensing Information Shadows
0.03MP – 0.2MP,[object Object]
Somatic Data Perception: Sensing Information Shadows
Visual bandwidth (a super rough estimate),[object Object],Eyes:,[object Object],150 bits per second to,[object Object],10,000 bits per second,[object Object],Intel Core 2 front-side bus,[object Object],102,400,000 bits per second,[object Object]
Somatic Data Perception: Sensing Information Shadows
Somatic Data Perception: Sensing Information Shadows
Somatic Data Perception: Sensing Information Shadows
Secondary senses for secondary data,[object Object]
INFORMATION SHADOWS,[object Object]
Somatic data perception,[object Object]
Mike Kuniavsky,[object Object],mikek@thingm.com,[object Object]
1 of 22

More Related Content

More from Mike Kuniavsky(20)

New product ecosystem_2013_0.1New product ecosystem_2013_0.1
New product ecosystem_2013_0.1
Mike Kuniavsky1.3K views
The New Product Development EcosystemThe New Product Development Ecosystem
The New Product Development Ecosystem
Mike Kuniavsky1.4K views
2012 ux lx-workshop_0.3-22012 ux lx-workshop_0.3-2
2012 ux lx-workshop_0.3-2
Mike Kuniavsky736 views
Information is a MaterialInformation is a Material
Information is a Material
Mike Kuniavsky602 views

Recently uploaded(20)

StratPlanning Manual 220713.pdfStratPlanning Manual 220713.pdf
StratPlanning Manual 220713.pdf
Lakewalk Media12 views
Task 3 copy.pptxTask 3 copy.pptx
Task 3 copy.pptx
ZaraCooper216 views
Viking passive.pdfViking passive.pdf
Viking passive.pdf
Matis Velt15 views
Anti-Cancer Drugs-Medicinal ChemistryAnti-Cancer Drugs-Medicinal Chemistry
Anti-Cancer Drugs-Medicinal Chemistry
NarminHamaaminHussen7 views
SS25 Fashion Key Items trend bookSS25 Fashion Key Items trend book
SS25 Fashion Key Items trend book
Peclers Paris80 views
The Last GrainsThe Last Grains
The Last Grains
pulkkinenaliisa36 views
3 Dark Design Templates3 Dark Design Templates
3 Dark Design Templates
Pixeldarts14 views
Doing Footwear - Footwear FactoryDoing Footwear - Footwear Factory
Doing Footwear - Footwear Factory
Doing Footwear8 views
217 Drive - All on upper.pptx217 Drive - All on upper.pptx
217 Drive - All on upper.pptx
vidstor28213 views
TISFLEET WEB DESIGN PROJECTTISFLEET WEB DESIGN PROJECT
TISFLEET WEB DESIGN PROJECT
Rabius Sany38 views
Nomor Meja RUANG-4.docNomor Meja RUANG-4.doc
Nomor Meja RUANG-4.doc
ssuserc40b916 views
Design System in Figma A to Z.pdfDesign System in Figma A to Z.pdf
Design System in Figma A to Z.pdf
Atiqur Rahaman13 views
Presentation (1).pdfPresentation (1).pdf
Presentation (1).pdf
hjksa16 views
polarispolaris
polaris
scribddarkened352233 views
Design System.pdfDesign System.pdf
Design System.pdf
Atiqur Rahaman10 views
Scopic UX Design Test Task.pdfScopic UX Design Test Task.pdf
Scopic UX Design Test Task.pdf
Atiqur Rahaman258 views
Here_Process bookHere_Process book
Here_Process book
nykimstudio15 views
evidence .pptxevidence .pptx
evidence .pptx
npgkddpbpd9 views

Somatic Data Perception: Sensing Information Shadows

  • 1.
  • 10.
  • 13.
  • 15.
  • 19.
  • 20.
  • 21.
  • 22.

Editor's Notes

  1. First, let me tell you a bit about myself. I’m a user experience designer and entrepreneur. I was one of the first professional Web designers in 1993. Since then I’ve worked on the user experience design of hundreds of web sites. I also consult on the design of digital consumer products, and I’ve helped a number of consumer electronics and appliance manufacturers create better user experiences and more user centered design cultures.
  2. We’re a micro-OEM. We design and manufactures a range of smart LEDs for architects, industrial designers and hackers.
  3. I have a new startup called Crowdlight.
  4. [Roughly speaking, since we filed our IP, Crowdlight is a lightweight hardware networking technology that divides a space into small sub-networks using. This can be used in AR to provide precise location information for registering location-based data onto the world, but it’s also useful in many other ways for layering information in precise ways onto the world. We think it’s particularly appropriate for The Internet of Things, for entertainment for lots of people, and for infusing information shadows into the world.]
  5. This talk is based on a chapter from my new book. It’s called “Smart Things” and it came out it September. In the book, I describe an approach for designing digital devices that combine software, hardware, physical and virtual components.
  6. It sets the bar very high and implies that you need to fundamentally alter reality or you’re not doing your job.This in turn implies that you have to capture as much reality as possible, that you have immerse people as much as possible.
  7. This leads naturally to try to take over vision, since it’s how we most perceive the world around us. If we were bats, we would have started with hearing, if we were dogs, smell, but we’re humans, so for us reality is vision.
  8. The problem is that vision is a pretty low bandwidth sense. Yes. It’s possibly the highest bandwidth sense we have, but it’s still low bandwidth.
  9. This morning I decided to do a back of the envelope estimate of how much bandwidth we have in our vision. This is a back of the envelope estimate by a non-scientist, so excuse it if it’s way off. Anyway, I started with the fovea, which typically has between 30,000 and 200,000 cones in it.
  10. To compensate, our eyes move in saccades which last between 20ms and 200ms, or 5 to 50 times per second.
  11. So this leads to a back of the envelope calculation of eye bandwidth between 100 bits per second and 10K bits per secondThat’s around 4 orders of magnitude slower than a modern front-side bus.The brain deals with this through a series of ingenious filters and adaptations to create the illusion of an experience of all reality, but at the core there’s only a limited amount of bandwidth available and our visual senses are easily overwhelmed.
  12. In the late 70s and early 80s a number of prominent cognitive scientists measured all of this and showed that, roughly speaking, you can perceive and act on about four things per second. That’s four things period. Not four novel things that just appeared in your vision system—that takes much longer—or the output of four new apps that you just downloaded. It’s four things, total.
  13. This is a long digression to my main point, which is that augmented reality is the experience of contextually appropriate data in the environment. And that experience not only can, but MUST, use every sense available.
  14. Six years ago I proposed using actual heat to display data heat maps. This is a sketch from my blog at the time I wrote about it. The basic idea is to use a peltier junction in an arm band to create a peripheral sense of data as you move through the world. So that you can have it hooked up to Wifi signal strength, or housing prices, or crime rate, or Craig’s List apartment listings, and as you move through the world, you can feel if you’re getting warmer to what you’re looking for because you arm actually gets warmer. This allows you to use your natural sense filters to determine whether it’s important. If it’s hot, it will naturalliy pop into your consciousness, but it’ll only be there otherwise if you want it, and you can check in while doing something else, just as when you’re gauging which direction the wind is going by which side of your face is cold, and you’re not adding additional information to your already overstuffed primary sense channels.
  15. If AR is the experience of any kind of data by any sense then we have the options to associate secondary data with secondary senses to create hierarchies of information that match our cognitive abilities.
  16. So what I’m advocating for is a change in language away from “augmented reality” to something that’s more representative of the whole experience of data in the environment. I’m calling it “Somatic Data Perception” and I close on a challenge to you. As you’re designing, think about what IS secondary data and what are secondary, and how can the two be brought together?