What does it mean when your apps can see, hear and feel? Digital sensors are flooding through our daily life. Mobile devices have microphones, cameras, accelerometers, digital compasses and now Near Field Communication (NFC) chips and scanners. But this is all just the very start. These small digital sensors are shrinking even further and just as Mark Weiser predicted they are soaking into the world around you.
In this session Rob will explore how you can use these new streams of sensor data to create innovative and dynamic new user experiences. He’ll examine, from an experience point of view, what the challenges and constraints of sensor driven apps are and how you can deal with this deeply technical domain that covers issues such as sensor fusion, latency, calibration and newly extended mental models.
But if that all sounds a bit dry and technical for you, don’t panic! Rob’s focus here is on how to create experiences using these technologies rather than the deep technical aspects themselves. However, he will help you get a clear overview of the work from the W3C Device API and Web Real Time Communication working groups, the ARStandards.org community and other related research activities. He’ll present tangible demonstrations that show how these new streams of “digital awareness” data can now be integrated into the experiences you create to literally bring your apps to life. This will leave you re-thinking your current projects and asking “Is that sense-able?”.
Rob will also explore where we are headed with all of this and how to plan for this mind expanding journey. This is far from just a technical discussion. It will provide an introduction to the ethical and moral issues created by these sensors. Issues that we will all need to address in our own lives providing yet another aspect to the key question “Is that sensible?”