Hi, I want to show you two things today: The first is how information will be apparent everywhere around us, on computers in different forms and shapes, on objects we don&#x2019;t see as computers, and on everyday things. The second is how there are new ways of interacting with these new computers, and this is where the revolution will be. I think you will see that it is like a dance between the Devices and the Internet, and how this dance evolves both of them in ways we didn&#x2019;t believe possible a few years ago.
So who am I? My name is Hampus Jakobsson and I am one of the founders of the Swedish design and technology company TAT. TAT was founded 2002, and is now 180 people in Sweden, USA, Korea, Taiwan and Japan. Our technology is in every seventh mobile phone built worldwide and shipping from companies like Nokia, Samsung, Motorola, and Sony Ericsson. To date more than 500 million phones. Other customers include Google, Orange, T-mobile, Paypal, and Volvo. So we don&#x2019;t only do mobile phones, but cars, tablets, and other electronics.
Enough about us. Let&#x2019;s start our journey together by looking backwards on the history and present of how we interact with machines.
The first computers where really dedicated machines. Each designed to handle a specific task. And the input and output were tailored to this - each function had a dedicated button and the output would show a very limited set of things.
As computers grew more competent, and ended up in more and more situations, they turned from single purpose machines to more as we see them today - with multiple uses through varied applications. To operate these, suddenly a much more general purpose input was needed.
The problem with these computers were that they demanded a lot from the user. Manual reading, remembering computer commands, and so forth.
And the graphical user interface was invented! The mouse. Windows instead of text. Suddenly we got a direct manipulation of the objects and actions we wanted, and didn&#x2019;t have to remember complex commands.
Let&#x2019;s do the same journey in the mobile industry - the first mobile phones also lacked richness of interaction - they also had dedicated keys to each function. And it was good - since a mobile phone had a handful functions, you could map a key to each without making the devices complex. They were dedicated devices.
But as the number of functions increased on a mobile phone, the user interface was not sufficient. It was not possible to map keys to everything - and the menu was introduced. This was all great, but as mobile phones increased in complexity the cognitive burden increased.
And we kept on adding features without changing the way we interacted with phones for 10 years till 2007 when we reached a maximum. When the functions a phone had were almost unknown and impossible to overview to anybody but the writers of the manual. And then...
... again direct manipulation was introduced. This time in the mobile industry. And the mobile market did a jump and removed the mouse and used the finger for direct interaction. So really direct interaction.
Apple surprised the whole industry by introducing a mobile phone much less capable than the average phone on the market - lacking wifi, 3G, a camera worthy taking pictures with and so on. And it surprised everybody even more that there was demand for these simple to use but feature-weak devices. And a great demand as we know today.
What was the feature that made the system tip over and make keys impossible? For each function we added - camera, radio, email - there was keys enough. I would say it was the Internet. Because as we really introduced Internet into the mobile world, the number of functions grew from hundreds to infinite. No-one can predict, when building a truly internet capable and open device, what people will use it for. So the paradigm had to change.
And the interesting thing is that this way of interacting - touch screens - suddenly showed to be the new way of interacting in general. It was a much more natural way. So suddenly the mobile industry led the user interface paradigm of computing to something new.
As the two industries converge with the common factor of Internet-use, it changes computers and mobile devices as we know them - the tablets are born. This is true mobile computing. And nobody really knows what it means. A week doesn&#x2019;t pass without some wise person calling tablets a temporary thing - a fad. But at the same time they cannibalize both industries.
Internet converged the two industries computer and mobile, but as you soon will see it doesn&#x2019;t end there. First the industries converged, but we are now at the brink of an explosion.
We now have new devices that are neither computers nor mobile phones. But new dedicated devices. Transformed by Internet and the wisdom of interaction designers. Like the Kindle. E-ink is one of the things that tell me why we will have screens everywhere. Because if dressing this wall, floor or ceiling wouldn&#x2019;t be expensive, and showing information there does not consume battery, why should they not be &#x201C;screens&#x201D;. Desks will be the first non-computer to be dressed with e-ink, I believe.
And screens just showing what is on another screen, but correct to the context will be more common. Like this new device, LiveView, from Sony Ericsson. It is a &#x201C;remote screen&#x201D;, which allows you to see and interact with selected parts of your mobile phone when jogging or just walking around. Our home television might turn into this as well, just an extension of another device.
Or this keyboard from Art. Lebedev Studio - where dedicated keys are back. But this time it is the user that set them. The keys are screens.
And not is not happening just in devices and computers. Let&#x2019;s look at what will happen with cars.
&#x6216;&#x8005;&#x6211;&#x4EEC;&#x6765;&#x770B;&#x770B;&#x8FD9;&#x6B3E;Art. Lebedev Studio &#x952E;&#x76D8;&#xFF0C;&#x6211;&#x4EEC;&#x4F1A;&#x53D1;&#x73B0;&#x4E13;&#x7528;&#x952E;&#x53C8;&#x56DE;&#x5F52;&#x4E86;&#x3002;&#x4F46;&#x73B0;&#x5728;&#x952E;&#x7684;&#x529F;&#x80FD;&#x7531;&#x7528;&#x6237;&#x8BBE;&#x5B9A;&#x3002;&#x952E;&#x672C;&#x8EAB;&#x5C31;&#x662F;&#x5C4F;&#x5E55;&#x3002;&#x4E0D;&#x4EC5;&#x4EC5;&#x662F;&#x8BBE;&#x5907;&#x548C;&#x8BA1;&#x7B97;&#x673A;&#x624D;&#x662F;&#x5982;&#x6B64;&#xFF0C;&#x8BA9;&#x6211;&#x4EEC;&#x518D;&#x6765;&#x770B;&#x770B;&#x4E0E;&#x6C7D;&#x8F66;&#x7ED3;&#x5408;&#x5C06;&#x53D1;&#x751F;&#x4EC0;&#x4E48;&#x3002;
Why are not the rearview mirrors screens? Flashing red when someone is in your blind spot and you are ready to turn left. Or showing really that the &#x201C;objects in mirror are closer than they appear&#x201D;.
But let&#x2019;s take a step back. A lot of these screens I showed you are on a surface or device you can touch. What about these new 3DTVs? How do you interact here?
We will have screens or information projected to surfaces where there aren&#x2019;t screens. But how do you interact with these?
Or projected screens? Or in a while, holographic screens? When then the object is in mid-air and not on a surface?
That means you would need some kind of super smart computer that would recognize your movements and guess what you mean and map that to the information. Some kind of science fiction artificial intelligence. When will this come along?
Well, Microsoft are selling it to your kids as we speak, and the product is called Kinect and is a part of the XBOX360 game console. With a handful of cameras and AI they are building a computer model of your body. So you can interact with machines like you interact with people. Right now. In your living room. And this cost something like five thousand Yuan.
So, the future will be about information everywhere. And the user interface is the main differentiator.
Let&#x2019;s look at a normal day 2014. Gesture interaction. Flexible screens. Mirrors that are screens. That recognize and log in you with face recognition. Controlled with gestures so that you don&#x2019;t smear the mirror. Large transparent surfaces that are screens. Desks that are screens. This with electronic ink. Devices that pair by proximity. And devices that can see the orientation of other devices for seamless interaction.