The Fourth Dimension of Virtual Reality


Published on

Bill Prensky, CTO FutureWork Institute, presents his perspective of The Fourth Dimension of Virtual Reality

Published in: Business
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • It’s not a news flash to anyone that we’re getting closer and closer to the ultimate android helper. We’ve already been able to make them walk on two legs, listen and respond and even learn new things as time progresses. When I first saw this clip I was nothing but awestruck. I was simply blown away. The speed we’re progressing in robotics and our strive to make them “human”-like is increasing by the second. We’re coming up with new and exciting solutions and I tell you, when these things are for sale I will be among the first one to get one. When they get affordable enough that is. The leaps of which they create new materials and technologies in order to get the result they need is just amazing. Who would have thought that they would be so close to mass produce these things. The cool thing is that they are already going to mass produce a little kids robot that will become a life companion for kids. It will learn through time and it will talk to you and it shows emotions. How freaky is that? When I was little I had like…a record to listen to if I wanted some of my teddy bears to talk to me. We have surely come a long way and I just can’t wait for all this to take off. Dr. David Hanson builds robots with character—Ted Talks
  • Also in Dubai: The Pad ( has been labeled the world’s first ‘cybertecture’ apartment tower. The 24-storey, USD 136 million, residential tower is designed by Hong Kong-based James Law Cybertecture and will have 231 'intelligent' apartments ranging from studios to one and two-bedroom units as well as a range of 'loft' homes. The “i” features in each apartment covers everything from communications and entertainment to shopping, which all fuse a plethora of technologically advanced features. Technologies such as real-time projection (iReality) allow residents to alter the façade of the apartment by projecting the beauty of the Caribbean or view of the New York skyline onto their windows. Bathrooms are fitted with health monitoring equipment (iHealth), while communication devices (iFamily & Friends) can be given to family and friends to allow residents to have dinner with loved ones from across the globe via a video conferencing link projected into their dining rooms. Rotating living and dining rooms (iRotation) give a 360 degree view of Dubai and the Business Bay, and reactive lighting that responds to incoming phone calls and the residents’ moods (iAmbience) gives the home an individual touch. 'The Pad' also has a shared clubhouse space (iClub) for the exclusive use of residents. The club will feature a gym, running track, half basketball court, a swimming pool fitted with an underwater audio system, and a Media Jacuzzi equipped with waterproof touch-screens. The Pad is set for completion in the fourth quarter of 2009.
  • 'SixthSense' is a wearable gestural interface that augments the physical world around us with digital information and lets us use natural hand gestures to interact with that information. We've evolved over millions of years to sense the world around us. When we encounter something, someone or some place, we use our five natural senses to perceive information about it; that information helps us make decisions and chose the right actions to take. But arguably the most useful information that can help us make the right decision is not naturally perceivable with our five senses, namely the data, information and knowledge that mankind has accumulated about everything and which is increasingly all available online. Although the miniaturization of computing devices allows us to carry computers in our pockets, keeping us continually connected to the digital world, there is no link between our digital devices and our interactions with the physical world. Information is confined traditionally on paper or digitally on a screen. SixthSense bridges this gap, bringing intangible, digital information out into the tangible world, and allowing us to interact with this information via natural hand gestures. ‘SixthSense’ frees information from its confines by seamlessly integrating it with reality, and thus making the entire world your computer. The SixthSense prototype is comprised of a pocket projector, a mirror and a camera. The hardware components are coupled in a pendant like mobile wearable device. Both the projector and the camera are connected to the mobile computing device in the user’s pocket. The projector projects visual information enabling surfaces, walls and physical objects around us to be used as interfaces; while the camera recognizes and tracks user's hand gestures and physical objects using computer-vision based techniques. The software program processes the video stream data captured by the camera and tracks the locations of the colored markers (visual tracking fiducials) at the tip of the user’s fingers using simple computer-vision techniques. The movements and arrangements of these fiducials are interpreted into gestures that act as interaction instructions for the projected application interfaces. The maximum number of tracked fingers is only constrained by the number of unique fiducials, thus SixthSense also supports multi-touch and multi-user interaction. The SixthSense prototype implements several applications that demonstrate the usefulness, viability and flexibility of the system. The map application lets the user navigate a map displayed on a nearby surface using hand gestures, similar to gestures supported by Multi-Touch based systems, letting the user zoom in, zoom out or pan using intuitive hand movements. The drawing application lets the user draw on any surface by tracking the fingertip movements of the user’s index finger. SixthSense also recognizes user’s freehand gestures (postures). For example, the SixthSense system implements a gestural camera that takes photos of the scene the user is looking at by detecting the ‘framing’ gesture. The user can stop by any surface or wall and flick through the photos he/she has taken. SixthSense also lets the user draw icons or symbols in the air using the movement of the index finger and recognizes those symbols as interaction instructions. For example, drawing a magnifying glass symbol takes the user to the map application or drawing an ‘@’ symbol lets the user check his mail. The SixthSense system also augments physical objects the user is interacting with by projecting more information about these objects projected on them. For example, a newspaper can show live video news or dynamic information can be provided on a regular piece of paper. The gesture of drawing a circle on the user’s wrist projects an analog watch. The current prototype system costs approximate $350 to build.
  • You don't need to work for the secret service or as a jet fighter pilot to appreciate the sheer convenience – and craftiness – of being able to grab hold of crucial information, without so much as lifting a finger or batting an eyelid. Students at the Fraunhofer Institute in Germany are developing a pair of interactive data eyeglasses that can project an image onto the retina from an organic light-emitting diode (OLED) micro-display, making the image appear as if it's a meter in front of the wearer. While similar headwear only throws up a static image, the students are working on eye-tracking technology that allows wearers, with just the movement of the eyeball, to scroll through information or move elements about. While similar headwear – sometimes referred to as head-mounted displays (HMDs) – only throws up a static image, the students are working on eye-tracking technology that allows wearers, with just the movement of the eyeball, to scroll through information or move elements about. The glasses are designed to provide information to wearers who don't have their hands free to operate a keyboard or mouse. Dr Michael Scholles, business unit manager at the Fraunhofer Institute for Photonic Microsystems (IPMS) in Dresden, believes these devices have a ready-made application in the medical field where they could be used to quickly project vital patient information or medical imaging to doctors during a consultation or surgery. Scholles also sees applications in the construction industry where the glasses could be used to project drawings or installation instructions. As the image needs to outshine the ambient light to be seen clearly against changing and highly contrasting backgrounds, OLEDs have been used to produce a high-luminance micro-display. While existing data glasses only display information, the German students are hoping to make the micro-display technology bi-directional and interactive, which will open up new uses, says Scholles. The eye-tracking device the students are working on - which is fitted to the hinge of the glasses - will enable users to influence the content projected by simply moving their eyes or fixing on certain points in the image. New content can be displayed and menus can be scrolled through or picture elements shifted. According to Scholles, they have concentrated on making the glasses inexpensive as well as small and light – the system’s eye tracker and image reproduction integrated into the CMOS chip measure 19.3mm by 17mm. David Greig
  • Original Video: Skinput - Your Arm as Input Device Mar-03-10 An armband utilizes sensors that can recognize and differentiate between various taps and locations on the forearm with 95% accuracy. Combined with a pico projector which displays menus and buttons on your arm, you can control devices such as computers and iPods. Advances in electronics have allowed mobile devices to become very small: Instead of carrying extra surfaces,we can use surfaces already around us (tables and walls) Bio-acoustic sensing technology Allows the body to be used as a large finger-input surface, no electronics required on the skin Finger-input are segmented and classified in real-time Waves that propagate through the body are read via a bio-acoustic sensing array Different locations are acoustically distinct It is possible to incorporate a Pico-projector – this allows for sensing as well as projection of a dynamic graphical interface
  • The “Human Singularity” refers to the radical fusion of the human body with technology to achieve levels of mental acuity and physical ability that eclipse anything humans have previously known. One critical social function that will be affected by the singularity is leadership, a chief defining factor of a society's values, relations, and objectives. Leaders will bear much of the burden of social evolution when the “ Enhanced Singular Individuals” (ESIs) of the Singularity Era enter the general population of “Norms” (those without technological enhancements). The leaders of every organization and group will be compelled to come to terms with the ESIs' advanced capabilities and the tensions, ambitions, and alliances attendant upon them
  • Textbooks replaced by “digipacks” Rugged mobile personal computer and communications console “ Alice in Wonderland” multi-user virtual environment interfaces Objective: obtain and create knowledge At the right time In the right place In the right way On the right device For the right person
  • Self Learning Horizontal Structures From Presumed Authority to Collective Credibility De-centered Pedagogy Networked Learning Open Source Education Learning as Connectivity/ Interactivity Campuses as Mobilizing Networks
  • Self Learning Horizontal Structures From Presumed Authority to Collective Credibility De-centered Pedagogy Networked Learning Open Source Education Learning as Connectivity/ Interactivity Campuses as Mobilizing Networks
  • The Fourth Dimension of Virtual Reality

    1. 1. ` Of Virtual Reality William Prensky Chant Newall Development Group The Fourth Dimension August 2010
    2. 2. Knowledge and Information Have Moved from the Real to the Simulated (2D)…. And are now moving to the hyper-real (3D) Classroom Learning E-Learning V-Learning (Virtual Reality)
    3. 3. At CNDG, We Create Photo-Realistic Representations of Client’s Ideas Representations in which you can live and work
    4. 4. ..And Infuse it With Just the Right Amount of Detail…
    5. 5. ..So That You Feel Like You Can Touch It and Hear It—With Mouse and Arrows
    6. 6. With Just the Right Amount of Detail… Your mind fills in the rest, and the individual
    7. 7. … Snaps Into Presence And Gets a Divided Sense of Self
    8. 8. ` What is Developing Now in the Real World That Leads to This Future?
    9. 9. We Start to Move into a Fourth Dimension—or an Enhanced Reality with New Tools Some of us will have robots and avatars with empathy which can “sense” and “perceive” our moods ...And react to our body movements
    10. 10. Some of Us Will Soon Work With Anybots or Robot Avatars So we can work from anywhere with a wireless connection
    11. 11. Some Will Live in “Cybertecture” Places with iReality and iFamily The Pad, Dubai -2009 The Pad, Dubai -2009
    12. 12. Augmented Reality Will Blur The Lines Between The Real And Digital World Sixth Sense is a wearable gestural interface that augments the physical world digitally
    13. 13. We May Use Pen Computers We Can Carry in Our Pockets With… Monitors and keyboards on any flat surface
    14. 14. … And Interactive Data Glasses That Can Project An Image Onto The Retina Providing clear information & giving hands free interaction
    15. 15. … Or Skinput, Using Your Arm as an Input Device, Instead of Carrying Extra Surfaces Bio-acoustic sensing technologies allows body to be used as finger input surface
    16. 16. Brain Chips Will Control Computers by 2020 Brain waves will replace keyboard and mouse, dial the phones and change TV channels
    17. 17. Headsets That Read Brainwaves Will Enable Us to Operate Objects With Our Thoughts So wheelchairs may be moved by brain, not brawn
    18. 18. The “Human Singularity”: the radical fusion of the human body with technology to achieve levels of mental acuity and physical ability that eclipse anything humans have previously known. Imagine Google and Wikipedia integrated into your brain! And the ESIs “Enhanced Singular Individuals” Will Change Our “Norms”— We Will Enter A New World Which fuses the human body with technology
    19. 19. “ Digipacks” and Virtual Environments Will Transform Learning <ul><li>Textbooks replaced by “digipacks” </li></ul><ul><li>“ Alice in Wonderland” multi-user virtual environment interfaces </li></ul><ul><li>Obtain and create knowledge </li></ul><ul><ul><li>At the right time/place/way </li></ul></ul><ul><ul><li>On the right device </li></ul></ul><ul><ul><li>For the right person </li></ul></ul>iPad – First Generation Digipacks
    20. 20. One-Room Schoolhouses in Second Life with a Network of Learners Will Become the Norm Offering global access to education and information
    21. 21. This schoolhouse is as local to Melbourne as it is to Boston And is available anytime, from anywhere
    22. 22. We Are Already Seeing Students Who Think Differently, With Hypertext Minds That Leap Around And Are Not Limited by Geography With cognitive structures that are parallel, not sequential
    23. 23. But Students Aren’t The Only Group That Is Changing More and more businesses are entering SL
    24. 24. Large Group Meetings Are Now Commonly Held on 3D Platforms Sodexo saved on a one day meeting: $1,700,000 on travel; 7,200 not-missed offfice hours; 450,000 pounds of CO2e not emitted Virtual Global Inclusion Summit-Sodexo & Microsoft: 1000 Managers from 5 continents
    25. 25. ..And Organizations Are Constantly Experimenting with Different Platforms 200 Fortune 500 clients experienced our V-DINE (Virtual D&I Networking Event)
    26. 26. ` What Will This Future Look Like in Second Life?
    27. 27. These Enhanced Realities Will Help to Create the Fourth Dimension of Virtual Reality A Hybrid of Real Life and Second Life
    28. 28. When You Enter the Metaverse in This Hybrid World Will you know things and have a prescient knowledge?
    29. 29. As We Find Second Life More and More Realistic and Immersive … What will happen when it becomes hyper-real?
    30. 30. We Haven’t Even Begun to See or Sense the Limits of What We Can Do As we merge all our senses and enter into the 4 th dimension of virtual reality we create a new REALITY
    31. 31. As We Walk Across this Bridge We Are Building Together We will begin to see a new vision and perception of what we are doing, ……………… Because……………….
    32. 32. The Future is Not Someplace We Are Going, But Something We Are Creating... The paths to it are not found, but made, and the activity of making them changes both maker and destination.