Successfully reported this slideshow.

Building AR and VR Experiences for Web Apps with JavaScript

2

Share

Upcoming SlideShare
WebGL:  The Next Generation
WebGL: The Next Generation
Loading in …3
×
1 of 35
1 of 35

Building AR and VR Experiences for Web Apps with JavaScript

2

Share

Download to read offline

It is increasingly important to understand how AR and VR technologies are changing what is possible to do in modern web applications. There are a lot of tools and technologies that you can choose from to tie all of the pieces together to start implementing AR and VR features, but not all of them are JavaScript or web friendly. In this talk Hasan will get into AR and VR development from the perspective of a web app developer who is competent with modern JavaScript and web development tools. He will also introduce a way to share what you build on the growing Oculus platform, and why Oculus is a great entry point for VR.

Objective
Learn how to approach your first feature or project involving AR or VR in your applications that run in browsers and mobile devices, all using JavaScript

Target Audience
Web application developers interested in building AR and VR driven features in their web applications

Assumed Audience Knowledge
JavaScript, web technology

Five Things Audience Members Will Learn
WebGL basics
Intro to React360
Working with Three.js
What types of use cases to apply AR and VR technology to
Building for Oculus

It is increasingly important to understand how AR and VR technologies are changing what is possible to do in modern web applications. There are a lot of tools and technologies that you can choose from to tie all of the pieces together to start implementing AR and VR features, but not all of them are JavaScript or web friendly. In this talk Hasan will get into AR and VR development from the perspective of a web app developer who is competent with modern JavaScript and web development tools. He will also introduce a way to share what you build on the growing Oculus platform, and why Oculus is a great entry point for VR.

Objective
Learn how to approach your first feature or project involving AR or VR in your applications that run in browsers and mobile devices, all using JavaScript

Target Audience
Web application developers interested in building AR and VR driven features in their web applications

Assumed Audience Knowledge
JavaScript, web technology

Five Things Audience Members Will Learn
WebGL basics
Intro to React360
Working with Three.js
What types of use cases to apply AR and VR technology to
Building for Oculus

More Related Content

Related Books

Free with a 14 day trial from Scribd

See all

Related Audiobooks

Free with a 14 day trial from Scribd

See all

Building AR and VR Experiences for Web Apps with JavaScript

  1. 1. AR/VR in JavaScript Apps FITC Web Unleashed 2018
  2. 2. The Rise of Extended Reality (AR/VR) • AR (Augmented Reality) – Interaction with physical realty augmented with computer generated input and output • VR (Virtual Reality) – Replace physical reality with a computer generated one • Hardware costs plummeting (Oculus Go launched F8 2018, $199) • Software for building AR and VR experiences getting better every day
  3. 3. Introduction – Hasan Ahmad • Principal Consultant at DEV6 • Developer Circles from Facebook (Toronto) • https://dev6.com • https://www.facebook.com/groups /DeveloperCirclesToronto/ • https://twitter.com/has_ibr_ahm
  4. 4. Industries that are embracing XR • Gaming • Media • Mobile Apps • Streaming • Education • Industrial • Retail
  5. 5. AR/VR tech-stack for web devs • Little bit of math & physics background • Smartphone or VR headset • Three.js • WebVR API / Web XR API • A-Frame • React 360 • AR.js • 3D Content
  6. 6. Three.js • https://threejs.org/ • WebGL graphics library • Scene – What to display • Camera – What to view • Renderer – How to display • Geometry – Objects, textures, etc
  7. 7. $ npm install three ... import { Scene } from 'three’; const scene = new Scene(); OR <script src="https://fastcdn.org/three.js/73/three.min.js"> </script>
  8. 8. var scene = new THREE.Scene(); var camera = new THREE.PerspectiveCamera(75, window.innerWidth / window.innerHeight, 0.1, 1000); var renderer = new THREE.WebGLRenderer(); renderer.setSize(window.innerWidth, window.innerHeight); document.body.appendChild(renderer.domElement); var geometry = new THREE.BoxGeometry(1, 1, 1); var material = new THREE.MeshBasicMaterial({color: 0x00ff00}); var cube = new THREE.Mesh(geometry, material); scene.add(cube); camera.position.z = 5;
  9. 9. WebVR • Low-level API to gather info about the VR display + capabilities • Eye parameters, the data to render the scene for each eye • Field of view data • Position of the VR display (and velocity and acceleration)
  10. 10. WebXR (unstable!) • Evolution of WebVR spec • Much faster than WebVR API • Better architecture to support both VR and AR, multiple view types • Desktop • VR Headset • Smartphone • Magic Window • Touchscreen, Mouse, Gamepad, Controllers
  11. 11. The future of the web is immersive (Google I/O ‘18) https://www.youtube.com/watch?v=1t1gBVykneA
  12. 12. WebXR-Polyfill • Best way to make sure code written to XR spec (proposal) will actually work https://github.com/immersive-web/webxr-polyfill <script src='https://cdn.jsdelivr.net/npm/webxr- polyfill@latest/build/webxr-polyfill.js’></script> $ npm install --save webxr-polyfill
  13. 13. function initXR() { xrButton = new XRDeviceButton({ onRequestSession: onRequestSession, onEndSession: onEndSession }); document.querySelector('header').appendChild(xrButton.domElement); if (navigator.xr) { navigator.xr.requestDevice().then((device) => { device.supportsSession({immersive: true}).then(() => { xrButton.setDevice(device); }); }); } } function onRequestSession(device) { device.requestSession({immersive: true}).then(onSessionStarted); }
  14. 14. function onSessionStarted(session) { xrButton.setSession(session); session.addEventListener('end', onSessionEnded); gl = createWebGLContext({compatibleXRDevice: session.device}); renderer = new Renderer(gl); scene.setRenderer(renderer); session.baseLayer = new XRWebGLLayer(session, gl); session.requestFrameOfReference('eye-level').then((frameOfRef) => { xrFrameOfRef = frameOfRef; session.requestAnimationFrame(onXRFrame); }); }
  15. 15. function onXRFrame(t, frame) { let session = frame.session; scene.startFrame(); session.requestAnimationFrame(onXRFrame); let pose = frame.getDevicePose(xrFrameOfRef); if (pose) { gl.bindFramebuffer(gl.FRAMEBUFFER, session.baseLayer.framebuffer); gl.clear(gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT); for (let view of frame.views) { let viewport = session.baseLayer.getViewport(view); gl.viewport(viewport.x, viewport.y, viewport.width, viewport.height); scene.draw(view.projectionMatrix, pose.getViewMatrix(view)); } } else {} scene.endFrame(); }
  16. 16. A-Frame • Web framework originally from Mozilla for rendering AR and VR in web pages • Declarative syntax • 3D scene graph with markup language
  17. 17. <head> <script src="https://aframe.io/releases/0.8.0/aframe.min.js"></script> </head> $ npm install aframe ... require(‘aframe’);
  18. 18. <body> <a-scene> <a-box position="-1 0.5 -3" rotation="0 45 0" color="#4CC3D9"></a-box> <a-sphere position="0 1.25 -5" radius="1.25" color="#EF2D5E"></a-sphere> <a-cylinder position="1 0.75 -3" radius="0.5" height="1.5" color="#FFC65D"></a-cylinder> <a-plane position="0 0 -4" rotation="-90 0 0" width="4" height="4" color="#7BC8A4"></a-plane> <a-sky color="#ECECEC"></a-sky> </a-scene> </body>
  19. 19. Loading 3D Models <a-assets> <a-asset-item id="cityModel" src="https://cdn.aframe.io/test-models/models/virtualcity/VC.gltf"> </a-asset-item> </a-assets>
  20. 20. https://aframe.io/ https://aframe.io/aframe-school/#/
  21. 21. $ npm install -g react-360-cli $ react-360 init Hello360 $ cd Hello360 $ npm start React 360
  22. 22. React 360 • You can use React to build VR web UIs • Render React Native components in 3D
  23. 23. import { AppRegistry, StyleSheet, Text, View } from'react-360; export default class Hello360 extends React.Component { render() { return ( <View style={styles.panel}> <View style={styles.greetingBox}> <Text style={styles.greeting}> Welcome to React 360 </Text> </View> </View> ); } };
  24. 24. React 360 • Similar in Architecture to React Native • Uses Web Workers to avoid single- threaded computation limitation, which could impact performance, break immersion
  25. 25. React 360 • Can also load 3D models, using Entity (multiple formats) // to reference a GLTF2 model <Entity source={{gltf2: asset('myModel.gltf')}} /> // to reference an untextured OBJ model <Entity source={{obj: asset('myModel.obj')}} /> // to reference an OBJ with matching MTL file <Entity source={{obj: asset('myModel.obj'), mtl: asset('myModel.mtl')}} />
  26. 26. Augmented Reality • Similar problems solved in VR • Must be able to identify real- world geometry • Capable of marker-based AR at 60fps, even on budget smartphones https://aframe.io/blog/arjs/
  27. 27. Building AR with A-Frame (AR.js) <script src="https://jeromeetienne.github.io/AR.js/aframe/build/aframe-ar.js"> </script> <a-scene embedded arjs> <a-marker-camera preset='hiro'></a-marker-camera>
  28. 28. Hiro Marker
  29. 29. Building AR with A-Frame <body style='margin : 0px; overflow: hidden;’> <a-scene embedded arjs> <!-- create your content here. just a box for now --> <a-box position='0 0.5 0' material='opacity: 0.5;'></a-box> <!-- define a camera which will move according to the marker position --> <a-marker-camera preset='hiro'></a-marker-camera> </a-scene> </body>
  30. 30. Wikitude SDK • Paid SDK, for implements sophisticated AR algorithms, available as a plugin for Native or Cordova projects • Free trial available (for experiments and education) • Instant Tracking, SLAM, SMART • Built on top of ARCore and ARKit • https://www.wikitude.com/augmented-reality-instant-tracking/
  31. 31. 3D Assets • https://sketchfab.com/ • https://poly.google.com/ • Create or buy 3D content to build amazing AR and VR experiences.
  32. 32. Continued Study • https://www.khanacademy.org/math/linear-algebra • https://medium.com/@necsoft/three-js-101-hello-world-part-1- 443207b1ebe1 • https://developer.mozilla.org/en- US/docs/Web/API/WebGL_API#Guides • https://github.com/mozilla/aframe-xr • https://aframe.io/blog/arjs/
  33. 33. Summary • Extended Reality is of increasing interest to many industries • Web tech can get us quite far, even with today’s experimental APIs • There are a number of entry points into this tech stack, pick the right level of abstraction for you
  34. 34. Thank You!

Editor's Notes

  • AR/VR/MR/XR terms are all interchangeable

    In 2016, Oculus Rift (high-end experience) was about $800, the year before over $1000? Currently in the $550 range

    https://www.zdnet.com/article/walmart-deploys-17000-oculus-go-headsets-to-train-its-employees/
  • Front-end training and consulting firm
    VoIP & chat apps for telecom (Mitel)
    Real-time telemetry and diagnostics apps for automotive service industry (FCA)
    Customized global developer platforms training (Facebook, BlackBerry)
    Modern search UI and experience for Ontario Electronic Land Registry (Teranet)
    B2B e-Commerce portal for retail giant (Adidas)

    DevC Toronto
    1500 member community
    Monthly meetups, guidance, training, networking, learn about latest tech from Facebook platform
  • https://www.zdnet.com/article/walmart-deploys-17000-oculus-go-headsets-to-train-its-employees/

    Talk about each image and how it’s market is relevant
  • We are going to walkthrough a variety of options to get started with these technologies, to get a better understanding of the state of the art in 2018.

    Once we understand where each tool or library is focused, hopefully we can make a decision about what is the appropriate level of detail for us

    How many of would consider themselves beginner web developers?
    How many would consider themselves to be intermediate? Expert?
    How many would consider themselves to be beginner AR/VR developers?
    How many would consider themselves to be intermediate or expert AR/VR?
  • Core building block of many libraries in this field
    WebGL is too low level for web developers (shader pipeline code, GPU programming) – basically the thinnest layer possible for a browser to access the GPU directly
    Fun to learn about, but need to dedicate a huge chunk of time to become a computer graphics expert
    Need strong understanding of mathematics in computer graphics (linear algebra, geometry, some physics)
    Not the best use of time when you have a real web project with tight deadlines

    Ricardo Cabello – Mr.Doob
  • Beware, not all the example code has been ported over to work with ES6 modules, this is an open issue on GitHub
  • Introductory threejs code,

    Draws a config scene, camera, renderer, view cube

    Much easier to work with than dealing with low-level WebGL API
  • Basic boilerplate and set up info to configure a VR session on a web page
  • Handles enormous amount of boiler plate code to set up a VR session consistently across many different types of set ups
  • https://caniuse.com/#search=webxr
  • // Checks to see if WebXR is available and, if so, queries a list of
    // XRDevices that are connected to the system.

    // Adds a helper button to the page that indicates if any XRDevices are
    // available and let's the user pick between them if there's multiple.

    // Is WebXR available on this UA?

    // Request an XRDevice connected to the system.

    // If the device allows creation of exclusive sessions set it as the
    // target of the 'Enter XR' button.
  • // Called when the user selects a device to present to. In response we
    // will request an exclusive session from that device.


    // Called when we've successfully acquired a XRSession. In response we
    // will set up the necessary session state and kick off the frame loop.
    // This informs the 'Enter XR' button that the session has started and
    // that it should display 'Exit XR' instead.
    // Listen for the sessions 'end' event so we can respond if the user
    // or UA ends the session for any reason.
    // Create a WebGL context to render with, initialized to be compatible
    // with the XRDisplay we're presenting to.
    // Create a renderer with that GL context (this is just for the samples
    // framework and has nothing to do with WebXR specifically.)
    // Set the scene's renderer, which creates the necessary GPU resources.
    // Use the new WebGL context to create a XRWebGLLayer and set it as the
    // sessions baseLayer. This allows any content rendered to the layer to
    // be displayed on the XRDevice.
    // Get a frame of reference, which is required for querying poses. In
    // this case an 'eye-level' frame of reference means that all poses will
    // be relative to the location where the XRDevice was first detected.
  • // Per-frame scene setup. Nothing WebXR specific here.
    // Inform the session that we're ready for the next frame.
    // Get the XRDevice pose relative to the Frame of Reference we created
    // earlier.
    // Getting the pose may fail if, for example, tracking is lost. So we
    // have to check to make sure that we got a valid pose before attempting
    // to render with it. If not in this case we'll just leave the
    // framebuffer cleared, so tracking loss means the scene will simply
    // dissapear.
    // If we do have a valid pose, bind the WebGL layer's framebuffer,
    // which is where any content to be displayed on the XRDevice must be
    // rendered.
    // Clear the framebuffer
    // Loop through each of the views reported by the frame and draw them
    // into the corresponding viewport.
    // Draw this view of the scene. What happens in this function really
    // isn't all that important. What is important is that it renders
    // into the XRWebGLLayer's framebuffer, using the viewport into that
    // framebuffer reported by the current view, and using the
    // projection and view matricies from the current view and pose.
    // We bound the framebuffer and viewport up above, and are passing
    // in the appropriate matrices here to be used when rendering.
  • That is a lot of code just to create a VR scene. Good to know what the API is responsible for, but if what if you just want to start building VR environments in the browser, as quickly as possible?

    Is ~10 LOC quick enough?

    This is the best place to start for 90% of web devs

    https://aframe.io/


    Brilliant entity component framework by Mozilla, allows you to declaratively build and define a 3D VR scene with a natural HTML markup syntax

    Abstract a scene using entity component architecture
    Components and entities abstract 3d concepts like geometry, mesh, lighting, textures into single objects in a scene
    Can create higher level re-usable components
    Declaratively mark up a scene using high level re-usable components
  • Multiple ways to install, depending on your projects build setup.
  • Declaratively build a scene by nesting entities inside of <a-scene>

    Components are an entities Attributes

    Compose entities of each other

    Can extend this to a full interaction model
  • Load huge geometry effeciently
  • This fetches the latest version of the CLI and installs it on your system. After installation, we can use it to generate the initial code for our first project. Start by navigating to a directory where you would like to put your new project, and run the command to create a new project called Hello360.

    This creates a new directory called Hello360, with all of the files needed to run your project. Enter the directory to view them.

    During development, the bundler runs a local server that allows you to access your project. It serves up the files for your project, doing any necessary compilation or processing at request time. When you're ready to publish your project, you can instruct the bundler to build production versions of these files, so you can place them on any web server. For now, start the development server with the following command.

    After the first load, successive loads are much faster, even when you change your code. You can watch the progress in the terminal where you started your server, and once the app has loaded you should see something like this in your browser:
  • Previously React VR, project has been replaced with React 360. Focused on 3D and 360 UIs, and photosphere and 360 video integration for web applications

    React 360 allows you to use familiar tools and concepts to create immersive 360 content on the web.

    Abstractions are focused on user interfaces and user interactions, rather than focus on 3D scene framework (A-Frame approach)
    Uses Three.js internally, but in the future will be open to working with WebGL directly
  • Executors
    Executors are pieces of the runtime that run your React application in a separate process from the main browser window. React 360 ships with two different executors, but chances are good that you don't need to worry about configuring this.
    Web Worker Executor
    Web Workers are a modern browser feature that allows code to run outside of the main window context. This is ideal for high-cost operations, or in our case, running code without interrupting the requestAnimationFrame loop. By default, React 360 uses a Web Worker to execute your React code. That means that any code found in your index.js runs inside a Web Worker environment, not the standard browser window
  • React 360 is still in VERY early stages

    Mininal API surface, not many built-in Components and APIs yet. Oculus native APIs are the focus for game development, React 360 has transitioned to focusing on developing Immersive 360 media for the web, which can be experienced with VR and AR equipment, but its not necessarily a requirement.

    Not many examples yet
  • Windows Mixed Reality
    Smartphone cameras

    Generally understood as a bit harder to pull off a good AR experience, compared to VR

    This is because VR developers have 100% control of user perception. AR needs to take into account data from the real world, at human scale and time

    Markers allow the AR engine to determine physical space that is being viewed by the camera
    Algorithms adjust over time, if the expected surface “drifts” from the actual surface, markers help the AR engine correct itself

  • Can think of like a “QR” code to activate AR display
    Can create custom markers as well

    But AR.js supports another kind of marker, called barcode. Each of those markers contains a kind of binary code which encodes a number. For example below, you see one representing 5
  • While the tech is not quite mature, the capabilities of the existing tech is quite impressive
    The more time you spend working with AR and VR features, the more you start to feel like this is going to be the future for a lot of industries

  • Time for questions?
  • ×