Your SlideShare is downloading. ×
0
Creating the interfaces of the future with the APIs of today
Creating the interfaces of the future with the APIs of today
Creating the interfaces of the future with the APIs of today
Creating the interfaces of the future with the APIs of today
Creating the interfaces of the future with the APIs of today
Creating the interfaces of the future with the APIs of today
Creating the interfaces of the future with the APIs of today
Creating the interfaces of the future with the APIs of today
Creating the interfaces of the future with the APIs of today
Creating the interfaces of the future with the APIs of today
Creating the interfaces of the future with the APIs of today
Creating the interfaces of the future with the APIs of today
Creating the interfaces of the future with the APIs of today
Creating the interfaces of the future with the APIs of today
Creating the interfaces of the future with the APIs of today
Creating the interfaces of the future with the APIs of today
Creating the interfaces of the future with the APIs of today
Creating the interfaces of the future with the APIs of today
Creating the interfaces of the future with the APIs of today
Creating the interfaces of the future with the APIs of today
Creating the interfaces of the future with the APIs of today
Creating the interfaces of the future with the APIs of today
Creating the interfaces of the future with the APIs of today
Creating the interfaces of the future with the APIs of today
Creating the interfaces of the future with the APIs of today
Creating the interfaces of the future with the APIs of today
Creating the interfaces of the future with the APIs of today
Creating the interfaces of the future with the APIs of today
Creating the interfaces of the future with the APIs of today
Creating the interfaces of the future with the APIs of today
Creating the interfaces of the future with the APIs of today
Creating the interfaces of the future with the APIs of today
Creating the interfaces of the future with the APIs of today
Creating the interfaces of the future with the APIs of today
Creating the interfaces of the future with the APIs of today
Creating the interfaces of the future with the APIs of today
Creating the interfaces of the future with the APIs of today
Creating the interfaces of the future with the APIs of today
Creating the interfaces of the future with the APIs of today
Creating the interfaces of the future with the APIs of today
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Creating the interfaces of the future with the APIs of today

6,745

Published on

Published in: Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
6,745
On Slideshare
0
From Embeds
0
Number of Embeds
11
Actions
Shares
0
Downloads
7
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Create the interfaces of the future with the web APIs of today @gebille
  • 2. Two “futuristic” interfaces using web APIs
  • 3. + web sockets + device orientation = + WebGL!!
  • 4. server.js α, β, ɣ α, β, ɣremote.js teapot.js
  • 5. web sockets
  • 6. remote.js: var websocketServerUrl = ws://10.112.0.139:8080/; window.addEventListener(DOMContentLoaded, function init() { //init websocket connections //device orientation sync socket var ws = new WebSocket(websocketServerUrl); ws.onopen = function() { ws.opened = true; }; //listen to device orientation window.addEventListener(deviceorientation, function(e) { if (ws.opened) { ws.send(JSON.stringify({ alpha: e.alpha, beta: e.beta, gamma: e.gamma })); } }); });
  • 7. server.js: // ws server var ws = require(websocket-server); var wsServer = ws.createServer(); wsServer.addListener(connection, function(connection){ connection.addListener(message, function(msg) { wsServer.broadcast(msg); }); }); wsServer.listen(8080);
  • 8. teapot.js: window.addEventListener(DOMContentLoaded, function init() { //connect to server using websockets var ws = new WebSocket(ws://10.112.0.139:8080/); ws.onopen = function() { ws.onmessage = function(e) { var data = JSON.parse(e.data), avalue = data.alpha / 180 * Math.PI, bvalue = data.beta / 180 * Math.PI, gvalue = data.gamma / 180 * Math.PI; teapot.rotation.set(gvalue, avalue, -bvalue); }; }; });
  • 9. socket.io
  • 10. device orientation
  • 11. remote.js:   //listen to device orientation   window.addEventListener(deviceorientation, function(e) {     angles.innerHTML = alpha:  + e.alpha + , beta:  +  e.beta + , gamma:  + e.gamma;     if (ws.opened) {       ws.send(JSON.stringify({         alpha: e.alpha,         beta: e.beta,         gamma: e.gamma       }));     }   });
  • 12. slideshare.net/gerbille/device-disorientation
  • 13. WebGL
  • 14. three.js
  • 15. // scene sizevar WIDTH = 724, HEIGHT = 512;// get the DOM element to attach tovar container = $(container);// create a WebGL renderer, set its size and append it to the DOMvar renderer = new THREE.WebGLRenderer();renderer.setSize(WIDTH, HEIGHT);renderer.setClearColorHex(0x111111, 1);renderer.clear();container.appendChild(renderer.domElement);// create a scenevar scene = new THREE.Scene();
  • 16. // camera settings: fov, aspect ratio, near, farvar FOV = 45, ASPECT = WIDTH / HEIGHT, NEAR = 0.1, FAR = 10000;// create a camera and position camera on z axis (starts at 0,0,0)var camera = new THREE.PerspectiveCamera( FOV, ASPECT, NEAR, FAR);camera.position.z = 100;// add the camera to the scenescene.add(camera);// create some lights, position them and add it to the scenevar spotlight = new THREE.SpotLight();spotlight.position.set( 170, 330, -160 );scene.add(spotlight);ambilight = new THREE.AmbientLight(0x333333);scene.add(ambilight);//enable shadows on the rendererrenderer.shadowMapEnabled = true;
  • 17. // add an object (teapot) to the scenevar teapot;var loader = new THREE.JSONLoader(), createScene = function createScene( geometry ) { var material = new THREE.MeshFaceMaterial(); teapot = new THREE.Mesh( geometry, material ); teapot.scale.set(8, 8, 8); teapot.position.set( 0, -10, 0 ); scene.add( teapot ); console.log(matrix + teapot.matrix); console.log(rotation + teapot.rotation.x); };loader.load(teapot-model.js, createScene );// drawrenderer.render(scene, camera);animate();//animatefunction animate() { requestAnimationFrame(animate); renderer.render(scene, camera);}
  • 18. + getUserMedia = + WebGL!!
  • 19. getUserMedia
  • 20. <video id="camera" autoplay></video>var video = document.getElementById("camera");navigator.getUserMedia({ video: true }, function(stream) { video.src = window.URL.createObjectURL(stream) || stream;}, function() { //error...}); ** to make sure your code works in ALL browsers add these two lines: navigator.getUserMedia = navigator.getUserMedia || navigator.webkitGetUserMedia || navigator.mozGetUserMedia || navigator.msGetUserMedia; window.URL = window.URL || window.webkitURL || window.mozURL || window.msURL;
  • 21. headtrackr.js
  • 22. <canvas id="inputCanvas" width="320" height="240"style="display:none"></canvas><video id="inputVideo" autoplay loop></video><script> var videoInput = document.getElementById(inputVideo); var canvasInput = document.getElementById(inputCanvas); var htracker = new headtrackr.Tracker(); htracker.init(videoInput, canvasInput); htracker.start();</script>
  • 23. // set up camera controller for head-coupled perspectiveheadtrackr.controllers.three.realisticAbsoluteCameraControl(camera, 27, [0,0,50], new THREE.Vector3(0,0,0), {damping : 0.5}); * @param {THREE.PerspectiveCamera} camera * @param {number}scaling size of screen in 3d-model relative to vertical size ofcomputer screen in real life * @param {array} fixedPosition array(x,y,z) w/ the position of the real life screen in the 3d-modelspace coordinates * @param {THREE.Vector3} lookAt theobject/position the camera should be pointed towards * @param{object} params optional object with optional parameters
  • 24. document.addEventListener(headtrackingEvent, function(event) {scene.fog = new THREE.Fog(0x000000,1+(event.z*27), 3000+(event.z*27));}, false);* x : position of head in cms right of camera as seen from userspoint of view (see figure)* y : position of head in cms abovecamera (see figure)* z : position of head in cms distance fromcamera (see figure)
  • 25. WebGL
  • 26. three.js
  • 27. //top wallplane1 = new THREE.Mesh(new THREE.PlaneGeometry(500, 3000, 5, 15),new THREE.MeshBasicMaterial({color: 0xcccccc, wireframe : true }));plane1.rotation.x = Math.PI/2;plane1.position.y = 250;plane1.position.z = 50-1500;scene.add(plane1);
  • 28. var geometry = new THREE.Geometry(); geometry.vertices.push(new THREE.Vertex(new THREE.Vector3(0, 0, -80000))); geometry.vertices.push(new THREE.Vertex(new THREE.Vector3(0, 0, z))); var line = new THREE.Line(geometry,new THREE.LineBasicMaterial({color: 0xeeeeee })); line.position.x = x; line.position.y = y; scene.add(line);
  • 29. github.com/luzc/wiimote auduno.github.com/ headtrackr/examples/targets.html github.com/auduno/headtrackrslideshare.net/gerbille/device-disorientation
  • 30. shinydemos.com/touch-trackergithub.com/operasoftware
  • 31. @gerbillegithub.com/luzcslideshare.net/gerbille

×