Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Can you hear me now?

347 views

Published on

Presented at Berlin.js May 31st 2018 for the #jsconfeu Special. Almost every video call begins with the same clumsy questions. Can you hear me now? Did I just turn off my camera instead of my mic now? But what if we could take the awkward troubleshooting out of the conversation, and solve it with code instead? In this talk, Ingvild Indrebø will give you a glimpse into aspects of WebRTC, WebAudio and Canvas, by showing you how she used these technologies to build a user-friendly and accessible tool to make sure you’re all set for your video call.

Published in: Engineering
  • Be the first to comment

  • Be the first to like this

Can you hear me now?

  1. 1. Can you hear me now? Saying goodbye to clumsy video calls, thanks to code. Ingvild Indrebø @IngvildIndrebo
  2. 2. People struggling
 is no joke.
  3. 3. The technical perspective: • Get access to the camera • Get access to the microphone • Play sound from the speakers
  4. 4. The technical perspective: • Get access to the camera • Get access to the microphone • Play sound from the speakers
  5. 5. Is this really
 a user problem?
  6. 6. Can you
 see yourself?
  7. 7. WebRTC
  8. 8. WebRTC allows browsers to communicate peer to peer.
  9. 9. WebRTC can be used for all kinds of data transfer.
  10. 10. mediaStreamgetUserMedia Grant permission
  11. 11. getUserMedia navigator.mediaDevices .getUserMedia({ video: true, audio: true })
  12. 12. getUserMedia navigator.mediaDevices .getUserMedia({ video: true, audio: true }) .then(playStream)
  13. 13. getUserMedia navigator.mediaDevices .getUserMedia({ video: true, audio: true }) .then(playStream) .catch(handleError);
  14. 14. <video id=“video” playsinline/>
  15. 15. <video id=“video” playsinline/> function playStream(mediaStream) { }
  16. 16. <video id=“video” playsinline/> function playStream(mediaStream) { const videoEl = document.getElementById('video'); }
  17. 17. <video id=“video” playsinline/> function playStream(mediaStream) { const videoEl = document.getElementById('video'); videoEl.srcObject = mediaStream; }
  18. 18. <video id=“video” playsinline/> function playStream(mediaStream) { const videoEl = document.getElementById('video'); videoEl.srcObject = mediaStream; videoEl.play(); }
  19. 19. Can you
 hear me?
  20. 20. We already have access to the microphone.
 But can we hear anything?
  21. 21. “WebAudio is a system for controlling audio on the Web, allowing developers to choose audio sources, add effects to audio, create audio visualizations, apply spatial effects (such as panning) and much more… https://developer.mozilla.org
  22. 22. Audio sources • computed mathematically (such as OscillatorNode) • recordings from audio/video files (<audio/>, <video/>) • WebRTC MediaStream
  23. 23. Audio context DestinationInput Analyze + Effects
  24. 24. WebAudio const audioContext = new AudioContext();
  25. 25. WebAudio const audioContext = new AudioContext(); const audioSrc = audioContext.createMediaStreamSource( mediaStream );
  26. 26. WebAudio const audioContext = new AudioContext(); const audioSrc = audioContext.createMediaStreamSource( mediaStream ); const analyser = audioContext.createAnalyser();
  27. 27. WebAudio const audioContext = new AudioContext(); const audioSrc = audioContext.createMediaStreamSource( mediaStream ); const analyser = audioContext.createAnalyser(); audioSrc.connect(analyser);
  28. 28. That we know it works isn’t enough.
  29. 29. But…
 what is sound?
  30. 30. Sound wave
  31. 31. Fourier transformation
  32. 32. Volume
  33. 33. Amplitude
  34. 34. function canWeHearYou(analyser) { const dataArray = new Uint8Array(analyser.frequencyBinCount); analyser.getByteFrequencyData(dataArray); }
  35. 35. function canWeHearYou(analyser) { const dataArray = new Uint8Array(analyser.frequencyBinCount); analyser.getByteFrequencyData(dataArray); } dataArray: [3, 25, 179, 5, …]
  36. 36. dataArray = [ , , ,]
  37. 37. 69-90 hz47-68 hz23-46 hz Frequencies 0-22hz Each bucket
 contains the total volume
 of that frequency range
  38. 38. 69-90 hz47-68 hz23-46 hz Frequencies 0-22hz Each bucket
 contains the total volume
 of that frequency range
  39. 39. function canWeHearYou(analyser) { const dataArray = new Uint8Array(analyser.frequencyBinCount); analyser.getByteFrequencyData(dataArray); let sum = 0; sum += dataArray.reduce((a, b) => a + b, 0); return sum > 0; }
  40. 40. Visualization • Our DataArray gives us a sequence of numbers • We use Canvas to draw them
  41. 41. <canvas id=“canvas”/>
 
 draw(dataArray) { 
 
 
 

  42. 42. <canvas id=“canvas”/>
 
 draw(dataArray) { const canvas = document.getElementById(‘canvas');
  43. 43. <canvas id=“canvas”/>
 
 draw(dataArray) { const canvas = document.getElementById(‘canvas'); const canvasCtx = canvas.getContext('2d');

  44. 44. <canvas id=“canvas”/>
 
 draw(dataArray) { const canvas = document.getElementById(‘canvas'); const canvasCtx = canvas.getContext('2d');
 canvasCtx.beginPath();
  45. 45. <canvas id=“canvas”/>
 
 draw(dataArray) { const canvas = document.getElementById(‘canvas'); const canvasCtx = canvas.getContext('2d');
 canvasCtx.beginPath(); canvasCtx.moveTo(0, 0);
  46. 46. draw(dataArray) { const sliceWidth = CANVAS_WIDTH/dataArray.length; let x = 0;
 for(let y of dataArray) {
 canvasCtx.lineTo(x, y);
 x += sliceWidth; }
  47. 47. draw(dataArray) { const sliceWidth = CANVAS_WIDTH/dataArray.length; let x = 0;
 for(let y of dataArray) {
 canvasCtx.lineTo(x, y);
 x += sliceWidth; } canvasCtx.lineTo(canvas.width, 0);
  48. 48. Demo
  49. 49. ~80-15khz What a mic can hear ~20-20khz What a human can hear
  50. 50. Can you hear
 the sound?
  51. 51. Audio sources • computed mathematically (such as OscillatorNode) • recordings from audio/video files (<audio/>, <video/>) • WebRTC MediaStream
  52. 52. WebAudio <audio id=“callingSound” src=“…”/>

  53. 53. WebAudio <audio id=“callingSound” src=“…”/>
 const audioContext = new AudioContext();

  54. 54. WebAudio <audio id=“callingSound” src=“…”/>
 const audioContext = new AudioContext();
 const audioEl = document.getElementById('callingSound');

  55. 55. WebAudio <audio id=“callingSound” src=“…”/>
 const audioContext = new AudioContext();
 const audioEl = document.getElementById('callingSound');
 const audioSrc = audioContext.createMediaElementSource(audioEl);

  56. 56. WebAudio <audio id=“callingSound” src=“…”/>
 const audioContext = new AudioContext();
 const audioEl = document.getElementById('callingSound');
 const audioSrc = audioContext.createMediaElementSource(audioEl);
 const analyser = audioContext.createAnalyser();

  57. 57. WebAudio <audio id=“callingSound” src=“…”/>
 const audioContext = new AudioContext();
 const audioEl = document.getElementById('callingSound');
 const audioSrc = audioContext.createMediaElementSource(audioEl);
 const analyser = audioContext.createAnalyser();
 audioSrc.connect(analyser);
  58. 58. There are no
 user problems.
  59. 59. Thank you! confrere.com/test Ingvild Indrebø @IngvildIndrebo ingvild@confrere.com

×