WebRTC Technical Overview and Introduction
Upcoming SlideShare
Loading in...5
×
 

WebRTC Technical Overview and Introduction

on

  • 13,157 views

 

Statistics

Views

Total Views
13,157
Views on SlideShare
1,915
Embed Views
11,242

Actions

Likes
6
Downloads
130
Comments
0

64 Embeds 11,242

http://3g4g.blogspot.com 2940
http://blog.3g4g.co.uk 2791
http://3g4g.blogspot.in 838
http://3g4g.blogspot.co.uk 705
http://3g4g.blogspot.tw 587
http://3g4g.blogspot.fr 392
http://3g4g.blogspot.jp 384
http://3g4g.blogspot.kr 331
http://3g4g.blogspot.de 309
http://3g4g.blogspot.ca 224
http://3g4g.blogspot.pt 221
http://3g4g.blogspot.com.es 129
http://3g4g.blogspot.fi 115
http://3g4g.blogspot.ch 113
http://3g4g.blogspot.com.au 111
http://3g4g.blogspot.ru 111
http://www.scoop.it 103
http://3g4g.blogspot.it 102
http://3g4g.blogspot.sg 77
http://3g4g.blogspot.com.br 66
http://3g4g.blogspot.se 56
http://3g4g.blogspot.co.il 54
http://feeds.feedburner.com 49
http://3g4g.blogspot.hk 45
http://www.3g4g.blogspot.com 45
http://3g4g.blogspot.gr 45
http://www.3g4g.blogspot.in 40
http://3g4g.blogspot.ro 37
http://3g4g.blogspot.hu 26
http://3g4g.blogspot.be 18
http://3g4g.blogspot.nl 18
http://3g4g.blogspot.ae 17
http://3g4g.blogspot.co.at 16
http://3g4g.blogspot.mx 14
http://3g4g.blogspot.cz 12
http://www.3g4g.blogspot.co.uk 11
http://3g4g.blogspot.ie 10
http://www.3g4g.blogspot.de 7
http://www.3g4g.blogspot.ru 7
http://3g4g.blogspot.no 6
http://conversation.ecairn.com 6
http://3g4g.blogspot.co.nz 5
http://3g4g.blogspot.sk 4
http://www.hanrss.com 4
http://3g4g.blogspot.dk 4
http://translate.googleusercontent.com 4
http://1834236085756782640_bd28ae30c4b4d85537435ac5b38a3ba3cc9ff6eb.blogspot.in 4
http://3g4g.blogspot.com.ar 4
http://www.3g4g.blogspot.hu 3
http://news.google.com 2
More...

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

WebRTC Technical Overview and Introduction WebRTC Technical Overview and Introduction Presentation Transcript

  • WebRTC  Tutorial  28  Nov  2012   0  
  • November 27-29, 2012 South San Francisco Conference CenterWebRTC  Technical  Overview  and  Introduc7on  Alan  Johnston   Dan  BurneG  Dis7nguished  Engineer   Director  of  Standards  Avaya   Voxeo  Labs  November  28,  2012   November  28,  2012   1  
  • WebRTC  Tutorial  Topics  •  What  is  WebRTC?  •  How  to  Use  WebRTC  –  Peer  Connec@on  •  WebRTC  Peer-­‐to-­‐Peer  Media  •  WebRTC  Protocols  and  IETF  Standards  •  WebRTC  W3C  API  Overview  •  Pseudo  Code  Walkthrough  •  What’s  Next?   WebRTC  Tutorial  28  Nov  2012   2  
  • Announcement!   •  New  book  on  WebRTC!   –  hQp://webrtcbook.com   •  Available  on  Amazon  as   paperback  and  Kindle   eBook   •  Also  iBooks,  B&N  Nook,   etc.   WebRTC  Tutorial  28  Nov  2012   3  
  • What  is  WebRTC?   WebRTC  Tutorial  28  Nov  2012   4  
  • The  Browser  RTC  Func@on   •  New  Browser  Real-­‐Time   Web   On-­‐the-­‐wire  protocols   (Signaling)   Communica@on  (RTC)   Server   Func@on  built-­‐in  to   HTTP  or  WebSockets   browsers   JavaScript/HTML/CSS   •  Contains   Other  APIs   RTC  APIs   –  Audio  and  video  codecs   –  Ability  to  nego@ate  peer-­‐ Web   Browser   On-­‐the-­‐wire  protocols   to-­‐peer  connec@ons  Browser   RTC   (Media  or  Data)   Func@on   –  Echo  cancella@on,  packet   Na@ve  OS  Services   loss  concealement   •  In  Chrome  today,  Mozilla   soon,  Internet  Explorer   and  Safari  eventually   WebRTC  Tutorial  28  Nov  2012   5  
  • What’s  New   WebRTC  Tutorial  28  Nov  2012   6  
  • What’s  New  Con@nued   WebRTC  Tutorial  28  Nov  2012   7  
  • WebRTC  Support  of  Mul@ple  Media   Microphone  Audio   Applica@on  Sharing  Video   Front  Camera  Video   Rear  Camera  Video   WebCam  Video   Stereo  Audio   Browser  M   Browser  L         on  Mobile   on  Laptop  •  Mul@ple  sources  of  audio  and  video  are  assumed   and  supported  •  All  RTP  media,  voice  and  video,  and  RTCP  feedback   messages  are  mul@plexed  over  the  same  UDP  port   and  address   WebRTC  Tutorial  28  Nov  2012   8  
  • WebRTC  Triangle   Web  Server   (Applica@on)   Peer  Connec@on  (Audio,  Video,  and/or  Data)   Browser  M   Browser  L  (Running  HTML5  Applica@on     (Running  HTML5  Applica@on     from  Web  Server)   from  Web  Server)  •  Both  browsers  running  the  same  web  applica@on  from  web   server  •  Peer  Connec@on  established  between  them  with  the  help  of   the  web  server   WebRTC  Tutorial  28  Nov  2012   9  
  • WebRTC  Trapezoid   SIP     or  Jingle   Web  Server  A   Web  Server  B   (Applica@on  A)   (Applica@on  B)   Peer  Connec@on  (Audio  and/or  Video)   Browser  M   Browser  T  (Running  HTML5  Applica@on     (Running  HTML5  Applica@on     from  Web  Server  A)   from  Web  Server  B)  •  Similar  to  SIP  Trapezoid  •  Web  Servers  communicate  using  SIP  or  Jingle  •  Unclear  how  this  really  works  on  web   WebRTC  Tutorial  28  Nov  2012   10  
  • WebRTC  and  SIP   SIP   Web  Server     SIP  Server   SIP   Peer  Connec@on  (Audio  and/or  Video)   Browser  M   SIP  Client  •  Peer  Connec@on  appears  as  a  standard  RTP  media  session,  described  by   SDP  •  SIP  Endpoint  must  support  RTCWEB  Media  extensions  (ICE  NAT  Traversal,   Secure  RTP,  etc.)  •  Also  drag-­‐ieh-­‐sipcore-­‐websocket  that  defines  SIP  transport  over   WebSockets   WebRTC  Tutorial  28  Nov  2012   11  
  • WebRTC  and  PSTN   Web  Server   Peer  Connec@on  (Audio)   Browser  M   PSTN  Gateway   Phone  •  Peer  Connec@on  terminates  on  a  PSTN  Gateway  •  Audio  Only   WebRTC  Tutorial  28  Nov  2012   12  
  • How  to  Use  WebRTC   WebRTC  Tutorial  28  Nov  2012   13  
  • How  to  use  WebRTC   Obtain  Local   Add  more  media   Media   All  media  added   Setup  Peer   Connec@on   Peer  Connec@on  established   AQach  Media   AQach  more  media  or  data   or  Data   Either  browser  closes  the  connec@on   Close   Connec@on   WebRTC  Tutorial  28  Nov  2012   14  
  • How  to  use  WebRTC  Obtain  Local   Add  more  media   •  getUserMedia() Media   –  Audio  and/or  video   All  media  added   –  Constraints   Setup  Peer   Connec@on   –  User  permissions   Peer  Connec@on  established   •  LocalMediaStreamAQach  Media   •  MediaStream AQach  more  media  or  data   or  Data   –  Local  or  derived   Either  browser  closes  the  connec@on   Close   Connec@on   WebRTC  Tutorial  28  Nov  2012   15  
  • How  to  use  WebRTC  Obtain  Local   Add  more  media   •  RTCPeerConnection Media   –  Direct  media   All  media  added   –  Between  two  peers   Setup  Peer   Connec@on   –  ICE  processing   Peer  Connec@on  established   –  SDP  processing   –  Iden@ty  verifica@on  AQach  Media   AQach  more  media  or  data   or  Data   –  Sta@s@cs  repor@ng   Either  browser  closes  the  connec@on   Close   Connec@on   WebRTC  Tutorial  28  Nov  2012   16  
  • How  to  use  WebRTC  Obtain  Local   Add  more  media   •  addStream() Media   –  Doesnt  change  media  state!   All  media  added   •  removeStream() Setup  Peer   –  DiQo!   Connec@on   •  createOffer(), Peer  Connec@on  established   createAnswer() •  setLocalDescription(),AQach  Media   AQach  media  or  data   setRemoteDescription() or  Data   •  Applying  SDP  answer  makes   Either  browser  closes  the  connec@on   the  magic  happen   Close   •  createDataChannel() Connec@on   WebRTC  Tutorial  28  Nov  2012   17  
  • How  to  use  WebRTC  Obtain  Local   Add  more  media   Setup  Peer   Media   Connec@on  All  media  added   Peer  Connec@on  established   AQach  Media   AQach  more  media  or  data   or  Data   Either  browser  closes  the  connec@on   Close   Connec@on   WebRTC  Tutorial  28  Nov  2012   18  
  • WebRTC  Peer-­‐to-­‐Peer  Media   WebRTC  Tutorial  28  Nov  2012   19  
  • Media  Flows  in  WebRTC   Web  Server   Internet   Home  WiFi   Router   Router   Browser  D  Browser  M   Browser  T   Coffee  Shop   WiFi  Router   Browser  L   WebRTC  Tutorial  28  Nov  2012   20  
  • Media  without  WebRTC   Web  Server   Internet   Home  WiFi   Router   Router   Browser  D  Browser  M   Browser  T   Coffee  Shop   WiFi  Router   Browser  L   WebRTC  Tutorial  28  Nov  2012   21  
  • Peer-­‐to-­‐Peer  Media  with  WebRTC   Web  Server   Internet   Home  WiFi   Router   Router   Browser  D   Browser  M   Browser  T   Coffee  Shop   WiFi  Router   Browser  L   WebRTC  Tutorial  28  Nov  2012   22  
  • NAT  Complicates  Peer-­‐to-­‐Peer  Media   Web  Server   Most  browsers  are  behind   NATs  on  the  Internet,   which  complicates  the   establishment  of  peer-­‐to-­‐ peer  media  sessions.   Internet   Home  WiFi     Router  with   with  NAT   NAT   Browser  D   Browser  M   Browser  T   Coffee  Shop   WiFi  with   NAT   Browser  L   WebRTC  Tutorial  28  Nov  2012   23  
  • Peer-­‐to-­‐Peer  Media  Through  NAT   Web  Server   ICE  hole  punching  can   ogen  establish  a  direct   peer-­‐to-­‐peer  session   between  browsers   behind  different  NATs   Internet   Home  WiFi     Router  with   with  NAT   NAT   Browser  D   Browser  M   Browser  T   Coffee  Shop   WiFi  with   NAT   Browser  L   WebRTC  Tutorial  28  Nov  2012   24  
  • P2P  Media  Can  Stay  Local  to  NAT   Web  Server   If  both  browsers  are   behind  the  same  NAT,  hole   punching  can  ogen   establish  a  connec@on   that  never  leaves  the  NAT.   Internet   Home  WiFi     Router  with   with  NAT   NAT   Browser  D   Browser  M   Browser  T   Coffee  Shop   WiFi  with   NAT   Browser  L   WebRTC  Tutorial  28  Nov  2012   25  
  • ICE  uses  STUN  and  TURN  Servers   Web  Server   STUN  Server   TURN  Server   198.51.100.9   198.51.100.2   ICE  hole  punching  uses   STUN  and  TURN  servers  in   the  public  Internet  to  help   Internet   with  NAT  traversal.   Home  WiFi     Router  with   with  NAT   NAT   203.0.113.4   Browser  D   Browser  M   192.168.0.5   Coffee  Shop   Browser  T   WiFi  with   NAT   Browser  L   WebRTC  Tutorial  28  Nov  2012   26  
  • Browser  Queries  STUN  Server     Web  Server   STUN  Server   TURN  Server   198.51.100.9   198.51.100.2   Browser  sends  STUN  test   packet  to  STUN  server  to   learn  its  public  IP  address   Internet   (address  of  the  NAT).   Home  WiFi     Router  with   with  NAT   NAT   203.0.113.4   Browser  D   Browser  M   192.168.0.5   Coffee  Shop   Browser  T   WiFi  with   NAT   Browser  L   WebRTC  Tutorial  28  Nov  2012   27  
  • TURN  Server  Can  Relay  Media   Web  Server   STUN   TURN  Server  as  a   Server   Media  Relay   In  some  cases,  hole   punching  fails,  and  a  TURN   Media  Relay  on  the  public   Internet   Internet  must  be  used.     Home  WiFi     Router  with   with  NAT   NAT   Browser  D   Browser  M   Browser  T   Coffee  Shop   WiFi  with   NAT   Browser  L   WebRTC  Tutorial  28  Nov  2012   28  
  • WebRTC  Protocols  and  IETF   Standards   WebRTC  Tutorial  28  Nov  2012   29  
  • WebRTC  Protocols  Applica@on  Layer   ICE   HTTP   WebSocket   SRTP   SDP   STUN   TURN  Transport  Layer   TLS   DTLS   TCP   SCTP   UDP  Network  Layer   IP   WebRTC  Tutorial  28  Nov  2012   30  
  • Data  Channel  Protocols   Data  Channel  Data   Provides  conges@on  and  flow  control   SCTP   Provides  security  (confiden@ality)   DTLS   DTLS   Provides  transport  through   UDP   NAT  (ager  ICE  hole  punching)   Internet  •  Data  channel  provides  a  non-­‐media  channel  between   browsers  •  ICE  is  s@ll  used  for  NAT  Traversal  •  Used  in  gaming  and  other  non-­‐communica@on  applica@ons   WebRTC  Tutorial  28  Nov  2012   31  
  • A  Joint  Standards  Effort  •  World  Wide  Web  Consor@um  (W3C)     –  Standardizing  APIs  (Applica@on  Programming   Interfaces)   –  Most  work  in  WEBRTC  Working  Group   –  Used  by  JavaScript  to  access  RTC  func@on  •  Internet  Engineering  Task  Force  (IETF)   –  Standardizing  protocols  (bits  on  the  wire)   –  Peer  Connec@on  will  use  RTP,  SDP,  and  extensions   –  Some  work  in  RTCWEB  Working  Group   –  Lots  of  related  work  in  MMUSIC,  AVTCORE,  etc.   WebRTC  Tutorial  28  Nov  2012   32  
  • IETF  RTCWEB  Documents   WebRTC  Tutorial  28  Nov  2012   33  
  • WebRTC  Security   HTML/CSS/JavaScript   Provides  transport  of  HTML/CSS/JavaScript   HTTP   Provides  security  (confiden@ality   TLS   DTLS   and  authen@ca@on)   TCP   Provides  reliability  and   conges@on  control   Internet  •  Media  is  secured  by  Secure  RTP    •  Control  is  secured  by  HTTPS  (HTTP  over  TLS  over  TCP)  •  Browser  confirms  permissions  for  microphone  and   camera  on  each  session   WebRTC  Tutorial  28  Nov  2012   34  
  • Codecs                      RFC  6716                  .                •  Mandatory  to  Implement  (MTI)  audio  codecs   are  seQled  (finally!)  •  Video  is  not!   WebRTC  Tutorial  28  Nov  2012   35  
  • WebRTC  W3C  API  Overview   WebRTC  Tutorial  28  Nov  2012   36  
  • Local  Media  Handling   Presenta@on  Stream  label=“F8kdls”   Audio   Track  label=“Audio”   Audio  Stream  label=“2dLe3js”   Track  label=“Presenta@on”   Presenta@on  Video   Presenter  Stream  label=“8dFlf”   Video  Stream  label=“eR3l0s”   Track  label=“Audio”   Microphone  Audio   Presenter  Video   Track  label=“Presenter”   Applica@on  Sharing  Video   Demonstra@on  Stream  label=“3dfdf2”   Front  Camera  Video   Video  Stream  label=“js4KMs”   Track  label=“Audio”   Rear  Camera  Video   Demonstra@on  Video   Track  label=“Demonstra@on”  Browser  M   Video  Stream  label=“923fKs”   Sources   Local  Media  Streams   Streams   Tracks   •  Channels   –  Encoded  together   –  Cant  manipulate  individually   WebRTC  Tutorial  28  Nov  2012   37  
  • Local  Media  Handling   Presenta@on  Stream  label=“F8kdls”   Audio   Track  label=“Audio”   Audio  Stream  label=“2dLe3js”   Track  label=“Presenta@on”   Presenta@on  Video   Presenter  Stream  label=“8dFlf”   Video  Stream  label=“eR3l0s”   Track  label=“Audio”   Microphone  Audio   Presenter  Video   Track  label=“Presenter”   Applica@on  Sharing  Video   Demonstra@on  Stream  label=“3dfdf2”   Front  Camera  Video   Video  Stream  label=“js4KMs”   Track  label=“Audio”   Rear  Camera  Video   Demonstra@on  Video   Track  label=“Demonstra@on”  Browser  M   Video  Stream  label=“923fKs”   Sources   Local  Media  Streams   Streams   Tracks   •  Tracks  (MediaStreamTrack)   –  Exist  only  as  part  of  Streams   –  Ordered  and  op@onally  labeled   WebRTC  Tutorial  28  Nov  2012   38  
  • Local  Media  Handling   Presenta@on  Stream  label=“F8kdls”   Audio   Track  label=“Audio”   Audio  Stream  label=“2dLe3js”   Track  label=“Presenta@on”   Presenta@on  Video   Presenter  Stream  label=“8dFlf”   Video  Stream  label=“eR3l0s”   Track  label=“Audio”   Microphone  Audio   Presenter  Video   Track  label=“Presenter”   Applica@on  Sharing  Video   Demonstra@on  Stream  label=“3dfdf2”   Front  Camera  Video   Video  Stream  label=“js4KMs”   Track  label=“Audio”   Rear  Camera  Video   Demonstra@on  Video   Track  label=“Demonstra@on”  Browser  M   Video  Stream  label=“923fKs”   Sources   Local  Media  Streams   Streams   Tracks   •  Streams  (MediaStream)   –  All  contained  tracks  are  synchronized   –  Can  be  created,  transmiQed,  etc.   WebRTC  Tutorial  28  Nov  2012   39  
  • Local  Media  Handling   Presenta@on  Stream  label=“F8kdls”   Audio   Track  label=“Audio”   Audio  Stream  label=“2dLe3js”   Track  label=“Presenta@on”   Presenta@on  Video   Presenter  Stream  label=“8dFlf”   Video  Stream  label=“eR3l0s”   Track  label=“Audio”   Microphone  Audio   Presenter  Video   Track  label=“Presenter”   Applica@on  Sharing  Video   Demonstra@on  Stream  label=“3dfdf2”   Front  Camera  Video   Video  Stream  label=“js4KMs”   Track  label=“Audio”   Rear  Camera  Video   Demonstra@on  Video   Track  label=“Demonstra@on”  Browser  M   Video  Stream  label=“923fKs”   Sources   Local  Media  Streams   Streams   Tracks   •  LocalMediaStream   –  Returned  from  getUserMedia() –  Directly  connected  to  source   –  Permission  check  required  to  obtain   40   WebRTC  Tutorial  28  Nov  2012  
  • Local  Media  Handling   Presenta@on  Stream  label=“F8kdls”   Audio   Track  label=“Audio”   Audio  Stream  label=“2dLe3js”   Track  label=“Presenta@on”   Presenta@on  Video   Presenter  Stream  label=“8dFlf”   Video  Stream  label=“eR3l0s”   Track  label=“Audio”   Microphone  Audio   Presenter  Video   Track  label=“Presenter”   Applica@on  Sharing  Video   Demonstra@on  Stream  label=“3dfdf2”   Front  Camera  Video   Video  Stream  label=“js4KMs”   Track  label=“Audio”   Rear  Camera  Video   Demonstra@on  Video   Track  label=“Demonstra@on”  Browser  M   Video  Stream  label=“923fKs”   Sources   Local  Media  Streams   Streams   Tracks   •  In  this  example   –  Obtained  4  local  media  streams   –  Created  3  media  streams  from  them   –  Sent  streams  over  Peer  Connec@on   41   WebRTC  Tutorial  28  Nov  2012  
  • Transmitng  media  •  Signaling  channel   –  Non-­‐standard   –  Must  exist  to  set  up  Peer  Connec@on  •  Peer  Connec@on   –  Links  together  two  peers   –  Add/Remove  Media  Streams   •  addStream(),  removeStream() –  Handlers  for  ICE  or  media  change   –  Data  Channel  support   WebRTC  Tutorial  28  Nov  2012   42  
  • Peer  Connec@on  •  "Links"  together  two  peers   –  Via  new RTCPeerConnection() –  Generates  Session  Descrip@on  offers/answers   •  createOffer(),  createAnswer() –  From  SDP  answers,  ini@ates  media   •  setLocalDescription(),  setRemoteDescription() –  Offers/answers  MUST  be  relayed  by  applica@on  code!   –  ICE  candidates  can  also  be  relayed  and  added  by  app   •  addIceCandidate() –  Think  of  PC  as  an  applica@on  helper   WebRTC  Tutorial  28  Nov  2012   43  
  • Peer  Connec@on  •  Handlers  for  ICE  or  media  change   –  onicecandidate,  onicechange –  onaddstream,  onremovestream –  onnegotiationneeded –  A  few  others   WebRTC  Tutorial  28  Nov  2012   44  
  • Peer  Connec@on  •  New  Iden@ty  func@ons   –  setIdentityProvider(),   getIdentityAssertion   –  Used  to  verify  iden@ty  via  third  party,  e.g.,   Facebook  Connect  •  New  Sta@s@cs  API   –  getStats() –  Obtain  sta@s@cs,  local  and  remote,  on  bytes/ packets  xmiQed,  audio  volume,  etc.   –  May  be  useful  for  conges@on-­‐based  adjustments   WebRTC  Tutorial  28  Nov  2012   45  
  • Pseudo  Code  Walkthrough   WebRTC  Tutorial  28  Nov  2012   46  
  • Pseudo  Code  •  Looks  like  real  code,  but  .  .  .  •  API  is  s@ll  in  flux,  so  .  .  .  •  Dont  expect  this  to  work  anywhere  .  .  .  •  Yet   WebRTC  Tutorial  28  Nov  2012   47  
  • Back  to  first  diagram   Microphone  Audio   Applica@on  Sharing  Video   Front  Camera  Video   Rear  Camera  Video   WebCam  Video   Stereo  Audio   Browser  M   Browser  L         on  Mobile   on  Laptop  •  Mobile  browser  "calls"  laptop  browser  •  Each  sends  media  to  the  other   WebRTC  Tutorial  28  Nov  2012   48  
  • Mobile  browser  code  outline   var signalingChannel = createSignalingChannel (); var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", "credential":"myPassword"}]}; var microphone, application, front, rear; var presentation, presenter, demonstration; var remote_av, stereo, mono; var display, left, right; Function s(sdp) {{ // stub success callback Function e(error) {} // stub error callback var signalingChannel = createSignalingChannel(); getMedia(); getMedia(); createPC(); attachMedia(); call(); function getMedia() { navigator.getUserMedia({"audio": true }, function (stream) { microphone = stream; }, e); createPC(); // get local video (application sharing) ///// This is outside the scope of this specification. ///// Assume that application has been set to this stream. // constraint = {"video": {"mandatory": {"enumDirection": "front"}}}; navigator.getUserMedia(constraint, function (stream) { front = stream; }, e); attachMedia(); constraint = {"video": {"mandatory": {"enumDirection": "rear"}}}; navigator.getUserMedia(constraint, function (stream) { rear = stream; }, e); } function createPC() { pc = new RTCPeerConnection(configuration); call(); pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } function attachMedia() { presentation = new MediaStream([microphone.audioTracks.item(0), application.videoTracks.item(0)]); presentation.audioTracks.item(0).label = "Audio"; presentation.videoTracks.item(0).label = "Presentation"; presenter = new MediaStream([microphone.audioTracks.item(0),•  We  will  look  next  at  each  of  these   front.videoTracks.item(0)]); presenter.audioTracks.item(0).label = "Audio"; presenter.videoTracks.item(0).label = "Presenter"; demonstration = new MediaStream([microphone.audioTracks.item(0), rear.videoTracks.item(0)]); demonstration.audioTracks.item(0).label = "Audio"; demonstration.videoTracks.item(0).label = "Demonstration"; pc.addStream(presentation); pc.addStream(presenter); pc.addStream(demonstration); } function call() { pc.createOffer(gotDescription, e);•  .  .  .  except  for  crea@ng  the  signaling   function gotDescription(desc) { pc.setLocalDescription(desc, o, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } } function handleIncomingStream(s) { if (s.videoTracks.length == 1) { av_stream = s; show_av(av_stream); } else if (s.audioTracks.length == 2) { stereo = s; channel   } else { mono = s; } } function show_av(s) { display.src = URL.createObjectURL(s.videoTracks.item(0)); left.src = URL.createObjectURL(s.audioTracks.item(0)); right.src = URL.createObjectURL(s.audioTracks.item(1)); } signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data); if (signal.sdp) { pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e); } else { pc.addIceCandidate( new RTCIceCandidate(signal.candidate)); } }; WebRTC  Tutorial  28  Nov  2012   49  
  • Mobile  browser  produces  .  .  .   Presenta@on  Stream  label=“F8kdls”   Audio   Track  label=“Audio”   Audio  Stream  label=“2dLe3js”   Track  label=“Presenta@on”   Presenta@on  Video   Presenter  Stream  label=“8dFlf”   Video  Stream  label=“eR3l0s”   Track  label=“Audio”   Microphone  Audio   Presenter  Video   Track  label=“Presenter”   Applica@on  Sharing  Video   Demonstra@on  Stream  label=“3dfdf2”   Front  Camera  Video   Video  Stream  label=“js4KMs”   Track  label=“Audio”   Rear  Camera  Video   Demonstra@on  Video   Track  label=“Demonstra@on”  Browser  M   Video  Stream  label=“923fKs”   Sources   Local  Media  Streams   Streams   Tracks   •  Four  calls  to  getUserMedia() •  Three  calls  to  new MediaStream() •  App  then  labels  all  tracks  and  sends  them   50   WebRTC  Tutorial  28  Nov  2012  
  • func@on  getMedia()  navigator.getUserMedia({"audio": true }, function (stream) { microphone = stream;}, e); var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", "credential":"myPassword"}]}; var microphone, application, front, rear; var presentation, presenter, demonstration; var remote_av, stereo, mono; var display, left, right; Function s(sdp) {{ // stub success callback Function e(error) {} // stub error callback// get local video (application sharing) var signalingChannel = createSignalingChannel(); getMedia(); createPC(); attachMedia(); call();///// This is outside the scope of this specification. function getMedia() { navigator.getUserMedia({"audio": true }, function (stream) { microphone = stream; }, e); // get local video (application sharing) ///// This is outside the scope of this specification. ///// Assume that application has been set to this stream. /////// Assume that application has been set to this stream. constraint = {"video": {"mandatory": {"enumDirection": "front"}}}; navigator.getUserMedia(constraint, function (stream) { front = stream; }, e); constraint = {"video": {"mandatory": {"enumDirection": "rear"}}};// navigator.getUserMedia(constraint, function (stream) { rear = stream; }, e); } function createPC() { pc = new RTCPeerConnection(configuration); pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } function attachMedia() {. . . presentation = new MediaStream([microphone.audioTracks.item(0), application.videoTracks.item(0)]); presentation.audioTracks.item(0).label = "Audio"; presentation.videoTracks.item(0).label = "Presentation"; presenter = new MediaStream([microphone.audioTracks.item(0), front.videoTracks.item(0)]); presenter.audioTracks.item(0).label = "Audio"; •  Get  audio   presenter.videoTracks.item(0).label = "Presenter"; demonstration = new MediaStream([microphone.audioTracks.item(0), rear.videoTracks.item(0)]); demonstration.audioTracks.item(0).label = "Audio"; demonstration.videoTracks.item(0).label = "Demonstration"; pc.addStream(presentation); pc.addStream(presenter); pc.addStream(demonstration); } function call() { pc.createOffer(gotDescription, e); function gotDescription(desc) { pc.setLocalDescription(desc, s, e); •  (Get  window  video  –  out  of  scope)   signalingChannel.send(JSON.stringify({ "sdp": desc })); } } function handleIncomingStream(s) { if (s.videoTracks.length == 1) { av_stream = s; show_av(av_stream); } else if (s.audioTracks.length == 2) { stereo = s; } else { mono = s; } } function show_av(s) { display.src = URL.createObjectURL(s.videoTracks.item(0)); left.src = URL.createObjectURL(s.audioTracks.item(0)); right.src = URL.createObjectURL(s.audioTracks.item(1)); } signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data); if (signal.sdp) { pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e); } else { pc.addIceCandidate( new RTCIceCandidate(signal.candidate)); } }; WebRTC  Tutorial  28  Nov  2012   51  
  • func@on  getMedia()   . . . constraint = {"video": {"mandatory": {"enumDirection": "front"}}}; navigator.getUserMedia(constraint, function (stream) { var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", front = stream; "credential":"myPassword"}]}; var microphone, application, front, rear; var presentation, presenter, demonstration; var remote_av, stereo, mono; var display, left, right; Function s(sdp) {{ // stub success callback Function e(error) {} // stub error callback }, e); var signalingChannel = createSignalingChannel(); getMedia(); createPC(); attachMedia(); call(); function getMedia() { navigator.getUserMedia({"audio": true }, function (stream) { microphone = stream; }, e); // get local video (application sharing) ///// This is outside the scope of this specification. ///// Assume that application has been set to this stream. // constraint = constraint = {"video": {"mandatory": {"enumDirection": "front"}}}; navigator.getUserMedia(constraint, function (stream) { front = stream; }, e); constraint = {"video": {"mandatory": {"enumDirection": "rear"}}}; navigator.getUserMedia(constraint, function (stream) { rear = stream; {"video": {"mandatory": {"enumDirection": "rear"}}}; }, e); } function createPC() { pc = new RTCPeerConnection(configuration); pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); navigator.getUserMedia(constraint, function (stream) { }; pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } function attachMedia() { presentation = new MediaStream([microphone.audioTracks.item(0), rear = stream; application.videoTracks.item(0)]); presentation.audioTracks.item(0).label = "Audio"; presentation.videoTracks.item(0).label = "Presentation"; presenter = new MediaStream([microphone.audioTracks.item(0), front.videoTracks.item(0)]); presenter.audioTracks.item(0).label = "Audio"; presenter.videoTracks.item(0).label = "Presenter"; demonstration = new MediaStream([microphone.audioTracks.item(0), }, e); rear.videoTracks.item(0)]); demonstration.audioTracks.item(0).label = "Audio"; demonstration.videoTracks.item(0).label = "Demonstration"; pc.addStream(presentation); pc.addStream(presenter); pc.addStream(demonstration); } function call() { pc.createOffer(gotDescription, s, e);•  Get  front-­‐facing  camera   function gotDescription(desc) { pc.setLocalDescription(desc, s, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } } function handleIncomingStream(s) { if (s.videoTracks.length == 1) { av_stream = s; show_av(av_stream); } else if (s.audioTracks.length == 2) { stereo = s; } else { mono = s; }•  Get  rear-­‐facing  camera   } function show_av(s) { display.src = URL.createObjectURL(s.videoTracks.item(0)); left.src = URL.createObjectURL(s.audioTracks.item(0)); right.src = URL.createObjectURL(s.audioTracks.item(1)); } signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data); if (signal.sdp) { pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e); } else { pc.addIceCandidate( new RTCIceCandidate(signal.candidate)); } }; WebRTC  Tutorial  28  Nov  2012   52  
  • func@on  createPC()   var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", "credential":"myPassword"}]}; var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", pc = new RTCPeerConnection(configuration); "credential":"myPassword"}]}; var microphone, application, front, rear; var presentation, presenter, demonstration; var remote_av, stereo, mono; var display, left, right; Function s(sdp) {{ // stub success callback Function e(error) {} // stub error callback var signalingChannel = createSignalingChannel(); getMedia(); createPC(); attachMedia(); call(); function getMedia() { navigator.getUserMedia({"audio": true }, function (stream) { pc.onicecandidate = function (evt) { microphone = stream; }, e); // get local video (application sharing) ///// This is outside the scope of this specification. ///// Assume that application has been set to this stream. // constraint = signalingChannel.send( {"video": {"mandatory": {"enumDirection": "front"}}}; navigator.getUserMedia(constraint, function (stream) { front = stream; }, e); constraint = {"video": {"mandatory": {"enumDirection": "rear"}}}; navigator.getUserMedia(constraint, function (stream) { rear = stream; JSON.stringify({ "candidate": evt.candidate })); }, e); } function createPC() { pc = new RTCPeerConnection(configuration); pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; }; pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } function attachMedia() { presentation = new MediaStream([microphone.audioTracks.item(0), pc.onaddstream = application.videoTracks.item(0)]); presentation.audioTracks.item(0).label = "Audio"; presentation.videoTracks.item(0).label = "Presentation"; presenter = new MediaStream([microphone.audioTracks.item(0), front.videoTracks.item(0)]); presenter.audioTracks.item(0).label = "Audio"; presenter.videoTracks.item(0).label = "Presenter"; demonstration = new MediaStream([microphone.audioTracks.item(0), function (evt) {handleIncomingStream(evt.stream);}; rear.videoTracks.item(0)]); demonstration.audioTracks.item(0).label = "Audio"; demonstration.videoTracks.item(0).label = "Demonstration"; pc.addStream(presentation); pc.addStream(presenter); pc.addStream(demonstration); } function call() { pc.createOffer(gotDescription, e);•  Create  RTCPeerConnec@on   function gotDescription(desc) { pc.setLocalDescription(desc, s, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } } function handleIncomingStream(s) { if (s.videoTracks.length == 1) { av_stream = s; show_av(av_stream); } else if (s.audioTracks.length == 2) { stereo = s; } else { mono = s; }•  Set  handlers   } function show_av(s) { display.src = URL.createObjectURL(s.videoTracks.item(0)); left.src = URL.createObjectURL(s.audioTracks.item(0)); right.src = URL.createObjectURL(s.audioTracks.item(1)); } signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data); if (signal.sdp) { pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e); } else { pc.addIceCandidate( new RTCIceCandidate(signal.candidate)); } }; WebRTC  Tutorial  28  Nov  2012   53  
  • Mobile  browser  consumes  .  .  .   Audio  &  Video  Stream  label=“wlQ3kdds”   Display   Track  label=“Video”   Right  Headphone   Track  label=“Right”   Leg  Headphone   Track  label=“Leg”  Browser  M   (Audio  &  Video  Stream  selected)   Stereo  Stream  label=“839dg”   Track  label=“Right”   Track  label=“Leg”   Track  label=“Mono”   Mono  Stream  label=“dk38djs”   Sinks   Media  Streams   Tracks   •  Receives  three  media  streams   •  Chooses  one •  Sends  tracks  to  output  channels   54   WebRTC  Tutorial  28  Nov  2012  
  • Func@on  handleIncomingStream()   if (s.videoTracks.length == 1) { av_stream = s; show_av(av_stream); } else if (s.audioTracks.length == 2) { var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", stereo = s; "credential":"myPassword"}]}; var microphone, application, front, rear; var presentation, presenter, demonstration; var remote_av, stereo, mono; var display, left, right; Function s(sdp) {{ // stub success callback Function e(error) {} // stub error callback } else { var signalingChannel = createSignalingChannel(); getMedia(); createPC(); attachMedia(); call(); function getMedia() { navigator.getUserMedia({"audio": true }, function (stream) { mono = s; microphone = stream; }, e); // get local video (application sharing) ///// This is outside the scope of this specification. ///// Assume that application has been set to this stream. // constraint = } {"video": {"mandatory": {"enumDirection": "front"}}}; navigator.getUserMedia(constraint, function (stream) { front = stream; }, e); constraint = {"video": {"mandatory": {"enumDirection": "rear"}}}; navigator.getUserMedia(constraint, function (stream) { rear = stream; }, e); } function createPC() { pc = new RTCPeerConnection(configuration); pc.onicecandidate = function (evt) { signalingChannel.send(•  If  incoming  stream  has  video  track,  set  to   JSON.stringify({ "candidate": evt.candidate })); }; pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } function attachMedia() { presentation = new MediaStream([microphone.audioTracks.item(0), application.videoTracks.item(0)]); presentation.audioTracks.item(0).label = "Audio"; presentation.videoTracks.item(0).label = "Presentation"; presenter = av_stream  and  display  it   new MediaStream([microphone.audioTracks.item(0), front.videoTracks.item(0)]); presenter.audioTracks.item(0).label = "Audio"; presenter.videoTracks.item(0).label = "Presenter"; demonstration = new MediaStream([microphone.audioTracks.item(0), rear.videoTracks.item(0)]); demonstration.audioTracks.item(0).label = "Audio"; demonstration.videoTracks.item(0).label = "Demonstration"; pc.addStream(presentation); pc.addStream(presenter); pc.addStream(demonstration); } function call() { pc.createOffer(gotDescription, e);•  If  it  has  two  audio  tracks,  must  be  stereo   function gotDescription(desc) { pc.setLocalDescription(desc, s, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } } function handleIncomingStream(s) { if (s.videoTracks.length == 1) { av_stream = s; show_av(av_stream); } else if (s.audioTracks.length == 2) { stereo = s; } else { mono = s; } }•  Otherwise,  must  be  the  mono  stream   function show_av(s) { display.src = URL.createObjectURL(s.videoTracks.item(0)); left.src = URL.createObjectURL(s.audioTracks.item(0)); right.src = URL.createObjectURL(s.audioTracks.item(1)); } signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data); if (signal.sdp) { pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e); } else { pc.addIceCandidate( new RTCIceCandidate(signal.candidate)); } }; WebRTC  Tutorial  28  Nov  2012   55  
  • func@on  aQachMedia()   presentation = new MediaStream([microphone.audioTracks.item(0), application.videoTracks.item(0)]); presentation.audioTracks.item(0).label = "Audio"; var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", presentation.videoTracks.item(0).label = "Presentation"; "credential":"myPassword"}]}; var microphone, application, front, rear; var presentation, presenter, demonstration; var remote_av, stereo, mono; var display, left, right; Function s(sdp) {{ // stub success callback Function e(error) {} // stub error callback presenter = var signalingChannel = createSignalingChannel(); getMedia(); createPC(); attachMedia(); call(); function getMedia() { navigator.getUserMedia({"audio": true }, function (stream) { new MediaStream([microphone.audioTracks.item(0), microphone = stream; }, e); // get local video (application sharing) ///// This is outside the scope of this specification. ///// Assume that application has been set to this stream. // constraint = front.videoTracks.item(0)]); {"video": {"mandatory": {"enumDirection": "front"}}}; navigator.getUserMedia(constraint, function (stream) { front = stream; }, e); constraint = {"video": {"mandatory": {"enumDirection": "rear"}}}; navigator.getUserMedia(constraint, function (stream) { rear = stream; presenter.audioTracks.item(0).label = "Audio"; }, e); } function createPC() { pc = new RTCPeerConnection(configuration); pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); presenter.videoTracks.item(0).label = "Presenter"; }; pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } function attachMedia() { presentation = new MediaStream([microphone.audioTracks.item(0), application.videoTracks.item(0)]); presentation.audioTracks.item(0).label = "Audio"; presentation.videoTracks.item(0).label = "Presentation"; presenter = new MediaStream([microphone.audioTracks.item(0), front.videoTracks.item(0)]); presenter.audioTracks.item(0).label = "Audio"; presenter.videoTracks.item(0).label = "Presenter"; demonstration = new MediaStream([microphone.audioTracks.item(0), . . . rear.videoTracks.item(0)]); demonstration.audioTracks.item(0).label = "Audio"; demonstration.videoTracks.item(0).label = "Demonstration"; pc.addStream(presentation); pc.addStream(presenter); pc.addStream(demonstration); } function call() { pc.createOffer(gotDescription, e);•  Create  new  presenta@on  &  presenter   function gotDescription(desc) { pc.setLocalDescription(desc, s, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } } function handleIncomingStream(s) { if (s.videoTracks.length == 1) { av_stream = s; show_av(av_stream); } else if (s.audioTracks.length == 2) { stereo = s; streams   } else { mono = s; } } function show_av(s) { display.src = URL.createObjectURL(s.videoTracks.item(0)); left.src = URL.createObjectURL(s.audioTracks.item(0)); right.src = URL.createObjectURL(s.audioTracks.item(1)); } signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data); if (signal.sdp) { pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e); } else {•  Label  the  tracks  in  the  new  streams   pc.addIceCandidate( new RTCIceCandidate(signal.candidate)); } }; WebRTC  Tutorial  28  Nov  2012   56  
  • func@on  aQachMedia()  demonstration = new MediaStream([microphone.audioTracks.item(0), rear.videoTracks.item(0)]);demonstration.audioTracks.item(0).label = "Audio"; var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2",demonstration.videoTracks.item(0).label = "Demonstration"; "credential":"myPassword"}]}; var microphone, application, front, rear; var presentation, presenter, demonstration; var remote_av, stereo, mono; var display, left, right; Function s(sdp) {{ // stub success callback Function e(error) {} // stub error callback var signalingChannel = createSignalingChannel(); getMedia(); createPC(); attachMedia(); call(); function getMedia() { navigator.getUserMedia({"audio": true }, function (stream) {pc.addStream(presentation); microphone = stream; }, e); // get local video (application sharing) ///// This is outside the scope of this specification. ///// Assume that application has been set to this stream. // constraint =pc.addStream(presenter); {"video": {"mandatory": {"enumDirection": "front"}}}; navigator.getUserMedia(constraint, function (stream) { front = stream; }, e); constraint = {"video": {"mandatory": {"enumDirection": "rear"}}}; navigator.getUserMedia(constraint, function (stream) { rear = stream;pc.addStream(demonstration); }, e); } function createPC() { pc = new RTCPeerConnection(configuration); pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } function attachMedia() { presentation = new MediaStream([microphone.audioTracks.item(0), application.videoTracks.item(0)]); presentation.audioTracks.item(0).label = "Audio"; presentation.videoTracks.item(0).label = "Presentation"; presenter = new MediaStream([microphone.audioTracks.item(0), front.videoTracks.item(0)]); presenter.audioTracks.item(0).label = "Audio"; presenter.videoTracks.item(0).label = "Presenter"; demonstration = new MediaStream([microphone.audioTracks.item(0), rear.videoTracks.item(0)]); demonstration.audioTracks.item(0).label = "Audio"; demonstration.videoTracks.item(0).label = "Demonstration"; pc.addStream(presentation); pc.addStream(presenter); pc.addStream(demonstration); } function call() { pc.createOffer(gotDescription, e);•  Create  new  demonstra@on  stream   function gotDescription(desc) { pc.setLocalDescription(desc, s, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } } function handleIncomingStream(s) { if (s.videoTracks.length == 1) { av_stream = s; show_av(av_stream); } else if (s.audioTracks.length == 2) { stereo = s; } else { mono = s; }•  AQach  all  3  streams  to  Peer  Connec@on   } function show_av(s) { display.src = URL.createObjectURL(s.videoTracks.item(0)); left.src = URL.createObjectURL(s.audioTracks.item(0)); right.src = URL.createObjectURL(s.audioTracks.item(1)); } signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data); if (signal.sdp) { pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e); } else { pc.addIceCandidate( new RTCIceCandidate(signal.candidate)); } }; WebRTC  Tutorial  28  Nov  2012   57  
  • func@on  call()  pc.createOffer(gotDescription, e); var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", "credential":"myPassword"}]}; var microphone, application, front, rear; var presentation, presenter, demonstration; var remote_av, stereo, mono; var display, left, right; Function s(sdp) {{ // stub success callback Function e(error) {} // stub error callbackfunction gotDescription(desc) { var signalingChannel = createSignalingChannel(); getMedia(); createPC(); attachMedia(); call(); pc.setLocalDescription(desc, s, e); function getMedia() { navigator.getUserMedia({"audio": true }, function (stream) { microphone = stream; }, e); // get local video (application sharing) ///// This is outside the scope of this specification. ///// Assume that application has been set to this stream. // constraint = {"video": {"mandatory": {"enumDirection": "front"}}}; navigator.getUserMedia(constraint, function (stream) { front = stream; }, e); constraint = {"video": {"mandatory": {"enumDirection": "rear"}}}; signalingChannel.send(JSON.stringify({ "sdp": desc })); navigator.getUserMedia(constraint, function (stream) { rear = stream; }, e); } function createPC() { pc = new RTCPeerConnection(configuration); pc.onicecandidate = function (evt) {} signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } function attachMedia() { presentation = new MediaStream([microphone.audioTracks.item(0), application.videoTracks.item(0)]); presentation.audioTracks.item(0).label = "Audio"; presentation.videoTracks.item(0).label = "Presentation"; presenter = new MediaStream([microphone.audioTracks.item(0), front.videoTracks.item(0)]); presenter.audioTracks.item(0).label = "Audio"; presenter.videoTracks.item(0).label = "Presenter"; demonstration = new MediaStream([microphone.audioTracks.item(0), •  Ask  browser  to  create  SDP  offer   rear.videoTracks.item(0)]); demonstration.audioTracks.item(0).label = "Audio"; demonstration.videoTracks.item(0).label = "Demonstration"; pc.addStream(presentation); pc.addStream(presenter); pc.addStream(demonstration); } function call() { pc.createOffer(gotDescription, e); function gotDescription(desc) { pc.setLocalDescription(desc, s, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } } •  Set  offer  as  local  descrip@on   function handleIncomingStream(s) { if (s.videoTracks.length == 1) { av_stream = s; show_av(av_stream); } else if (s.audioTracks.length == 2) { stereo = s; } else { mono = s; } } function show_av(s) { display.src = URL.createObjectURL(s.videoTracks.item(0)); left.src = URL.createObjectURL(s.audioTracks.item(0)); right.src = URL.createObjectURL(s.audioTracks.item(1)); } •  Send  offer  to  peer   signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data); if (signal.sdp) { pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e); } else { pc.addIceCandidate( new RTCIceCandidate(signal.candidate)); } }; WebRTC  Tutorial  28  Nov  2012   58  
  • Func@on  show_av(s)   display.src = URL.createObjectURL(s.videoTracks.item(0)); var pc; left.src = URL.createObjectURL(s.audioTracks.item(0)); var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", "credential":"myPassword"}]}; var microphone, application, front, rear; var presentation, presenter, demonstration; var remote_av, stereo, mono; var display, left, right; Function s(sdp) {{ // stub success callback right.src = URL.createObjectURL(s.audioTracks.item(1)); Function e(error) {} // stub error callback var signalingChannel = createSignalingChannel(); getMedia(); createPC(); attachMedia(); call(); function getMedia() { navigator.getUserMedia({"audio": true }, function (stream) { microphone = stream; }, e); // get local video (application sharing) ///// This is outside the scope of this specification. ///// Assume that application has been set to this stream. // constraint = {"video": {"mandatory": {"enumDirection": "front"}}}; navigator.getUserMedia(constraint, function (stream) { front = stream; }, e); constraint = {"video": {"mandatory": {"enumDirection": "rear"}}}; navigator.getUserMedia(constraint, function (stream) { rear = stream; }, e); } function createPC() { pc = new RTCPeerConnection(configuration); pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } function attachMedia() { presentation = new MediaStream([microphone.audioTracks.item(0), application.videoTracks.item(0)]); presentation.audioTracks.item(0).label = "Audio"; presentation.videoTracks.item(0).label = "Presentation"; presenter = new MediaStream([microphone.audioTracks.item(0), front.videoTracks.item(0)]); presenter.audioTracks.item(0).label = "Audio"; presenter.videoTracks.item(0).label = "Presenter"; demonstration = new MediaStream([microphone.audioTracks.item(0), rear.videoTracks.item(0)]); demonstration.audioTracks.item(0).label = "Audio"; demonstration.videoTracks.item(0).label = "Demonstration"; pc.addStream(presentation); pc.addStream(presenter); pc.addStream(demonstration); } function call() { pc.createOffer(gotDescription, e);•  Turn  streams  into  URLs   function gotDescription(desc) { pc.setLocalDescription(desc, s, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } } function handleIncomingStream(s) { if (s.videoTracks.length == 1) { av_stream = s; show_av(av_stream); } else if (s.audioTracks.length == 2) { stereo = s; } else { mono = s; }•  Set  as  source  for  media  elements   } function show_av(s) { display.src = URL.createObjectURL(s.videoTracks.item(0)); left.src = URL.createObjectURL(s.audioTracks.item(0)); right.src = URL.createObjectURL(s.audioTracks.item(1)); } signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data); if (signal.sdp) { pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e); } else { pc.addIceCandidate( new RTCIceCandidate(signal.candidate)); } }; WebRTC  Tutorial  28  Nov  2012   59  
  • How  do  we  get  the  SDP  answer?   signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data); var pc; if (signal.sdp) { var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", "credential":"myPassword"}]}; var microphone, application, front, rear; var presentation, presenter, demonstration; var remote_av, stereo, mono; var display, left, right; Function s(sdp) {{ // stub success callback pc.setRemoteDescription( Function e(error) {} // stub error callback var signalingChannel = createSignalingChannel(); getMedia(); createPC(); attachMedia(); call(); new RTCSessionDescription(signal.sdp), s, e); function getMedia() { navigator.getUserMedia({"audio": true }, function (stream) { microphone = stream; }, e); // get local video (application sharing) ///// This is outside the scope of this specification. ///// Assume that application has been set to this stream. } else { // constraint = {"video": {"mandatory": {"enumDirection": "front"}}}; navigator.getUserMedia(constraint, function (stream) { front = stream; }, e); constraint = pc.addIceCandidate( {"video": {"mandatory": {"enumDirection": "rear"}}}; navigator.getUserMedia(constraint, function (stream) { rear = stream; }, e); } function createPC() { pc = new RTCPeerConnection(configuration); new RTCIceCandidate(signal.candidate)); pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } } function attachMedia() { presentation = new MediaStream([microphone.audioTracks.item(0), application.videoTracks.item(0)]); presentation.audioTracks.item(0).label = "Audio"; presentation.videoTracks.item(0).label = "Presentation"; presenter = new MediaStream([microphone.audioTracks.item(0), front.videoTracks.item(0)]); }; presenter.audioTracks.item(0).label = "Audio"; presenter.videoTracks.item(0).label = "Presenter"; demonstration = new MediaStream([microphone.audioTracks.item(0), rear.videoTracks.item(0)]); demonstration.audioTracks.item(0).label = "Audio"; demonstration.videoTracks.item(0).label = "Demonstration"; pc.addStream(presentation); pc.addStream(presenter); pc.addStream(demonstration); }•  Magic  signaling  channel  provides  message   function call() { pc.createOffer(gotDescription, e); function gotDescription(desc) { pc.setLocalDescription(desc, s, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } } function handleIncomingStream(s) { if (s.videoTracks.length == 1) { av_stream = s; show_av(av_stream); } else if (s.audioTracks.length == 2) { stereo = s; } else { mono = s;•  If  SDP,  set  as  remote  descrip@on   } } function show_av(s) { display.src = URL.createObjectURL(s.videoTracks.item(0)); left.src = URL.createObjectURL(s.audioTracks.item(0)); right.src = URL.createObjectURL(s.audioTracks.item(1)); } signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data); if (signal.sdp) { pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e); } else { pc.addIceCandidate(•  If  ICE  candidate,  tell  the  browser   new RTCIceCandidate(signal.candidate)); } }; WebRTC  Tutorial  28  Nov  2012   60  
  • And  now  the  laptop  browser  .  .  .  •  Watch  for  the  following   –  We  set  up  media  *ager*  receiving  the  offer   –  but  the  signaling  channel  s@ll  must  exist  first!   WebRTC  Tutorial  28  Nov  2012   61  
  • Signaling  channel  message  is  trigger   signalingChannel.onmessage = function (msg) { if (!pc) { prepareForIncomingCall(); } var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, var s = JSON.parse(msg.data); {"url":"turn:198.51.100.2", "credential":"myPassword"}]}; var webcam, left, right; var av, stereo, mono; var presentation, presenter, demonstration; var speaker, win1, win2, win3; Function s(sdp) {{ // stub success callback Function e(error) {} // stub error callback var signalingChannel = createSignalingChannel(); function prepareForIncomingCall() { createPC(); getMedia(); attachMedia(); if (s.sdp) { } function createPC() { pc = new RTCPeerConnection(configuration); pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; pc.setRemoteDescription( } pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; function getMedia() { navigator.getUserMedia({"video": true }, function (stream) { webcam = stream; new RTCSessionDescription(s.sdp), s, e); }, e); constraint = {"audio": {"mandatory": {"enumDirection": "left"}}}; navigator.getUserMedia(constraint, function (stream) { left = stream; }, e); constraint = answer(); {"audio": {"mandatory": {"enumDirection": "right"}}}; navigator.getUserMedia(constraint, function (stream) { right = stream; }, e); } function attachMedia() { av = new MediaStream([webcam.videoTracks.item(0), left.audioTracks.item(0), } else { right.audioTracks.item(0)]); av.videoTracks.item(0).label = "Video"; av.audioTracks.item(0).label = "Left"; av.audioTracks.item(1).label = "Right"; stereo = new MediaStream([left.audioTracks.item(0), right.audioTracks.item(0)]); stereo.audioTracks.item(0).label = "Left"; stereo.audioTracks.item(1).label = "Right"; mono = left; pc.addIceCandidate(new RTCIceCandidate(s.candidate)); mono.audioTracks.item(0).label = "Left"; pc.addStream(av); pc.addStream(stereo); pc.addStream(mono); } function answer() { pc.createAnswer(gotDescription, e); }}; function gotDescription(desc) { pc.setLocalDescription(desc, s, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } }•  Set  up  PC  and  media  if  not  already  done   function handleIncomingStream(s) { if (s.videoTracks.item(0).label == "Presentation") { speaker.src = URL.createObjectURL(s.audioTracks.item(0)); win1.src = URL.createObjectURL(s.videoTracks.item(0)); } else if (s.videoTracks.item(0).label == "Presenter") { win2.src = URL.createObjectURL(s.videoTracks.item(0)); } else { win3.src = URL.createObjectURL(s.videoTracks.item(0)); } } signalingChannel.onmessage = function (msg) { if (!pc) { prepareForIncomingCall(); } var signal = JSON.parse(msg.data);•  If  SDP,  *also*  answer   if (signal.sdp) { pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e); answer(); } else { pc.addIceCandidate(new RTCIceCandidate(signal.candidate)); } }; WebRTC  Tutorial  28  Nov  2012   62  
  • Func@on  prepareForIncomingCall()   createPC(); var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", "credential":"myPassword"}]}; var webcam, left, right; var av, stereo, mono; getMedia(); var presentation, presenter, demonstration; var speaker, win1, win2, win3; Function s(sdp) {{ // stub success callback Function e(error) {} // stub error callback var signalingChannel = createSignalingChannel(); function prepareForIncomingCall() { createPC(); getMedia(); attachMedia(); } function createPC() { pc = new RTCPeerConnection(configuration); attachMedia(); pc.onicecandidate = function (evt) { }; signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } function getMedia() { navigator.getUserMedia({"video": true }, function (stream) { webcam = stream; }, e); constraint = {"audio": {"mandatory": {"enumDirection": "left"}}}; navigator.getUserMedia(constraint, function (stream) { left = stream; }, e); constraint = {"audio": {"mandatory": {"enumDirection": "right"}}}; navigator.getUserMedia(constraint, function (stream) { right = stream; }, e); } function attachMedia() { av = new MediaStream([webcam.videoTracks.item(0), left.audioTracks.item(0), right.audioTracks.item(0)]); av.videoTracks.item(0).label = "Video"; av.audioTracks.item(0).label = "Left"; av.audioTracks.item(1).label = "Right"; stereo = new MediaStream([left.audioTracks.item(0), right.audioTracks.item(0)]); stereo.audioTracks.item(0).label = "Left"; stereo.audioTracks.item(1).label = "Right";•  No  suprises  here   mono = left; mono.audioTracks.item(0).label = "Left"; pc.addStream(av); pc.addStream(stereo); pc.addStream(mono); } function answer() { pc.createAnswer(gotDescription, e); function gotDescription(desc) { pc.setLocalDescription(desc, s, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } }•  Media  obtained  is  a  liQle  different   function handleIncomingStream(s) { if (s.videoTracks.item(0).label == "Presentation") { speaker.src = URL.createObjectURL(s.audioTracks.item(0)); win1.src = URL.createObjectURL(s.videoTracks.item(0)); } else if (s.videoTracks.item(0).label == "Presenter") { win2.src = URL.createObjectURL(s.videoTracks.item(0)); } else { win3.src = URL.createObjectURL(s.videoTracks.item(0)); } } signalingChannel.onmessage = function (msg) { if (!pc) { prepareForIncomingCall(); } var signal = JSON.parse(msg.data);•  But  aQached  the  same  way   if (signal.sdp) { pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e); answer(); } else { pc.addIceCandidate(new RTCIceCandidate(signal.candidate)); } }; 63   WebRTC  Tutorial  28  Nov  2012  
  • Func@on  answer()  pc.createAnswer(gotDescription, e); var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", "credential":"myPassword"}]}; var webcam, left, right; var av, stereo, mono;function gotDescription(desc) { var presentation, presenter, demonstration; var speaker, win1, win2, win3; Function s(sdp) {{ // stub success callback Function e(error) {} // stub error callback var signalingChannel = createSignalingChannel(); function prepareForIncomingCall() { pc.setLocalDescription(desc, s, e); createPC(); getMedia(); attachMedia(); } function createPC() { pc = new RTCPeerConnection(configuration); pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } signalingChannel.send(JSON.stringify({ "sdp": desc })); function getMedia() { navigator.getUserMedia({"video": true }, function (stream) { webcam = stream; }, e); constraint = {"audio": {"mandatory": {"enumDirection": "left"}}}; navigator.getUserMedia(constraint, function (stream) {} left = stream; }, e); constraint = {"audio": {"mandatory": {"enumDirection": "right"}}}; navigator.getUserMedia(constraint, function (stream) { right = stream; }, e); } function attachMedia() { av = new MediaStream([webcam.videoTracks.item(0), left.audioTracks.item(0), right.audioTracks.item(0)]); av.videoTracks.item(0).label = "Video"; av.audioTracks.item(0).label = "Left"; av.audioTracks.item(1).label = "Right"; stereo = new MediaStream([left.audioTracks.item(0), right.audioTracks.item(0)]); stereo.audioTracks.item(0).label = "Left"; stereo.audioTracks.item(1).label = "Right"; mono = left; mono.audioTracks.item(0).label = "Left";•  createAnswer()  automa@cally  uses   pc.addStream(av); pc.addStream(stereo); pc.addStream(mono); } function answer() { pc.createAnswer(gotDescription, e); function gotDescription(desc) { pc.setLocalDescription(desc, s, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } } value  of  remoteDescription  when   function handleIncomingStream(s) { if (s.videoTracks.item(0).label == "Presentation") { speaker.src = URL.createObjectURL(s.audioTracks.item(0)); win1.src = URL.createObjectURL(s.videoTracks.item(0)); } else if (s.videoTracks.item(0).label == "Presenter") { win2.src = URL.createObjectURL(s.videoTracks.item(0)); } else { win3.src = URL.createObjectURL(s.videoTracks.item(0)); } } signalingChannel.onmessage = function (msg) { if (!pc) { genera@ng  new  SDP   prepareForIncomingCall(); } var signal = JSON.parse(msg.data); if (signal.sdp) { pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e); answer(); } else { pc.addIceCandidate(new RTCIceCandidate(signal.candidate)); } }; WebRTC  Tutorial  28  Nov  2012   64  
  • In  real  code  .  .  .  •  The  error  callbacks  must  do  something  useful  •  Methods  with  callbacks  are  asynchronous!   –  May  want  to  wait  for  callbacks  before  con@nuing   –  Consider  using  Stra@fiedJS,  or  some  other  JS  async   toolbox   –  createOffer(),  createAnswer(),  setLocalDescrip@on (),  setRemoteDescrip@on()  now  queued  •  May  want  to  use  Iden@ty  provider   WebRTC  Tutorial  28  Nov  2012   65  
  • What’s  Next?  •  W3C  and  IETF  standards  s@ll  need  to  be   finalized,  including  SDP  use/interpreta@on  •  Browsers  need  to  add  support   –  Chrome  browser  has  much  of  this  func@onality   now,  M23    without  flag   –  Firefox  will  have  shortly  (in  nightly  builds)  •  Interworking  with  SIP  and  Jingle  need  to  be   finalized   WebRTC  Tutorial  28  Nov  2012   66  
  • Ques@ons?  hQp://webrtcbook.com     WebRTC  Tutorial  28  Nov  2012   67  
  • November 27-29, 2012 South San Francisco Conference CenterThank  You   68