WebRTC

Introduc)on	
  to	
  WebRTC	
  

Dan	
  Burne4	
  
Chief	
  Scien)st,	
  Tropo	
  
Director	
  of	
  Standards,	
 ...
WebRTC	
  Tutorial	
  Topics	
  
• 
• 
• 
• 
• 
• 
• 

What	
  is	
  WebRTC?	
  
How	
  to	
  Use	
  WebRTC	
  
WebRTC	
  ...
What	
  is	
  WebRTC?	
  
WebRTC	
  is	
  “Voice	
  &	
  Video	
  in	
  the	
  browser”	
  
•  Access	
  to	
  camera	
  and	
  microphone	
  withou...
The	
  Browser	
  RTC	
  FuncFon	
  
Web	
  
Server	
  

Signaling	
  
Server	
  

HTTP	
  or	
  WebSockets	
  
	
  
JavaS...
Benefits	
  of	
  WebRTC	
  
For	
  Developer 	
  	
  

For	
  User	
  

•  Streamlined	
  development	
  –	
  
one	
  plac...
WebRTC	
  Support	
  of	
  MulFple	
  Media	
  
Microphone	
  Audio	
  
ApplicaFon	
  Sharing	
  Video	
  
Front	
  Camera...
WebRTC	
  Triangle	
  
Web	
  Server	
  
(ApplicaFon)	
  

Peer	
  ConnecFon	
  (Audio,	
  Video,	
  and/or	
  Data)	
  
B...
WebRTC	
  Trapezoid	
  
Web	
  Server	
  A	
  
(ApplicaFon	
  A)	
  

Browser	
  M	
  

SIP	
  	
  
or	
  Jingle	
  

Web	...
WebRTC	
  and	
  SIP	
  
Web	
  Server	
  	
  

SIP	
  

SIP	
  Server	
  

SIP	
  

Browser	
  M	
  

Peer	
  ConnecFon	
...
WebRTC	
  and	
  Jingle	
  
Web	
  Server	
  

Jingle	
  

XMPP	
  Server	
  

Jingle	
  

Peer	
  ConnecFon	
  (Audio	
  ...
WebRTC	
  and	
  PSTN	
  
Web	
  Server	
  

Peer	
  ConnecFon	
  (Audio)	
  
PSTN	
  Gateway	
  

Browser	
  M	
  

Phone...
WebRTC	
  with	
  SIP	
  
Web	
  Server	
  	
  

SIP	
  Proxy/Registrar	
  Server	
  	
  
WebSocket	
  (SIP)	
  

HTTP	
  ...
WebRTC	
  Signaling	
  Approaches	
  
•  Signaling	
  is	
  required	
  for	
  exchange	
  of	
  candidate	
  transport	
 ...
How	
  to	
  Use	
  WebRTC	
  
WebRTC	
  usage	
  in	
  brief	
  
Obtain	
  Local	
  
Media	
  

Get	
  more	
  media	
  

All	
  media	
  added	
  

Set...
WebRTC	
  usage	
  in	
  brief	
  
Obtain	
  Local	
  
Media	
  

Get	
  more	
  media	
  

•  getUserMedia()
–  Audio	
  ...
WebRTC	
  usage	
  in	
  brief	
  
Obtain	
  Local	
  
Media	
  

Get	
  more	
  media	
  

•  RTCPeerConnection

– 
All	
...
WebRTC	
  usage	
  in	
  brief	
  
Obtain	
  Local	
  
Media	
  

Get	
  more	
  media	
  

•  addStream()
–  Doesn't	
  c...
WebRTC	
  usage	
  in	
  brief	
  
Obtain	
  Local	
  
Media	
  

Get	
  more	
  media	
  

All	
  media	
  added	
  

Set...
WebRTC	
  usage	
  –	
  a	
  bit	
  more	
  detail	
  
Set	
  Up	
  Signaling	
  
Channel	
  
Obtain	
  Local	
  
Media	
 ...
SDP	
  offer/answer	
  
•  Session	
  DescripFons	
  
–  Session	
  DescripFon	
  Protocol	
  created	
  for	
  use	
  by	
...
WebRTC	
  Peer-­‐to-­‐Peer	
  Media	
  
Media	
  Flows	
  in	
  WebRTC	
  
Web	
  Server	
  
	
  
	
  

Internet	
  

Home	
  WiFi	
  
Router	
  

Router	
  

	
 ...
Media	
  without	
  WebRTC	
  
Web	
  Server	
  
	
  
	
  

Internet	
  

Home	
  WiFi	
  
Router	
  

Router	
  

	
  

	...
Peer-­‐to-­‐Peer	
  Media	
  with	
  WebRTC	
  
Web	
  Server	
  
	
  
	
  

Internet	
  

Home	
  WiFi	
  
Router	
  

Ro...
NAT	
  Complicates	
  Peer-­‐to-­‐Peer	
  Media	
  
Web	
  Server	
  

Most	
  browsers	
  are	
  behind	
  NATs	
  
on	
 ...
What	
  is	
  a	
  NAT?	
  
•  Network	
  Address	
  Translator	
  (NAT)	
  
•  Used	
  to	
  map	
  an	
  inside	
  addre...
NAT	
  Example	
  
Internet	
  
“Outside”	
  	
  	
  Public	
  IP	
  Address	
  	
  

	
  

203.0.113.4

“Inside”	
  	
  	...
NATs	
  and	
  ApplicaFons	
  
•  NATs	
  are	
  compaFble	
  with	
  client/server	
  protocols	
  
such	
  as	
  web,	
 ...
Peer-­‐to-­‐Peer	
  Media	
  Through	
  NAT	
  
Web	
  Server	
  

ICE	
  connecFvity	
  checks	
  can	
  
omen	
  establi...
ICE	
  ConnecFvity	
  Checks	
  
•  ConnecFvity	
  through	
  NAT	
  can	
  be	
  achieved	
  using	
  ICE	
  
connecFvity...
P2P	
  Media	
  Can	
  Stay	
  Local	
  to	
  NAT	
  
If	
  both	
  browsers	
  are	
  
behind	
  the	
  same	
  NAT,	
  
...
ICE	
  Servers	
  
Web	
  Server	
  
	
  
	
  

STUN	
  Server	
   TURN	
  Server	
  
198.51.100.9	
   198.51.100.2	
  
	
...
Browser	
  Queries	
  STUN	
  Server	
  	
  
Web	
  Server	
  
	
  
	
  

STUN	
  Server	
   TURN	
  Server	
  
198.51.100...
TURN	
  Server	
  Can	
  Relay	
  Media	
  
Web	
  Server	
  
	
  
	
  

STUN	
   TURN	
  Server	
  as	
  a	
  
Server	
  ...
WebRTC	
  Protocols	
  and	
  IETF	
  
Standards	
  
WebRTC:	
  A	
  Joint	
  Standards	
  Effort	
  
•  Internet	
  Engineering	
  Task	
  Force	
  (IETF)	
  and	
  World	
  
...
WebRTC	
  Protocols	
  
ApplicaFon	
  Layer	
  
HTTP	
  

ICE	
  

WebSocket	
  

SRTP	
  

SDP	
  

STUN	
  
TURN	
  

Tr...
IETF	
  RTCWEB	
  Documents	
  
Document)

Ref)

Overview'

“Overview:'Real'Time'Protocols'for'
Browser6based'Applications...
Codecs	
  
	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  RFC	
  6716	
  	
  	
  	
  	
  	
  	
  	
  	
  .	
  	
  	
  	
  	
  	
 ...
WebRTC	
  W3C	
  API	
  Overview	
  
Two	
  primary	
  API	
  secFons	
  
•  Handling	
  local	
  media	
  
–  Media	
  Capture	
  and	
  Streams	
  (getUserMe...
Local	
  Media	
  Handling	
  
Audio	
  

PresentaFon	
  
Video	
  

“Audio”	
  Track	
  

Presenter	
  Stream	
  

“Prese...
Local	
  Media	
  Handling	
  
Audio	
  

PresentaFon	
  
Video	
  
Microphone	
  Audio	
  
ApplicaFon	
  Sharing	
  Video...
Local	
  Media	
  Handling	
  
Audio	
  

PresentaFon	
  
Video	
  
Microphone	
  Audio	
  
ApplicaFon	
  Sharing	
  Video...
Local	
  Media	
  Handling	
  
PresentaFon	
  Stream	
  

Audio	
  

“Audio”	
  Track	
  

PresentaFon	
  
Video	
  
Micro...
Local	
  Media	
  Handling	
  
Audio	
  

PresentaFon	
  
Video	
  
Microphone	
  Audio	
  
ApplicaFon	
  Sharing	
  Video...
Local	
  Media	
  Handling	
  
•  Sepngs	
  
–  Current	
  values	
  of	
  source	
  properFes	
  (height,	
  width,	
  
e...
Transmipng	
  media	
  
•  Signaling	
  channel	
  
–  Non-­‐standard	
  
–  Must	
  exist	
  to	
  set	
  up	
  Peer	
  C...
Peer	
  ConnecFon	
  
•  "Links"	
  together	
  two	
  peers	
  
–  Via	
  new RTCPeerConnection()
–  Generates	
  Session...
Peer	
  ConnecFon	
  
•  Handlers	
  for	
  signaling,	
  ICE	
  or	
  media	
  change	
  
–  onsignalingstatechange
–  on...
Peer	
  ConnecFon	
  
•  “Extra”	
  APIs	
  
–  Data	
  
–  DTMF	
  
–  StaFsFcs	
  
–  IdenFty	
  

•  Grouped	
  separat...
Data	
  Channel	
  API	
  
•  RTCDataChannel createDataChannel()

•  Configurable	
  with	
  
– 
– 
– 
– 

ordered
maxRetra...
DTMF	
  API	
  
•  RTCDTMFSender createDTMFSender()
–  Associates	
  track	
  input	
  parameter	
  with	
  this	
  
RTCPe...
StaFsFcs	
  API	
  
•  getStats()
–  Callback	
  returns	
  staFsFcs	
  for	
  given	
  track	
  

•  StaFsFcs	
  availabl...
IdenFty	
  API	
  
•  setIdentityProvider(),	
  
getIdentityAssertion()

•  Used	
  to	
  verify	
  idenFty	
  via	
  thir...
Pseudo	
  Code	
  Walkthrough	
  
Pseudo	
  Code	
  
•  Close	
  to	
  real	
  code,	
  but	
  .	
  .	
  .	
  
•  No	
  HTML,	
  no	
  signaling	
  channel,...
Back	
  to	
  first	
  diagram	
  
Microphone	
  Audio	
  
ApplicaFon	
  Sharing	
  Video	
  
Front	
  Camera	
  Video	
  
...
Mobile	
  browser	
  code	
  outline	
  
var signalingChannel =
createSignalingChannel();

var pc;
var configuration =
{"i...
Mobile	
  browser	
  produces	
  .	
  .	
  .	
  
Audio	
  

PresentaFon	
  Stream	
  
“Audio”	
  Track	
  
“PresentaFon”	
...
funcFon	
  getMedia()	
  [1]	
  
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
...
funcFon	
  getMedia()	
  [2]	
  
. . .
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "front"}}};
navigator....
funcFon	
  createPC()	
  
var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"cre...
Mobile	
  browser	
  consumes	
  .	
  .	
  .	
  
Audio	
  &	
  Video	
  Stream	
  

Display	
  

“Video”	
  Track	
  

Rig...
FuncFon	
  handleIncomingStream()	
  
if (st.getVideoTracks().length == 1) {
av_stream = st;
show_av(av_stream);
} else if...
FuncFon	
  show_av(st)	
  
display.srcObject =
new MediaStream(st.getVideoTracks()[0]);
left.srcObject =
new MediaStream(s...
Mobile	
  browser	
  code	
  outline	
  
var signalingChannel =
createSignalingChannel();

var pc;
var configuration =
{"i...
funcFon	
  aWachMedia()	
  [1]	
  
presentation =
new MediaStream(
[microphone.getAudioTracks()[0],
application.getVideoTr...
funcFon	
  aWachMedia()	
  [2]	
  
pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);
var p...
Mobile	
  browser	
  code	
  outline	
  
var signalingChannel =
createSignalingChannel();

var pc;
var configuration =
{"i...
funcFon	
  call()	
  
pc.createOffer(gotDescription, e);
var pc;
var configuration =
{"iceServers":[{"url":"stun:198.51.10...
How	
  do	
  we	
  get	
  the	
  SDP	
  answer?	
  
signalingChannel.onmessage = function (msg) {
var signal = JSON.parse(...
And	
  now	
  the	
  laptop	
  browser	
  .	
  .	
  .	
  
•  Watch	
  for	
  the	
  following	
  
–  We	
  set	
  up	
  me...
Signaling	
  channel	
  message	
  is	
  trigger	
  
signalingChannel.onmessage = function (msg) {
if (!pc) {
prepareForIn...
Signaling	
  channel	
  message	
  is	
  trigger	
  
signalingChannel.onmessage = function (msg) {
. . .
if (sgnl.sdp) {
p...
FuncFon	
  prepareForIncomingCall()	
  
createPC();
var pc;
var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"}...
FuncFon	
  answer()	
  
pc.createAnswer(gotDescription, e);
var pc;
var configuration =
{"iceServers":[{"url":"stun:198.51...
Laptop	
  browser	
  consumes	
  .	
  .	
  .	
  
PresentaFon	
  Stream	
  
“Audio”	
  Track	
  
“PresentaFon”	
  Track	
  ...
FuncFon	
  handleIncomingStream()	
  
if (st.id === incoming.presentation) {
speaker.srcObject =
new MediaStream(st.getAud...
Laptop	
  browser	
  produces	
  .	
  .	
  .	
  
Audio	
  &	
  Video	
  Stream	
  
Video	
  

“Video”	
  Track	
  

WebCam...
FuncFon	
  getMedia()	
  [1]	
  
navigator.getUserMedia({"video": true}, function (stream) {
webcam = stream;
}, e);

var ...
FuncFon	
  getMedia()	
  [2]	
  
. . .
constraint =
{"audio": {"mandatory": {"audioDirectionEnum": "left"}}};
navigator.ge...
TADS Developer Summit WebRTC Dan Burnett
TADS Developer Summit WebRTC Dan Burnett
TADS Developer Summit WebRTC Dan Burnett
TADS Developer Summit WebRTC Dan Burnett
TADS Developer Summit WebRTC Dan Burnett
TADS Developer Summit WebRTC Dan Burnett
TADS Developer Summit WebRTC Dan Burnett
TADS Developer Summit WebRTC Dan Burnett
TADS Developer Summit WebRTC Dan Burnett
Upcoming SlideShare
Loading in...5
×

TADS Developer Summit WebRTC Dan Burnett

1,452

Published on

Dan Burnett, editor of the WebRTC specification and author of The WebRTC Book provides an excellent tutorial on WenRTC at TADS, 21-22 Nov 2013 in Bangkok

Published in: Technology
0 Comments
7 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
1,452
On Slideshare
0
From Embeds
0
Number of Embeds
4
Actions
Shares
0
Downloads
210
Comments
0
Likes
7
Embeds 0
No embeds

No notes for slide

Transcript of "TADS Developer Summit WebRTC Dan Burnett"

  1. 1. WebRTC Introduc)on  to  WebRTC   Dan  Burne4   Chief  Scien)st,  Tropo   Director  of  Standards,  Voxeo     Alan  Johnston   Dis)nguished  Engineer   Avaya    
  2. 2. WebRTC  Tutorial  Topics   •  •  •  •  •  •  •  What  is  WebRTC?   How  to  Use  WebRTC   WebRTC  Peer-­‐to-­‐Peer  Media   WebRTC  Protocols  and  IETF  Standards   WebRTC  W3C  API  Overview   Pseudo  Code  Walkthrough   PracFcal  bits   TAD  Summit  Bangkok  2013   2  
  3. 3. What  is  WebRTC?  
  4. 4. WebRTC  is  “Voice  &  Video  in  the  browser”   •  Access  to  camera  and  microphone  without  a   plugin   –  No  proprietary  plugin  required!     •  Audio/video  direct  from  browser  to  browser   •  Why  does  it  maWer?   –  Media  can  stay  local   –  Mobile  devices  eventually  dropping  voice  channel   anyway   –  Games   TAD  Summit  Bangkok  2013   4  
  5. 5. The  Browser  RTC  FuncFon   Web   Server   Signaling   Server   HTTP  or  WebSockets     JavaScript/HTML/CSS   Other  APIs   Web   Browser   •  WebRTC  adds  new  Real-­‐ Time  CommunicaFon  (RTC)   FuncFon  built-­‐in  to   browsers   –  No  download   HTTP  or  WebSockets  –  No  Flash  or  other  plugins     (Signaling)   •  Contains   –  Audio  and  video  codecs   –  Ability  to  negoFate  peer-­‐to-­‐ peer  connecFons   On-­‐the-­‐wire  protocols   –  Echo  cancellaFon,  packet  loss   (Media  or  Data)   concealement   RTC  APIs   Browser   RTC   FuncFon   NaFve  OS  Services   •  In  Chrome  &  Firefox  today,   Internet  Explorer  someFme   and  Safari  eventually   TAD  Summit  Bangkok  2013   5  
  6. 6. Benefits  of  WebRTC   For  Developer     For  User   •  Streamlined  development  –   one  placorm   •  Simple  APIs  –  detailed   knowledge  of  RTC  protocols   not  needed   •  NAT  traversal  only  uses   expensive  relays  when  no   other  choice   •  Advanced  voice  and  video   codecs  without  licensing     •  No  download  or  install  –   easy  to  use   •  All  communicaton   encrypted  –  private   •  Reliable  session   establishment     –  “just  works”   •  Excellent  voice  and  video   quality     •  Many  more  choices  for  real-­‐ Fme  communicaFon   TAD  Summit  Bangkok  2013   6  
  7. 7. WebRTC  Support  of  MulFple  Media   Microphone  Audio   ApplicaFon  Sharing  Video   Front  Camera  Video   Rear  Camera  Video   WebCam  Video   Stereo  Audio   Browser  L         on  Laptop   Browser  M   on  Mobile   •  MulFple  sources  of  audio  and  video  are  assumed   and  supported   •  All  media,  voice  and  video,  and  feedback  messages   are  mulFplexed  over  the  same  transport  address   TAD  Summit  Bangkok  2013   7  
  8. 8. WebRTC  Triangle   Web  Server   (ApplicaFon)   Peer  ConnecFon  (Audio,  Video,  and/or  Data)   Browser  L   Browser  M   (Running  HTML5  ApplicaFon     from  Web  Server)   (Running  HTML5  ApplicaFon     from  Web  Server)   •  Both  browsers  running  the  same  web  applicaFon  from  web   server   •  Peer  ConnecFon  established  between  them  with  the  help  of   the  web  server   TAD  Summit  Bangkok  2013   8  
  9. 9. WebRTC  Trapezoid   Web  Server  A   (ApplicaFon  A)   Browser  M   SIP     or  Jingle   Web  Server  B   (ApplicaFon  B)   Peer  ConnecFon  (Audio  and/or  Video)   Browser  T   (Running  HTML5  ApplicaFon     from  Web  Server  B)   (Running  HTML5  ApplicaFon     from  Web  Server  A)     •  Similar  to  SIP  Trapezoid     •  Web  Servers  communicate  using  SIP  or  Jingle  or  proprietary   •  Could  become  important  in  the  future.   TAD  Summit  Bangkok  2013   9  
  10. 10. WebRTC  and  SIP   Web  Server     SIP   SIP  Server   SIP   Browser  M   Peer  ConnecFon  (Audio  and/or  Video)   SIP  Client     •  SIP  (Session  IniFaFon  Protocol)  is  a  signaling  protocol  used  by  service   providers  and  enterprises  for  real-­‐Fme  communcaFon   •  Peer  ConnecFon  appears  as  a  standard  RTP  session,  described  by  SDP   •  SIP  Endpoint  must  support  RTCWEB  media  extensions       TAD  Summit  Bangkok  2013   10  
  11. 11. WebRTC  and  Jingle   Web  Server   Jingle   XMPP  Server   Jingle   Peer  ConnecFon  (Audio  and/or  Video)   Browser  M   Jingle  Client   •  Jingle  is  a  signaling  extension  to  XMPP  (Extensible  Messaging  and   Presence  Protocol,  aka  Jabber)   •  Peer  ConnecFon  SDP  can  be  mapped  to  Jingle   •  Jingle  Endpoint  must  support  RTCWEB  Media  extensions   TAD  Summit  Bangkok  2013   11  
  12. 12. WebRTC  and  PSTN   Web  Server   Peer  ConnecFon  (Audio)   PSTN  Gateway   Browser  M   Phone   •  Peer  ConnecFon  terminates  on  a  PSTN  Gateway   •  Audio  Only   •  EncrypFon  ends  at  Gateway   TAD  Summit  Bangkok  2013   12  
  13. 13. WebRTC  with  SIP   Web  Server     SIP  Proxy/Registrar  Server     WebSocket  (SIP)   HTTP     (HTML5/CSS/ JavaScript)   Browser  M   (running  JavaScript  SIP  UA)   HTTP     WebSocket   (HTML5/CSS/ (SIP)   JavaScript)   SRTP  Media   Browser  T   (running  JavaScript  SIP  UA)     •  Browser  runs  a  SIP  User  Agent  by  running  JavaScript  from  Web  Server     •  SRTP  media  connecFon  uses  WebRTC  APIs   •  Details  in  [dram-­‐iec-­‐sipcore-­‐websocket]  that  defines  SIP  transport  over   TAD  Summit  Bangkok  2013   13   WebSockets  
  14. 14. WebRTC  Signaling  Approaches   •  Signaling  is  required  for  exchange  of  candidate  transport   addresses,  codec  informaFon,  media  keying  informaFon   •  Many  opFons  –  choice  is  up  to  web  developer   TAD  Summit  Bangkok  2013   14  
  15. 15. How  to  Use  WebRTC  
  16. 16. WebRTC  usage  in  brief   Obtain  Local   Media   Get  more  media   All  media  added   Set  Up  Peer   ConnecFon   Peer  ConnecFon  established   AWach  Media   or  Data   AWach  more  media  or  data   Ready  for  call   Exchange   Offer/Answer   2013   TAD  Summit  Bangkok   16  
  17. 17. WebRTC  usage  in  brief   Obtain  Local   Media   Get  more  media   •  getUserMedia() –  Audio  and/or  video   –  Constraints   –  User  permissions   All  media  added   Set  Up  Peer   ConnecFon   •  Browser  must  ask  before   allowing  a  page  to  access   microphone  or  camera   Peer  ConnecFon  established   AWach  Media   or  Data   AWach  more  media  or  data   •  MediaStream •  MediaStreamTrack –  CapabiliFes   –  States  (sepngs)   Ready  for  call   Exchange   Offer/Answer   TAD  Summit  Bangkok  2013   17  
  18. 18. WebRTC  usage  in  brief   Obtain  Local   Media   Get  more  media   •  RTCPeerConnection –  All  media  added   –  –  Set  Up  Peer   –  ConnecFon   –  Peer  ConnecFon  established   –  AWach  Media   AWach  more  media  or  data   –  or  Data   –  Ready  for  call   Direct  media   Between  two  peers   ICE  processing   SDP  processing   DTMF  support   Data  channels   IdenFty  verificaFon   StaFsFcs  reporFng   Exchange   Offer/Answer   TAD  Summit  Bangkok  2013   18  
  19. 19. WebRTC  usage  in  brief   Obtain  Local   Media   Get  more  media   •  addStream() –  Doesn't  change  media  state!   •  removeStream() All  media  added   –  DiWo!   Set  Up  Peer   ConnecFon   •  createDataChannel() Peer  ConnecFon  established   AWach  Media   or  Data   –  Depends  on  transport   AWach  more  media  or  data   Ready  for  call   Exchange   Offer/Answer   TAD  Summit  Bangkok  2013   19  
  20. 20. WebRTC  usage  in  brief   Obtain  Local   Media   Get  more  media   All  media  added   Set  Up  Peer   ConnecFon   Peer  ConnecFon  established   AWach  Media   or  Data   •  createOffer(), createAnswer() •  setLocalDescription(), setRemoteDescription() •  Applying  SDP  answer  makes   the  magic  happen AWach  more  media  or  data   Ready  for  call   Exchange   Session   DescripFons   TAD  Summit  Bangkok  2013   20  
  21. 21. WebRTC  usage  –  a  bit  more  detail   Set  Up  Signaling   Channel   Obtain  Local   Media   Get  more  media   Set  Up  Peer   ConnecFon   AWach  Media  or   Data   Exchange   Session   TAD  Summit  Bangkok  2013   DescripFons   AWach  more  media  or  data   21  
  22. 22. SDP  offer/answer   •  Session  DescripFons   –  Session  DescripFon  Protocol  created  for  use  by   SIP  in  sepng  up  voice  (and  video)  calls   –  Describes  real-­‐Fme  media  at  low  level  of  detail   •  Which  IP  addresses  and  ports  to  use   •  Which  codecs  to  use   •  Offer/answer  model  (JSEP)   –  One  side  sends  an  SDP  offer  lisFng  what  it  wants   to  send  and  what  it  can  receive   –  Other  side  replies  with  an  SDP  answer  lisFng  what   it  will  receive  and  send   TAD  Summit  Bangkok  2013   22  
  23. 23. WebRTC  Peer-­‐to-­‐Peer  Media  
  24. 24. Media  Flows  in  WebRTC   Web  Server       Internet   Home  WiFi   Router   Router           Browser  M     Browser  D   Browser  T     Coffee  Shop   WiFi  Router       Browser  L     TAD  Summit  Bangkok  2013   24  
  25. 25. Media  without  WebRTC   Web  Server       Internet   Home  WiFi   Router   Router           Browser  M     Browser  D   Browser  T     Coffee  Shop   WiFi  Router       Browser  L     TAD  Summit  Bangkok  2013   25  
  26. 26. Peer-­‐to-­‐Peer  Media  with  WebRTC   Web  Server       Internet   Home  WiFi   Router   Router           Browser  M     Browser  D   Browser  T     Coffee  Shop   WiFi  Router       Browser  L     TAD  Summit  Bangkok  2013   26  
  27. 27. NAT  Complicates  Peer-­‐to-­‐Peer  Media   Web  Server   Most  browsers  are  behind  NATs   on  the  Internet,  which   complicates  the  establishment   of  peer-­‐to-­‐peer  media  sessions.         Internet   Router  with   NAT   Home  WiFi     with  NAT           Browser  M     Browser  D   Browser  T     Coffee  Shop   WiFi  with   NAT       Browser  L     TAD  Summit  Bangkok  2013   27  
  28. 28. What  is  a  NAT?   •  Network  Address  Translator  (NAT)   •  Used  to  map  an  inside  address  (usually  a   private  IP  address)  to  outside  address   (usually  a  public  IP  address)  at  Layer  3   •  Network  Address  and  Port  TranslaFon   (NAPT)  also  changes  the  transport  port   number  (Layer  4)   – These  are  omen  just  called  NATs  as  well   •  One  reason  for  NAT  is  the  IP  address   shortage   TAD  Summit  Bangkok  2013   28  
  29. 29. NAT  Example   Internet   “Outside”      Public  IP  Address       203.0.113.4 “Inside”      Private  IP  Addresses       192.168.x.x Home  WiFi     with  NAT     Browser  M   192.168.0.5     Browser  T   192.168.0.6       TAD  Summit  Bangkok  2013   29  
  30. 30. NATs  and  ApplicaFons   •  NATs  are  compaFble  with  client/server  protocols   such  as  web,  email,  etc.   •  However,  NATs  generally  block  peer-­‐to-­‐peer   communicaFon   •  Typical  NAT  traversal  for  VoIP  and  video   services  today  use  a  media  relay  whenever  the   client  is  behind  a  NAT   –  Omen  done  with  an  SBC  –  Session  Border   Controller   –  This  is  a  major  expense  and  complicaFon  in   exisFng  VoIP  and  video  systems   •  WebRTC  has  a  built-­‐in  NAT  traversal  strategy:   InteracFve  ConnecFvity  Establishment  (ICE)   TAD  Summit  Bangkok  2013   30  
  31. 31. Peer-­‐to-­‐Peer  Media  Through  NAT   Web  Server   ICE  connecFvity  checks  can   omen  establish  a  direct  peer-­‐ to-­‐peer  session  between   browsers  behind  different   NATs         Internet   Router  with   NAT   Home  WiFi     with  NAT           Browser  M     Browser  D   Browser  T     Coffee  Shop   WiFi  with   NAT       Browser  L     TAD  Summit  Bangkok  2013   31  
  32. 32. ICE  ConnecFvity  Checks   •  ConnecFvity  through  NAT  can  be  achieved  using  ICE   connecFvity  checks   •  Browsers  exchange  a  list  of  candidates   –  Local:  read  from  network  interfaces     –  Reflexive:  obtained  using  a  STUN  Server   –  Relayed:  obtained  from  a  TURN  Server  (media  relay)   •  Browsers  aWempt  to  send  STUN  packets  to  the   candidate  list  received  from  other  browser   •  Checks  performed  by  both  sides  at  same  Fme   •  If  one  STUN  packet  gets  through,  a  response  is  sent   and  this  connecFon  used  for  communicaFon   –  TURN  relay  will  be  last  resort  (lowest  priority)   TAD  Summit  Bangkok  2013   32  
  33. 33. P2P  Media  Can  Stay  Local  to  NAT   If  both  browsers  are   behind  the  same  NAT,   connecFvity  checks  can   omen  establish  a   connecFon  that  never   leaves  the  NAT.     Web  Server       Internet   Router  with   NAT   Home  WiFi     with  NAT           Browser  M     Browser  D   Browser  T     Coffee  Shop   WiFi  with   NAT       Browser  L     TAD  Summit  Bangkok  2013   33  
  34. 34. ICE  Servers   Web  Server       STUN  Server   TURN  Server   198.51.100.9   198.51.100.2       ICE  uses  STUN  and  TURN   servers  in  the  public   Internet  to  help  with  NAT   traversal.     Internet   Home  WiFi     with  NAT   203.0.113.4 Router  with   NAT         Browser  M   192.168.0.5         Browser  D     Browser  T     Coffee  Shop   WiFi  with   NAT       Browser  L     TAD  Summit  Bangkok  2013   34  
  35. 35. Browser  Queries  STUN  Server     Web  Server       STUN  Server   TURN  Server   198.51.100.9   198.51.100.2       Browser  sends  STUN  test   packet  to  STUN  server  to   learn  its  public  IP  address   (address  of  the  NAT).     Internet   Home  WiFi     with  NAT   203.0.113.4 Router  with   NAT         Browser  M   192.168.0.5         Browser  D     Browser  T     Coffee  Shop   WiFi  with   NAT       Browser  L     TAD  Summit  Bangkok  2013   35  
  36. 36. TURN  Server  Can  Relay  Media   Web  Server       STUN   TURN  Server  as  a   Server   Media  Relay       In  some  cases,  connecFvity   checks  fail,  and  a  TURN   Media  Relay  on  the  public   Internet  must  be  used.       Internet   Router  with   NAT   Home  WiFi     with  NAT           Browser  M     Browser  D   Browser  T     Coffee  Shop   WiFi  with   NAT       Browser  L     TAD  Summit  Bangkok  2013   36  
  37. 37. WebRTC  Protocols  and  IETF   Standards  
  38. 38. WebRTC:  A  Joint  Standards  Effort   •  Internet  Engineering  Task  Force  (IETF)  and  World   Wide  Web  ConsorFum  (W3C)  are  working  together  on   WebRTC   •  IETF   –  Protocols  –  “bits  on  wire”   –  Main  protocols  are  already  RFCs,  but  many  extensions  in   progress   –  RTCWEB  (Real-­‐Time  CommunicaFons  on  the  Web)  Working   Group  is  the  main  focus,  but  other  WGs  involved  as  well   –  hWp://www.iec.org     •  W3C   –  APIs  –  used  by  JavaScript  code  in  HTML5   –  hWp://www.w3c.org   TAD  Summit  Bangkok  2013   38  
  39. 39. WebRTC  Protocols   ApplicaFon  Layer   HTTP   ICE   WebSocket   SRTP   SDP   STUN   TURN   Transport  Layer   TLS   TCP   Network  Layer   DTLS   UDP   SCTP   IP   SIP  is  not  shown  as  it  is  opFonal   TAD  Summit  Bangkok  2013   39  
  40. 40. IETF  RTCWEB  Documents   Document) Ref) Overview' “Overview:'Real'Time'Protocols'for' Browser6based'Applications”' draft6ietf6rtcweb6 overview' Use'Cases'and'Requirements' “Web'Real6Time'Communication' Use6cases'and'Requirements”' draft6ietf6rtcweb6 use6cases6and6 requirements' RTP'Usage' “Web'Real6Time'Communication' (WebRTC):'Media'Transport'and' Use'of'RTP”' draft6ietf6rtcweb6 rtp6usage' Security'Architecture' “RTCWEB'Security'Architecture”' draft6ietf6rtcweb6 security6arch' Threat'Model' “Security'Considerations'for'RTC6 Web”' draft6ietf6rtcweb6 security' Data'Channel' “RTCWeb'Data'Channels”' draft6ietf6rtcweb6 data6channel' JSEP' “JavaScript'Session'Establishment' Protocol”' draft6ietf6rtcweb6 jsep' Audio' “WebRTC'Audio'Codec'and' Processing'Requirements”' draft6ietf6rtcweb6 audio' Quality'of'Service' ' Title) “DSCP'and'other'packet'markings' for'RTCWeb'QoS”' draft6ietf6rtcweb6 qos' TAD  Summit  Bangkok  2013   40  
  41. 41. Codecs                      RFC  6716                  .                 •  Mandatory  to  Implement  (MTI)  audio  codecs  are   seWled  on  Opus  and  G.711  (finally!)   •  Video  is  not  yet  decided!   TAD  Summit  Bangkok  2013   41  
  42. 42. WebRTC  W3C  API  Overview  
  43. 43. Two  primary  API  secFons   •  Handling  local  media   –  Media  Capture  and  Streams  (getUserMedia)   specificaFon   •  Transmipng  media   –  WebRTC  (Peer  ConnecFon)  specificaFon   TAD  Summit  Bangkok  2013   43  
  44. 44. Local  Media  Handling   Audio   PresentaFon   Video   “Audio”  Track   Presenter  Stream   “PresentaFon”    Track   “Audio”  Track   Microphone  Audio   Presenter  Video   ApplicaFon  Sharing  Video   Front  Camera  Video   Rear  Camera  Video   “Presenter”     Track   DemonstraFon  Stream   “Audio”  Track   DemonstraFon   Video   Browser  M   Sources   PresentaFon  Stream   Captured   MediaStreams   •  In  this  example   “DemonstraFon”    Track   Created   MediaStreams   Tracks   –  Captured  4  local  media  streams   –  Created  3  media  streams  from  them   –  Sent  streams  over  Peer  ConnecFon   TAD  Summit  Bangkok  2013   44  
  45. 45. Local  Media  Handling   Audio   PresentaFon   Video   Microphone  Audio   ApplicaFon  Sharing  Video   Browser  M   Sources   •  Sources   “Audio”  Track   Presenter  Stream   “PresentaFon”    Track   “Audio”  Track   Presenter  Video   Front  Camera  Video   Rear  Camera  Video   PresentaFon  Stream   “Presenter”     Track   DemonstraFon  Stream   “Audio”  Track   DemonstraFon   Video   Captured   MediaStreams   “DemonstraFon”    Track   Created   MediaStreams   Tracks   –  Encoded  together   –  Can't  manipulate  individually   TAD  Summit  Bangkok  2013   45  
  46. 46. Local  Media  Handling   Audio   PresentaFon   Video   Microphone  Audio   ApplicaFon  Sharing  Video   Browser  M   “Audio”  Track   Presenter  Stream   “PresentaFon”    Track   “Audio”  Track   Presenter  Video   Front  Camera  Video   Rear  Camera  Video   PresentaFon  Stream   “Presenter”     Track   DemonstraFon  Stream   “Audio”  Track   DemonstraFon   Video   Sources   Captured   MediaStreams   •  Tracks  (MediaStreamTrack)   “DemonstraFon”    Track   Created   MediaStreams   Tracks   –  Tied  to  a  source   –  Exist  primarily  as  part  of  Streams;  single  media  type   –  Globally  unique  ids;  opFonally  browser-­‐labeled   TAD  Summit  Bangkok  2013   46  
  47. 47. Local  Media  Handling   PresentaFon  Stream   Audio   “Audio”  Track   PresentaFon   Video   Microphone  Audio   ApplicaFon  Sharing  Video   Presenter  Stream   “Audio”  Track   Presenter  Video   “Presenter”     Track   DemonstraFon  Stream   Front  Camera  Video   Rear  Camera  Video   Browser  M   Sources   “PresentaFon”    Track   “Audio”  Track   DemonstraFon   Video   Captured   MediaStreams   •  Captured  MediaStream   “DemonstraFon”    Track   Created   MediaStreams   Tracks   –  Returned  from  getUserMedia() –  Permission  check  required  to  obtain   TAD  Summit  Bangkok  2013   47  
  48. 48. Local  Media  Handling   Audio   PresentaFon   Video   Microphone  Audio   ApplicaFon  Sharing  Video   Browser  M   Sources   “Audio”  Track   Presenter  Stream   “PresentaFon”    Track   “Audio”  Track   Presenter  Video   Front  Camera  Video   Rear  Camera  Video   PresentaFon  Stream   “Presenter”     Track   DemonstraFon  Stream   “Audio”  Track   DemonstraFon   Video   Captured   MediaStreams   •  MediaStream   “DemonstraFon”    Track   Created   MediaStreams   Tracks   –  All  contained  tracks  are  synchronized   –  Can  be  created,  transmiWed,  etc.   TAD  Summit  Bangkok  2013   48  
  49. 49. Local  Media  Handling   •  Sepngs   –  Current  values  of  source  properFes  (height,  width,   etc.)   –  Exposed  on  MediaStreamTrack •  CapabiliFes   –  Allowed  values  for  source  properFes   –  Exposed  on  MediaStreamTrack •  Constraints   –  Requested  ranges  for  track  properFes   –  Used  in  getUserMedia(),  applyConstraints() TAD  Summit  Bangkok  2013   49  
  50. 50. Transmipng  media   •  Signaling  channel   –  Non-­‐standard   –  Must  exist  to  set  up  Peer  ConnecFon   •  Peer  ConnecFon   –  Links  together  two  peers   –  Add/Remove  Media  Streams   •  addStream(),  removeStream() –  Handlers  for  ICE  or  media  change   –  Data  Channel  support   TAD  Summit  Bangkok  2013   50  
  51. 51. Peer  ConnecFon   •  "Links"  together  two  peers   –  Via  new RTCPeerConnection() –  Generates  Session  DescripFon  offers/answers   •  createOffer(),  createAnswer() –  From  SDP  answers,  iniFates  media   •  setLocalDescription(),  setRemoteDescription() –  Offers/answers  MUST  be  relayed  by  applicaFon   code!   –  ICE  candidates  can  also  be  relayed  and  added  by  app   •  addIceCandidate()   TAD  Summit  Bangkok  2013   51  
  52. 52. Peer  ConnecFon   •  Handlers  for  signaling,  ICE  or  media  change   –  onsignalingstatechange –  onicecandidate,   oniceconnectionstatechange –  onaddstream,  onremovestream –  onnegotiationneeded –  A  few  others   TAD  Summit  Bangkok  2013   52  
  53. 53. Peer  ConnecFon   •  “Extra”  APIs   –  Data   –  DTMF   –  StaFsFcs   –  IdenFty   •  Grouped  separately  in  WebRTC  spec   –  but  really  part  of  RTCPeerConnection   definiFon   –  all  are  mandatory  to  implement   TAD  Summit  Bangkok  2013   53  
  54. 54. Data  Channel  API   •  RTCDataChannel createDataChannel() •  Configurable  with   –  –  –  –  ordered maxRetransmits,  maxRetransmitTime negotiated id •  Provides  RTCDataChannel  with   –  send() –  onopen,  onerror,  onclose,  onmessage*   TAD  Summit  Bangkok  2013   54  
  55. 55. DTMF  API   •  RTCDTMFSender createDTMFSender() –  Associates  track  input  parameter  with  this   RTCPeerConnection •  RTCDTMFSender  provides   –  boolean canInsertDTMF() –  insertDTMF() –  ontonechange –  (other  stuff)   TAD  Summit  Bangkok  2013   55  
  56. 56. StaFsFcs  API   •  getStats() –  Callback  returns  staFsFcs  for  given  track   •  StaFsFcs  available  (local/remote)  are:   –  Bytes/packets  xmiWed   –  Bytes/packets  received   •  May  be  useful  for  congesFon-­‐based   adjustments   TAD  Summit  Bangkok  2013   56  
  57. 57. IdenFty  API   •  setIdentityProvider(),   getIdentityAssertion() •  Used  to  verify  idenFty  via  third  party,  e.g.,   Facebook  Connect   •  Both  methods  are  opFonal   •  onidentity  handler  called  amer  any   verificaFon  aWempt   •  RTCPeerConnection.peerIdentity  holds   any  verified  idenFty  asserFon   TAD  Summit  Bangkok  2013   57  
  58. 58. Pseudo  Code  Walkthrough  
  59. 59. Pseudo  Code   •  Close  to  real  code,  but  .  .  .   •  No  HTML,  no  signaling  channel,  not   asynchronous,  and  API  is  sFll  in  flux   •  Don't  expect  this  to  work  anywhere   TAD  Summit  Bangkok  2013   59  
  60. 60. Back  to  first  diagram   Microphone  Audio   ApplicaFon  Sharing  Video   Front  Camera  Video   Rear  Camera  Video   WebCam  Video   Stereo  Audio   Browser  L         on  Laptop   Browser  M   on  Mobile   •  Mobile  browser  "calls"  laptop  browser   •  Each  sends  media  to  the  other   TAD  Summit  Bangkok  2013   60  
  61. 61. Mobile  browser  code  outline   var signalingChannel = createSignalingChannel(); var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", "credential":"myPassword"}]}; var microphone, application, front, rear; var presentation, presenter, demonstration; var remote_av, stereo, mono; var display, left, right; function s(sdp) {} // stub success callback function e(error) {} // stub error callback var signalingChannel = createSignalingChannel(); getMedia(); createPC(); attachMedia(); call(); getMedia(); createPC(); attachMedia(); call(); function getMedia() { // get local audio (microphone) navigator.getUserMedia({"audio": true }, function (stream) { microphone = stream; }, e); // get local video (application sharing) ///// This is outside the scope of this specification. ///// Assume that 'application' has been set to this stream. // constraint = {"video": {"mandatory": {"videoFacingModeEnum": "front"}}}; navigator.getUserMedia(constraint, function (stream) { front = stream; }, e); constraint = {"video": {"mandatory": {"videoFacingModeEnum": "rear"}}}; navigator.getUserMedia(constraint, function (stream) { rear = stream; }, e); } function createPC() { pc = new RTCPeerConnection(configuration); pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } function attachMedia() { presentation = new MediaStream( •  We  will  look  next  at  each  of  these   •  .  .  .  except  for  creaFng  the  signaling   channel   [microphone.getAudioTracks()[0], // Audio application.getVideoTracks()[0]]); // Presentation presenter = new MediaStream( [microphone.getAudioTracks()[0], // Audio front.getVideoTracks()[0]]); // Presenter demonstration = new MediaStream( [microphone.getAudioTracks()[0], rear.getVideoTracks()[0]]); // Audio // Demonstration pc.addStream(presentation); pc.addStream(presenter); pc.addStream(demonstration); } signalingChannel.send( JSON.stringify({ "presentation": presentation.id, "presenter": presenter.id, "demonstration": demonstration.id })); function call() { pc.createOffer(gotDescription, e); function gotDescription(desc) { pc.setLocalDescription(desc, s, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } } function handleIncomingStream(st) { if (st.getVideoTracks().length == 1) { av_stream = st; show_av(av_stream); } else if (st.getAudioTracks().length == 2) { stereo = st; } else { mono = st; } } function show_av(st) { display.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); left.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[0])); right.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[1])); } signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data); if (signal.sdp) { pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e); } else { pc.addIceCandidate( new RTCIceCandidate(signal.candidate)); } }; TAD  Summit  Bangkok  2013   61  
  62. 62. Mobile  browser  produces  .  .  .   Audio   PresentaFon  Stream   “Audio”  Track   “PresentaFon”  Track   PresentaFon  Video   Microphone  Audio   ApplicaFon  Sharing  Video   “Audio”  Track   Presenter  Video   “Presenter”  Track   DemonstraFon  Stream   Front  Camera  Video   Rear  Camera  Video   Presenter  Stream   “Audio”  Track   DemonstraFon  Video   “DemonstraFon”  Track   Browser  M   Sources   Captured  MediaStreams   Created  MediaStreams   •  At  least  3  calls  to  getUserMedia() •  Three  calls  to  new MediaStream() •  App  sends  stream  ids,  then  streams   TAD  Summit  Bangkok  2013   Tracks   62  
  63. 63. funcFon  getMedia()  [1]   navigator.getUserMedia({"audio": true }, function (stream) { microphone = stream; }, e); var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", "credential":"myPassword"}]}; var microphone, application, front, rear; var presentation, presenter, demonstration; var remote_av, stereo, mono; var display, left, right; function s(sdp) {} // stub success callback // get local video (application sharing) ///// This is outside the scope of this specification. ///// Assume that 'application' has been set to this stream. // function e(error) {} // stub error callback var signalingChannel = createSignalingChannel(); getMedia(); createPC(); attachMedia(); call(); function getMedia() { // get local audio (microphone) navigator.getUserMedia({"audio": true }, function (stream) { microphone = stream; }, e); // get local video (application sharing) ///// This is outside the scope of this specification. ///// Assume that 'application' has been set to this stream. // constraint = {"video": {"mandatory": {"videoFacingModeEnum": "front"}}}; navigator.getUserMedia(constraint, function (stream) { front = stream; }, e); constraint = {"video": {"mandatory": {"videoFacingModeEnum": "rear"}}}; navigator.getUserMedia(constraint, function (stream) { rear = stream; }, e); } function createPC() { pc = new RTCPeerConnection(configuration); pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } . . . function attachMedia() { presentation = new MediaStream( •  Get  audio   •  (Get  window  video  –  out  of  scope)   [microphone.getAudioTracks()[0], // Audio application.getVideoTracks()[0]]); // Presentation presenter = new MediaStream( [microphone.getAudioTracks()[0], // Audio front.getVideoTracks()[0]]); // Presenter demonstration = new MediaStream( [microphone.getAudioTracks()[0], rear.getVideoTracks()[0]]); // Audio // Demonstration pc.addStream(presentation); pc.addStream(presenter); pc.addStream(demonstration); } signalingChannel.send( JSON.stringify({ "presentation": presentation.id, "presenter": presenter.id, "demonstration": demonstration.id })); function call() { pc.createOffer(gotDescription, e); function gotDescription(desc) { pc.setLocalDescription(desc, s, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } } function handleIncomingStream(st) { if (st.getVideoTracks().length == 1) { av_stream = st; show_av(av_stream); } else if (st.getAudioTracks().length == 2) { stereo = st; } else { mono = st; } } function show_av(st) { display.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); left.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[0])); right.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[1])); } signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data); if (signal.sdp) { pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e); } else { pc.addIceCandidate( new RTCIceCandidate(signal.candidate)); } }; TAD  Summit  Bangkok  2013   63  
  64. 64. funcFon  getMedia()  [2]   . . . constraint = {"video": {"mandatory": {"videoFacingModeEnum": "front"}}}; navigator.getUserMedia(constraint, function (stream) { front = stream; }, e); var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", "credential":"myPassword"}]}; var microphone, application, front, rear; var presentation, presenter, demonstration; var remote_av, stereo, mono; var display, left, right; function s(sdp) {} // stub success callback function e(error) {} // stub error callback var signalingChannel = createSignalingChannel(); getMedia(); createPC(); attachMedia(); call(); function getMedia() { // get local audio (microphone) navigator.getUserMedia({"audio": true }, function (stream) { microphone = stream; }, e); // get local video (application sharing) ///// This is outside the scope of this specification. ///// Assume that 'application' has been set to this stream. // constraint = {"video": {"mandatory": {"videoFacingModeEnum": "rear"}}}; navigator.getUserMedia(constraint, function (stream) { rear = stream; }, e); •  Get  front-­‐facing  camera   •  Get  rear-­‐facing  camera   constraint = {"video": {"mandatory": {"videoFacingModeEnum": "front"}}}; navigator.getUserMedia(constraint, function (stream) { front = stream; }, e); constraint = {"video": {"mandatory": {"videoFacingModeEnum": "rear"}}}; navigator.getUserMedia(constraint, function (stream) { rear = stream; }, e); } function createPC() { pc = new RTCPeerConnection(configuration); pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } function attachMedia() { presentation = new MediaStream( [microphone.getAudioTracks()[0], // Audio application.getVideoTracks()[0]]); // Presentation presenter = new MediaStream( [microphone.getAudioTracks()[0], // Audio front.getVideoTracks()[0]]); // Presenter demonstration = new MediaStream( [microphone.getAudioTracks()[0], rear.getVideoTracks()[0]]); // Audio // Demonstration pc.addStream(presentation); pc.addStream(presenter); pc.addStream(demonstration); } signalingChannel.send( JSON.stringify({ "presentation": presentation.id, "presenter": presenter.id, "demonstration": demonstration.id })); function call() { pc.createOffer(gotDescription, e); function gotDescription(desc) { pc.setLocalDescription(desc, s, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } } function handleIncomingStream(st) { if (st.getVideoTracks().length == 1) { av_stream = st; show_av(av_stream); } else if (st.getAudioTracks().length == 2) { stereo = st; } else { mono = st; } } function show_av(st) { display.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); left.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[0])); right.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[1])); } signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data); if (signal.sdp) { pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e); } else { pc.addIceCandidate( new RTCIceCandidate(signal.candidate)); } }; TAD  Summit  Bangkok  2013   64  
  65. 65. funcFon  createPC()   var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", "credential":"myPassword"}]}; pc = new RTCPeerConnection(configuration); var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", "credential":"myPassword"}]}; var microphone, application, front, rear; var presentation, presenter, demonstration; var remote_av, stereo, mono; var display, left, right; function s(sdp) {} // stub success callback function e(error) {} // stub error callback var signalingChannel = createSignalingChannel(); getMedia(); createPC(); attachMedia(); call(); pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; •  Create  RTCPeerConnection •  Set  handlers   function getMedia() { // get local audio (microphone) navigator.getUserMedia({"audio": true }, function (stream) { microphone = stream; }, e); // get local video (application sharing) ///// This is outside the scope of this specification. ///// Assume that 'application' has been set to this stream. // constraint = {"video": {"mandatory": {"videoFacingModeEnum": "front"}}}; navigator.getUserMedia(constraint, function (stream) { front = stream; }, e); constraint = {"video": {"mandatory": {"videoFacingModeEnum": "rear"}}}; navigator.getUserMedia(constraint, function (stream) { rear = stream; }, e); } function createPC() { pc = new RTCPeerConnection(configuration); pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } function attachMedia() { presentation = new MediaStream( [microphone.getAudioTracks()[0], // Audio application.getVideoTracks()[0]]); // Presentation presenter = new MediaStream( [microphone.getAudioTracks()[0], // Audio front.getVideoTracks()[0]]); // Presenter demonstration = new MediaStream( [microphone.getAudioTracks()[0], rear.getVideoTracks()[0]]); // Audio // Demonstration pc.addStream(presentation); pc.addStream(presenter); pc.addStream(demonstration); } signalingChannel.send( JSON.stringify({ "presentation": presentation.id, "presenter": presenter.id, "demonstration": demonstration.id })); function call() { pc.createOffer(gotDescription, e); function gotDescription(desc) { pc.setLocalDescription(desc, s, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } } function handleIncomingStream(st) { if (st.getVideoTracks().length == 1) { av_stream = st; show_av(av_stream); } else if (st.getAudioTracks().length == 2) { stereo = st; } else { mono = st; } } function show_av(st) { display.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); left.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[0])); right.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[1])); } signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data); if (signal.sdp) { pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e); } else { pc.addIceCandidate( new RTCIceCandidate(signal.candidate)); } }; TAD  Summit  Bangkok  2013   65  
  66. 66. Mobile  browser  consumes  .  .  .   Audio  &  Video  Stream   Display   “Video”  Track   Right  Headphone   “Right”  Track   Lem  Headphone   Browser  M   “Lem”  Track   (Audio  &  Video  Stream  selected)   Stereo  Stream   “Right”  Track   “Lem”  Track   “Mono”  track   Mono  Stream   Sinks   MediaStreams   •  Receives  three  media  streams   •  Chooses  one •  Sends  tracks  to  output  channels   TAD  Summit  Bangkok  2013   Tracks   66  
  67. 67. FuncFon  handleIncomingStream()   if (st.getVideoTracks().length == 1) { av_stream = st; show_av(av_stream); } else if (st.getAudioTracks().length == 2) { stereo = st; } else { mono = st; } var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", "credential":"myPassword"}]}; var microphone, application, front, rear; var presentation, presenter, demonstration; var remote_av, stereo, mono; var display, left, right; function s(sdp) {} // stub success callback function e(error) {} // stub error callback var signalingChannel = createSignalingChannel(); getMedia(); createPC(); attachMedia(); call(); function getMedia() { // get local audio (microphone) navigator.getUserMedia({"audio": true }, function (stream) { microphone = stream; }, e); // get local video (application sharing) ///// This is outside the scope of this specification. ///// Assume that 'application' has been set to this stream. // constraint = {"video": {"mandatory": {"videoFacingModeEnum": "front"}}}; navigator.getUserMedia(constraint, function (stream) { front = stream; }, e); constraint = {"video": {"mandatory": {"videoFacingModeEnum": "rear"}}}; navigator.getUserMedia(constraint, function (stream) { rear = stream; }, e); } function createPC() { pc = new RTCPeerConnection(configuration); •  If  incoming  stream  has  video  track,  set  to   av_stream  and  display  it   •  If  it  has  two  audio  tracks,  must  be  stereo   •  Otherwise,  must  be  the  mono  stream   pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } function attachMedia() { presentation = new MediaStream( [microphone.getAudioTracks()[0], // Audio application.getVideoTracks()[0]]); // Presentation presenter = new MediaStream( [microphone.getAudioTracks()[0], // Audio front.getVideoTracks()[0]]); // Presenter demonstration = new MediaStream( [microphone.getAudioTracks()[0], rear.getVideoTracks()[0]]); // Audio // Demonstration pc.addStream(presentation); pc.addStream(presenter); pc.addStream(demonstration); } signalingChannel.send( JSON.stringify({ "presentation": presentation.id, "presenter": presenter.id, "demonstration": demonstration.id })); function call() { pc.createOffer(gotDescription, e); function gotDescription(desc) { pc.setLocalDescription(desc, s, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } } function handleIncomingStream(st) { if (st.getVideoTracks().length == 1) { av_stream = st; show_av(av_stream); } else if (st.getAudioTracks().length == 2) { stereo = st; } else { mono = st; } } function show_av(st) { display.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); left.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[0])); right.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[1])); } signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data); if (signal.sdp) { pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e); } else { pc.addIceCandidate( new RTCIceCandidate(signal.candidate)); } }; TAD  Summit  Bangkok  2013   67  
  68. 68. FuncFon  show_av(st)   display.srcObject = new MediaStream(st.getVideoTracks()[0]); left.srcObject = new MediaStream(st.getAudioTracks()[0]); right.srcObject = new MediaStream(st.getAudioTracks()[1]); var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", "credential":"myPassword"}]}; var microphone, application, front, rear; var presentation, presenter, demonstration; var remote_av, stereo, mono; var display, left, right; function s(sdp) {} // stub success callback function e(error) {} // stub error callback var signalingChannel = createSignalingChannel(); getMedia(); createPC(); attachMedia(); call(); function getMedia() { // get local audio (microphone) navigator.getUserMedia({"audio": true }, function (stream) { microphone = stream; }, e); // get local video (application sharing) ///// This is outside the scope of this specification. ///// Assume that 'application' has been set to this stream. // constraint = {"video": {"mandatory": {"videoFacingModeEnum": "front"}}}; navigator.getUserMedia(constraint, function (stream) { front = stream; }, e); constraint = {"video": {"mandatory": {"videoFacingModeEnum": "rear"}}}; navigator.getUserMedia(constraint, function (stream) { rear = stream; }, e); } function createPC() { pc = new RTCPeerConnection(configuration); pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } function attachMedia() { presentation = new MediaStream( [microphone.getAudioTracks()[0], // Audio application.getVideoTracks()[0]]); // Presentation presenter = new MediaStream( [microphone.getAudioTracks()[0], // Audio front.getVideoTracks()[0]]); // Presenter demonstration = new MediaStream( [microphone.getAudioTracks()[0], rear.getVideoTracks()[0]]); // Audio // Demonstration pc.addStream(presentation); pc.addStream(presenter); pc.addStream(demonstration); •  Turn  streams  into  URLs   •  Set  as  source  for  media  elements   } signalingChannel.send( JSON.stringify({ "presentation": presentation.id, "presenter": presenter.id, "demonstration": demonstration.id })); function call() { pc.createOffer(gotDescription, e); function gotDescription(desc) { pc.setLocalDescription(desc, s, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } } function handleIncomingStream(st) { if (st.getVideoTracks().length == 1) { av_stream = st; show_av(av_stream); } else if (st.getAudioTracks().length == 2) { stereo = st; } else { mono = st; } } function show_av(st) { display.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); left.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[0])); right.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[1])); } signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data); if (signal.sdp) { pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e); } else { pc.addIceCandidate( new RTCIceCandidate(signal.candidate)); } }; TAD  Summit  Bangkok  2013   68  
  69. 69. Mobile  browser  code  outline   var signalingChannel = createSignalingChannel(); var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", "credential":"myPassword"}]}; var microphone, application, front, rear; var presentation, presenter, demonstration; var remote_av, stereo, mono; var display, left, right; function s(sdp) {} // stub success callback function e(error) {} // stub error callback var signalingChannel = createSignalingChannel(); getMedia(); createPC(); attachMedia(); call(); getMedia(); createPC(); attachMedia(); call(); function getMedia() { // get local audio (microphone) navigator.getUserMedia({"audio": true }, function (stream) { microphone = stream; }, e); // get local video (application sharing) ///// This is outside the scope of this specification. ///// Assume that 'application' has been set to this stream. // constraint = {"video": {"mandatory": {"videoFacingModeEnum": "front"}}}; navigator.getUserMedia(constraint, function (stream) { front = stream; }, e); constraint = {"video": {"mandatory": {"videoFacingModeEnum": "rear"}}}; navigator.getUserMedia(constraint, function (stream) { rear = stream; }, e); } function createPC() { pc = new RTCPeerConnection(configuration); pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } function attachMedia() { presentation = new MediaStream( •  We  will  look  next  at  each  of  these   •  .  .  .  except  for  creaFng  the  signaling   channel   [microphone.getAudioTracks()[0], // Audio application.getVideoTracks()[0]]); // Presentation presenter = new MediaStream( [microphone.getAudioTracks()[0], // Audio front.getVideoTracks()[0]]); // Presenter demonstration = new MediaStream( [microphone.getAudioTracks()[0], rear.getVideoTracks()[0]]); // Audio // Demonstration pc.addStream(presentation); pc.addStream(presenter); pc.addStream(demonstration); } signalingChannel.send( JSON.stringify({ "presentation": presentation.id, "presenter": presenter.id, "demonstration": demonstration.id })); function call() { pc.createOffer(gotDescription, e); function gotDescription(desc) { pc.setLocalDescription(desc, s, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } } function handleIncomingStream(st) { if (st.getVideoTracks().length == 1) { av_stream = st; show_av(av_stream); } else if (st.getAudioTracks().length == 2) { stereo = st; } else { mono = st; } } function show_av(st) { display.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); left.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[0])); right.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[1])); } signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data); if (signal.sdp) { pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e); } else { pc.addIceCandidate( new RTCIceCandidate(signal.candidate)); } }; TAD  Summit  Bangkok  2013   69  
  70. 70. funcFon  aWachMedia()  [1]   presentation = new MediaStream( [microphone.getAudioTracks()[0], application.getVideoTracks()[0]]); presenter = new MediaStream( [microphone.getAudioTracks()[0], front.getVideoTracks()[0]]); demonstration = new MediaStream( [microphone.getAudioTracks()[0], rear.getVideoTracks()[0]]); . . . // Audio // Presentation "credential":"myPassword"}]}; var microphone, application, front, rear; var presentation, presenter, demonstration; var remote_av, stereo, mono; var display, left, right; function s(sdp) {} // stub success callback function e(error) {} // stub error callback var signalingChannel = createSignalingChannel(); getMedia(); createPC(); attachMedia(); call(); function getMedia() { // get local audio (microphone) navigator.getUserMedia({"audio": true }, function (stream) { microphone = stream; }, e); // Audio // Presenter // get local video (application sharing) ///// This is outside the scope of this specification. ///// Assume that 'application' has been set to this stream. // constraint = {"video": {"mandatory": {"videoFacingModeEnum": "front"}}}; navigator.getUserMedia(constraint, function (stream) { front = stream; }, e); constraint = {"video": {"mandatory": {"videoFacingModeEnum": "rear"}}}; navigator.getUserMedia(constraint, function (stream) { rear = stream; }, e); } function createPC() { pc = new RTCPeerConnection(configuration); pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } function attachMedia() { presentation = new MediaStream( // Audio // Demonstration •  Create  3  new  streams,  all  with  same   audio  but  different  video   TAD  Summit  Bangkok  2013   var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", [microphone.getAudioTracks()[0], // Audio application.getVideoTracks()[0]]); // Presentation presenter = new MediaStream( [microphone.getAudioTracks()[0], // Audio front.getVideoTracks()[0]]); // Presenter demonstration = new MediaStream( [microphone.getAudioTracks()[0], rear.getVideoTracks()[0]]); // Audio // Demonstration pc.addStream(presentation); pc.addStream(presenter); pc.addStream(demonstration); } signalingChannel.send( JSON.stringify({ "presentation": presentation.id, "presenter": presenter.id, "demonstration": demonstration.id })); function call() { pc.createOffer(gotDescription, e); function gotDescription(desc) { pc.setLocalDescription(desc, s, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } } function handleIncomingStream(st) { if (st.getVideoTracks().length == 1) { av_stream = st; show_av(av_stream); } else if (st.getAudioTracks().length == 2) { stereo = st; } else { mono = st; } } function show_av(st) { display.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); left.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[0])); right.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[1])); } signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data); if (signal.sdp) { pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e); } else { pc.addIceCandidate( new RTCIceCandidate(signal.candidate)); } }; 70  
  71. 71. funcFon  aWachMedia()  [2]   pc.addStream(presentation); pc.addStream(presenter); pc.addStream(demonstration); var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", signalingChannel.send( JSON.stringify({ "presentation": presentation.id, "presenter": presenter.id, "demonstration": demonstration.id })); "credential":"myPassword"}]}; var microphone, application, front, rear; var presentation, presenter, demonstration; var remote_av, stereo, mono; var display, left, right; function s(sdp) {} // stub success callback function e(error) {} // stub error callback var signalingChannel = createSignalingChannel(); getMedia(); createPC(); attachMedia(); call(); function getMedia() { // get local audio (microphone) navigator.getUserMedia({"audio": true }, function (stream) { microphone = stream; }, e); // get local video (application sharing) ///// This is outside the scope of this specification. ///// Assume that 'application' has been set to this stream. // constraint = {"video": {"mandatory": {"videoFacingModeEnum": "front"}}}; navigator.getUserMedia(constraint, function (stream) { front = stream; }, e); constraint = {"video": {"mandatory": {"videoFacingModeEnum": "rear"}}}; navigator.getUserMedia(constraint, function (stream) { rear = stream; }, e); } function createPC() { pc = new RTCPeerConnection(configuration); pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } function attachMedia() { presentation = new MediaStream( [microphone.getAudioTracks()[0], // Audio application.getVideoTracks()[0]]); // Presentation presenter = new MediaStream( [microphone.getAudioTracks()[0], // Audio front.getVideoTracks()[0]]); // Presenter demonstration = new MediaStream( [microphone.getAudioTracks()[0], rear.getVideoTracks()[0]]); // Audio // Demonstration pc.addStream(presentation); pc.addStream(presenter); pc.addStream(demonstration); signalingChannel.send( JSON.stringify({ "presentation": presentation.id, "presenter": presenter.id, "demonstration": demonstration.id })); •  AWach  all  3  streams  to  Peer  ConnecFon   •  Send  stream  ids  to  peer  (before  streams!)   } function call() { pc.createOffer(gotDescription, e); function gotDescription(desc) { pc.setLocalDescription(desc, s, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } } function handleIncomingStream(st) { if (st.getVideoTracks().length == 1) { av_stream = st; show_av(av_stream); } else if (st.getAudioTracks().length == 2) { stereo = st; } else { mono = st; } } function show_av(st) { display.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); left.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[0])); right.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[1])); } signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data); if (signal.sdp) { pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e); } else { pc.addIceCandidate( new RTCIceCandidate(signal.candidate)); } }; TAD  Summit  Bangkok  2013   71  
  72. 72. Mobile  browser  code  outline   var signalingChannel = createSignalingChannel(); var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", "credential":"myPassword"}]}; var microphone, application, front, rear; var presentation, presenter, demonstration; var remote_av, stereo, mono; var display, left, right; function s(sdp) {} // stub success callback function e(error) {} // stub error callback var signalingChannel = createSignalingChannel(); getMedia(); createPC(); attachMedia(); call(); getMedia(); createPC(); attachMedia(); call(); function getMedia() { // get local audio (microphone) navigator.getUserMedia({"audio": true }, function (stream) { microphone = stream; }, e); // get local video (application sharing) ///// This is outside the scope of this specification. ///// Assume that 'application' has been set to this stream. // constraint = {"video": {"mandatory": {"videoFacingModeEnum": "front"}}}; navigator.getUserMedia(constraint, function (stream) { front = stream; }, e); constraint = {"video": {"mandatory": {"videoFacingModeEnum": "rear"}}}; navigator.getUserMedia(constraint, function (stream) { rear = stream; }, e); } function createPC() { pc = new RTCPeerConnection(configuration); pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } function attachMedia() { presentation = new MediaStream( •  We  will  look  next  at  each  of  these   •  .  .  .  except  for  creaFng  the  signaling   channel   [microphone.getAudioTracks()[0], // Audio application.getVideoTracks()[0]]); // Presentation presenter = new MediaStream( [microphone.getAudioTracks()[0], // Audio front.getVideoTracks()[0]]); // Presenter demonstration = new MediaStream( [microphone.getAudioTracks()[0], rear.getVideoTracks()[0]]); // Audio // Demonstration pc.addStream(presentation); pc.addStream(presenter); pc.addStream(demonstration); } signalingChannel.send( JSON.stringify({ "presentation": presentation.id, "presenter": presenter.id, "demonstration": demonstration.id })); function call() { pc.createOffer(gotDescription, e); function gotDescription(desc) { pc.setLocalDescription(desc, s, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } } function handleIncomingStream(st) { if (st.getVideoTracks().length == 1) { av_stream = st; show_av(av_stream); } else if (st.getAudioTracks().length == 2) { stereo = st; } else { mono = st; } } function show_av(st) { display.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); left.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[0])); right.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[1])); } signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data); if (signal.sdp) { pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e); } else { pc.addIceCandidate( new RTCIceCandidate(signal.candidate)); } }; TAD  Summit  Bangkok  2013   72  
  73. 73. funcFon  call()   pc.createOffer(gotDescription, e); var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", "credential":"myPassword"}]}; var microphone, application, front, rear; var presentation, presenter, demonstration; var remote_av, stereo, mono; var display, left, right; function s(sdp) {} // stub success callback function gotDescription(desc) { pc.setLocalDescription(desc, s, e); function e(error) {} // stub error callback var signalingChannel = createSignalingChannel(); getMedia(); createPC(); attachMedia(); call(); function getMedia() { // get local audio (microphone) navigator.getUserMedia({"audio": true }, function (stream) { microphone = stream; }, e); // get local video (application sharing) ///// This is outside the scope of this specification. ///// Assume that 'application' has been set to this stream. // constraint = {"video": {"mandatory": {"videoFacingModeEnum": "front"}}}; navigator.getUserMedia(constraint, function (stream) { front = stream; }, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); constraint = {"video": {"mandatory": {"videoFacingModeEnum": "rear"}}}; navigator.getUserMedia(constraint, function (stream) { rear = stream; }, e); } function createPC() { pc = new RTCPeerConnection(configuration); } pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } function attachMedia() { presentation = new MediaStream( [microphone.getAudioTracks()[0], // Audio application.getVideoTracks()[0]]); // Presentation presenter = new MediaStream( [microphone.getAudioTracks()[0], // Audio front.getVideoTracks()[0]]); // Presenter •  Ask  browser  to  create  SDP  offer   •  Set  offer  as  local  descripFon   •  Send  offer  to  peer   demonstration = new MediaStream( [microphone.getAudioTracks()[0], rear.getVideoTracks()[0]]); // Audio // Demonstration pc.addStream(presentation); pc.addStream(presenter); pc.addStream(demonstration); } signalingChannel.send( JSON.stringify({ "presentation": presentation.id, "presenter": presenter.id, "demonstration": demonstration.id })); function call() { pc.createOffer(gotDescription, e); function gotDescription(desc) { pc.setLocalDescription(desc, s, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } } function handleIncomingStream(st) { if (st.getVideoTracks().length == 1) { av_stream = st; show_av(av_stream); } else if (st.getAudioTracks().length == 2) { stereo = st; } else { mono = st; } } function show_av(st) { display.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); left.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[0])); right.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[1])); } signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data); if (signal.sdp) { pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e); } else { pc.addIceCandidate( new RTCIceCandidate(signal.candidate)); } }; TAD  Summit  Bangkok  2013   73  
  74. 74. How  do  we  get  the  SDP  answer?   signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data); if (signal.sdp) { pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e); } else { pc.addIceCandidate( new RTCIceCandidate(signal.candidate)); } var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", "credential":"myPassword"}]}; var microphone, application, front, rear; var presentation, presenter, demonstration; var remote_av, stereo, mono; var display, left, right; function s(sdp) {} // stub success callback function e(error) {} // stub error callback var signalingChannel = createSignalingChannel(); getMedia(); createPC(); attachMedia(); call(); function getMedia() { // get local audio (microphone) navigator.getUserMedia({"audio": true }, function (stream) { microphone = stream; }, e); // get local video (application sharing) ///// This is outside the scope of this specification. ///// Assume that 'application' has been set to this stream. // constraint = {"video": {"mandatory": {"videoFacingModeEnum": "front"}}}; navigator.getUserMedia(constraint, function (stream) { front = stream; }, e); constraint = {"video": {"mandatory": {"videoFacingModeEnum": "rear"}}}; navigator.getUserMedia(constraint, function (stream) { rear = stream; }, e); } function createPC() { pc = new RTCPeerConnection(configuration); pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } function attachMedia() { presentation = new MediaStream( [microphone.getAudioTracks()[0], // Audio application.getVideoTracks()[0]]); // Presentation presenter = new MediaStream( [microphone.getAudioTracks()[0], // Audio front.getVideoTracks()[0]]); // Presenter }; demonstration = new MediaStream( [microphone.getAudioTracks()[0], rear.getVideoTracks()[0]]); // Audio // Demonstration pc.addStream(presentation); pc.addStream(presenter); pc.addStream(demonstration); •  Signaling  channel  provides  message   •  If  SDP,  set  as  remote  descripFon   •  If  ICE  candidate,  tell  the  browser   TAD  Summit  Bangkok  2013   } signalingChannel.send( JSON.stringify({ "presentation": presentation.id, "presenter": presenter.id, "demonstration": demonstration.id })); function call() { pc.createOffer(gotDescription, e); function gotDescription(desc) { pc.setLocalDescription(desc, s, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } } function handleIncomingStream(st) { if (st.getVideoTracks().length == 1) { av_stream = st; show_av(av_stream); } else if (st.getAudioTracks().length == 2) { stereo = st; } else { mono = st; } } function show_av(st) { display.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); left.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[0])); right.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[1])); } signalingChannel.onmessage = function (msg) { var signal = JSON.parse(msg.data); if (signal.sdp) { pc.setRemoteDescription( new RTCSessionDescription(signal.sdp), s, e); } else { pc.addIceCandidate( new RTCIceCandidate(signal.candidate)); } }; 74  
  75. 75. And  now  the  laptop  browser  .  .  .   •  Watch  for  the  following   –  We  set  up  media  *amer*  receiving  the  offer   –  but  the  signaling  channel  sFll  must  exist  first!   –  Also,  need  to  save  incoming  stream  ids   TAD  Summit  Bangkok  2013   75  
  76. 76. Signaling  channel  message  is  trigger   signalingChannel.onmessage = function (msg) { if (!pc) { prepareForIncomingCall(); } var sgnl = JSON.parse(msg.data); var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", "credential":"myPassword"}]}; var webcam, left, right; var av, stereo, mono; var incoming; var speaker, win1, win2, win3; function s(sdp) {} // stub success callback function e(error) {} // stub error callback var signalingChannel = createSignalingChannel(); function prepareForIncomingCall() { createPC(); getMedia(); } attachMedia(); function createPC() { pc = new RTCPeerConnection(configuration); pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; . . . pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } function getMedia() { navigator.getUserMedia({"video": true }, function (stream) { webcam = stream; }, e); constraint = {"audio": {"mandatory": {"audioDirectionEnum": "left"}}}; navigator.getUserMedia(constraint, function (stream) { left = stream; }, e); constraint = {"audio": {"mandatory": {"audioDirectionEnum": "right"}}}; navigator.getUserMedia(constraint, function (stream) { right = stream; }, e); } function attachMedia() { av = new MediaStream( [webcam.getVideoTracks()[0], left.getAudioTracks()[0], right.getAudioTracks()[0]]); stereo = new MediaStream( [left.getAudioTracks()[0], right.getAudioTracks()[0]]); mono = left; }; // Video // Left audio // Right audio // Left audio // Right audio // Treat the left audio as the mono stream pc.addStream(av); pc.addStream(stereo); pc.addStream(mono); } function answer() { pc.createAnswer(gotDescription, e); function gotDescription(desc) { pc.setLocalDescription(desc, s, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } } function handleIncomingStream(st) { •  Set  up  PC  and  media  if  not  already  done   if (st.id === incoming.presentation) { speaker.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[0])); win1.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); } else if (st.id === incoming.presenter) { win2.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); } else { win3.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); } } signalingChannel.onmessage = function (msg) { if (!pc) { prepareForIncomingCall(); } var sgnl = JSON.parse(msg.data); if (sgnl.sdp) { pc.setRemoteDescription( new RTCSessionDescription(sgnl.sdp), s, e); answer(); } else if (sgnl.candidate) { pc.addIceCandidate(new RTCIceCandidate(sgnl.candidate)); } else { incoming = sgnl; } }; TAD  Summit  Bangkok  2013   76  
  77. 77. Signaling  channel  message  is  trigger   signalingChannel.onmessage = function (msg) { . . . if (sgnl.sdp) { pc.setRemoteDescription( new RTCSessionDescription(sgnl.sdp), s, e); answer(); } else if (sgnl.candidate) { pc.addIceCandidate(new RTCIceCandidate(sgnl.candidate)); } else { incoming = sgnl; } var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", "credential":"myPassword"}]}; var webcam, left, right; var av, stereo, mono; var incoming; var speaker, win1, win2, win3; function s(sdp) {} // stub success callback function e(error) {} // stub error callback var signalingChannel = createSignalingChannel(); function prepareForIncomingCall() { createPC(); getMedia(); } attachMedia(); function createPC() { pc = new RTCPeerConnection(configuration); pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } function getMedia() { navigator.getUserMedia({"video": true }, function (stream) { webcam = stream; }, e); constraint = {"audio": {"mandatory": {"audioDirectionEnum": "left"}}}; navigator.getUserMedia(constraint, function (stream) { left = stream; }, e); constraint = {"audio": {"mandatory": {"audioDirectionEnum": "right"}}}; navigator.getUserMedia(constraint, function (stream) { right = stream; }, e); } function attachMedia() { av = new MediaStream( [webcam.getVideoTracks()[0], left.getAudioTracks()[0], right.getAudioTracks()[0]]); stereo = new MediaStream( [left.getAudioTracks()[0], right.getAudioTracks()[0]]); mono = left; // Video // Left audio // Right audio // Left audio // Right audio // Treat the left audio as the mono stream pc.addStream(av); pc.addStream(stereo); pc.addStream(mono); } function answer() { pc.createAnswer(gotDescription, e); function gotDescription(desc) { pc.setLocalDescription(desc, s, e); }; •  If  SDP,  *also*  answer   •  But  if  neither  SDP  nor  ICE  candidate,  must   be  set  of  incoming  stream  ids,  so  save   TAD  Summit  Bangkok  2013   signalingChannel.send(JSON.stringify({ "sdp": desc })); } } function handleIncomingStream(st) { if (st.id === incoming.presentation) { speaker.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[0])); win1.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); } else if (st.id === incoming.presenter) { win2.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); } else { win3.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); } } signalingChannel.onmessage = function (msg) { if (!pc) { prepareForIncomingCall(); } var sgnl = JSON.parse(msg.data); if (sgnl.sdp) { pc.setRemoteDescription( new RTCSessionDescription(sgnl.sdp), s, e); answer(); } else if (sgnl.candidate) { pc.addIceCandidate(new RTCIceCandidate(sgnl.candidate)); } else { incoming = sgnl; } }; 77  
  78. 78. FuncFon  prepareForIncomingCall()   createPC(); var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", "credential":"myPassword"}]}; var webcam, left, right; var av, stereo, mono; getMedia(); var incoming; var speaker, win1, win2, win3; function s(sdp) {} // stub success callback function e(error) {} // stub error callback var signalingChannel = createSignalingChannel(); function prepareForIncomingCall() { createPC(); getMedia(); } attachMedia(); function createPC() { pc = new RTCPeerConnection(configuration); attachMedia(); pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } function getMedia() { navigator.getUserMedia({"video": true }, function (stream) { webcam = stream; }, e); constraint = {"audio": {"mandatory": {"audioDirectionEnum": "left"}}}; navigator.getUserMedia(constraint, function (stream) { left = stream; }, e); constraint = {"audio": {"mandatory": {"audioDirectionEnum": "right"}}}; navigator.getUserMedia(constraint, function (stream) { right = stream; }, e); } function attachMedia() { av = new MediaStream( [webcam.getVideoTracks()[0], left.getAudioTracks()[0], right.getAudioTracks()[0]]); stereo = new MediaStream( [left.getAudioTracks()[0], right.getAudioTracks()[0]]); mono = left; •  No  suprises  here   •  Media  obtained  is  a  liWle  different   •  But  aWached  the  same  way   TAD  Summit  Bangkok  2013   // Video // Left audio // Right audio // Left audio // Right audio // Treat the left audio as the mono stream pc.addStream(av); pc.addStream(stereo); pc.addStream(mono); } function answer() { pc.createAnswer(gotDescription, e); function gotDescription(desc) { pc.setLocalDescription(desc, s, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } } function handleIncomingStream(st) { if (st.id === incoming.presentation) { speaker.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[0])); win1.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); } else if (st.id === incoming.presenter) { win2.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); } else { win3.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); } } signalingChannel.onmessage = function (msg) { if (!pc) { prepareForIncomingCall(); } var sgnl = JSON.parse(msg.data); if (sgnl.sdp) { pc.setRemoteDescription( new RTCSessionDescription(sgnl.sdp), s, e); answer(); } else if (sgnl.candidate) { pc.addIceCandidate(new RTCIceCandidate(sgnl.candidate)); } else { incoming = sgnl; } }; 78  
  79. 79. FuncFon  answer()   pc.createAnswer(gotDescription, e); var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", function gotDescription(desc) { pc.setLocalDescription(desc, s, e); "credential":"myPassword"}]}; var webcam, left, right; var av, stereo, mono; var incoming; var speaker, win1, win2, win3; function s(sdp) {} // stub success callback function e(error) {} // stub error callback var signalingChannel = createSignalingChannel(); function prepareForIncomingCall() { createPC(); getMedia(); } attachMedia(); function createPC() { pc = new RTCPeerConnection(configuration); pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; pc.onaddstream = signalingChannel.send(JSON.stringify({ "sdp": desc })); function (evt) {handleIncomingStream(evt.stream);}; } function getMedia() { navigator.getUserMedia({"video": true }, function (stream) { webcam = stream; }, e); constraint = {"audio": {"mandatory": {"audioDirectionEnum": "left"}}}; navigator.getUserMedia(constraint, function (stream) { left = stream; }, e); } constraint = {"audio": {"mandatory": {"audioDirectionEnum": "right"}}}; navigator.getUserMedia(constraint, function (stream) { right = stream; }, e); } function attachMedia() { av = new MediaStream( [webcam.getVideoTracks()[0], left.getAudioTracks()[0], right.getAudioTracks()[0]]); stereo = new MediaStream( [left.getAudioTracks()[0], right.getAudioTracks()[0]]); mono = left; •  createAnswer()  automaFcally  uses   value  of  remoteDescription  when   generaFng  new  SDP   // Video // Left audio // Right audio // Left audio // Right audio // Treat the left audio as the mono stream pc.addStream(av); pc.addStream(stereo); pc.addStream(mono); } function answer() { pc.createAnswer(gotDescription, e); function gotDescription(desc) { pc.setLocalDescription(desc, s, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } } function handleIncomingStream(st) { if (st.id === incoming.presentation) { speaker.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[0])); win1.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); } else if (st.id === incoming.presenter) { win2.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); } else { win3.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); } } signalingChannel.onmessage = function (msg) { if (!pc) { prepareForIncomingCall(); } var sgnl = JSON.parse(msg.data); if (sgnl.sdp) { pc.setRemoteDescription( new RTCSessionDescription(sgnl.sdp), s, e); answer(); } else if (sgnl.candidate) { pc.addIceCandidate(new RTCIceCandidate(sgnl.candidate)); } else { incoming = sgnl; } }; TAD  Summit  Bangkok  2013   79  
  80. 80. Laptop  browser  consumes  .  .  .   PresentaFon  Stream   “Audio”  Track   “PresentaFon”  Track   Presenter  Stream     “Audio”  Track   Speaker   “Presenter”  Track   Display   DemonstraFon  Stream   Display   “Audio”  Track   Display   “DemonstraFon”  Track   Browser  L   (All  video  streams  selected)   Tracks   MediaStreams   Sinks   •  Three  input  streams •  All  have  same  #  of  audio  and  video  tracks •  Need  stream  ids  to  disFnguish   TAD  Summit  Bangkok  2013   80  
  81. 81. FuncFon  handleIncomingStream()   if (st.id === incoming.presentation) { speaker.srcObject = new MediaStream(st.getAudioTracks()[0]); win1.srcObject = new MediaStream(st.getVideoTracks()[0]); } else if (st.id === incoming.presenter) { win2.srcObject = new MediaStream(st.getVideoTracks()[0]); } else { win3.srcObject = new MediaStream(st.getVideoTracks()[0]); } •  Use  ids  to  disFnguish  streams   •  Extract  one  audio  and  all  video  tracks   •  Assign  to  element  sources   TAD  Summit  Bangkok  2013   var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", "credential":"myPassword"}]}; var webcam, left, right; var av, stereo, mono; var incoming; var speaker, win1, win2, win3; function s(sdp) {} // stub success callback function e(error) {} // stub error callback var signalingChannel = createSignalingChannel(); function prepareForIncomingCall() { createPC(); getMedia(); } attachMedia(); function createPC() { pc = new RTCPeerConnection(configuration); pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } function getMedia() { navigator.getUserMedia({"video": true }, function (stream) { webcam = stream; }, e); constraint = {"audio": {"mandatory": {"audioDirectionEnum": "left"}}}; navigator.getUserMedia(constraint, function (stream) { left = stream; }, e); constraint = {"audio": {"mandatory": {"audioDirectionEnum": "right"}}}; navigator.getUserMedia(constraint, function (stream) { right = stream; }, e); } function attachMedia() { av = new MediaStream( [webcam.getVideoTracks()[0], left.getAudioTracks()[0], right.getAudioTracks()[0]]); stereo = new MediaStream( [left.getAudioTracks()[0], right.getAudioTracks()[0]]); mono = left; // Video // Left audio // Right audio // Left audio // Right audio // Treat the left audio as the mono stream pc.addStream(av); pc.addStream(stereo); pc.addStream(mono); } function answer() { pc.createAnswer(gotDescription, e); function gotDescription(desc) { pc.setLocalDescription(desc, s, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } } function handleIncomingStream(st) { if (st.id === incoming.presentation) { speaker.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[0])); win1.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); } else if (st.id === incoming.presenter) { win2.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); } else { win3.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); } } signalingChannel.onmessage = function (msg) { if (!pc) { prepareForIncomingCall(); } var sgnl = JSON.parse(msg.data); if (sgnl.sdp) { pc.setRemoteDescription( new RTCSessionDescription(sgnl.sdp), s, e); answer(); } else if (sgnl.candidate) { pc.addIceCandidate(new RTCIceCandidate(sgnl.candidate)); } else { incoming = sgnl; } }; 81  
  82. 82. Laptop  browser  produces  .  .  .   Audio  &  Video  Stream   Video   “Video”  Track   WebCam   Lem  Microphone   “Right”  Track   “Lem”  Track   Lem  Audio   Right  Microphone   Stereo  Stream   Browser  L   “Right”  Track   Right  Audio   “Lem”  Track   “Mono”  Track   Mono  Stream   Tracks   Created  MediaStreams   Captured   MediaStreams   Sources   •  Three  calls  to  getUserMedia() •  Three  calls  to  new MediaStream() •  No  stream  ids  needed   TAD  Summit  Bangkok  2013   82  
  83. 83. FuncFon  getMedia()  [1]   navigator.getUserMedia({"video": true}, function (stream) { webcam = stream; }, e); var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", "credential":"myPassword"}]}; var webcam, left, right; var av, stereo, mono; var incoming; var speaker, win1, win2, win3; function s(sdp) {} // stub success callback function e(error) {} // stub error callback var signalingChannel = createSignalingChannel(); function prepareForIncomingCall() { createPC(); getMedia(); } attachMedia(); function createPC() { pc = new RTCPeerConnection(configuration); . . . pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } function getMedia() { navigator.getUserMedia({"video": true }, function (stream) { webcam = stream; }, e); constraint = {"audio": {"mandatory": {"audioDirectionEnum": "left"}}}; navigator.getUserMedia(constraint, function (stream) { left = stream; }, e); constraint = {"audio": {"mandatory": {"audioDirectionEnum": "right"}}}; navigator.getUserMedia(constraint, function (stream) { right = stream; }, e); } function attachMedia() { av = new MediaStream( [webcam.getVideoTracks()[0], left.getAudioTracks()[0], right.getAudioTracks()[0]]); stereo = new MediaStream( [left.getAudioTracks()[0], right.getAudioTracks()[0]]); mono = left; •  Request  webcam  video   // Video // Left audio // Right audio // Left audio // Right audio // Treat the left audio as the mono stream pc.addStream(av); pc.addStream(stereo); pc.addStream(mono); } function answer() { pc.createAnswer(gotDescription, e); function gotDescription(desc) { pc.setLocalDescription(desc, s, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } } function handleIncomingStream(st) { if (st.id === incoming.presentation) { speaker.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[0])); win1.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); } else if (st.id === incoming.presenter) { win2.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); } else { win3.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); } } signalingChannel.onmessage = function (msg) { if (!pc) { prepareForIncomingCall(); } var sgnl = JSON.parse(msg.data); if (sgnl.sdp) { pc.setRemoteDescription( new RTCSessionDescription(sgnl.sdp), s, e); answer(); } else if (sgnl.candidate) { pc.addIceCandidate(new RTCIceCandidate(sgnl.candidate)); } else { incoming = sgnl; } }; TAD  Summit  Bangkok  2013   83  
  84. 84. FuncFon  getMedia()  [2]   . . . constraint = {"audio": {"mandatory": {"audioDirectionEnum": "left"}}}; navigator.getUserMedia(constraint, function (stream) {left = stream;}, e); var pc; var configuration = {"iceServers":[{"url":"stun:198.51.100.9"}, {"url":"turn:198.51.100.2", "credential":"myPassword"}]}; var webcam, left, right; var av, stereo, mono; var incoming; var speaker, win1, win2, win3; function s(sdp) {} // stub success callback function e(error) {} // stub error callback var signalingChannel = createSignalingChannel(); function prepareForIncomingCall() { createPC(); getMedia(); } attachMedia(); function createPC() { pc = new RTCPeerConnection(configuration); constraint = {"audio": {"mandatory": {"audioDirectionEnum": "right"}}}; navigator.getUserMedia(constraint, function (stream) {right = stream;}, e); pc.onicecandidate = function (evt) { signalingChannel.send( JSON.stringify({ "candidate": evt.candidate })); }; pc.onaddstream = function (evt) {handleIncomingStream(evt.stream);}; } function getMedia() { navigator.getUserMedia({"video": true }, function (stream) { webcam = stream; }, e); constraint = {"audio": {"mandatory": {"audioDirectionEnum": "left"}}}; navigator.getUserMedia(constraint, function (stream) { left = stream; }, e); constraint = {"audio": {"mandatory": {"audioDirectionEnum": "right"}}}; navigator.getUserMedia(constraint, function (stream) { right = stream; }, e); } function attachMedia() { av = new MediaStream( [webcam.getVideoTracks()[0], left.getAudioTracks()[0], right.getAudioTracks()[0]]); stereo = new MediaStream( [left.getAudioTracks()[0], right.getAudioTracks()[0]]); mono = left; // Video // Left audio // Right audio // Left audio // Right audio // Treat the left audio as the mono stream pc.addStream(av); pc.addStream(stereo); pc.addStream(mono); } function answer() { pc.createAnswer(gotDescription, e); •  Request  lem  and  right  audio  streams   •  Save  them  as  left  and  right  variables   function gotDescription(desc) { pc.setLocalDescription(desc, s, e); signalingChannel.send(JSON.stringify({ "sdp": desc })); } } function handleIncomingStream(st) { if (st.id === incoming.presentation) { speaker.src = URL.createObjectURL( new MediaStream(st.getAudioTracks()[0])); win1.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); } else if (st.id === incoming.presenter) { win2.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); } else { win3.src = URL.createObjectURL( new MediaStream(st.getVideoTracks()[0])); } } signalingChannel.onmessage = function (msg) { if (!pc) { prepareForIncomingCall(); } var sgnl = JSON.parse(msg.data); if (sgnl.sdp) { pc.setRemoteDescription( new RTCSessionDescription(sgnl.sdp), s, e); answer(); } else if (sgnl.candidate) { pc.addIceCandidate(new RTCIceCandidate(sgnl.candidate)); } else { incoming = sgnl; } }; TAD  Summit  Bangkok  2013   84  
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.

×