Hear how to develop and implement WebRTC using the new IETF and W3C standards. This session will overview the concepts and structure of WebRTC and how it is defined in the emerging standards. The session will bring everyone up to a clear understanding of WebRTC for the technical discussions in the next session.
This workshop will include specific examples of how to code and create real-time interactions. The session will be interactive, allowing for open and clear discussion.
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
WebRTC Overview by Dan Burnett
1. WebRTC
Introduc)on
to
WebRTC
Dan
Burne4
Chief
Scien)st,
Tropo
Director
of
Standards,
Voxeo
Alan
Johnston
Dis)nguished
Engineer
Avaya
2. WebRTC
Tutorial
Topics
•
•
•
•
•
•
•
What
is
WebRTC?
How
to
Use
WebRTC
WebRTC
Peer-‐to-‐Peer
Media
WebRTC
Protocols
and
IETF
Standards
WebRTC
W3C
API
Overview
Pseudo
Code
Walkthrough
PracFcal
bits
AdhearsionConf
2013
2
4. WebRTC
is
“Voice
&
Video
in
the
browser”
• Access
to
camera
and
microphone
without
a
plugin
– No
proprietary
plugin
required!
• Audio/video
direct
from
browser
to
browser
• Why
does
it
maUer?
– Media
can
stay
local
– Mobile
devices
eventually
dropping
voice
channel
anyway
– Games
AdhearsionConf
2013
4
5. The
Browser
RTC
FuncFon
Web
Server
Signaling
Server
HTTP
or
WebSockets
JavaScript/HTML/CSS
Other
APIs
Web
Browser
• WebRTC
adds
new
Real-‐
Time
CommunicaFon
(RTC)
FuncFon
built-‐in
to
browsers
– No
download
HTTP
or
WebSockets
– No
Flash
or
other
plugins
(Signaling)
• Contains
– Audio
and
video
codecs
– Ability
to
negoFate
peer-‐to-‐
peer
connecFons
On-‐the-‐wire
protocols
– Echo
cancellaFon,
packet
loss
(Media
or
Data)
concealement
RTC
APIs
Browser
RTC
FuncFon
NaFve
OS
Services
• In
Chrome
&
Firefox
today,
Internet
Explorer
someFme
and
Safari
eventually
AdhearsionConf
2013
5
6. Benefits
of
WebRTC
For
Developer
For
User
• Streamlined
development
–
one
placorm
• Simple
APIs
–
detailed
knowledge
of
RTC
protocols
not
needed
• NAT
traversal
only
uses
expensive
relays
when
no
other
choice
• Advanced
voice
and
video
codecs
without
licensing
• No
download
or
install
–
easy
to
use
• All
communicaton
encrypted
–
private
• Reliable
session
establishment
– “just
works”
• Excellent
voice
and
video
quality
• Many
more
choices
for
real-‐
Fme
communicaFon
AdhearsionConf
2013
6
7. WebRTC
Support
of
MulFple
Media
Microphone
Audio
ApplicaFon
Sharing
Video
Front
Camera
Video
Rear
Camera
Video
WebCam
Video
Stereo
Audio
Browser
L
on
Laptop
Browser
M
on
Mobile
• MulFple
sources
of
audio
and
video
are
assumed
and
supported
• All
media,
voice
and
video,
and
feedback
messages
are
mulFplexed
over
the
same
transport
address
AdhearsionConf
2013
7
8. WebRTC
Triangle
Web
Server
(ApplicaFon)
Peer
ConnecFon
(Audio,
Video,
and/or
Data)
Browser
L
Browser
M
(Running
HTML5
ApplicaFon
from
Web
Server)
(Running
HTML5
ApplicaFon
from
Web
Server)
• Both
browsers
running
the
same
web
applicaFon
from
web
server
• Peer
ConnecFon
established
between
them
with
the
help
of
the
web
server
AdhearsionConf
2013
8
9. WebRTC
Trapezoid
Web
Server
A
(ApplicaFon
A)
Browser
M
SIP
or
Jingle
Web
Server
B
(ApplicaFon
B)
Peer
ConnecFon
(Audio
and/or
Video)
Browser
T
(Running
HTML5
ApplicaFon
from
Web
Server
B)
(Running
HTML5
ApplicaFon
from
Web
Server
A)
• Similar
to
SIP
Trapezoid
• Web
Servers
communicate
using
SIP
or
Jingle
or
proprietary
• Could
become
important
in
the
future.
AdhearsionConf
2013
9
10. WebRTC
and
SIP
Web
Server
SIP
SIP
Server
SIP
Browser
M
Peer
ConnecFon
(Audio
and/or
Video)
SIP
Client
• SIP
(Session
IniFaFon
Protocol)
is
a
signaling
protocol
used
by
service
providers
and
enterprises
for
real-‐Fme
communcaFon
• Peer
ConnecFon
appears
as
a
standard
RTP
session,
described
by
SDP
• SIP
Endpoint
must
support
RTCWEB
media
extensions
AdhearsionConf
2013
10
11. WebRTC
and
Jingle
Web
Server
Jingle
XMPP
Server
Jingle
Peer
ConnecFon
(Audio
and/or
Video)
Browser
M
Jingle
Client
• Jingle
is
a
signaling
extension
to
XMPP
(Extensible
Messaging
and
Presence
Protocol,
aka
Jabber)
• Peer
ConnecFon
SDP
can
be
mapped
to
Jingle
• Jingle
Endpoint
must
support
RTCWEB
Media
extensions
AdhearsionConf
2013
11
12. WebRTC
and
PSTN
Web
Server
Peer
ConnecFon
(Audio)
PSTN
Gateway
Browser
M
Phone
• Peer
ConnecFon
terminates
on
a
PSTN
Gateway
• Audio
Only
• EncrypFon
ends
at
Gateway
AdhearsionConf
2013
12
13. WebRTC
with
SIP
Web
Server
SIP
Proxy/Registrar
Server
WebSocket
(SIP)
HTTP
(HTML5/CSS/
JavaScript)
Browser
M
(running
JavaScript
SIP
UA)
HTTP
WebSocket
(HTML5/CSS/
(SIP)
JavaScript)
SRTP
Media
Browser
T
(running
JavaScript
SIP
UA)
• Browser
runs
a
SIP
User
Agent
by
running
JavaScript
from
Web
Server
• SRTP
media
connecFon
uses
WebRTC
APIs
• Details
in
[dram-‐iec-‐sipcore-‐websocket]
that
defines
SIP
transport
over
AdhearsionConf
2013
13
WebSockets
14. WebRTC
Signaling
Approaches
• Signaling
is
required
for
exchange
of
candidate
transport
addresses,
codec
informaFon,
media
keying
informaFon
• Many
opFons
–
choice
is
up
to
web
developer
AdhearsionConf
2013
14
16. WebRTC
usage
in
brief
Obtain
Local
Media
Get
more
media
All
media
added
Set
Up
Peer
ConnecFon
Peer
ConnecFon
established
AUach
Media
or
Data
AUach
more
media
or
data
Ready
for
call
Exchange
Offer/Answer
2013
AdhearsionConf
16
17. WebRTC
usage
in
brief
Obtain
Local
Media
Get
more
media
• getUserMedia()
– Audio
and/or
video
– Constraints
– User
permissions
All
media
added
Set
Up
Peer
ConnecFon
• Browser
must
ask
before
allowing
a
page
to
access
microphone
or
camera
Peer
ConnecFon
established
AUach
Media
or
Data
AUach
more
media
or
data
• MediaStream
• MediaStreamTrack
– CapabiliFes
– States
(sepngs)
Ready
for
call
Exchange
Offer/Answer
AdhearsionConf
2013
17
18. WebRTC
usage
in
brief
Obtain
Local
Media
Get
more
media
• RTCPeerConnection
–
All
media
added
–
–
Set
Up
Peer
–
ConnecFon
–
Peer
ConnecFon
established
–
AUach
Media
AUach
more
media
or
data
–
or
Data
–
Ready
for
call
Direct
media
Between
two
peers
ICE
processing
SDP
processing
DTMF
support
Data
channels
IdenFty
verificaFon
StaFsFcs
reporFng
Exchange
Offer/Answer
AdhearsionConf
2013
18
19. WebRTC
usage
in
brief
Obtain
Local
Media
Get
more
media
• addStream()
– Doesn't
change
media
state!
• removeStream()
All
media
added
– DiUo!
Set
Up
Peer
ConnecFon
• createDataChannel()
Peer
ConnecFon
established
AUach
Media
or
Data
– Depends
on
transport
AUach
more
media
or
data
Ready
for
call
Exchange
Offer/Answer
AdhearsionConf
2013
19
20. WebRTC
usage
in
brief
Obtain
Local
Media
Get
more
media
All
media
added
Set
Up
Peer
ConnecFon
Peer
ConnecFon
established
AUach
Media
or
Data
• createOffer(),
createAnswer()
• setLocalDescription(),
setRemoteDescription()
• Applying
SDP
answer
makes
the
magic
happen
AUach
more
media
or
data
Ready
for
call
Exchange
Session
DescripFons
AdhearsionConf
2013
20
21. WebRTC
usage
–
a
bit
more
detail
Set
Up
Signaling
Channel
Obtain
Local
Media
Get
more
media
Set
Up
Peer
ConnecFon
AUach
Media
or
Data
Exchange
Session
AdhearsionConf
2013
DescripFons
AUach
more
media
or
data
21
22. SDP
offer/answer
• Session
DescripFons
– Session
DescripFon
Protocol
created
for
use
by
SIP
in
sepng
up
voice
(and
video)
calls
– Describes
real-‐Fme
media
at
low
level
of
detail
• Which
IP
addresses
and
ports
to
use
• Which
codecs
to
use
• Offer/answer
model
(JSEP)
– One
side
sends
an
SDP
offer
lisFng
what
it
wants
to
send
and
what
it
can
receive
– Other
side
replies
with
an
SDP
answer
lisFng
what
it
will
receive
and
send
AdhearsionConf
2013
22
24. Media
Flows
in
WebRTC
Web
Server
Internet
Home
WiFi
Router
Router
Browser
M
Browser
D
Browser
T
Coffee
Shop
WiFi
Router
Browser
L
AdhearsionConf
2013
24
25. Media
without
WebRTC
Web
Server
Internet
Home
WiFi
Router
Router
Browser
M
Browser
D
Browser
T
Coffee
Shop
WiFi
Router
Browser
L
AdhearsionConf
2013
25
26. Peer-‐to-‐Peer
Media
with
WebRTC
Web
Server
Internet
Home
WiFi
Router
Router
Browser
M
Browser
D
Browser
T
Coffee
Shop
WiFi
Router
Browser
L
AdhearsionConf
2013
26
27. NAT
Complicates
Peer-‐to-‐Peer
Media
Web
Server
Most
browsers
are
behind
NATs
on
the
Internet,
which
complicates
the
establishment
of
peer-‐to-‐peer
media
sessions.
Internet
Router
with
NAT
Home
WiFi
with
NAT
Browser
M
Browser
D
Browser
T
Coffee
Shop
WiFi
with
NAT
Browser
L
AdhearsionConf
2013
27
28. What
is
a
NAT?
• Network
Address
Translator
(NAT)
• Used
to
map
an
inside
address
(usually
a
private
IP
address)
to
outside
address
(usually
a
public
IP
address)
at
Layer
3
• Network
Address
and
Port
TranslaFon
(NAPT)
also
changes
the
transport
port
number
(Layer
4)
– These
are
omen
just
called
NATs
as
well
• One
reason
for
NAT
is
the
IP
address
shortage
AdhearsionConf
2013
28
29. NAT
Example
Internet
“Outside”
Public
IP
Address
203.0.113.4
“Inside”
Private
IP
Addresses
192.168.x.x
Home
WiFi
with
NAT
Browser
M
192.168.0.5
Browser
T
192.168.0.6
AdhearsionConf
2013
29
30. NATs
and
ApplicaFons
• NATs
are
compaFble
with
client/server
protocols
such
as
web,
email,
etc.
• However,
NATs
generally
block
peer-‐to-‐peer
communicaFon
• Typical
NAT
traversal
for
VoIP
and
video
services
today
use
a
media
relay
whenever
the
client
is
behind
a
NAT
– Omen
done
with
an
SBC
–
Session
Border
Controller
– This
is
a
major
expense
and
complicaFon
in
exisFng
VoIP
and
video
systems
• WebRTC
has
a
built-‐in
NAT
traversal
strategy:
InteracFve
ConnecFvity
Establishment
(ICE)
AdhearsionConf
2013
30
31. Peer-‐to-‐Peer
Media
Through
NAT
Web
Server
ICE
connecFvity
checks
can
omen
establish
a
direct
peer-‐
to-‐peer
session
between
browsers
behind
different
NATs
Internet
Router
with
NAT
Home
WiFi
with
NAT
Browser
M
Browser
D
Browser
T
Coffee
Shop
WiFi
with
NAT
Browser
L
AdhearsionConf
2013
31
32. ICE
ConnecFvity
Checks
• ConnecFvity
through
NAT
can
be
achieved
using
ICE
connecFvity
checks
• Browsers
exchange
a
list
of
candidates
– Local:
read
from
network
interfaces
– Reflexive:
obtained
using
a
STUN
Server
– Relayed:
obtained
from
a
TURN
Server
(media
relay)
• Browsers
aUempt
to
send
STUN
packets
to
the
candidate
list
received
from
other
browser
• Checks
performed
by
both
sides
at
same
Fme
• If
one
STUN
packet
gets
through,
a
response
is
sent
and
this
connecFon
used
for
communicaFon
– TURN
relay
will
be
last
resort
(lowest
priority)
AdhearsionConf
2013
32
33. P2P
Media
Can
Stay
Local
to
NAT
If
both
browsers
are
behind
the
same
NAT,
connecFvity
checks
can
omen
establish
a
connecFon
that
never
leaves
the
NAT.
Web
Server
Internet
Router
with
NAT
Home
WiFi
with
NAT
Browser
M
Browser
D
Browser
T
Coffee
Shop
WiFi
with
NAT
Browser
L
AdhearsionConf
2013
33
34. ICE
Servers
Web
Server
STUN
Server
TURN
Server
198.51.100.9
198.51.100.2
ICE
uses
STUN
and
TURN
servers
in
the
public
Internet
to
help
with
NAT
traversal.
Internet
Home
WiFi
with
NAT
203.0.113.4
Router
with
NAT
Browser
M
192.168.0.5
Browser
D
Browser
T
Coffee
Shop
WiFi
with
NAT
Browser
L
AdhearsionConf
2013
34
35. Browser
Queries
STUN
Server
Web
Server
STUN
Server
TURN
Server
198.51.100.9
198.51.100.2
Browser
sends
STUN
test
packet
to
STUN
server
to
learn
its
public
IP
address
(address
of
the
NAT).
Internet
Home
WiFi
with
NAT
203.0.113.4
Router
with
NAT
Browser
M
192.168.0.5
Browser
D
Browser
T
Coffee
Shop
WiFi
with
NAT
Browser
L
AdhearsionConf
2013
35
36. TURN
Server
Can
Relay
Media
Web
Server
STUN
TURN
Server
as
a
Server
Media
Relay
In
some
cases,
connecFvity
checks
fail,
and
a
TURN
Media
Relay
on
the
public
Internet
must
be
used.
Internet
Router
with
NAT
Home
WiFi
with
NAT
Browser
M
Browser
D
Browser
T
Coffee
Shop
WiFi
with
NAT
Browser
L
AdhearsionConf
2013
36
38. WebRTC:
A
Joint
Standards
Effort
• Internet
Engineering
Task
Force
(IETF)
and
World
Wide
Web
ConsorFum
(W3C)
are
working
together
on
WebRTC
• IETF
– Protocols
–
“bits
on
wire”
– Main
protocols
are
already
RFCs,
but
many
extensions
in
progress
– RTCWEB
(Real-‐Time
CommunicaFons
on
the
Web)
Working
Group
is
the
main
focus,
but
other
WGs
involved
as
well
– hUp://www.iec.org
• W3C
– APIs
–
used
by
JavaScript
code
in
HTML5
– hUp://www.w3c.org
AdhearsionConf
2013
38
39. WebRTC
Protocols
ApplicaFon
Layer
HTTP
ICE
WebSocket
SRTP
SDP
STUN
TURN
Transport
Layer
TLS
TCP
Network
Layer
DTLS
UDP
SCTP
IP
SIP
is
not
shown
as
it
is
opFonal
AdhearsionConf
2013
39
41. Codecs
RFC
6716
.
• Mandatory
to
Implement
(MTI)
audio
codecs
are
seUled
on
Opus
and
G.711
(finally!)
• Video
is
not
yet
decided!
AdhearsionConf
2013
41
43. Two
primary
API
secFons
• Handling
local
media
– Media
Capture
and
Streams
(getUserMedia)
specificaFon
• Transmipng
media
– WebRTC
(Peer
ConnecFon)
specificaFon
AdhearsionConf
2013
43
44. Local
Media
Handling
Audio
PresentaFon
Video
“Audio”
Track
Presenter
Stream
“PresentaFon”
Track
“Audio”
Track
Microphone
Audio
Presenter
Video
ApplicaFon
Sharing
Video
Front
Camera
Video
Rear
Camera
Video
“Presenter”
Track
DemonstraFon
Stream
“Audio”
Track
DemonstraFon
Video
Browser
M
Sources
PresentaFon
Stream
Captured
MediaStreams
• In
this
example
“DemonstraFon”
Track
Created
MediaStreams
Tracks
– Captured
4
local
media
streams
– Created
3
media
streams
from
them
– Sent
streams
over
Peer
ConnecFon
AdhearsionConf
2013
44
45. Local
Media
Handling
Audio
PresentaFon
Video
Microphone
Audio
ApplicaFon
Sharing
Video
Browser
M
Sources
• Sources
“Audio”
Track
Presenter
Stream
“PresentaFon”
Track
“Audio”
Track
Presenter
Video
Front
Camera
Video
Rear
Camera
Video
PresentaFon
Stream
“Presenter”
Track
DemonstraFon
Stream
“Audio”
Track
DemonstraFon
Video
Captured
MediaStreams
“DemonstraFon”
Track
Created
MediaStreams
Tracks
– Encoded
together
– Can't
manipulate
individually
AdhearsionConf
2013
45
46. Local
Media
Handling
Audio
PresentaFon
Video
Microphone
Audio
ApplicaFon
Sharing
Video
Browser
M
“Audio”
Track
Presenter
Stream
“PresentaFon”
Track
“Audio”
Track
Presenter
Video
Front
Camera
Video
Rear
Camera
Video
PresentaFon
Stream
“Presenter”
Track
DemonstraFon
Stream
“Audio”
Track
DemonstraFon
Video
Sources
Captured
MediaStreams
• Tracks
(MediaStreamTrack)
“DemonstraFon”
Track
Created
MediaStreams
Tracks
– Tied
to
a
source
– Exist
primarily
as
part
of
Streams;
single
media
type
– Globally
unique
ids;
opFonally
browser-‐labeled
AdhearsionConf
2013
46
47. Local
Media
Handling
PresentaFon
Stream
Audio
“Audio”
Track
PresentaFon
Video
Microphone
Audio
ApplicaFon
Sharing
Video
Presenter
Stream
“Audio”
Track
Presenter
Video
“Presenter”
Track
DemonstraFon
Stream
Front
Camera
Video
Rear
Camera
Video
Browser
M
Sources
“PresentaFon”
Track
“Audio”
Track
DemonstraFon
Video
Captured
MediaStreams
• Captured
MediaStream
“DemonstraFon”
Track
Created
MediaStreams
Tracks
– Returned
from
getUserMedia()
– Permission
check
required
to
obtain
AdhearsionConf
2013
47
48. Local
Media
Handling
Audio
PresentaFon
Video
Microphone
Audio
ApplicaFon
Sharing
Video
Browser
M
Sources
“Audio”
Track
Presenter
Stream
“PresentaFon”
Track
“Audio”
Track
Presenter
Video
Front
Camera
Video
Rear
Camera
Video
PresentaFon
Stream
“Presenter”
Track
DemonstraFon
Stream
“Audio”
Track
DemonstraFon
Video
Captured
MediaStreams
• MediaStream
“DemonstraFon”
Track
Created
MediaStreams
Tracks
– All
contained
tracks
are
synchronized
– Can
be
created,
transmiUed,
etc.
AdhearsionConf
2013
48
49. Local
Media
Handling
• Sepngs
– Current
values
of
source
properFes
(height,
width,
etc.)
– Exposed
on
MediaStreamTrack
• CapabiliFes
– Allowed
values
for
source
properFes
– Exposed
on
MediaStreamTrack
• Constraints
– Requested
ranges
for
track
properFes
– Used
in
getUserMedia(),
applyConstraints()
AdhearsionConf
2013
49
50. Transmipng
media
• Signaling
channel
– Non-‐standard
– Must
exist
to
set
up
Peer
ConnecFon
• Peer
ConnecFon
– Links
together
two
peers
– Add/Remove
Media
Streams
• addStream(),
removeStream()
– Handlers
for
ICE
or
media
change
– Data
Channel
support
AdhearsionConf
2013
50
51. Peer
ConnecFon
• "Links"
together
two
peers
– Via
new RTCPeerConnection()
– Generates
Session
DescripFon
offers/answers
• createOffer(),
createAnswer()
– From
SDP
answers,
iniFates
media
• setLocalDescription(),
setRemoteDescription()
– Offers/answers
MUST
be
relayed
by
applicaFon
code!
– ICE
candidates
can
also
be
relayed
and
added
by
app
• addIceCandidate()
AdhearsionConf
2013
51
52. Peer
ConnecFon
• Handlers
for
signaling,
ICE
or
media
change
– onsignalingstatechange
– onicecandidate,
oniceconnectionstatechange
– onaddstream,
onremovestream
– onnegotiationneeded
– A
few
others
AdhearsionConf
2013
52
53. Peer
ConnecFon
• “Extra”
APIs
– Data
– DTMF
– StaFsFcs
– IdenFty
• Grouped
separately
in
WebRTC
spec
– but
really
part
of
RTCPeerConnection
definiFon
– all
are
mandatory
to
implement
AdhearsionConf
2013
53
54. Data
Channel
API
• RTCDataChannel createDataChannel()
• Configurable
with
–
–
–
–
ordered
maxRetransmits,
maxRetransmitTime
negotiated
id
• Provides
RTCDataChannel
with
– send()
– onopen,
onerror,
onclose,
onmessage*
AdhearsionConf
2013
54
55. DTMF
API
• RTCDTMFSender createDTMFSender()
– Associates
track
input
parameter
with
this
RTCPeerConnection
• RTCDTMFSender
provides
– boolean canInsertDTMF()
– insertDTMF()
– ontonechange
– (other
stuff)
AdhearsionConf
2013
55
56. StaFsFcs
API
• getStats()
– Callback
returns
staFsFcs
for
given
track
• StaFsFcs
available
(local/remote)
are:
– Bytes/packets
xmiUed
– Bytes/packets
received
• May
be
useful
for
congesFon-‐based
adjustments
AdhearsionConf
2013
56
57. IdenFty
API
• setIdentityProvider(),
getIdentityAssertion()
• Used
to
verify
idenFty
via
third
party,
e.g.,
Facebook
Connect
• Both
methods
are
opFonal
• onidentity
handler
called
amer
any
verificaFon
aUempt
• RTCPeerConnection.peerIdentity
holds
any
verified
idenFty
asserFon
AdhearsionConf
2013
57
59. Pseudo
Code
• Close
to
real
code,
but
.
.
.
• No
HTML,
no
signaling
channel,
not
asynchronous,
and
API
is
sFll
in
flux
• Don't
expect
this
to
work
anywhere
AdhearsionConf
2013
59
60. Back
to
first
diagram
Microphone
Audio
ApplicaFon
Sharing
Video
Front
Camera
Video
Rear
Camera
Video
WebCam
Video
Stereo
Audio
Browser
L
on
Laptop
Browser
M
on
Mobile
• Mobile
browser
"calls"
laptop
browser
• Each
sends
media
to
the
other
AdhearsionConf
2013
60
61. Mobile
browser
code
outline
var signalingChannel =
createSignalingChannel();
var pc;
var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right;
function s(sdp) {} // stub success callback
function e(error) {}
//
stub error callback
var signalingChannel = createSignalingChannel();
getMedia();
createPC();
attachMedia();
call();
getMedia();
createPC();
attachMedia();
call();
function getMedia() {
// get local audio (microphone)
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// get local video (application sharing)
///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream.
//
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream;
}, e);
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};
}
function attachMedia() {
presentation =
new MediaStream(
• We
will
look
next
at
each
of
these
• .
.
.
except
for
creaFng
the
signaling
channel
[microphone.getAudioTracks()[0],
// Audio
application.getVideoTracks()[0]]); // Presentation
presenter =
new MediaStream(
[microphone.getAudioTracks()[0],
// Audio
front.getVideoTracks()[0]]);
// Presenter
demonstration =
new MediaStream(
[microphone.getAudioTracks()[0],
rear.getVideoTracks()[0]]);
// Audio
// Demonstration
pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);
}
signalingChannel.send(
JSON.stringify({ "presentation": presentation.id,
"presenter": presenter.id,
"demonstration": demonstration.id
}));
function call() {
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.getVideoTracks().length == 1) {
av_stream = st;
show_av(av_stream);
} else if (st.getAudioTracks().length == 2) {
stereo = st;
} else {
mono = st;
}
}
function show_av(st) {
display.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
left.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
right.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[1]));
}
signalingChannel.onmessage = function (msg) {
var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate));
}
};
AdhearsionConf
2013
61
62. Mobile
browser
produces
.
.
.
Audio
PresentaFon
Stream
“Audio”
Track
“PresentaFon”
Track
PresentaFon
Video
Microphone
Audio
ApplicaFon
Sharing
Video
“Audio”
Track
Presenter
Video
“Presenter”
Track
DemonstraFon
Stream
Front
Camera
Video
Rear
Camera
Video
Presenter
Stream
“Audio”
Track
DemonstraFon
Video
“DemonstraFon”
Track
Browser
M
Sources
Captured
MediaStreams
Created
MediaStreams
• At
least
3
calls
to
getUserMedia()
• Three
calls
to
new MediaStream()
• App
sends
stream
ids,
then
streams
AdhearsionConf
2013
Tracks
62
63. funcFon
getMedia()
[1]
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
var pc;
var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right;
function s(sdp) {} // stub success callback
// get local video (application sharing)
///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream.
//
function e(error) {}
//
stub error callback
var signalingChannel = createSignalingChannel();
getMedia();
createPC();
attachMedia();
call();
function getMedia() {
// get local audio (microphone)
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// get local video (application sharing)
///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream.
//
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream;
}, e);
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};
}
. . .
function attachMedia() {
presentation =
new MediaStream(
• Get
audio
• (Get
window
video
–
out
of
scope)
[microphone.getAudioTracks()[0],
// Audio
application.getVideoTracks()[0]]); // Presentation
presenter =
new MediaStream(
[microphone.getAudioTracks()[0],
// Audio
front.getVideoTracks()[0]]);
// Presenter
demonstration =
new MediaStream(
[microphone.getAudioTracks()[0],
rear.getVideoTracks()[0]]);
// Audio
// Demonstration
pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);
}
signalingChannel.send(
JSON.stringify({ "presentation": presentation.id,
"presenter": presenter.id,
"demonstration": demonstration.id
}));
function call() {
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.getVideoTracks().length == 1) {
av_stream = st;
show_av(av_stream);
} else if (st.getAudioTracks().length == 2) {
stereo = st;
} else {
mono = st;
}
}
function show_av(st) {
display.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
left.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
right.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[1]));
}
signalingChannel.onmessage = function (msg) {
var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate));
}
};
AdhearsionConf
2013
63
64. funcFon
getMedia()
[2]
. . .
constraint =
{"video": {"mandatory": {"facingMode": "environment"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream;
}, e);
var pc;
var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right;
function s(sdp) {} // stub success callback
function e(error) {}
//
stub error callback
var signalingChannel = createSignalingChannel();
getMedia();
createPC();
attachMedia();
call();
function getMedia() {
// get local audio (microphone)
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// get local video (application sharing)
///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream.
//
constraint =
{"video": {"mandatory": {"facingMode": "user"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
• Get
front-‐facing
camera
• Get
rear-‐facing
camera
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream;
}, e);
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};
}
function attachMedia() {
presentation =
new MediaStream(
[microphone.getAudioTracks()[0],
// Audio
application.getVideoTracks()[0]]); // Presentation
presenter =
new MediaStream(
[microphone.getAudioTracks()[0],
// Audio
front.getVideoTracks()[0]]);
// Presenter
demonstration =
new MediaStream(
[microphone.getAudioTracks()[0],
rear.getVideoTracks()[0]]);
// Audio
// Demonstration
pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);
}
signalingChannel.send(
JSON.stringify({ "presentation": presentation.id,
"presenter": presenter.id,
"demonstration": demonstration.id
}));
function call() {
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.getVideoTracks().length == 1) {
av_stream = st;
show_av(av_stream);
} else if (st.getAudioTracks().length == 2) {
stereo = st;
} else {
mono = st;
}
}
function show_av(st) {
display.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
left.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
right.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[1]));
}
signalingChannel.onmessage = function (msg) {
var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate));
}
};
AdhearsionConf
2013
64
65. Mobile
browser
code
outline
var signalingChannel =
createSignalingChannel();
var pc;
var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right;
function s(sdp) {} // stub success callback
function e(error) {}
//
stub error callback
var signalingChannel = createSignalingChannel();
getMedia();
createPC();
attachMedia();
call();
getMedia();
createPC();
attachMedia();
call();
function getMedia() {
// get local audio (microphone)
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// get local video (application sharing)
///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream.
//
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream;
}, e);
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};
}
function attachMedia() {
presentation =
new MediaStream(
• We
will
look
next
at
each
of
these
• .
.
.
except
for
creaFng
the
signaling
channel
[microphone.getAudioTracks()[0],
// Audio
application.getVideoTracks()[0]]); // Presentation
presenter =
new MediaStream(
[microphone.getAudioTracks()[0],
// Audio
front.getVideoTracks()[0]]);
// Presenter
demonstration =
new MediaStream(
[microphone.getAudioTracks()[0],
rear.getVideoTracks()[0]]);
// Audio
// Demonstration
pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);
}
signalingChannel.send(
JSON.stringify({ "presentation": presentation.id,
"presenter": presenter.id,
"demonstration": demonstration.id
}));
function call() {
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.getVideoTracks().length == 1) {
av_stream = st;
show_av(av_stream);
} else if (st.getAudioTracks().length == 2) {
stereo = st;
} else {
mono = st;
}
}
function show_av(st) {
display.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
left.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
right.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[1]));
}
signalingChannel.onmessage = function (msg) {
var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate));
}
};
AdhearsionConf
2013
65
66. funcFon
createPC()
var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
pc = new RTCPeerConnection(configuration);
var pc;
var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right;
function s(sdp) {} // stub success callback
function e(error) {}
//
stub error callback
var signalingChannel = createSignalingChannel();
getMedia();
createPC();
attachMedia();
call();
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};
• Create
RTCPeerConnection
• Set
handlers
function getMedia() {
// get local audio (microphone)
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// get local video (application sharing)
///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream.
//
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream;
}, e);
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};
}
function attachMedia() {
presentation =
new MediaStream(
[microphone.getAudioTracks()[0],
// Audio
application.getVideoTracks()[0]]); // Presentation
presenter =
new MediaStream(
[microphone.getAudioTracks()[0],
// Audio
front.getVideoTracks()[0]]);
// Presenter
demonstration =
new MediaStream(
[microphone.getAudioTracks()[0],
rear.getVideoTracks()[0]]);
// Audio
// Demonstration
pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);
}
signalingChannel.send(
JSON.stringify({ "presentation": presentation.id,
"presenter": presenter.id,
"demonstration": demonstration.id
}));
function call() {
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.getVideoTracks().length == 1) {
av_stream = st;
show_av(av_stream);
} else if (st.getAudioTracks().length == 2) {
stereo = st;
} else {
mono = st;
}
}
function show_av(st) {
display.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
left.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
right.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[1]));
}
signalingChannel.onmessage = function (msg) {
var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate));
}
};
AdhearsionConf
2013
66
67. Mobile
browser
consumes
.
.
.
Audio
&
Video
Stream
Display
“Video”
Track
Right
Headphone
“Right”
Track
Lem
Headphone
Browser
M
“Lem”
Track
(Audio
&
Video
Stream
selected)
Stereo
Stream
“Right”
Track
“Lem”
Track
“Mono”
track
Mono
Stream
Sinks
MediaStreams
• Receives
three
media
streams
• Chooses
one
• Sends
tracks
to
output
channels
AdhearsionConf
2013
Tracks
67
68. FuncFon
handleIncomingStream()
if (st.getVideoTracks().length == 1) {
av_stream = st;
show_av(av_stream);
} else if (st.getAudioTracks().length == 2) {
stereo = st;
} else {
mono = st;
}
var pc;
var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right;
function s(sdp) {} // stub success callback
function e(error) {}
//
stub error callback
var signalingChannel = createSignalingChannel();
getMedia();
createPC();
attachMedia();
call();
function getMedia() {
// get local audio (microphone)
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// get local video (application sharing)
///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream.
//
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream;
}, e);
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);
• If
incoming
stream
has
video
track,
set
to
av_stream
and
display
it
• If
it
has
two
audio
tracks,
must
be
stereo
• Otherwise,
must
be
the
mono
stream
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};
}
function attachMedia() {
presentation =
new MediaStream(
[microphone.getAudioTracks()[0],
// Audio
application.getVideoTracks()[0]]); // Presentation
presenter =
new MediaStream(
[microphone.getAudioTracks()[0],
// Audio
front.getVideoTracks()[0]]);
// Presenter
demonstration =
new MediaStream(
[microphone.getAudioTracks()[0],
rear.getVideoTracks()[0]]);
// Audio
// Demonstration
pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);
}
signalingChannel.send(
JSON.stringify({ "presentation": presentation.id,
"presenter": presenter.id,
"demonstration": demonstration.id
}));
function call() {
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.getVideoTracks().length == 1) {
av_stream = st;
show_av(av_stream);
} else if (st.getAudioTracks().length == 2) {
stereo = st;
} else {
mono = st;
}
}
function show_av(st) {
display.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
left.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
right.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[1]));
}
signalingChannel.onmessage = function (msg) {
var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate));
}
};
AdhearsionConf
2013
68
69. FuncFon
show_av(st)
display.srcObject =
new MediaStream(st.getVideoTracks()[0]);
left.srcObject =
new MediaStream(st.getAudioTracks()[0]);
right.srcObject =
new MediaStream(st.getAudioTracks()[1]);
var pc;
var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right;
function s(sdp) {} // stub success callback
function e(error) {}
//
stub error callback
var signalingChannel = createSignalingChannel();
getMedia();
createPC();
attachMedia();
call();
function getMedia() {
// get local audio (microphone)
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// get local video (application sharing)
///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream.
//
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream;
}, e);
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};
}
function attachMedia() {
presentation =
new MediaStream(
[microphone.getAudioTracks()[0],
// Audio
application.getVideoTracks()[0]]); // Presentation
presenter =
new MediaStream(
[microphone.getAudioTracks()[0],
// Audio
front.getVideoTracks()[0]]);
// Presenter
demonstration =
new MediaStream(
[microphone.getAudioTracks()[0],
rear.getVideoTracks()[0]]);
// Audio
// Demonstration
pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);
• Using
new
srcObject
property
on
media,
• Set
new
stream
as
source
}
signalingChannel.send(
JSON.stringify({ "presentation": presentation.id,
"presenter": presenter.id,
"demonstration": demonstration.id
}));
function call() {
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.getVideoTracks().length == 1) {
av_stream = st;
show_av(av_stream);
} else if (st.getAudioTracks().length == 2) {
stereo = st;
} else {
mono = st;
}
}
function show_av(st) {
display.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
left.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
right.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[1]));
}
signalingChannel.onmessage = function (msg) {
var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate));
}
};
AdhearsionConf
2013
69
70. Mobile
browser
code
outline
var signalingChannel =
createSignalingChannel();
var pc;
var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right;
function s(sdp) {} // stub success callback
function e(error) {}
//
stub error callback
var signalingChannel = createSignalingChannel();
getMedia();
createPC();
attachMedia();
call();
getMedia();
createPC();
attachMedia();
call();
function getMedia() {
// get local audio (microphone)
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// get local video (application sharing)
///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream.
//
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream;
}, e);
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};
}
function attachMedia() {
presentation =
new MediaStream(
• We
will
look
next
at
each
of
these
• .
.
.
except
for
creaFng
the
signaling
channel
[microphone.getAudioTracks()[0],
// Audio
application.getVideoTracks()[0]]); // Presentation
presenter =
new MediaStream(
[microphone.getAudioTracks()[0],
// Audio
front.getVideoTracks()[0]]);
// Presenter
demonstration =
new MediaStream(
[microphone.getAudioTracks()[0],
rear.getVideoTracks()[0]]);
// Audio
// Demonstration
pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);
}
signalingChannel.send(
JSON.stringify({ "presentation": presentation.id,
"presenter": presenter.id,
"demonstration": demonstration.id
}));
function call() {
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.getVideoTracks().length == 1) {
av_stream = st;
show_av(av_stream);
} else if (st.getAudioTracks().length == 2) {
stereo = st;
} else {
mono = st;
}
}
function show_av(st) {
display.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
left.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
right.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[1]));
}
signalingChannel.onmessage = function (msg) {
var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate));
}
};
AdhearsionConf
2013
70
71. funcFon
aUachMedia()
[1]
presentation =
new MediaStream(
[microphone.getAudioTracks()[0],
application.getVideoTracks()[0]]);
presenter =
new MediaStream(
[microphone.getAudioTracks()[0],
front.getVideoTracks()[0]]);
demonstration =
new MediaStream(
[microphone.getAudioTracks()[0],
rear.getVideoTracks()[0]]);
. . .
// Audio
// Presentation
"credential":"myPassword"}]};
var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right;
function s(sdp) {} // stub success callback
function e(error) {}
//
stub error callback
var signalingChannel = createSignalingChannel();
getMedia();
createPC();
attachMedia();
call();
function getMedia() {
// get local audio (microphone)
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// Audio
// Presenter
// get local video (application sharing)
///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream.
//
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream;
}, e);
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};
}
function attachMedia() {
presentation =
new MediaStream(
// Audio
// Demonstration
• Create
3
new
streams,
all
with
same
audio
but
different
video
AdhearsionConf
2013
var pc;
var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
[microphone.getAudioTracks()[0],
// Audio
application.getVideoTracks()[0]]); // Presentation
presenter =
new MediaStream(
[microphone.getAudioTracks()[0],
// Audio
front.getVideoTracks()[0]]);
// Presenter
demonstration =
new MediaStream(
[microphone.getAudioTracks()[0],
rear.getVideoTracks()[0]]);
// Audio
// Demonstration
pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);
}
signalingChannel.send(
JSON.stringify({ "presentation": presentation.id,
"presenter": presenter.id,
"demonstration": demonstration.id
}));
function call() {
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.getVideoTracks().length == 1) {
av_stream = st;
show_av(av_stream);
} else if (st.getAudioTracks().length == 2) {
stereo = st;
} else {
mono = st;
}
}
function show_av(st) {
display.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
left.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
right.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[1]));
}
signalingChannel.onmessage = function (msg) {
var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate));
}
};
71
72. funcFon
aUachMedia()
[2]
pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);
var pc;
var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
signalingChannel.send(
JSON.stringify({ "presentation": presentation.id,
"presenter": presenter.id,
"demonstration": demonstration.id
}));
"credential":"myPassword"}]};
var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right;
function s(sdp) {} // stub success callback
function e(error) {}
//
stub error callback
var signalingChannel = createSignalingChannel();
getMedia();
createPC();
attachMedia();
call();
function getMedia() {
// get local audio (microphone)
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// get local video (application sharing)
///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream.
//
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream;
}, e);
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};
}
function attachMedia() {
presentation =
new MediaStream(
[microphone.getAudioTracks()[0],
// Audio
application.getVideoTracks()[0]]); // Presentation
presenter =
new MediaStream(
[microphone.getAudioTracks()[0],
// Audio
front.getVideoTracks()[0]]);
// Presenter
demonstration =
new MediaStream(
[microphone.getAudioTracks()[0],
rear.getVideoTracks()[0]]);
// Audio
// Demonstration
pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);
signalingChannel.send(
JSON.stringify({ "presentation": presentation.id,
"presenter": presenter.id,
"demonstration": demonstration.id
}));
• AUach
all
3
streams
to
Peer
ConnecFon
• Send
stream
ids
to
peer
(before
streams!)
}
function call() {
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.getVideoTracks().length == 1) {
av_stream = st;
show_av(av_stream);
} else if (st.getAudioTracks().length == 2) {
stereo = st;
} else {
mono = st;
}
}
function show_av(st) {
display.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
left.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
right.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[1]));
}
signalingChannel.onmessage = function (msg) {
var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate));
}
};
AdhearsionConf
2013
72
73. Mobile
browser
code
outline
var signalingChannel =
createSignalingChannel();
var pc;
var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right;
function s(sdp) {} // stub success callback
function e(error) {}
//
stub error callback
var signalingChannel = createSignalingChannel();
getMedia();
createPC();
attachMedia();
call();
getMedia();
createPC();
attachMedia();
call();
function getMedia() {
// get local audio (microphone)
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// get local video (application sharing)
///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream.
//
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream;
}, e);
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};
}
function attachMedia() {
presentation =
new MediaStream(
• We
will
look
next
at
each
of
these
• .
.
.
except
for
creaFng
the
signaling
channel
[microphone.getAudioTracks()[0],
// Audio
application.getVideoTracks()[0]]); // Presentation
presenter =
new MediaStream(
[microphone.getAudioTracks()[0],
// Audio
front.getVideoTracks()[0]]);
// Presenter
demonstration =
new MediaStream(
[microphone.getAudioTracks()[0],
rear.getVideoTracks()[0]]);
// Audio
// Demonstration
pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);
}
signalingChannel.send(
JSON.stringify({ "presentation": presentation.id,
"presenter": presenter.id,
"demonstration": demonstration.id
}));
function call() {
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.getVideoTracks().length == 1) {
av_stream = st;
show_av(av_stream);
} else if (st.getAudioTracks().length == 2) {
stereo = st;
} else {
mono = st;
}
}
function show_av(st) {
display.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
left.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
right.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[1]));
}
signalingChannel.onmessage = function (msg) {
var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate));
}
};
AdhearsionConf
2013
73
74. funcFon
call()
pc.createOffer(gotDescription, e);
var pc;
var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right;
function s(sdp) {} // stub success callback
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
function e(error) {}
//
stub error callback
var signalingChannel = createSignalingChannel();
getMedia();
createPC();
attachMedia();
call();
function getMedia() {
// get local audio (microphone)
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// get local video (application sharing)
///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream.
//
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream;
}, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);
}
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};
}
function attachMedia() {
presentation =
new MediaStream(
[microphone.getAudioTracks()[0],
// Audio
application.getVideoTracks()[0]]); // Presentation
presenter =
new MediaStream(
[microphone.getAudioTracks()[0],
// Audio
front.getVideoTracks()[0]]);
// Presenter
• Ask
browser
to
create
SDP
offer
• Set
offer
as
local
descripFon
• Send
offer
to
peer
demonstration =
new MediaStream(
[microphone.getAudioTracks()[0],
rear.getVideoTracks()[0]]);
// Audio
// Demonstration
pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);
}
signalingChannel.send(
JSON.stringify({ "presentation": presentation.id,
"presenter": presenter.id,
"demonstration": demonstration.id
}));
function call() {
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.getVideoTracks().length == 1) {
av_stream = st;
show_av(av_stream);
} else if (st.getAudioTracks().length == 2) {
stereo = st;
} else {
mono = st;
}
}
function show_av(st) {
display.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
left.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
right.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[1]));
}
signalingChannel.onmessage = function (msg) {
var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate));
}
};
AdhearsionConf
2013
74
75. How
do
we
get
the
SDP
answer?
signalingChannel.onmessage = function (msg) {
var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate));
}
var pc;
var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
var microphone, application, front, rear;
var presentation, presenter, demonstration;
var remote_av, stereo, mono;
var display, left, right;
function s(sdp) {} // stub success callback
function e(error) {}
//
stub error callback
var signalingChannel = createSignalingChannel();
getMedia();
createPC();
attachMedia();
call();
function getMedia() {
// get local audio (microphone)
navigator.getUserMedia({"audio": true }, function (stream) {
microphone = stream;
}, e);
// get local video (application sharing)
///// This is outside the scope of this specification.
///// Assume that 'application' has been set to this stream.
//
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "front"}}};
navigator.getUserMedia(constraint, function (stream) {
front = stream;
}, e);
constraint =
{"video": {"mandatory": {"videoFacingModeEnum": "rear"}}};
navigator.getUserMedia(constraint, function (stream) {
rear = stream;
}, e);
}
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};
}
function attachMedia() {
presentation =
new MediaStream(
[microphone.getAudioTracks()[0],
// Audio
application.getVideoTracks()[0]]); // Presentation
presenter =
new MediaStream(
[microphone.getAudioTracks()[0],
// Audio
front.getVideoTracks()[0]]);
// Presenter
};
demonstration =
new MediaStream(
[microphone.getAudioTracks()[0],
rear.getVideoTracks()[0]]);
// Audio
// Demonstration
pc.addStream(presentation);
pc.addStream(presenter);
pc.addStream(demonstration);
• Signaling
channel
provides
message
• If
SDP,
set
as
remote
descripFon
• If
ICE
candidate,
tell
the
browser
AdhearsionConf
2013
}
signalingChannel.send(
JSON.stringify({ "presentation": presentation.id,
"presenter": presenter.id,
"demonstration": demonstration.id
}));
function call() {
pc.createOffer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.getVideoTracks().length == 1) {
av_stream = st;
show_av(av_stream);
} else if (st.getAudioTracks().length == 2) {
stereo = st;
} else {
mono = st;
}
}
function show_av(st) {
display.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
left.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
right.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[1]));
}
signalingChannel.onmessage = function (msg) {
var signal = JSON.parse(msg.data);
if (signal.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(signal.sdp), s, e);
} else {
pc.addIceCandidate(
new RTCIceCandidate(signal.candidate));
}
};
75
76. And
now
the
laptop
browser
.
.
.
• Watch
for
the
following
– We
set
up
media
*amer*
receiving
the
offer
– but
the
signaling
channel
sFll
must
exist
first!
– Also,
need
to
save
incoming
stream
ids
AdhearsionConf
2013
76
77. Signaling
channel
message
is
trigger
signalingChannel.onmessage = function (msg) {
if (!pc) {
prepareForIncomingCall();
}
var sgnl = JSON.parse(msg.data);
var pc;
var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
var webcam, left, right;
var av, stereo, mono;
var incoming;
var speaker, win1, win2, win3;
function s(sdp) {} // stub success callback
function e(error) {}
//
stub error callback
var signalingChannel = createSignalingChannel();
function prepareForIncomingCall() {
createPC();
getMedia();
}
attachMedia();
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
. . .
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};
}
function getMedia() {
navigator.getUserMedia({"video": true }, function (stream) {
webcam = stream;
}, e);
constraint =
{"audio": {"mandatory": {"audioDirectionEnum": "left"}}};
navigator.getUserMedia(constraint, function (stream) {
left = stream;
}, e);
constraint =
{"audio": {"mandatory": {"audioDirectionEnum": "right"}}};
navigator.getUserMedia(constraint, function (stream) {
right = stream;
}, e);
}
function attachMedia() {
av = new MediaStream(
[webcam.getVideoTracks()[0],
left.getAudioTracks()[0],
right.getAudioTracks()[0]]);
stereo = new MediaStream(
[left.getAudioTracks()[0],
right.getAudioTracks()[0]]);
mono = left;
};
// Video
// Left audio
// Right audio
// Left audio
// Right audio
// Treat the left audio as the mono stream
pc.addStream(av);
pc.addStream(stereo);
pc.addStream(mono);
}
function answer() {
pc.createAnswer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
• Set
up
PC
and
media
if
not
already
done
if (st.id === incoming.presentation) {
speaker.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
win1.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
} else if (st.id === incoming.presenter) {
win2.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
} else {
win3.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
}
}
signalingChannel.onmessage = function (msg) {
if (!pc) {
prepareForIncomingCall();
}
var sgnl = JSON.parse(msg.data);
if (sgnl.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(sgnl.sdp), s, e);
answer();
} else if (sgnl.candidate) {
pc.addIceCandidate(new RTCIceCandidate(sgnl.candidate));
} else {
incoming = sgnl;
}
};
AdhearsionConf
2013
77
78. Signaling
channel
message
is
trigger
signalingChannel.onmessage = function (msg) {
. . .
if (sgnl.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(sgnl.sdp), s, e);
answer();
} else if (sgnl.candidate) {
pc.addIceCandidate(new RTCIceCandidate(sgnl.candidate));
} else {
incoming = sgnl;
}
var pc;
var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
var webcam, left, right;
var av, stereo, mono;
var incoming;
var speaker, win1, win2, win3;
function s(sdp) {} // stub success callback
function e(error) {}
//
stub error callback
var signalingChannel = createSignalingChannel();
function prepareForIncomingCall() {
createPC();
getMedia();
}
attachMedia();
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};
}
function getMedia() {
navigator.getUserMedia({"video": true }, function (stream) {
webcam = stream;
}, e);
constraint =
{"audio": {"mandatory": {"audioDirectionEnum": "left"}}};
navigator.getUserMedia(constraint, function (stream) {
left = stream;
}, e);
constraint =
{"audio": {"mandatory": {"audioDirectionEnum": "right"}}};
navigator.getUserMedia(constraint, function (stream) {
right = stream;
}, e);
}
function attachMedia() {
av = new MediaStream(
[webcam.getVideoTracks()[0],
left.getAudioTracks()[0],
right.getAudioTracks()[0]]);
stereo = new MediaStream(
[left.getAudioTracks()[0],
right.getAudioTracks()[0]]);
mono = left;
// Video
// Left audio
// Right audio
// Left audio
// Right audio
// Treat the left audio as the mono stream
pc.addStream(av);
pc.addStream(stereo);
pc.addStream(mono);
}
function answer() {
pc.createAnswer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
};
• If
SDP,
*also*
answer
• But
if
neither
SDP
nor
ICE
candidate,
must
be
set
of
incoming
stream
ids,
so
save
AdhearsionConf
2013
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.id === incoming.presentation) {
speaker.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
win1.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
} else if (st.id === incoming.presenter) {
win2.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
} else {
win3.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
}
}
signalingChannel.onmessage = function (msg) {
if (!pc) {
prepareForIncomingCall();
}
var sgnl = JSON.parse(msg.data);
if (sgnl.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(sgnl.sdp), s, e);
answer();
} else if (sgnl.candidate) {
pc.addIceCandidate(new RTCIceCandidate(sgnl.candidate));
} else {
incoming = sgnl;
}
};
78
79. FuncFon
prepareForIncomingCall()
createPC();
var pc;
var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
var webcam, left, right;
var av, stereo, mono;
getMedia();
var incoming;
var speaker, win1, win2, win3;
function s(sdp) {} // stub success callback
function e(error) {}
//
stub error callback
var signalingChannel = createSignalingChannel();
function prepareForIncomingCall() {
createPC();
getMedia();
}
attachMedia();
function createPC() {
pc = new RTCPeerConnection(configuration);
attachMedia();
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};
}
function getMedia() {
navigator.getUserMedia({"video": true }, function (stream) {
webcam = stream;
}, e);
constraint =
{"audio": {"mandatory": {"audioDirectionEnum": "left"}}};
navigator.getUserMedia(constraint, function (stream) {
left = stream;
}, e);
constraint =
{"audio": {"mandatory": {"audioDirectionEnum": "right"}}};
navigator.getUserMedia(constraint, function (stream) {
right = stream;
}, e);
}
function attachMedia() {
av = new MediaStream(
[webcam.getVideoTracks()[0],
left.getAudioTracks()[0],
right.getAudioTracks()[0]]);
stereo = new MediaStream(
[left.getAudioTracks()[0],
right.getAudioTracks()[0]]);
mono = left;
• No
suprises
here
• Media
obtained
is
a
liUle
different
• But
aUached
the
same
way
AdhearsionConf
2013
// Video
// Left audio
// Right audio
// Left audio
// Right audio
// Treat the left audio as the mono stream
pc.addStream(av);
pc.addStream(stereo);
pc.addStream(mono);
}
function answer() {
pc.createAnswer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.id === incoming.presentation) {
speaker.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
win1.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
} else if (st.id === incoming.presenter) {
win2.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
} else {
win3.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
}
}
signalingChannel.onmessage = function (msg) {
if (!pc) {
prepareForIncomingCall();
}
var sgnl = JSON.parse(msg.data);
if (sgnl.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(sgnl.sdp), s, e);
answer();
} else if (sgnl.candidate) {
pc.addIceCandidate(new RTCIceCandidate(sgnl.candidate));
} else {
incoming = sgnl;
}
};
79
80. FuncFon
answer()
pc.createAnswer(gotDescription, e);
var pc;
var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
"credential":"myPassword"}]};
var webcam, left, right;
var av, stereo, mono;
var incoming;
var speaker, win1, win2, win3;
function s(sdp) {} // stub success callback
function e(error) {}
//
stub error callback
var signalingChannel = createSignalingChannel();
function prepareForIncomingCall() {
createPC();
getMedia();
}
attachMedia();
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
signalingChannel.send(JSON.stringify({ "sdp": desc }));
function (evt) {handleIncomingStream(evt.stream);};
}
function getMedia() {
navigator.getUserMedia({"video": true }, function (stream) {
webcam = stream;
}, e);
constraint =
{"audio": {"mandatory": {"audioDirectionEnum": "left"}}};
navigator.getUserMedia(constraint, function (stream) {
left = stream;
}, e);
}
constraint =
{"audio": {"mandatory": {"audioDirectionEnum": "right"}}};
navigator.getUserMedia(constraint, function (stream) {
right = stream;
}, e);
}
function attachMedia() {
av = new MediaStream(
[webcam.getVideoTracks()[0],
left.getAudioTracks()[0],
right.getAudioTracks()[0]]);
stereo = new MediaStream(
[left.getAudioTracks()[0],
right.getAudioTracks()[0]]);
mono = left;
• createAnswer()
automaFcally
uses
value
of
remoteDescription
when
generaFng
new
SDP
// Video
// Left audio
// Right audio
// Left audio
// Right audio
// Treat the left audio as the mono stream
pc.addStream(av);
pc.addStream(stereo);
pc.addStream(mono);
}
function answer() {
pc.createAnswer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.id === incoming.presentation) {
speaker.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
win1.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
} else if (st.id === incoming.presenter) {
win2.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
} else {
win3.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
}
}
signalingChannel.onmessage = function (msg) {
if (!pc) {
prepareForIncomingCall();
}
var sgnl = JSON.parse(msg.data);
if (sgnl.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(sgnl.sdp), s, e);
answer();
} else if (sgnl.candidate) {
pc.addIceCandidate(new RTCIceCandidate(sgnl.candidate));
} else {
incoming = sgnl;
}
};
AdhearsionConf
2013
80
81. Laptop
browser
consumes
.
.
.
PresentaFon
Stream
“Audio”
Track
“PresentaFon”
Track
Presenter
Stream
“Audio”
Track
Speaker
“Presenter”
Track
Display
DemonstraFon
Stream
Display
“Audio”
Track
Display
“DemonstraFon”
Track
Browser
L
(All
video
streams
selected)
Tracks
MediaStreams
Sinks
• Three
input
streams
• All
have
same
#
of
audio
and
video
tracks
• Need
stream
ids
to
disFnguish
AdhearsionConf
2013
81
82. FuncFon
handleIncomingStream()
if (st.id === incoming.presentation) {
speaker.srcObject =
new MediaStream(st.getAudioTracks()[0]);
win1.srcObject =
new MediaStream(st.getVideoTracks()[0]);
} else if (st.id === incoming.presenter) {
win2.srcObject =
new MediaStream(st.getVideoTracks()[0]);
} else {
win3.srcObject =
new MediaStream(st.getVideoTracks()[0]);
}
• Use
ids
to
disFnguish
streams
• Extract
one
audio
and
all
video
tracks
• Assign
to
element
sources
AdhearsionConf
2013
var pc;
var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
var webcam, left, right;
var av, stereo, mono;
var incoming;
var speaker, win1, win2, win3;
function s(sdp) {} // stub success callback
function e(error) {}
//
stub error callback
var signalingChannel = createSignalingChannel();
function prepareForIncomingCall() {
createPC();
getMedia();
}
attachMedia();
function createPC() {
pc = new RTCPeerConnection(configuration);
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};
}
function getMedia() {
navigator.getUserMedia({"video": true }, function (stream) {
webcam = stream;
}, e);
constraint =
{"audio": {"mandatory": {"audioDirectionEnum": "left"}}};
navigator.getUserMedia(constraint, function (stream) {
left = stream;
}, e);
constraint =
{"audio": {"mandatory": {"audioDirectionEnum": "right"}}};
navigator.getUserMedia(constraint, function (stream) {
right = stream;
}, e);
}
function attachMedia() {
av = new MediaStream(
[webcam.getVideoTracks()[0],
left.getAudioTracks()[0],
right.getAudioTracks()[0]]);
stereo = new MediaStream(
[left.getAudioTracks()[0],
right.getAudioTracks()[0]]);
mono = left;
// Video
// Left audio
// Right audio
// Left audio
// Right audio
// Treat the left audio as the mono stream
pc.addStream(av);
pc.addStream(stereo);
pc.addStream(mono);
}
function answer() {
pc.createAnswer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.id === incoming.presentation) {
speaker.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
win1.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
} else if (st.id === incoming.presenter) {
win2.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
} else {
win3.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
}
}
signalingChannel.onmessage = function (msg) {
if (!pc) {
prepareForIncomingCall();
}
var sgnl = JSON.parse(msg.data);
if (sgnl.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(sgnl.sdp), s, e);
answer();
} else if (sgnl.candidate) {
pc.addIceCandidate(new RTCIceCandidate(sgnl.candidate));
} else {
incoming = sgnl;
}
};
82
83. Laptop
browser
produces
.
.
.
Audio
&
Video
Stream
Video
“Video”
Track
WebCam
Lem
Microphone
“Right”
Track
“Lem”
Track
Lem
Audio
Right
Microphone
Stereo
Stream
Browser
L
“Right”
Track
Right
Audio
“Lem”
Track
“Mono”
Track
Mono
Stream
Tracks
Created
MediaStreams
Captured
MediaStreams
Sources
• Three
calls
to
getUserMedia()
• Three
calls
to
new MediaStream()
• No
stream
ids
needed
AdhearsionConf
2013
83
84. FuncFon
getMedia()
[1]
navigator.getUserMedia({"video": true}, function (stream) {
webcam = stream;
}, e);
var pc;
var configuration =
{"iceServers":[{"url":"stun:198.51.100.9"},
{"url":"turn:198.51.100.2",
"credential":"myPassword"}]};
var webcam, left, right;
var av, stereo, mono;
var incoming;
var speaker, win1, win2, win3;
function s(sdp) {} // stub success callback
function e(error) {}
//
stub error callback
var signalingChannel = createSignalingChannel();
function prepareForIncomingCall() {
createPC();
getMedia();
}
attachMedia();
function createPC() {
pc = new RTCPeerConnection(configuration);
. . .
pc.onicecandidate = function (evt) {
signalingChannel.send(
JSON.stringify({ "candidate": evt.candidate }));
};
pc.onaddstream =
function (evt) {handleIncomingStream(evt.stream);};
}
function getMedia() {
navigator.getUserMedia({"video": true }, function (stream) {
webcam = stream;
}, e);
constraint =
{"audio": {"mandatory": {"audioDirectionEnum": "left"}}};
navigator.getUserMedia(constraint, function (stream) {
left = stream;
}, e);
constraint =
{"audio": {"mandatory": {"audioDirectionEnum": "right"}}};
navigator.getUserMedia(constraint, function (stream) {
right = stream;
}, e);
}
function attachMedia() {
av = new MediaStream(
[webcam.getVideoTracks()[0],
left.getAudioTracks()[0],
right.getAudioTracks()[0]]);
stereo = new MediaStream(
[left.getAudioTracks()[0],
right.getAudioTracks()[0]]);
mono = left;
• Request
webcam
video
// Video
// Left audio
// Right audio
// Left audio
// Right audio
// Treat the left audio as the mono stream
pc.addStream(av);
pc.addStream(stereo);
pc.addStream(mono);
}
function answer() {
pc.createAnswer(gotDescription, e);
function gotDescription(desc) {
pc.setLocalDescription(desc, s, e);
signalingChannel.send(JSON.stringify({ "sdp": desc }));
}
}
function handleIncomingStream(st) {
if (st.id === incoming.presentation) {
speaker.src = URL.createObjectURL(
new MediaStream(st.getAudioTracks()[0]));
win1.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
} else if (st.id === incoming.presenter) {
win2.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
} else {
win3.src = URL.createObjectURL(
new MediaStream(st.getVideoTracks()[0]));
}
}
signalingChannel.onmessage = function (msg) {
if (!pc) {
prepareForIncomingCall();
}
var sgnl = JSON.parse(msg.data);
if (sgnl.sdp) {
pc.setRemoteDescription(
new RTCSessionDescription(sgnl.sdp), s, e);
answer();
} else if (sgnl.candidate) {
pc.addIceCandidate(new RTCIceCandidate(sgnl.candidate));
} else {
incoming = sgnl;
}
};
AdhearsionConf
2013
84