My (quite boring) slides on what we needed to do in Janus to support multiple streams of the same type (e.g., 3 video streams) on the same PeerConnection.
1. “Sounds like a plan!”
Or how I added multistream to Janus using Unified
Lorenzo Miniero
@elminiero
CommCon 2019
July 8th 2019, Latimer Estate, Buckinghamshire (UK)
2. Couldn’t find a good picture of me
Lorenzo Miniero
• Ph.D @ UniNA
• Chairman @ Meetecho
• Should probably eat more
Contacts and info
• lorenzo@meetecho.com
• https://twitter.com/elminiero
• https://www.slideshare.net/LorenzoMiniero
3. A few words on Meetecho
• Co-founded in 2009 as an academic spin-off
• University research efforts brought to the market
• Completely independent from the University
• Focus on real-time multimedia applications
• Strong perspective on standardization and open source
• Several activities
• Consulting services
• Commercial support and Janus licenses
• Streaming of live events (IETF, ACM, etc.)
5. First of all, what is Janus?
Janus
General purpose, open source WebRTC server
• https://github.com/meetecho/janus-gateway
• Demos and documentation: https://janus.conf.meetecho.com
• Community: https://groups.google.com/forum/#!forum/meetecho-janus
6. Modular architecture
• The core only implements the WebRTC stack
• JSEP/SDP, ICE, DTLS-SRTP, Data Channels, ...
• Plugins expose Janus API over different “transports”
• Currently HTTP / WebSockets / RabbitMQ / Unix Sockets / MQTT / Nanomsg
• “Application” logic implemented in plugins too
• Users attach to plugins via the Janus core
• The core handles the WebRTC stuff
• Plugins route/manipulate the media/data
• Plugins can be combined on client side as “bricks”
• Video SFU, Audio MCU, SIP gatewaying, broadcasting, etc.
7. Modular architecture
• The core only implements the WebRTC stack
• JSEP/SDP, ICE, DTLS-SRTP, Data Channels, ...
• Plugins expose Janus API over different “transports”
• Currently HTTP / WebSockets / RabbitMQ / Unix Sockets / MQTT / Nanomsg
• “Application” logic implemented in plugins too
• Users attach to plugins via the Janus core
• The core handles the WebRTC stuff
• Plugins route/manipulate the media/data
• Plugins can be combined on client side as “bricks”
• Video SFU, Audio MCU, SIP gatewaying, broadcasting, etc.
8. Modular architecture
• The core only implements the WebRTC stack
• JSEP/SDP, ICE, DTLS-SRTP, Data Channels, ...
• Plugins expose Janus API over different “transports”
• Currently HTTP / WebSockets / RabbitMQ / Unix Sockets / MQTT / Nanomsg
• “Application” logic implemented in plugins too
• Users attach to plugins via the Janus core
• The core handles the WebRTC stuff
• Plugins route/manipulate the media/data
• Plugins can be combined on client side as “bricks”
• Video SFU, Audio MCU, SIP gatewaying, broadcasting, etc.
9. Modular architecture
• The core only implements the WebRTC stack
• JSEP/SDP, ICE, DTLS-SRTP, Data Channels, ...
• Plugins expose Janus API over different “transports”
• Currently HTTP / WebSockets / RabbitMQ / Unix Sockets / MQTT / Nanomsg
• “Application” logic implemented in plugins too
• Users attach to plugins via the Janus core
• The core handles the WebRTC stuff
• Plugins route/manipulate the media/data
• Plugins can be combined on client side as “bricks”
• Video SFU, Audio MCU, SIP gatewaying, broadcasting, etc.
12. A known limitation, though...
• Since day one, PeerConnections in Janus had a well-known “limitation”
• Only one stream and m-line per media type allowed
• PeerConnections limited to 1 audio + 1 video + 1 data channel
• Never limited the Janus functionality...
• You simply need more PeerConnections if you need more audio/video streams
• ... but bundling streams together can be really useful, though
• e.g., to reduce networking overhead and number of PeerConnections
Why not add it years ago, then?
• Missing interoperability between Chrome and Firefox is what stopped us
• We really didn’t want to implement both just to drop one later
• Decided to focus on features instead (simulcast, SVC, etc.)
13. A known limitation, though...
• Since day one, PeerConnections in Janus had a well-known “limitation”
• Only one stream and m-line per media type allowed
• PeerConnections limited to 1 audio + 1 video + 1 data channel
• Never limited the Janus functionality...
• You simply need more PeerConnections if you need more audio/video streams
• ... but bundling streams together can be really useful, though
• e.g., to reduce networking overhead and number of PeerConnections
Why not add it years ago, then?
• Missing interoperability between Chrome and Firefox is what stopped us
• We really didn’t want to implement both just to drop one later
• Decided to focus on features instead (simulcast, SVC, etc.)
14. A known limitation, though...
• Since day one, PeerConnections in Janus had a well-known “limitation”
• Only one stream and m-line per media type allowed
• PeerConnections limited to 1 audio + 1 video + 1 data channel
• Never limited the Janus functionality...
• You simply need more PeerConnections if you need more audio/video streams
• ... but bundling streams together can be really useful, though
• e.g., to reduce networking overhead and number of PeerConnections
Why not add it years ago, then?
• Missing interoperability between Chrome and Firefox is what stopped us
• We really didn’t want to implement both just to drop one later
• Decided to focus on features instead (simulcast, SVC, etc.)
15. A known limitation, though...
• Since day one, PeerConnections in Janus had a well-known “limitation”
• Only one stream and m-line per media type allowed
• PeerConnections limited to 1 audio + 1 video + 1 data channel
• Never limited the Janus functionality...
• You simply need more PeerConnections if you need more audio/video streams
• ... but bundling streams together can be really useful, though
• e.g., to reduce networking overhead and number of PeerConnections
Why not add it years ago, then?
• Missing interoperability between Chrome and Firefox is what stopped us
• We really didn’t want to implement both just to drop one later
• Decided to focus on features instead (simulcast, SVC, etc.)
17. Multistream in WebRTC: so many plans!
• Plan B
• https://tools.ietf.org/html/draft-uberti-rtcweb-plan-00
• One m-line per media type (msid/SSRCs to identify streams)
• Originally implemented by Chrome
• Unified Plan (originally Plan A)
• https://tools.ietf.org/html/draft-roach-mmusic-unified-plan-00
• A separate m-line per media type (mid/rid to identify streams)
• Originally implemented by Firefox
• No Plan (!)
• https://tools.ietf.org/html/draft-ivov-rtcweb-noplan-01
• Attempt to reduce the number of offer/answer exchanges
• Never implemented, AFAICT?
18. Multistream in WebRTC: so many plans!
• Plan B
• https://tools.ietf.org/html/draft-uberti-rtcweb-plan-00
• One m-line per media type (msid/SSRCs to identify streams)
• Originally implemented by Chrome
• Unified Plan (originally Plan A)
• https://tools.ietf.org/html/draft-roach-mmusic-unified-plan-00
• A separate m-line per media type (mid/rid to identify streams)
• Originally implemented by Firefox
• No Plan (!)
• https://tools.ietf.org/html/draft-ivov-rtcweb-noplan-01
• Attempt to reduce the number of offer/answer exchanges
• Never implemented, AFAICT?
19. Multistream in WebRTC: so many plans!
• Plan B
• https://tools.ietf.org/html/draft-uberti-rtcweb-plan-00
• One m-line per media type (msid/SSRCs to identify streams)
• Originally implemented by Chrome
• Unified Plan (originally Plan A)
• https://tools.ietf.org/html/draft-roach-mmusic-unified-plan-00
• A separate m-line per media type (mid/rid to identify streams)
• Originally implemented by Firefox
• No Plan (!)
• https://tools.ietf.org/html/draft-ivov-rtcweb-noplan-01
• Attempt to reduce the number of offer/answer exchanges
• Never implemented, AFAICT?
29. A Plan to Unify them all!
• IETF consensus on Unified Plan was reached a long time ago
• IETF 87, Summer of 2013!
• https://webrtchacks.com/a-hitchhikers-guide-to-webrtc-standardization/
• Firefox was the first to implement it, a couple of years later
• https://hacks.mozilla.org/2015/03/webrtc-in-firefox-38-multistream-and-renegotiation/
• Chrome only started implementing it recently, though
• https://webrtc.org/web-apis/chrome/unified-plan/
Translated...
No more excuses for us, and time to work on it!
30. A Plan to Unify them all!
• IETF consensus on Unified Plan was reached a long time ago
• IETF 87, Summer of 2013!
• https://webrtchacks.com/a-hitchhikers-guide-to-webrtc-standardization/
• Firefox was the first to implement it, a couple of years later
• https://hacks.mozilla.org/2015/03/webrtc-in-firefox-38-multistream-and-renegotiation/
• Chrome only started implementing it recently, though
• https://webrtc.org/web-apis/chrome/unified-plan/
Translated...
No more excuses for us, and time to work on it!
31. A Plan to Unify them all!
• IETF consensus on Unified Plan was reached a long time ago
• IETF 87, Summer of 2013!
• https://webrtchacks.com/a-hitchhikers-guide-to-webrtc-standardization/
• Firefox was the first to implement it, a couple of years later
• https://hacks.mozilla.org/2015/03/webrtc-in-firefox-38-multistream-and-renegotiation/
• Chrome only started implementing it recently, though
• https://webrtc.org/web-apis/chrome/unified-plan/
Translated...
No more excuses for us, and time to work on it!
32. A Plan to Unify them all!
• IETF consensus on Unified Plan was reached a long time ago
• IETF 87, Summer of 2013!
• https://webrtchacks.com/a-hitchhikers-guide-to-webrtc-standardization/
• Firefox was the first to implement it, a couple of years later
• https://hacks.mozilla.org/2015/03/webrtc-in-firefox-38-multistream-and-renegotiation/
• Chrome only started implementing it recently, though
• https://webrtc.org/web-apis/chrome/unified-plan/
Translated...
No more excuses for us, and time to work on it!
33. Thanks to Highfive for sponsoring the development!
https://github.com/meetecho/janus-gateway/pull/1459
34. How much of a refactoring of the Janus internals?
• For our existing PeerConnections, it didn’t matter much
• For one audio and one video stream, plans are basically interoperable
• ... well, kinda (you still need to be prepared for mid, transceivers, etc.)
• Multistream support required a considerable refactoring, though, of:
1 SDP parsing and generation utils
2 Support for multiple streams in the core
• Hardcoded references to single audio/video streams
• Routing and addressing of individual streams
3 Support for multiple streams in all (most?) plugins
4 And let’s not forget the client side of things!
35. How much of a refactoring of the Janus internals?
• For our existing PeerConnections, it didn’t matter much
• For one audio and one video stream, plans are basically interoperable
• ... well, kinda (you still need to be prepared for mid, transceivers, etc.)
• Multistream support required a considerable refactoring, though, of:
1 SDP parsing and generation utils
2 Support for multiple streams in the core
• Hardcoded references to single audio/video streams
• Routing and addressing of individual streams
3 Support for multiple streams in all (most?) plugins
4 And let’s not forget the client side of things!
36. How much of a refactoring of the Janus internals?
• For our existing PeerConnections, it didn’t matter much
• For one audio and one video stream, plans are basically interoperable
• ... well, kinda (you still need to be prepared for mid, transceivers, etc.)
• Multistream support required a considerable refactoring, though, of:
1 SDP parsing and generation utils
2 Support for multiple streams in the core
• Hardcoded references to single audio/video streams
• Routing and addressing of individual streams
3 Support for multiple streams in all (most?) plugins
4 And let’s not forget the client side of things!
37. How much of a refactoring of the Janus internals?
• For our existing PeerConnections, it didn’t matter much
• For one audio and one video stream, plans are basically interoperable
• ... well, kinda (you still need to be prepared for mid, transceivers, etc.)
• Multistream support required a considerable refactoring, though, of:
1 SDP parsing and generation utils
2 Support for multiple streams in the core
• Hardcoded references to single audio/video streams
• Routing and addressing of individual streams
3 Support for multiple streams in all (most?) plugins
4 And let’s not forget the client side of things!
38. How much of a refactoring of the Janus internals?
• For our existing PeerConnections, it didn’t matter much
• For one audio and one video stream, plans are basically interoperable
• ... well, kinda (you still need to be prepared for mid, transceivers, etc.)
• Multistream support required a considerable refactoring, though, of:
1 SDP parsing and generation utils
2 Support for multiple streams in the core
• Hardcoded references to single audio/video streams
• Routing and addressing of individual streams
3 Support for multiple streams in all (most?) plugins
4 And let’s not forget the client side of things!
39. How much of a refactoring of the Janus internals?
• For our existing PeerConnections, it didn’t matter much
• For one audio and one video stream, plans are basically interoperable
• ... well, kinda (you still need to be prepared for mid, transceivers, etc.)
• Multistream support required a considerable refactoring, though, of:
1 SDP parsing and generation utils
2 Support for multiple streams in the core
• Hardcoded references to single audio/video streams
• Routing and addressing of individual streams
3 Support for multiple streams in all (most?) plugins
4 And let’s not forget the client side of things!
41. SDP parsing and generation
• Janus includes “homemade” SDP utils to quickly parse/generate SDPs
• Used by the core to handle WebRTC SDP negotiation
• Used by plugins as well to negotiate what they want
• Particularly helpful when creating new SDPs
• janus_sdp_generate_offer() vs. janus_sdp_generate_answer()
• Variable length methods with flags to customize output
• m-lines and attributes as lists that can then be pruned/extended
• Assumption about 1 audio/1 video, though, implied simpler syntax
• “I want audio, but not video”
• “Use VP9 for video, and add this fmtp attribute”
42. SDP parsing and generation
• Janus includes “homemade” SDP utils to quickly parse/generate SDPs
• Used by the core to handle WebRTC SDP negotiation
• Used by plugins as well to negotiate what they want
• Particularly helpful when creating new SDPs
• janus_sdp_generate_offer() vs. janus_sdp_generate_answer()
• Variable length methods with flags to customize output
• m-lines and attributes as lists that can then be pruned/extended
• Assumption about 1 audio/1 video, though, implied simpler syntax
• “I want audio, but not video”
• “Use VP9 for video, and add this fmtp attribute”
43. SDP parsing and generation
• Janus includes “homemade” SDP utils to quickly parse/generate SDPs
• Used by the core to handle WebRTC SDP negotiation
• Used by plugins as well to negotiate what they want
• Particularly helpful when creating new SDPs
• janus_sdp_generate_offer() vs. janus_sdp_generate_answer()
• Variable length methods with flags to customize output
• m-lines and attributes as lists that can then be pruned/extended
• Assumption about 1 audio/1 video, though, implied simpler syntax
• “I want audio, but not video”
• “Use VP9 for video, and add this fmtp attribute”
48. Multistream in the Janus core
• Assumptions on PeerConnections limits meant some hardcoded properties
• Audio related properties (all assuming a single stream)
• Video related properties (all assuming a single stream, and maybe simulcast)
• Single datachannel
• This had an impact on the media routing as well
• Incoming packet could be either audio or video (or data)
• Same level of multiplexing when talking to plugins as well
• Besides, some “legacy” code meant info was in different places
• Old stream/component structure based on ICE concepts
• Some properties on the same thing in the former, other in the latter
• Made sense initially, when we allowed non-bundle, but not anymore...
49. Multistream in the Janus core
• Assumptions on PeerConnections limits meant some hardcoded properties
• Audio related properties (all assuming a single stream)
• Video related properties (all assuming a single stream, and maybe simulcast)
• Single datachannel
• This had an impact on the media routing as well
• Incoming packet could be either audio or video (or data)
• Same level of multiplexing when talking to plugins as well
• Besides, some “legacy” code meant info was in different places
• Old stream/component structure based on ICE concepts
• Some properties on the same thing in the former, other in the latter
• Made sense initially, when we allowed non-bundle, but not anymore...
50. Multistream in the Janus core
• Assumptions on PeerConnections limits meant some hardcoded properties
• Audio related properties (all assuming a single stream)
• Video related properties (all assuming a single stream, and maybe simulcast)
• Single datachannel
• This had an impact on the media routing as well
• Incoming packet could be either audio or video (or data)
• Same level of multiplexing when talking to plugins as well
• Besides, some “legacy” code meant info was in different places
• Old stream/component structure based on ICE concepts
• Some properties on the same thing in the former, other in the latter
• Made sense initially, when we allowed non-bundle, but not anymore...
53. Demultiplexing traffic in the core
• Addressing traffic was updated as a consequence
• Not just "is it audio or video", but "which medium does this belong to?"
• Demultiplexing based on different information
• RTP vs RTCP vs data (exactly as before)
• For RTP/RTCP, mapping between known SSRC and existing medium
• For RTP, mapping with mid/rid in extension (if we need to figure out SSRC)
• Media (and stats) then internally addressed and routed basing on index or mid
Wait, figure out SSRC?!
• Yep, it might happen, e.g., with the new Chrome simulcast!
• https://www.meetecho.com/blog/simulcast-janus-ssrc/
54. Demultiplexing traffic in the core
• Addressing traffic was updated as a consequence
• Not just "is it audio or video", but "which medium does this belong to?"
• Demultiplexing based on different information
• RTP vs RTCP vs data (exactly as before)
• For RTP/RTCP, mapping between known SSRC and existing medium
• For RTP, mapping with mid/rid in extension (if we need to figure out SSRC)
• Media (and stats) then internally addressed and routed basing on index or mid
Wait, figure out SSRC?!
• Yep, it might happen, e.g., with the new Chrome simulcast!
• https://www.meetecho.com/blog/simulcast-janus-ssrc/
55. Demultiplexing traffic in the core
• Addressing traffic was updated as a consequence
• Not just "is it audio or video", but "which medium does this belong to?"
• Demultiplexing based on different information
• RTP vs RTCP vs data (exactly as before)
• For RTP/RTCP, mapping between known SSRC and existing medium
• For RTP, mapping with mid/rid in extension (if we need to figure out SSRC)
• Media (and stats) then internally addressed and routed basing on index or mid
Wait, figure out SSRC?!
• Yep, it might happen, e.g., with the new Chrome simulcast!
• https://www.meetecho.com/blog/simulcast-janus-ssrc/
56. Demultiplexing traffic in the core
• Addressing traffic was updated as a consequence
• Not just "is it audio or video", but "which medium does this belong to?"
• Demultiplexing based on different information
• RTP vs RTCP vs data (exactly as before)
• For RTP/RTCP, mapping between known SSRC and existing medium
• For RTP, mapping with mid/rid in extension (if we need to figure out SSRC)
• Media (and stats) then internally addressed and routed basing on index or mid
Wait, figure out SSRC?!
• Yep, it might happen, e.g., with the new Chrome simulcast!
• https://www.meetecho.com/blog/simulcast-janus-ssrc/
57. Making plugins multistream-aware
• Support for multistream in the core was of course only the first step
• If plugins don’t use it, it’s worthless!
• Main step was updating the media routing API
• incoming_rtp() and incoming_rtcp() updated with m-index info
• relay_rtp() and relay_rtcp() updated with m-index info as well
• Everything else up to plugins themselves
• e.g., SDP negotiation, support for managing multiple streams, etc.
Decided to only start with a few key plugins
• EchoTest
• Streaming
• VideoRoom
58. Making plugins multistream-aware
• Support for multistream in the core was of course only the first step
• If plugins don’t use it, it’s worthless!
• Main step was updating the media routing API
• incoming_rtp() and incoming_rtcp() updated with m-index info
• relay_rtp() and relay_rtcp() updated with m-index info as well
• Everything else up to plugins themselves
• e.g., SDP negotiation, support for managing multiple streams, etc.
Decided to only start with a few key plugins
• EchoTest
• Streaming
• VideoRoom
59. Making plugins multistream-aware
• Support for multistream in the core was of course only the first step
• If plugins don’t use it, it’s worthless!
• Main step was updating the media routing API
• incoming_rtp() and incoming_rtcp() updated with m-index info
• relay_rtp() and relay_rtcp() updated with m-index info as well
• Everything else up to plugins themselves
• e.g., SDP negotiation, support for managing multiple streams, etc.
Decided to only start with a few key plugins
• EchoTest
• Streaming
• VideoRoom
60. Making plugins multistream-aware
• Support for multistream in the core was of course only the first step
• If plugins don’t use it, it’s worthless!
• Main step was updating the media routing API
• incoming_rtp() and incoming_rtcp() updated with m-index info
• relay_rtp() and relay_rtcp() updated with m-index info as well
• Everything else up to plugins themselves
• e.g., SDP negotiation, support for managing multiple streams, etc.
Decided to only start with a few key plugins
• EchoTest
• Streaming
• VideoRoom
61. A multistream EchoTest plugin
• First plugin we updated was obviously the EchoTest
• Simple playground for both signalling and media
• Every packet on each stream is sent back
• Relatively small changes
• Updated usage of SDP utils to generate multistream answer
• Updated RTP routing methods/callbacks to make them aware of m-indexes
Works nicely already, although it still needs some tweaks
• Currently doesn’t support simulcast on more than one stream (because I’m lazy)
• PLIs should be generated on all video streams as well
62. A multistream EchoTest plugin
• First plugin we updated was obviously the EchoTest
• Simple playground for both signalling and media
• Every packet on each stream is sent back
• Relatively small changes
• Updated usage of SDP utils to generate multistream answer
• Updated RTP routing methods/callbacks to make them aware of m-indexes
Works nicely already, although it still needs some tweaks
• Currently doesn’t support simulcast on more than one stream (because I’m lazy)
• PLIs should be generated on all video streams as well
63. A multistream EchoTest plugin
• First plugin we updated was obviously the EchoTest
• Simple playground for both signalling and media
• Every packet on each stream is sent back
• Relatively small changes
• Updated usage of SDP utils to generate multistream answer
• Updated RTP routing methods/callbacks to make them aware of m-indexes
Works nicely already, although it still needs some tweaks
• Currently doesn’t support simulcast on more than one stream (because I’m lazy)
• PLIs should be generated on all video streams as well
69. A multistream Streaming plugin
• Streaming plugin required some more effort
• Same assumptions on single audio/video as in the core
• Ports for audio/video/data hardcoded in here as well
• Original mountpoint configuration quite rigid as a consequence
• Huge refactoring of mountpoint internals
• Mountpoints as generic array of configurable streams
• Each stream can be audio, video or data, with common properties
• Static configuration changed to take advantage of libconfig arrays
• Dynamic API updated as well to reflect this new flexibility
Much more flexible now!
• ... although streams list can’t be modified once created, but whatever
70. A multistream Streaming plugin
• Streaming plugin required some more effort
• Same assumptions on single audio/video as in the core
• Ports for audio/video/data hardcoded in here as well
• Original mountpoint configuration quite rigid as a consequence
• Huge refactoring of mountpoint internals
• Mountpoints as generic array of configurable streams
• Each stream can be audio, video or data, with common properties
• Static configuration changed to take advantage of libconfig arrays
• Dynamic API updated as well to reflect this new flexibility
Much more flexible now!
• ... although streams list can’t be modified once created, but whatever
71. A multistream Streaming plugin
• Streaming plugin required some more effort
• Same assumptions on single audio/video as in the core
• Ports for audio/video/data hardcoded in here as well
• Original mountpoint configuration quite rigid as a consequence
• Huge refactoring of mountpoint internals
• Mountpoints as generic array of configurable streams
• Each stream can be audio, video or data, with common properties
• Static configuration changed to take advantage of libconfig arrays
• Dynamic API updated as well to reflect this new flexibility
Much more flexible now!
• ... although streams list can’t be modified once created, but whatever
72. Configuring a multistream mountpoint
multistream-test: {
type = "rtp"
id = 123
description = "Multistream test (1 audio, 2 video)"
media = (
{
type = "audio"
mid = "a"
label = "Audio stream"
port = 5102
pt = 111
rtpmap = "opus/48000/2"
},
{
type = "video"
mid = "v1"
label = "Video stream #1"
port = 5104
pt = 100
rtpmap = "VP8/90000"
},
{
type = "video"
mid = "v2"
label = "Video stream #2"
port = 5106
pt = 100
rtpmap = "VP8/90000"
}
74. A multistream VideoRoom plugin
• VideoRoom plugin was even harder than that...
• Not only multistream subscribers, but multistream publishers as well!
• First decision was to keep publishers and subscribers separated
• Not “one PeerConnection to rule them all”, but two!
• All active streams on one PC (publishers)
• All passive streams on another PC (subscribers)
• Several reasons behind that
• Avoiding glare, of course (easier when O/A pattern is always the same)
• Smarter management of resources
• And most importantly, we’re not the only ones doing that! (e.g., mediasoup)
• Of course, old approach still supported
• Freedom to distribute publishers and subscribers however you want
75. A multistream VideoRoom plugin
• VideoRoom plugin was even harder than that...
• Not only multistream subscribers, but multistream publishers as well!
• First decision was to keep publishers and subscribers separated
• Not “one PeerConnection to rule them all”, but two!
• All active streams on one PC (publishers)
• All passive streams on another PC (subscribers)
• Several reasons behind that
• Avoiding glare, of course (easier when O/A pattern is always the same)
• Smarter management of resources
• And most importantly, we’re not the only ones doing that! (e.g., mediasoup)
• Of course, old approach still supported
• Freedom to distribute publishers and subscribers however you want
76. A multistream VideoRoom plugin
• VideoRoom plugin was even harder than that...
• Not only multistream subscribers, but multistream publishers as well!
• First decision was to keep publishers and subscribers separated
• Not “one PeerConnection to rule them all”, but two!
• All active streams on one PC (publishers)
• All passive streams on another PC (subscribers)
• Several reasons behind that
• Avoiding glare, of course (easier when O/A pattern is always the same)
• Smarter management of resources
• And most importantly, we’re not the only ones doing that! (e.g., mediasoup)
• Of course, old approach still supported
• Freedom to distribute publishers and subscribers however you want
77. A multistream VideoRoom plugin
• VideoRoom plugin was even harder than that...
• Not only multistream subscribers, but multistream publishers as well!
• First decision was to keep publishers and subscribers separated
• Not “one PeerConnection to rule them all”, but two!
• All active streams on one PC (publishers)
• All passive streams on another PC (subscribers)
• Several reasons behind that
• Avoiding glare, of course (easier when O/A pattern is always the same)
• Smarter management of resources
• And most importantly, we’re not the only ones doing that! (e.g., mediasoup)
• Of course, old approach still supported
• Freedom to distribute publishers and subscribers however you want
78. A multistream VideoRoom plugin
• VideoRoom plugin was even harder than that...
• Not only multistream subscribers, but multistream publishers as well!
• First decision was to keep publishers and subscribers separated
• Not “one PeerConnection to rule them all”, but two!
• All active streams on one PC (publishers)
• All passive streams on another PC (subscribers)
• Several reasons behind that
• Avoiding glare, of course (easier when O/A pattern is always the same)
• Smarter management of resources
• And most importantly, we’re not the only ones doing that! (e.g., mediasoup)
• Of course, old approach still supported
• Freedom to distribute publishers and subscribers however you want
83. A multistream VideoRoom plugin
• For the rest, several updates required
• Refactoring of publishers and subscribers as collection of streams
• Refactored media relationships at a stream level
• Updated code to use the new SDP utils for crafting SDPs
• Existing features now stream-specific, rather than publisher/subscriber specific
• e.g., in theory possible to send multiple different simulcast video streams
• Updated API to allow for dynamic updates on existing PeerConnections
• e.g., subscribe/unsubscribe at any time, with metadata on the streams
Two demos now available (and interoperable!)
1 videoroomtest.html: legacy approach, same as old one
2 mvideoroomtest.html: multistream version of the demo above
84. A multistream VideoRoom plugin
• For the rest, several updates required
• Refactoring of publishers and subscribers as collection of streams
• Refactored media relationships at a stream level
• Updated code to use the new SDP utils for crafting SDPs
• Existing features now stream-specific, rather than publisher/subscriber specific
• e.g., in theory possible to send multiple different simulcast video streams
• Updated API to allow for dynamic updates on existing PeerConnections
• e.g., subscribe/unsubscribe at any time, with metadata on the streams
Two demos now available (and interoperable!)
1 videoroomtest.html: legacy approach, same as old one
2 mvideoroomtest.html: multistream version of the demo above
85. A multistream VideoRoom plugin
• For the rest, several updates required
• Refactoring of publishers and subscribers as collection of streams
• Refactored media relationships at a stream level
• Updated code to use the new SDP utils for crafting SDPs
• Existing features now stream-specific, rather than publisher/subscriber specific
• e.g., in theory possible to send multiple different simulcast video streams
• Updated API to allow for dynamic updates on existing PeerConnections
• e.g., subscribe/unsubscribe at any time, with metadata on the streams
Two demos now available (and interoperable!)
1 videoroomtest.html: legacy approach, same as old one
2 mvideoroomtest.html: multistream version of the demo above
86. A multistream VideoRoom plugin
• For the rest, several updates required
• Refactoring of publishers and subscribers as collection of streams
• Refactored media relationships at a stream level
• Updated code to use the new SDP utils for crafting SDPs
• Existing features now stream-specific, rather than publisher/subscriber specific
• e.g., in theory possible to send multiple different simulcast video streams
• Updated API to allow for dynamic updates on existing PeerConnections
• e.g., subscribe/unsubscribe at any time, with metadata on the streams
Two demos now available (and interoperable!)
1 videoroomtest.html: legacy approach, same as old one
2 mvideoroomtest.html: multistream version of the demo above
91. Another challenge: data channels!
• With separate PeerConnections in the VideoRoom, relaying is easy
• If I want data from Alice, I’ll subscribe negotiating datachannels too
• Incoming data on that PeerConnection will only come from Alice
• What if the same PeerConnection is used for all subscriptions, though?
• Only one datachannel is created, and shared for all sources
• Who is the incoming data on that PeerConnection from?!
Problem solved by starting to use datachannels to their full potential
• Added support for multiple streams/labels (instead of just one, as before)
• Plugins can associate different labels to different sources
• Easy to demultiplex incoming messages from an application perspective
92. Another challenge: data channels!
• With separate PeerConnections in the VideoRoom, relaying is easy
• If I want data from Alice, I’ll subscribe negotiating datachannels too
• Incoming data on that PeerConnection will only come from Alice
• What if the same PeerConnection is used for all subscriptions, though?
• Only one datachannel is created, and shared for all sources
• Who is the incoming data on that PeerConnection from?!
Problem solved by starting to use datachannels to their full potential
• Added support for multiple streams/labels (instead of just one, as before)
• Plugins can associate different labels to different sources
• Easy to demultiplex incoming messages from an application perspective
93. Another challenge: data channels!
• With separate PeerConnections in the VideoRoom, relaying is easy
• If I want data from Alice, I’ll subscribe negotiating datachannels too
• Incoming data on that PeerConnection will only come from Alice
• What if the same PeerConnection is used for all subscriptions, though?
• Only one datachannel is created, and shared for all sources
• Who is the incoming data on that PeerConnection from?!
Problem solved by starting to use datachannels to their full potential
• Added support for multiple streams/labels (instead of just one, as before)
• Plugins can associate different labels to different sources
• Easy to demultiplex incoming messages from an application perspective
94. Last step: updating the client side (janus.js)
• Client-side, the biggest thing to be aware of are transceivers
• https://blog.mozilla.org/webrtc/the-evolution-of-webrtc/
• https://webrtc.org/web-apis/chrome/unified-plan/
• In a nutshell, a way to pair a sender and a receiver
• e.g., a local track and a remote track
• Each transceiver maps to a specific m-line
• State can then be controlled/monitored there (e.g., media direction)
• As such, first step was to check if the browser supports transceivers
• Chrome >= 72 and Firefox >= 59 both do
• More specific checks also available to help in other cases
• Chrome may actually need an explicit way of enabling it
• new RTCPeerConnection ({sdpSemantics: "unified-plan"});
95. Last step: updating the client side (janus.js)
• Client-side, the biggest thing to be aware of are transceivers
• https://blog.mozilla.org/webrtc/the-evolution-of-webrtc/
• https://webrtc.org/web-apis/chrome/unified-plan/
• In a nutshell, a way to pair a sender and a receiver
• e.g., a local track and a remote track
• Each transceiver maps to a specific m-line
• State can then be controlled/monitored there (e.g., media direction)
• As such, first step was to check if the browser supports transceivers
• Chrome >= 72 and Firefox >= 59 both do
• More specific checks also available to help in other cases
• Chrome may actually need an explicit way of enabling it
• new RTCPeerConnection ({sdpSemantics: "unified-plan"});
96. Last step: updating the client side (janus.js)
• Client-side, the biggest thing to be aware of are transceivers
• https://blog.mozilla.org/webrtc/the-evolution-of-webrtc/
• https://webrtc.org/web-apis/chrome/unified-plan/
• In a nutshell, a way to pair a sender and a receiver
• e.g., a local track and a remote track
• Each transceiver maps to a specific m-line
• State can then be controlled/monitored there (e.g., media direction)
• As such, first step was to check if the browser supports transceivers
• Chrome >= 72 and Firefox >= 59 both do
• More specific checks also available to help in other cases
• Chrome may actually need an explicit way of enabling it
• new RTCPeerConnection ({sdpSemantics: "unified-plan"});
97. Last step: updating the client side (janus.js)
• Client-side, the biggest thing to be aware of are transceivers
• https://blog.mozilla.org/webrtc/the-evolution-of-webrtc/
• https://webrtc.org/web-apis/chrome/unified-plan/
• In a nutshell, a way to pair a sender and a receiver
• e.g., a local track and a remote track
• Each transceiver maps to a specific m-line
• State can then be controlled/monitored there (e.g., media direction)
• As such, first step was to check if the browser supports transceivers
• Chrome >= 72 and Firefox >= 59 both do
• More specific checks also available to help in other cases
• Chrome may actually need an explicit way of enabling it
• new RTCPeerConnection ({sdpSemantics: "unified-plan"});
98. Last step: updating the client side (janus.js)
• Second step was updating how we notified remote streams
• Before: onlocalstream / onremotestream
• Now: onlocaltrack / onremotetrack
• Before, we notified a MediaStream object
• No issue since it could only be 1 audio + 1 video
• Subsequent calls to same callback would update the stream state
• Now that we can have multiple heterogeneous m-lines, we notify tracks instead
• Notification when track is added/unmuted/muted/removed (with info on mid)
• Each track is also played in a separate element, in the updated demos
• Avoids the “no audio while still waiting for video” annoyance
Still missing...
An easy way to add/replace/remove local tracks (because, again, I’m lazy!)
99. Last step: updating the client side (janus.js)
• Second step was updating how we notified remote streams
• Before: onlocalstream / onremotestream
• Now: onlocaltrack / onremotetrack
• Before, we notified a MediaStream object
• No issue since it could only be 1 audio + 1 video
• Subsequent calls to same callback would update the stream state
• Now that we can have multiple heterogeneous m-lines, we notify tracks instead
• Notification when track is added/unmuted/muted/removed (with info on mid)
• Each track is also played in a separate element, in the updated demos
• Avoids the “no audio while still waiting for video” annoyance
Still missing...
An easy way to add/replace/remove local tracks (because, again, I’m lazy!)
100. Last step: updating the client side (janus.js)
• Second step was updating how we notified remote streams
• Before: onlocalstream / onremotestream
• Now: onlocaltrack / onremotetrack
• Before, we notified a MediaStream object
• No issue since it could only be 1 audio + 1 video
• Subsequent calls to same callback would update the stream state
• Now that we can have multiple heterogeneous m-lines, we notify tracks instead
• Notification when track is added/unmuted/muted/removed (with info on mid)
• Each track is also played in a separate element, in the updated demos
• Avoids the “no audio while still waiting for video” annoyance
Still missing...
An easy way to add/replace/remove local tracks (because, again, I’m lazy!)
101. Last step: updating the client side (janus.js)
• Second step was updating how we notified remote streams
• Before: onlocalstream / onremotestream
• Now: onlocaltrack / onremotetrack
• Before, we notified a MediaStream object
• No issue since it could only be 1 audio + 1 video
• Subsequent calls to same callback would update the stream state
• Now that we can have multiple heterogeneous m-lines, we notify tracks instead
• Notification when track is added/unmuted/muted/removed (with info on mid)
• Each track is also played in a separate element, in the updated demos
• Avoids the “no audio while still waiting for video” annoyance
Still missing...
An easy way to add/replace/remove local tracks (because, again, I’m lazy!)
102. Last step: updating the client side (janus.js)
• Second step was updating how we notified remote streams
• Before: onlocalstream / onremotestream
• Now: onlocaltrack / onremotetrack
• Before, we notified a MediaStream object
• No issue since it could only be 1 audio + 1 video
• Subsequent calls to same callback would update the stream state
• Now that we can have multiple heterogeneous m-lines, we notify tracks instead
• Notification when track is added/unmuted/muted/removed (with info on mid)
• Each track is also played in a separate element, in the updated demos
• Avoids the “no audio while still waiting for video” annoyance
Still missing...
An easy way to add/replace/remove local tracks (because, again, I’m lazy!)
103. Next steps?
• The effort is basically done, and we’ll probably merge soon
• Mostly reacting to feedback and bug reports right now
• JavaScript code needs some love, though...
• Of course, there’s always room for improvements!
• VideoRoom will need a new scaling mechanism (material for another talk!)
• Possibly extend multistream support to other plugins too? (e.g., Lua/Duktape)
• ... or maybe even cross-plugins! (if that makes sense)
Test test test!
• If you’re using Janus already, start playing with this!
• If you’ve never used Janus before, then it’s the perfect moment to start
104. Next steps?
• The effort is basically done, and we’ll probably merge soon
• Mostly reacting to feedback and bug reports right now
• JavaScript code needs some love, though...
• Of course, there’s always room for improvements!
• VideoRoom will need a new scaling mechanism (material for another talk!)
• Possibly extend multistream support to other plugins too? (e.g., Lua/Duktape)
• ... or maybe even cross-plugins! (if that makes sense)
Test test test!
• If you’re using Janus already, start playing with this!
• If you’ve never used Janus before, then it’s the perfect moment to start
105. Next steps?
• The effort is basically done, and we’ll probably merge soon
• Mostly reacting to feedback and bug reports right now
• JavaScript code needs some love, though...
• Of course, there’s always room for improvements!
• VideoRoom will need a new scaling mechanism (material for another talk!)
• Possibly extend multistream support to other plugins too? (e.g., Lua/Duktape)
• ... or maybe even cross-plugins! (if that makes sense)
Test test test!
• If you’re using Janus already, start playing with this!
• If you’ve never used Janus before, then it’s the perfect moment to start
106. See you soon in Napoli!
September 23-25, 2019, Napoli — https://januscon.it
107. See you soon in Napoli!
September 23-25, 2019, Napoli — https://januscon.it