• Save
Streaming video to html
Upcoming SlideShare
Loading in...5
×

Like this? Share it with your network

Share

Streaming video to html

  • 22,630 views
Uploaded on

my presentation from cf.objective 2013 on building a DASH-264 player in HTML/JavaScript

my presentation from cf.objective 2013 on building a DASH-264 player in HTML/JavaScript

More in: Technology
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
No Downloads

Views

Total Views
22,630
On Slideshare
9,752
From Embeds
12,878
Number of Embeds
24

Actions

Shares
Downloads
0
Comments
0
Likes
11

Embeds 12,878

http://www.scoop.it 8,127
http://www.digitalprimates.net 3,693
http://www.alkannoide.com 959
http://blogs.digitalprimates.net 20
http://translate.googleusercontent.com 16
http://webcache.googleusercontent.com 15
https://twitter.com 5
http://jotafernando57.blogspot.pt 4
http://lewiki.blogspot.fr 4
https://www.google.com 4
http://www.google.com 4
http://88.191.134.83 3
http://lewiki.blogspot.com 3
http://www.365dailyjournal.com 3
http://modelisme-dombe.com 3
http://www.newsblur.com 3
https://hootsuite.scoop.it 2
http://127.0.0.1 2
https://www.google.fr 2
http://news.google.com 2
http://www.lewiki.blogspot.fr 1
http://ovfsquad.fr 1
http://feeds.feedburner.com 1
http://www.google.com.hk 1

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Streaming Video to HTMLJeff TapperDigital Primates@jefftapper
  • 2. Who am I?• Senior Consultant at Digital Primates– Building next generation client applications• Built video applications for many of the mostattended streaming events.• Developing Internet applications for 17 years• Author of 12 books on Internet technologies
  • 3. Agenda• Video and the Internet today• Understanding HTTP Streaming• What are the Streaming options without aplugin?• Understanding MediaSource Extensions (MSE)• What is DASH• Making it work in a browser• Questions
  • 4. Video is dominating the Internet• Desktop: Video makes up 50% of traffic at peak periods– notably 30% from Netflix and 11% from YouTube• Mobile: Video traffic is growing exponentiallyFixed Internet Mobile Internet
  • 5. HTTP Adaptive StreamingMedia Capture & EncodingMedia Origin Servers HTTP Cache Servers Client Devices00101010000101001010101000111001110100011010101001010100001010010101010001110011101000110101010010101000010100101010100011100111010001101010100101010000101001010101000111001110100011010101Encode each segmentat multiple bitrates2 Split the video intosmall segments1Make each segmentaddressable via a HTTP-URL3Client makes decisionon which segment todownload4Client splices togetherand plays back5
  • 6. HTTP Streaming Landscape• Apple’s HTTP Live Streaming (HLS)• Microsoft’s Smooth Streaming• Adobe’s HTTP Dynamic Streaming (HDS)• And many more…
  • 7. The challenge• Most agree that HTTP Streaming is the mostefficient choice• Different devices support different streamingprotocols• No one standard is currently supportedubiquitously• Results in media being served in severaldifferent formats to support the broadestrange of devices
  • 8. What do browsers support?• Unfortunately, Progressive Download is the onlyubiquity supported option• Different Browsers support different videocodec’s– H.264– webM– Etc.• Safari (iOs and MacOS only) natively supports HLS• MediaSource Extensions in Chrome (and soonothers)
  • 9. MediaSource Extensions (MSE)• MSE allow for pieces (segments) of media tobe handed to the HTML5 video tag’s bufferdirectly.• This enables HTTP Streaming in HTML• Not universally supported, yet.
  • 10. What is MPEG-DASH DASH – Dynamic Adaptive Streaming via HTTP International open standard, developed andpublished by ISO Addresses both simple and advanced use cases Enables highest-quality multiscreen distributionand efficient dynamic adaptive switching Enables reuse of existing content, devices andinfrastructure Attempts to unify to a single standard for HTTPStreaming
  • 11. DASH and codecs• The DASH specification is codec agnostic• Any existing or future codec can work withDASH• DASH manifest describes which codec is used– Different codecs store the actual video datadifferently
  • 12. DASH264• H.264 is dominant format today• Many vendors and service providers arecommitted to supporting/enabling DASH264• Provides support for today’s requirementssuch as DRM• H.264 is backed by rigorous testing andconformance
  • 13. DASH Industry Forum• Addressing the dramatic growth of broadbandvideo by recommending a universal deliveryformat that provides end users with the bestpossible media experience by dynamicallyadapting to changing network conditions.
  • 14. DASH Industry Forum• Objectives:– promote and catalyze market adoption of MPEG-DASH– publish interoperability and deployment guidelines– facilitate interoperability tests– collaborate with standard bodies and industryconsortia in aligning ongoing DASH standardsdevelopment and the use of common profiles acrossindustry organizations• Over 65 members• Visit http://dashif.org for more information• Released the DASH/264 standard
  • 15. Building a DASH player• We have built DASH players for severaldifferent platforms– Flash– Android– HTML5/JavaScript (dash.js)• DASH.js is available as an open source project(bsd3) on github• DASH.js is the reference player for the DASHIndustry Forum (dashif.org)
  • 16. How to play a DASH Stream• Download Manifest• Parse Manifest• Determine optimal bandwidth for client• Initialize for bandwidth• Download Segment• Hand segment to MSE• Check Bandwidth to determine if change isnecessary
  • 17. Understanding DASH structure• Three types of files– Manifest (.mpd)• XML file describing the segments– Initialization file• Contains headers needed to decode bytes in segments– Segment Files• Contains playable media• Includes:– 0…many video tracks– 0…many audio tracks
  • 18. DASH Manifest• Manifest contains:– Root node– 1 or more periods• Periods contain 1 adaptation set per video stream and• Periods contain 1 adaptation set per audio stream• Adaptation Sets contain:– Content Composition nodes (for each video or audio track)– 1 or more Representation node» Each representation describes a single bitrate» Representations contain data on finding the actual segments» Different ways a representation can describe segments
  • 19. Describing Representations• SegmentBase– Describes a stream with only a single Segment per bitrate– Can be used for Byte Range Requests• SegmentList– A SegmentList will contain a specific list of eachSegmentURL (individual HTTP packet with media data)– Can be used for Byte Range Requests• SegmentTemplate– Defines a known url for the fragment with wildcardsresolved at runtime to request a segments (see bbb.mpd)– Alternatively, can specify a list of segments based onduration
  • 20. SegmentList<Representation id="h264bl_hd" mimeType="video/mp4"codecs="avc1.42c01f" width="1280" height="720" startWithSAP="1"bandwidth="514864"><SegmentList timescale="1000" duration="10000"><Initialization sourceURL="mp4-main-multi-h264bl_hd-.mp4"/><SegmentURL media="mp4-main-multi-h264bl_hd-1.m4s"/><SegmentURL media="mp4-main-multi-h264bl_hd-2.m4s"/><SegmentURL media="mp4-main-multi-h264bl_hd-3.m4s"/><SegmentURL media="mp4-main-multi-h264bl_hd-4.m4s"/><SegmentURL media="mp4-main-multi-h264bl_hd-5.m4s"/><SegmentURL media="mp4-main-multi-h264bl_hd-6.m4s"/><SegmentURL media="mp4-main-multi-h264bl_hd-7.m4s"/><SegmentURL media="mp4-main-multi-h264bl_hd-8.m4s"/>
  • 21. SegmentTemplate fixed segmentduration<AdaptationSet><ContentComponent id="1" contentType="video"/><SegmentTemplateinitialization="BigBuckBunny_720p_1800kbps_44khz_track1_dash.mp4"/><Representation id="1"mimeType="video/mp4“ codecs="avc1.64001f"width="1280" height="720“startWithSAP="1" bandwidth="1809954"><SegmentTemplate timescale="1000" duration="13809"media="bbb_seg_BigBuckBunny_720p_1800kbps_44khz_track1$Number$.m4s"startNumber="1"/></Representation></AdaptationSet>
  • 22. SegmentTemplate variable segmentduration<AdaptationSet group="2" mimeType="video/mp4" par="16:9“ minBandwidth="475000“maxBandwidth="6589000" minWidth="176" maxWidth="1680"minHeight="99" maxHeight="944“ segmentAlignment="true“startWithSAP="1"><SegmentTemplate timescale="1000"initialization="dash/ateam-video=$Bandwidth$.dash"media="dash/ateam-video=$Bandwidth$-$Time$.dash"><SegmentTimeline><S t="0" d="4171" /><S d="2503" /><S d="2961" /><S d="2461" /><S d="2127" r="2" />…
  • 23. dash.js player
  • 24. Tools used by dash.jsCore Player• Q – Asynchronous handling with promises• Dijon – DI / IOC• Jasmine – unit testsWeb Site• JQuery – DOM manipulation• Flat-ui – UI elements• Flot – Charting• Kendo - Components
  • 25. Class Structure• The player is divided into two main packages.• streaming – Contains the classes responsiblefor creating and populating the MediaSourcebuffers. These classes are intended to beabstract enough for use with any segmentedstream (such as DASH, HLS, HDS and MSS).• dash – Contains the classes responsible formaking decisions specifically related to Dash.
  • 26. streaming package
  • 27. MediaPlayer.js• Exposes the top level functions and propertiesto the developer (play, autoPlay, isLive, abrquality, and metrics).• The manifest URL and the HTML Video objectas passed to the MediaPlayer.
  • 28. Context.js• The dependency mapping for the streampackage.• The context is passed into the MediaPlayerobject allowing for different MediaPlayerinstances to use different mappings.
  • 29. Stream.js• Loads/refreshes the manifest.• Create SourceBuffers from MediaSource.• Create BufferManager classes to manageSourceBuffers.• Responds to events from HTML Video object.• For a live stream, the live edge is calculatedand passed to the BufferController instances.
  • 30. Debug.js• Convenience class for logging methods.• Default implementation is to just useconsole.log().• Extension point for tapping into loggingmessages.
  • 31. BufferController.js• Responsible for loading fragments andpushing the bytes into the SourceBuffer.• Once play() has been called a timer isstarted to check the status of the bytes in thebuffer.• If the amount of time left to play is less thanManifest.minBufferTime the next fragmentis loaded.• Records metrics related to playback.
  • 32. FragmentLoader.js• Responsible for loading fragments.• Loads requests sequentially.ManifestLoader.js• Responsible for loading manifest files.• Returns the parsed manifest object.
  • 33. AbrController.js• Responsible for deciding if the current qualityshould be changed.• The stream metrics are passed to a set of‘rules’.• Methods:getPlaybackQuality(type, data) type – The type of the data(audio/video). data – The stream data.
  • 34. DownloadRatioRule.js• Validates that fragments are beingdownloaded in a timely manner.• Compares the time it takes to download afragment to how long it takes to play out afragment.• If the download time is considered abottleneck the quality will be lowered.
  • 35. InsufficientBufferRule.js• Validates that the buffer doesn’t run dryduring playback.• If the buffer is running dry continuously itlikely means that the player has a processingbottleneck (video decode time is longer thanplayback time).
  • 36. LimitSwitchesRule.js• Watches for competing rules to avoid constantbitrate switches.• If two or more rules are causing switches toooften this rule will limit the switches to give abetter overall playback experience.
  • 37. dash package
  • 38. DashContext.js• Defines dependency mapping specific to thedash package.– Parser– Index Handler– Manifest Extensions
  • 39. DashParser.js• Converts the manifest to a JSON object.• Converts duration and datetime strings intonumber/date objects.• Manages inheritance fields.– Many fields are inherited from parent to childnodes in DASH.– For example, a BaseURL can be defined in the<MPD> node and all <Representation> nodesinherit that value.
  • 40. DashHandler.js• Responsible for deciding which fragment URL should beloaded.• Methods: getInitRequest(quality) – Returns an initializationrequest for a given quality, if available. getSegmentRequestForTime(time, quality) – Returnsa fragment URL to load for a given quality and a giventime. Returns a Stream.vo.SegmentRequest object. getNextSegmentRequest(quality) – Returns the nextfragment URL to load. Assumes thatgetSegmentRequestForTime() has already been called. getCurrentTime (quality) – Returns the time for thelast loaded fragment index.
  • 41. DashHandler.js (cont’d)• Uses available information in the manifest(SegmentList, SegmentTemplate, SegmentBase).• When using a single, non-fragmented mp4 file the SIDX boxwill be loaded to determine byte ranges for segments.
  • 42. Flow1. Create the Context and MediaPlayer instances.var context = new Dash.di.DashContext(),player = new MediaPlayer(context);2. Initialize MediaPlayer and set manifest URL.player.startup();player.setIsLive(false);player.attachSource(manifest_url);3. Attach HTML Video element.video = document.querySelector(".dash-video-playervideo"),player.autoPlay = true;player.attachView(video);
  • 43. 2. Call play()on the MediaPlayer (if autoPlay =false).3. The Stream object will be created and initialized with themanifest URL.4. The manifest is loaded and then parsed.5.MediaSource, SourceBuffers, andBufferControllers are created.– Create one BufferController per stream type (usuallyvideo and audio).6. Set the duration of the MediaSource to the duration of themanifest (or infinity for a live stream).7. If the stream is live, calculate the live edge.8. Call play() on the HTML video element.9. The BufferManager instances create a timer. When thetimer ticks the state of the buffers is checked.
  • 44. BufferManager.validate()1. Check to see if the buffers need more data.• Must be in a playing state.• Must not already be loading data.• Must require more data to be buffered.amountBuffered < manifest.minBufferTime2. If automatic ABR is enabled check to see if the bitrateshould be changed.• Ask AbrController for the new quality.• Rules will determine which bitrate to change to.3. If initial playback, seeking, or the bitrate has changed loadthe initialization fragment (if available).
  • 45. 4. Ask the IndexHandler for the next fragment request.• If seeking pass the seek time to the IndexHandler.• Otherwise ask for the ‘next’ fragment.• Pass the bitrate to the IndexHandler.6. The IndexHandler returns a SegmentRequest indicatingwhat action the BufferManager should take next.• “download” – Download and append the fragment to the buffer.• “stall” – Wait because the IndexHandler is not ready.• “complete” – Signal that the stream has completed playback.7. Repeat.
  • 46. Resources• GPAC– http://gpac.wp.mines-telecom.fr– Provides baseline test streams– Provides baseline player• MP4Parser– http://code.google.com/p/mp4parser/– Open Source java project– Allows for display of contents within boxes• DASH Industry Forum– http://www.dashif.org– Test Vectors– Reference Player
  • 47. Questions?