The document summarizes Chris Adamson's presentation on mobile movies with HTTP Live Streaming. The presentation covered what streaming is and how it differs from traditional broadcast media, introduced HTTP Live Streaming (HLS) as a way to stream media over HTTP, and described how HLS works by serving media in short file segments using a playlist file. It also discussed features of HLS like providing multiple variants for different bandwidths and encrypting file segments for security.
MPEG Dynamic Adaptive Streaming over HTTP (DASH) is a new streaming standard that has been recently ratified as an international standard (IS). In comparison to other streaming systems, e.g., HTTP progressive download, DASH is able to handle varying bandwidth conditions providing smooth streaming. Furthermore, it enables NAT and Firewall traversal, flexible and scalable deployment as well as reduced infrastructure costs due to the reuse of existing Internet infrastructure components, e.g., proxies, caches, and Content Distribution Networks (CDN). Recently, the Hypertext Transfer Protocol Bis (httpbis) working group of the IETF has officially started the development of HTTP 2.0. Initially three major proposals have been submitted to the IETF i.e., Googles' SPDY, Microsofts' HTTP Speed+Mobility and Network-Friendly HTTP Upgrade, but SPDY has been chosen as working draft for HTTP 2.0. In this paper we implemented MPEG-DASH over HTTP 2.0 (i.e., SPDY), demonstrating its potential benefits and drawbacks. Moreover, several experimental evaluations have been performed that compare HTTP 2.0 with HTTP 1.1 and HTTP 1.0 in the context of DASH. In particular, the protocol overhead, the performance for different round trip times, and DASH with HTTP 2.0 in a lab test scenario has been evaluated in detail.
MPEG Dynamic Adaptive Streaming over HTTP (DASH) is a new streaming standard that has been recently ratified as an international standard (IS). In comparison to other streaming systems, e.g., HTTP progressive download, DASH is able to handle varying bandwidth conditions providing smooth streaming. Furthermore, it enables NAT and Firewall traversal, flexible and scalable deployment as well as reduced infrastructure costs due to the reuse of existing Internet infrastructure components, e.g., proxies, caches, and Content Distribution Networks (CDN). Recently, the Hypertext Transfer Protocol Bis (httpbis) working group of the IETF has officially started the development of HTTP 2.0. Initially three major proposals have been submitted to the IETF i.e., Googles' SPDY, Microsofts' HTTP Speed+Mobility and Network-Friendly HTTP Upgrade, but SPDY has been chosen as working draft for HTTP 2.0. In this paper we implemented MPEG-DASH over HTTP 2.0 (i.e., SPDY), demonstrating its potential benefits and drawbacks. Moreover, several experimental evaluations have been performed that compare HTTP 2.0 with HTTP 1.1 and HTTP 1.0 in the context of DASH. In particular, the protocol overhead, the performance for different round trip times, and DASH with HTTP 2.0 in a lab test scenario has been evaluated in detail.
libdash is a library that provides an object orient (OO) interface to the MPEG-DASH standard.
Features
- Cross platform build system based on cmake that includes Windows, Linux and Mac.
- Open source available and licensed under the LGPL.
- Implements the full MPEG-DASH standard according to ISO/IEC 23009-1, Information Technology Dynamic Adaptive Streaming over HTTP (DASH) Part 1: Media Presentation Description and Segment Formats
- Handles the download and xml parsing of the MPD. Based on that it provides an OO based interface to the MPD.
Media elements, e.g., SegmentURL, SegmentTemplate, etc., are downloadable in that OO based structure and can be downloaded through libdash, which internally uses libcurl.
- Therefore basically all protocols that libcurl supports, e.g., HTTP, FTP, etc. are supported by libdash.
- However it also provides a configurable download interface, which enables the use of external connections that can be implemented by the user of the library for the download of media segments.
- The use of such external connections will be shown in the libdash_networkpart_test project which is part of libdash solution and also part of the cross platform cmake system and therefore usable on Windows, Linux and Mac.
- The project contains a sample multimedia player that is based on ffmpeg which uses libdash for the playback of one of our dataset MPDs.
- The development is based on Windows, therefore the code contains a VS10 solution with additional tests and the sample multimedia player.
Dynamic Adaptive Streaming over HTTP (DASH) is a
convenient approach to transfer videos in an adaptive and
dynamic way to the user. As a consequence, this system
provides high bandwidth flexibility and is especially
suitable for mobile use cases where the bandwidth variations
are tremendous. In this paper we have integrated the
Scalable Video Coding (SVC) extensions of the Advanced
Video Coding (AVC) standard into the recently ratified
MPEG-DASH standard. Furthermore, we have evaluated
our solution under restricted conditions using bandwidth
traces from mobile environments and compared it with an
improved version of our MPEG-DASH implementation
using AVC as well as major industry solutions.
Our presentation from the media web symposium 2013 in Berlin on the open source landscape around MPEG-DASH as well as on cloud-based services for MPEG-DASH
Nowadays video is an important part of the Web and Web sites like YouTube, Hulu, etc. count millions of users consuming their content every day. However, these Web sites mainly use media players based on proprietary browser plug-ins (i.e., Adobe Flash) and do not leverage adaptive streaming systems. This paper presents a seamless integration of the recent MPEG standard on Dynamic Adaptive Streaming over HTTP (DASH) in the Web using the HTML5 video element. Therefore, we present DASH-JS, a JavaScript-based MPEG-DASH client which adopts the Media Source API of Google’s Chrome browser to present a flexible and potentially browser independent DASH client. Furthermore, we present the integration of WebM based media segments in DASH giving a detailed description of the used container format structure and a corresponding Media Presentation Description (MPD). Our preliminary evaluation demonstrates the bandwidth adaption capabilities to show the effectiveness of the system.
A PROXY EFFECT ANALYIS AND FAIR ADATPATION ALGORITHM FOR MULTIPLE COMPETING D...Christopher Mueller
Multimedia streaming technologies based on the Hypertext Transfer Protocol (HTTP) are very popular and used by many content providers such as Netflix, Hulu, and Vudu. Recently, ISO/IEC MPEG has ratified Dynamic Adaptive Streaming over HTTP (DASH) which extends the traditional HTTP streaming with an adaptive component addressing the issue of varying bandwidth conditions that users are facing in networks based on the Internet Protocol (IP). Additionally, industry has already deployed several solutions based on such an approach which simplifies large scale deployment because the whole streaming logic is located at the client. However, these features may introduce drawbacks when multiple clients compete for a network bottleneck due to the fact that the clients are not aware of the network infrastructure such as proxies or other clients. This paper identifies these negative effects and the evaluation thereof using MPEG-DASH and Microsoft Smooth Streaming. Furthermore, we propose a novel adaptation algorithm introducing the concept of fairness regarding a cluster of clients. In anticipation of the results we can conclude that we achieve more efficient bottleneck bandwidth utilization and less quality switches.
Peer-to-Peer streaming technology has become one of the major Internet applications as it offers the opportunity of broadcasting high quality video content to a large number of peers with low costs. It is widely accepted that with the efficient utilization of peers and server's upload capacities, peers can enjoy watching a high bit rate video with minimal end-to-end delay. In this paper, we present a practical scheduling algorithm that works in the challenging condition where no spare capacity is available, i.e., it maximally utilizes the resources and broadcasts the maximum streaming rate. Each peer contacts with only a small number of neighbours in the overlay network and autonomously subscribes to sub-streams according to a budget-model in such a way that the number of peers forwarding exactly one sub-stream will be maximized. The hop-count delay is also taken into account to construct a short depth trees. Finally, we show through simulation that peers dynamically converge to an efficient overlay structure with a short hop-count delay. Moreover, the proposed scheme gives nice features in the homogeneous case and overcomes SplitStream in all simulated scenarios.
Switiching by Ravi Namboori Babson University USARavi Namboori
Ravi Namboori currently serves as vice president of Data Center Solutions at Unigen. Combining extensive IT experience and leadership skills, Ravi Namboori oversees the entire business unit, motivating and energizing his team to contribute to the strategic vision of the company.
Namboori brings experience from a number of Silicon Valley tech companies to Unigen. During his time as director of IT for Glu Mobile, he established the architecture and carried out the implementation of the company’s data center. Designed to be scalable and reliable, the center provided a high performance environment for Glu Mobile’s e-commerce and gaming business.
Earlier in his career, Ravi Namboori worked for Hewlett-Packard Company (HP) in the Netherlands as a network manager. In this role, he designed the LAN and call center systems for HP’s facility and carried out networking projects to client specifications.
In addition to Microsoft and Novell certifications, Namboori holds seven Cisco professional certifications including a teaching credential. He received his MBA from Babson University’s F. W. Olin Graduate School of Business and his undergraduate degree from DNR College.
libdash is a library that provides an object orient (OO) interface to the MPEG-DASH standard.
Features
- Cross platform build system based on cmake that includes Windows, Linux and Mac.
- Open source available and licensed under the LGPL.
- Implements the full MPEG-DASH standard according to ISO/IEC 23009-1, Information Technology Dynamic Adaptive Streaming over HTTP (DASH) Part 1: Media Presentation Description and Segment Formats
- Handles the download and xml parsing of the MPD. Based on that it provides an OO based interface to the MPD.
Media elements, e.g., SegmentURL, SegmentTemplate, etc., are downloadable in that OO based structure and can be downloaded through libdash, which internally uses libcurl.
- Therefore basically all protocols that libcurl supports, e.g., HTTP, FTP, etc. are supported by libdash.
- However it also provides a configurable download interface, which enables the use of external connections that can be implemented by the user of the library for the download of media segments.
- The use of such external connections will be shown in the libdash_networkpart_test project which is part of libdash solution and also part of the cross platform cmake system and therefore usable on Windows, Linux and Mac.
- The project contains a sample multimedia player that is based on ffmpeg which uses libdash for the playback of one of our dataset MPDs.
- The development is based on Windows, therefore the code contains a VS10 solution with additional tests and the sample multimedia player.
Dynamic Adaptive Streaming over HTTP (DASH) is a
convenient approach to transfer videos in an adaptive and
dynamic way to the user. As a consequence, this system
provides high bandwidth flexibility and is especially
suitable for mobile use cases where the bandwidth variations
are tremendous. In this paper we have integrated the
Scalable Video Coding (SVC) extensions of the Advanced
Video Coding (AVC) standard into the recently ratified
MPEG-DASH standard. Furthermore, we have evaluated
our solution under restricted conditions using bandwidth
traces from mobile environments and compared it with an
improved version of our MPEG-DASH implementation
using AVC as well as major industry solutions.
Our presentation from the media web symposium 2013 in Berlin on the open source landscape around MPEG-DASH as well as on cloud-based services for MPEG-DASH
Nowadays video is an important part of the Web and Web sites like YouTube, Hulu, etc. count millions of users consuming their content every day. However, these Web sites mainly use media players based on proprietary browser plug-ins (i.e., Adobe Flash) and do not leverage adaptive streaming systems. This paper presents a seamless integration of the recent MPEG standard on Dynamic Adaptive Streaming over HTTP (DASH) in the Web using the HTML5 video element. Therefore, we present DASH-JS, a JavaScript-based MPEG-DASH client which adopts the Media Source API of Google’s Chrome browser to present a flexible and potentially browser independent DASH client. Furthermore, we present the integration of WebM based media segments in DASH giving a detailed description of the used container format structure and a corresponding Media Presentation Description (MPD). Our preliminary evaluation demonstrates the bandwidth adaption capabilities to show the effectiveness of the system.
A PROXY EFFECT ANALYIS AND FAIR ADATPATION ALGORITHM FOR MULTIPLE COMPETING D...Christopher Mueller
Multimedia streaming technologies based on the Hypertext Transfer Protocol (HTTP) are very popular and used by many content providers such as Netflix, Hulu, and Vudu. Recently, ISO/IEC MPEG has ratified Dynamic Adaptive Streaming over HTTP (DASH) which extends the traditional HTTP streaming with an adaptive component addressing the issue of varying bandwidth conditions that users are facing in networks based on the Internet Protocol (IP). Additionally, industry has already deployed several solutions based on such an approach which simplifies large scale deployment because the whole streaming logic is located at the client. However, these features may introduce drawbacks when multiple clients compete for a network bottleneck due to the fact that the clients are not aware of the network infrastructure such as proxies or other clients. This paper identifies these negative effects and the evaluation thereof using MPEG-DASH and Microsoft Smooth Streaming. Furthermore, we propose a novel adaptation algorithm introducing the concept of fairness regarding a cluster of clients. In anticipation of the results we can conclude that we achieve more efficient bottleneck bandwidth utilization and less quality switches.
Peer-to-Peer streaming technology has become one of the major Internet applications as it offers the opportunity of broadcasting high quality video content to a large number of peers with low costs. It is widely accepted that with the efficient utilization of peers and server's upload capacities, peers can enjoy watching a high bit rate video with minimal end-to-end delay. In this paper, we present a practical scheduling algorithm that works in the challenging condition where no spare capacity is available, i.e., it maximally utilizes the resources and broadcasts the maximum streaming rate. Each peer contacts with only a small number of neighbours in the overlay network and autonomously subscribes to sub-streams according to a budget-model in such a way that the number of peers forwarding exactly one sub-stream will be maximized. The hop-count delay is also taken into account to construct a short depth trees. Finally, we show through simulation that peers dynamically converge to an efficient overlay structure with a short hop-count delay. Moreover, the proposed scheme gives nice features in the homogeneous case and overcomes SplitStream in all simulated scenarios.
Switiching by Ravi Namboori Babson University USARavi Namboori
Ravi Namboori currently serves as vice president of Data Center Solutions at Unigen. Combining extensive IT experience and leadership skills, Ravi Namboori oversees the entire business unit, motivating and energizing his team to contribute to the strategic vision of the company.
Namboori brings experience from a number of Silicon Valley tech companies to Unigen. During his time as director of IT for Glu Mobile, he established the architecture and carried out the implementation of the company’s data center. Designed to be scalable and reliable, the center provided a high performance environment for Glu Mobile’s e-commerce and gaming business.
Earlier in his career, Ravi Namboori worked for Hewlett-Packard Company (HP) in the Netherlands as a network manager. In this role, he designed the LAN and call center systems for HP’s facility and carried out networking projects to client specifications.
In addition to Microsoft and Novell certifications, Namboori holds seven Cisco professional certifications including a teaching credential. He received his MBA from Babson University’s F. W. Olin Graduate School of Business and his undergraduate degree from DNR College.
Presentation to show how the video is transferred using ffmpeg, ffserver that can be played in mobile and desktop browsers.
HTTP : Protocol to transfer data in web
Streaming : Method of transferring continuous data
Description of Microsoft Silverlight technology.
Advantages over "standard streaming", download and progressive download methods.
Silverlight session description and analysis using wireshark
Scraper site are those site which copies content from other
websites.
Web Scraping is the method of collecting certain kind of
data from the websites.
Also known as : screen scraping, site scraping, web
harvesting and web data extraction
Slides for my Master Video session at Renaissance 2014. This session provided a high-level overview of some of AV Foundation's video playback and editing capabilities.
The demo app for this talk can be found at:
https://github.com/tapharmonic/AVFoundationEditor
Live streaming of video and subtitles with MPEG-DASHCyril Concolato
This presentation was made at the MPEG meeting in Shanghai, China, in October 2012, related to the input contribution M26906. It gives the details about the demonstration made during the meeting. This demonstration showed the use of the Google Chrome browser to display synchronized video and subtitles, using the Media Source Extension draft specification and the WebVTT subtitle format. The video and DASH content was prepared using GPAC MP4Box tool.
Welcome to a primer on the back-propagation (of errors) as it applies to the training of neural networks. We answer the question, what's the contribution of the back-propagation-technique?
MPEG DASH – Tomorrow's Format Today by Nicolas Weil
Senior Solutions Architect, Akamai Technologies & Will Law, Chief Architect, Media Cloud Engineering, Akamai Technologies
As an open standard designed to help simplify video delivery across connected devices, MPEG-DASH is continuing to gain momentum in the OTT, broadcast and wireless industries. Join Akamai's DASH experts for a discussion on what differentiates the emerging standard from legacy formats along with a demonstration showing the ease of deploying DASH playback across devices. The panel will also highlight current deployments, offer a review of the industry and provide a three-year outlook.
Akamai Edge is the premier event for Internet innovators, tech professionals and online business pioneers who together are forging a Faster Forward World. At Edge, the architects, experts and implementers of the most innovative global online businesses gather face-to-face for an invaluable three days of sharing, learning and together pushing the limits of the Faster Forward World. Learn more at: http://www.akamai.com/edge
This presentation provides an overview of MPEG-DASH and future developments, namely common media application format and virtual reality/360-degree streaming.
This presentation is meant to cover the basis of what streaming media is. There is a definition given, a history, and how streaming media is used today, along with a video example.
This presentation is meant to cover the basis of what streaming media is. There is a definition given, a history, and how streaming media is used today, along with a video example.
Building A Streaming Apple TV App (CocoaConf DC, Sept 2016)Chris Adamson
Apple TV offers a friendly SDK, full of familiar view controllers and Foundation classes, with everything an iOS developer needs to develop their own streaming channel. Except for… you know… the streaming part. In this session, we'll look at how Apple's HTTP Live Streaming video works -- from flat files or live sources -- and how to get it from your computer to a streaming server and then to an Apple TV. We'll also look at common challenges for building streaming channel apps, like serving metadata, protecting content, and supporting single sign-on.
My (quite boring) slides on what we needed to do in Janus to support multiple streams of the same type (e.g., 3 video streams) on the same PeerConnection.
This was a presentation to the IPTV working group run by the IDA in Singapore (chaired by A*Star / IR2). This was a public event while i worked at MSTV.
Slides for the presentation I did remotely at Open Source World, to talk about audio-only WebRTC applications, and what we've done in Janus to improve and cover the requirements so far.
Building A Streaming Apple TV App (CocoaConf San Jose, Nov 2016)Chris Adamson
Apple TV offers a friendly SDK, full of familiar view controllers and Foundation classes, with everything an iOS developer needs to develop their own streaming channel. Except for… you know… the streaming part. In this session, we'll look at how Apple's HTTP Live Streaming video works -- from flat files or live sources -- and how to get it from your computer to a streaming server and then to an Apple TV. We'll also look at common challenges for building streaming channel apps, like serving metadata, protecting content, and supporting single sign-on
Video Streaming: Broadcast quality on a shoe string budget. netc2012
Kansas State University has been video streaming for over 10 years. Through changes in formats and technology, they have provided online video on-campus and across the state. This session covers our newer tools including video equipment to digital encoders and software.
Whatever Happened to Visual Novel Anime? (AWA/Youmacon 2018)Chris Adamson
Ten years ago, visual novels were one of the most prominent sources for anime adaptations. Today, the flood has slowed to a trickle, and Steins;Gate 0 seems like the last gasp of a dying breed. This panel looks at what went right and wrong with VN anime, and whether they might ever make a comeback.
Whatever Happened to Visual Novel Anime? (JAFAX 2018)Chris Adamson
Not long ago, adaptations of visual novels were a major inspiration for anime, with a dozen or more shows a year based on VNs. These include classics like Clannad, Fate/Stay Night, and Higurashi. Today, the flood has slowed to a trickle, with only six VN anime in 2017, all of them commercial flops. This panel will track the rise and fall of VN adaptations, the traits that make them good and bad for anime, and whether this year's Steins;Gate 0 represents a new hope for the genre or a last gasp.
Media Frameworks Versus Swift (Swift by Northwest, October 2017)Chris Adamson
As much as we love Swift for developing our apps, playgrounds, and even on the server, there are some things for which Swift is not a good match. The media frameworks on iOS are a good example of this. Dropping into Core Audio can twist your Swift code so badly it’s hardly readable anymore. And there are parts of AV Foundation where using Swift is literally not allowed.
In this talk, we’ll show off some of these tricky scenarios, see what we can do to make things better, and think about what this means for the Swift language and its future prospects.
Fall Premieres: Media Frameworks in iOS 11, macOS 10.13, and tvOS 11 (CocoaCo...Chris Adamson
What’s Apple planning for its media frameworks in the next 12 months? What’s it doing with Apple TV, or the HTTP Live Streaming standard? We won’t know until the curtain drops on WWDC! In this talk, we’ll amass everything audio- and video-related that gets announced throughout the week, combine it with the solid base of frameworks already present in the Apple platforms, and figure out from there what we’re going to be playing with in 2018.
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is FineChris Adamson
Swift is great for writing iOS and Mac apps, and its creators also mean for it to be used as a systems programming language. However, certain traits about Swift make it officially off-limits for use in some audio/video-processing scenarios. What's the deal, is it not fast enough or what? We'll look at what media apps can and can't do in Swift, and what you're supposed to do instead. We'll also look at strategies for knowing what responsibilities to dole out to Swift and to C, and how to make those parts of your code play nicely with each other.
(This is a longer version of a talk previously presented at Forward Swift 2017)
Forward Swift 2017: Media Frameworks and Swift: This Is FineChris Adamson
Swift is great for writing iOS and Mac apps, and its creators also mean for it to be used as a systems programming language. However, certain traits about Swift make it officially off-limits for use in some audio/video-processing scenarios. What's the deal, is it not fast enough or what? We'll look at what media apps can and can't do in Swift, and what you're supposed to do instead. We'll also look at strategies for knowing what responsibilities to dole out to Swift and to C, and how to make those parts of your code play nicely with each other.
Firebase: Totally Not Parse All Over Again (Unless It Is) (CocoaConf San Jose...Chris Adamson
With Facebook shutting down Parse, everybody knows to never again depend on a third party for their backend solution, right? Sure, and after you spend six months trying to write your own syncing service, how's that working? In 2016, Google has added a ton of features to Firebase, their popular backend-as-a-service solution. Firebase's primary offering is a realtime database in the cloud that syncs changes to and from multiple concurrent users, and their Swift-friendly iOS SDK makes it ideal for mobile use. In this session, you'll learn how to set up a Firebase backend and build an iOS app around it.
Firebase: Totally Not Parse All Over Again (Unless It Is)Chris Adamson
With Facebook shutting down Parse, everybody knows to never again depend on a third party for their backend solution, right? Sure, and after you spend six months trying to write your own syncing service, how's that working? In 2016, Google has added a ton of features to Firebase, their popular backend-as-a-service solution. Firebase's primary offering is a realtime database in the cloud that syncs changes to and from multiple concurrent users, and their Swift-friendly iOS SDK makes it ideal for mobile use. In this session, you'll learn how to set up a Firebase backend and build an iOS app around it.
Video Killed the Rolex Star (CocoaConf San Jose, November, 2015)Chris Adamson
[updated from previous version to include Watch Connectivity, screenshots of WKInterfaceMovie]
watchOS 2.0 brings media functionality to Apple Watch, offering audio and video playback and audio capture. But lest you plan on writing Logic or Final Cut for the watch: what's available on the wrist has its limits, and you hit them quickly. In this session, we'll see what the WKInterfaceController offers us for miniature mobile media, and how we can get the benefits of AV Foundation and Core Audio by moving our movies, songs, and podcasts back and forth between the watch and the iPhone.
Video Killed the Rolex Star (CocoaConf Columbus, July 2015)Chris Adamson
watchOS 2.0 brings media functionality to Apple Watch, offering audio and video playback and audio capture. But lest you plan on writing Logic or Final Cut for the watch: what's available on the wrist has its limits, and you hit them quickly. In this session, we'll see what the WKInterfaceController offers us for miniature mobile media, and how we can get the benefits of AV Foundation and Core Audio by moving our movies, songs, and podcasts back and forth between the watch and the iPhone.
Revenge of the 80s: Cut/Copy/Paste, Undo/Redo, and More Big Hits (CocoaConf C...Chris Adamson
When the first 128K Macs landed in 1984, it was the first time many of us could undo a mistake with just a keystroke, or exchange data between documents or applications with cut/copy/paste and the system clipboard. Fast forward 30 years and we all use this stuff… but do you know how to actually impement it? Especially on iOS, these everyday features are surprisingly absent from many developers' toolchests. In this session, we'll flashback to the era of Reagan, Rubik's Cubes, and Return of the Jedi, to see these hot hits of the early 80's are represented in modern-day Cocoa.
Core Image: The Most Fun API You're Not Using, CocoaConf Atlanta, December 2014Chris Adamson
Graphics on iOS and OS X isn't just about stroking shapes and paths in Core Graphics and trying to figure out OpenGL. The Core Image framework gives you access to about 100 built-in filters, providing everything from photographic effects and color manipulation to face-finding and QR Code generation. It can leverage the power of the GPU to provide performance fast enough to perform complex effects work on real-time video capture. But even if you're not writing the next Final Cut Pro or Photoshop, it's easy to call in Core Image for simple tasks, like putting a blur in part of your UI for transitions or privacy reasons. In this session, we'll explore the many ways Core Image can make your app sizzle
Stupid Video Tricks, CocoaConf Seattle 2014Chris Adamson
AV Foundation makes it reasonably straightforward to capture video from the camera and edit together a nice family video. This session is not about that stuff. This session is about the nooks and crannies where AV Foundation exposes what's behind the curtain. Instead of letting AVPlayer read our video files, we can grab the samples ourselves and mess with them. AVCaptureVideoPreviewLayer, meet the CGAffineTransform. And instead of dutifully passing our captured video frames to the preview layer and an output file, how about if we instead run them through a series of Core Image filters? Record your own screen? Oh yeah, we can AVAssetWriter that. With a few pointers, a little experimentation, and a healthy disregard for safe coding practices, Core Media and Core Video let you get away with some neat stuff
Stupid Video Tricks, CocoaConf Las VegasChris Adamson
AV Foundation makes it reasonably straightforward to capture video from the camera and edit together a nice family video. This session is not about that stuff. This session is about the nooks and crannies where AV Foundation exposes what's behind the curtain. Instead of letting AVPlayer read our video files, we can grab the samples ourselves and mess with them. AVCaptureVideoPreviewLayer, meet the CGAffineTransform. And instead of dutifully passing our captured video frames to the preview layer and an output file, how about if we instead run them through a series of Core Image filters? Record your own screen? Oh yeah, we can AVAssetWriter that. With a few pointers, a little experimentation, and a healthy disregard for safe coding practices, Core Media and Core Video let you get away with some neat stuff.
Core Image: The Most Fun API You're Not Using (CocoaConf Columbus 2014)Chris Adamson
Graphics on iOS and OS X isn't just about stroking shapes and paths in Core Graphics and trying to figure out OpenGL. The Core Image framework gives you access to about 100 built-in filters, providing everything from photographic effects and color manipulation to face-finding and QR Code generation. It can leverage the power of the GPU to provide performance fast enough to perform complex effects work on real-time video capture. But even if you're not writing the next Final Cut Pro or Photoshop, it's easy to call in Core Image for simple tasks, like putting a blur in part of your UI for transitions or privacy reasons. In this session, we'll explore the many ways Core Image can make your app sizzle.
Stupid Video Tricks (CocoaConf DC, March 2014)Chris Adamson
AV Foundation makes it reasonably straightforward to capture video from the camera and edit together a nice family video. This session is not about that stuff. This session is about the nooks and crannies where AV Foundation exposes what's behind the curtain. Instead of letting AVPlayer read our video files, we can grab the samples ourselves and mess with them. AVCaptureVideoPreviewLayer, meet the CGAffineTransform. And instead of dutifully passing our captured video frames to the preview layer and an output file, how about if we instead run them through a series of Core Image filters? Record your own screen? Oh yeah, we can AVAssetWriter that. With a few pointers, a little experimentation, and a healthy disregard for safe coding practices, Core Media and Core Video let you get away with some neat stuff.
AV Foundation makes it reasonably straightforward to capture video from the camera and edit together a nice family video. This session is not about that stuff. This session is about the nooks and crannies where AV Foundation exposes what's behind the curtain. Instead of letting AVPlayer read our video files, we can grab the samples ourselves and mess with them. AVCaptureVideoPreviewLayer, meet the CGAffineTransform. And instead of dutifully passing our captured video frames to the preview layer and an output file, how about if we instead run them through a series of Core Image filters? Record your own screen? Oh yeah, we can AVAssetWriter that. With a few pointers, a little experimentation, and a healthy disregard for safe coding practices, Core Media and Core Video let you get away with some neat stuff.
Get On The Audiobus (CocoaConf Atlanta, November 2013)Chris Adamson
Audiobus is an iOS app that allows other apps to work together as an audio-processing toolchain: play your MIDI keyboard into one app, run it through filters in other apps, and mix it in a third. All in real-time, foreground or background. That such a thing is possible on the locked down iOS platform is remarkable enough, but what's even more remarkable is that hundreds of audio apps have added Audiobus support in the few months since its debut, including Apple's own GarageBand. In this session, we'll take a look at the Audiobus SDK and see how to create inputs, outputs, and filters that can be managed by the Audiobus app to process audio in collaboration with other apps on the device.
Get On The Audiobus (CocoaConf Boston, October 2013)Chris Adamson
Audiobus is an iOS app that allows other apps to work together as an audio-processing toolchain: play your MIDI keyboard into one app, run it through filters in other apps, and mix it in a third. All in real-time, foreground or background. That such a thing is possible on the locked down iOS platform is remarkable enough, but what's even more remarkable is that hundreds of audio apps have added Audiobus support in the few months since its debut, including Apple's own GarageBand. In this session, we'll take a look at the Audiobus SDK and see how to create inputs, outputs, and filters that can be managed by the Audiobus app to process audio in collaboration with other apps on the device.
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfPaige Cruz
Monitoring and observability aren’t traditionally found in software curriculums and many of us cobble this knowledge together from whatever vendor or ecosystem we were first introduced to and whatever is a part of your current company’s observability stack.
While the dev and ops silo continues to crumble….many organizations still relegate monitoring & observability as the purview of ops, infra and SRE teams. This is a mistake - achieving a highly observable system requires collaboration up and down the stack.
I, a former op, would like to extend an invitation to all application developers to join the observability party will share these foundational concepts to build on:
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
Le nuove frontiere dell'AI nell'RPA con UiPath Autopilot™UiPathCommunity
In questo evento online gratuito, organizzato dalla Community Italiana di UiPath, potrai esplorare le nuove funzionalità di Autopilot, il tool che integra l'Intelligenza Artificiale nei processi di sviluppo e utilizzo delle Automazioni.
📕 Vedremo insieme alcuni esempi dell'utilizzo di Autopilot in diversi tool della Suite UiPath:
Autopilot per Studio Web
Autopilot per Studio
Autopilot per Apps
Clipboard AI
GenAI applicata alla Document Understanding
👨🏫👨💻 Speakers:
Stefano Negro, UiPath MVPx3, RPA Tech Lead @ BSP Consultant
Flavio Martinelli, UiPath MVP 2023, Technical Account Manager @UiPath
Andrei Tasca, RPA Solutions Team Lead @NTT Data
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Assuring Contact Center Experiences for Your Customers With ThousandEyes
Mobile Movies with HTTP Live Streaming (CocoaConf DC, March 2013)
1. Mobile Movies with
HTTP Live Streaming
Chris Adamson • @invalidname
CocoaConf DC • March 23, 2013 • Herndon, VA
Livestreaming at http://ustream.tv/channel/invalidstream
Sides and code available on my blog:
http://www.subfurther.com/blog
Monday, March 25, 13
20. What You'll Learn
• What streaming is (and isn't)
• Setting up HLS on the server
• Using HLS streams in iOS apps
• Real-world deployment
Monday, March 25, 13
21. HLS: What It Is (and
isn't)
Monday, March 25, 13
23. Broadcast Media
• Always live on some channel (a band of EM
spectrum).
• Every client tuned to that channel sees the
same thing, at the same time.
• One-way, one-to-many model.
Monday, March 25, 13
24. Internet
• Generally one-to-one (host to host).
• Multicast IP is an exception, but is rare on
the public Internet.
• Two-way communication over sockets.
• Routing can take many hops, via multiple
transport media (wire, wifi, cellular, etc.).
Monday, March 25, 13
26. Ye Olde Streaming
• Client makes socket connection and keeps it
open for duration of program.
• Server sends media at playback speed (plus
buffering).
• Shoutcast: MP3 files served slowly over HTTP.
• Typically use a special port number and special
server software.
Monday, March 25, 13
27. Streaming Problems
• Difficult and expensive to scale.
• Special port numbers routinely blocked by
businesses, ISPs, firewalls, etc.
• Competing standards: Real Player, Windows
Media, QuickTime (all with their own plugins).
• No wonder Flash won.
• Good luck holding a socket connection on cellular.
Monday, March 25, 13
28. What If…
• We didn't need an always-on socket
connection?
• We could just run over port 80?
• We could just adopt industry standards like
H.264 and AAC instead of cooking custom
codecs?
Monday, March 25, 13
29. HTTP Live Streaming
• Serves media as a series of short flat files, via
HTTP, usually on port 80.
• Any web server will do.
• Client software reassembles the data into a
continuous media stream.
• Spec does not specify contents, but Apple uses
H.264 and AAC, just like all their media apps.
Monday, March 25, 13
30. Serving up HLS
• Client URL is an .m3u8 playlist file
• Playlist points to the media segment files
Monday, March 25, 13
33. The HLS playlist
#EXTM3U Format: .m3u8 format,
#EXT-X-TARGETDURATION:10
#EXT-X-VERSION:3
just a list of files to play
#EXT-X-MEDIA-SEQUENCE:0
#EXT-X-PLAYLIST-TYPE:VOD
#EXTINF:9.975,!
fileSequence0.ts
#EXTINF:9.975,!
fileSequence1.ts
#EXTINF:9.975,!
fileSequence2.ts
#EXTINF:9.9767,!
fileSequence3.ts
#EXTINF:9.975,!
[...]
#EXT-X-ENDLIST
Monday, March 25, 13
34. The HLS playlist
#EXTM3U Format: .m3u8 format,
#EXT-X-TARGETDURATION:10
#EXT-X-VERSION:3
just a list of files to play
#EXT-X-MEDIA-SEQUENCE:0
#EXT-X-PLAYLIST-TYPE:VOD Metadata tags describe
#EXTINF:9.975,!
fileSequence0.ts the contents
#EXTINF:9.975,!
fileSequence1.ts
#EXTINF:9.975,!
fileSequence2.ts
#EXTINF:9.9767,!
fileSequence3.ts
#EXTINF:9.975,!
[...]
#EXT-X-ENDLIST
Monday, March 25, 13
35. The HLS playlist
#EXTM3U Format: .m3u8 format,
#EXT-X-TARGETDURATION:10
#EXT-X-VERSION:3
just a list of files to play
#EXT-X-MEDIA-SEQUENCE:0
#EXT-X-PLAYLIST-TYPE:VOD Metadata tags describe
#EXTINF:9.975,!
fileSequence0.ts the contents
#EXTINF:9.975,!
fileSequence1.ts
#EXTINF:9.975,! Each segment file
fileSequence2.ts
#EXTINF:9.9767,!
preceded by metadata
fileSequence3.ts (e.g., duration)
#EXTINF:9.975,!
[...]
#EXT-X-ENDLIST
Monday, March 25, 13
36. The HLS playlist
#EXTM3U Format: .m3u8 format,
#EXT-X-TARGETDURATION:10
#EXT-X-VERSION:3
just a list of files to play
#EXT-X-MEDIA-SEQUENCE:0
#EXT-X-PLAYLIST-TYPE:VOD Metadata tags describe
#EXTINF:9.975,!
fileSequence0.ts the contents
#EXTINF:9.975,!
fileSequence1.ts
#EXTINF:9.975,! Each segment file
fileSequence2.ts
#EXTINF:9.9767,!
preceded by metadata
fileSequence3.ts (e.g., duration)
#EXTINF:9.975,!
[...] If no end tag, client
#EXT-X-ENDLIST refreshes periodically
Monday, March 25, 13
38. How is this better than
a flat .m4v file?
• Streams can provide variants for different
bandwidths (as we’ll see…)
• Segments make it easier to scrub into the
video
• Streams can be live video
Monday, March 25, 13
39. The “Live” in HLS
• A playlist is a live stream if it doesn’t have an
#EXT-X-ENDLIST tag
• Live playlist will generally just contain the last
minute or so of segments
• Client will refresh playlist every minute or so,
download whatever segments it doesn’t
already have, queue them locally
• “Live” isn’t really “live” (often a minute behind)
Monday, March 25, 13
46. mediafilesegmenter
• Splits an A/V file into segment files, creates
the .m3u8 playlist
• Source must be .mov or .m4v with H.264 video,
AAC audio
• Output segments will be MPEG-2 Transport
Stream (.ts) files, or .aac if audio-only
• Segment paths are relative, use -b to prepend
URL stub
Monday, March 25, 13
47. Technical Note TN2224
The following audio and video formats are supported:
• Video: H.264 Baseline Profile Level 3.0 (iPhone/iPod Touch),
Main Profile Level 3.1 (iPad 1,2)
• Audio: HE-AAC or AAC-LC up to 48 kHz, stereo audio OR
MP3 (MPEG-1 Audio Layer 3) 8 kHz to 48 kHz, stereo audio
Note: iPhone 3G supports Baseline Profile Level 3.1. If your
app runs on older iPhones, however, you should use H.264
Baseline Profile 3.0 for compatibility.
Monday, March 25, 13
48. Yuna:HTTP Live Streaming tests cadamson$ mediafilesegmenter
-f basic source/IMG_0251.MOV
Jun 24 2012 10:01:24.203: Using floating point is not
backward compatible to iOS 4.1 or earlier devices
Jun 24 2012 10:01:24.204: Processing file /Users/cadamson/
Documents/HTTP Live Streaming tests/source/IMG_0251.MOV
Jun 24 2012 10:01:24.338: Finalized /Users/cadamson/
Documents/HTTP Live Streaming tests/basic/fileSequence0.ts
Jun 24 2012 10:01:24.375: segment bitrate 3.78028e+06 is
new max
Jun 24 2012 10:01:24.468: Finalized /Users/cadamson/
Documents/HTTP Live Streaming tests/basic/fileSequence1.ts
Jun 24 2012 10:01:24.554: Finalized /Users/cadamson/
Documents/HTTP Live Streaming tests/basic/fileSequence2.ts
Jun 24 2012 10:01:24.631: Finalized /Users/cadamson/
Documents/HTTP Live Streaming tests/basic/fileSequence3.ts
Jun 24 2012 10:01:24.717: Finalized /Users/cadamson/
Documents/HTTP Live Streaming tests/basic/fileSequence4.ts
Monday, March 25, 13
50. Variant Playlists
• One bitrate does not fit all: Mac on Ethernet
versus iPhone on Edge.
• Solution: encode your video at multiple
bitrates, offer metadata in playlist about what's
available, let client figure out which to use.
• HLS clients automatically switch to best
variant for current network conditions, switch
on the fly.
Monday, March 25, 13
51. variantplaylistcreator
• Creates a playlist that itself points to playlists
created with mediafilesegmenter.
• Each entry contains metadata describing the
bitrate and encoding of the variant.
• Tool takes argument pairs: file or URL of a
variant .m3u8, and metadata .plist created with
mediafilesegmenter -I flag
• First entry in variant playlist is default; client will try
this one first
Monday, March 25, 13
57. That's Great, but…
How do we keep people from stealing our stream?
Monday, March 25, 13
58. Encryption
• HLS encrypts files, not transport.
• Easy to scale: still serving flat files, but now
they're useless without decryption keys.
• Serving the keys still needs to be secure.
• Necessary, but not sufficient, for DRM.
Monday, March 25, 13
59. Encrypting a playlist
Yuna:HTTP Live Streaming tests cadamson$ mediafilesegmenter -
I -k keys -f encrypted/cellular source/IMG_0426_Cellular.m4v
Jun 24 2012 18:59:47.115: Using new key/iv rotation period;
this is not backward compatible to iOS 3.1.* or earlier
devices. Use the "-encrypt-iv=sequence" option for
compatibility with those devices.
Jun 24 2012 18:59:47.115: Using floating point is not
backward compatible to iOS 4.1 or earlier devices
Jun 24 2012 18:59:47.115: Processing file /Users/cadamson/
Documents/HTTP Live Streaming tests/source/
IMG_0426_Cellular.m4v
Jun 24 2012 18:59:47.152: changing IV
Jun 24 2012 18:59:47.160: Finalized /Users/cadamson/
Documents/HTTP Live Streaming tests/encrypted/cellular/
fileSequence0.ts
Jun 24 2012 18:59:47.160: segment bitrate 271257 is new max
Monday, March 25, 13
66. Captions
• HLS supports CEA-608 closed captions in the
MPEG-2 Transport Stream
• If using file segmenter, add a closed-
caption track (type 'clcp') to your source
QuickTime .mov
• Or use Compressor and Sonic Scenarist
Monday, March 25, 13
69. Opening an HLS
stream
• Provide the .m3u8 URL to
MPMoviePlayerController or AVPlayer
• Add the movie view or layer to your UI,
customizing size or scaling if necessary
Monday, March 25, 13
70. Create an
MPMoviePlayerController
// create new movie player
self.moviePlayer = [[MPMoviePlayerController alloc]
! ! ! ! ! ! ! ! initWithContentURL:streamURL];
[self.moviePlayer prepareToPlay];
• This is the same as playing a local file or
any other URL
Monday, March 25, 13
71. Add it to your UI
[self.moviePlayer.view setFrame:
! ! ! ! ! ! ! self.movieContainerView.bounds];
[self.movieContainerView addSubview:
! ! ! ! ! ! ! ! self.moviePlayer.view];
self.moviePlayer.scalingMode =
! ! ! ! ! ! ! ! MPMovieScalingModeFill;
• Can inspect the moviePlayer's naturalSize, though
it may change during playback (listen for
MPMovieNaturalSizeAvailableNotification),
or just setFullscreen:animated:
Monday, March 25, 13
75. Accessing Encrypted
Streams
• Media Player and AV Foundation can use
NSURLCredentials that you've provided
• Place credentials in
NSURLCredentialStorage
• Server can provide the keys securely(*) with
HTTP Basic or Digest authentication, an
HTTPS script, etc.
* - For various values of "secure"
Monday, March 25, 13
79. Live Streaming
• mediastreamsegmenter mostly works like the
file version, but takes its input from UDP
stream or a Unix pipe
• Only difference is that .m3u8 file doesn't
have an EXT-X-ENDLIST tag, so client
reloads periodically to fetch new segments
• How the heck do you create a UDP A/V
stream?
Monday, March 25, 13
80. You Don't
• None of Apple's tools create the required
MPEG-2 stream
• This is a "third party opportunity"
• Which begs the question… buy or build?
Monday, March 25, 13
81. Streaming in the Real
World
It's not all about iPhones…
Monday, March 25, 13
82. Streaming Clients
• Mobile Devices: iPhone, iPad, iPod Touch…
plus Android, Windows Mobile, etc.
• Mac and Windows PCs
• Game consoles
• Over-the-top (OTT) boxes: Apple TV, Roku
Monday, March 25, 13
84. Playing HLS on Roku
Roku "channels" are programmed in the
"BrightScript" programming language:
port = CreateObject("roMessagePort")
screen = CreateObject("roVideoScreen")
screen.SetMessagePort(port)
screen.SetPositionNotificationPeriod(30)
screen.SetContent(episode)
screen.Show()
episode is an "associative array" with
key/value pairs for URLs, formats ("hls"),
bitrates, etc.
Monday, March 25, 13
85. Let's Get Practical
• HLS is the preferred format for Roku
• What other devices do you get for free?
• What other devices do you have to be on?
• How to encode and deliver to the devices
you need?
Monday, March 25, 13
86. HLS Alternatives
• Flash still rules on the desktop/browser
space, thanks in part to Mozilla's obstinance
about H.264 in <video> (irony alert: H.264 is
the de facto standard for Flash video)
• Adobe Dynamic Streaming and Microsoft
Smooth Streaming are highly similar to HLS:
bitrate-adaptive streams over HTTP
Monday, March 25, 13
87. MPEG-DASH
• Attempt at a standardized approach to HTTP
adaptive-bitrate streaming. ISO/IEC 23009-1.
http://xkcd.com/927/
Monday, March 25, 13
88. Emerging Consensus
• Flash for PCs
• HTTP Live Streaming for iOS
• Plus whatever other devices you need / are
able to support
Monday, March 25, 13
89. Real-World HLS
• Can you competently encode all your media
at all the variant bitrates you need?
• Do you have a way to QC all your streams?
• Can you handle the server load?
Monday, March 25, 13
90. Build or Buy: Services
• Provide hosting, live transcoding, bandwidth
• All-in-one: UStream, LiveStream, Justin.tv /
Twitch.tv (all of which have iOS apps)
• May provide broadcast tools (Flash applet,
Telestream Wirecast, etc.
• Often free with ads; you pay to go ad-free,
embed on your site, etc.
Monday, March 25, 13
91. invalidstream
http://www.ustream.tv/channel/invalidstream
Monday, March 25, 13
92. Production Demo
http://www.telestream.net/wirecast/
Monday, March 25, 13
93. Content Delivery
Networks
http://en.wikipedia.org/wiki/Content_delivery_network
Monday, March 25, 13
94. Content Delivery
Networks
• CDNs host your media on edge servers that
are closer to your clients. Less strain on your
servers and the backbones.
• Examples: Akamai, Limelight, EdgeCast
• Big media companies may have their own
CDN
• Most already know how to do HLS
Monday, March 25, 13
95. Buy or Build: Encoders
$25,000
$4,000
Monday, March 25, 13
96. Buy or Build:
Bandwidth
• Each HLS client will consume up to 1GB /
hour, depending on variant bitrates, client
bandwidth, etc.
• Many CDNs charge around $0.20/GB.
Monday, March 25, 13
97. Bandwidth Costs
Prices from ScaleEngine, UStream, and LiveStream as of
October 2012
Monday, March 25, 13
98. Bandwidth Costs
Prices from ScaleEngine, UStream, and LiveStream as of
October 2012
Monday, March 25, 13
99. Bandwidth Costs
Prices from ScaleEngine, UStream, and LiveStream as of
October 2012
Monday, March 25, 13
100. Bandwidth Costs
Prices from ScaleEngine, UStream, and LiveStream as of
October 2012
Monday, March 25, 13
101. Self-Hosted Costs
Prices from MacMiniColo and Amazon EC2 as of March 2013
Monday, March 25, 13
102. Self-Hosted Costs
Prices from MacMiniColo and Amazon EC2 as of March 2013
Monday, March 25, 13
103. Self-Hosted Costs
Prices from MacMiniColo and Amazon EC2 as of March 2013
Monday, March 25, 13
105. Takeaways
• HLS is a very practical streaming solution
• Only part of the picture if you're multi-platform
• Encoding and serving correctly requires
some care and expertise, and a lot of money
• Client-side software requirements are fairly
simple
Monday, March 25, 13
106. Q&A
Slides and code will be available on my blog:
http://www.subfurther.com/blog
http://www.slideshare.net/invalidname
@invalidname
Monday, March 25, 13