This document discusses building a streaming video app for Apple TV. It covers topics like codecs, containers, livestreaming, adaptive streaming using HTTP Live Streaming (HLS), creating HLS streams with tools like mediafilesegmenter, and securing streams. HLS breaks video into small file segments delivered over HTTP, making streaming scalable and suitable for mobile. Variant playlists allow encoding at multiple bitrates to adapt to network conditions.
Building A Streaming Apple TV App (CocoaConf DC, Sept 2016)Chris Adamson
Apple TV offers a friendly SDK, full of familiar view controllers and Foundation classes, with everything an iOS developer needs to develop their own streaming channel. Except for… you know… the streaming part. In this session, we'll look at how Apple's HTTP Live Streaming video works -- from flat files or live sources -- and how to get it from your computer to a streaming server and then to an Apple TV. We'll also look at common challenges for building streaming channel apps, like serving metadata, protecting content, and supporting single sign-on.
Video Killed the Rolex Star (CocoaConf San Jose, November, 2015)Chris Adamson
[updated from previous version to include Watch Connectivity, screenshots of WKInterfaceMovie]
watchOS 2.0 brings media functionality to Apple Watch, offering audio and video playback and audio capture. But lest you plan on writing Logic or Final Cut for the watch: what's available on the wrist has its limits, and you hit them quickly. In this session, we'll see what the WKInterfaceController offers us for miniature mobile media, and how we can get the benefits of AV Foundation and Core Audio by moving our movies, songs, and podcasts back and forth between the watch and the iPhone.
Stupid Video Tricks, CocoaConf Seattle 2014Chris Adamson
AV Foundation makes it reasonably straightforward to capture video from the camera and edit together a nice family video. This session is not about that stuff. This session is about the nooks and crannies where AV Foundation exposes what's behind the curtain. Instead of letting AVPlayer read our video files, we can grab the samples ourselves and mess with them. AVCaptureVideoPreviewLayer, meet the CGAffineTransform. And instead of dutifully passing our captured video frames to the preview layer and an output file, how about if we instead run them through a series of Core Image filters? Record your own screen? Oh yeah, we can AVAssetWriter that. With a few pointers, a little experimentation, and a healthy disregard for safe coding practices, Core Media and Core Video let you get away with some neat stuff
Video Killed the Rolex Star (CocoaConf Columbus, July 2015)Chris Adamson
watchOS 2.0 brings media functionality to Apple Watch, offering audio and video playback and audio capture. But lest you plan on writing Logic or Final Cut for the watch: what's available on the wrist has its limits, and you hit them quickly. In this session, we'll see what the WKInterfaceController offers us for miniature mobile media, and how we can get the benefits of AV Foundation and Core Audio by moving our movies, songs, and podcasts back and forth between the watch and the iPhone.
Get On The Audiobus (CocoaConf Atlanta, November 2013)Chris Adamson
Audiobus is an iOS app that allows other apps to work together as an audio-processing toolchain: play your MIDI keyboard into one app, run it through filters in other apps, and mix it in a third. All in real-time, foreground or background. That such a thing is possible on the locked down iOS platform is remarkable enough, but what's even more remarkable is that hundreds of audio apps have added Audiobus support in the few months since its debut, including Apple's own GarageBand. In this session, we'll take a look at the Audiobus SDK and see how to create inputs, outputs, and filters that can be managed by the Audiobus app to process audio in collaboration with other apps on the device.
Get On The Audiobus (CocoaConf Boston, October 2013)Chris Adamson
Audiobus is an iOS app that allows other apps to work together as an audio-processing toolchain: play your MIDI keyboard into one app, run it through filters in other apps, and mix it in a third. All in real-time, foreground or background. That such a thing is possible on the locked down iOS platform is remarkable enough, but what's even more remarkable is that hundreds of audio apps have added Audiobus support in the few months since its debut, including Apple's own GarageBand. In this session, we'll take a look at the Audiobus SDK and see how to create inputs, outputs, and filters that can be managed by the Audiobus app to process audio in collaboration with other apps on the device.
Building A Streaming Apple TV App (CocoaConf DC, Sept 2016)Chris Adamson
Apple TV offers a friendly SDK, full of familiar view controllers and Foundation classes, with everything an iOS developer needs to develop their own streaming channel. Except for… you know… the streaming part. In this session, we'll look at how Apple's HTTP Live Streaming video works -- from flat files or live sources -- and how to get it from your computer to a streaming server and then to an Apple TV. We'll also look at common challenges for building streaming channel apps, like serving metadata, protecting content, and supporting single sign-on.
Video Killed the Rolex Star (CocoaConf San Jose, November, 2015)Chris Adamson
[updated from previous version to include Watch Connectivity, screenshots of WKInterfaceMovie]
watchOS 2.0 brings media functionality to Apple Watch, offering audio and video playback and audio capture. But lest you plan on writing Logic or Final Cut for the watch: what's available on the wrist has its limits, and you hit them quickly. In this session, we'll see what the WKInterfaceController offers us for miniature mobile media, and how we can get the benefits of AV Foundation and Core Audio by moving our movies, songs, and podcasts back and forth between the watch and the iPhone.
Stupid Video Tricks, CocoaConf Seattle 2014Chris Adamson
AV Foundation makes it reasonably straightforward to capture video from the camera and edit together a nice family video. This session is not about that stuff. This session is about the nooks and crannies where AV Foundation exposes what's behind the curtain. Instead of letting AVPlayer read our video files, we can grab the samples ourselves and mess with them. AVCaptureVideoPreviewLayer, meet the CGAffineTransform. And instead of dutifully passing our captured video frames to the preview layer and an output file, how about if we instead run them through a series of Core Image filters? Record your own screen? Oh yeah, we can AVAssetWriter that. With a few pointers, a little experimentation, and a healthy disregard for safe coding practices, Core Media and Core Video let you get away with some neat stuff
Video Killed the Rolex Star (CocoaConf Columbus, July 2015)Chris Adamson
watchOS 2.0 brings media functionality to Apple Watch, offering audio and video playback and audio capture. But lest you plan on writing Logic or Final Cut for the watch: what's available on the wrist has its limits, and you hit them quickly. In this session, we'll see what the WKInterfaceController offers us for miniature mobile media, and how we can get the benefits of AV Foundation and Core Audio by moving our movies, songs, and podcasts back and forth between the watch and the iPhone.
Get On The Audiobus (CocoaConf Atlanta, November 2013)Chris Adamson
Audiobus is an iOS app that allows other apps to work together as an audio-processing toolchain: play your MIDI keyboard into one app, run it through filters in other apps, and mix it in a third. All in real-time, foreground or background. That such a thing is possible on the locked down iOS platform is remarkable enough, but what's even more remarkable is that hundreds of audio apps have added Audiobus support in the few months since its debut, including Apple's own GarageBand. In this session, we'll take a look at the Audiobus SDK and see how to create inputs, outputs, and filters that can be managed by the Audiobus app to process audio in collaboration with other apps on the device.
Get On The Audiobus (CocoaConf Boston, October 2013)Chris Adamson
Audiobus is an iOS app that allows other apps to work together as an audio-processing toolchain: play your MIDI keyboard into one app, run it through filters in other apps, and mix it in a third. All in real-time, foreground or background. That such a thing is possible on the locked down iOS platform is remarkable enough, but what's even more remarkable is that hundreds of audio apps have added Audiobus support in the few months since its debut, including Apple's own GarageBand. In this session, we'll take a look at the Audiobus SDK and see how to create inputs, outputs, and filters that can be managed by the Audiobus app to process audio in collaboration with other apps on the device.
Stupid Video Tricks (CocoaConf DC, March 2014)Chris Adamson
AV Foundation makes it reasonably straightforward to capture video from the camera and edit together a nice family video. This session is not about that stuff. This session is about the nooks and crannies where AV Foundation exposes what's behind the curtain. Instead of letting AVPlayer read our video files, we can grab the samples ourselves and mess with them. AVCaptureVideoPreviewLayer, meet the CGAffineTransform. And instead of dutifully passing our captured video frames to the preview layer and an output file, how about if we instead run them through a series of Core Image filters? Record your own screen? Oh yeah, we can AVAssetWriter that. With a few pointers, a little experimentation, and a healthy disregard for safe coding practices, Core Media and Core Video let you get away with some neat stuff.
Stupid Video Tricks, CocoaConf Las VegasChris Adamson
AV Foundation makes it reasonably straightforward to capture video from the camera and edit together a nice family video. This session is not about that stuff. This session is about the nooks and crannies where AV Foundation exposes what's behind the curtain. Instead of letting AVPlayer read our video files, we can grab the samples ourselves and mess with them. AVCaptureVideoPreviewLayer, meet the CGAffineTransform. And instead of dutifully passing our captured video frames to the preview layer and an output file, how about if we instead run them through a series of Core Image filters? Record your own screen? Oh yeah, we can AVAssetWriter that. With a few pointers, a little experimentation, and a healthy disregard for safe coding practices, Core Media and Core Video let you get away with some neat stuff.
AV Foundation makes it reasonably straightforward to capture video from the camera and edit together a nice family video. This session is not about that stuff. This session is about the nooks and crannies where AV Foundation exposes what's behind the curtain. Instead of letting AVPlayer read our video files, we can grab the samples ourselves and mess with them. AVCaptureVideoPreviewLayer, meet the CGAffineTransform. And instead of dutifully passing our captured video frames to the preview layer and an output file, how about if we instead run them through a series of Core Image filters? Record your own screen? Oh yeah, we can AVAssetWriter that. With a few pointers, a little experimentation, and a healthy disregard for safe coding practices, Core Media and Core Video let you get away with some neat stuff.
, AV Foundation moves to center stage as the essential media framework on the device, offering support for playing, capturing, and even editing audio and video. Borrowing some of the core ideas from the Mac's QuickTime, while adding many new concepts of its own, AV Foundation offers extraordinary capabilities for application programmers. This talk will offer a high-level overview of what's in AV Foundation, and a taste of what it can do.
Managing Eclipse Preferences for Teams (EclipseCon 2011)Netcetera
Netcetera senior software engineer Michael Pellaton describes the difficulty of managing consistent eclipse preferences for entire development teams and presents tools that have been developed to improve the situation.
The AV Foundation has grown from a simple audio player quietly added in iPhone OS 2.2 to an extraordinarily ambitious media-creation framework in iOS 4. The latest version provides highly-customizable audio and video capture, deep support for editing and Core Animation-based effects, and export. It is far more comprehensive than nearly any other platform's media framework, mobile or desktop. As a developer, this gives you incredible power… and one heck of a learning curve. In this session, we will take a ground-up look at AV Foundation, starting with its core concepts and patterns, and moving through its most practical and powerful capabilities. Along the way, we'll see how AV Foundation works with iOS' other media APIs -- Core Audio, Media Player, and Core Media -- and how it aggressively uses new iOS 4 programming paradigms like blocks and Grand Central Dispatch. Attendees will come away understanding the straight-arrow paths to AV Foundation's most important features like capture, editing, and export, and how its pieces might be combined in interesting ways to create even more powerful media applications.
Slides for my Master Video session at Renaissance 2014. This session provided a high-level overview of some of AV Foundation's video playback and editing capabilities.
The demo app for this talk can be found at:
https://github.com/tapharmonic/AVFoundationEditor
Some of my favourite bits of AVFoundation. Topics include capture, composition, a custom player and scrubber interface, synchronized CAAnimations, and real-time VFX.
JavaOne 2015 : How I Rediscovered My Coding Mojo by Building an IoT/Robotics ...Mark West
Is your project dragging you down? Are you stuck with the same old technologies? Are you bored with coding? If you answer “yes” to any of these questions, you may have lost your coding mojo—just like this session’s speaker did a few years back. Come hear how he learned new technologies and rediscovered his coding mojo by building an IoT/robotics prototype: a voice-controlled robot. Along the way, you’ll hear about HTML5 speech recognition, controlling hardware with Node.js and Johnny-Five, using WebSocket and MQTT for communication between components, and finally how you can easily combine the Raspberry Pi and Arduino platforms to gain ultimate power over your own projects.
AwesomeBox TV gathers additional information of users’ videos
- Casts, plot, trailer of the movie
- Subtitles of the movie and the TV show
- Album, lyric of the music video
Build HA Asterisk on Microsoft Azure using DRBD/HeartbeatSanjay Willie
This was presented during Microsoft Azure's BootCamp on April 25 2015 at Microsoft Malaysia. This particular session was about using OSS Asterisk on Azure with HA capabilities.
Firebase: Totally Not Parse All Over Again (Unless It Is) (CocoaConf San Jose...Chris Adamson
With Facebook shutting down Parse, everybody knows to never again depend on a third party for their backend solution, right? Sure, and after you spend six months trying to write your own syncing service, how's that working? In 2016, Google has added a ton of features to Firebase, their popular backend-as-a-service solution. Firebase's primary offering is a realtime database in the cloud that syncs changes to and from multiple concurrent users, and their Swift-friendly iOS SDK makes it ideal for mobile use. In this session, you'll learn how to set up a Firebase backend and build an iOS app around it.
Firebase: Totally Not Parse All Over Again (Unless It Is)Chris Adamson
With Facebook shutting down Parse, everybody knows to never again depend on a third party for their backend solution, right? Sure, and after you spend six months trying to write your own syncing service, how's that working? In 2016, Google has added a ton of features to Firebase, their popular backend-as-a-service solution. Firebase's primary offering is a realtime database in the cloud that syncs changes to and from multiple concurrent users, and their Swift-friendly iOS SDK makes it ideal for mobile use. In this session, you'll learn how to set up a Firebase backend and build an iOS app around it.
Stupid Video Tricks (CocoaConf DC, March 2014)Chris Adamson
AV Foundation makes it reasonably straightforward to capture video from the camera and edit together a nice family video. This session is not about that stuff. This session is about the nooks and crannies where AV Foundation exposes what's behind the curtain. Instead of letting AVPlayer read our video files, we can grab the samples ourselves and mess with them. AVCaptureVideoPreviewLayer, meet the CGAffineTransform. And instead of dutifully passing our captured video frames to the preview layer and an output file, how about if we instead run them through a series of Core Image filters? Record your own screen? Oh yeah, we can AVAssetWriter that. With a few pointers, a little experimentation, and a healthy disregard for safe coding practices, Core Media and Core Video let you get away with some neat stuff.
Stupid Video Tricks, CocoaConf Las VegasChris Adamson
AV Foundation makes it reasonably straightforward to capture video from the camera and edit together a nice family video. This session is not about that stuff. This session is about the nooks and crannies where AV Foundation exposes what's behind the curtain. Instead of letting AVPlayer read our video files, we can grab the samples ourselves and mess with them. AVCaptureVideoPreviewLayer, meet the CGAffineTransform. And instead of dutifully passing our captured video frames to the preview layer and an output file, how about if we instead run them through a series of Core Image filters? Record your own screen? Oh yeah, we can AVAssetWriter that. With a few pointers, a little experimentation, and a healthy disregard for safe coding practices, Core Media and Core Video let you get away with some neat stuff.
AV Foundation makes it reasonably straightforward to capture video from the camera and edit together a nice family video. This session is not about that stuff. This session is about the nooks and crannies where AV Foundation exposes what's behind the curtain. Instead of letting AVPlayer read our video files, we can grab the samples ourselves and mess with them. AVCaptureVideoPreviewLayer, meet the CGAffineTransform. And instead of dutifully passing our captured video frames to the preview layer and an output file, how about if we instead run them through a series of Core Image filters? Record your own screen? Oh yeah, we can AVAssetWriter that. With a few pointers, a little experimentation, and a healthy disregard for safe coding practices, Core Media and Core Video let you get away with some neat stuff.
, AV Foundation moves to center stage as the essential media framework on the device, offering support for playing, capturing, and even editing audio and video. Borrowing some of the core ideas from the Mac's QuickTime, while adding many new concepts of its own, AV Foundation offers extraordinary capabilities for application programmers. This talk will offer a high-level overview of what's in AV Foundation, and a taste of what it can do.
Managing Eclipse Preferences for Teams (EclipseCon 2011)Netcetera
Netcetera senior software engineer Michael Pellaton describes the difficulty of managing consistent eclipse preferences for entire development teams and presents tools that have been developed to improve the situation.
The AV Foundation has grown from a simple audio player quietly added in iPhone OS 2.2 to an extraordinarily ambitious media-creation framework in iOS 4. The latest version provides highly-customizable audio and video capture, deep support for editing and Core Animation-based effects, and export. It is far more comprehensive than nearly any other platform's media framework, mobile or desktop. As a developer, this gives you incredible power… and one heck of a learning curve. In this session, we will take a ground-up look at AV Foundation, starting with its core concepts and patterns, and moving through its most practical and powerful capabilities. Along the way, we'll see how AV Foundation works with iOS' other media APIs -- Core Audio, Media Player, and Core Media -- and how it aggressively uses new iOS 4 programming paradigms like blocks and Grand Central Dispatch. Attendees will come away understanding the straight-arrow paths to AV Foundation's most important features like capture, editing, and export, and how its pieces might be combined in interesting ways to create even more powerful media applications.
Slides for my Master Video session at Renaissance 2014. This session provided a high-level overview of some of AV Foundation's video playback and editing capabilities.
The demo app for this talk can be found at:
https://github.com/tapharmonic/AVFoundationEditor
Some of my favourite bits of AVFoundation. Topics include capture, composition, a custom player and scrubber interface, synchronized CAAnimations, and real-time VFX.
JavaOne 2015 : How I Rediscovered My Coding Mojo by Building an IoT/Robotics ...Mark West
Is your project dragging you down? Are you stuck with the same old technologies? Are you bored with coding? If you answer “yes” to any of these questions, you may have lost your coding mojo—just like this session’s speaker did a few years back. Come hear how he learned new technologies and rediscovered his coding mojo by building an IoT/robotics prototype: a voice-controlled robot. Along the way, you’ll hear about HTML5 speech recognition, controlling hardware with Node.js and Johnny-Five, using WebSocket and MQTT for communication between components, and finally how you can easily combine the Raspberry Pi and Arduino platforms to gain ultimate power over your own projects.
AwesomeBox TV gathers additional information of users’ videos
- Casts, plot, trailer of the movie
- Subtitles of the movie and the TV show
- Album, lyric of the music video
Build HA Asterisk on Microsoft Azure using DRBD/HeartbeatSanjay Willie
This was presented during Microsoft Azure's BootCamp on April 25 2015 at Microsoft Malaysia. This particular session was about using OSS Asterisk on Azure with HA capabilities.
Firebase: Totally Not Parse All Over Again (Unless It Is) (CocoaConf San Jose...Chris Adamson
With Facebook shutting down Parse, everybody knows to never again depend on a third party for their backend solution, right? Sure, and after you spend six months trying to write your own syncing service, how's that working? In 2016, Google has added a ton of features to Firebase, their popular backend-as-a-service solution. Firebase's primary offering is a realtime database in the cloud that syncs changes to and from multiple concurrent users, and their Swift-friendly iOS SDK makes it ideal for mobile use. In this session, you'll learn how to set up a Firebase backend and build an iOS app around it.
Firebase: Totally Not Parse All Over Again (Unless It Is)Chris Adamson
With Facebook shutting down Parse, everybody knows to never again depend on a third party for their backend solution, right? Sure, and after you spend six months trying to write your own syncing service, how's that working? In 2016, Google has added a ton of features to Firebase, their popular backend-as-a-service solution. Firebase's primary offering is a realtime database in the cloud that syncs changes to and from multiple concurrent users, and their Swift-friendly iOS SDK makes it ideal for mobile use. In this session, you'll learn how to set up a Firebase backend and build an iOS app around it.
Revenge of the 80s: Cut/Copy/Paste, Undo/Redo, and More Big Hits (CocoaConf C...Chris Adamson
When the first 128K Macs landed in 1984, it was the first time many of us could undo a mistake with just a keystroke, or exchange data between documents or applications with cut/copy/paste and the system clipboard. Fast forward 30 years and we all use this stuff… but do you know how to actually impement it? Especially on iOS, these everyday features are surprisingly absent from many developers' toolchests. In this session, we'll flashback to the era of Reagan, Rubik's Cubes, and Return of the Jedi, to see these hot hits of the early 80's are represented in modern-day Cocoa.
Core Image: The Most Fun API You're Not Using (CocoaConf Columbus 2014)Chris Adamson
Graphics on iOS and OS X isn't just about stroking shapes and paths in Core Graphics and trying to figure out OpenGL. The Core Image framework gives you access to about 100 built-in filters, providing everything from photographic effects and color manipulation to face-finding and QR Code generation. It can leverage the power of the GPU to provide performance fast enough to perform complex effects work on real-time video capture. But even if you're not writing the next Final Cut Pro or Photoshop, it's easy to call in Core Image for simple tasks, like putting a blur in part of your UI for transitions or privacy reasons. In this session, we'll explore the many ways Core Image can make your app sizzle.
Core Image: The Most Fun API You're Not Using, CocoaConf Atlanta, December 2014Chris Adamson
Graphics on iOS and OS X isn't just about stroking shapes and paths in Core Graphics and trying to figure out OpenGL. The Core Image framework gives you access to about 100 built-in filters, providing everything from photographic effects and color manipulation to face-finding and QR Code generation. It can leverage the power of the GPU to provide performance fast enough to perform complex effects work on real-time video capture. But even if you're not writing the next Final Cut Pro or Photoshop, it's easy to call in Core Image for simple tasks, like putting a blur in part of your UI for transitions or privacy reasons. In this session, we'll explore the many ways Core Image can make your app sizzle
Forward Swift 2017: Media Frameworks and Swift: This Is FineChris Adamson
Swift is great for writing iOS and Mac apps, and its creators also mean for it to be used as a systems programming language. However, certain traits about Swift make it officially off-limits for use in some audio/video-processing scenarios. What's the deal, is it not fast enough or what? We'll look at what media apps can and can't do in Swift, and what you're supposed to do instead. We'll also look at strategies for knowing what responsibilities to dole out to Swift and to C, and how to make those parts of your code play nicely with each other.
All the content of this website is informative and non-commercial, does not imply a commitment to develop, launch or schedule delivery of any feature or functionality, should not rely on it in making decisions, incorporate or take it as a reference in a contract or academic matters. Likewise, the use, distribution and reproduction by any means, in whole or in part, without the authorization of the author and / or third-party copyright holders, as applicable, is prohibited.
This presentation is devoted to the architecture of streaming services, special features of adaptive streaming, benefits and disadvantages of various streaming technologies and specific issues of media streaming apps development.
This presentation by Nazariy Mamrokha, GlobalLogic expert, was delivered at GlobalLogic Lviv C++ TechTalk on September 15, 2016. Learn more here: https://www.globallogic.com/ua/gl_news/globallogic-lviv-c-techtalk-summary/
Fall Premieres: Media Frameworks in iOS 11, macOS 10.13, and tvOS 11 (CocoaCo...Chris Adamson
What’s Apple planning for its media frameworks in the next 12 months? What’s it doing with Apple TV, or the HTTP Live Streaming standard? We won’t know until the curtain drops on WWDC! In this talk, we’ll amass everything audio- and video-related that gets announced throughout the week, combine it with the solid base of frameworks already present in the Apple platforms, and figure out from there what we’re going to be playing with in 2018.
AWS re:Invent 2016: Accelerating the Transition to Broadcast and OTT Infrastr...Amazon Web Services
In this session, we show how to seamlessly transition VOD, live, and other advanced media workflows from on-premises deployments to the cloud. Cinépolis will provide an overview of their transcoding solution on AWS and how they have seamlessly expanded the solution increasing their customer reach. We'll show real world examples of the API calls used to configure and control all elements of the workflow including compression and origination. And how standard AWS services can be media-optimized with Elemental Technologies to form a robust live solution.
Glitch-Free A/V Encoding (CocoaConf Boston, October 2013)Chris Adamson
The iPhone is the best iPod Apple's ever made, and the iPad has replaced the TV for many users. And while developers can use documentation and books master the media frameworks (AV Foundation, Core Audio, and the rest), there's nothing in Xcode that will keep your audio from dropping out, fix artifacting on video with a lot of motion, or properly balance performance on the most-capable new Retina devices with backwards-compatibility with older ones. This session offers a ground-level intro to what's actually in your iTunes songs and streaming videos, and how to best encode them for the realities of iOS devices, their storage capacities and the networks they live on. We'll shoot, compress, and stream, all from a MacBook Air, and take a close look and listen to the results.
On-demand & Live Streaming with Amazon CloudFront in the Post-PC World (MED30...Amazon Web Services
Learn how AWS customers are using Amazon CloudFront to deliver their video content to users over HTTP(S) using a number of modern streaming protocols such as Smooth Streaming, HLS, DASH, etc. You also learn about options for end-to-end security of your video content—through both encryption and preventing access from unauthorized users based on their location. Finally, we demonstrate how easy it is to use CloudFront to deliver both your on-demand and live video to a global audience with great performance.
Description of Microsoft Silverlight technology.
Advantages over "standard streaming", download and progressive download methods.
Silverlight session description and analysis using wireshark
초보 개발자도 바로 따라할 수 있는 AWS 미디어 서비스를 이용한 Live/VOD 서비스 구축 – 현륜식 AWS 솔루션즈 아키텍트:: A...Amazon Web Services Korea
코로나가 장기화되면서 일상에 여러 변화가 오고있습니다. 그 중에서 비대면이 일상화되고 있는데요. 콘텐츠를 쉽고 빠르게 제작 및 처리하고 전송하고 싶을 때 하드웨어와 같은 과중한 업무는 AWS에 맡기고, 오직 뷰어를 기쁘게 하는데만 전념할 수 있도록 전세계 청중에게 Live/VOD 콘텐츠를 효율적으로 제공할 수 있는 방법을 Elemental 미디어 서비스 데모와 함께 알아봅니다.
Windows Azure Media Services June 2013 updateMingfei Yan
This is a update presentation for Windows Azure Media Services June 2013 update. It showcases dynamic packaging, MPEG-DASH release and Live streaming sneak peek. You could view session video here: https://channel9.msdn.com/Events/Build/2013/3-549
Craft 2019 - “The Upside Down” Of The Web - Video technologiesMáté Nádasdi
Video technologies are "The Upside Down” of the web for sure. Being a frontend engineer writing HTML5 video players, WebRTC broadcast clients, supporting 360/VR videos or maybe writing interactive movies like Bandersnatch is just like to live in a parallel universe where everything and nothing is the same. Working with video is an exciting combination of all the trendy stuff out there like new Web APIs, ByteArrays, Workers, WebRTC, WebGL, etc.
In this talk, Mate would like to share important insights of video specific frontend engineering nowadays. The purpose of sharing this adventure is to give you some sense of this universe, to explain how a web video player works, to talk about the key layers and challenges and to point out that every frontend engineer has the power in their hands to utilize this knowledge to boost the user experience in any kind of product.
TV is changing in 2017 ! Step into the future of Broadcast (www.tecsys.tv)Rony Weinfeld
We invite you to step into the future of Broadcast. Launch your TV channel directly from the cloud within minutes without any CAEPX investment (Hardware, space, power consumption, broadcast satellite / radio or IP links, maintenance, technical staffs etc). All on a flexible System as a service basis (OPEX) without commitment. More information and free demo registration is available on our web site: http://www.tecsys.tv
Similar to Building A Streaming Apple TV App (CocoaConf San Jose, Nov 2016) (20)
Whatever Happened to Visual Novel Anime? (AWA/Youmacon 2018)Chris Adamson
Ten years ago, visual novels were one of the most prominent sources for anime adaptations. Today, the flood has slowed to a trickle, and Steins;Gate 0 seems like the last gasp of a dying breed. This panel looks at what went right and wrong with VN anime, and whether they might ever make a comeback.
Whatever Happened to Visual Novel Anime? (JAFAX 2018)Chris Adamson
Not long ago, adaptations of visual novels were a major inspiration for anime, with a dozen or more shows a year based on VNs. These include classics like Clannad, Fate/Stay Night, and Higurashi. Today, the flood has slowed to a trickle, with only six VN anime in 2017, all of them commercial flops. This panel will track the rise and fall of VN adaptations, the traits that make them good and bad for anime, and whether this year's Steins;Gate 0 represents a new hope for the genre or a last gasp.
Media Frameworks Versus Swift (Swift by Northwest, October 2017)Chris Adamson
As much as we love Swift for developing our apps, playgrounds, and even on the server, there are some things for which Swift is not a good match. The media frameworks on iOS are a good example of this. Dropping into Core Audio can twist your Swift code so badly it’s hardly readable anymore. And there are parts of AV Foundation where using Swift is literally not allowed.
In this talk, we’ll show off some of these tricky scenarios, see what we can do to make things better, and think about what this means for the Swift language and its future prospects.
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is FineChris Adamson
Swift is great for writing iOS and Mac apps, and its creators also mean for it to be used as a systems programming language. However, certain traits about Swift make it officially off-limits for use in some audio/video-processing scenarios. What's the deal, is it not fast enough or what? We'll look at what media apps can and can't do in Swift, and what you're supposed to do instead. We'll also look at strategies for knowing what responsibilities to dole out to Swift and to C, and how to make those parts of your code play nicely with each other.
(This is a longer version of a talk previously presented at Forward Swift 2017)
Mobile Movies with HTTP Live Streaming (CocoaConf DC, March 2013)Chris Adamson
If your iOS app streams video, then you're going to be using HTTP Live Streaming. Between the serious support for it in iOS, and App Store rules mandating its use in some cases, there realistically is no other choice. But where do you get started and what do you have to do? In this session, we'll take a holistic look at how to use HLS. We'll cover how to encode media for HLS and how to get the best results for all the clients and bitrates you might need to support, how to serve that media (and whether it makes sense to let someone else do it for you), and how to integrate the HLS stream into your app.
Core Audio in iOS 6 (CocoaConf Chicago, March 2013)Chris Adamson
Core Audio gets a bunch of neat new tricks in iOS 6, particularly for developers working with Audio Units. New effect units include an improved ability to vary pitch and playback speed, a digital delay unit, and OS X's powerful matrix mixer. There's now a new place to use units too, as the Audio Queue now offers developers a way to "tap" into the data being queued up for playback. To top it all off, a new "multi-route" system allows us to play out of multiple, multi-channel output devices at the same time.
Want to see, and hear, how all this stuff works? This section is the place to find out.
Look past the square braces and the damned header files and Objective-C -- the essential language of iOS development -- really isn't that different from other object-oriented languages. Classes, single-inheritance, polymorphism, implementation hiding... check, check, check, and check. So it's really not that difficult for old Java / Python / Ruby / C++ dogs to learn new tricks once they install Xcode, right?
To be a competent Obj-C programmer, not that hard.
To be a great Obj-C programmer... now that's another story.
In this session, we will look at traits that are unique to Objective-C, the tricks that bring out the expressiveness and power of the language. We'll also look at how to write idiomatic code that will be easily understood and maintained by other Objective-C developers. We'll look at how Automatic Reference Counting resembles but is really nothing like Garbage Collection, how properties put plain old instance variables to shame, how we loosely couple classes with delegates and notification, how blocks help us un-block our code by simplifying asynchronicity, and more.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
PHP Frameworks: I want to break free (IPC Berlin 2024)Ralf Eggert
In this presentation, we examine the challenges and limitations of relying too heavily on PHP frameworks in web development. We discuss the history of PHP and its frameworks to understand how this dependence has evolved. The focus will be on providing concrete tips and strategies to reduce reliance on these frameworks, based on real-world examples and practical considerations. The goal is to equip developers with the skills and knowledge to create more flexible and future-proof web applications. We'll explore the importance of maintaining autonomy in a rapidly changing tech landscape and how to make informed decisions in PHP development.
This talk is aimed at encouraging a more independent approach to using PHP frameworks, moving towards a more flexible and future-proof approach to PHP development.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Let's dive deeper into the world of ODC! Ricardo Alves (OutSystems) will join us to tell all about the new Data Fabric. After that, Sezen de Bruijn (OutSystems) will get into the details on how to best design a sturdy architecture within ODC.
"Impact of front-end architecture on development cost", Viktor TurskyiFwdays
I have heard many times that architecture is not important for the front-end. Also, many times I have seen how developers implement features on the front-end just following the standard rules for a framework and think that this is enough to successfully launch the project, and then the project fails. How to prevent this and what approach to choose? I have launched dozens of complex projects and during the talk we will analyze which approaches have worked for me and which have not.
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
Building A Streaming Apple TV App (CocoaConf San Jose, Nov 2016)
1. Building a Streaming
Apple TV app
Chris Adamson (@invalidname)
CocoaConf San Jose • November, 2016
Slides available at slideshare.net/invalidname
Code available at github.com/invalidstream
10. Digital Media
• Codecs — How we encode (represent) audio or
video in a digital form
• AAC, Apple Lossless, H.264, ProRes
• Containers — How we deliver encoded media
• MP3, .aac, Shoutcast, .mov, .mp4
13. Containers: QuickTime
• Dates back to the early 1990s
• Originally used Mac dual-fork files
• Content agnostic: nearly any media content can
be contained in a QuickTime file
• MPEG-4 adopted QuickTime as basis of its file
format (1998)
15. QuickTime: OK for playback
• Resolve all references into a single file (flatten)
or export (possibly re-encoding) to a file
• Dual-fork approach abandoned to support
QuickTime for Windows (and later, Mac OS X)
16. QuickTime: Bad for
Streaming
• Metadata at end of file meant you needed the
whole file
• First solution: “Quick-Start”, which put metadata
at beginning of file
• Later: QuickTime Streaming Server
• Also-ran to RealPlayer, Windows ASF, Flash
17. Streaming
• Different from progressive download in that the
client does only wants the media for the current
playback time
• Live sources are possible
18. Livestreaming
• Essential to some content types: breaking news,
sports, one-of-a-kind events
• Chat rooms allow interaction with audience
during the program
• It's just fun
• Twitch.tv often has over 1 million
simultaneous viewers
19. Bad old streaming
• Real, ASF, Flash, QTSS – socket-based
connections
• Hard to scale
• Bad for mobile
• Not on port 80; frequently blocked
20. Adaptive streaming
• Send “stream” as a series of small files over
HTTP, expect client to reassemble them
• Easy to scale
• Good for mobile / inconsistent networks
• Runs on port 80; never blocked
21. HTTP Live Streaming
• Apple’s de facto standard for adaptive streaming
• Supported on iOS (iPhone, Apple TV) from day
one
• Also: QuickTime, Safari, Microsoft Edge, Flash,
Android, Roku, etc
• Standards-based competitor: MPEG-DASH
24. Streaming playback
@IBAction func handlePlayButtonTapped(_ sender: AnyObject) {
guard let demoItem = demoItem else { return }
self.playerVC?.player?.pause()
let playerVC = AVPlayerViewController()
playerVC.showsPlaybackControls = true
let player = AVPlayer(url: demoItem.url)
playerVC.player = player
present(playerVC, animated: true)
player.play()
self.playerVC = playerVC
}
This is identical to local file playback
25. HLS, how do you even?
• Source media is split into short (~10 second)
segments, as MPEG-2 transport stream (.ts) files
• iOS 10 / macOS Sierra support “Fragmented
MP4”, which is potentially MPEG-DASH
compatible
• An .m3u8 playlist provides a manifest of files to
be played back
28. mediafilesegmenter
• Takes a source MP4 file and splits it into
segment files and an .m3u8 playlist
• Resulting directory can be hosted on any http
server to provide stream
31. Why is this better than a flat
file?
• Easier to skip around
• Can be encrypted / DRM’ed
• Also, about that bitrate…
32. Variant Playlists
• Allow you to encode at multiple bitrates
• Client determines at runtime if it is keeping up,
and whether to request a different variant for the
next 10 sec segment
• Instead of “buffering” stalls, viewers just see
degraded image/sound quality
44. HLS livestreams
• Exactly the same as the VOD streams we’ve
seen before
• Support variant playlists and encryption
• Difference: the playlist doesn’t have a #EXT-X-
ENDLIST
• Client knows to refresh the playlist periodically
to fetch new segment files
45. mediastreamsegmenter
• Works just like mediafilesegmenter to create the
segment files and a playlist
• Takes as input an MPEG-2 transport stream
• Nothing in Apple’s developer toolchain can
actually create such a stream
46. Streaming servers
• Services that create the stream files from an
input stream you provide
• Can also transcode to lower bitrates and
provide a variant playlist
• Can also transmux to non-HLS formats
• Generally you send them RTMP
50. Wowza
• Top streaming service provider
• Wowza Streaming Engine — runs on your
server (real or virtual)
• Wowza Streaming Cloud — streaming as a
service
55. Wirecast
• Proprietary streaming production software
• Mixes shots, captures from cameras, iOS
devices, or screen, highly customizable
• Prices start at $500
• Watch for a 1/3 off sale on Black Friday
57. Securing Streams
• With -k option, mediafilesegmenter and
mediastreamsegmenter will AES-128 encrypt
segments, using keys from the specified location
• FairPlay Streaming for HLS provides severe
DRM.
• See WWDC 2015 session “Content Protection
for HTTP Live Streaming”
58. Takeaways
• Video streaming is the best thing the Apple TV
does
• HLS streams are just flat files on a web server
• Create them with mediafilesegmenter or a
streaming server
• For lifestreams RTMP to the streaming server
with OBS, Wirecast, etc.
59. Building a Streaming
Apple TV app
Chris Adamson (@invalidname)
CocoaConf San Jose • November, 2016
Slides are available at slideshare.net/invalidname
Code is available at github.com/invalidstream