BPI Media Group is a family-owned printing and marketing solutions company that has served customers since 1979. They offer a full range of services from design, content creation, and web development to printing and fulfillment. BPI prides itself on unmatched customer service and a state-of-the-art climate controlled facility housing both traditional and digital printing capabilities. Their goal is to anticipate customers' needs and help them achieve their goals on time and on budget.
Core Audio in iOS 6 (CocoaConf Chicago, March 2013)Chris Adamson
Core Audio gets a bunch of neat new tricks in iOS 6, particularly for developers working with Audio Units. New effect units include an improved ability to vary pitch and playback speed, a digital delay unit, and OS X's powerful matrix mixer. There's now a new place to use units too, as the Audio Queue now offers developers a way to "tap" into the data being queued up for playback. To top it all off, a new "multi-route" system allows us to play out of multiple, multi-channel output devices at the same time.
Want to see, and hear, how all this stuff works? This section is the place to find out.
Stupid Video Tricks, CocoaConf Las VegasChris Adamson
The document discusses various techniques for manipulating and processing video and audio using AV Foundation frameworks in iOS and Mac OS X. It begins with an overview of AV Foundation and describes common tasks like playback, capture, and editing. It then demonstrates tricks like animating AVPlayerLayers and recording the screen. The document dives deeper into techniques for reading and manipulating subtitle, audio, and video tracks using Core Media, Core Audio, Core Video, and Core Image frameworks. It provides code samples for applying filters to video in real-time and writing modified data back out.
Core Audio in iOS 6 (CocoaConf Raleigh, Dec. '12)Chris Adamson
The document summarizes Chris Adamson's presentation on Core Audio in iOS 6. It discusses the key components of Core Audio including engines like Audio Units that process audio streams, and helpers that deal with formats, file I/O, and session management. It provides examples of using Audio Units to process and render audio in a pull model, and creating an AURemoteIO unit to handle playback and capture on iOS.
Mobile Movies with HTTP Live Streaming (CocoaConf DC, March 2013)Chris Adamson
The document summarizes Chris Adamson's presentation on mobile movies with HTTP Live Streaming. The presentation covered what streaming is and how it differs from traditional broadcast media, introduced HTTP Live Streaming (HLS) as a way to stream media over HTTP, and described how HLS works by serving media in short file segments using a playlist file. It also discussed features of HLS like providing multiple variants for different bandwidths and encrypting file segments for security.
Core Audio, the only media framework available since day one of the public iPhone SDK, offers extremely low latency and powerful access to the device's audio processing system... assuming you can handle what's renowned as one of the hardest APIs on the platform. In iOS 5, Core Audio gets even better, with great new features that had previous been burdensome, if not impossible, to develop on your own. Once the iOS 5 NDA drops, the shiny new bits will be available to all, and this talk will be one of your first chances to learn how they work. Attendees will learn the basics of Core Audio -- the engine APIs that process sound (Audio Queue, Audio Units, and OpenAL) and the helper APIs that get samples into and out of them -- and then look where iOS 5 fills in some of the holes that have existed up to now.
BPI Media Group is a family-owned printing and marketing solutions company that has served customers since 1979. They offer a full range of services from design, content creation, and web development to printing and fulfillment. BPI prides itself on unmatched customer service and a state-of-the-art climate controlled facility housing both traditional and digital printing capabilities. Their goal is to anticipate customers' needs and help them achieve their goals on time and on budget.
Core Audio in iOS 6 (CocoaConf Chicago, March 2013)Chris Adamson
Core Audio gets a bunch of neat new tricks in iOS 6, particularly for developers working with Audio Units. New effect units include an improved ability to vary pitch and playback speed, a digital delay unit, and OS X's powerful matrix mixer. There's now a new place to use units too, as the Audio Queue now offers developers a way to "tap" into the data being queued up for playback. To top it all off, a new "multi-route" system allows us to play out of multiple, multi-channel output devices at the same time.
Want to see, and hear, how all this stuff works? This section is the place to find out.
Stupid Video Tricks, CocoaConf Las VegasChris Adamson
The document discusses various techniques for manipulating and processing video and audio using AV Foundation frameworks in iOS and Mac OS X. It begins with an overview of AV Foundation and describes common tasks like playback, capture, and editing. It then demonstrates tricks like animating AVPlayerLayers and recording the screen. The document dives deeper into techniques for reading and manipulating subtitle, audio, and video tracks using Core Media, Core Audio, Core Video, and Core Image frameworks. It provides code samples for applying filters to video in real-time and writing modified data back out.
Core Audio in iOS 6 (CocoaConf Raleigh, Dec. '12)Chris Adamson
The document summarizes Chris Adamson's presentation on Core Audio in iOS 6. It discusses the key components of Core Audio including engines like Audio Units that process audio streams, and helpers that deal with formats, file I/O, and session management. It provides examples of using Audio Units to process and render audio in a pull model, and creating an AURemoteIO unit to handle playback and capture on iOS.
Mobile Movies with HTTP Live Streaming (CocoaConf DC, March 2013)Chris Adamson
The document summarizes Chris Adamson's presentation on mobile movies with HTTP Live Streaming. The presentation covered what streaming is and how it differs from traditional broadcast media, introduced HTTP Live Streaming (HLS) as a way to stream media over HTTP, and described how HLS works by serving media in short file segments using a playlist file. It also discussed features of HLS like providing multiple variants for different bandwidths and encrypting file segments for security.
Core Audio, the only media framework available since day one of the public iPhone SDK, offers extremely low latency and powerful access to the device's audio processing system... assuming you can handle what's renowned as one of the hardest APIs on the platform. In iOS 5, Core Audio gets even better, with great new features that had previous been burdensome, if not impossible, to develop on your own. Once the iOS 5 NDA drops, the shiny new bits will be available to all, and this talk will be one of your first chances to learn how they work. Attendees will learn the basics of Core Audio -- the engine APIs that process sound (Audio Queue, Audio Units, and OpenAL) and the helper APIs that get samples into and out of them -- and then look where iOS 5 fills in some of the holes that have existed up to now.
Stupid Video Tricks, CocoaConf Seattle 2014Chris Adamson
AV Foundation makes it reasonably straightforward to capture video from the camera and edit together a nice family video. This session is not about that stuff. This session is about the nooks and crannies where AV Foundation exposes what's behind the curtain. Instead of letting AVPlayer read our video files, we can grab the samples ourselves and mess with them. AVCaptureVideoPreviewLayer, meet the CGAffineTransform. And instead of dutifully passing our captured video frames to the preview layer and an output file, how about if we instead run them through a series of Core Image filters? Record your own screen? Oh yeah, we can AVAssetWriter that. With a few pointers, a little experimentation, and a healthy disregard for safe coding practices, Core Media and Core Video let you get away with some neat stuff
Mobile devices are so useful because they can get on the net with their built-in wi-fi or cellular data radios. But how does this work? In iOS, we have a slew of networking APIs, each appropriate in different situations. From decades-old BSD sockets to the new-in-iOS-5 iCloud, there are a wide range of networking calls available to your app, and an equally wide range of semantics in how to use them. In this talk, we'll start with high-level abstractions like iCloud and other objects that can either load from or save to a URL, then progress down through the stack, grabbing arbitrary content from URLs, self-networking with Bonjour and Game Kit, and finally accessing the socket layer with CFNetwork and BSD sockets.
iCloud sample code at: http://dl.dropbox.com/u/12216224/conferences/codemash12/bonjour-icloud/CloudNotes.zip
Stupid Video Tricks (CocoaConf DC, March 2014)Chris Adamson
The document discusses various techniques for working with time-based media like video and audio using the AV Foundation framework in iOS and macOS. It provides an overview of common tasks like playback, capture, and editing using classes like AVPlayer, AVCaptureSession, and AVComposition. It then demonstrates more advanced tricks like animating an AVPlayerLayer and processing video frames in real-time using Core Image filters. The document recommends exploring other related frameworks like Core Audio, Core Media, Video Toolbox, and Core Video for additional functionality and performance.
This document provides an introduction and overview of the Roku SDK. It discusses the basics of developing Roku channels using the BrightScript programming language and built-in component library. It covers setting up the development environment, the file structure of Roku channels, common screens and objects, debugging, and preparing and loading content like audio and video. The document concludes with next steps like following Roku's design guidelines and publishing channels privately or to the public Roku channel store.
Everybody knows that iOS is a shiny, modern operating system with a sleek object-oriented framework, Cocoa Touch, that makes development uncluttered and easy. Everybody is wrong. As a successor to both Unix and the Classic Mac OS and OS X, iOS has a wide-ranging mass of frameworks and libraries, employing different design patterns and conventions and sometimes employing different programming languages. The developer who's new to iOS can go only so far with Objective-C and the UIKit frameworks and their modern friends before he or she discovers the need to go deeper. But what's down there? This session digs down into the iOS stack to show the lower levels of the platform's APIs: the Media Layer, Core Services, and the Core OS Layer. As we go, we'll have to abandon Objective-C in favor of plain ol' C, which is used for the Core Foundation framework that does the heavy lifting for Cocoa Touch's strings, collections, memory management, I/O and more. We'll also look at specialized low-level frameworks for security (including certificate management and the confounding but useful Keychain), CPU-accelerated math and DSP functions, high performance graphics and sound, and more. At the lowest level, we hit Unix, and we'll see how conventional Unix-style programming practices are often appropriate (and sometimes necessary) on iOS, including pthreads and BSD sockets.
, AV Foundation moves to center stage as the essential media framework on the device, offering support for playing, capturing, and even editing audio and video. Borrowing some of the core ideas from the Mac's QuickTime, while adding many new concepts of its own, AV Foundation offers extraordinary capabilities for application programmers. This talk will offer a high-level overview of what's in AV Foundation, and a taste of what it can do.
Stupid Video Tricks, CocoaConf Seattle 2014Chris Adamson
AV Foundation makes it reasonably straightforward to capture video from the camera and edit together a nice family video. This session is not about that stuff. This session is about the nooks and crannies where AV Foundation exposes what's behind the curtain. Instead of letting AVPlayer read our video files, we can grab the samples ourselves and mess with them. AVCaptureVideoPreviewLayer, meet the CGAffineTransform. And instead of dutifully passing our captured video frames to the preview layer and an output file, how about if we instead run them through a series of Core Image filters? Record your own screen? Oh yeah, we can AVAssetWriter that. With a few pointers, a little experimentation, and a healthy disregard for safe coding practices, Core Media and Core Video let you get away with some neat stuff
Mobile devices are so useful because they can get on the net with their built-in wi-fi or cellular data radios. But how does this work? In iOS, we have a slew of networking APIs, each appropriate in different situations. From decades-old BSD sockets to the new-in-iOS-5 iCloud, there are a wide range of networking calls available to your app, and an equally wide range of semantics in how to use them. In this talk, we'll start with high-level abstractions like iCloud and other objects that can either load from or save to a URL, then progress down through the stack, grabbing arbitrary content from URLs, self-networking with Bonjour and Game Kit, and finally accessing the socket layer with CFNetwork and BSD sockets.
iCloud sample code at: http://dl.dropbox.com/u/12216224/conferences/codemash12/bonjour-icloud/CloudNotes.zip
Stupid Video Tricks (CocoaConf DC, March 2014)Chris Adamson
The document discusses various techniques for working with time-based media like video and audio using the AV Foundation framework in iOS and macOS. It provides an overview of common tasks like playback, capture, and editing using classes like AVPlayer, AVCaptureSession, and AVComposition. It then demonstrates more advanced tricks like animating an AVPlayerLayer and processing video frames in real-time using Core Image filters. The document recommends exploring other related frameworks like Core Audio, Core Media, Video Toolbox, and Core Video for additional functionality and performance.
This document provides an introduction and overview of the Roku SDK. It discusses the basics of developing Roku channels using the BrightScript programming language and built-in component library. It covers setting up the development environment, the file structure of Roku channels, common screens and objects, debugging, and preparing and loading content like audio and video. The document concludes with next steps like following Roku's design guidelines and publishing channels privately or to the public Roku channel store.
Everybody knows that iOS is a shiny, modern operating system with a sleek object-oriented framework, Cocoa Touch, that makes development uncluttered and easy. Everybody is wrong. As a successor to both Unix and the Classic Mac OS and OS X, iOS has a wide-ranging mass of frameworks and libraries, employing different design patterns and conventions and sometimes employing different programming languages. The developer who's new to iOS can go only so far with Objective-C and the UIKit frameworks and their modern friends before he or she discovers the need to go deeper. But what's down there? This session digs down into the iOS stack to show the lower levels of the platform's APIs: the Media Layer, Core Services, and the Core OS Layer. As we go, we'll have to abandon Objective-C in favor of plain ol' C, which is used for the Core Foundation framework that does the heavy lifting for Cocoa Touch's strings, collections, memory management, I/O and more. We'll also look at specialized low-level frameworks for security (including certificate management and the confounding but useful Keychain), CPU-accelerated math and DSP functions, high performance graphics and sound, and more. At the lowest level, we hit Unix, and we'll see how conventional Unix-style programming practices are often appropriate (and sometimes necessary) on iOS, including pthreads and BSD sockets.
, AV Foundation moves to center stage as the essential media framework on the device, offering support for playing, capturing, and even editing audio and video. Borrowing some of the core ideas from the Mac's QuickTime, while adding many new concepts of its own, AV Foundation offers extraordinary capabilities for application programmers. This talk will offer a high-level overview of what's in AV Foundation, and a taste of what it can do.