Neat things to do on a rainy day with
AVFoundation
Ryder Mackay
@rydermackay
TACOW
May 14, 2013
Don’t worry, I’m an “expert”
AVFoundation
• Mid-level Objective-C framework for playing,
recording and editing time-based media
• Available on iOS 4.0+ and Mac OS X 10.7+
AV Foundation
Core Audio Core Media Core Animation
Media Player
UIKit
Additional Frameworks
• QuartzCore
• Layer trees, animation
• Core Media
• CMTime, CMTimeRange, etc.
• CoreVideo
• CVPixelBufferRef, kCVPixelFormatTypes
A Brief History
iOS AVFoundation Features
2.2 AVAudioPlayer
3.0 AVAudioRecorder,AVAudioSession
4.0 Capture, playback and editing
4.1 Read/write sample buffers, queue player
5.0 OpenGLES compatibility,AirPlay
New in iOS 6.0
• Real-time access to video buffers
• Face tracking during capture
• Better support for encrypted streams
• Advanced synchronization features
AVAsset (abstract base class)
• AVURLAsset: local or remote
• AVComposition
• AVMutableComposition
AVAsset
AVAssetTrack
AVAssetTrackSegment
audio
video
@protocol AVAsynchronousKeyValueLoading
• Handler invoked on arbitrary thread;
dispatch_async to main queue
- (void)loadValuesAsynchronouslyForKeys:(NSArray *)keys
completionHandler:(void (^)())handler;
- (AVKeyValueStatus)statusOfValueForKey:(NSString *)key
error:(NSError **)outError;
AVURLAsset
+ (AVURLAsset *)assetWithURL:(NSURL *)URL options:
(NSDictionary *)options;
- (CMTime)duration;
- (NSArray *)tracks;
- (BOOL)isPlayable;
- (BOOL)isExportable;
- (BOOL)isReadable;
- (BOOL)isComposable;
AVURLAssetPreferPreciseDurationAndTimingKey
CMTime
• C struct representing rational number
• numerator: value, denominator: scale
• time in seconds = value / scale
• Flags: valid, +/-ve infinity, has been rounded
• Time scale of 600 can conveniently
represents 24, 25 and 30 fps exactly
Playback
AVPlayerItem
AVPlayerItemTrack
AVPlayerItemTrack
AVAsset
AVAssetTrack
AVAssetTrack
AVPlayerLayer
AVPlayer
AVPlayerItem
AVPlayerItemTrack
AVPlayer & item manage asset presentation state
AVPlayer
@property (nonatomic) float rate;
- (void)play;
- (void)seekToTime:(CMTime)time;
- (CMTime)currentTime;
- (AVPlayerStatus)status;
- (NSError *)error;
Playback Notifications
– (id)addPeriodicTimeObserverForInterval:
(CMTime)interval queue:(dispatch_queue_t)queue
usingBlock:(void (^)(CMTime))block;
-(id)addBoundaryTimeObserverForTimes:(NSArray *)times
queue:(dispatch_queue_t)queue usingBlock:(void (^)
(CMTime))block;
– (void)removeTimeObserver:(id)observer;
typedef NSInteger AVPlayerActionAtItemEnd;
NSString * const AVPlayerItemDidPlayToEndNotification;
AVPlayerLayer
+ (AVPlayerLayer *)playerLayerWithPlayer:
(AVPlayer *)player;
@property (copy) NSString *videoGravity;
@property (nonatomic,readonly, getter =
isReadyForDisplay) BOOL readyForDisplay;
Playback Summary
• Load asset tracks
• Create player and player item
• Create player layer
• Observe “readyForDisplay” property
• Insert layer into subtree
Generating Thumbnails
AVAssetImageGenerator
- (id)initWithAsset:(AVAsset *)asset;
- (CGImageRef)copyCGImageAtTime:(CMTime)requestedTime
actualTime:(CMTime *)actualTime
error:(NSError **)outError;
- (void)generateCGImagesAsynchronouslyForTimes:(NSArray
*)requestedTimes completionHandler:(CMTime requestedTime,
CGImageRef image, CMTime actualTime,
AVAssetImageGeneratorResult result, NSError
*error)handler;
- (void)cancelAllCGImageGeneration;
Demo:
Custom Player
Editing
AVAsset
AVAssetTrack AVAssetTrackSegment
AVComposition
AVCompositionTrack AVCompositionTrackSegment
AVMutableComposition
- (BOOL)insertTimeRange:
(CMTimeRange)timeRange ofAsset:(AVAsset
*)asset atTime:(CMTime)startTime error:
(NSError **)outError;
- (void)scaleTimeRange:(CMTimeRange)timeRange
toDuration:(CMTime)duration;
- (void)removeTimeRange:
(CMTimeRange)timeRange;
timeRange must be valid or you’re gonna have a bad time
Demo:
Capture, Compose & Export
AVSynchronizedLayer
• Confers timing state upon sublayers
• Timing synced with AVPlayerItem instance
• +synchronizedLayerWithPlayerItem:
• When creating CAAnimations:
• Use AVCoreAnimationBeginTimeAtZero
• -setRemovedOnCompletion:NO
Demo:
AVSynchronizedLayer
Real-time Processing
Now Playing in 3D
• iOS 5:
• CVOpenGLESTextureCacheRef
• CVOpenGLESTextureRef
• +[CIImage imageWithCVPixelBuffer:options:]
• Binding between CVPixelBufferRef and GL textures
• Bypasses copying to/from CPU-controlled memory
• iOS 6:
• +[CIImage imageWithTexture:size:flipped:colorSpace:]
CVOpenGLESTextureCacheCreate(
CFAllocatorRef allocator,
CFDictionaryRef cacheAttributes,
CVEAGLContext eaglContext,
CFDictionaryRef textureAttributes,
CVOpenGLESTextureCacheRef *cacheOut)
CVOpenGLESTextureCacheCreateTextureFromImage(
CFAllocatorRef allocator,
CVOpenGLESTextureCacheRef textureCache,
CVImageBufferRef sourceImage,
CFDictionaryRef textureAttributes,
GLenum target, // GL_TEXTURE_2D
GLint internalFormat, // GL_RGBA
GLsizei width,
GLsizei height,
GLenum format, // GL_BGRA
GLenum type, // GL_UNSIGNED_BYTE
size_t planeIndex, // 0
CVOpenGLESTextureRef *textureOut)
Obtaining Pixel Buffers
Real-time:
• AVCaptureVideoDataOutput
• AVPlayerItemVideoOutput iOS 6
Offline:
• AVAssetReaderTrackOutput
• AVAssetReaderVideoCompositionOutput
AVPlayerItemVideoDataOutput
• Access pixel buffers during playback
• Request wakeup, poll w/ display link
running pausedCADisplayLink
request change
notification
media data
will change
buffer for
time?
NO
YES
process &
display
entry
point
Demo:
Real-timeVFX
Sample Code
Apple:
AVBasicVideoOutput
AVSimpleEditor
RosyWriter
Bob McCune – AVFoundationEditor:
https://github.com/tapharmonic/AVFoundationEditor
Bill Dudney – AVCoreImageIntegration:
https://github.com/bdudney/Experiments
Twitter: @rydermackay
ADN: @ryder
github.com/rydermackay
while (self.retainCount > 0) {
[self release];
}

AVFoundation @ TACOW 2013 05 14