SlideShare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our User Agreement and Privacy Policy.
SlideShare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our Privacy Policy and User Agreement for details.
Successfully reported this slideshow.
Activate your 14 day free trial to unlock unlimited reading.
, AV Foundation moves to center stage as the essential media framework on the device, offering support for playing, capturing, and even editing audio and video. Borrowing some of the core ideas from the Mac's QuickTime, while adding many new concepts of its own, AV Foundation offers extraordinary capabilities for application programmers. This talk will offer a high-level overview of what's in AV Foundation, and a taste of what it can do.
, AV Foundation moves to center stage as the essential media framework on the device, offering support for playing, capturing, and even editing audio and video. Borrowing some of the core ideas from the Mac's QuickTime, while adding many new concepts of its own, AV Foundation offers extraordinary capabilities for application programmers. This talk will offer a high-level overview of what's in AV Foundation, and a taste of what it can do.
3.
iPhone 2 Media
Frameworks
Core Audio Low-level audio streaming
Media Player Full-screen video player
Obj-C wrapper for audio file
AV Foundation
playback (2.2 only)
4.
iPhone 3 Media
Frameworks
Core Audio Low-level audio streaming
Media Player iPod library search/playback
Obj-C wrapper for audio file
AV Foundation
playback, recording
5.
iOS 4 Media Frameworks
Core Audio Low-level audio streaming
Media Player iPod library search/playback
Audio/video capture, editing,
AV Foundation
playback, export…
Quartz effects on moving
Core Video
images
Objects for representing media
Core Media
times, formats, buffers
12.
Capture: Old and Busted
• UIImagePickerController
• Takes user out of your UI
• Low configurability
• No capture-time data access
• AVAudioRecorder
• Audio only, to file only
13.
Capture: New Hotness
• AV Foundation capture classes
• Highly configurable
• Live callbacks with capture data
• Image/video preview to a CALayer
15.
AVCaptureDevice
• Represents an input (camera,
microphone) or output (speakers)
device
• Discover with +[devices], +
[devicesWithMediaType], +
[defaultDeviceWithMediaType:], …
• Flash, white balance, exposure, focus
settings for camera devices
16.
AVCaptureOutput
• A destination for captured data
• Files: AVCaptureFileOutput,
AVCaptureMovieFileOutput
• Images: AVCaptureStillImageOutput
• Live data:
AVCaptureAudioDataOutput,
AVCaptureVideoDataOutput
17.
AVCaptureSession
• Coordinates the activity of audio and
video capture devices
• Allows you to connect/disconnect
inputs and outputs
• startRunning/stopRunning
18.
AVCaptureVideoPreview
Layer
• Subclass of CALayer
• +[layerWithSession:]
19.
Data Output Callbacks
• Audio and video data outputs provide
-[setSampleBufferDelegate:queue:]
• Delegates get -[captureOutput:
didOutputSampleBuffer:
fromConnection];
• Sample buffer is a
CMSampleBufferRef
20.
Core Media
• New in iOS 4
• Core Foundation opaque types for
wrapping sample buffers, format
descriptions, time structures
• Functions convert video samples to
CVImageBuffer, audio to Core Audio
AudioBufferList
21.
Core Media Time
• CMTime: value, timescale, flags, epoch
• Timescale is n-ths of a second
• Set timescale to a resolution
appropriate to your media (e.g.,
44100 for CD audio). QT convention
is 600 for video (ask Chris why!)
• CMTimeConvertScale()
22.
WWDC 2010 Session 409
Using the Camera
with AV Foundation
Overview and best practices
Brad Ford
iPhone Engineering
2
24.
“iMovie is built entirely on exactly the same public API in
AV Foundation that we’re presenting to you in iPhone 4.”
25.
“Boom Box” APIs
• Simple API for playback,
sometimes recording
• Little or no support for
editing, mixing, metadata,
etc.
• Example: HTML 5 <audio> tag
26.
“Streaming” APIs
• Use “stream of audio”
metaphor
• Strong support for mixing,
effects, other real-time
operations
• Example: Core Audio
27.
“Document” APIs
• Use “media document”
metaphor
• Strong support for editing
• Mixing may be a special case
of editing
• Example: QuickTime
and AV Foundation
28.
Assets and Movies
• AVAsset: Collection of tracks
representing timed media data
• QTMovie: Collection of tracks
representing timed media data
29.
QuickTime movies
Movie
Track Track
Media Media
31.
AVAsset
• Superclass of all “movie”-like
structures in AVFoundation
• Represents traits of all tracks taken
together: size, duration
• Build your own with AVURLAsset
32.
AVComposition
• Subclass of AVAsset representing a
combination of multiple file-based
assets
• Tracks are AVCompositionTracks
• For editing, AVMutableComposition
and AVMutableCompositionTracks
33.
Effects 1
• AVAudioMix, AVMutableAudioMix: set
volumes or audio ramps at specific
times
• AVVideoCompositionInstructions:
provide a set of layer-based
instructions for performing time-
based opacity or affine transform
ramps
34.
Playback
• AVPlayer: Playback controller
• play, pause, seekToTime, etc.
• AVPlayerLayer: CALayer for
presenting video from an AVLayer
35.
Effects 2
• AVSynchronizedLayer: CALayer that
synchronizes with a AVPlayerItem’s
playback timing
• Use for overlays, titles, rendered
images, Ken Burns effects, etc.
36.
Export
• AVAssetExportSession
• Must be created with a canned
preset
• -[exportAsynchronouslyWith
CompletionHandler:]
• Takes a block!
• Exporting CA effects is tricky…
37.
Other stuff
• AVAssetStillImageGenerator: used for
generating thumbnails
• Not suitable for getting individual
frames (but no GetMediaSample()
equivalent either!)
• NSCoder, NSValue additions for
wrapping CMTimes, CMTimeRanges
38.
WWDC 2010 Session 407
Editing Media with AV Foundation
Overview and best practices
Eric Lee
iPhone Engineering
2