Introduction to AV Foundation

30,278 views

Published on

, AV Foundation moves to center stage as the essential media framework on the device, offering support for playing, capturing, and even editing audio and video. Borrowing some of the core ideas from the Mac's QuickTime, while adding many new concepts of its own, AV Foundation offers extraordinary capabilities for application programmers. This talk will offer a high-level overview of what's in AV Foundation, and a taste of what it can do.

Published in: Technology
0 Comments
22 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
30,278
On SlideShare
0
From Embeds
0
Number of Embeds
5
Actions
Shares
0
Downloads
0
Comments
0
Likes
22
Embeds 0
No embeds

No notes for slide

Introduction to AV Foundation

  1. 1. Introduction to AV Foundation Chris Adamson CocoaHeads Ann Arbor — Aug. 12, 2010
  2. 2. Media on iPhone / iOS
  3. 3. iPhone 2 Media Frameworks Core Audio Low-level audio streaming Media Player Full-screen video player Obj-C wrapper for audio file AV Foundation playback (2.2 only)
  4. 4. iPhone 3 Media Frameworks Core Audio Low-level audio streaming Media Player iPod library search/playback Obj-C wrapper for audio file AV Foundation playback, recording
  5. 5. iOS 4 Media Frameworks Core Audio Low-level audio streaming Media Player iPod library search/playback Audio/video capture, editing, AV Foundation playback, export… Quartz effects on moving Core Video images Objects for representing media Core Media times, formats, buffers
  6. 6. AV Foundation in iPhone 2.2
  7. 7. AV Foundation in iPhone 3.2
  8. 8. AV Foundation in iOS 4
  9. 9. Size is relative AV android. QuickTime QT Kit Foundation media for Java Classes 56 40 24 576 Methods 460 280 360 >10,000
  10. 10. AV Foundation Classes • Capture • Assets and compositions • Playback, editing, and export • Legacy classes
  11. 11. Media Capture
  12. 12. Capture: Old and Busted • UIImagePickerController • Takes user out of your UI • Low configurability • No capture-time data access • AVAudioRecorder • Audio only, to file only
  13. 13. Capture: New Hotness • AV Foundation capture classes • Highly configurable • Live callbacks with capture data • Image/video preview to a CALayer
  14. 14. Capture Classes Seem Familiar? QT Kit AV Foundation QTCaptureAudioPreviewOutput AVCaptureAudioDataOutput QTCaptureConnection AVCaptureConnection QTCaptureDecompressedAudioOutput AVCaptureDevice QTCaptureDecompressedVideoOutput AVCaptureFileOutput QTCaptureDevice AVCaptureInput QTCaptureDeviceInput AVCaptureMovieFileOutput QTCaptureFileOutput AVCaptureOutput QTCaptureInput AVCaptureSession QTCaptureLayer AVCaptureStillImageOutput QTCaptureMovieFileOutput AVCaptureVideoDataOutput QTCaptureOutput AVCaptureVideoPreviewLayer QTCaptureSession QTCaptureVideoPreviewOutput QTCaptureView
  15. 15. AVCaptureDevice • Represents an input (camera, microphone) or output (speakers) device • Discover with +[devices], + [devicesWithMediaType], + [defaultDeviceWithMediaType:], … • Flash, white balance, exposure, focus settings for camera devices
  16. 16. AVCaptureOutput • A destination for captured data • Files: AVCaptureFileOutput, AVCaptureMovieFileOutput • Images: AVCaptureStillImageOutput • Live data: AVCaptureAudioDataOutput, AVCaptureVideoDataOutput
  17. 17. AVCaptureSession • Coordinates the activity of audio and video capture devices • Allows you to connect/disconnect inputs and outputs • startRunning/stopRunning
  18. 18. AVCaptureVideoPreview Layer • Subclass of CALayer • +[layerWithSession:]
  19. 19. Data Output Callbacks • Audio and video data outputs provide -[setSampleBufferDelegate:queue:] • Delegates get -[captureOutput:​ didOutputSampleBuffer:​ fromConnection]; • Sample buffer is a CMSampleBufferRef
  20. 20. Core Media • New in iOS 4 • Core Foundation opaque types for wrapping sample buffers, format descriptions, time structures • Functions convert video samples to CVImageBuffer, audio to Core Audio AudioBufferList
  21. 21. Core Media Time • CMTime: value, timescale, flags, epoch • Timescale is n-ths of a second • Set timescale to a resolution appropriate to your media (e.g., 44100 for CD audio). QT convention is 600 for video (ask Chris why!) • CMTimeConvertScale()
  22. 22. WWDC 2010 Session 409 Using the Camera with AV Foundation Overview and best practices Brad Ford iPhone Engineering 2
  23. 23. Assets and Composition
  24. 24. “iMovie is built entirely on exactly the same public API in AV Foundation that we’re presenting to you in iPhone 4.”
  25. 25. “Boom Box” APIs • Simple API for playback, sometimes recording • Little or no support for editing, mixing, metadata, etc. • Example: HTML 5 <audio> tag
  26. 26. “Streaming” APIs • Use “stream of audio” metaphor • Strong support for mixing, effects, other real-time operations • Example: Core Audio
  27. 27. “Document” APIs • Use “media document” metaphor • Strong support for editing • Mixing may be a special case of editing • Example: QuickTime and AV Foundation
  28. 28. Assets and Movies • AVAsset: Collection of tracks representing timed media data • QTMovie: Collection of tracks representing timed media data
  29. 29. QuickTime movies Movie Track Track Media Media
  30. 30. AVFoundation assets Movie Track Track
  31. 31. AVAsset • Superclass of all “movie”-like structures in AVFoundation • Represents traits of all tracks taken together: size, duration • Build your own with AVURLAsset
  32. 32. AVComposition • Subclass of AVAsset representing a combination of multiple file-based assets • Tracks are AVCompositionTracks • For editing, AVMutableComposition and AVMutableCompositionTracks
  33. 33. Effects 1 • AVAudioMix, AVMutableAudioMix: set volumes or audio ramps at specific times • AVVideoCompositionInstructions: provide a set of layer-based instructions for performing time- based opacity or affine transform ramps
  34. 34. Playback • AVPlayer: Playback controller • play, pause, seekToTime, etc. • AVPlayerLayer: CALayer for presenting video from an AVLayer
  35. 35. Effects 2 • AVSynchronizedLayer: CALayer that synchronizes with a AVPlayerItem’s playback timing • Use for overlays, titles, rendered images, Ken Burns effects, etc.
  36. 36. Export • AVAssetExportSession • Must be created with a canned preset • -[exportAsynchronouslyWith​ CompletionHandler:] • Takes a block! • Exporting CA effects is tricky…
  37. 37. Other stuff • AVAssetStillImageGenerator: used for generating thumbnails • Not suitable for getting individual frames (but no GetMediaSample() equivalent either!) • NSCoder, NSValue additions for wrapping CMTimes, CMTimeRanges
  38. 38. WWDC 2010 Session 407 Editing Media with AV Foundation Overview and best practices Eric Lee iPhone Engineering 2
  39. 39. Demo
  40. 40. What’s Next?
  41. 41. What’s Next Next? AVAssetExportSession.h: extern NSString *const AVAssetExportPreset1280x720 __OSX_AVAILABLE_STARTING(__MAC_10_7,__IPHONE_4_0); CMTime.h: CM_EXPORT const CFStringRef kCMTimeScaleKey __OSX_AVAILABLE_STARTING(__MAC_10_7,__IPHONE_4_0);
  42. 42. Contact Info • http://www.subfurther.com/blog • @invalidname • invalidname [at] gmail [dot] com

×