iOS Media APIs (MobiDevDay Detroit, May 2013)

  • 1,279 views
Uploaded on

 

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
No Downloads

Views

Total Views
1,279
On Slideshare
0
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
0
Comments
0
Likes
4

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. iOS Media APIsChris Adamson • @invalidnameMobiDevDay • Detroit, MI • May 4, 2013Slides will be posted to slideshare.net/invalidnameThursday, May 2, 13
  • 2. Where do I start?Thursday, May 2, 13
  • 3. iOS Media APIs• AV Foundation• Core Media• Core Animation• Media Player• Core Audio• Audio Toolbox• Audio Units• Core MIDI• Open ALThursday, May 2, 13
  • 4. Also!• UIKit• UIImagePickerController, remote controlevents• AirPlay• HTTP Live StreamingThursday, May 2, 13
  • 5. How Do I?• Play music on my titlescreen?• Play a video in my app?• Capture, edit, andexport video?• Play streaming videofrom my web site?• Play user’s iTunes music?• Play a video on anAppleTV?• Play web radio?• Mix and perform effectson audio?• Use MIDI devices?• Create in-game soundsfor a 3D game?Thursday, May 2, 13
  • 6. How Do I?• Play music on my titlescreen?• Play a video in my app?• Capture, edit, andexport video?• Play streaming videofrom my web site?• Play user’s iTunes music?• Play a video on anAppleTV?• Play web radio?• Mix and perform effectson audio?• Use MIDI devices?• Create in-game soundsfor a 3D game?Thursday, May 2, 13
  • 7. AV FoundationThursday, May 2, 13
  • 8. AV Foundation• High-level Obj-C audio/video framework• Debuted in iPhone OS 2.3, not fully bakeduntil iOS 4• Consists of three distinct areas offunctionality with little overlapThursday, May 2, 13
  • 9. AVAudioPlayerAVAudioRecorderAVAudioSessionAVAudioSessionChannelDescriptionAVAudioSessionPortDescriptionAVAudioSessionRouteDescriptionAVCaptureAudioDataOutputAVCaptureConnectionAVCaptureDeviceAVCaptureDeviceInputAVCaptureFileOutputAVCaptureInputAVCaptureMetadataOutputAVCaptureMovieFileOutputAVCaptureOutputAVCaptureSessionAVCaptureStillImageOutputAVCaptureVideoDataOutputAVCaptureVideoPreviewLayerAVAssetAVAssetExportSessionAVAssetImageGeneratorAVAssetReaderAVAssetReaderAudioMixOutputAVAssetReaderOutputAVAssetReaderTrackOutputAVAssetReaderVideoCompositionOutputAVAssetResourceLoaderAVAssetResourceLoadingRequestAVAssetTrackAVAssetTrackSegmentAVAssetWriterAVAssetWriterInputAVAssetWriterInputPixelBufferAdaptorAVAudioMixAVAudioMixInputParametersAVCompositionAVCompositionTrackAVCompositionTrackSegmentAVMediaSelectionGroupAVMediaSelectionOptionAVMetadataFaceObjectAVMetadataItemAVMetadataObjectAVMutableAudioMixAVMutableAudioMixInputParametersAVMutableCompositionAVMutableCompositionTrackAVMutableMetadataItemAVMutableTimedMetadataGroupAVMutableVideoCompositionAVMutableVideoCompositionInstructionAVMutableVideoCompositionLayerInstructionAVPlayerAVPlayerItemAVPlayerItemAccessLogAVPlayerItemAccessLogEventAVPlayerItemErrorLogAVPlayerItemErrorLogEventAVPlayerItemOutputAVPlayerItemTrackAVPlayerItemVideoOutputAVPlayerLayerAVQueuePlayerAVSynchronizedLayerAVTextStyleRuleAVTimedMetadataGroupAVURLAssetAVVideoCompositionAVVideoCompositionCoreAnimationToolAVVideoCompositionInstructionAVVideoCompositionLayerInstructionAudioCaptureEditing / Export / PlaybackThursday, May 2, 13
  • 10. AVF audio classes• AVAudioPlayer – plays flat files or audiofrom an NSData• AVAudioRecorder – records from most-recently connected input device (built-inmic, headset, etc.)• AVAudioSession – negotiates with systemfor access to audio hardwareThursday, May 2, 13
  • 11. AVAudioPlayer• initWithContentsOfURL:error: orinitWithData:error:• URL must be local• Methods: play, playAtTime:, pause, stop• Properties: volume, pan, numberOfLoops,etc.Thursday, May 2, 13
  • 12. Playing a songNSURL *songURL = [[NSBundle mainBundle]URLForResource:@"Bossa Lounger Long"! ! ! ! ! ! ! withExtension:@"caf"];// set up av audio playerNSError *err = nil;self.player = [[AVAudioPlayer alloc]initWithContentsOfURL:songURLerror:&err];if (err) {! NSLog (@"Error creating player: %@", err);} else {! [self.player play];}Thursday, May 2, 13
  • 13. How Do I?• Play music on my titlescreen?• Play a video in my app?• Capture, edit, andexport video?• Play streaming videofrom my web site?• Play user’s iTunes music?• Play a video on anAppleTV?• Play web radio?• Mix and perform effectson audio?• Use MIDI devices?• Create in-game soundsfor a 3D game?Thursday, May 2, 13
  • 14. Video Playback• Two options:AVPlayer andMPMoviePlayerController• Since we’re already talking about AVFoundation, let’s do AVPlayer and friendsfor nowThursday, May 2, 13
  • 15. AVF essentials• AVAsset – A time-based, playable item(local or remote)• AVPlayer – Handles playback of one ormore AVPlayerItems• Each AVPlayerItem is associated with anAVAsset• Player does play/pause, seekToTime:, etc.Thursday, May 2, 13
  • 16. AVPlayer video• AVPlayerLayer – A CALayer to presentvideo from an AVLayer• CALayer is not a UIResponder, doesn’thandle touches.You need to provide yourown playback UI.• gravity property determines “stretching”of video to fit layer’s boundsThursday, May 2, 13
  • 17. Playing a videoAVURLAsset *asset = [AVURLAsset URLAssetWithURL:url! ! ! ! ! ! ! options:nil];AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:asset];self.player = [AVPlayer playerWithPlayerItem:playerItem];NSArray *visualTracks = [asset tracksWithMediaCharacteristic:AVMediaCharacteristicVisual];if ([visualTracks count] > 0) {! AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:player];! [playerView.layer addSublayer:playerLayer];! playerLayer.frame = playerView.layer.bounds;! playerLayer.videoGravity = AVLayerVideoGravityResizeAspect;}Thursday, May 2, 13
  • 18. How Do I?• Play music on my titlescreen?• Play a video in my app?• Capture, edit, andexport video?• Play streaming videofrom my web site?• Play user’s iTunes music?• Play a video on anAppleTV?• Play web radio?• Mix and perform effectson audio?• Use MIDI devices?• Create in-game soundsfor a 3D game?Thursday, May 2, 13
  • 19. AVF Capture• AVCaptureSession – Coordinates captureactivities• Discover AVCaptureDevices (mics,camera), create AVCaptureInputs fromthem, connect to session• Create AVCaptureOutputs (file, datacallbacks), connect to sessionThursday, May 2, 13
  • 20. AVF Capture// create capture session, attach default video inputself.captureSession = [[AVCaptureSession alloc] init];NSError *setUpError = nil;AVCaptureDevice *videoDevice = [AVCaptureDevicedefaultDeviceWithMediaType: AVMediaTypeVideo];if (videoDevice) {! AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInputdeviceInputWithDevice:videoDevice! error:&setUpError];! if (videoInput) {! ! [captureSession addInput: videoInput];! }}// create a preview layer from the session and add it to UIAVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayerlayerWithSession:captureSession];previewLayer.frame = captureView.layer.bounds;previewLayer.videoGravity = AVLayerVideoGravityResizeAspect;[captureView.layer addSublayer:previewLayer];// start capture session and write to filecaptureMovieOutput = [[AVCaptureMovieFileOutput alloc] init];[captureSession addOutput:captureMovieOutput];[captureSession startRunning];[captureMovieOutput startRecordingToOutputFileURL:captureMovieURL! ! ! ! ! ! recordingDelegate:self];Thursday, May 2, 13
  • 21. AVF Editing• AVMutableComposition – An AVAsset for amulti-track movie you build fromreferences to other AVAssets• AVMutableCompositionTrack – Built upwithinsertTimeRange:ofTrack:atTime:error:Thursday, May 2, 13
  • 22. AVFVideo Effects• Create AVMutableMutableVideoInstructionmade up ofAVMutableVideoCompositionLayerInstructions• Each instruction works with a video track• Instruction ramps opacity or affinetransform over a time rangeThursday, May 2, 13
  • 23. AVF Text/Image effects• Create an AVSynchronizedLayer to showthe composition• Create CALayers, CATextLayers and setCAAnimations on them, then add as sub-layers to the AVSynchronizedLayerThursday, May 2, 13
  • 24. AVF Export• Create AVAssetExportSession with one of thecanned presets (audio only, or QuickTime .movat preset size or quality)• If you used layer-based animations, add anAVVideoComposition, with anAVVideoCompositionCoreAnimationTool(yes, it’s as hard as it sounds)• CallexportAsynchronouslyWithCompletionHandler:Thursday, May 2, 13
  • 25. AVF Editing-(IBAction) handlePerformTapped: (id) sender {! NSLog (@"handlePerformTapped");! NSError *performError = nil;! // create composition! self.composition = [[AVMutableComposition alloc] init];! // create video tracks a and b! // note: mediatypes are defined in AVMediaFormat.h! [trackA! release];! trackA = [self.composition addMutableTrackWithMediaType:AVMediaTypeVideo! ! ! ! ! ! ! ! ! !preferredTrackID:kCMPersistentTrackID_Invalid];! [trackB release];! trackB = [self.composition addMutableTrackWithMediaType:AVMediaTypeVideo! ! ! ! ! ! ! ! ! !preferredTrackID:kCMPersistentTrackID_Invalid];// create video instructionsNSMutableArray *videoInstructions = [[[NSMutableArray alloc] init] autorelease];! // create music tracktrackMusic = [self.composition addMutableTrackWithMediaType:AVMediaTypeAudiopreferredTrackID:kCMPersistentTrackID_Invalid];! CMTimeRange musicTrackTimeRange = CMTimeRangeMake(kCMTimeZero,musicTrackAudioAsset.duration);! NSError *trackMusicError = nil;! [trackMusic insertTimeRange:musicTrackTimeRange! ! ! ! ! ! ofTrack:[musicTrackAudioAsset.tracks objectAtIndex:0]! ! ! ! ! ! atTime:kCMTimeZero! ! ! ! ! ! error:&trackMusicError];! if (trackMusicError) {! ! NSLog(@"couldnt create trackMusic: %@", trackMusicError);! } else {! ! NSLog (@"created trackMusic");! }!! // setup the player! [compositionPlayer release];! compositionPlayer = [[AVPlayer playerWithPlayerItem: [AVPlayerItem playerItemWithAsset: composition]] retain];! [compositionPlayerLayer removeFromSuperlayer];! compositionPlayerLayer = [[AVPlayerLayer playerLayerWithPlayer:compositionPlayer] retain];! [compositionView.layer addSublayer:compositionPlayerLayer];! compositionPlayerLayer.frame = compositionView.layer.bounds;! compositionPlayerLayer.videoGravity = AVLayerVideoGravityResizeAspect;!! [updateScrubberTimer invalidate];! [updateScrubberTimer release];! updateScrubberTimer = [[NSTimer scheduledTimerWithTimeInterval:0.1! ! ! ! ! ! ! ! ! ! ! !! ! target:self! ! ! ! ! ! ! ! ! ! ! !! ! selector:@selector(updateScrubber:)! ! ! ! ! ! ! ! ! ! ! !! ! userInfo:nil! ! ! ! ! ! ! ! ! ! ! !! ! repeats:YES]! ! ! ! ! ! retain];// the video tracksAVAssetTrack *sourceVideoTrack = [[sourceVideoAsset tracksWithMediaType: AVMediaTypeVideo] objectAtIndex: 0];! // pad out the opening with five seconds of blankCMTime videoTracksTime = CMTimeMake(0, VIDEO_TIME_SCALE);CMTime postEditTime = CMTimeAdd (videoTracksTime, CMTimeMakeWithSeconds(FIRST_CUT_TRACK_A_IN_TIME, VIDEO_TIME_SCALE));[trackA insertEmptyTimeRange:CMTimeRangeMake(kCMTimeZero, postEditTime)];videoTracksTime = postEditTime;// first shotpostEditTime = CMTimeAdd (videoTracksTime, CMTimeMakeWithSeconds(FIRST_CUT_DURATION,VIDEO_TIME_SCALE));CMTimeRange firstShotRange = CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(FIRST_CUT_DURATION, VIDEO_TIME_SCALE));! [trackA insertTimeRange:firstShotRange! ! ! ! ! ofTrack:sourceVideoTrack! ! ! ! ! atTime:videoTracksTime! ! ! ! ! error:&performError];videoTracksTime = postEditTime;// track b needs to insert empty segment up to its first use[trackB insertEmptyTimeRange:CMTimeRangeMake(kCMTimeZero, videoTracksTime)];postEditTime = CMTimeAdd (videoTracksTime, CMTimeMakeWithSeconds(SECOND_CUT_TRACK_B_IN_TIME,VIDEO_TIME_SCALE));CMTimeRange secondShotRange = CMTimeRangeMake(CMTimeMakeWithSeconds(SECOND_CUT_SOURCE_TIME, VIDEO_TIME_SCALE),CMTimeMakeWithSeconds(SECOND_CUT_DURATION, VIDEO_TIME_SCALE));[trackB insertTimeRange:secondShotRangeofTrack:sourceVideoTrackatTime:videoTracksTimeerror:&performError];videoTracksTime = postEditTime;// TODO: later segments// desperation cheese - worksAVMutableVideoCompositionInstruction *transitionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];transitionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, composition.duration);AVMutableVideoCompositionLayerInstruction *aInstruction = [AVMutableVideoCompositionLayerInstructionvideoCompositionLayerInstructionWithAssetTrack: trackA];[aInstruction setOpacityRampFromStartOpacity:0.0 toEndOpacity:1.0timeRange:CMTimeRangeMake(CMTimeMakeWithSeconds(2.9, VIDEO_TIME_SCALE),CMTimeMakeWithSeconds(6.0, VIDEO_TIME_SCALE))];AVMutableVideoCompositionLayerInstruction *bInstruction = [AVMutableVideoCompositionLayerInstructionvideoCompositionLayerInstructionWithAssetTrack: trackB];[bInstruction setOpacity:0 atTime:kCMTimeZero];transitionInstruction.layerInstructions = [NSArray arrayWithObjects:aInstruction, bInstruction, nil];[videoInstructions addObject: transitionInstruction];// end of desperation cheese// synchronized layer to own all the title layersAVSynchronizedLayer *synchronizedLayer = [AVSynchronizedLayer synchronizedLayerWithPlayerItem:compositionPlayer.currentItem];synchronizedLayer.frame = [compositionView frame];[self.view.layer addSublayer:synchronizedLayer];// main titlesCATextLayer *mainTitleLayer = [CATextLayer layer];mainTitleLayer.string = NSLocalizedString(@"Running Start", nil);mainTitleLayer.font = @"Verdana-Bold";mainTitleLayer.fontSize = videoSize.height / 8;mainTitleLayer.foregroundColor = [[UIColor yellowColor] CGColor];mainTitleLayer.alignmentMode = kCAAlignmentCenter;mainTitleLayer.frame = CGRectMake(0.0, 0.0, videoSize.width, videoSize.height);mainTitleLayer.opacity = 0.0; // initially invisible[synchronizedLayer addSublayer:mainTitleLayer];// main title opacity animation[CATransaction begin];[CATransaction setDisableActions:YES];CABasicAnimation *mainTitleInAnimation = [CABasicAnimation animationWithKeyPath:@"opacity"];mainTitleInAnimation.fromValue = [NSNumber numberWithFloat: 0.0];mainTitleInAnimation.toValue = [NSNumber numberWithFloat: 1.0];mainTitleInAnimation.removedOnCompletion = NO;mainTitleInAnimation.beginTime = AVCoreAnimationBeginTimeAtZero;mainTitleInAnimation.duration = 5.0;[mainTitleLayer addAnimation:mainTitleInAnimation forKey:@"in-animation"];CABasicAnimation *mainTitleOutAnimation = [CABasicAnimation animationWithKeyPath:@"opacity"];mainTitleOutAnimation.fromValue = [NSNumber numberWithFloat: 1.0];mainTitleOutAnimation.toValue = [NSNumber numberWithFloat: 0.0];mainTitleOutAnimation.removedOnCompletion = NO;mainTitleOutAnimation.beginTime = 5.0;mainTitleOutAnimation.duration = 2.0;[mainTitleLayer addAnimation:mainTitleOutAnimation forKey:@"out-animation"];[CATransaction commit];// TODO: end credits// tell the player about our effectsAVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];videoComposition.instructions = videoInstructions;videoComposition.renderSize = videoSize;videoComposition.frameDuration = CMTimeMake(1, 30); // 30 fpscompositionPlayer.currentItem.videoComposition = videoComposition;// set up the duration label at the right side of the scrubberint durationSeconds = (int) CMTimeGetSeconds (composition.duration);self.durationLabel.text = [NSString stringWithFormat: @"%02d:%02d",durationSeconds / 60, durationSeconds % 60];// reset rate field and play/pause button staterateField.text = @"0.0";playPauseButton.selected = NO;! NSLog (@"bottom of handlePerformTapped");!}No, that’s not supposed to be legible.Thursday, May 2, 13
  • 26. How Do I?• Play music on my titlescreen?• Play a video in my app?• Capture, edit, andexport video?• Play streaming videofrom my web site?• Play user’s iTunes music?• Play a video on anAppleTV?• Play web radio?• Mix and perform effectson audio?• Use MIDI devices?• Create in-game soundsfor a 3D game?Thursday, May 2, 13
  • 27. HTTP Live StreamingThursday, May 2, 13
  • 28. HTTP Live Streaming• Apple-led semi-standard streaming format• Only stream format allowed on App Store• Streams video as series of small (10 sec.)files, via ordinary web server on port 80• Create streams with command-line tools,Pro apps (Compressor, Final Cut), server-side transcoders (Wowza)Thursday, May 2, 13
  • 29. Thursday, May 2, 13
  • 30. Thursday, May 2, 13
  • 31. HLS Advantages• Mobile-friendly: works over spotty cellularconnections, stream can provide multiplebitrates (client switches on the fly)• No special server software required: canstream from Dropbox• Wide adoption: Roku, Xbox,Android,GoogleTV, etc.Thursday, May 2, 13
  • 32. Client-side HLS• Create an AVPlayer orMPMoviePlayerController with thestream’s .m3u8 URL just like any other URL• There is no step 2Thursday, May 2, 13
  • 33. How Do I?• Play music on my titlescreen?• Play a video in my app?• Capture, edit, andexport video?• Play streaming videofrom my web site?• Play user’s iTunes music?• Play a video on anAppleTV?• Play web radio?• Mix and perform effectson audio?• Use MIDI devices?• Create in-game soundsfor a 3D game?Thursday, May 2, 13
  • 34. Media PlayerThursday, May 2, 13
  • 35. Media Player• Allows access to the device’s iTunes library• Audio-only: songs, podcasts, audiobooks• Discover contents with MPMediaQuery, orshow a MPMediaPickerController• MPMediaItems have metadata (title, artist,album, cover art, etc.)• Play with MPMusicPlayerControllerThursday, May 2, 13
  • 36. MPMediaQueryNSString *searchText = searchController.searchBar.text;MPMediaQuery *query = [MPMediaQuery songsQuery];MPMediaPropertyPredicate *titlePredicate =[MPMediaPropertyPredicate predicateWithValue:searchText! ! ! forProperty:MPMediaItemPropertyTitle! ! ! comparisonType:MPMediaPredicateComparisonContains];[query addFilterPredicate:titlePredicate];MPMusicPlayerController *iPodController =! [MPMusicPlayerController iPodMusicPlayer];[iPodController stop];[iPodController setQueueWithQuery: query];[iPodController play];Thursday, May 2, 13
  • 37. MPMoviePlayerController• Simple video player, alternative to AVPlayer• Provides its own view and controls• Lighter-weight than AVPlayer, which helpson really old devices (iPhone 3GS)Thursday, May 2, 13
  • 38. MP novelties• MPNowPlayingInfoCenter – Access tometadata shown on lock screen andexternal media displays (e.g., in-carentertainment systems)• MPVolumeView – System-wide volumeslider, with AirPlay button if availableThursday, May 2, 13
  • 39. Remote Controls• Receive play/pause, forward/back fromheadset, dock keyboard, in-car systems,other external devices• -[UIApplicationbeginReceivingRemoteControlEvents]• Must be able to become first responderThursday, May 2, 13
  • 40. How Do I?• Play music on my titlescreen?• Play a video in my app?• Capture, edit, andexport video?• Play streaming videofrom my web site?• Play user’s iTunes music?• Play a video on anAppleTV?• Play web radio?• Mix and perform effectson audio?• Use MIDI devices?• Create in-game soundsfor a 3D game?Thursday, May 2, 13
  • 41. AirPlayThursday, May 2, 13
  • 42. AirPlay• Wireless audio/video streaming from iOSdevice to Apple TV,AirPort Express, somespeakers• Unofficial third-party clients for Mac(Reflector,Air Sharing), PC,Android, etc.Thursday, May 2, 13
  • 43. AirPlay API• There basically isn’t one• User either mirrors device, or usesAirPlay menu on MPVolumeView orMPMoviePlayerController• You can deny (but please don’t) withallowsAirPlay (MPMovieController),mediaPlaybackAllowsAirPlay (UIWebView),allowsExternalPlayback (AVPlayer)Thursday, May 2, 13
  • 44. Second screens• If user connects toVGA/DVI via Dock/Lightning adapter, or connects to AirPlaywithout mirroring, your app will see asecond screen• Discover with -[UIScreen screens], createnew UIWindow for it• Chat on the device, video on the secondscreen?Thursday, May 2, 13
  • 45. How Do I?• Play music on my titlescreen?• Play a video in my app?• Capture, edit, andexport video?• Play streaming videofrom my web site?• Play user’s iTunes music?• Play a video on anAppleTV?• Play web radio?• Mix and perform effectson audio?• Use MIDI devices?• Create in-game soundsfor a 3D game?Thursday, May 2, 13
  • 46. Core AudioThursday, May 2, 13
  • 47. Core Audio• C frameworks for real-time audioprocessing• Legendary performance, legendarydifficulty• Basis of OS X pro audio apps like Logicand GarageBandThursday, May 2, 13
  • 48. Core Audio• Audio Queue• Audio Units• Open AL• Audio File Services• Audio Converter Svcs.• Extended Audio FileSvcs.• Audio File Stream Svcs.• Audio Session Svcs.Thursday, May 2, 13
  • 49. Core Audio• Small number of structures and functions• Most behavior is specified by getting andsetting properties• All functions return OSStatus, must checkfor noErr before continuing• “Create”-style functions take pointer as aparameter and populate itThursday, May 2, 13
  • 50. Audio Queue• Convenience API for play-out or capture• Wrapped by AV Foundation’sAVAudioRecorder,AVAudioPlayer for file-only scenarios• For play-out, app provides buffers to play• For captuer, queue gives app with buffers ofcapture dataThursday, May 2, 13
  • 51. Audio QueueThursday, May 2, 13
  • 52. Audio QueueThursday, May 2, 13
  • 53. Audio QueueThursday, May 2, 13
  • 54. Audio QueueThursday, May 2, 13
  • 55. Parsing Web RadioThursday, May 2, 13
  • 56. Parsing Web RadioNSData NSDataPackets Packets Packets Packets PacketsNSURLConnection deliversNSData buffers, containing audioand framing info.We pass it toAudio File Services.Thursday, May 2, 13
  • 57. Parsing Web RadioNSData NSDataPackets Packets Packets Packets PacketsPackets PacketsPackets Packets PacketsNSURLConnection deliversNSData buffers, containing audioand framing info.We pass it toAudio File Services.Audio File Services calls us backwith parsed packets of audio data.Thursday, May 2, 13
  • 58. Parsing Web RadioNSData NSDataPackets Packets Packets Packets PacketsPackets PacketsPackets Packets Packets012PacketsPacketsPacketsPacketsPacketsPacketsNSURLConnection deliversNSData buffers, containing audioand framing info.We pass it toAudio File Services.Audio File Services calls us backwith parsed packets of audio data.We create an AudioQueueBufferwith those packets and enqueue itfor play-out.Thursday, May 2, 13
  • 59. How Do I?• Play music on my titlescreen?• Play a video in my app?• Capture, edit, andexport video?• Play streaming videofrom my web site?• Play user’s iTunes music?• Play a video on anAppleTV?• Play web radio?• Mix and perform effectson audio?• Use MIDI devices?• Create in-game soundsfor a 3D game?Thursday, May 2, 13
  • 60. Audio UnitsThursday, May 2, 13
  • 61. Audio Units• Lowest level of audio processing availableto third parties• I/O, effects, mixing, file player, synthesis• Extremely low latency I/O (< 10 ms)Thursday, May 2, 13
  • 62. Pull ModelAUSomethingThursday, May 2, 13
  • 63. Pull ModelAUSomethingAudioUnitRender()Thursday, May 2, 13
  • 64. Pull ModelAUSomethingAUSomethingElseThursday, May 2, 13
  • 65. Buses (aka, Elements)AUSomethingAUSomethingElseAUSomethingElseThursday, May 2, 13
  • 66. AUGraphAUSomethingAUSomethingElseAUSomethingElseThursday, May 2, 13
  • 67. AURemoteIO BusesAURemoteIOThursday, May 2, 13
  • 68. AURemoteIO BusesAURemoteIObus 0to output H/WThursday, May 2, 13
  • 69. AURemoteIO BusesAURemoteIObus 0to output H/Wbus 0from appThursday, May 2, 13
  • 70. AURemoteIO BusesAURemoteIObus 0to output H/Wbus 1from input H/Wbus 0from appThursday, May 2, 13
  • 71. AURemoteIO BusesAURemoteIObus 0to output H/Wbus 1from input H/Wbus 1to appbus 0from appThursday, May 2, 13
  • 72. Pass ThroughAURemoteIObus 1from input H/Wbus 0to output H/WThursday, May 2, 13
  • 73. Pass-Through with Effectbus 0to output H/WAURemoteIOAUEffectbus 1from input H/WThursday, May 2, 13
  • 74. Core Audio code-(void) setUpAUGraph {! if (self.auGraph) {! ! CheckError(AUGraphClose(self.auGraph),! ! ! ! "Couldnt close old AUGraph");! ! CheckError (DisposeAUGraph(self.auGraph),! ! ! ! ! "Couldnt dispose old AUGraph");! }!! CheckError(NewAUGraph(&_auGraph),! ! ! "Couldnt create new AUGraph");!! CheckError(AUGraphOpen(self.auGraph),! ! ! "Couldnt open AUGraph");!! AudioComponentDescription outputcd = {0};! outputcd.componentType = kAudioUnitType_Output;! outputcd.componentSubType = kAudioUnitSubType_RemoteIO;! outputcd.componentManufacturer = kAudioUnitManufacturer_Apple;!! AUNode ioNode;! CheckError(AUGraphAddNode(self.auGraph,! ! ! ! &outputcd,! ! ! ! &ioNode),! ! ! "couldnt add remote io node");!! // get the remote io unit from the node! CheckError(AUGraphNodeInfo(self.auGraph,! ! ! ! ioNode,! ! ! ! NULL,! ! ! ! &_ioUnit),! ! ! "couldnt get remote io unit");!!! // effect unit here! AudioComponentDescription effectcd = {0};! effectcd.componentType = kAudioUnitType_FormatConverter;! effectcd.componentSubType = kAudioUnitSubType_NewTimePitch;! effectcd.componentManufacturer = kAudioUnitManufacturer_Apple;!! AUNode effectNode;! CheckError(AUGraphAddNode(self.auGraph,! ! ! ! &effectcd,! ! ! ! &effectNode),! ! ! "couldnt get effect node [time/pitch]");!! // get effect unit from the node! CheckError(AUGraphNodeInfo(self.auGraph,! ! ! ! effectNode,! ! ! ! NULL,! ! ! ! &_effectUnit),! ! ! "couldnt get effect unit from node");!! // enable input on the remote io unitUInt32 oneFlag = 1;UInt32 busZero = 0;! CheckError(AudioUnitSetProperty(self.ioUnit,! ! ! ! kAudioOutputUnitProperty_EnableIO,! ! ! ! kAudioUnitScope_Output,! ! ! ! busZero,! ! ! ! &oneFlag,! ! ! ! sizeof(oneFlag)),! ! ! "Couldnt enable output on bus 0");! UInt32 busOne = 1;! CheckError(AudioUnitSetProperty(self.ioUnit,! ! ! ! kAudioOutputUnitProperty_EnableIO,! ! ! ! kAudioUnitScope_Input,! ! ! ! busOne,! ! ! ! &oneFlag,! ! ! ! sizeof(oneFlag)),! ! ! "Couldnt enable input on bus 1");!! // set stream format that the effect wants! AudioStreamBasicDescription streamFormat;! UInt32 propertySize = sizeof (streamFormat);! CheckError(AudioUnitGetProperty(self.effectUnit,kAudioUnitProperty_StreamFormat,! ! ! ! ! kAudioUnitScope_Input,! ! ! ! ! 0,! ! ! ! ! &streamFormat,! ! ! ! ! &propertySize),! ! ! "Couldnt get effect unit stream format");!! CheckError(AudioUnitSetProperty(self.ioUnit,kAudioUnitProperty_StreamFormat,! ! ! ! ! kAudioUnitScope_Output,! ! ! ! ! busOne,! ! ! ! ! &streamFormat,! ! ! ! ! sizeof(streamFormat)),! ! ! "couldnt set stream format on iounit bus 1 output");! CheckError(AudioUnitSetProperty(self.ioUnit,! kAudioUnitProperty_StreamFormat,! ! ! ! kAudioUnitScope_Input,! ! ! ! ! busZero,! ! ! ! ! &streamFormat,! ! ! ! ! sizeof(streamFormat)),! ! ! "couldnt set stream format on iounit bus 0 input");! !!! CheckError(AUGraphConnectNodeInput(self.auGraph,! ! ! ! ! ioNode,! ! ! 1,! ! ! ! ! effectNode,! ! ! ! ! 0),! ! ! "couldnt connect remoteio bus 1 output to effect bus 0 input");!! CheckError(AUGraphConnectNodeInput(self.auGraph,! ! ! ! ! effectNode,! ! ! ! ! 0,! ! ! ! ! ioNode,! ! ! ! ! 0),! ! ! "couldnt connect effect bus 0 output to remoteio bus 0 input");!!! CheckError(AUGraphInitialize(self.auGraph),! ! ! "Couldnt initialize AUGraph");!! CheckError(AUGraphStart(self.auGraph),! ! ! "Couldnt start AUGraph");!!!! NSLog (@"bottom of setUpAUGraph");}No, this isn’t supposed to be readable eitherThursday, May 2, 13
  • 75. How Do I?• Play music on my titlescreen?• Play a video in my app?• Capture, edit, andexport video?• Play streaming videofrom my web site?• Play user’s iTunes music?• Play a video on anAppleTV?• Play web radio?• Mix and perform effectson audio?• Use MIDI devices?• Create in-game soundsfor a 3D game?Thursday, May 2, 13
  • 76. Core MIDIThursday, May 2, 13
  • 77. Core MIDI• Handles MIDI events from external devices• Connect via Dock/Lightning connectorsor Camera Connection Kit (USB)• Only provides callbacks on events (key up,key down, pitch bend, etc.) — up to you dosomething with itThursday, May 2, 13
  • 78. MIDI Packets• High nybble of status is command, low ischannel number• Data 1 & 2 depend on commandSTATUS DATA 1 DATA 2Thursday, May 2, 13
  • 79. Handling MIDI packetsstatic void! MyMIDIReadProc(const MIDIPacketList *pktlist, void *refCon, void*connRefCon) {! SNFMasterViewController *myVC = (__bridge SNFMasterViewController*) refCon;! MIDIPacket *packet = (MIDIPacket *)pktlist->packet;!! for (int i=0; i < pktlist->numPackets; i++) {! ! Byte midiStatus = packet->data[0];! ! Byte midiCommand = midiStatus >> 4;! ! // is it a note-on or note-off! ! if ((midiCommand == 0x09) ||! ! ! (midiCommand == 0x08)) {! ! ! Byte note = packet->data[1] & 0x7F;! ! ! Byte velocity = packet->data[2] & 0x7F;! ! ! printf("midiCommand=%d. Note=%d, Velocity=%dn",midiCommand, note, velocity);! ! !! ! ! // send to augraph! ! ! CheckError(MusicDeviceMIDIEvent (myVC.auSampler,! ! ! ! ! ! ! ! ! ! ! midiStatus,! ! ! ! ! ! ! ! ! ! ! note,! ! ! ! ! ! ! ! ! ! ! velocity,! ! ! ! ! ! ! ! ! ! ! 0),! ! ! ! ! "Couldnt send MIDI event");! ! !! ! }! ! packet = MIDIPacketNext(packet);! }}Thursday, May 2, 13
  • 80. How Do I?• Play music on my titlescreen?• Play a video in my app?• Capture, edit, andexport video?• Play streaming videofrom my web site?• Play user’s iTunes music?• Play a video on anAppleTV?• Play web radio?• Mix and perform effectson audio?• Use MIDI devices?• Create in-game soundsfor a 3D game?Thursday, May 2, 13
  • 81. OpenALThursday, May 2, 13
  • 82. OpenAL• Cross-platform C API to create positionalsound• Designed to resemble OpenGL• Low latency, ideal for games• Calls set an error flag, must check withalGetError() after every AL callThursday, May 2, 13
  • 83. AL example// set up OpenAL sourcealGenSources(1, player.sources);alSourcei(player.sources[0], AL_LOOPING, AL_TRUE);alSourcef(player.sources[0], AL_GAIN, AL_MAX_GAIN);updateSourceLocation(player);// connect buffer to sourcealSourcei(player.sources[0], AL_BUFFER, buffers[0]);// set up listeneralListener3f (AL_POSITION, 0.0, 0.0, 0.0);// start playingalSourcePlay(player.sources[0]);Note: error checking removed for clarityThursday, May 2, 13
  • 84. OpenAL Concepts• ALListener – The listener, who has an x,y,zposition, orientation, etc.• ALSource – A sound producing object inthe space. Has position, orientation,motion, sound cone, much more• ALBuffer – Buffers of sound attached to asource. Source can loop one buffer orreceive a stream of buffersThursday, May 2, 13
  • 85. How Did I?• Play music on my titlescreen?• Play a video in my app?• Capture, edit, andexport video?• Play streaming videofrom my web site?• Play user’s iTunes music?• Play a video on anAppleTV?• Play web radio?• Mix and perform effectson audio?• Use MIDI devices?• Create in-game soundsfor a 3D game?Thursday, May 2, 13
  • 86. iOS Media APIs• There are a LOT of them!• Each fills a specific role, not a lot of overlap• Simple stuff is easy, complex stuff is possibleThursday, May 2, 13
  • 87. Questions?• “Audio andVideo Starting Point” in Appledeveloper documentation• http://devforums.apple.com• coreaudio-api@lists.apple.comThursday, May 2, 13
  • 88. iOS Media APIsChris Adamson • @invalidnameMobiDevDay • Detroit, MI • May 4, 2013Slides will be posted to slideshare.net/invalidnameThursday, May 2, 13