Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

KKBOX WWDC17 Airplay 2 - Dolphin

4,135 views

Published on

KKBOX iOS 工程師 Dolphin 分享
影片位置 https://youtu.be/a5QI5cEhiNg
相關 Sessions
- [What's New in Audio] (https://developer.apple.com/videos/wwdc2017/videos/play/wwdc2017/501/)
- [Introducing AirPlay 2] (https://developer.apple.com/videos/wwdc2017/videos/play/wwdc2017/509/)

Published in: Technology
  • Be the first to comment

  • Be the first to like this

KKBOX WWDC17 Airplay 2 - Dolphin

  1. 1. #WWDC17 © 2017 Apple Inc. All rights reserved. Redistribution or public display not permitted without written permission from Apple. David Saracino, AirPlay Engineer • Introducing AirPlay 2 • Unlocking multi-room audio • Session 509 Media
  2. 2. AirPlay 2 Audio Wireless audio
  3. 3. AirPlay 2 Audio Wireless audio Multi-room playback NEWNEW
  4. 4. AirPlay 2 Audio Wireless audio Multi-room playback Enhanced buffering NEW
  5. 5. AirPlay 2 Audio Wireless audio Multi-room playback Enhanced buffering Multi-device control NEW
  6. 6. AirPlay 2 Supported platforms tvOS mac OSiOS
  7. 7. AirPlay 2 Supported speakers HomePod Apple TV* 3rd Party * Requires Apple TV (4th generation)
  8. 8. • AirPlay 2 Adoption
  9. 9. AirPlay 2 Adoption Identify as long-form audio
  10. 10. AirPlay 2 Adoption Identify as long-form audio Add an AirPlay picker
  11. 11. AirPlay 2 Adoption Identify as long-form audio Add an AirPlay picker Integrate with MediaPlayer framework
  12. 12. AirPlay 2 Adoption Identify as long-form audio Add an AirPlay picker Integrate with MediaPlayer framework Adopt an AirPlay 2 playback API
  13. 13. Identify as Long-Form Audio NEW
  14. 14. Long-form Audio Routing (iOS) Music and phone call coexistence Long-form Audio (AirPlay 2) System Audio NEW
  15. 15. Long-form Audio Routing (iOS) Music and phone call coexistence Long-form Audio (AirPlay 2) System Audio Long-form audio route Session Arbitration NEW
  16. 16. Long-form Audio Routing (iOS) Music and phone call coexistence Long-form Audio (AirPlay 2) System Audio Long-form audio route Session Arbitration NEW
  17. 17. Long-form Audio Routing (iOS) Music and phone call coexistence Long-form Audio (AirPlay 2) System Audio Long-form audio route Session Arbitration System audio route Session Arbitration and Mixing NEW
  18. 18. NEW Long-form Audio Routing (iOS and tvOS) Long-form Audio (AirPlay 2) Session Arbitration Applications that use
 the long-form audio route System Audio Session Arbitration and Mixing Applications that use
 the system audio route
  19. 19. //Long-form Audio Routing (iOS and tvOS), code example let mySession = AVAudioSession.sharedInstance() do { try mySession.setCategory(AVAudioSessionCategoryPlayback, mode: AVAudioSessionModeDefault, routeSharingPolicy: .longForm) } catch { // handle errors } NEW
  20. 20. //Long-form Audio Routing (iOS and tvOS), code example let mySession = AVAudioSession.sharedInstance() do { try mySession.setCategory(AVAudioSessionCategoryPlayback, mode: AVAudioSessionModeDefault, routeSharingPolicy: .longForm) } catch { // handle errors } NEW
  21. 21. //Long-form Audio Routing (iOS and tvOS), code example let mySession = AVAudioSession.sharedInstance() do { try mySession.setCategory(AVAudioSessionCategoryPlayback, mode: AVAudioSessionModeDefault, routeSharingPolicy: .longForm) } catch { // handle errors } NEW
  22. 22. Long-form Audio Routing (macOS) Long-form Audio (AirPlay 2) Arbitration Applications that use
 the long-form audio route NEW Default Device Mixing Applications that use
 the system audio route
  23. 23. //Long-form Audio Routing (macOS), code example let mySession = AVAudioSession.sharedInstance() do { try mySession.setRouteSharingPolicy(.longForm) } catch { // handle errors } NEW
  24. 24. //Long-form Audio Routing (macOS), code example let mySession = AVAudioSession.sharedInstance() do { try mySession.setRouteSharingPolicy(.longForm) } catch { // handle errors } NEW
  25. 25. Add an AirPlay Picker
  26. 26. Add an AirPlay Picker Adopt
  27. 27. Add an AirPlay Picker Adopt • AVRoutePickerView NEW
  28. 28. Add an AirPlay Picker Adopt • AVRoutePickerView • AVRouteDetector NEW
  29. 29. Integrate with Media Player Framework
  30. 30. Handle remote media commands • MPRemoteCommandCenter Integrate with Media Player Framework
  31. 31. Handle remote media commands • MPRemoteCommandCenter Display current track info • MPNowPlayingInfoCenter Integrate with Media Player Framework
  32. 32. • AirPlay 2 Playback APIs
  33. 33. Audio Buffering Existing AirPlay Real-time audio stream Speaker adds a small buffer before output Works fine for streaming to single speaker
  34. 34. Enhanced Audio Buffering AirPlay 2 Large audio buffering capacity on speakers Faster-than-real-time streaming to speakers Benefits • Adds robustness • More responsive playback NEW
  35. 35. Enhanced Audio Buffering Adoption Supported with specific playback APIs
  36. 36. Enhanced Audio Buffering Adoption Supported with specific playback APIs • AVPlayer / AVQueuePlayer
  37. 37. Enhanced Audio Buffering Adoption Supported with specific playback APIs • AVPlayer / AVQueuePlayer • AVSampleBufferAudioRenderer - AVSampleBufferRenderSynchronizer NEW
  38. 38. Enhanced Audio Buffering AVSampleBufferAudioRenderer / AVSampleBufferRenderSynchronizer NEW
  39. 39. Enhanced Audio Buffering AVSampleBufferAudioRenderer / AVSampleBufferRenderSynchronizer Your app has additional responsibilities NEW
  40. 40. Enhanced Audio Buffering AVSampleBufferAudioRenderer / AVSampleBufferRenderSynchronizer Your app has additional responsibilities • Sourcing and parsing the content • Providing raw audio buffers for rendering NEW
  41. 41. Enhanced Audio Buffering AVSampleBufferAudioRenderer / AVSampleBufferRenderSynchronizer Wants more data Client App SynchronizerAudioRenderer
  42. 42. Enhanced Audio Buffering AVSampleBufferAudioRenderer / AVSampleBufferRenderSynchronizer Client App SynchronizerAudioRenderer Audio Data
  43. 43. Enhanced Audio Buffering AVSampleBufferAudioRenderer / AVSampleBufferRenderSynchronizer Client App SynchronizerAudioRenderer Audio Data Audio Data
  44. 44. Enhanced Audio Buffering AVSampleBufferAudioRenderer / AVSampleBufferRenderSynchronizer Client App SynchronizerAudioRenderer Audio Data SetRate 1 Audio Data
  45. 45. Enhanced Audio Buffering AVSampleBufferAudioRenderer / AVSampleBufferRenderSynchronizer Client App SynchronizerAudioRenderer Audio Data Audio Data
  46. 46. Requested data amount varies by audio route Audio Buffer Levels AVSampleBufferAudioRenderer Local Seconds Bluetooth Seconds AirPlay Seconds AirPlay 2 Minutes
  47. 47. Manually changing Playhead location Seek AVSampleBufferAudioRenderer
  48. 48. Seek AVSampleBufferAudioRenderer Renderer Source Audio Data to Play PlayheadPlayhead
  49. 49. Seek AVSampleBufferAudioRenderer Renderer Source Audio Data to Play Playhead Playhead
  50. 50. Seek AVSampleBufferAudioRenderer Renderer Source Audio Data to Play Flush Playhead
  51. 51. Seek AVSampleBufferAudioRenderer Renderer Source Audio Data to Play Playhead
  52. 52. // Seek - AVSampleBufferAudioRenderer func seek(toMediaTime mediaTime: CMTime) { renderSynchronizer.setRate(0.0, time: kCMTimeZero) audioRenderer.stopRequestingMediaData() audioRenderer.flush() myPrepareSampleGenerationForMediaTime(mediaTime) audioRenderer.requestMediaDataWhenReady(on: mySerializationQueue) { // ... } renderSynchronizer.setRate(1.0, time: mediaTime) }
  53. 53. // Seek - AVSampleBufferAudioRenderer func seek(toMediaTime mediaTime: CMTime) { renderSynchronizer.setRate(0.0, time: kCMTimeZero) audioRenderer.stopRequestingMediaData() audioRenderer.flush() myPrepareSampleGenerationForMediaTime(mediaTime) audioRenderer.requestMediaDataWhenReady(on: mySerializationQueue) { // ... } renderSynchronizer.setRate(1.0, time: mediaTime) }
  54. 54. // Seek - AVSampleBufferAudioRenderer func seek(toMediaTime mediaTime: CMTime) { renderSynchronizer.setRate(0.0, time: kCMTimeZero) audioRenderer.stopRequestingMediaData() audioRenderer.flush() myPrepareSampleGenerationForMediaTime(mediaTime) audioRenderer.requestMediaDataWhenReady(on: mySerializationQueue) { // ... } renderSynchronizer.setRate(1.0, time: mediaTime) }
  55. 55. // Seek - AVSampleBufferAudioRenderer func seek(toMediaTime mediaTime: CMTime) { renderSynchronizer.setRate(0.0, time: kCMTimeZero) audioRenderer.stopRequestingMediaData() audioRenderer.flush() myPrepareSampleGenerationForMediaTime(mediaTime) audioRenderer.requestMediaDataWhenReady(on: mySerializationQueue) { // ... } renderSynchronizer.setRate(1.0, time: mediaTime) }
  56. 56. // Seek - AVSampleBufferAudioRenderer func seek(toMediaTime mediaTime: CMTime) { renderSynchronizer.setRate(0.0, time: kCMTimeZero) audioRenderer.stopRequestingMediaData() audioRenderer.flush() myPrepareSampleGenerationForMediaTime(mediaTime) audioRenderer.requestMediaDataWhenReady(on: mySerializationQueue) { // ... } renderSynchronizer.setRate(1.0, time: mediaTime) }
  57. 57. // Seek - AVSampleBufferAudioRenderer func seek(toMediaTime mediaTime: CMTime) { renderSynchronizer.setRate(0.0, time: kCMTimeZero) audioRenderer.stopRequestingMediaData() audioRenderer.flush() myPrepareSampleGenerationForMediaTime(mediaTime) audioRenderer.requestMediaDataWhenReady(on: mySerializationQueue) { // ... } renderSynchronizer.setRate(1.0, time: mediaTime) }
  58. 58. Play Queues Timelines Queue Renderer 0 100 200 Continuous Timeline 0 100 1000 0 Item 1 Item 2 Item 3
  59. 59. Play Queues Editing during playback Queue Last Enqueued Sample Item 1 Item 2 Item 3 Renderer Playhead
  60. 60. Item 1 Play Queues Editing during playback Queue Renderer Playhead Item 3 Last Enqueued Sample
  61. 61. Item 3 Play Queues Editing during playback Queue Item 1 Item 4 Renderer Item 2’s Data! Playhead Last Enqueued Sample
  62. 62. Flushing from a Source Time 1. Stop enqueueing audio data 2. flush(fromSourceTime:) 3. Wait for the callback
  63. 63. Item 3 Play Queues Replacing incorrect renderer data Queue Item 1 Item 4 Renderer Flush from Source Time
  64. 64. Item 3 Play Queues Replacing incorrect renderer data Queue Item 1 Item 4 Renderer Last Enqueued SamplePlayhead
  65. 65. Flushing from a Source Time Gotchas
  66. 66. Flushing from a Source Time Gotchas Flush may fail!
  67. 67. Flushing from a Source Time Gotchas Flush may fail! • Source time is too close to playhead
  68. 68. Flushing from a Source Time Gotchas Flush may fail! • Source time is too close to playhead Wait for the callback!
  69. 69. // FlushFromSourceTime - AVSampleBufferAudioRenderer func performFlush(fromSourceTime sourceTime: CMTime) { audioRenderer.stopRequestingMediaData() // App-specific logic to ensure no more media data is enqueued audioRenderer.flush(fromSourceTime: sourceTime) { (flushSucceeded) in if flushSucceeded { self.myPrepareSampleGenerationForMediaTime(sourceTime) audioRenderer.requestMediaDataWhenReady(on: mySerializationQueue) { /*…*/ } } else { // Flush and interrupt playback } } }
  70. 70. // FlushFromSourceTime - AVSampleBufferAudioRenderer func performFlush(fromSourceTime sourceTime: CMTime) { audioRenderer.stopRequestingMediaData() // App-specific logic to ensure no more media data is enqueued audioRenderer.flush(fromSourceTime: sourceTime) { (flushSucceeded) in if flushSucceeded { self.myPrepareSampleGenerationForMediaTime(sourceTime) audioRenderer.requestMediaDataWhenReady(on: mySerializationQueue) { /*…*/ } } else { // Flush and interrupt playback } } }
  71. 71. // FlushFromSourceTime - AVSampleBufferAudioRenderer func performFlush(fromSourceTime sourceTime: CMTime) { audioRenderer.stopRequestingMediaData() // App-specific logic to ensure no more media data is enqueued audioRenderer.flush(fromSourceTime: sourceTime) { (flushSucceeded) in if flushSucceeded { self.myPrepareSampleGenerationForMediaTime(sourceTime) audioRenderer.requestMediaDataWhenReady(on: mySerializationQueue) { /*…*/ } } else { // Flush and interrupt playback } } }
  72. 72. // FlushFromSourceTime - AVSampleBufferAudioRenderer func performFlush(fromSourceTime sourceTime: CMTime) { audioRenderer.stopRequestingMediaData() // App-specific logic to ensure no more media data is enqueued audioRenderer.flush(fromSourceTime: sourceTime) { (flushSucceeded) in if flushSucceeded { self.myPrepareSampleGenerationForMediaTime(sourceTime) audioRenderer.requestMediaDataWhenReady(on: mySerializationQueue) { /*…*/ } } else { // Flush and interrupt playback } } }
  73. 73. • Audio Format Support • AVSampleBufferAudioRenderer
  74. 74. Supported Audio Formats AVSampleBufferAudioRenderer All platform-supported audio formats • e.g. LPCM, AAC, mp3, or ALAC • e.g. 44.1 kHZ or 48 kHZ
  75. 75. Supported Audio Formats AVSampleBufferAudioRenderer All platform-supported audio formats • e.g. LPCM, AAC, mp3, or ALAC • e.g. 44.1 kHZ or 48 kHZ • various bit depths Mixed formats may be enqueued AAC @ 44.1kHZ MP3 @ 48kHz 16bit ALAC @ 48kHzRenderer
  76. 76. • Video Synchronization • AVSampleBufferDisplayLayer
  77. 77. AudioRenderer Synchronizer Video Synchronization AVSampleBufferDisplayLayer Client App
  78. 78. AudioRenderer Synchronizer Video Synchronization AVSampleBufferDisplayLayer Client App DisplayLayer NEW
  79. 79. Client App Video Synchronization AVSampleBufferDisplayLayer AudioRenderer Audio Data Synchronizer DisplayLayer Video Data NEW
  80. 80. Client App Video Synchronization AVSampleBufferDisplayLayer SetRate 1 AudioRenderer Audio Data Synchronizer DisplayLayer Video Data NEW
  81. 81. Client App Video Synchronization AVSampleBufferDisplayLayer AudioRenderer Audio Data Synchronizer DisplayLayer Video Data NEW
  82. 82. #WWDC17 © 2017 Apple Inc. All rights reserved. Redistribution or public display not permitted without written permission from Apple. Akshatha Nagesh, AudioEngine-eer Béla Balázs, Audio Artisan Torrey Holbrook Walker, Audio/MIDI Black Ops • What’s New in Audio • Session 501 Media
  83. 83. • AVAudioEngine • AVAudioSession • watchOS • AUAudioUnit • Other Enhancements • Inter-Device Audio Mode (IDAM)
  84. 84. • AVAudioEngine
  85. 85. Recap Powerful, feature-rich, Objective-C / Swift API set Simplifies realtime audio, easier to use Supports • Playback, recording, processing, mixing • 3D spatialization AVAudioEngine in Practice WWDC14 What’s New in Core Audio WWDC15
  86. 86. What’s New AVAudioEngine • Manual rendering • Auto shutdown AVAudioPlayerNode • Completion callbacks NEW
  87. 87. Sample Engine Setup Karaoke NodeTapBlock (Analyze) InputNode EffectNode (EQ) PlayerNode (Backing Track) PlayerNode (Sound Effects) MixerNode OutputNode
  88. 88. Sample Engine Setup InputNode OutputNode NodeTapBlock (Analyze) EffectNode PlayerNode PlayerNode MixerNode
  89. 89. Sample Engine Setup Manual rendering NodeTapBlock (Analyze) InputNode EffectNode PlayerNode PlayerNode MixerNode OutputNode NEW Application
  90. 90. Engine is not connected to any audio device Renders in response to requests from the client Modes • Offline • Realtime Manual Rendering NEW
  91. 91. Engine and nodes operate under no deadlines or realtime constraints A node may choose to: • Use a more expensive signal processing algorithm • Block on render thread for more data if needed - For example, player node may wait until its worker thread reads the 
 data from disk Offline Manual Rendering NEW
  92. 92. Application Offline Context Offline Manual Rendering Example NEW Source File Destination File PlayerNode EffectNode OutputNode
  93. 93. Post-processing of audio files, for example, apply reverb, effects etc. Mixing of audio files Offline audio processing using CPU intensive (higher quality) algorithms Tuning, debugging or testing the engine setup Offline Manual Rendering Applications NEW
  94. 94. The engine and nodes: • Operate under realtime constraints • Do not make any blocking calls like blocking on a mutex, calling libdispatch etc., on the render thread - A node may drop the data if it is not ready to be rendered in time Realtime Manual Rendering NEW
  95. 95. Processing audio in an AUAudioUnit’s internalRenderBlock Processing audio data in a movie/video during streaming/playback Realtime Manual Rendering Applications NEW
  96. 96. Realtime Manual Rendering Example NEW Realtime Context Application Audio from an
 Input Movie Stream Audio into an
 Output Movie Stream InputNode EffectNode OutputNode TV
  97. 97. //Realtime Manual Rendering, code example do { let engine = AVAudioEngine() // by default engine will render to/from the audio device // make connections, e.g. inputNode -> effectNode -> outputNode // switch to manual rendering mode engine.stop() try engine.enableManualRenderingMode(.realtime, format: outputPCMFormat, maximumFrameCount: frameCount) // e.g. 1024 @ 48 kHz = 21.33 ms let renderBlock = engine.manualRenderingBlock // cache the render block NEW
  98. 98. //Realtime Manual Rendering, code example do { let engine = AVAudioEngine() // by default engine will render to/from the audio device // make connections, e.g. inputNode -> effectNode -> outputNode // switch to manual rendering mode engine.stop() try engine.enableManualRenderingMode(.realtime, format: outputPCMFormat, maximumFrameCount: frameCount) // e.g. 1024 @ 48 kHz = 21.33 ms let renderBlock = engine.manualRenderingBlock // cache the render block NEW
  99. 99. //Realtime Manual Rendering, code example do { let engine = AVAudioEngine() // by default engine will render to/from the audio device // make connections, e.g. inputNode -> effectNode -> outputNode // switch to manual rendering mode engine.stop() try engine.enableManualRenderingMode(.realtime, format: outputPCMFormat, maximumFrameCount: frameCount) // e.g. 1024 @ 48 kHz = 21.33 ms let renderBlock = engine.manualRenderingBlock // cache the render block NEW
  100. 100. //Realtime Manual Rendering, code example do { let engine = AVAudioEngine() // by default engine will render to/from the audio device // make connections, e.g. inputNode -> effectNode -> outputNode // switch to manual rendering mode engine.stop() try engine.enableManualRenderingMode(.realtime, format: outputPCMFormat, maximumFrameCount: frameCount) // e.g. 1024 @ 48 kHz = 21.33 ms let renderBlock = engine.manualRenderingBlock // cache the render block NEW
  101. 101. // set the block to provide input data to engine engine.inputNode.setManualRenderingInputPCMFormat(inputPCMFormat) { (inputFrameCount) -> UnsafePointer<AudioBufferList>? in guard haveData else { return nil } // fill and return the input audio buffer list return inputBufferList }) // create output buffer, cache the buffer list let buffer = AVAudioPCMBuffer(pcmFormat: outputPCMFormat, frameCapacity: engine.manualRenderingMaximumFrameCount)! buffer.frameLength = buffer.frameCapacity let outputBufferList = buffer.mutableAudioBufferList try engine.start() } catch { // handle errors } NEW
  102. 102. // set the block to provide input data to engine engine.inputNode.setManualRenderingInputPCMFormat(inputPCMFormat) { (inputFrameCount) -> UnsafePointer<AudioBufferList>? in guard haveData else { return nil } // fill and return the input audio buffer list return inputBufferList }) // create output buffer, cache the buffer list let buffer = AVAudioPCMBuffer(pcmFormat: outputPCMFormat, frameCapacity: engine.manualRenderingMaximumFrameCount)! buffer.frameLength = buffer.frameCapacity let outputBufferList = buffer.mutableAudioBufferList try engine.start() } catch { // handle errors } NEW
  103. 103. // set the block to provide input data to engine engine.inputNode.setManualRenderingInputPCMFormat(inputPCMFormat) { (inputFrameCount) -> UnsafePointer<AudioBufferList>? in guard haveData else { return nil } // fill and return the input audio buffer list return inputBufferList }) // create output buffer, cache the buffer list let buffer = AVAudioPCMBuffer(pcmFormat: outputPCMFormat, frameCapacity: engine.manualRenderingMaximumFrameCount)! buffer.frameLength = buffer.frameCapacity let outputBufferList = buffer.mutableAudioBufferList try engine.start() } catch { // handle errors } NEW
  104. 104. // set the block to provide input data to engine engine.inputNode.setManualRenderingInputPCMFormat(inputPCMFormat) { (inputFrameCount) -> UnsafePointer<AudioBufferList>? in guard haveData else { return nil } // fill and return the input audio buffer list return inputBufferList }) // create output buffer, cache the buffer list let buffer = AVAudioPCMBuffer(pcmFormat: outputPCMFormat, frameCapacity: engine.manualRenderingMaximumFrameCount)! buffer.frameLength = buffer.frameCapacity let outputBufferList = buffer.mutableAudioBufferList try engine.start() } catch { // handle errors } NEW
  105. 105. // to render from realtime context OSStatus outputError = noErr; const auto status = renderBlock(framesToRender, outputBufferList, &outputError); switch (status) { case AVAudioEngineManualRenderingStatusSuccess: handleProcessedOutput(outputBufferList); // data rendered successfully break; case AVAudioEngineManualRenderingStatusInsufficientDataFromInputNode: handleProcessedOutput(outputBufferList); // input node did not provide data, // but other sources may have rendered break; .. default: break; } NEW
  106. 106. // to render from realtime context OSStatus outputError = noErr; const auto status = renderBlock(framesToRender, outputBufferList, &outputError); switch (status) { case AVAudioEngineManualRenderingStatusSuccess: handleProcessedOutput(outputBufferList); // data rendered successfully break; case AVAudioEngineManualRenderingStatusInsufficientDataFromInputNode: handleProcessedOutput(outputBufferList); // input node did not provide data, // but other sources may have rendered break; .. default: break; } NEW
  107. 107. Offline • Can use either ObjC/Swift render method or the block based render call Realtime • Must use the block based render call Manual Rendering Render calls NEW
  108. 108. What’s New AVAudioEngine • Manual rendering • Auto shutdown AVAudioPlayerNode • Completion callbacks NEW
  109. 109. Hardware is stopped if running idle for a certain duration, 
 started dynamically when needed Safety net for conserving power Enforced behavior on watchOS, optional on other platforms Auto Shutdown isAutoShutdownEnabled NEW
  110. 110. What’s New AVAudioEngine • Manual rendering • Auto shutdown AVAudioPlayerNode • Completion callbacks NEW
  111. 111. Existing buffer/file completion handlers called when the data has been consumed New completion handler and callback types Completion Callbacks AVAudioPlayerNodeCompletionCallbackType .dataConsumed .dataRendered .dataPlayedBack NEW
  112. 112. • Buffer/file has finished playing • Applicable only when the engine is rendering to an audio device • Accounts for both (small) downstream signal processing latency and (possibly significant) audio playback device latency Completion Callbacks .dataPlayedBack NEW
  113. 113. • Buffer/file has finished playing • Applicable only when the engine is rendering to an audio device • Accounts for both (small) downstream signal processing latency and (possibly significant) audio playback device latency Completion Callbacks .dataPlayedBack NEW player.scheduleFile(file, at: nil, completionCallbackType: .dataPlayedBack) { (callbackType) in // file has finished playing from listener’s perspective // notify to stop the engine and update UI })
  114. 114. • Data has been consumed, same as the existing completion handlers • The buffer can be recycled, more data can be scheduled • Data has been output by the player • Useful in manual rendering mode • Does not account for any downstream signal processing latency Completion Callbacks .dataConsumed .dataRendered NEW
  115. 115. AVAudioEngine Summary AVAudioEngine • Manual rendering • Auto shutdown AVAudioPlayerNode • Completion callbacks Deprecation coming soon (2018) • AUGraph NEW
  116. 116. • AVAudioSession
  117. 117. AirPlay 2 Support AirPlay 2 - new technology on iOS, tvOS and macOS • Multi-room audio with AirPlay 2 capable devices Long-form audio applications • Content - music, podcasts etc. • Separate, shared audio route to AirPlay 2 devices • New AVAudioSession API for an application to identify itself as long-form Introducing AirPlay 2 Session 509 Thursday 4:10PM NEW
  118. 118. • AUAudioUnit
  119. 119. AU MIDI Output AU can emit MIDI output synchronized with its audio output Host sets a block on the AU to be called every render cycle Host can record/edit both MIDI performance and audio output from the AU *MIDIOutputNames MIDIOutputEventBlock NEW
  120. 120. AU View Configuration Host applications decide how to display UI for AUs Current limitations • No standard view sizes defined • AUs are supposed to adapt to any view size chosen by host
  121. 121. Audio Unit ExtensionHost AU Preferred View Configuration supportedViewConfigurations (availableViewConfigurations) Array of all 
 possible view configurations NEW
  122. 122. Audio Unit ExtensionHost AU Preferred View Configuration supportedViewConfigurations (availableViewConfigurations) Array of all 
 possible view configurations IndexSet of 
 supported view configurations NEW
  123. 123. Audio Unit ExtensionHost AU Preferred View Configuration supportedViewConfigurations (availableViewConfigurations) Array of all 
 possible view configurations IndexSet of 
 supported view configurations Chosen view configuration select(viewConfiguration) NEW
  124. 124. //AU Preferred View Configuration Code example - host application var smallConfigActive: Bool = false // true if the small view is the currently active one @IBAction func toggleViewModes(_ sender: AnyObject?) { guard audioUnit = self.engine.audioUnit else { return } let largeConfig = AUAudioUnitViewConfiguration(width: 600, height: 400, hostHasController: false) let smallConfig = AUAudioUnitViewConfiguration(width: 300, height: 200, hostHasController: true) let supportedIndices = audioUnit.supportedViewConfigurations([smallConfig, largeConfig]) if supportedIndices.count == 2 { audioUnit.select(self.smallConfigActive ? largeConfig : smallConfig) self.smallConfigActive = !self.smallConfigActive } } NEW
  125. 125. //AU Preferred View Configuration Code example - host application var smallConfigActive: Bool = false // true if the small view is the currently active one @IBAction func toggleViewModes(_ sender: AnyObject?) { guard audioUnit = self.engine.audioUnit else { return } let largeConfig = AUAudioUnitViewConfiguration(width: 600, height: 400, hostHasController: false) let smallConfig = AUAudioUnitViewConfiguration(width: 300, height: 200, hostHasController: true) let supportedIndices = audioUnit.supportedViewConfigurations([smallConfig, largeConfig]) if supportedIndices.count == 2 { audioUnit.select(self.smallConfigActive ? largeConfig : smallConfig) self.smallConfigActive = !self.smallConfigActive } } NEW
  126. 126. //AU Preferred View Configuration Code example - host application var smallConfigActive: Bool = false // true if the small view is the currently active one @IBAction func toggleViewModes(_ sender: AnyObject?) { guard audioUnit = self.engine.audioUnit else { return } let largeConfig = AUAudioUnitViewConfiguration(width: 600, height: 400, hostHasController: false) let smallConfig = AUAudioUnitViewConfiguration(width: 300, height: 200, hostHasController: true) let supportedIndices = audioUnit.supportedViewConfigurations([smallConfig, largeConfig]) if supportedIndices.count == 2 { audioUnit.select(self.smallConfigActive ? largeConfig : smallConfig) self.smallConfigActive = !self.smallConfigActive } } NEW
  127. 127. //AU Preferred View Configuration Code example - host application var smallConfigActive: Bool = false // true if the small view is the currently active one @IBAction func toggleViewModes(_ sender: AnyObject?) { guard audioUnit = self.engine.audioUnit else { return } let largeConfig = AUAudioUnitViewConfiguration(width: 600, height: 400, hostHasController: false) let smallConfig = AUAudioUnitViewConfiguration(width: 300, height: 200, hostHasController: true) let supportedIndices = audioUnit.supportedViewConfigurations([smallConfig, largeConfig]) if supportedIndices.count == 2 { audioUnit.select(self.smallConfigActive ? largeConfig : smallConfig) self.smallConfigActive = !self.smallConfigActive } } NEW
  128. 128. //AU Preferred View Configuration Code example - AU extension override public func supportedViewConfigurations(_ availableViewConfigurations: [AUAudioUnitViewConfiguration]) -> IndexSet { var result = NSMutableIndexSet() for (index, config) in availableViewConfigurations.enumerated() { // check if the config (width, height, hostHasController) is supported // a config of 0x0 (default full size) must always be supported if isConfigurationSupported(config) { result.add(index) } } return result as IndexSet } NEW
  129. 129. //AU Preferred View Configuration Code example - AU extension override public func supportedViewConfigurations(_ availableViewConfigurations: [AUAudioUnitViewConfiguration]) -> IndexSet { var result = NSMutableIndexSet() for (index, config) in availableViewConfigurations.enumerated() { // check if the config (width, height, hostHasController) is supported // a config of 0x0 (default full size) must always be supported if isConfigurationSupported(config) { result.add(index) } } return result as IndexSet } NEW
  130. 130. //AU Preferred View Configuration Code example - AU extension override public func supportedViewConfigurations(_ availableViewConfigurations: [AUAudioUnitViewConfiguration]) -> IndexSet { var result = NSMutableIndexSet() for (index, config) in availableViewConfigurations.enumerated() { // check if the config (width, height, hostHasController) is supported // a config of 0x0 (default full size) must always be supported if isConfigurationSupported(config) { result.add(index) } } return result as IndexSet } NEW
  131. 131. //AU Preferred View Configuration Code example - AU extension override public func supportedViewConfigurations(_ availableViewConfigurations: [AUAudioUnitViewConfiguration]) -> IndexSet { var result = NSMutableIndexSet() for (index, config) in availableViewConfigurations.enumerated() { // check if the config (width, height, hostHasController) is supported // a config of 0x0 (default full size) must always be supported if isConfigurationSupported(config) { result.add(index) } } return result as IndexSet } NEW
  132. 132. //AU Preferred View Configuration Code example - AU extension override public func supportedViewConfigurations(_ availableViewConfigurations: [AUAudioUnitViewConfiguration]) -> IndexSet { var result = NSMutableIndexSet() for (index, config) in availableViewConfigurations.enumerated() { // check if the config (width, height, hostHasController) is supported // a config of 0x0 (default full size) must always be supported if isConfigurationSupported(config) { result.add(index) } } return result as IndexSet } NEW
  133. 133. //AU Preferred View Configuration Code example - AU extension override public func select(_ viewConfiguration: AUAudioUnitViewConfiguration) { // configuration selected by host, used by view controller to re-arrange its view self.currentViewConfiguration = viewConfiguration self.viewController?.selectViewConfig(self.currentViewConfiguration) } NEW
  134. 134. //AU Preferred View Configuration Code example - AU extension override public func select(_ viewConfiguration: AUAudioUnitViewConfiguration) { // configuration selected by host, used by view controller to re-arrange its view self.currentViewConfiguration = viewConfiguration self.viewController?.selectViewConfig(self.currentViewConfiguration) } NEW
  135. 135. //AU Preferred View Configuration Code example - AU extension override public func select(_ viewConfiguration: AUAudioUnitViewConfiguration) { // configuration selected by host, used by view controller to re-arrange its view self.currentViewConfiguration = viewConfiguration self.viewController?.selectViewConfig(self.currentViewConfiguration) } NEW
  136. 136. • Other Enhancements
  137. 137. Audio Formats FLAC (Free Lossless Audio Codec) • Codec, file, and streaming support • Content distribution, streaming applications Opus • Codec support • File I/O using .caf container • VOIP applications NEW
  138. 138. Spatial Audio Formats B-Format Audio stream is regular PCM File container .caf B-format: W,X,Y,Z • 1st order ambisonics kAudioChannelLayoutTag_Ambisonic_B_Format NEW
  139. 139. • AVAudioEngine • AVAudioSession • watchOS • AUAudioUnit • Other Enhancements • Inter-Device Audio Mode (IDAM)
  140. 140. Torrey Holbrook Walker, Audio/MIDI Black Ops • Inter-Device Audio Mode (IDAM)
  141. 141. Inter-Device Audio Mode Record audio digitally via Lightning-to-
 USB cable USB 2.0 audio class-compliant implementation Available since El Capitan and iOS 9
  142. 142. Inter-Device Audio Mode IDAM
  143. 143. Inter-Device Audio Mode+ MIDI IDAM
  144. 144. Inter-Device Audio + MIDI Send and receive MIDI via Lightning-to-USB cable Class-compliant USB MIDI implementation Requires iOS 11 and macOS El Capitan or later Auto-enabled in IDAM configuration NEW
  145. 145. Inter-Device Audio + MIDI Device can charge and sync in IDAM configuration Photo import and tethering are temporarily disabled Audio device aggregation is ok Use your iOS devices as a MIDI controllers, destinations, or both NEW
  146. 146. Summary AVAudioEngine - manual rendering AVAudioSession - Airplay 2 support watchOS - recording AUAudioUnit - preferred view size, MIDI output Other enhancements - audio formats (FLAC, Opus, HOA) Inter-Device Audio and MIDI

×