Core Audio in iOS 6 (CocoaConf Chicago, March 2013)

1,197 views

Published on

Core Audio gets a bunch of neat new tricks in iOS 6, particularly for developers working with Audio Units. New effect units include an improved ability to vary pitch and playback speed, a digital delay unit, and OS X's powerful matrix mixer. There's now a new place to use units too, as the Audio Queue now offers developers a way to "tap" into the data being queued up for playback. To top it all off, a new "multi-route" system allows us to play out of multiple, multi-channel output devices at the same time.

Want to see, and hear, how all this stuff works? This section is the place to find out.

Published in: Technology
0 Comments
3 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
1,197
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
0
Comments
0
Likes
3
Embeds 0
No embeds

No notes for slide

Core Audio in iOS 6 (CocoaConf Chicago, March 2013)

  1. 1. Core Audio in iOS 6 Chris Adamson • @invalidname CocoaConf Chicago March 9, 2013 Sides and code available on my blog: http://www.subfurther.com/blog
  2. 2. Plug!
  3. 3. The Reviews Are In!
  4. 4. Legitimate copies!• Amazon (paper or Kindle)• Barnes & Noble (paper or Nook)• Apple (iBooks)• Direct from InformIT (paper, eBook [.epub + .mobi + .pdf], or Bundle) • 35% off with code COREAUDIO3174
  5. 5. What You’ll Learn• What Core Audio does and doesn’t do• When to use and not use it• What’s new in Core Audio for iOS 6
  6. 6. AV Foundation, Media Player Simple things should be simple, complex things should be possible. –Alan Kay Core Audio
  7. 7. Core Audio• Low-level C framework for processing audio • Capture, play-out, real-time or off-line processing• The “complex things should be possible” part of audio on OS X and iOS
  8. 8. Chris’ CA Taxonomy• Engines: process streams of audio • Capture, play-out, mixing, effects processing• Helpers: deal with formats, encodings, etc. • File I/O, stream I/O, format conversion, iOS “session” management
  9. 9. Helpers: Audio File• Read from / write to multiple audio file types (.aiff, .wav, .caf, .m4a, .mp3) in a content-agnostic way• Get metadata (data format, duration, iTunes/ID3 info)
  10. 10. Helpers: Audio File Stream• Read audio from non-random-access source like a network stream• Discover encoding and encapsulation on the fly, then deliver audio packets to client application
  11. 11. Helpers: Converters• Convert buffers of audio to and from different encodings• One side must be in an uncompressed format (i.e., Linear PCM)
  12. 12. Helpers: ExtAudioFile• Combine file I/O and format conversion• Read a compressed file into PCM buffers• Write PCM buffers into a compressed file
  13. 13. Helpers: Audio Session• iOS-only API to negotiate use of audio resources with the rest of the system• Deetermine whether your app mixes with other apps’ audio, honors ring/silent switch, can play in background, etc.• Gets notified of audio interruptions• See also AVAudioSession
  14. 14. Engines: Audio Units• Low-latency (~10ms) processing of capture/play-out audio data• Effects, mixing, etc.• Connect units manually or via an AUGraph• Much more on this topic momentarily…
  15. 15. Engines: Audio Queue• Convenience API for recording or play-out, built atop audio units• Rather than processing on-demand and on Core Audio’s thread, your callback provides or receives buffers of audio (at whatever size is convenient to you)• Higher latency, naturally• Supports compressed formats (MP3, AAC)
  16. 16. Engines: Open AL• API for 3D spatialized audio, implemented atop audio units• Set a source’s properties (x/y/z coordinates, orientation, audio buffer, etc.), OpenAL renders what it sounds like to the listener from that location
  17. 17. Engines and Helpers• Audio Units • Audio File• Audio Queue • Audio File Stream• Open AL • Audio Converter • ExtAudioFile • Audio Session
  18. 18. Audio Units
  19. 19. Audio Unit AUSomething
  20. 20. Types of Audio Units• Output (which also do input)• Generator• Converter• Effect• Mixer• Music
  21. 21. Pull Model AUSomething AudioUnitRender()
  22. 22. Pull ModelAUSomethingElse AUSomething
  23. 23. Buses (aka, Elements) AUSomethingElse AUSomething AUSomethingElse
  24. 24. AUGraphAUSomethingElse AUSomethingAUSomethingElse
  25. 25. Render CallbacksOSStatus converterInputRenderCallback (void *inRefCon, AudioUnitRenderActionFlags *ioActionFlags, const AudioTimeStamp *inTimeStamp, UInt32 inBusNumber, UInt32 inNumberFrames, AudioBufferList * ioData) { CCFWebRadioPlayer *player = (__bridge CCFWebRadioPlayer*) inRefCon; // read from buffer ioData->mBuffers[0].mData = player.preRenderData; return noErr;} AUSomething AUSomethingElse
  26. 26. AURemoteIO• Output unit used for play-out, capture• A Core Audio thread repeatedly and automatically calls AudioUnitRender()• Must set EnableIO property to explicitly enable capture and/or play-out • Capture requires setting appropriate AudioSession category
  27. 27. Create AURemoteIOCheckError(NewAUGraph(&_auGraph), "couldnt create au graph"); CheckError(AUGraphOpen(_auGraph), "couldnt open au graph"); AudioComponentDescription componentDesc;componentDesc.componentType = kAudioUnitType_Output;componentDesc.componentSubType = kAudioUnitSubType_RemoteIO;componentDesc.componentManufacturer = kAudioUnitManufacturer_Apple; AUNode remoteIONode;CheckError(AUGraphAddNode(_auGraph, &componentDesc, &remoteIONode), "couldnt add remote io node");
  28. 28. Getting an AudioUnit from AUNode CheckError(AUGraphNodeInfo(self.auGraph, remoteIONode, NULL, &_remoteIOUnit), "couldnt get remote io unit from node");
  29. 29. AURemoteIO Buses bus 1 bus 1from input H/W to app AURemoteIO bus 0 bus 0 from app to output H/W
  30. 30. EnableIO UInt32 oneFlag = 1; UInt32 busZero = 0; CheckError(AudioUnitSetProperty(self.remoteIOUnit, kAudioOutputUnitProperty_EnableIO, kAudioUnitScope_Output, busZero, &oneFlag, sizeof(oneFlag)), "couldnt enable remote io output"); UInt32 busOne = 1; CheckError(AudioUnitSetProperty(self.remoteIOUnit, kAudioOutputUnitProperty_EnableIO, kAudioUnitScope_Input, busOne, &oneFlag, sizeof(oneFlag)), "couldnt enable remote io input");
  31. 31. Pass Through bus 1from input H/W AURemoteIO bus 0 to output H/W
  32. 32. Connect In to Out UInt32 busZero = 0; UInt32 busOne = 1; CheckError(AUGraphConnectNodeInput(self.auGraph, remoteIONode, busOne, remoteIONode, busZero), "couldnt connect remote io bus 1 to 0");
  33. 33. Pass-Through with Effect AUEffect bus 1 from input H/W AURemoteIO bus 0 to output H/W
  34. 34. Demo: Delay Effect New in iOS 6!
  35. 35. Creating the AUDelay componentDesc.componentType = kAudioUnitType_Effect; componentDesc.componentSubType = kAudioUnitSubType_Delay; componentDesc.componentManufacturer = kAudioUnitManufacturer_Apple; AUNode effectNode; CheckError(AUGraphAddNode(self.auGraph, &componentDesc, &effectNode), "couldnt create effect node"); AudioUnit effectUnit; CheckError(AUGraphNodeInfo(self.auGraph, effectNode, NULL, &effectUnit), "couldnt get effect unit from node");
  36. 36. The problem with effect units• Audio Units available since iPhone OS 2.0 prefer int formats• Effect units arrived with iOS 5 (arm7 era) and only work with float format• Have to set the AUEffect unit’s format on AURemoteIO
  37. 37. Setting formats AudioStreamBasicDescription effectDataFormat; UInt32 propSize = sizeof (effectDataFormat); CheckError(AudioUnitGetProperty(effectUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Output, busZero, &effectDataFormat, &propSize), "couldnt read effect format"); CheckError(AudioUnitSetProperty(self.remoteIOUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Output, busOne, &effectDataFormat, propSize), "couldnt set bus one output format"); Then repeat AudioUnitSetProperty() for input scope / bus 0
  38. 38. AUNewTimePitch• New in iOS 6!• Allows you to change pitch independent of time, or time independent of pitch• How do you use it?
  39. 39. AUTimePitch AudioComponentDescription effectcd = {0}; effectcd.componentType = kAudioUnitType_FormatConverter; effectcd.componentSubType = kAudioUnitSubType_NewTimePitch; effectcd.componentManufacturer = kAudioUnitManufacturer_Apple; AUNode effectNode; CheckError(AUGraphAddNode(self.auGraph, &effectcd, &effectNode), "couldnt get effect node [time/pitch]"); Notice the type is AUFormatConverter, not AUEffect
  40. 40. AudioUnitParameters.h // Parameters for AUNewTimePitch enum { // Global, rate, 1/32 -> 32.0, 1.0 kNewTimePitchParam_Rate = 0, // Global, Cents, -2400 -> 2400, 1.0 kNewTimePitchParam_Pitch = 1, // Global, generic, 3.0 -> 32.0, 8.0 kNewTimePitchParam_Overlap = 4, // Global, Boolean, 0->1, 1 kNewTimePitchParam_EnablePeakLocking = 6 };This is the entire documentation for the AUNewTimePitch parameters
  41. 41. AUNewTimePitch parameters• Rate: kNewTimePitchParam_Rate takes a Float32 rate from 1/32 speed to 32x speed. • Use powers of 2: 1/32, 1/16, …, 2, 4, 8…• Pitch: kNewTimePitchParam_Pitch takes a Float32 representing cents, meaning 1/100 of a musical semitone
  42. 42. Pitch shifting• Pitch can vary, time does not• Suitable for real-time sources, such as audio capture
  43. 43. Demo: Pitch Shift New in iOS 6!
  44. 44. Rate shifting• Rate can vary, pitch does not • Think of 1.5x and 2x speed modes in Podcasts app• Not suitable for real-time sources, as data will be consumed faster. Files work well. • Sources must be able to map time systems with kAudioUnitProperty_InputSamplesInOutput
  45. 45. Demo: Rate Shift New in iOS 6!
  46. 46. AUSplitter AUSomethingElseAUSplitter AUSomethingElse New in iOS 6!
  47. 47. AUMatrixMixerAUSomethingElse AUSomethingElseAUSomethingElse AUMatrixMixer AUSomethingElseAUSomethingElse New in iOS 6!
  48. 48. Audio Queues(and the APIs that help them)
  49. 49. AudioQueue• Easier than AURemoteIO - provide data when you want to, less time pressure, can accept or provide compressed formats (MP3, AAC)• Recording queue - receive buffers of captured audio in a callback• Play-out queue - enqueue buffers of audio to play, optionally refill in a callback
  50. 50. AudioQueue 2 1 0
  51. 51. Common AQ scenarios• File player - Read from file and “prime” queue buffers, start queue, when called back with used buffer, refill from next part of file• Synthesis - Maintain state in your own code, write raw samples into buffers during callbacks
  52. 52. Web Radio• Project from CocoaConfs Columbus, Portland, and Raleigh• Use Audio File Stream Services to pick out audio data from a network stream• Enqueue these packets as new AQ buffers• Dispose used buffers in callback
  53. 53. Parsing web radioNSURLConnection deliversNSData buffers, containing audioand framing info. We pass it to NSData NSDataAudio File Services. Packets Packets Packets Packets Packets Packets PacketsAudio File Services calls us backwith parsed packets of audio data. Packets Packets PacketsWe create an AudioQueueBuffer Packets Packetswith those packets and enqueue it Packets 2 Packets 1 0for play-out. Packets Packets
  54. 54. A complex thing!• What if we want to see that data after it’s been decoded to PCM and is about to be played? • e.g., spectrum analysis, effects, visualizers• AudioQueue design is “fire-and-forget”
  55. 55. AudioQueue Tap!http://www.last.fm/music/Spinal+Tap
  56. 56. AudioQueueProcessingTap• Set as a property on the Audio Queue• Calls back to your function with decoded (PCM) audio data• Three types: pre- or post- effects (that the AQ performs), or siphon. First two can modify the data.• Only documentation is in AudioQueue.h
  57. 57. Creating an AQ Tap create the tap // UInt32 maxFrames = 0; AudioStreamBasicDescription tapFormat = {0}; AudioQueueProcessingTapRef tapRef; CheckError(AudioQueueProcessingTapNew(audioQueue, tapProc, (__bridge void *)(player), kAudioQueueProcessingTap_PreEffects, &maxFrames, &tapFormat, &tapRef), "couldnt create AQ tap"); Notice that you receive maxFrames and tapFormat. These do not appear to be settable.
  58. 58. AQ Tap Procvoid tapProc (void * inClientData, AudioQueueProcessingTapRef inAQTap, UInt32 inNumberFrames, AudioTimeStamp * ioTimeStamp, UInt32 * ioFlags, UInt32 * outNumberFrames, AudioBufferList * ioData) { CCFWebRadioPlayer *player = (__bridge CCFWebRadioPlayer*) inClientData; UInt32 getSourceFlags = 0; UInt32 getSourceFrames = 0; AudioQueueProcessingTapGetSourceAudio(inAQTap, inNumberFrames, ioTimeStamp, &getSourceFlags, &getSourceFrames, ioData); // then do something with ioData // ...
  59. 59. So what should we do with the audio? Let’s apply our pitch-shift effect
  60. 60. Shouldn’t this work? AUEffect AudioUnitRender()
  61. 61. AudioUnitRender()• Last argument is an AudioBufferList, whose AudioBuffer members have mData pointers • If mData != NULL, audio unit does its thing with those samples • If mData == NULL, audio data pulls from whatever it’s connected to• So we just call with AudioBufferList ioData we got from tap callback, right?
  62. 62. Psych!• AQ tap provides data as signed ints• Effect units only work with floating point• We need to do an on-the-spot format conversion
  63. 63. invalidname’s convert- and-effect recipe OSStatus converterInputRenderCallback (void *inRefCon, AudioUnitRenderActionFlags *ioActionFlags, const AudioTimeStamp *inTimeStamp, UInt32 inBusNumber, UInt32 inNumberFrames, AudioBufferList * ioData) { CCFWebRadioPlayer *player = (__bridge CCFWebRadioPlayer*) inRefCon; // read from buffer ioData->mBuffers[0].mData = player.preRenderData; return noErr; } AUConverter AUEffect AUConverter AUGenericOutputNote: red arrows are float format, yellow arrows are int
  64. 64. How it works• AUGraph: AUConverter → AUEffect → AUConverter → AUGenericOutput• Top AUConverter is connected to a render callback function
  65. 65. The trick!• Copy mData pointer to a state variable and NULL it in ioData• Call AudioQueueRender() on output unit. The NULL makes it pull from the graph.• Top of the graph pulls on render callback, which gives it back the mData we copied off.
  66. 66. Yes, really This is the rest of tapProc() // copy off the ioData so the graph can read from it // in render callback player.preRenderData = ioData->mBuffers[0].mData; ioData->mBuffers[0].mData = NULL; OSStatus renderErr = noErr; AudioUnitRenderActionFlags actionFlags = 0; renderErr = AudioUnitRender(player.genericOutputUnit, &actionFlags, player.renderTimeStamp, 0, inNumberFrames, ioData); NSLog (@"AudioUnitRender, renderErr = %ld",renderErr);}
  67. 67. Yes, really This is the render callback that supplies data to the int→float converterOSStatus converterInputRenderCallback (void *inRefCon, AudioUnitRenderActionFlags *ioActionFlags, const AudioTimeStamp *inTimeStamp, UInt32 inBusNumber, UInt32 inNumberFrames, AudioBufferList * ioData) { CCFWebRadioPlayer *player = (__bridge CCFWebRadioPlayer*) inRefCon; // read from buffer ioData->mBuffers[0].mData = player.preRenderData; return noErr;}
  68. 68. Demo: AQ Tap +AUNewTimePitch New in iOS 6!
  69. 69. Other new stuff
  70. 70. Multi-Route• Ordinarily, one input or output is active: earpiece, speaker, headphones, dock- connected device • “Last in wins”• With AV Session “multi-route” category, you can use several at once• WWDC 2012 session 505
  71. 71. Utility classes moved again• C++ utilities, including the CARingBuffer • < Xcode 4.3, installed into /Developer • Xcode 4.3-4.4, optional download from developer.apple.com • Xcode 4.5, sample code project “Core Audio Utility Classes”
  72. 72. Takeaways• Core Audio fundamentals never change• New stuff is added as properties, typedefs, enums, etc.• Watch the SDK API diffs document to find the new stuff• Hope you like header files and experimentation
  73. 73. Q&A• Slides will be posted to slideshare.net/ invalidname• Code will be linked from there and my blog• Watch CocoaConf glassboard, @invalidname on Twitter/ADN, or [Time code]; blog for announcement• Thanks!

×