Core Audio in iOS 6         Chris Adamson • @invalidname                CocoaConf PDX                October 27, 2012Slide...
Plug!
The Reviews Are In!
The Reviews Are In!
The Reviews Are In!
The Reviews Are In!
Legitimate copies!• Amazon (paper or Kindle)• Barnes & Noble (paper or Nook)• Apple (iBooks)• Direct from InformIT (paper,...
What You’ll Learn• What Core Audio does and doesn’t do• When to use and not use it• What’s new in Core Audio for iOS 6
Simple things should be simple,complex things should be possible.             –Alan Kay
AV Foundation, Media Player           Simple things should be simple,         complex things should be possible.          ...
AV Foundation, Media Player           Simple things should be simple,         complex things should be possible.          ...
Core Audio• Low-level C framework for processing  audio • Capture, play-out, real-time or off-line    processing• The “com...
Chris’ CA Taxonomy• Engines: process streams of audio • Capture, play-out, mixing, effects    processing• Helpers: deal wi...
Helpers: Audio File• Read from / write to multiple audio file  types (.aiff, .wav, .caf, .m4a, .mp3) in a  content-agnostic...
Helpers: Audio File         Stream• Read audio from non-random-access  source like a network stream• Discover encoding and...
Helpers: Converters• Convert buffers of audio to and from  different encodings• One side must be in an uncompressed  forma...
Helpers: ExtAudioFile• Combine file I/O and format conversion• Read a compressed file into PCM buffers• Write PCM buffers in...
Helpers: Audio Session• iOS-only API to negotiate use of audio  resources with the rest of the system• Deetermine whether ...
Engines: Audio Units• Low-latency (~10ms) processing of  capture/play-out audio data• Effects, mixing, etc.• Connect units...
Engines: Audio Queue• Convenience API for recording or play-out,  built atop audio units• Rather than processing on-demand...
Engines: Open AL• API for 3D spatialized audio, implemented  atop audio units• Set a source’s properties (x/y/z  coordinat...
Engines and Helpers•   Audio Units   •   Audio File•   Audio Queue   •   Audio File Stream•   Open AL       •   Audio Conv...
Audio Units
Audio Unit  AUSomething
Types of Audio Units• Output (which also do input)• Generator• Converter• Effect• Mixer• Music
Pull Model AUSomething
Pull Model AUSomething               AudioUnitRender()
Pull ModelAUSomethingElse   AUSomething
Buses (aka, Elements)   AUSomethingElse                     AUSomething   AUSomethingElse
AUGraphAUSomethingElse                  AUSomethingAUSomethingElse
Render CallbacksOSStatus converterInputRenderCallback (void *inRefCon,                                        AudioUnitRen...
AURemoteIO• Output unit used for play-out, capture• A Core Audio thread repeatedly and  automatically calls AudioUnitRende...
Create AURemoteIOCheckError(NewAUGraph(&_auGraph),! !     "couldnt create au graph");!CheckError(AUGraphOpen(_auGraph),! !...
Getting an AudioUnit        from AUNode!   CheckError(AUGraphNodeInfo(self.auGraph,!   ! ! ! ! ! !       remoteIONode,!   ...
AURemoteIO Buses     AURemoteIO
AURemoteIO Buses     AURemoteIO                       bus 0                  to output H/W
AURemoteIO Buses           AURemoteIO  bus 0                      bus 0from app                to output H/W
AURemoteIO Buses     bus 1from input H/W                 AURemoteIO    bus 0                          bus 0  from app     ...
AURemoteIO Buses     bus 1                        bus 1from input H/W                   to app                 AURemoteIO ...
EnableIO!   UInt32 oneFlag = 1;!   UInt32 busZero = 0;!   CheckError(AudioUnitSetProperty(self.remoteIOUnit,!   ! ! ! ! ! ...
Pass Through     bus 1from input H/W                 AURemoteIO                                   bus 0                   ...
Connect In to Out!   UInt32 busZero = 0;!   UInt32 busOne = 1;!   CheckError(AUGraphConnectNodeInput(self.auGraph,!   ! ! ...
Pass-Through with Effect                       AUEffect          bus 1     from input H/W                      AURemoteIO ...
Demo: Delay Effect      New in iOS 6!
Creating the AUDelay! componentDesc.componentType = kAudioUnitType_Effect;! componentDesc.componentSubType = kAudioUnitSub...
The problem with effect         units• Audio Units available since iPhone OS 2.0  prefer int formats• Effect units arrived...
Setting formats!   AudioStreamBasicDescription effectDataFormat;!   UInt32 propSize = sizeof (effectDataFormat);!   CheckE...
AUNewTimePitch• New in iOS 6!• Allows you to change pitch independent of  time, or time independent of pitch• How do you u...
AUTimePitch!   AudioComponentDescription effectcd = {0};!   effectcd.componentType = kAudioUnitType_FormatConverter;!   ef...
AudioUnitParameters.h      // Parameters for AUNewTimePitch      enum {      ! ! // Global, rate, 1/32 -> 32.0, 1.0      !...
AUNewTimePitch      parameters• Rate: kNewTimePitchParam_Rate takes a  Float32 rate from 1/32 speed to 32x  speed. • Use p...
Pitch shifting• Pitch can vary, time does not• Suitable for real-time sources, such as audio  capture
Demo: Pitch Shift     New in iOS 6!
Rate shifting• Rate can vary, pitch does not • Think of 1.5x and 2x speed modes in    Podcasts app• Not suitable for real-...
Demo: Rate Shift     New in iOS 6!
AUSplitter                   AUSomethingElseAUSplitter                   AUSomethingElse         New in iOS 6!
AUMatrixMixerAUSomethingElse                                     AUSomethingElseAUSomethingElse      AUMatrixMixer        ...
Audio Queues(and the APIs that help them)
AudioQueue• Easier than AURemoteIO - provide data  when you want to, less time pressure, can  accept or provide compressed...
AudioQueue   2   1   0
Common AQ scenarios• File player - Read from file and “prime”  queue buffers, start queue, when called  back with used buff...
Web Radio• Thursday class’ third project• Use Audio File Stream Services to pick out  audio data from a network stream• En...
Parsing web radio
Parsing web radioNSURLConnection deliversNSData buffers, containing audioand framing info. We pass it to              NSDa...
Parsing web radioNSURLConnection deliversNSData buffers, containing audioand framing info. We pass it to                NS...
Parsing web radioNSURLConnection deliversNSData buffers, containing audioand framing info. We pass it to                NS...
A complex thing!• What if we want to see that data after it’s  been decoded to PCM and is about to be  played?  • e.g., sp...
AudioQueue Tap!http://www.last.fm/music/Spinal+Tap
AudioQueueProcessingTap• Set as a property on the Audio Queue• Calls back to your function with decoded  (PCM) audio data•...
Creating an AQ Tap!   !     // create the tap!   !     UInt32 maxFrames = 0;!   !     AudioStreamBasicDescription tapForma...
AQ Tap Procvoid tapProc (void *                            inClientData,! ! !     AudioQueueProcessingTapRef       inAQTap...
So what should we do   with the audio?
So what should we do   with the audio?  Let’s apply our pitch-shift effect
Shouldn’t this work?       AUEffect
Shouldn’t this work?       AUEffect                  AudioUnitRender()
AudioUnitRender()• Last argument is an AudioBufferList, whose  AudioBuffer members have mData pointers  • If mData != NULL...
Psych!• AQ tap provides data as signed ints• Effect units only work with floating point• We need to do an on-the-spot forma...
invalidname’s convert-      and-effect recipe      OSStatus converterInputRenderCallback (void *inRefCon,                 ...
How it works• AUGraph: AUConverter → AUEffect →  AUConverter → AUGenericOutput• Top AUConverter is connected to a render  ...
The trick!• Copy mData pointer to a state variable and  NULL it in ioData• Call AudioQueueRender() on output unit.  The NU...
Yes, really          This is the rest of tapProc()! // copy off the ioData so the graph can read from it  // in render cal...
Yes, really  This is the render callback that supplies data to the int→float converterOSStatus converterInputRenderCallback...
Demo: AQ Tap +AUNewTimePitch     New in iOS 6!
Other new stuff
Multi-Route• Ordinarily, one input or output is active:  earpiece, speaker, headphones, dock-  connected device  • “Last i...
Utility classes moved         again• C++ utilities, including the CARingBuffer • < Xcode 4.3, installed into /Developer • ...
Takeaways• Core Audio fundamentals never change• New stuff is added as properties, typedefs,  enums, etc.• Watch the SDK A...
Q&A• Slides will be posted to slideshare.net/  invalidname• Code will be linked from there and my blog• Watch CocoaConf PD...
Core Audio in iOS 6 (CocoaConf Portland, Oct. '12)
Core Audio in iOS 6 (CocoaConf Portland, Oct. '12)
Upcoming SlideShare
Loading in …5
×

Core Audio in iOS 6 (CocoaConf Portland, Oct. '12)

10,435 views
10,248 views

Published on

Core Audio gets a bunch of neat new tricks in iOS 6, particularly for developers working with Audio Units. New effect units include an improved ability to vary pitch and playback speed, a digital delay unit, and OS X's powerful matrix mixer. There's now a new place to use units too, as the Audio Queue now offers developers a way to "tap" into the data being queued up for playback. To top it all off, a new "multi-route" system allows us to play out of multiple, multi-channel output devices at the same time.

Want to see, and hear, how all this stuff works? This section is the place to find out.

Published in: Technology
1 Comment
11 Likes
Statistics
Notes
  • Chris, I'm a PC software developer who is tinkering with iPad programming... I have a great idea, but I don't have the skills to do it. I thought I would give it to you since I think you could probably write it.

    What I am looking for is a wireless microphone using iOS devices. The ultimate app would be to enable an iPod to be able to connect to one (or more) bluetooth microphones (headsets) and output the audio from the headphone jack on the iPod... this way, I could plug the iPod into my video camera and have an inexpensive wireless microphone.

    In an ideal app, I would be able to connect more than one microphone and the app would include a mixer giving me a multi-channel capability. If you could write such an app, I would certainly buy it and I think you would have a huge potential market for people who want a wireless microphone for their home video cameras.

    Just a thought... not sure if this is up your alley, but thought I would share the idea with you.

    Thx!

    James
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here
No Downloads
Views
Total views
10,435
On SlideShare
0
From Embeds
0
Number of Embeds
403
Actions
Shares
0
Downloads
0
Comments
1
Likes
11
Embeds 0
No embeds

No notes for slide
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • Core Audio in iOS 6 (CocoaConf Portland, Oct. '12)

    1. 1. Core Audio in iOS 6 Chris Adamson • @invalidname CocoaConf PDX October 27, 2012Slides and sample code will be available later today
    2. 2. Plug!
    3. 3. The Reviews Are In!
    4. 4. The Reviews Are In!
    5. 5. The Reviews Are In!
    6. 6. The Reviews Are In!
    7. 7. Legitimate copies!• Amazon (paper or Kindle)• Barnes & Noble (paper or Nook)• Apple (iBooks)• Direct from InformIT (paper, eBook [.epub + .mobi + .pdf], or Bundle) • 35% off with code COREAUDIO3174
    8. 8. What You’ll Learn• What Core Audio does and doesn’t do• When to use and not use it• What’s new in Core Audio for iOS 6
    9. 9. Simple things should be simple,complex things should be possible. –Alan Kay
    10. 10. AV Foundation, Media Player Simple things should be simple, complex things should be possible. –Alan Kay
    11. 11. AV Foundation, Media Player Simple things should be simple, complex things should be possible. –Alan Kay Core Audio
    12. 12. Core Audio• Low-level C framework for processing audio • Capture, play-out, real-time or off-line processing• The “complex things should be possible” part of audio on OS X and iOS
    13. 13. Chris’ CA Taxonomy• Engines: process streams of audio • Capture, play-out, mixing, effects processing• Helpers: deal with formats, encodings, etc. • File I/O, stream I/O, format conversion, iOS “session” management
    14. 14. Helpers: Audio File• Read from / write to multiple audio file types (.aiff, .wav, .caf, .m4a, .mp3) in a content-agnostic way• Get metadata (data format, duration, iTunes/ID3 info)
    15. 15. Helpers: Audio File Stream• Read audio from non-random-access source like a network stream• Discover encoding and encapsulation on the fly, then deliver audio packets to client application
    16. 16. Helpers: Converters• Convert buffers of audio to and from different encodings• One side must be in an uncompressed format (i.e., Linear PCM)
    17. 17. Helpers: ExtAudioFile• Combine file I/O and format conversion• Read a compressed file into PCM buffers• Write PCM buffers into a compressed file
    18. 18. Helpers: Audio Session• iOS-only API to negotiate use of audio resources with the rest of the system• Deetermine whether your app mixes with other apps’ audio, honors ring/silent switch, can play in background, etc.• Gets notified of audio interruptions• See also AVAudioSession
    19. 19. Engines: Audio Units• Low-latency (~10ms) processing of capture/play-out audio data• Effects, mixing, etc.• Connect units manually or via an AUGraph• Much more on this topic momentarily…
    20. 20. Engines: Audio Queue• Convenience API for recording or play-out, built atop audio units• Rather than processing on-demand and on Core Audio’s thread, your callback provides or receives buffers of audio (at whatever size is convenient to you)• Higher latency, naturally• Supports compressed formats (MP3, AAC)
    21. 21. Engines: Open AL• API for 3D spatialized audio, implemented atop audio units• Set a source’s properties (x/y/z coordinates, orientation, audio buffer, etc.), OpenAL renders what it sounds like to the listener from that location
    22. 22. Engines and Helpers• Audio Units • Audio File• Audio Queue • Audio File Stream• Open AL • Audio Converter • ExtAudioFile • Audio Session
    23. 23. Audio Units
    24. 24. Audio Unit AUSomething
    25. 25. Types of Audio Units• Output (which also do input)• Generator• Converter• Effect• Mixer• Music
    26. 26. Pull Model AUSomething
    27. 27. Pull Model AUSomething AudioUnitRender()
    28. 28. Pull ModelAUSomethingElse AUSomething
    29. 29. Buses (aka, Elements) AUSomethingElse AUSomething AUSomethingElse
    30. 30. AUGraphAUSomethingElse AUSomethingAUSomethingElse
    31. 31. Render CallbacksOSStatus converterInputRenderCallback (void *inRefCon, AudioUnitRenderActionFlags *ioActionFlags, const AudioTimeStamp *inTimeStamp, UInt32 inBusNumber, UInt32 inNumberFrames, AudioBufferList * ioData) { CCFWebRadioPlayer *player = (__bridge CCFWebRadioPlayer*) inRefCon; // read from buffer ioData->mBuffers[0].mData = player.preRenderData; return noErr;} AUSomething AUSomethingElse
    32. 32. AURemoteIO• Output unit used for play-out, capture• A Core Audio thread repeatedly and automatically calls AudioUnitRender()• Must set EnableIO property to explicitly enable capture and/or play-out • Capture requires setting appropriate AudioSession category
    33. 33. Create AURemoteIOCheckError(NewAUGraph(&_auGraph),! ! "couldnt create au graph");!CheckError(AUGraphOpen(_auGraph),! ! "couldnt open au graph");!AudioComponentDescription componentDesc;componentDesc.componentType = kAudioUnitType_Output;componentDesc.componentSubType = kAudioUnitSubType_RemoteIO;componentDesc.componentManufacturer = kAudioUnitManufacturer_Apple;!AUNode remoteIONode;CheckError(AUGraphAddNode(_auGraph,! ! ! ! ! ! &componentDesc,! ! ! ! ! ! &remoteIONode),! ! "couldnt add remote io node");
    34. 34. Getting an AudioUnit from AUNode! CheckError(AUGraphNodeInfo(self.auGraph,! ! ! ! ! ! ! remoteIONode,! ! ! ! ! ! ! NULL,! ! ! ! ! ! ! &_remoteIOUnit),! ! ! "couldnt get remote io unit from node");
    35. 35. AURemoteIO Buses AURemoteIO
    36. 36. AURemoteIO Buses AURemoteIO bus 0 to output H/W
    37. 37. AURemoteIO Buses AURemoteIO bus 0 bus 0from app to output H/W
    38. 38. AURemoteIO Buses bus 1from input H/W AURemoteIO bus 0 bus 0 from app to output H/W
    39. 39. AURemoteIO Buses bus 1 bus 1from input H/W to app AURemoteIO bus 0 bus 0 from app to output H/W
    40. 40. EnableIO! UInt32 oneFlag = 1;! UInt32 busZero = 0;! CheckError(AudioUnitSetProperty(self.remoteIOUnit,! ! ! ! ! ! ! ! ! kAudioOutputUnitProperty_EnableIO,! ! ! ! ! ! ! ! ! kAudioUnitScope_Output,! ! ! ! ! ! ! ! ! busZero,! ! ! ! ! ! ! ! ! &oneFlag,! ! ! ! ! ! ! ! ! sizeof(oneFlag)),! ! ! "couldnt enable remote io output");! UInt32 busOne = 1;! CheckError(AudioUnitSetProperty(self.remoteIOUnit,! ! ! ! ! ! ! ! ! kAudioOutputUnitProperty_EnableIO,! ! ! ! ! ! ! ! ! kAudioUnitScope_Input,! ! ! ! ! ! ! ! ! busOne,! ! ! ! ! ! ! ! ! &oneFlag,! ! ! ! ! ! ! ! ! sizeof(oneFlag)),! ! ! "couldnt enable remote io input");
    41. 41. Pass Through bus 1from input H/W AURemoteIO bus 0 to output H/W
    42. 42. Connect In to Out! UInt32 busZero = 0;! UInt32 busOne = 1;! CheckError(AUGraphConnectNodeInput(self.auGraph,! ! ! ! ! ! ! ! ! remoteIONode,! ! ! ! ! ! ! ! ! busOne,! ! ! ! ! ! ! ! ! remoteIONode,! ! ! ! ! ! ! ! ! busZero),! ! ! "couldnt connect remote io bus 1 to 0");
    43. 43. Pass-Through with Effect AUEffect bus 1 from input H/W AURemoteIO bus 0 to output H/W
    44. 44. Demo: Delay Effect New in iOS 6!
    45. 45. Creating the AUDelay! componentDesc.componentType = kAudioUnitType_Effect;! componentDesc.componentSubType = kAudioUnitSubType_Delay;! componentDesc.componentManufacturer = kAudioUnitManufacturer_Apple;!! AUNode effectNode;! CheckError(AUGraphAddNode(self.auGraph,! ! ! ! ! ! ! &componentDesc,! ! ! ! ! ! ! &effectNode),! ! ! "couldnt create effect node");! AudioUnit effectUnit;! CheckError(AUGraphNodeInfo(self.auGraph,! ! ! ! ! ! ! effectNode,! ! ! ! ! ! ! NULL,! ! ! ! ! ! ! &effectUnit),! ! ! "couldnt get effect unit from node");
    46. 46. The problem with effect units• Audio Units available since iPhone OS 2.0 prefer int formats• Effect units arrived with iOS 5 (arm7 era) and only work with float format• Have to set the AUEffect unit’s format on AURemoteIO
    47. 47. Setting formats! AudioStreamBasicDescription effectDataFormat;! UInt32 propSize = sizeof (effectDataFormat);! CheckError(AudioUnitGetProperty(effectUnit,! ! ! ! ! ! ! ! ! kAudioUnitProperty_StreamFormat,! ! ! ! ! ! ! ! ! kAudioUnitScope_Output,! ! ! ! ! ! ! ! ! busZero,! ! ! ! ! ! ! ! ! &effectDataFormat,! ! ! ! ! ! ! ! ! &propSize),! ! ! "couldnt read effect format");! CheckError(AudioUnitSetProperty(self.remoteIOUnit,! ! ! ! ! ! ! ! ! kAudioUnitProperty_StreamFormat,! ! ! ! ! ! ! ! ! kAudioUnitScope_Output,! ! ! ! ! ! ! ! ! busOne,! ! ! ! ! ! ! ! ! &effectDataFormat,! ! ! ! ! ! ! ! ! propSize),! ! ! "couldnt set bus one output format"); Then repeat AudioUnitSetProperty() for input scope / bus 0
    48. 48. AUNewTimePitch• New in iOS 6!• Allows you to change pitch independent of time, or time independent of pitch• How do you use it?
    49. 49. AUTimePitch! AudioComponentDescription effectcd = {0};! effectcd.componentType = kAudioUnitType_FormatConverter;! effectcd.componentSubType = kAudioUnitSubType_NewTimePitch;! effectcd.componentManufacturer = kAudioUnitManufacturer_Apple;!! AUNode effectNode;! CheckError(AUGraphAddNode(self.auGraph,! ! ! ! ! ! ! &effectcd,! ! ! ! ! ! ! &effectNode),! ! ! "couldnt get effect node [time/pitch]"); Notice the type is AUFormatConverter, not AUEffect
    50. 50. AudioUnitParameters.h // Parameters for AUNewTimePitch enum { ! ! // Global, rate, 1/32 -> 32.0, 1.0 ! kNewTimePitchParam_Rate! ! ! ! ! ! = ! 0, ! ! // Global, Cents, -2400 -> 2400, 1.0 ! kNewTimePitchParam_Pitch! ! ! ! ! ! = 1, ! ! // Global, generic, 3.0 -> 32.0, 8.0 ! kNewTimePitchParam_Overlap! ! ! ! ! ! = 4, ! ! // Global, Boolean, 0->1, 1 ! kNewTimePitchParam_EnablePeakLocking! ! ! = 6 };This is the entire documentation for the AUNewTimePitch parameters
    51. 51. AUNewTimePitch parameters• Rate: kNewTimePitchParam_Rate takes a Float32 rate from 1/32 speed to 32x speed. • Use powers of 2: 1/32, 1/16, …, 2, 4, 8…• Pitch: kNewTimePitchParam_Pitch takes a Float32 representing cents, meaning 1/100 of a musical semitone
    52. 52. Pitch shifting• Pitch can vary, time does not• Suitable for real-time sources, such as audio capture
    53. 53. Demo: Pitch Shift New in iOS 6!
    54. 54. Rate shifting• Rate can vary, pitch does not • Think of 1.5x and 2x speed modes in Podcasts app• Not suitable for real-time sources, as data will be consumed faster. Files work well. • Sources must be able to map time systems with kAudioUnitProperty_InputSamplesInOutput
    55. 55. Demo: Rate Shift New in iOS 6!
    56. 56. AUSplitter AUSomethingElseAUSplitter AUSomethingElse New in iOS 6!
    57. 57. AUMatrixMixerAUSomethingElse AUSomethingElseAUSomethingElse AUMatrixMixer AUSomethingElseAUSomethingElse New in iOS 6!
    58. 58. Audio Queues(and the APIs that help them)
    59. 59. AudioQueue• Easier than AURemoteIO - provide data when you want to, less time pressure, can accept or provide compressed formats (MP3, AAC)• Recording queue - receive buffers of captured audio in a callback• Play-out queue - enqueue buffers of audio to play, optionally refill in a callback
    60. 60. AudioQueue 2 1 0
    61. 61. Common AQ scenarios• File player - Read from file and “prime” queue buffers, start queue, when called back with used buffer, refill from next part of file• Synthesis - Maintain state in your own code, write raw samples into buffers during callbacks
    62. 62. Web Radio• Thursday class’ third project• Use Audio File Stream Services to pick out audio data from a network stream• Enqueue these packets as new AQ buffers• Dispose used buffers in callback
    63. 63. Parsing web radio
    64. 64. Parsing web radioNSURLConnection deliversNSData buffers, containing audioand framing info. We pass it to NSData NSDataAudio File Services. Packets Packets Packets Packets Packets
    65. 65. Parsing web radioNSURLConnection deliversNSData buffers, containing audioand framing info. We pass it to NSData NSDataAudio File Services. Packets Packets Packets Packets Packets Packets PacketsAudio File Services calls us backwith parsed packets of audio data. Packets Packets Packets
    66. 66. Parsing web radioNSURLConnection deliversNSData buffers, containing audioand framing info. We pass it to NSData NSDataAudio File Services. Packets Packets Packets Packets Packets Packets PacketsAudio File Services calls us backwith parsed packets of audio data. Packets Packets PacketsWe create an AudioQueueBuffer Packets Packetswith those packets and enqueue it Packets 2 Packets 1 0for play-out. Packets Packets
    67. 67. A complex thing!• What if we want to see that data after it’s been decoded to PCM and is about to be played? • e.g., spectrum analysis, effects, visualizers• AudioQueue design is “fire-and-forget”
    68. 68. AudioQueue Tap!http://www.last.fm/music/Spinal+Tap
    69. 69. AudioQueueProcessingTap• Set as a property on the Audio Queue• Calls back to your function with decoded (PCM) audio data• Three types: pre- or post- effects (that the AQ performs), or siphon. First two can modify the data.• Only documentation is in AudioQueue.h
    70. 70. Creating an AQ Tap! ! // create the tap! ! UInt32 maxFrames = 0;! ! AudioStreamBasicDescription tapFormat = {0};! ! AudioQueueProcessingTapRef tapRef;! ! CheckError(AudioQueueProcessingTapNew(audioQueue,! ! ! ! ! ! ! ! ! ! ! tapProc,! ! ! ! ! ! ! ! ! ! ! (__bridge void *)(player),! ! ! ! ! ! ! ! ! ! ! kAudioQueueProcessingTap_PreEffects,! ! ! ! ! ! ! ! ! ! ! &maxFrames,! ! ! ! ! ! ! ! ! ! ! &tapFormat,! ! ! ! ! ! ! ! ! ! ! &tapRef),! ! ! ! "couldnt create AQ tap"); Notice that you receive maxFrames and tapFormat. These do not appear to be settable.
    71. 71. AQ Tap Procvoid tapProc (void * inClientData,! ! ! AudioQueueProcessingTapRef inAQTap,! ! ! UInt32 inNumberFrames,! ! ! AudioTimeStamp * ioTimeStamp,! ! ! UInt32 * ioFlags,! ! ! UInt32 * outNumberFrames,! ! ! AudioBufferList * ioData) {! CCFWebRadioPlayer *player = (__bridge CCFWebRadioPlayer*) inClientData;! UInt32 getSourceFlags = 0;! UInt32 getSourceFrames = 0;! AudioQueueProcessingTapGetSourceAudio(inAQTap,! ! ! ! ! ! ! ! ! ! inNumberFrames,! ! ! ! ! ! ! ! ! ! ioTimeStamp,! ! ! ! ! ! ! ! ! ! &getSourceFlags,! ! ! ! ! ! ! ! ! ! &getSourceFrames,! ! ! ! ! ! ! ! ! ! ioData); // then do something with ioData // ...
    72. 72. So what should we do with the audio?
    73. 73. So what should we do with the audio? Let’s apply our pitch-shift effect
    74. 74. Shouldn’t this work? AUEffect
    75. 75. Shouldn’t this work? AUEffect AudioUnitRender()
    76. 76. AudioUnitRender()• Last argument is an AudioBufferList, whose AudioBuffer members have mData pointers • If mData != NULL, audio unit does its thing with those samples • If mData == NULL, audio data pulls from whatever it’s connected to• So we just call with AudioBufferList ioData we got from tap callback, right?
    77. 77. Psych!• AQ tap provides data as signed ints• Effect units only work with floating point• We need to do an on-the-spot format conversion
    78. 78. invalidname’s convert- and-effect recipe OSStatus converterInputRenderCallback (void *inRefCon, AudioUnitRenderActionFlags *ioActionFlags, const AudioTimeStamp *inTimeStamp, UInt32 inBusNumber, UInt32 inNumberFrames, AudioBufferList * ioData) { CCFWebRadioPlayer *player = (__bridge CCFWebRadioPlayer*) inRefCon; // read from buffer ioData->mBuffers[0].mData = player.preRenderData; return noErr; } AUConverter AUEffect AUConverter AUGenericOutputNote: red arrows are float format, yellow arrows are int
    79. 79. How it works• AUGraph: AUConverter → AUEffect → AUConverter → AUGenericOutput• Top AUConverter is connected to a render callback function
    80. 80. The trick!• Copy mData pointer to a state variable and NULL it in ioData• Call AudioQueueRender() on output unit. The NULL makes it pull from the graph.• Top of the graph pulls on render callback, which gives it back the mData we copied off.
    81. 81. Yes, really This is the rest of tapProc()! // copy off the ioData so the graph can read from it // in render callback! player.preRenderData = ioData->mBuffers[0].mData;! ioData->mBuffers[0].mData = NULL;!! OSStatus renderErr = noErr;! AudioUnitRenderActionFlags actionFlags = 0;! renderErr = AudioUnitRender(player.genericOutputUnit,! ! ! ! ! ! ! ! &actionFlags,! ! ! ! ! ! ! ! player.renderTimeStamp,! ! ! ! ! ! ! ! 0,! ! ! ! ! ! ! ! inNumberFrames,! ! ! ! ! ! ! ! ioData);! NSLog (@"AudioUnitRender, renderErr = %ld",renderErr);}
    82. 82. Yes, really This is the render callback that supplies data to the int→float converterOSStatus converterInputRenderCallback (void *inRefCon,! ! ! ! ! ! ! ! ! AudioUnitRenderActionFlags *ioActionFlags,! ! ! ! ! ! ! ! ! const AudioTimeStamp *inTimeStamp,! ! ! ! ! ! ! ! ! UInt32 inBusNumber,! ! ! ! ! ! ! ! ! UInt32 inNumberFrames,! ! ! ! ! ! ! ! ! AudioBufferList * ioData) {! CCFWebRadioPlayer *player = (__bridge CCFWebRadioPlayer*) inRefCon;!! // read from buffer! ioData->mBuffers[0].mData = player.preRenderData;! return noErr;}
    83. 83. Demo: AQ Tap +AUNewTimePitch New in iOS 6!
    84. 84. Other new stuff
    85. 85. Multi-Route• Ordinarily, one input or output is active: earpiece, speaker, headphones, dock- connected device • “Last in wins”• With AV Session “multi-route” category, you can use several at once• WWDC 2012 session 505
    86. 86. Utility classes moved again• C++ utilities, including the CARingBuffer • < Xcode 4.3, installed into /Developer • Xcode 4.3-4.4, optional download from developer.apple.com • ≧ Xcode 4.5, sample code project “Core Audio Utility Classes”
    87. 87. Takeaways• Core Audio fundamentals never change• New stuff is added as properties, typedefs, enums, etc.• Watch the SDK API diffs document to find the new stuff• Hope you like header files and experimentation
    88. 88. Q&A• Slides will be posted to slideshare.net/ invalidname• Code will be linked from there and my blog• Watch CocoaConf PDX glassboard, @invalidname on Twitter/ADN, or [Time code]; blog for announcement• Thanks!

    ×