Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

  • 768 views
Uploaded on

 

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
No Downloads

Views

Total Views
768
On Slideshare
0
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
25
Comments
0
Likes
2

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Core Audio in iOS 6Chris Adamson • @invalidnameCocoaConf San JoseApril 20, 2013Sides and code available on my blog:http://www.subfurther.com/blogMonday, April 29, 13
  • 2. Plug!Monday, April 29, 13
  • 3. The Reviews Are In!Monday, April 29, 13
  • 4. The Reviews Are In!Monday, April 29, 13
  • 5. The Reviews Are In!Monday, April 29, 13
  • 6. The Reviews Are In!Monday, April 29, 13
  • 7. Legitimate copies!• Amazon (paper or Kindle)• Barnes & Noble (paper or Nook)• Apple (iBooks)• Direct from InformIT (paper, eBook [.epub+ .mobi + .pdf], or Bundle)Monday, April 29, 13
  • 8. WhatYou’ll Learn• What Core Audio does and doesn’t do• When to use and not use it• What’s new in Core Audio for iOS 6Monday, April 29, 13
  • 9. Monday, April 29, 13
  • 10. Simple things should be simple,complex things should be possible.–Alan KayMonday, April 29, 13
  • 11. Simple things should be simple,complex things should be possible.–Alan KayAV Foundation,Media PlayerMonday, April 29, 13
  • 12. Simple things should be simple,complex things should be possible.–Alan KayAV Foundation,Media PlayerCore AudioMonday, April 29, 13
  • 13. Core Audio• Low-level C framework for processingaudio• Capture, play-out, real-time or off-lineprocessing• The “complex things should be possible”part of audio on OS X and iOSMonday, April 29, 13
  • 14. Chris’ CA Taxonomy• Engines: process streams of audio• Capture, play-out, mixing, effectsprocessing• Helpers: deal with formats, encodings, etc.• File I/O, stream I/O, format conversion,iOS “session” managementMonday, April 29, 13
  • 15. Helpers:Audio File• Read from / write to multiple audio filetypes (.aiff, .wav, .caf, .m4a, .mp3) in acontent-agnostic way• Get metadata (data format, duration,iTunes/ID3 info)Monday, April 29, 13
  • 16. Helpers:Audio FileStream• Read audio from non-random-accesssource like a network stream• Discover encoding and encapsulation onthe fly, then deliver audio packets to clientapplicationMonday, April 29, 13
  • 17. Helpers: Converters• Convert buffers of audio to and fromdifferent encodings• One side must be in an uncompressedformat (i.e., Linear PCM)Monday, April 29, 13
  • 18. Helpers: ExtAudioFile• Combine file I/O and format conversion• Read a compressed file into PCM buffers• Write PCM buffers into a compressed fileMonday, April 29, 13
  • 19. Helpers:Audio Session• iOS-only API to negotiate use of audioresources with the rest of the system• Deetermine whether your app mixes withother apps’ audio, honors ring/silentswitch, can play in background, etc.• Gets notified of audio interruptions• See also AVAudioSessionMonday, April 29, 13
  • 20. Engines:Audio Units• Low-latency (~10ms) processing ofcapture/play-out audio data• Effects, mixing, etc.• Connect units manually or via an AUGraph• Much more on this topic momentarily…Monday, April 29, 13
  • 21. Engines:Audio Queue• Convenience API for recording or play-out,built atop audio units• Rather than processing on-demand and onCore Audio’s thread, your callback providesor receives buffers of audio (at whateversize is convenient to you)• Higher latency, naturally• Supports compressed formats (MP3,AAC)Monday, April 29, 13
  • 22. Engines: Open AL• API for 3D spatialized audio, implementedatop audio units• Set a source’s properties (x/y/zcoordinates, orientation, audio buffer, etc.),OpenAL renders what it sounds like to thelistener from that locationMonday, April 29, 13
  • 23. Engines and Helpers• Audio Units• Audio Queue• Open AL• Audio File• Audio File Stream• Audio Converter• ExtAudioFile• Audio SessionMonday, April 29, 13
  • 24. Audio UnitsMonday, April 29, 13
  • 25. Audio UnitAUSomethingMonday, April 29, 13
  • 26. Types of Audio Units• Output (which also do input)• Generator• Converter• Effect• Mixer• MusicMonday, April 29, 13
  • 27. Pull ModelAUSomethingMonday, April 29, 13
  • 28. Pull ModelAUSomethingAudioUnitRender()Monday, April 29, 13
  • 29. Pull ModelAUSomethingAUSomethingElseMonday, April 29, 13
  • 30. Buses (aka, Elements)AUSomethingAUSomethingElseAUSomethingElseMonday, April 29, 13
  • 31. AUGraphAUSomethingAUSomethingElseAUSomethingElseMonday, April 29, 13
  • 32. Render CallbacksAUSomethingAUSomethingElseOSStatus converterInputRenderCallback (void *inRefCon,AudioUnitRenderActionFlags *ioActionFlags,const AudioTimeStamp *inTimeStamp,UInt32 inBusNumber,UInt32 inNumberFrames,AudioBufferList * ioData) {CCFWebRadioPlayer *player = (__bridge CCFWebRadioPlayer*) inRefCon;// read from bufferioData->mBuffers[0].mData = player.preRenderData;return noErr;}Monday, April 29, 13
  • 33. AURemoteIO• Output unit used for play-out, capture• A Core Audio thread repeatedly andautomatically calls AudioUnitRender()• Must set EnableIO property to explicitlyenable capture and/or play-out• Capture requires setting appropriateAudioSession categoryMonday, April 29, 13
  • 34. Create AURemoteIOCheckError(NewAUGraph(&_auGraph),! ! "couldnt create au graph");!CheckError(AUGraphOpen(_auGraph),! ! "couldnt open au graph");!AudioComponentDescription componentDesc;componentDesc.componentType = kAudioUnitType_Output;componentDesc.componentSubType = kAudioUnitSubType_RemoteIO;componentDesc.componentManufacturer =kAudioUnitManufacturer_Apple;!AUNode remoteIONode;CheckError(AUGraphAddNode(_auGraph,! ! ! ! ! ! &componentDesc,! ! ! ! ! ! &remoteIONode),! ! "couldnt add remote io node");Monday, April 29, 13
  • 35. Getting an AudioUnitfrom AUNode! CheckError(AUGraphNodeInfo(self.auGraph,! ! ! ! ! ! ! remoteIONode,! ! ! ! ! ! ! NULL,! ! ! ! ! ! ! &_remoteIOUnit),! ! ! "couldnt get remote io unit from node");Monday, April 29, 13
  • 36. AURemoteIO BusesAURemoteIOMonday, April 29, 13
  • 37. AURemoteIO BusesAURemoteIObus 0to output H/WMonday, April 29, 13
  • 38. AURemoteIO BusesAURemoteIObus 0to output H/Wbus 0from appMonday, April 29, 13
  • 39. AURemoteIO BusesAURemoteIObus 0to output H/Wbus 1from input H/Wbus 0from appMonday, April 29, 13
  • 40. AURemoteIO BusesAURemoteIObus 0to output H/Wbus 1from input H/Wbus 1to appbus 0from appMonday, April 29, 13
  • 41. EnableIO! UInt32 oneFlag = 1;! UInt32 busZero = 0;! CheckError(AudioUnitSetProperty(self.remoteIOUnit,! ! ! ! ! ! ! ! ! kAudioOutputUnitProperty_EnableIO,! ! ! ! ! ! ! ! ! kAudioUnitScope_Output,! ! ! ! ! ! ! ! ! busZero,! ! ! ! ! ! ! ! ! &oneFlag,! ! ! ! ! ! ! ! ! sizeof(oneFlag)),! ! ! "couldnt enable remote io output");! UInt32 busOne = 1;! CheckError(AudioUnitSetProperty(self.remoteIOUnit,! ! ! ! ! ! ! ! ! kAudioOutputUnitProperty_EnableIO,! ! ! ! ! ! ! ! ! kAudioUnitScope_Input,! ! ! ! ! ! ! ! ! busOne,! ! ! ! ! ! ! ! ! &oneFlag,! ! ! ! ! ! ! ! ! sizeof(oneFlag)),! ! ! "couldnt enable remote io input");Monday, April 29, 13
  • 42. Pass ThroughAURemoteIObus 1from input H/Wbus 0to output H/WMonday, April 29, 13
  • 43. Connect In to Out! UInt32 busZero = 0;! UInt32 busOne = 1;! CheckError(AUGraphConnectNodeInput(self.auGraph,! ! ! ! ! ! ! ! ! remoteIONode,! ! ! ! ! ! ! ! ! busOne,! ! ! ! ! ! ! ! ! remoteIONode,! ! ! ! ! ! ! ! ! busZero),! ! ! "couldnt connect remote io bus 1 to 0");Monday, April 29, 13
  • 44. Pass-Through with Effectbus 0to output H/WAURemoteIOAUEffectbus 1from input H/WMonday, April 29, 13
  • 45. Demo: Delay EffectNew in iOS 6!Monday, April 29, 13
  • 46. Creating the AUDelay! componentDesc.componentType = kAudioUnitType_Effect;! componentDesc.componentSubType = kAudioUnitSubType_Delay;! componentDesc.componentManufacturer =kAudioUnitManufacturer_Apple;!! AUNode effectNode;! CheckError(AUGraphAddNode(self.auGraph,! ! ! ! ! ! ! &componentDesc,! ! ! ! ! ! ! &effectNode),! ! ! "couldnt create effect node");! AudioUnit effectUnit;! CheckError(AUGraphNodeInfo(self.auGraph,! ! ! ! ! ! ! effectNode,! ! ! ! ! ! ! NULL,! ! ! ! ! ! ! &effectUnit),! ! ! "couldnt get effect unit from node");Monday, April 29, 13
  • 47. The problem with effectunits• Audio Units available since iPhone OS 2.0prefer int formats• Effect units arrived with iOS 5 (arm7 era)and only work with float format• Have to set the AUEffect unit’s format onAURemoteIOMonday, April 29, 13
  • 48. Setting formats! AudioStreamBasicDescription effectDataFormat;! UInt32 propSize = sizeof (effectDataFormat);! CheckError(AudioUnitGetProperty(effectUnit,! ! ! ! ! ! ! ! ! kAudioUnitProperty_StreamFormat,! ! ! ! ! ! ! ! ! kAudioUnitScope_Output,! ! ! ! ! ! ! ! ! busZero,! ! ! ! ! ! ! ! ! &effectDataFormat,! ! ! ! ! ! ! ! ! &propSize),! ! ! "couldnt read effect format");! CheckError(AudioUnitSetProperty(self.remoteIOUnit,! ! ! ! ! ! ! ! ! kAudioUnitProperty_StreamFormat,! ! ! ! ! ! ! ! ! kAudioUnitScope_Output,! ! ! ! ! ! ! ! ! busOne,! ! ! ! ! ! ! ! ! &effectDataFormat,! ! ! ! ! ! ! ! ! propSize),! ! ! "couldnt set bus one output format");Then repeat AudioUnitSetProperty() for input scope / bus 0Monday, April 29, 13
  • 49. AUNewTimePitch• New in iOS 6!• Allows you to change pitch independent oftime, or time independent of pitch• How do you use it?Monday, April 29, 13
  • 50. AUTimePitch! AudioComponentDescription effectcd = {0};! effectcd.componentType = kAudioUnitType_FormatConverter;! effectcd.componentSubType = kAudioUnitSubType_NewTimePitch;! effectcd.componentManufacturer = kAudioUnitManufacturer_Apple;!! AUNode effectNode;! CheckError(AUGraphAddNode(self.auGraph,! ! ! ! ! ! ! &effectcd,! ! ! ! ! ! ! &effectNode),! ! ! "couldnt get effect node [time/pitch]");Notice the type is AUFormatConverter, not AUEffectMonday, April 29, 13
  • 51. AudioUnitParameters.h// Parameters for AUNewTimePitchenum {! ! // Global, rate, 1/32 -> 32.0, 1.0! kNewTimePitchParam_Rate!! ! ! ! ! ! = 0,! ! // Global, Cents, -2400 -> 2400, 1.0! kNewTimePitchParam_Pitch! ! ! ! ! ! = 1,! ! // Global, generic, 3.0 -> 32.0, 8.0! kNewTimePitchParam_Overlap! ! ! ! ! ! = 4,! ! // Global, Boolean, 0->1, 1! kNewTimePitchParam_EnablePeakLocking! ! ! = 6};This is the entire documentation for the AUNewTimePitch parametersMonday, April 29, 13
  • 52. AUNewTimePitchparameters• Rate: kNewTimePitchParam_Rate takes aFloat32 rate from 1/32 speed to 32xspeed.• Use powers of 2: 1/32, 1/16, …, 2, 4, 8…• Pitch: kNewTimePitchParam_Pitch takesa Float32 representing cents, meaning1/100 of a musical semitoneMonday, April 29, 13
  • 53. Pitch shifting• Pitch can vary, time does not• Suitable for real-time sources, such asaudio captureMonday, April 29, 13
  • 54. Demo: Pitch ShiftNew in iOS 6!Monday, April 29, 13
  • 55. Rate shifting• Rate can vary, pitch does not• Think of 1.5x and 2x speed modes inPodcasts app• Not suitable for real-time sources, as datawill be consumed faster. Files work well.• Sources must be able to map timesystems withkAudioUnitProperty_InputSamplesInOutputMonday, April 29, 13
  • 56. Demo: Rate ShiftNew in iOS 6!Monday, April 29, 13
  • 57. AUSplitterAUSplitterAUSomethingElseAUSomethingElseNew in iOS 6!Monday, April 29, 13
  • 58. AUMatrixMixerAUMatrixMixerAUSomethingElseAUSomethingElseAUSomethingElseAUSomethingElseAUSomethingElseNew in iOS 6!Monday, April 29, 13
  • 59. Audio Queues(and the APIs that help them)Monday, April 29, 13
  • 60. AudioQueue• Easier than AURemoteIO - provide datawhen you want to, less time pressure, canaccept or provide compressed formats(MP3,AAC)• Recording queue - receive buffers ofcaptured audio in a callback• Play-out queue - enqueue buffers of audioto play, optionally refill in a callbackMonday, April 29, 13
  • 61. Audio QueueMonday, April 29, 13
  • 62. Audio QueueMonday, April 29, 13
  • 63. Audio QueueMonday, April 29, 13
  • 64. Audio QueueMonday, April 29, 13
  • 65. Common AQ scenarios• File player - Read from file and “prime”queue buffers, start queue, when calledback with used buffer, refill from next partof file• Synthesis - Maintain state in your owncode, write raw samples into buffers duringcallbacksMonday, April 29, 13
  • 66. Web Radio• Project from Thursday’s workshop• Use Audio File Stream Services to pick outaudio data from a network stream• Enqueue these packets as new AQ buffers• Dispose used buffers in callbackMonday, April 29, 13
  • 67. Parsing web radioMonday, April 29, 13
  • 68. Parsing web radioNSData NSDataPackets Packets Packets Packets PacketsNSURLConnection deliversNSData buffers, containing audioand framing info.We pass it toAudio File Services.Monday, April 29, 13
  • 69. Parsing web radioNSData NSDataPackets Packets Packets Packets PacketsPackets PacketsPackets Packets PacketsNSURLConnection deliversNSData buffers, containing audioand framing info.We pass it toAudio File Services.Audio File Services calls us backwith parsed packets of audio data.Monday, April 29, 13
  • 70. Parsing web radioNSData NSDataPackets Packets Packets Packets PacketsPackets PacketsPackets Packets Packets012PacketsPacketsPacketsPacketsPacketsPacketsNSURLConnection deliversNSData buffers, containing audioand framing info.We pass it toAudio File Services.Audio File Services calls us backwith parsed packets of audio data.We create an AudioQueueBufferwith those packets and enqueue itfor play-out.Monday, April 29, 13
  • 71. A complex thing!• What if we want to see that data after it’sbeen decoded to PCM and is about to beplayed?• e.g., spectrum analysis, effects, visualizers• AudioQueue design is “fire-and-forget”Monday, April 29, 13
  • 72. AudioQueue Tap!http://www.last.fm/music/Spinal+TapMonday, April 29, 13
  • 73. AudioQueueProcessingTap• Set as a property on the Audio Queue• Calls back to your function with decoded(PCM) audio data• Three types: pre- or post- effects (that theAQ performs), or siphon. First two canmodify the data.• Only documentation is in AudioQueue.hMonday, April 29, 13
  • 74. Creating an AQ Tap! ! // create the tap! ! UInt32 maxFrames = 0;! ! AudioStreamBasicDescription tapFormat = {0};! ! AudioQueueProcessingTapRef tapRef;! ! CheckError(AudioQueueProcessingTapNew(audioQueue,! ! ! ! ! ! ! ! ! ! ! tapProc,! ! ! ! ! ! ! ! ! ! ! (__bridge void *)(player),! ! ! ! ! ! ! ! ! ! ! kAudioQueueProcessingTap_PreEffects,! ! ! ! ! ! ! ! ! ! ! &maxFrames,! ! ! ! ! ! ! ! ! ! ! &tapFormat,! ! ! ! ! ! ! ! ! ! ! &tapRef),! ! ! ! "couldnt create AQ tap");Notice that you receive maxFrames and tapFormat.These do not appear to be settable.Monday, April 29, 13
  • 75. AQ Tap Procvoid tapProc (void * inClientData,! ! ! AudioQueueProcessingTapRef inAQTap,! ! ! UInt32 inNumberFrames,! ! ! AudioTimeStamp * ioTimeStamp,! ! ! UInt32 * ioFlags,! ! ! UInt32 * outNumberFrames,! ! ! AudioBufferList * ioData) {! CCFWebRadioPlayer *player =(__bridge CCFWebRadioPlayer*) inClientData;! UInt32 getSourceFlags = 0;! UInt32 getSourceFrames = 0;! AudioQueueProcessingTapGetSourceAudio(inAQTap,! ! ! ! ! ! ! ! ! ! inNumberFrames,! ! ! ! ! ! ! ! ! ! ioTimeStamp,! ! ! ! ! ! ! ! ! ! &getSourceFlags,! ! ! ! ! ! ! ! ! ! &getSourceFrames,! ! ! ! ! ! ! ! ! ! ioData);// then do something with ioData// ...Monday, April 29, 13
  • 76. So what should we dowith the audio?Monday, April 29, 13
  • 77. So what should we dowith the audio?Let’s apply our pitch-shift effectMonday, April 29, 13
  • 78. Shouldn’t this work?AUEffectMonday, April 29, 13
  • 79. Shouldn’t this work?AUEffectAudioUnitRender()Monday, April 29, 13
  • 80. AudioUnitRender()• Last argument is an AudioBufferList, whoseAudioBuffer members have mData pointers• If mData != NULL, audio unit does itsthing with those samples• If mData == NULL, audio data pulls fromwhatever it’s connected to• So we just call with AudioBufferList ioDatawe got from tap callback, right?Monday, April 29, 13
  • 81. Psych!• AQ tap provides data as signed ints• Effect units only work with floating point• We need to do an on-the-spot formatconversionMonday, April 29, 13
  • 82. invalidname’s convert-and-effect recipeAUGenericOutputAUConverterAUEffectAUConverterOSStatus converterInputRenderCallback (void *inRefCon,AudioUnitRenderActionFlags *ioActionFlags,const AudioTimeStamp *inTimeStamp,UInt32 inBusNumber,UInt32 inNumberFrames,AudioBufferList * ioData) {CCFWebRadioPlayer *player = (__bridge CCFWebRadioPlayer*) inRefCon;// read from bufferioData->mBuffers[0].mData = player.preRenderData;return noErr;}Note: red arrows are float format, yellow arrows are intMonday, April 29, 13
  • 83. How it works• AUGraph:AUConverter → AUEffect →AUConverter → AUGenericOutput• Top AUConverter is connected to a rendercallback functionMonday, April 29, 13
  • 84. The trick!• Copy mData pointer to a state variable andNULL it in ioData• Call AudioQueueRender() on output unit.The NULL makes it pull from the graph.• Top of the graph pulls on render callback,which gives it back the mData we copiedoff.Monday, April 29, 13
  • 85. Yes, reallyThis is the rest of tapProc()! // copy off the ioData so the graph can read from it// in render callback! player.preRenderData = ioData->mBuffers[0].mData;! ioData->mBuffers[0].mData = NULL;!! OSStatus renderErr = noErr;! AudioUnitRenderActionFlags actionFlags = 0;! renderErr = AudioUnitRender(player.genericOutputUnit,! ! ! ! ! ! ! ! &actionFlags,! ! ! ! ! ! ! ! player.renderTimeStamp,! ! ! ! ! ! ! ! 0,! ! ! ! ! ! ! ! inNumberFrames,! ! ! ! ! ! ! ! ioData);! NSLog (@"AudioUnitRender, renderErr = %ld",renderErr);}Monday, April 29, 13
  • 86. Yes, reallyOSStatus converterInputRenderCallback (void *inRefCon,! ! ! ! ! ! ! ! ! AudioUnitRenderActionFlags *ioActionFlags,! ! ! ! ! ! ! ! ! const AudioTimeStamp *inTimeStamp,! ! ! ! ! ! ! ! ! UInt32 inBusNumber,! ! ! ! ! ! ! ! ! UInt32 inNumberFrames,! ! ! ! ! ! ! ! ! AudioBufferList * ioData) {! CCFWebRadioPlayer *player =(__bridge CCFWebRadioPlayer*) inRefCon;!! // read from buffer! ioData->mBuffers[0].mData = player.preRenderData;! return noErr;}This is the render callback that supplies data to the int→float converterMonday, April 29, 13
  • 87. Demo:AQ Tap +AUNewTimePitchNew in iOS 6!Monday, April 29, 13
  • 88. Monday, April 29, 13
  • 89. Meanwhile in a mini-bus nearCopenhagan…Monday, April 29, 13
  • 90. AudiobusMonday, April 29, 13
  • 91. Audiobus• Allows multiple audio apps to exchangedata in realtime• Works by sending raw data in MIDI• Actually approved by Apple• Actually supported in GarageBandMonday, April 29, 13
  • 92. Monday, April 29, 13
  • 93. Monday, April 29, 13
  • 94. Supporting Audiobus• Get the SDK from audiob.us• Enable background mode, add an audiobus-compatible URL scheme, get API key fromaudiob.us• Create and use ABAudiobusController,ABOutputPort/ABInputPort, andABAudiobusAudioUnitWrapperMonday, April 29, 13
  • 95. Wrapping up…Monday, April 29, 13
  • 96. Takeaways• Core Audio fundamentals never change• New stuff is added as properties, typedefs,enums, etc.• Watch the SDK API diffs document to findthe new stuff• Hope you like header files andexperimentationMonday, April 29, 13
  • 97. Q&A• Slides will be posted to slideshare.net/invalidname• Code will be linked from there and my blog• Watch CocoaConf glassboard,@invalidname on Twitter/ADN, or [Timecode]; blog for announcement• Thanks!Monday, April 29, 13