• Save
Core Audio in iOS 6 (CocoaConf Portland, Oct. '12)
Upcoming SlideShare
Loading in...5
×
 

Core Audio in iOS 6 (CocoaConf Portland, Oct. '12)

on

  • 9,346 views

Core Audio gets a bunch of neat new tricks in iOS 6, particularly for developers working with Audio Units. New effect units include an improved ability to vary pitch and playback speed, a digital ...

Core Audio gets a bunch of neat new tricks in iOS 6, particularly for developers working with Audio Units. New effect units include an improved ability to vary pitch and playback speed, a digital delay unit, and OS X's powerful matrix mixer. There's now a new place to use units too, as the Audio Queue now offers developers a way to "tap" into the data being queued up for playback. To top it all off, a new "multi-route" system allows us to play out of multiple, multi-channel output devices at the same time.

Want to see, and hear, how all this stuff works? This section is the place to find out.

Statistics

Views

Total Views
9,346
Views on SlideShare
8,958
Embed Views
388

Actions

Likes
10
Downloads
0
Comments
1

5 Embeds 388

http://blog.jorgemaroto.es 378
http://eventifier.co 6
http://www.google.com 2
http://jorgemaroto.tumblr.com 1
http://eventifier.com 1

Accessibility

Categories

Upload Details

Uploaded via as Apple Keynote

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
  • Chris, I'm a PC software developer who is tinkering with iPad programming... I have a great idea, but I don't have the skills to do it. I thought I would give it to you since I think you could probably write it.

    What I am looking for is a wireless microphone using iOS devices. The ultimate app would be to enable an iPod to be able to connect to one (or more) bluetooth microphones (headsets) and output the audio from the headphone jack on the iPod... this way, I could plug the iPod into my video camera and have an inexpensive wireless microphone.

    In an ideal app, I would be able to connect more than one microphone and the app would include a mixer giving me a multi-channel capability. If you could write such an app, I would certainly buy it and I think you would have a huge potential market for people who want a wireless microphone for their home video cameras.

    Just a thought... not sure if this is up your alley, but thought I would share the idea with you.

    Thx!

    James
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n
  • \n\n

Core Audio in iOS 6 (CocoaConf Portland, Oct. '12) Core Audio in iOS 6 (CocoaConf Portland, Oct. '12) Presentation Transcript

  • Core Audio in iOS 6 Chris Adamson • @invalidname CocoaConf PDX October 27, 2012Slides and sample code will be available later today
  • Plug!
  • The Reviews Are In!
  • The Reviews Are In!
  • The Reviews Are In!
  • The Reviews Are In!
  • Legitimate copies!• Amazon (paper or Kindle)• Barnes & Noble (paper or Nook)• Apple (iBooks)• Direct from InformIT (paper, eBook [.epub + .mobi + .pdf], or Bundle) • 35% off with code COREAUDIO3174
  • What You’ll Learn• What Core Audio does and doesn’t do• When to use and not use it• What’s new in Core Audio for iOS 6
  • Simple things should be simple,complex things should be possible. –Alan Kay
  • AV Foundation, Media Player Simple things should be simple, complex things should be possible. –Alan Kay
  • AV Foundation, Media Player Simple things should be simple, complex things should be possible. –Alan Kay Core Audio
  • Core Audio• Low-level C framework for processing audio • Capture, play-out, real-time or off-line processing• The “complex things should be possible” part of audio on OS X and iOS
  • Chris’ CA Taxonomy• Engines: process streams of audio • Capture, play-out, mixing, effects processing• Helpers: deal with formats, encodings, etc. • File I/O, stream I/O, format conversion, iOS “session” management
  • Helpers: Audio File• Read from / write to multiple audio file types (.aiff, .wav, .caf, .m4a, .mp3) in a content-agnostic way• Get metadata (data format, duration, iTunes/ID3 info)
  • Helpers: Audio File Stream• Read audio from non-random-access source like a network stream• Discover encoding and encapsulation on the fly, then deliver audio packets to client application
  • Helpers: Converters• Convert buffers of audio to and from different encodings• One side must be in an uncompressed format (i.e., Linear PCM)
  • Helpers: ExtAudioFile• Combine file I/O and format conversion• Read a compressed file into PCM buffers• Write PCM buffers into a compressed file
  • Helpers: Audio Session• iOS-only API to negotiate use of audio resources with the rest of the system• Deetermine whether your app mixes with other apps’ audio, honors ring/silent switch, can play in background, etc.• Gets notified of audio interruptions• See also AVAudioSession
  • Engines: Audio Units• Low-latency (~10ms) processing of capture/play-out audio data• Effects, mixing, etc.• Connect units manually or via an AUGraph• Much more on this topic momentarily…
  • Engines: Audio Queue• Convenience API for recording or play-out, built atop audio units• Rather than processing on-demand and on Core Audio’s thread, your callback provides or receives buffers of audio (at whatever size is convenient to you)• Higher latency, naturally• Supports compressed formats (MP3, AAC)
  • Engines: Open AL• API for 3D spatialized audio, implemented atop audio units• Set a source’s properties (x/y/z coordinates, orientation, audio buffer, etc.), OpenAL renders what it sounds like to the listener from that location
  • Engines and Helpers• Audio Units • Audio File• Audio Queue • Audio File Stream• Open AL • Audio Converter • ExtAudioFile • Audio Session
  • Audio Units
  • Audio Unit AUSomething
  • Types of Audio Units• Output (which also do input)• Generator• Converter• Effect• Mixer• Music
  • Pull Model AUSomething
  • Pull Model AUSomething AudioUnitRender()
  • Pull ModelAUSomethingElse AUSomething
  • Buses (aka, Elements) AUSomethingElse AUSomething AUSomethingElse
  • AUGraphAUSomethingElse AUSomethingAUSomethingElse
  • Render CallbacksOSStatus converterInputRenderCallback (void *inRefCon, AudioUnitRenderActionFlags *ioActionFlags, const AudioTimeStamp *inTimeStamp, UInt32 inBusNumber, UInt32 inNumberFrames, AudioBufferList * ioData) { CCFWebRadioPlayer *player = (__bridge CCFWebRadioPlayer*) inRefCon; // read from buffer ioData->mBuffers[0].mData = player.preRenderData; return noErr;} AUSomething AUSomethingElse
  • AURemoteIO• Output unit used for play-out, capture• A Core Audio thread repeatedly and automatically calls AudioUnitRender()• Must set EnableIO property to explicitly enable capture and/or play-out • Capture requires setting appropriate AudioSession category
  • Create AURemoteIOCheckError(NewAUGraph(&_auGraph),! ! "couldnt create au graph");!CheckError(AUGraphOpen(_auGraph),! ! "couldnt open au graph");!AudioComponentDescription componentDesc;componentDesc.componentType = kAudioUnitType_Output;componentDesc.componentSubType = kAudioUnitSubType_RemoteIO;componentDesc.componentManufacturer = kAudioUnitManufacturer_Apple;!AUNode remoteIONode;CheckError(AUGraphAddNode(_auGraph,! ! ! ! ! ! &componentDesc,! ! ! ! ! ! &remoteIONode),! ! "couldnt add remote io node");
  • Getting an AudioUnit from AUNode! CheckError(AUGraphNodeInfo(self.auGraph,! ! ! ! ! ! ! remoteIONode,! ! ! ! ! ! ! NULL,! ! ! ! ! ! ! &_remoteIOUnit),! ! ! "couldnt get remote io unit from node");
  • AURemoteIO Buses AURemoteIO
  • AURemoteIO Buses AURemoteIO bus 0 to output H/W
  • AURemoteIO Buses AURemoteIO bus 0 bus 0from app to output H/W
  • AURemoteIO Buses bus 1from input H/W AURemoteIO bus 0 bus 0 from app to output H/W
  • AURemoteIO Buses bus 1 bus 1from input H/W to app AURemoteIO bus 0 bus 0 from app to output H/W
  • EnableIO! UInt32 oneFlag = 1;! UInt32 busZero = 0;! CheckError(AudioUnitSetProperty(self.remoteIOUnit,! ! ! ! ! ! ! ! ! kAudioOutputUnitProperty_EnableIO,! ! ! ! ! ! ! ! ! kAudioUnitScope_Output,! ! ! ! ! ! ! ! ! busZero,! ! ! ! ! ! ! ! ! &oneFlag,! ! ! ! ! ! ! ! ! sizeof(oneFlag)),! ! ! "couldnt enable remote io output");! UInt32 busOne = 1;! CheckError(AudioUnitSetProperty(self.remoteIOUnit,! ! ! ! ! ! ! ! ! kAudioOutputUnitProperty_EnableIO,! ! ! ! ! ! ! ! ! kAudioUnitScope_Input,! ! ! ! ! ! ! ! ! busOne,! ! ! ! ! ! ! ! ! &oneFlag,! ! ! ! ! ! ! ! ! sizeof(oneFlag)),! ! ! "couldnt enable remote io input");
  • Pass Through bus 1from input H/W AURemoteIO bus 0 to output H/W
  • Connect In to Out! UInt32 busZero = 0;! UInt32 busOne = 1;! CheckError(AUGraphConnectNodeInput(self.auGraph,! ! ! ! ! ! ! ! ! remoteIONode,! ! ! ! ! ! ! ! ! busOne,! ! ! ! ! ! ! ! ! remoteIONode,! ! ! ! ! ! ! ! ! busZero),! ! ! "couldnt connect remote io bus 1 to 0");
  • Pass-Through with Effect AUEffect bus 1 from input H/W AURemoteIO bus 0 to output H/W
  • Demo: Delay Effect New in iOS 6!
  • Creating the AUDelay! componentDesc.componentType = kAudioUnitType_Effect;! componentDesc.componentSubType = kAudioUnitSubType_Delay;! componentDesc.componentManufacturer = kAudioUnitManufacturer_Apple;!! AUNode effectNode;! CheckError(AUGraphAddNode(self.auGraph,! ! ! ! ! ! ! &componentDesc,! ! ! ! ! ! ! &effectNode),! ! ! "couldnt create effect node");! AudioUnit effectUnit;! CheckError(AUGraphNodeInfo(self.auGraph,! ! ! ! ! ! ! effectNode,! ! ! ! ! ! ! NULL,! ! ! ! ! ! ! &effectUnit),! ! ! "couldnt get effect unit from node");
  • The problem with effect units• Audio Units available since iPhone OS 2.0 prefer int formats• Effect units arrived with iOS 5 (arm7 era) and only work with float format• Have to set the AUEffect unit’s format on AURemoteIO
  • Setting formats! AudioStreamBasicDescription effectDataFormat;! UInt32 propSize = sizeof (effectDataFormat);! CheckError(AudioUnitGetProperty(effectUnit,! ! ! ! ! ! ! ! ! kAudioUnitProperty_StreamFormat,! ! ! ! ! ! ! ! ! kAudioUnitScope_Output,! ! ! ! ! ! ! ! ! busZero,! ! ! ! ! ! ! ! ! &effectDataFormat,! ! ! ! ! ! ! ! ! &propSize),! ! ! "couldnt read effect format");! CheckError(AudioUnitSetProperty(self.remoteIOUnit,! ! ! ! ! ! ! ! ! kAudioUnitProperty_StreamFormat,! ! ! ! ! ! ! ! ! kAudioUnitScope_Output,! ! ! ! ! ! ! ! ! busOne,! ! ! ! ! ! ! ! ! &effectDataFormat,! ! ! ! ! ! ! ! ! propSize),! ! ! "couldnt set bus one output format"); Then repeat AudioUnitSetProperty() for input scope / bus 0
  • AUNewTimePitch• New in iOS 6!• Allows you to change pitch independent of time, or time independent of pitch• How do you use it?
  • AUTimePitch! AudioComponentDescription effectcd = {0};! effectcd.componentType = kAudioUnitType_FormatConverter;! effectcd.componentSubType = kAudioUnitSubType_NewTimePitch;! effectcd.componentManufacturer = kAudioUnitManufacturer_Apple;!! AUNode effectNode;! CheckError(AUGraphAddNode(self.auGraph,! ! ! ! ! ! ! &effectcd,! ! ! ! ! ! ! &effectNode),! ! ! "couldnt get effect node [time/pitch]"); Notice the type is AUFormatConverter, not AUEffect
  • AudioUnitParameters.h // Parameters for AUNewTimePitch enum { ! ! // Global, rate, 1/32 -> 32.0, 1.0 ! kNewTimePitchParam_Rate! ! ! ! ! ! = ! 0, ! ! // Global, Cents, -2400 -> 2400, 1.0 ! kNewTimePitchParam_Pitch! ! ! ! ! ! = 1, ! ! // Global, generic, 3.0 -> 32.0, 8.0 ! kNewTimePitchParam_Overlap! ! ! ! ! ! = 4, ! ! // Global, Boolean, 0->1, 1 ! kNewTimePitchParam_EnablePeakLocking! ! ! = 6 };This is the entire documentation for the AUNewTimePitch parameters
  • AUNewTimePitch parameters• Rate: kNewTimePitchParam_Rate takes a Float32 rate from 1/32 speed to 32x speed. • Use powers of 2: 1/32, 1/16, …, 2, 4, 8…• Pitch: kNewTimePitchParam_Pitch takes a Float32 representing cents, meaning 1/100 of a musical semitone
  • Pitch shifting• Pitch can vary, time does not• Suitable for real-time sources, such as audio capture
  • Demo: Pitch Shift New in iOS 6!
  • Rate shifting• Rate can vary, pitch does not • Think of 1.5x and 2x speed modes in Podcasts app• Not suitable for real-time sources, as data will be consumed faster. Files work well. • Sources must be able to map time systems with kAudioUnitProperty_InputSamplesInOutput
  • Demo: Rate Shift New in iOS 6!
  • AUSplitter AUSomethingElseAUSplitter AUSomethingElse New in iOS 6!
  • AUMatrixMixerAUSomethingElse AUSomethingElseAUSomethingElse AUMatrixMixer AUSomethingElseAUSomethingElse New in iOS 6!
  • Audio Queues(and the APIs that help them)
  • AudioQueue• Easier than AURemoteIO - provide data when you want to, less time pressure, can accept or provide compressed formats (MP3, AAC)• Recording queue - receive buffers of captured audio in a callback• Play-out queue - enqueue buffers of audio to play, optionally refill in a callback
  • AudioQueue 2 1 0
  • Common AQ scenarios• File player - Read from file and “prime” queue buffers, start queue, when called back with used buffer, refill from next part of file• Synthesis - Maintain state in your own code, write raw samples into buffers during callbacks
  • Web Radio• Thursday class’ third project• Use Audio File Stream Services to pick out audio data from a network stream• Enqueue these packets as new AQ buffers• Dispose used buffers in callback
  • Parsing web radio
  • Parsing web radioNSURLConnection deliversNSData buffers, containing audioand framing info. We pass it to NSData NSDataAudio File Services. Packets Packets Packets Packets Packets
  • Parsing web radioNSURLConnection deliversNSData buffers, containing audioand framing info. We pass it to NSData NSDataAudio File Services. Packets Packets Packets Packets Packets Packets PacketsAudio File Services calls us backwith parsed packets of audio data. Packets Packets Packets
  • Parsing web radioNSURLConnection deliversNSData buffers, containing audioand framing info. We pass it to NSData NSDataAudio File Services. Packets Packets Packets Packets Packets Packets PacketsAudio File Services calls us backwith parsed packets of audio data. Packets Packets PacketsWe create an AudioQueueBuffer Packets Packetswith those packets and enqueue it Packets 2 Packets 1 0for play-out. Packets Packets
  • A complex thing!• What if we want to see that data after it’s been decoded to PCM and is about to be played? • e.g., spectrum analysis, effects, visualizers• AudioQueue design is “fire-and-forget”
  • AudioQueue Tap!http://www.last.fm/music/Spinal+Tap
  • AudioQueueProcessingTap• Set as a property on the Audio Queue• Calls back to your function with decoded (PCM) audio data• Three types: pre- or post- effects (that the AQ performs), or siphon. First two can modify the data.• Only documentation is in AudioQueue.h
  • Creating an AQ Tap! ! // create the tap! ! UInt32 maxFrames = 0;! ! AudioStreamBasicDescription tapFormat = {0};! ! AudioQueueProcessingTapRef tapRef;! ! CheckError(AudioQueueProcessingTapNew(audioQueue,! ! ! ! ! ! ! ! ! ! ! tapProc,! ! ! ! ! ! ! ! ! ! ! (__bridge void *)(player),! ! ! ! ! ! ! ! ! ! ! kAudioQueueProcessingTap_PreEffects,! ! ! ! ! ! ! ! ! ! ! &maxFrames,! ! ! ! ! ! ! ! ! ! ! &tapFormat,! ! ! ! ! ! ! ! ! ! ! &tapRef),! ! ! ! "couldnt create AQ tap"); Notice that you receive maxFrames and tapFormat. These do not appear to be settable.
  • AQ Tap Procvoid tapProc (void * inClientData,! ! ! AudioQueueProcessingTapRef inAQTap,! ! ! UInt32 inNumberFrames,! ! ! AudioTimeStamp * ioTimeStamp,! ! ! UInt32 * ioFlags,! ! ! UInt32 * outNumberFrames,! ! ! AudioBufferList * ioData) {! CCFWebRadioPlayer *player = (__bridge CCFWebRadioPlayer*) inClientData;! UInt32 getSourceFlags = 0;! UInt32 getSourceFrames = 0;! AudioQueueProcessingTapGetSourceAudio(inAQTap,! ! ! ! ! ! ! ! ! ! inNumberFrames,! ! ! ! ! ! ! ! ! ! ioTimeStamp,! ! ! ! ! ! ! ! ! ! &getSourceFlags,! ! ! ! ! ! ! ! ! ! &getSourceFrames,! ! ! ! ! ! ! ! ! ! ioData); // then do something with ioData // ...
  • So what should we do with the audio?
  • So what should we do with the audio? Let’s apply our pitch-shift effect
  • Shouldn’t this work? AUEffect
  • Shouldn’t this work? AUEffect AudioUnitRender()
  • AudioUnitRender()• Last argument is an AudioBufferList, whose AudioBuffer members have mData pointers • If mData != NULL, audio unit does its thing with those samples • If mData == NULL, audio data pulls from whatever it’s connected to• So we just call with AudioBufferList ioData we got from tap callback, right?
  • Psych!• AQ tap provides data as signed ints• Effect units only work with floating point• We need to do an on-the-spot format conversion
  • invalidname’s convert- and-effect recipe OSStatus converterInputRenderCallback (void *inRefCon, AudioUnitRenderActionFlags *ioActionFlags, const AudioTimeStamp *inTimeStamp, UInt32 inBusNumber, UInt32 inNumberFrames, AudioBufferList * ioData) { CCFWebRadioPlayer *player = (__bridge CCFWebRadioPlayer*) inRefCon; // read from buffer ioData->mBuffers[0].mData = player.preRenderData; return noErr; } AUConverter AUEffect AUConverter AUGenericOutputNote: red arrows are float format, yellow arrows are int
  • How it works• AUGraph: AUConverter → AUEffect → AUConverter → AUGenericOutput• Top AUConverter is connected to a render callback function
  • The trick!• Copy mData pointer to a state variable and NULL it in ioData• Call AudioQueueRender() on output unit. The NULL makes it pull from the graph.• Top of the graph pulls on render callback, which gives it back the mData we copied off.
  • Yes, really This is the rest of tapProc()! // copy off the ioData so the graph can read from it // in render callback! player.preRenderData = ioData->mBuffers[0].mData;! ioData->mBuffers[0].mData = NULL;!! OSStatus renderErr = noErr;! AudioUnitRenderActionFlags actionFlags = 0;! renderErr = AudioUnitRender(player.genericOutputUnit,! ! ! ! ! ! ! ! &actionFlags,! ! ! ! ! ! ! ! player.renderTimeStamp,! ! ! ! ! ! ! ! 0,! ! ! ! ! ! ! ! inNumberFrames,! ! ! ! ! ! ! ! ioData);! NSLog (@"AudioUnitRender, renderErr = %ld",renderErr);}
  • Yes, really This is the render callback that supplies data to the int→float converterOSStatus converterInputRenderCallback (void *inRefCon,! ! ! ! ! ! ! ! ! AudioUnitRenderActionFlags *ioActionFlags,! ! ! ! ! ! ! ! ! const AudioTimeStamp *inTimeStamp,! ! ! ! ! ! ! ! ! UInt32 inBusNumber,! ! ! ! ! ! ! ! ! UInt32 inNumberFrames,! ! ! ! ! ! ! ! ! AudioBufferList * ioData) {! CCFWebRadioPlayer *player = (__bridge CCFWebRadioPlayer*) inRefCon;!! // read from buffer! ioData->mBuffers[0].mData = player.preRenderData;! return noErr;}
  • Demo: AQ Tap +AUNewTimePitch New in iOS 6!
  • Other new stuff
  • Multi-Route• Ordinarily, one input or output is active: earpiece, speaker, headphones, dock- connected device • “Last in wins”• With AV Session “multi-route” category, you can use several at once• WWDC 2012 session 505
  • Utility classes moved again• C++ utilities, including the CARingBuffer • < Xcode 4.3, installed into /Developer • Xcode 4.3-4.4, optional download from developer.apple.com • ≧ Xcode 4.5, sample code project “Core Audio Utility Classes”
  • Takeaways• Core Audio fundamentals never change• New stuff is added as properties, typedefs, enums, etc.• Watch the SDK API diffs document to find the new stuff• Hope you like header files and experimentation
  • Q&A• Slides will be posted to slideshare.net/ invalidname• Code will be linked from there and my blog• Watch CocoaConf PDX glassboard, @invalidname on Twitter/ADN, or [Time code]; blog for announcement• Thanks!