• Save
Core Audio in iOS 6 (CocoaConf Raleigh, Dec. '12)
Upcoming SlideShare
Loading in...5
×
 

Core Audio in iOS 6 (CocoaConf Raleigh, Dec. '12)

on

  • 1,995 views

 

Statistics

Views

Total Views
1,995
Views on SlideShare
1,987
Embed Views
8

Actions

Likes
3
Downloads
0
Comments
0

2 Embeds 8

https://twitter.com 4
http://lanyrd.com 4

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Core Audio in iOS 6 (CocoaConf Raleigh, Dec. '12) Core Audio in iOS 6 (CocoaConf Raleigh, Dec. '12) Presentation Transcript

  • Core Audio in iOS 6 Chris Adamson • @invalidname CocoaConf Raleigh December 1, 2012 Sides and code available on my blog: http://www.subfurther.com/blogSunday, December 2, 12
  • Plug!Sunday, December 2, 12
  • The Reviews Are In!Sunday, December 2, 12
  • The Reviews Are In!Sunday, December 2, 12
  • The Reviews Are In!Sunday, December 2, 12
  • The Reviews Are In!Sunday, December 2, 12
  • Legitimate copies! • Amazon (paper or Kindle) • Barnes & Noble (paper or Nook) • Apple (iBooks) • Direct from InformIT (paper, eBook [.epub + .mobi + .pdf], or Bundle) • 35% off with code COREAUDIO3174Sunday, December 2, 12
  • What You’ll Learn • What Core Audio does and doesn’t do • When to use and not use it • What’s new in Core Audio for iOS 6Sunday, December 2, 12
  • Sunday, December 2, 12
  • Simple things should be simple, complex things should be possible. –Alan KaySunday, December 2, 12
  • AV Foundation, Media Player Simple things should be simple, complex things should be possible. –Alan KaySunday, December 2, 12
  • AV Foundation, Media Player Simple things should be simple, complex things should be possible. –Alan Kay Core AudioSunday, December 2, 12
  • Core Audio • Low-level C framework for processing audio • Capture, play-out, real-time or off-line processing • The “complex things should be possible” part of audio on OS X and iOSSunday, December 2, 12
  • Chris’ CA Taxonomy • Engines: process streams of audio • Capture, play-out, mixing, effects processing • Helpers: deal with formats, encodings, etc. • File I/O, stream I/O, format conversion, iOS “session” managementSunday, December 2, 12
  • Helpers: Audio File • Read from / write to multiple audio file types (.aiff, .wav, .caf, .m4a, .mp3) in a content-agnostic way • Get metadata (data format, duration, iTunes/ID3 info)Sunday, December 2, 12
  • Helpers: Audio File Stream • Read audio from non-random-access source like a network stream • Discover encoding and encapsulation on the fly, then deliver audio packets to client applicationSunday, December 2, 12
  • Helpers: Converters • Convert buffers of audio to and from different encodings • One side must be in an uncompressed format (i.e., Linear PCM)Sunday, December 2, 12
  • Helpers: ExtAudioFile • Combine file I/O and format conversion • Read a compressed file into PCM buffers • Write PCM buffers into a compressed fileSunday, December 2, 12
  • Helpers: Audio Session • iOS-only API to negotiate use of audio resources with the rest of the system • Deetermine whether your app mixes with other apps’ audio, honors ring/silent switch, can play in background, etc. • Gets notified of audio interruptions • See also AVAudioSessionSunday, December 2, 12
  • Engines: Audio Units • Low-latency (~10ms) processing of capture/play-out audio data • Effects, mixing, etc. • Connect units manually or via an AUGraph • Much more on this topic momentarily…Sunday, December 2, 12
  • Engines: Audio Queue • Convenience API for recording or play-out, built atop audio units • Rather than processing on-demand and on Core Audio’s thread, your callback provides or receives buffers of audio (at whatever size is convenient to you) • Higher latency, naturally • Supports compressed formats (MP3, AAC)Sunday, December 2, 12
  • Engines: Open AL • API for 3D spatialized audio, implemented atop audio units • Set a source’s properties (x/y/z coordinates, orientation, audio buffer, etc.), OpenAL renders what it sounds like to the listener from that locationSunday, December 2, 12
  • Engines and Helpers • Audio Units • Audio File • Audio Queue • Audio File Stream • Open AL • Audio Converter • ExtAudioFile • Audio SessionSunday, December 2, 12
  • Audio UnitsSunday, December 2, 12
  • Audio Unit AUSomethingSunday, December 2, 12
  • Types of Audio Units • Output (which also do input) • Generator • Converter • Effect • Mixer • MusicSunday, December 2, 12
  • Pull Model AUSomethingSunday, December 2, 12
  • Pull Model AUSomething AudioUnitRender()Sunday, December 2, 12
  • Pull Model AUSomethingElse AUSomethingSunday, December 2, 12
  • Buses (aka, Elements) AUSomethingElse AUSomething AUSomethingElseSunday, December 2, 12
  • AUGraph AUSomethingElse AUSomething AUSomethingElseSunday, December 2, 12
  • Render Callbacks OSStatus converterInputRenderCallback (void *inRefCon, AudioUnitRenderActionFlags *ioActionFlags, const AudioTimeStamp *inTimeStamp, UInt32 inBusNumber, UInt32 inNumberFrames, AudioBufferList * ioData) { CCFWebRadioPlayer *player = (__bridge CCFWebRadioPlayer*) inRefCon; // read from buffer ioData->mBuffers[0].mData = player.preRenderData; return noErr; } AUSomething AUSomethingElseSunday, December 2, 12
  • AURemoteIO • Output unit used for play-out, capture • A Core Audio thread repeatedly and automatically calls AudioUnitRender() • Must set EnableIO property to explicitly enable capture and/or play-out • Capture requires setting appropriate AudioSession categorySunday, December 2, 12
  • Create AURemoteIO CheckError(NewAUGraph(&_auGraph), ! ! "couldnt create au graph"); ! CheckError(AUGraphOpen(_auGraph), ! ! "couldnt open au graph"); ! AudioComponentDescription componentDesc; componentDesc.componentType = kAudioUnitType_Output; componentDesc.componentSubType = kAudioUnitSubType_RemoteIO; componentDesc.componentManufacturer = kAudioUnitManufacturer_Apple; ! AUNode remoteIONode; CheckError(AUGraphAddNode(_auGraph, ! ! ! ! ! ! &componentDesc, ! ! ! ! ! ! &remoteIONode), ! ! "couldnt add remote io node");Sunday, December 2, 12
  • Getting an AudioUnit from AUNode ! CheckError(AUGraphNodeInfo(self.auGraph, ! ! ! ! ! ! ! remoteIONode, ! ! ! ! ! ! ! NULL, ! ! ! ! ! ! ! &_remoteIOUnit), ! ! ! "couldnt get remote io unit from node");Sunday, December 2, 12
  • AURemoteIO Buses AURemoteIOSunday, December 2, 12
  • AURemoteIO Buses AURemoteIO bus 0 to output H/WSunday, December 2, 12
  • AURemoteIO Buses AURemoteIO bus 0 bus 0 from app to output H/WSunday, December 2, 12
  • AURemoteIO Buses bus 1from input H/W AURemoteIO bus 0 bus 0 from app to output H/WSunday, December 2, 12
  • AURemoteIO Buses bus 1 bus 1from input H/W to app AURemoteIO bus 0 bus 0 from app to output H/WSunday, December 2, 12
  • EnableIO ! UInt32 oneFlag = 1; ! UInt32 busZero = 0; ! CheckError(AudioUnitSetProperty(self.remoteIOUnit, ! ! ! ! ! ! ! ! ! kAudioOutputUnitProperty_EnableIO, ! ! ! ! ! ! ! ! ! kAudioUnitScope_Output, ! ! ! ! ! ! ! ! ! busZero, ! ! ! ! ! ! ! ! ! &oneFlag, ! ! ! ! ! ! ! ! ! sizeof(oneFlag)), ! ! ! "couldnt enable remote io output"); ! UInt32 busOne = 1; ! CheckError(AudioUnitSetProperty(self.remoteIOUnit, ! ! ! ! ! ! ! ! ! kAudioOutputUnitProperty_EnableIO, ! ! ! ! ! ! ! ! ! kAudioUnitScope_Input, ! ! ! ! ! ! ! ! ! busOne, ! ! ! ! ! ! ! ! ! &oneFlag, ! ! ! ! ! ! ! ! ! sizeof(oneFlag)), ! ! ! "couldnt enable remote io input");Sunday, December 2, 12
  • Pass Through bus 1 from input H/W AURemoteIO bus 0 to output H/WSunday, December 2, 12
  • Connect In to Out ! UInt32 busZero = 0; ! UInt32 busOne = 1; ! CheckError(AUGraphConnectNodeInput(self.auGraph, ! ! ! ! ! ! ! ! ! remoteIONode, ! ! ! ! ! ! ! ! ! busOne, ! ! ! ! ! ! ! ! ! remoteIONode, ! ! ! ! ! ! ! ! ! busZero), ! ! ! "couldnt connect remote io bus 1 to 0");Sunday, December 2, 12
  • Pass-Through with Effect AUEffect bus 1 from input H/W AURemoteIO bus 0 to output H/WSunday, December 2, 12
  • Demo: Delay Effect New in iOS 6!Sunday, December 2, 12
  • Creating the AUDelay ! componentDesc.componentType = kAudioUnitType_Effect; ! componentDesc.componentSubType = kAudioUnitSubType_Delay; ! componentDesc.componentManufacturer = kAudioUnitManufacturer_Apple; ! ! AUNode effectNode; ! CheckError(AUGraphAddNode(self.auGraph, ! ! ! ! ! ! ! &componentDesc, ! ! ! ! ! ! ! &effectNode), ! ! ! "couldnt create effect node"); ! AudioUnit effectUnit; ! CheckError(AUGraphNodeInfo(self.auGraph, ! ! ! ! ! ! ! effectNode, ! ! ! ! ! ! ! NULL, ! ! ! ! ! ! ! &effectUnit), ! ! ! "couldnt get effect unit from node");Sunday, December 2, 12
  • The problem with effect units • Audio Units available since iPhone OS 2.0 prefer int formats • Effect units arrived with iOS 5 (arm7 era) and only work with float format • Have to set the AUEffect unit’s format on AURemoteIOSunday, December 2, 12
  • Setting formats ! AudioStreamBasicDescription effectDataFormat; ! UInt32 propSize = sizeof (effectDataFormat); ! CheckError(AudioUnitGetProperty(effectUnit, ! ! ! ! ! ! ! ! ! kAudioUnitProperty_StreamFormat, ! ! ! ! ! ! ! ! ! kAudioUnitScope_Output, ! ! ! ! ! ! ! ! ! busZero, ! ! ! ! ! ! ! ! ! &effectDataFormat, ! ! ! ! ! ! ! ! ! &propSize), ! ! ! "couldnt read effect format"); ! CheckError(AudioUnitSetProperty(self.remoteIOUnit, ! ! ! ! ! ! ! ! ! kAudioUnitProperty_StreamFormat, ! ! ! ! ! ! ! ! ! kAudioUnitScope_Output, ! ! ! ! ! ! ! ! ! busOne, ! ! ! ! ! ! ! ! ! &effectDataFormat, ! ! ! ! ! ! ! ! ! propSize), ! ! ! "couldnt set bus one output format"); Then repeat AudioUnitSetProperty() for input scope / bus 0Sunday, December 2, 12
  • AUNewTimePitch • New in iOS 6! • Allows you to change pitch independent of time, or time independent of pitch • How do you use it?Sunday, December 2, 12
  • AUTimePitch ! AudioComponentDescription effectcd = {0}; ! effectcd.componentType = kAudioUnitType_FormatConverter; ! effectcd.componentSubType = kAudioUnitSubType_NewTimePitch; ! effectcd.componentManufacturer = kAudioUnitManufacturer_Apple; ! ! AUNode effectNode; ! CheckError(AUGraphAddNode(self.auGraph, ! ! ! ! ! ! ! &effectcd, ! ! ! ! ! ! ! &effectNode), ! ! ! "couldnt get effect node [time/pitch]"); Notice the type is AUFormatConverter, not AUEffectSunday, December 2, 12
  • AudioUnitParameters.h // Parameters for AUNewTimePitch enum { ! ! // Global, rate, 1/32 -> 32.0, 1.0 ! kNewTimePitchParam_Rate! ! ! ! ! ! = ! 0, ! ! // Global, Cents, -2400 -> 2400, 1.0 ! kNewTimePitchParam_Pitch! ! ! ! ! ! = 1, ! ! // Global, generic, 3.0 -> 32.0, 8.0 ! kNewTimePitchParam_Overlap! ! ! ! ! ! = 4, ! ! // Global, Boolean, 0->1, 1 ! kNewTimePitchParam_EnablePeakLocking! ! ! = 6 }; This is the entire documentation for the AUNewTimePitch parametersSunday, December 2, 12
  • AUNewTimePitch parameters • Rate: kNewTimePitchParam_Rate takes a Float32 rate from 1/32 speed to 32x speed. • Use powers of 2: 1/32, 1/16, …, 2, 4, 8… • Pitch: kNewTimePitchParam_Pitch takes a Float32 representing cents, meaning 1/100 of a musical semitoneSunday, December 2, 12
  • Pitch shifting • Pitch can vary, time does not • Suitable for real-time sources, such as audio captureSunday, December 2, 12
  • Demo: Pitch Shift New in iOS 6!Sunday, December 2, 12
  • Rate shifting • Rate can vary, pitch does not • Think of 1.5x and 2x speed modes in Podcasts app • Not suitable for real-time sources, as data will be consumed faster. Files work well. • Sources must be able to map time systems with kAudioUnitProperty_InputSamplesInOutputSunday, December 2, 12
  • Demo: Rate Shift New in iOS 6!Sunday, December 2, 12
  • AUSplitter AUSomethingElse AUSplitter AUSomethingElse New in iOS 6!Sunday, December 2, 12
  • AUMatrixMixer AUSomethingElse AUSomethingElse AUSomethingElse AUMatrixMixer AUSomethingElse AUSomethingElse New in iOS 6!Sunday, December 2, 12
  • Audio Queues (and the APIs that help them)Sunday, December 2, 12
  • AudioQueue • Easier than AURemoteIO - provide data when you want to, less time pressure, can accept or provide compressed formats (MP3, AAC) • Recording queue - receive buffers of captured audio in a callback • Play-out queue - enqueue buffers of audio to play, optionally refill in a callbackSunday, December 2, 12
  • AudioQueue 2 1 0Sunday, December 2, 12
  • Common AQ scenarios • File player - Read from file and “prime” queue buffers, start queue, when called back with used buffer, refill from next part of file • Synthesis - Maintain state in your own code, write raw samples into buffers during callbacksSunday, December 2, 12
  • Web Radio • Thursday class’ third project • Use Audio File Stream Services to pick out audio data from a network stream • Enqueue these packets as new AQ buffers • Dispose used buffers in callbackSunday, December 2, 12
  • Parsing web radioSunday, December 2, 12
  • Parsing web radio NSURLConnection delivers NSData buffers, containing audio and framing info. We pass it to NSData NSData Audio File Services. Packets Packets Packets Packets PacketsSunday, December 2, 12
  • Parsing web radio NSURLConnection delivers NSData buffers, containing audio and framing info. We pass it to NSData NSData Audio File Services. Packets Packets Packets Packets Packets Packets Packets Audio File Services calls us back with parsed packets of audio data. Packets Packets PacketsSunday, December 2, 12
  • Parsing web radio NSURLConnection delivers NSData buffers, containing audio and framing info. We pass it to NSData NSData Audio File Services. Packets Packets Packets Packets Packets Packets Packets Audio File Services calls us back with parsed packets of audio data. Packets Packets Packets We create an AudioQueueBuffer Packets Packets with those packets and enqueue it Packets 2 Packets 1 0 for play-out. Packets PacketsSunday, December 2, 12
  • A complex thing! • What if we want to see that data after it’s been decoded to PCM and is about to be played? • e.g., spectrum analysis, effects, visualizers • AudioQueue design is “fire-and-forget”Sunday, December 2, 12
  • AudioQueue Tap! http://www.last.fm/music/Spinal+TapSunday, December 2, 12
  • AudioQueueProcessingTap • Set as a property on the Audio Queue • Calls back to your function with decoded (PCM) audio data • Three types: pre- or post- effects (that the AQ performs), or siphon. First two can modify the data. • Only documentation is in AudioQueue.hSunday, December 2, 12
  • Creating an AQ Tap ! ! // create the tap ! ! UInt32 maxFrames = 0; ! ! AudioStreamBasicDescription tapFormat = {0}; ! ! AudioQueueProcessingTapRef tapRef; ! ! CheckError(AudioQueueProcessingTapNew(audioQueue, ! ! ! ! ! ! ! ! ! ! ! tapProc, ! ! ! ! ! ! ! ! ! ! ! (__bridge void *)(player), ! ! ! ! ! ! ! ! ! ! ! kAudioQueueProcessingTap_PreEffects, ! ! ! ! ! ! ! ! ! ! ! &maxFrames, ! ! ! ! ! ! ! ! ! ! ! &tapFormat, ! ! ! ! ! ! ! ! ! ! ! &tapRef), ! ! ! ! "couldnt create AQ tap"); Notice that you receive maxFrames and tapFormat. These do not appear to be settable.Sunday, December 2, 12
  • AQ Tap Proc void tapProc (void * inClientData, ! ! ! AudioQueueProcessingTapRef inAQTap, ! ! ! UInt32 inNumberFrames, ! ! ! AudioTimeStamp * ioTimeStamp, ! ! ! UInt32 * ioFlags, ! ! ! UInt32 * outNumberFrames, ! ! ! AudioBufferList * ioData) { ! CCFWebRadioPlayer *player = (__bridge CCFWebRadioPlayer*) inClientData; ! UInt32 getSourceFlags = 0; ! UInt32 getSourceFrames = 0; ! AudioQueueProcessingTapGetSourceAudio(inAQTap, ! ! ! ! ! ! ! ! ! ! inNumberFrames, ! ! ! ! ! ! ! ! ! ! ioTimeStamp, ! ! ! ! ! ! ! ! ! ! &getSourceFlags, ! ! ! ! ! ! ! ! ! ! &getSourceFrames, ! ! ! ! ! ! ! ! ! ! ioData); // then do something with ioData // ...Sunday, December 2, 12
  • So what should we do with the audio?Sunday, December 2, 12
  • So what should we do with the audio? Let’s apply our pitch-shift effectSunday, December 2, 12
  • Shouldn’t this work? AUEffectSunday, December 2, 12
  • Shouldn’t this work? AUEffect AudioUnitRender()Sunday, December 2, 12
  • AudioUnitRender() • Last argument is an AudioBufferList, whose AudioBuffer members have mData pointers • If mData != NULL, audio unit does its thing with those samples • If mData == NULL, audio data pulls from whatever it’s connected to • So we just call with AudioBufferList ioData we got from tap callback, right?Sunday, December 2, 12
  • Psych! • AQ tap provides data as signed ints • Effect units only work with floating point • We need to do an on-the-spot format conversionSunday, December 2, 12
  • invalidname’s convert- and-effect recipe OSStatus converterInputRenderCallback (void *inRefCon, AudioUnitRenderActionFlags *ioActionFlags, const AudioTimeStamp *inTimeStamp, UInt32 inBusNumber, UInt32 inNumberFrames, AudioBufferList * ioData) { CCFWebRadioPlayer *player = (__bridge CCFWebRadioPlayer*) inRefCon; // read from buffer ioData->mBuffers[0].mData = player.preRenderData; return noErr; } AUConverter AUEffect AUConverter AUGenericOutput Note: red arrows are float format, yellow arrows are intSunday, December 2, 12
  • How it works • AUGraph: AUConverter → AUEffect → AUConverter → AUGenericOutput • Top AUConverter is connected to a render callback functionSunday, December 2, 12
  • The trick! • Copy mData pointer to a state variable and NULL it in ioData • Call AudioQueueRender() on output unit. The NULL makes it pull from the graph. • Top of the graph pulls on render callback, which gives it back the mData we copied off.Sunday, December 2, 12
  • Yes, really This is the rest of tapProc() ! // copy off the ioData so the graph can read from it // in render callback ! player.preRenderData = ioData->mBuffers[0].mData; ! ioData->mBuffers[0].mData = NULL; ! ! OSStatus renderErr = noErr; ! AudioUnitRenderActionFlags actionFlags = 0; ! renderErr = AudioUnitRender(player.genericOutputUnit, ! ! ! ! ! ! ! ! &actionFlags, ! ! ! ! ! ! ! ! player.renderTimeStamp, ! ! ! ! ! ! ! ! 0, ! ! ! ! ! ! ! ! inNumberFrames, ! ! ! ! ! ! ! ! ioData); ! NSLog (@"AudioUnitRender, renderErr = %ld",renderErr); }Sunday, December 2, 12
  • Yes, really This is the render callback that supplies data to the int→float converter OSStatus converterInputRenderCallback (void *inRefCon, ! ! ! ! ! ! ! ! ! AudioUnitRenderActionFlags *ioActionFlags, ! ! ! ! ! ! ! ! ! const AudioTimeStamp *inTimeStamp, ! ! ! ! ! ! ! ! ! UInt32 inBusNumber, ! ! ! ! ! ! ! ! ! UInt32 inNumberFrames, ! ! ! ! ! ! ! ! ! AudioBufferList * ioData) { ! CCFWebRadioPlayer *player = (__bridge CCFWebRadioPlayer*) inRefCon; ! ! // read from buffer ! ioData->mBuffers[0].mData = player.preRenderData; ! return noErr; }Sunday, December 2, 12
  • Demo: AQ Tap + AUNewTimePitch New in iOS 6!Sunday, December 2, 12
  • Sunday, December 2, 12
  • Other new stuffSunday, December 2, 12
  • Multi-Route • Ordinarily, one input or output is active: earpiece, speaker, headphones, dock- connected device • “Last in wins” • With AV Session “multi-route” category, you can use several at once • WWDC 2012 session 505Sunday, December 2, 12
  • Utility classes moved again • C++ utilities, including the CARingBuffer • < Xcode 4.3, installed into /Developer • Xcode 4.3-4.4, optional download from developer.apple.com • ≧ Xcode 4.5, sample code project “Core Audio Utility Classes”Sunday, December 2, 12
  • Takeaways • Core Audio fundamentals never change • New stuff is added as properties, typedefs, enums, etc. • Watch the SDK API diffs document to find the new stuff • Hope you like header files and experimentationSunday, December 2, 12
  • Q&A • Slides will be posted to slideshare.net/ invalidname • Code will be linked from there and my blog • Watch CocoaConf RDU glassboard, @invalidname on Twitter/ADN, or [Time code]; blog for announcement • Thanks!Sunday, December 2, 12