Core Audio Cranks It Up
                                  Chris Adamson • @invalidname
                           Voices That Matter: iOS Developer Conference
                                    Nov. 13, 2011 • Boston, MA




Tuesday, November 15, 11
What We'll Cover
               • Core Audio: what it is and why it's there

               • Audio Units and graphs

               • Fun with I/O

               • Special effects

               • Music with MIDI


Tuesday, November 15, 11
Core Audio
               • Low-level, low-latency, professional audio
                 processing API, ported from OS X to iPhone
                 OS.

               • Legendarily difficult: see Mike Ash's "Why
                 Core Audio Is Hard" (Oct. 2006)

               • Consists of three audio engines and various
                 helpers


Tuesday, November 15, 11
What's in the box
               • #import AudioToolbox, possibly also AudioUnit,
                 CoreMIDI

               • Helper APIs — File I/O, format conversion,
                 stream parsing, MIDI events, sessions

               • Audio Engine APIs — Audio Units, Audio
                 Queues, OpenAL

                    • Queues and OpenAL implemented atop units


Tuesday, November 15, 11
Audio Units
               • Process audio in some way: generate, apply
                 affects, mix, hardware I/O, etc.

               • Arbitrary number of inputs and outputs (usually
                 one each)

               • Rare case: process a buffer of samples in
                 place

               • Common case: connect units in a "graph"


Tuesday, November 15, 11
Audio Unit types

               • I/O: microphone input / headphone output

               • Music/generators: create sound

               • Mixers/effects: process sound




Tuesday, November 15, 11
Tuesday, November 15, 11
Pull
               • Audio flows through units via a "pull" metaphor

               • Typically the I/O unit, but can be an app-
                 created unit

                    • Pull begins with a call to AudioUnitRender()

               • Last unit pulls audio from upstream unit(s), or
                 from "render callbacks" to application code


Tuesday, November 15, 11
Why pull?
               • Caller, often an I/O thread, is responsible for
                 knowing when it needs new buffers

               • Push may seem more sensible, but ends up
                 being burdensome

                    • OpenAL model: poll to see if sources need
                      new buffers and, if so, push them in



Tuesday, November 15, 11
I/O units
               • Abstraction over audio hardware

               • AURemoteIO for local use, echo-cancelling
                 AUVoiceProcessingIO for VOIP

               • Can only have one per graph

               • Special meaning for buses: 0 is output to
                 headphones/speakers, 1 is input from
                 microphone


Tuesday, November 15, 11
Tuesday, November 15, 11
Tuesday, November 15, 11
Demo
         VTM AU Input




  http://dl.dropbox.com/u/12216224/conferences/vtm11/VTMAUGraphDemo.zip

Tuesday, November 15, 11
Adding IO Unit Node
      NewAUGraph(&_auGraph);

      AudioComponentDescription compDesc = {0};
      compDesc.componentType = kAudioUnitType_Output;
      compDesc.componentSubType =
      ! kAudioUnitSubType_RemoteIO;
      compDesc.componentManufacturer =
      ! kAudioUnitManufacturer_Apple;
      // adds a node with above description to the graph
      AUNode ioNode;
      AUGraphAddNode(self.auGraph, &compDesc, &ioNode);

 A subsequent call to AUGraphNodeInfo() gets an AudioUnit from
                         the AUNode
Tuesday, November 15, 11
Connect nodes and
                              start graph


          AUGraphConnectNodeInput(_auGraph
              ioNode, 1, mixerNode, 0);

          AUGraphInitialize(_auGraph)

          AUGraphStart(_auGraph);!



Tuesday, November 15, 11
The problem with
                              AUGraphs
               • Prior to iOS 5, AUGraphs were boring:

                    • No generator units: only input was
                      AURemoteIO (or synthesized audio in a
                      render callback)

                    • Only one effect unit (AUiPodEQ)




Tuesday, November 15, 11
iOS 5 cranks it up

               • Generators: AUFilePlayer and AUSampler

               • Effects: Reverb, distortion box, high/low pass
                 filters, high/low shelf filters, dynamics
                 processor, N-band EQ




Tuesday, November 15, 11
AUFilePlayer
               • Generator unit that produces sound from a file

               • Prior to this, playing a file in an AUGraph was
                 unduly burdensome

                    • Render callbacks cannot perform blocking calls
                      like file I/O, so you'd read from file in one thread,
                      put samples in a ring buffer, and have callback
                      pull from buffer

                    • We have an OSX example of this in book. Ouch.


Tuesday, November 15, 11
AUFilePlayer properties
               • kAudioUnitProperty_ScheduledFileIDs - indicates
                 which files to play

               • kAudioUnitProperty_ScheduledFileRegion - struct
                 describing what portion of file to play

               • kAudioUnitProperty_ScheduledFilePrime - optionally
                 preload audio frames for performance

               • kAudioUnitProperty_ScheduleStartTimeStamp -
                 when to start playing (-1 for next audio render cycle)


Tuesday, November 15, 11
Getting some beats
                           Live Edgy Drum Kit 42.aiff




Tuesday, November 15, 11
Mixing
               • AUMultichannelMixer - Mixes multiple input
                 buses into one output bus




Tuesday, November 15, 11
Demo
         VTM AU Mix




  http://dl.dropbox.com/u/12216224/conferences/vtm11/VTMAUGraphDemo.zip

Tuesday, November 15, 11
More fun with mixers


               • Set volume for an input or output bus with
                 kMultiChannelMixerParam_Volume

                    • This is a parameter, not a property

               • Also read-only pre-/post- peak/average levels
                 for creating level meters (must be enabled first)

Tuesday, November 15, 11
Also new: MIDI

               • Core MIDI added in iOS 4.2

               • Provides I/O with MIDI hardware, or MIDI-
                 over-WiFi

               • Hardware access is via Camera Connection
                 Kit and MIDI-to-USB adapter




Tuesday, November 15, 11
MIDI packets




Tuesday, November 15, 11
MIDI Messages
               • Channel Voice Messages — Note On, Note
                 Off, After-touch, Pitch wheel

               • High nybble of status is command, low nybble
                 is channel number

               • Channel Mode, System Messages

               • Various extensions to the spec over the years

             http://www.midi.org/techspecs/midimessages.php
Tuesday, November 15, 11
MIDI hardware on iOS
               • MIDI adapters must be "MIDI class
                 compliant" (i.e., no drivers)

               • Dock connector / CCK provides minuscule
                 power, not enough to power some MIDI adapters

                    • Powered adapters generally work

                    • Otherwise check http://www.iosmidi.com/
                      devices


Tuesday, November 15, 11
MIDI set-up
    MIDIClientCreate(CFSTR("VTM iOS Demo"), MyMIDINotifyProc,
    ! ! callbackContext, &client);
    MIDIInputPortCreate(client, CFSTR("Input port"),
    ! ! MyMIDIReadProc, callbackContext, &inPort);
    unsigned long sourceCount = MIDIGetNumberOfSources();
    for (int i = 0; i < sourceCount; ++i) {
    ! MIDIEndpointRef src = MIDIGetSource(i);
    ! CFStringRef endpointName = NULL;
    ! MIDIObjectGetStringProperty(src, kMIDIPropertyName,
    ! ! &endpointName);
    ! char endpointNameC[255];
    ! CFStringGetCString(endpointName, endpointNameC,
    ! ! 255, kCFStringEncodingUTF8);
    ! printf(" source %d: %sn", i, endpointNameC);
    ! MIDIPortConnectSource(inPort, src, NULL);
    }
Tuesday, November 15, 11
So what do we do in
                            MyMIDIReadProc?
               • Prior to iOS 5, there was no API that actually
                 did anything with MIDI events

               • On OSX, Core Audio defines music units,
                 which play sounds in response to MIDI events

               • iOS 5 and Lion add the AUSampler




Tuesday, November 15, 11
AUSampler

               • Music unit that starts with a source clip and
                 pitch-shifts to create different tones

               • Can use DLS or SoundFont 2 files as source

               • Can also create your own sample files with
                 AULab




Tuesday, November 15, 11
http://developer.apple.com/library/mac/#technotes/tn2283/_index.html

Tuesday, November 15, 11
Orchestral String Section 02-cropped.aif

Tuesday, November 15, 11
Orchestral String Section 02-cropped.aif

Tuesday, November 15, 11
/Developer/Applications/Audio/AU Lab


Tuesday, November 15, 11
Tuesday, November 15, 11
Creating an .aupreset




Tuesday, November 15, 11
Tuesday, November 15, 11
Tuesday, November 15, 11
Tuesday, November 15, 11
Tuesday, November 15, 11
Loading .aupreset (1/3)
   NSString *filePath = [[NSBundle mainBundle]
   ! pathForResource: AU_SAMPLER_PRESET_FILE
   ! ofType:@"aupreset"];
   CFURLRef presetURL =
   ! (__bridge CFURLRef) [NSURL fileURLWithPath:filePath];

   // load preset file into a CFDataRef
   CFDataRef presetData = NULL;
   SInt32 errorCode = noErr;
   Boolean gotPresetData =
   ! CFURLCreateDataAndPropertiesFromResource(
   ! ! kCFAllocatorSystemDefault, presetURL,
   ! ! &presetData, NULL, NULL, &errorCode);


Tuesday, November 15, 11
Loading .aupreset (2/3)

      // convert this into a property list
      CFPropertyListFormat presetPlistFormat = {0};
      CFErrorRef presetPlistError = NULL;
      CFPropertyListRef presetPlist =
      CFPropertyListCreateWithData(
      ! kCFAllocatorSystemDefault, presetData,
      ! kCFPropertyListImmutable,&presetPlistFormat,
      ! &presetPlistError);




Tuesday, November 15, 11
Loading .aupreset (3/3)
          // set this plist as the
          // kAudioUnitProperty_ClassInfo on _auSampler
          if (presetPlist) {
          ! ! AudioUnitSetProperty(
          ! ! ! ! self.auSampler,
          ! ! ! ! kAudioUnitProperty_ClassInfo,
          ! ! ! ! kAudioUnitScope_Global,
          ! ! ! ! 0,
          ! ! ! ! &presetPlist,
          ! ! ! ! sizeof(presetPlist);
          }


Tuesday, November 15, 11
Now what?
               • We have configured Core MIDI to call
                 MyMIDIReadProc on incoming MIDI events

               • We have configured the AUSampler audio
                 unit with a sampled sound

               • We need to deliver the MIDI events to the
                 AUSampler unit



Tuesday, November 15, 11
Tuesday, November 15, 11
Tuesday, November 15, 11
MusicDevice.h
               • Small API to deliver MIDI events to instrument units

               • Not in Xcode documentation. Check out the
                 header file

               • Only 4 functions

               • MusicDeviceMIDIEvent() sends status, data1,
                 data2 to a MusicDeviceComponent (i.e., an
                 instrument Audio Unit)



Tuesday, November 15, 11
Delivering MIDI event
    MIDIPacket *packet = (MIDIPacket *)pktlist->packet;!
    for (int i=0; i < pktlist->numPackets; i++) {
    ! Byte midiStatus = packet->data[0];
    ! Byte midiCommand = midiStatus >> 4;
    ! // is it a note-on or note-off
    ! if ((midiCommand == 0x09) || (midiCommand == 0x08)) {
    ! ! Byte note = packet->data[1] & 0x7F;
    ! ! Byte velocity = packet->data[2] & 0x7F;
    ! ! // send to augraph
    ! ! MusicDeviceMIDIEvent (myVC.auSampler, midiStatus,
    note, velocity, 0);
    ! }
    ! packet = MIDIPacketNext(packet);
    }


Tuesday, November 15, 11
Demo
         VTM AU Mix (w/MIDI)




  http://dl.dropbox.com/u/12216224/conferences/vtm11/VTMAUGraphDemo.zip

Tuesday, November 15, 11
Effects
               • iOS 5 first version to offer multiple audio effects

               • Many effects use floating-point samples, rather
                 than ints or fixed-point; requires adjusting the
                 stream formats of units connected to them (or
                 in-line AUConverters)

               • Adjust with parameters (see
                 AudioUnitParameters.h)


Tuesday, November 15, 11
Two sample effects

               • Distortion effect - audio "fuzzing" effect, takes
                 no parameters

               • Low pass filter - Cuts off frequencies above a
                 given frequency




Tuesday, November 15, 11
Tuesday, November 15, 11
Configuring
                           AULowPassFilter

           Float32 lowPassCutoffFrequency = 800.0;
           AudioUnitSetParameter(_auLowPassFilter,
           ! ! ! ! ! kLowPassParam_CutoffFrequency,
           ! ! ! ! ! kAudioUnitScope_Global,
           ! ! ! ! ! 0,
           ! ! ! ! ! lowPassCutoffFrequency,
           ! ! ! ! ! 0);




Tuesday, November 15, 11
Demo
         VTM AU Effects




  http://dl.dropbox.com/u/12216224/conferences/vtm11/VTMAUGraphDemo.zip

Tuesday, November 15, 11
Bypassing units


                           UInt32 bypassed =
                           ! self.effectEnabledSwitch.on ? 0 : 1;
                           AudioUnitSetProperty (self.audioUnit,
                           ! ! kAudioUnitProperty_BypassEffect,
                           ! ! kAudioUnitScope_Global,
                           ! ! 0,
                           ! ! &bypassed,
                           ! ! sizeof(bypassed));

Tuesday, November 15, 11
Recap

               • Audio Units are greatly enhanced in iOS 5

               • Real-time mixing and effects on mic input,
                 pitch-shifted samples, and file playback

               • Best choice for serious audio production




Tuesday, November 15, 11
Resources
               • http://lists.apple.com/mailman/listinfo2/coreaudio-
                 api

               • https://devforums.apple.com/community/ios/
                 graphics/audio

               • Blogs:

                    • Mine: http://www.subfurther.com/blog

                    • Michael Tyson: http://atastypixel.com/blog/


Tuesday, November 15, 11
Q&A




                   http://my.safaribooksonline.com/book/audio/
                                  9780321636973
Tuesday, November 15, 11

Core Audio Cranks It Up

  • 1.
    Core Audio CranksIt Up Chris Adamson • @invalidname Voices That Matter: iOS Developer Conference Nov. 13, 2011 • Boston, MA Tuesday, November 15, 11
  • 2.
    What We'll Cover • Core Audio: what it is and why it's there • Audio Units and graphs • Fun with I/O • Special effects • Music with MIDI Tuesday, November 15, 11
  • 3.
    Core Audio • Low-level, low-latency, professional audio processing API, ported from OS X to iPhone OS. • Legendarily difficult: see Mike Ash's "Why Core Audio Is Hard" (Oct. 2006) • Consists of three audio engines and various helpers Tuesday, November 15, 11
  • 4.
    What's in thebox • #import AudioToolbox, possibly also AudioUnit, CoreMIDI • Helper APIs — File I/O, format conversion, stream parsing, MIDI events, sessions • Audio Engine APIs — Audio Units, Audio Queues, OpenAL • Queues and OpenAL implemented atop units Tuesday, November 15, 11
  • 5.
    Audio Units • Process audio in some way: generate, apply affects, mix, hardware I/O, etc. • Arbitrary number of inputs and outputs (usually one each) • Rare case: process a buffer of samples in place • Common case: connect units in a "graph" Tuesday, November 15, 11
  • 6.
    Audio Unit types • I/O: microphone input / headphone output • Music/generators: create sound • Mixers/effects: process sound Tuesday, November 15, 11
  • 7.
  • 8.
    Pull • Audio flows through units via a "pull" metaphor • Typically the I/O unit, but can be an app- created unit • Pull begins with a call to AudioUnitRender() • Last unit pulls audio from upstream unit(s), or from "render callbacks" to application code Tuesday, November 15, 11
  • 9.
    Why pull? • Caller, often an I/O thread, is responsible for knowing when it needs new buffers • Push may seem more sensible, but ends up being burdensome • OpenAL model: poll to see if sources need new buffers and, if so, push them in Tuesday, November 15, 11
  • 10.
    I/O units • Abstraction over audio hardware • AURemoteIO for local use, echo-cancelling AUVoiceProcessingIO for VOIP • Can only have one per graph • Special meaning for buses: 0 is output to headphones/speakers, 1 is input from microphone Tuesday, November 15, 11
  • 11.
  • 12.
  • 13.
    Demo VTM AU Input http://dl.dropbox.com/u/12216224/conferences/vtm11/VTMAUGraphDemo.zip Tuesday, November 15, 11
  • 14.
    Adding IO UnitNode NewAUGraph(&_auGraph); AudioComponentDescription compDesc = {0}; compDesc.componentType = kAudioUnitType_Output; compDesc.componentSubType = ! kAudioUnitSubType_RemoteIO; compDesc.componentManufacturer = ! kAudioUnitManufacturer_Apple; // adds a node with above description to the graph AUNode ioNode; AUGraphAddNode(self.auGraph, &compDesc, &ioNode); A subsequent call to AUGraphNodeInfo() gets an AudioUnit from the AUNode Tuesday, November 15, 11
  • 15.
    Connect nodes and start graph AUGraphConnectNodeInput(_auGraph ioNode, 1, mixerNode, 0); AUGraphInitialize(_auGraph) AUGraphStart(_auGraph);! Tuesday, November 15, 11
  • 16.
    The problem with AUGraphs • Prior to iOS 5, AUGraphs were boring: • No generator units: only input was AURemoteIO (or synthesized audio in a render callback) • Only one effect unit (AUiPodEQ) Tuesday, November 15, 11
  • 17.
    iOS 5 cranksit up • Generators: AUFilePlayer and AUSampler • Effects: Reverb, distortion box, high/low pass filters, high/low shelf filters, dynamics processor, N-band EQ Tuesday, November 15, 11
  • 18.
    AUFilePlayer • Generator unit that produces sound from a file • Prior to this, playing a file in an AUGraph was unduly burdensome • Render callbacks cannot perform blocking calls like file I/O, so you'd read from file in one thread, put samples in a ring buffer, and have callback pull from buffer • We have an OSX example of this in book. Ouch. Tuesday, November 15, 11
  • 19.
    AUFilePlayer properties • kAudioUnitProperty_ScheduledFileIDs - indicates which files to play • kAudioUnitProperty_ScheduledFileRegion - struct describing what portion of file to play • kAudioUnitProperty_ScheduledFilePrime - optionally preload audio frames for performance • kAudioUnitProperty_ScheduleStartTimeStamp - when to start playing (-1 for next audio render cycle) Tuesday, November 15, 11
  • 20.
    Getting some beats Live Edgy Drum Kit 42.aiff Tuesday, November 15, 11
  • 21.
    Mixing • AUMultichannelMixer - Mixes multiple input buses into one output bus Tuesday, November 15, 11
  • 22.
    Demo VTM AU Mix http://dl.dropbox.com/u/12216224/conferences/vtm11/VTMAUGraphDemo.zip Tuesday, November 15, 11
  • 23.
    More fun withmixers • Set volume for an input or output bus with kMultiChannelMixerParam_Volume • This is a parameter, not a property • Also read-only pre-/post- peak/average levels for creating level meters (must be enabled first) Tuesday, November 15, 11
  • 24.
    Also new: MIDI • Core MIDI added in iOS 4.2 • Provides I/O with MIDI hardware, or MIDI- over-WiFi • Hardware access is via Camera Connection Kit and MIDI-to-USB adapter Tuesday, November 15, 11
  • 25.
  • 26.
    MIDI Messages • Channel Voice Messages — Note On, Note Off, After-touch, Pitch wheel • High nybble of status is command, low nybble is channel number • Channel Mode, System Messages • Various extensions to the spec over the years http://www.midi.org/techspecs/midimessages.php Tuesday, November 15, 11
  • 27.
    MIDI hardware oniOS • MIDI adapters must be "MIDI class compliant" (i.e., no drivers) • Dock connector / CCK provides minuscule power, not enough to power some MIDI adapters • Powered adapters generally work • Otherwise check http://www.iosmidi.com/ devices Tuesday, November 15, 11
  • 28.
    MIDI set-up MIDIClientCreate(CFSTR("VTM iOS Demo"), MyMIDINotifyProc, ! ! callbackContext, &client); MIDIInputPortCreate(client, CFSTR("Input port"), ! ! MyMIDIReadProc, callbackContext, &inPort); unsigned long sourceCount = MIDIGetNumberOfSources(); for (int i = 0; i < sourceCount; ++i) { ! MIDIEndpointRef src = MIDIGetSource(i); ! CFStringRef endpointName = NULL; ! MIDIObjectGetStringProperty(src, kMIDIPropertyName, ! ! &endpointName); ! char endpointNameC[255]; ! CFStringGetCString(endpointName, endpointNameC, ! ! 255, kCFStringEncodingUTF8); ! printf(" source %d: %sn", i, endpointNameC); ! MIDIPortConnectSource(inPort, src, NULL); } Tuesday, November 15, 11
  • 29.
    So what dowe do in MyMIDIReadProc? • Prior to iOS 5, there was no API that actually did anything with MIDI events • On OSX, Core Audio defines music units, which play sounds in response to MIDI events • iOS 5 and Lion add the AUSampler Tuesday, November 15, 11
  • 30.
    AUSampler • Music unit that starts with a source clip and pitch-shifts to create different tones • Can use DLS or SoundFont 2 files as source • Can also create your own sample files with AULab Tuesday, November 15, 11
  • 31.
  • 32.
    Orchestral String Section02-cropped.aif Tuesday, November 15, 11
  • 33.
    Orchestral String Section02-cropped.aif Tuesday, November 15, 11
  • 34.
  • 35.
  • 36.
  • 37.
  • 38.
  • 39.
  • 40.
  • 41.
    Loading .aupreset (1/3) NSString *filePath = [[NSBundle mainBundle] ! pathForResource: AU_SAMPLER_PRESET_FILE ! ofType:@"aupreset"]; CFURLRef presetURL = ! (__bridge CFURLRef) [NSURL fileURLWithPath:filePath]; // load preset file into a CFDataRef CFDataRef presetData = NULL; SInt32 errorCode = noErr; Boolean gotPresetData = ! CFURLCreateDataAndPropertiesFromResource( ! ! kCFAllocatorSystemDefault, presetURL, ! ! &presetData, NULL, NULL, &errorCode); Tuesday, November 15, 11
  • 42.
    Loading .aupreset (2/3) // convert this into a property list CFPropertyListFormat presetPlistFormat = {0}; CFErrorRef presetPlistError = NULL; CFPropertyListRef presetPlist = CFPropertyListCreateWithData( ! kCFAllocatorSystemDefault, presetData, ! kCFPropertyListImmutable,&presetPlistFormat, ! &presetPlistError); Tuesday, November 15, 11
  • 43.
    Loading .aupreset (3/3) // set this plist as the // kAudioUnitProperty_ClassInfo on _auSampler if (presetPlist) { ! ! AudioUnitSetProperty( ! ! ! ! self.auSampler, ! ! ! ! kAudioUnitProperty_ClassInfo, ! ! ! ! kAudioUnitScope_Global, ! ! ! ! 0, ! ! ! ! &presetPlist, ! ! ! ! sizeof(presetPlist); } Tuesday, November 15, 11
  • 44.
    Now what? • We have configured Core MIDI to call MyMIDIReadProc on incoming MIDI events • We have configured the AUSampler audio unit with a sampled sound • We need to deliver the MIDI events to the AUSampler unit Tuesday, November 15, 11
  • 45.
  • 46.
  • 47.
    MusicDevice.h • Small API to deliver MIDI events to instrument units • Not in Xcode documentation. Check out the header file • Only 4 functions • MusicDeviceMIDIEvent() sends status, data1, data2 to a MusicDeviceComponent (i.e., an instrument Audio Unit) Tuesday, November 15, 11
  • 48.
    Delivering MIDI event MIDIPacket *packet = (MIDIPacket *)pktlist->packet;! for (int i=0; i < pktlist->numPackets; i++) { ! Byte midiStatus = packet->data[0]; ! Byte midiCommand = midiStatus >> 4; ! // is it a note-on or note-off ! if ((midiCommand == 0x09) || (midiCommand == 0x08)) { ! ! Byte note = packet->data[1] & 0x7F; ! ! Byte velocity = packet->data[2] & 0x7F; ! ! // send to augraph ! ! MusicDeviceMIDIEvent (myVC.auSampler, midiStatus, note, velocity, 0); ! } ! packet = MIDIPacketNext(packet); } Tuesday, November 15, 11
  • 49.
    Demo VTM AU Mix (w/MIDI) http://dl.dropbox.com/u/12216224/conferences/vtm11/VTMAUGraphDemo.zip Tuesday, November 15, 11
  • 50.
    Effects • iOS 5 first version to offer multiple audio effects • Many effects use floating-point samples, rather than ints or fixed-point; requires adjusting the stream formats of units connected to them (or in-line AUConverters) • Adjust with parameters (see AudioUnitParameters.h) Tuesday, November 15, 11
  • 51.
    Two sample effects • Distortion effect - audio "fuzzing" effect, takes no parameters • Low pass filter - Cuts off frequencies above a given frequency Tuesday, November 15, 11
  • 52.
  • 53.
    Configuring AULowPassFilter Float32 lowPassCutoffFrequency = 800.0; AudioUnitSetParameter(_auLowPassFilter, ! ! ! ! ! kLowPassParam_CutoffFrequency, ! ! ! ! ! kAudioUnitScope_Global, ! ! ! ! ! 0, ! ! ! ! ! lowPassCutoffFrequency, ! ! ! ! ! 0); Tuesday, November 15, 11
  • 54.
    Demo VTM AU Effects http://dl.dropbox.com/u/12216224/conferences/vtm11/VTMAUGraphDemo.zip Tuesday, November 15, 11
  • 55.
    Bypassing units UInt32 bypassed = ! self.effectEnabledSwitch.on ? 0 : 1; AudioUnitSetProperty (self.audioUnit, ! ! kAudioUnitProperty_BypassEffect, ! ! kAudioUnitScope_Global, ! ! 0, ! ! &bypassed, ! ! sizeof(bypassed)); Tuesday, November 15, 11
  • 56.
    Recap • Audio Units are greatly enhanced in iOS 5 • Real-time mixing and effects on mic input, pitch-shifted samples, and file playback • Best choice for serious audio production Tuesday, November 15, 11
  • 57.
    Resources • http://lists.apple.com/mailman/listinfo2/coreaudio- api • https://devforums.apple.com/community/ios/ graphics/audio • Blogs: • Mine: http://www.subfurther.com/blog • Michael Tyson: http://atastypixel.com/blog/ Tuesday, November 15, 11
  • 58.
    Q&A http://my.safaribooksonline.com/book/audio/ 9780321636973 Tuesday, November 15, 11