Your SlideShare is downloading. ×
Mac OS X 與 iOS 的 Audio API
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Introducing the official SlideShare app

Stunning, full-screen experience for iPhone and Android

Text the download link to your phone

Standard text messaging rates apply

Mac OS X 與 iOS 的 Audio API

903
views

Published on

AVFoundation 與 Core Audio 簡介

AVFoundation 與 Core Audio 簡介


1 Comment
3 Likes
Statistics
Notes
  • 寫得很不錯,我剛好遇到這方面的問題,非常苦手,特感謝你的投影片釋疑。
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here
No Downloads
Views
Total Views
903
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
13
Comments
1
Likes
3
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Mac OS X 與 iOS 的 Audio API 楊維中 a.k.a zonble zonble@gmail.com Friday, August 16, 13
  • 2. 與 Audio 有關的 API • AVFoundation • Audio Service • OpenAL • Audio Queue • Audio Unit • Audio Session • MPNowPlayingCenter • Background Task • Remote Control Events • … Friday, August 16, 13
  • 3. 任務: 播放網站上的⾳音檔 ⽽而且,抓到了多少資料 就⾺馬上播放多少資料 Friday, August 16, 13
  • 4. 這就不會是我們要的 • Audio Service:⽤用來處理簡短的⾳音效,例 如按到虛擬鍵盤上的喀嚓聲 • OpenAL:製作遊戲中的3D⾳音效,可以設 定⾳音效是從哪個⽅方向傳來的 Friday, August 16, 13
  • 5. AVAudioPlayer • ⾼高階的 Audio Player • iOS 2.2 • ⽀支援多種格式 • 可以從 file URL 或是 NSData 建⽴立 player 播放,但只能夠播放 local file • 適合製作遊戲背景⾳音樂 Friday, August 16, 13
  • 6. AVPlayer • ⾼高階的 Audio Player • iOS 4.0 • ⽀支援本機檔案與遠端檔案 • 適合網路廣播 •無法知道檔案總⻑⾧長度 Friday, August 16, 13
  • 7. 我們的服務需要… • 可以直接讀多少資料就播放多少資料 • 可以⼀一邊下載⼀一邊 cache ⽇日後離線播放 • 知道載⼊入進度 • 檔案可能加密過,必須解密… • 對,有些公司就是會做這種事 Friday, August 16, 13
  • 8. 我們需要了解底層 API CoreAudio Friday, August 16, 13
  • 9. 因為太難了 所以先講簡單的… Friday, August 16, 13
  • 10. Audio Session • ⽤用來表⽰示⺫⽬目前的 Audio 屬於哪⼀一種類 • 設定 Audio Session Category • ⼀一般背景、Media Playback…etc • 然後呼叫 Audio Session Start • 有 C 與 Objective-C API Friday, August 16, 13
  • 11. Audio Session • 還要處理 Interrupt 與 Resume • 實作 delegate,更新 UI • Interrupt 的場合: • 別的 app 播放⾳音樂 • 來電 • 鬧鐘… Friday, August 16, 13
  • 12. 回到 Core Audio Friday, August 16, 13
  • 13. 說實在 這種 code 寫過⼀一次 不會想寫第⼆二次 Friday, August 16, 13
  • 14. 請忘記 OO 忘記Objective-C Audio 串流就是快速處理⼀一堆 連續的 Binary 資料 Friday, August 16, 13
  • 15. 基本觀念 • Audio 資料是連續的 binary 資料 • 資料中包含的是連續的sample/frame,⼀一 秒鐘會有 44100 個 sample • MP3/AAC等⺫⽬目前通⽤用的壓縮⾳音檔格式, 會將若干⼤大⼩小的frame變成⼀一個packet • 如果是VBR(變動碼率)的⾳音檔,每個 packet裡頭的資料量不⼀一樣⼤大 Friday, August 16, 13
  • 16. ⼀一些術語 • Sample Rate:每秒鐘有多少 sample • Packet Size:每個 Packet 有多少 sample • Bit Rate:每秒鐘有多少 bit Friday, August 16, 13
  • 17. 完整流程 • 建⽴立網路連線,從連線 callback 讀取資料 • 解密,並將解密過的資料放⼊入記憶體 • 建⽴立 Audio Queue 或是 Audio Unit Graph,收取系統通知需要下⼀一段資料的 callback • 提供資料 Friday, August 16, 13
  • 18. 我們專注在 • 將資料讀⼊入記憶體 • Parse 出 packet 並保存 • 將資料餵給 Audio API • 註冊 callback • 在 callback 中回傳下⼀一段資料 • 資料要轉換成 Linear PCM 格式 Friday, August 16, 13
  • 19. Audio Unit 與 Audio Queue 兩組 API 的差別? • 在上⾴頁的流程中,最⼤大的差別在於Audio Queue API 不需要⼿手動將資料轉換成 Linear PCM 格式 • 不容易設定 • 可以直接對Linear PCM資料做⼿手腳… • Audio Unit 中可以在 Audio Graph 中增加 mixer 與 EQ effect node Friday, August 16, 13
  • 20. Packet ID3 data MP3 Header MP3 Data MP3 Header MP3 Data MP3 Header MP3 Data Packet Friday, August 16, 13
  • 21. 辨識 MP3 header • MP3 Header 共 4 個 bytes • 其中前11個bit是sync word(都是1),看 到 sync word就可以知道是⼀一個packet的 開頭 • sync word之後描述這個packet的格式 • http://www.mp3-tech.org/programmer/ frame_header.html Friday, August 16, 13
  • 22. Sample Packet Parser def parse(content): i = 0 while i + 2 < len(content): frameSync = (content[i] << 8) | (content[i + 1] & (0x80 | 0x40 | 0x20)) if frameSync != 0xffe0: if foundFirstFrame: pass i += 1; continue if not foundFirstFrame: foundFirstFrame = True audioVersion = (content[i + 1] >> 3) & 0x03; layer = (content[i + 1] >> 1) & 0x03 hasCRC = not(content[i + 1] & 0x01) bitrateIndex = content[i + 2] >> 4; sampleRateIndex = content[i + 2] >> 2 & 0x03; bitrate = [0, 32000, 40000, 48000, 56000, 64000, 80000, 96000, 112000, 128000, 160000, 192000, 224000, 256000, 320000, 0][bitrateIndex]; hasPadding = not(not((content[i + 2] >> 1) & 0x01)) frameLength = 144 * bitrate / 44100 + (1 if hasPadding else 0) + (2 if hasCRC else 0) i += frameLength Friday, August 16, 13
  • 23. MPEG 4 Audio/Video MOOV MDAT Friday, August 16, 13
  • 24. CoreAudio 提供我們 ⼀一組 Audio Parser • ⽤用途是找出 Packet,並分析出檔案格式 • C API • iOS 2/Mac OS X 10.5 • 本機檔案可以呼叫 AudioFileOpenURL • 串流資料可以呼叫AudioFileStreamOpen Friday, August 16, 13
  • 25. AudioFileStreamOpen • AudioFileStreamOpen(self, ZBAudioFileStreamPropertyListener, ZBAudioFileStreamPacketsCallback, kAudioFileMP3Type, &audioFileStreamID); • self :傳遞⼀一個callback可以使⽤用的物件 • ZBAudioFileStreamPropertyListener 檔案格式 callback • ZBAudioFileStreamPacketsCallback parse 出 packet 的 callback • kAudioFileMP3Type 給 parser 的 hint • audioFileStreamID 產⽣生 audio file stream ID Friday, August 16, 13
  • 26. 在記憶體中保存 packet? • 其實我們只要⽤用個簡單的 structure 就可以 保存 typedef struct { size_t length; // packet ⻑⾧長度 void *data; // packet 資料的指標 } Friday, August 16, 13
  • 27. 保存檔案格式 void ZBAudioFileStreamPropertyListener(void * inClientData, AudioFileStreamID inAudioFileStream, AudioFileStreamPropertyID inPropertyID, UInt32 * ioFlags) { ZBSimplePlayer *self = (ZBSimplePlayer *)inClientData; if (inPropertyID == kAudioFileStreamProperty_DataFormat) { UInt32 dataSize = 0; OSStatus status = 0; AudioStreamBasicDescription audioStreamDescription; Boolean writable = false; status = AudioFileStreamGetPropertyInfo(inAudioFileStream, kAudioFileStreamProperty_DataFormat, &dataSize, &writable); status = AudioFileStreamGetProperty(inAudioFileStream, kAudioFileStreamProperty_DataFormat, &dataSize, &audioStreamDescription); // 然後把 audioStreamDescription 存起來 } } Friday, August 16, 13
  • 28. 保存 Packet static void ZBAudioQueueOutputCallback(void * inUserData, AudioQueueRef inAQ,AudioQueueBufferRef inBuffer) { ZBSimplePlayer *self = (ZBSimplePlayer *)inClientData; for (int i = 0; i < inNumberPackets; ++i) { SInt64 packetStart = inPacketDescriptions[i].mStartOffset; UInt32 packetSize = inPacketDescriptions[i].mDataByteSize; assert(packetSize > 0); self->packetData[self->packetCount].length = (size_t)packetSize; self->packetData[self->packetCount].data = malloc(packetSize); memcpy(packetData[self->packetCount].data, inInputData + packetStart, packetSize); self->packetCount++; } } Friday, August 16, 13
  • 29. 下⼀一步 • 為了要播放順利,我們會在有⼀一定數量 的 packet 之後,才會開始呼叫 Audio API 開始播放。資料不夠會產⽣生爆⾳音 • 等待 packet 的過程叫buffering…緩衝處理 • 資料換算成時間的⽅方式:packet 數量 * frames per packet / sample rate • 100 * 1152 / 44100 = 2.61... Friday, August 16, 13
  • 30. 播放 Callback Callback Friday, August 16, 13
  • 31. Audio Queue 的播放 • 每次 callback,提供⼀一個新的buffer struct • buffer物件包含 • packet 數量 • 指向 packet 資料的指標 • packet格式 • 將 buffer 送⼊入 queue 中 Friday, August 16, 13
  • 32. AudioUnit 的播放 • Audio Unit API 會給你⼀一個 struct 的指, 標,叫做IOData,並傳⼊入⼀一定⼤大⼩小的 frame 數量 • 對IOData指定所需要⼤大⼩小的Linear PCM 資料 Friday, August 16, 13
  • 33. iOS 上的 Audio Unit • ⼀一般模式與畫⾯面鎖定時,要求的frame數 量不⼀一樣⼤大 • 平時要求1024,但畫⾯面鎖定時要求4096 • 主要考量是節電 • 如果資料量不夠,會停⽌止播放 Friday, August 16, 13
  • 34. Audio Queue Friday, August 16, 13
  • 35. 建⽴立 Audio Queue OSStatus status = AudioQueueNewOutput(audioStreamBasicDes cription, ZBAudioQueueOutputCallback, self, CFRunLoopGetCurrent(), kCFRunLoopCommonModes, 0, &outputQueue); assert(status == noErr); Friday, August 16, 13
  • 36. Enqueue Data AudioQueueBufferRef buffer; status = AudioQueueAllocateBuffer(outputQueue, totalSize, &buffer); assert(status == noErr); buffer->mAudioDataByteSize = totalSize; buffer->mUserData = self; AudioStreamPacketDescription *packetDescs = calloc(inPacketCount, sizeof(AudioStreamPacketDescription)); totalSize = 0; for (index = 0 ; index < inPacketCount ; index++) { size_t readIndex = index + readHead; memcpy(buffer->mAudioData + totalSize, packetData[readIndex].data, packetData[readIndex].length); AudioStreamPacketDescription description; description.mStartOffset = totalSize; description.mDataByteSize = (UInt32)packetData[readIndex].length; description.mVariableFramesInPacket = 0; totalSize += packetData[readIndex].length; memcpy(&(packetDescs[index]), &description, sizeof(AudioStreamPacketDescription)); } status = AudioQueueEnqueueBuffer(outputQueue, buffer, (UInt32)inPacketCount, packetDescs); free(packetDescs); Friday, August 16, 13
  • 37. Audio Unit Friday, August 16, 13
  • 38. Audio Unit Output Node Effect Node Mixer Node Render Callback Audio Unit Graph Mixer Unit Effect Unit Output Unit Get Info Get Info Get Info Friday, August 16, 13
  • 39. 建⽴立 audio converter AudioConverterRef converter; AudioStreamBasicDescription fromFormat=…⋯; AudioStreamBasicDescription destFormat=…⋯; AudioConverterNew(&fromFormat, &destFormat, &converter); Friday, August 16, 13
  • 40. 使⽤用 audio converter AudioBufferList *list; UInt32 packetSize = 1024; AudioConverterFillComplexBuffer(conve rter, ZBPlayerConverterFiller, self, &packetSize, list, NULL); Friday, August 16, 13
  • 41. audio converter callback OSStatus ZBPlayerConverterFiller (AudioConverterRef inAudioConverter, UInt32* ioNumberDataPackets, AudioBufferList* ioData, AudioStreamPacketDescription** outDataPacketDescription, void* inUserData) { ZBSimpleAUPlayer *self = (ZBSimpleAUPlayer *)inUserData; *ioNumberDataPackets = 1; static AudioStreamPacketDescription aspdesc; ioData->mNumberBuffers = 1; void *data = self->packetData[readHead].data; UInt32 length = self->packetData[readHead].length; ioData->mBuffers[0].mData = data; ioData->mBuffers[0].mDataByteSize = length; *outDataPacketDescription = &aspdesc; aspdesc.mDataByteSize = length; aspdesc.mStartOffset = 0; aspdesc.mVariableFramesInPacket = 1; readHead++; return noErr; } Friday, August 16, 13
  • 42. NewAUGraph(&audioGraph); // 建立 audio grapg AudioComponentDescription cdesc; bzero(&cdesc, sizeof(AudioComponentDescription)); cdesc.componentType = kAudioUnitType_Output; cdesc.componentSubType = kAudioUnitSubType_DefaultOutput; cdesc.componentManufacturer = kAudioUnitManufacturer_Apple; cdesc.componentFlags = 0; cdesc.componentFlagsMask = 0; AUGraphAddNode(audioGraph, &cdesc, &outputNode); // 建立 output node AUGraphOpen(audioGraph); AUGraphNodeInfo(audioGraph, outputNode, &cdesc, &outputUnit); // 建立 output unit AudioStreamBasicDescription destFormat = LFPCMStreamDescription(); // 設定輸出格式 AudioUnitSetProperty(outputUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Input, 0, &destFormat, sizeof(destFormat)); AURenderCallbackStruct callbackStruct; callbackStruct.inputProc = ZBPlayerAURenderCallback; callbackStruct.inputProcRefCon = self; AudioUnitSetProperty(outputUnit, kAudioUnitProperty_SetRenderCallback, kAudioUnitScope_Input, 0, &callbackStruct, sizeof(callbackStruct)); // 建立 render callback AUGraphInitialize(audioGraph); // Init audio graph CAShow(audioGraph); // 開始 audio graph Friday, August 16, 13
  • 43. 播放 • AUGraphStart(audioGraph); • AUGraphStop(audioGraph); Friday, August 16, 13
  • 44. Sample Code • https://github.com/zonble/ZBSimplePlayer Friday, August 16, 13
  • 45. 因為 Audio Unit API 需 要直接使⽤用 Linear PCM 資料,我們也可 以直接修改資料產⽣生 效果 Friday, August 16, 13
  • 46. Thanks! Friday, August 16, 13