The document discusses Android's audio system and the AudioHardware class. It describes how AudioHardware initializes and manages audio devices, streams, and drivers. Key methods like setVoiceVolume and setVolume are analyzed in detail from the framework level down to the hardware abstraction layer. The initialization and roles of classes like AudioStreamOut and AudioStreamIn are also explained.
The document discusses the Android audio system initialization process and the creation of playback and recording threads. The audio HAL library is loaded based on the device properties, and the AudioFlinger service initializes and manages the audio streams. It creates a MixerThread for playback using the audio HAL output, and a RecordThread is generated for audio input using the HAL functions.
The document discusses the Android audio system architecture. It is comprised of an Audio Framework layer that includes AudioTrack, AudioRecord and AudioPolicy classes that handle routing audio between apps and the hardware. Below this is the Audio HAL interface that provides read/write functions to the underlying Linux audio driver and hardware. The Audio Flinger manages multiple threads to non-blocking read and write audio data to attached hardware devices. This layered design provides flexibility and handles the real-time audio needs across different Android devices and usage scenarios.
1. The document discusses Android's AudioPolicyService which manages audio routing and device connections.
2. It describes the initialization process where AudioPolicyService is started and an AudioPolicyManager is created.
3. The AudioPolicyManager handles requests from AudioPolicyService like setting output devices and parameters which are passed to AudioFlinger for processing.
The document discusses Android's audio system and the AudioHardware class. It describes how AudioHardware initializes and manages audio devices, streams, and drivers. Key methods like setVoiceVolume and setVolume are analyzed in detail from the framework level down to the hardware abstraction layer. The initialization and roles of classes like AudioStreamOut and AudioStreamIn are also explained.
The document discusses the Android audio system initialization process and the creation of playback and recording threads. The audio HAL library is loaded based on the device properties, and the AudioFlinger service initializes and manages the audio streams. It creates a MixerThread for playback using the audio HAL output, and a RecordThread is generated for audio input using the HAL functions.
The document discusses the Android audio system architecture. It is comprised of an Audio Framework layer that includes AudioTrack, AudioRecord and AudioPolicy classes that handle routing audio between apps and the hardware. Below this is the Audio HAL interface that provides read/write functions to the underlying Linux audio driver and hardware. The Audio Flinger manages multiple threads to non-blocking read and write audio data to attached hardware devices. This layered design provides flexibility and handles the real-time audio needs across different Android devices and usage scenarios.
1. The document discusses Android's AudioPolicyService which manages audio routing and device connections.
2. It describes the initialization process where AudioPolicyService is started and an AudioPolicyManager is created.
3. The AudioPolicyManager handles requests from AudioPolicyService like setting output devices and parameters which are passed to AudioFlinger for processing.
Android Audio HAL – Audio Architecture – Audio HAL interface – Audio Policy – Audio HAL compilation & verification – Overview of Tinyalsa
Android Video HAL – Camera Architecture – Overview of camera HAL interface – Overview of V4L2 – Enabling V4l2 in kernel – Camera HAL compilation and verification
Android booting sequece and setup and debuggingUtkarsh Mankad
The document summarizes key Android SDK components and concepts in 3 sentences or less:
Android SDK components are organized by functionality and include Activities, Services, BroadcastReceivers, Views, Intents, Adapters, AlertDialogs, Notifications, ContentProviders, and data storage methods. Common data storage options include SharedPreferences, internal storage, external storage, and SQLite databases. The Android booting process involves 6 stages: power on and ROM code execution, boot loader loading, starting the Linux kernel, initiating the init process, launching the Zygote and Dalvik virtual machine, and system server initiation.
The presentation deals with the range of features of the Linux sound subsystem — Advanced Linux Sound Architecture (ALSA). During the presentation, the participants were provided with case studies of the difference it makes for the development of audio drivers for PC and embedded systems. Also, it was
shared an overview of the state-of-the-art tendencies in the development of audio drivers for embedded systems.
This presentation by Vadym Shovkoplias (Senior Software Engineer, GlobalLogic Kharkiv) was delivered at GlobalLogic Kharkiv Embedded TechTalk #1 on March 13, 2018.
The document discusses Android over-the-air (OTA) updates, including the update mechanism, update packages, and the recovery console. It describes how OTA updates work in Android, from downloading an update package to verifying and applying the updates. It also discusses the AOSP components involved like the updater binary, edify scripting language, and recovery console. Finally, it covers creating an OTA application to check and download updates and rebooting into recovery to apply them.
There is a surge in number of sensors / devices that are getting connected under the umbrella of Internet-Of-Things (IoT). These devices need to be integrated into the Android system and accessed via applications, which is covered in the course. Our Android system development course curriculum over weekends with practicals ensures you learn all critical components to get started.
Embedded Android system development workshop is focused on integrating new device with Android framework. Our hands-on approach makes Emertxe as the best institute to learn android system development training. This workshop deep dives into Android porting, Android Hardware Abstraction Layer (HAL), Android Services and Linux device driver ecosystem. This workshop based training program will enable you to efficiently integrate new hardware with Android HAL / Framework.
Embedded Android System Development - Part II talks about Hardware Abstraction Layer (HAL). HAL is an interfacing layer through which Android service can place a request to device. Uses functions provided by Linux system to service the request from android framework. A C/C++ layer with purely vendor specific implementation. Packaged into modules (.so) file & loaded by Android system at appropriate time
For new age touch-based embedded devices, Android is becoming a popular OS going beyond mobile phones. With its roots from Embedded Linux, Android framework offers benefits in terms of rich libraries, open-source and multi-device support. Emertxe’s hands-on Embedded Android Training Course is designed to customize, build and deploy custom Embedded OS on ARM target. Rich set of projects will make your learning complete.
The document discusses the Android media framework which aims to simplify application development, share resources efficiently in a multi-tasked environment with strong security, and allow for future growth. It describes the sandbox model, media framework components including the media server and services, and typical media function calls from applications through proxies and interfaces. Media playback and recording are outlined, showing how audio/video is decoded and composited for output. Key codecs are also mentioned.
The document discusses Linux audio drivers. It introduces the Linux audio subsystem, including the ALSA sound core in kernel space and its interfaces for user space applications. It describes the vertical components like the sound core and horizontal components like audio codec and controller drivers. It also covers porting an audio driver, which may involve changing pin assignments for standard codecs or implementing new codec drivers.
The document discusses audio and video support and playback in the Android platform. It covers built-in encoding/decoding, playing media from resources, files and streams. It also covers playing JET interactive content and capturing audio using the MediaRecorder class. Supported audio formats include AAC, AMR, MP3, MIDI, Ogg Vorbis and PCM. Supported video formats include H.263, H.264 and MPEG-4.
This document summarizes a presentation on enhancing Qualcomm Snapdragon audio using Android Audio APIs. The presentation discusses how Qualcomm incorporates a digital signal processor into Snapdragon, the Snapdragon Audio SDK, adding custom audio processing modules dynamically at runtime using open DSP programs, and demonstrates audio processing effects.
Outline:
a. MediaPlayer Subsystem
b. Related Files
c. MediaPlayer Frame of Playing Flow
-StageFright and AwesomePlayer Relatin
-AwesomePlayer Frame and Playing Flow
d. Simple Playing Implement
The document discusses the Android booting process. It begins with the boot ROM and boot loader which initialize hardware and load the kernel image. The kernel then initializes drivers and loads init, which sets up the environment and mounts partitions. Init starts the zygote process, which preloads classes. System servers like the activity manager and power manager are then started via zygote. Once all servers are running, Android broadcasts an intent to indicate the boot process is complete. The boot sequence involves the bootloader, kernel, init, zygote and system servers working together to start the Android system.
Using and Customizing the Android Framework / part 4 of Embedded Android Work...Opersys inc.
1) The document provides an overview of using and customizing the Android framework, covering topics like kickstarting the framework, utilities and commands, system services internals, and creating custom services.
2) It describes the core building blocks of the framework, like services, Dalvik, and the boot process. It also covers utilities like am, pm, and dumpsys.
3) The document discusses native daemons like servicemanager and installd. It explains how to observe the system server and interact with services programmatically.
This document discusses the Android multimedia framework on Jelly Bean. It provides an introduction to OpenMAX and describes the simple stack architecture including the developer API, event handler, surface holder, StageFright, OpenMAX interface, and software/hardware codecs. It explains the workflows and sequence flows for playing a media file, including setting the data source, preparing to play, and starting playback. Finally, it covers the synchronization architecture and flow of StageFright.
There are many books, articles and paper publications about Android and related applications but only a few are related to how Android operating system works internally.In this talk we will see how android boots up , an overview of zygote , how system server and package manager works. This talk will be extremely helpful to foster understanding among android developers about Android Internals as well as everybody else who desires a general understanding of the internal working of Android powered devices.
This document provides an overview of porting Android to new platforms. It discusses the Android software stack, the Android Open Source Project structure, the AOSP code structure, common Android hardware abstraction layers, device configuration files, the AOSP build process, the Android boot process, and Android debugging tools.
The second part of Linux Internals covers system calls, process subsystem and inter process communication mechanisms. Understanding these services provided by Linux are essential for embedded systems engineer.
In order to understand HAL layers of Android Framework, having Linux device driver knowledge is important. Hence Day-2 of the workshop focuses on the same.
Android Audio HAL – Audio Architecture – Audio HAL interface – Audio Policy – Audio HAL compilation & verification – Overview of Tinyalsa
Android Video HAL – Camera Architecture – Overview of camera HAL interface – Overview of V4L2 – Enabling V4l2 in kernel – Camera HAL compilation and verification
Android booting sequece and setup and debuggingUtkarsh Mankad
The document summarizes key Android SDK components and concepts in 3 sentences or less:
Android SDK components are organized by functionality and include Activities, Services, BroadcastReceivers, Views, Intents, Adapters, AlertDialogs, Notifications, ContentProviders, and data storage methods. Common data storage options include SharedPreferences, internal storage, external storage, and SQLite databases. The Android booting process involves 6 stages: power on and ROM code execution, boot loader loading, starting the Linux kernel, initiating the init process, launching the Zygote and Dalvik virtual machine, and system server initiation.
The presentation deals with the range of features of the Linux sound subsystem — Advanced Linux Sound Architecture (ALSA). During the presentation, the participants were provided with case studies of the difference it makes for the development of audio drivers for PC and embedded systems. Also, it was
shared an overview of the state-of-the-art tendencies in the development of audio drivers for embedded systems.
This presentation by Vadym Shovkoplias (Senior Software Engineer, GlobalLogic Kharkiv) was delivered at GlobalLogic Kharkiv Embedded TechTalk #1 on March 13, 2018.
The document discusses Android over-the-air (OTA) updates, including the update mechanism, update packages, and the recovery console. It describes how OTA updates work in Android, from downloading an update package to verifying and applying the updates. It also discusses the AOSP components involved like the updater binary, edify scripting language, and recovery console. Finally, it covers creating an OTA application to check and download updates and rebooting into recovery to apply them.
There is a surge in number of sensors / devices that are getting connected under the umbrella of Internet-Of-Things (IoT). These devices need to be integrated into the Android system and accessed via applications, which is covered in the course. Our Android system development course curriculum over weekends with practicals ensures you learn all critical components to get started.
Embedded Android system development workshop is focused on integrating new device with Android framework. Our hands-on approach makes Emertxe as the best institute to learn android system development training. This workshop deep dives into Android porting, Android Hardware Abstraction Layer (HAL), Android Services and Linux device driver ecosystem. This workshop based training program will enable you to efficiently integrate new hardware with Android HAL / Framework.
Embedded Android System Development - Part II talks about Hardware Abstraction Layer (HAL). HAL is an interfacing layer through which Android service can place a request to device. Uses functions provided by Linux system to service the request from android framework. A C/C++ layer with purely vendor specific implementation. Packaged into modules (.so) file & loaded by Android system at appropriate time
For new age touch-based embedded devices, Android is becoming a popular OS going beyond mobile phones. With its roots from Embedded Linux, Android framework offers benefits in terms of rich libraries, open-source and multi-device support. Emertxe’s hands-on Embedded Android Training Course is designed to customize, build and deploy custom Embedded OS on ARM target. Rich set of projects will make your learning complete.
The document discusses the Android media framework which aims to simplify application development, share resources efficiently in a multi-tasked environment with strong security, and allow for future growth. It describes the sandbox model, media framework components including the media server and services, and typical media function calls from applications through proxies and interfaces. Media playback and recording are outlined, showing how audio/video is decoded and composited for output. Key codecs are also mentioned.
The document discusses Linux audio drivers. It introduces the Linux audio subsystem, including the ALSA sound core in kernel space and its interfaces for user space applications. It describes the vertical components like the sound core and horizontal components like audio codec and controller drivers. It also covers porting an audio driver, which may involve changing pin assignments for standard codecs or implementing new codec drivers.
The document discusses audio and video support and playback in the Android platform. It covers built-in encoding/decoding, playing media from resources, files and streams. It also covers playing JET interactive content and capturing audio using the MediaRecorder class. Supported audio formats include AAC, AMR, MP3, MIDI, Ogg Vorbis and PCM. Supported video formats include H.263, H.264 and MPEG-4.
This document summarizes a presentation on enhancing Qualcomm Snapdragon audio using Android Audio APIs. The presentation discusses how Qualcomm incorporates a digital signal processor into Snapdragon, the Snapdragon Audio SDK, adding custom audio processing modules dynamically at runtime using open DSP programs, and demonstrates audio processing effects.
Outline:
a. MediaPlayer Subsystem
b. Related Files
c. MediaPlayer Frame of Playing Flow
-StageFright and AwesomePlayer Relatin
-AwesomePlayer Frame and Playing Flow
d. Simple Playing Implement
The document discusses the Android booting process. It begins with the boot ROM and boot loader which initialize hardware and load the kernel image. The kernel then initializes drivers and loads init, which sets up the environment and mounts partitions. Init starts the zygote process, which preloads classes. System servers like the activity manager and power manager are then started via zygote. Once all servers are running, Android broadcasts an intent to indicate the boot process is complete. The boot sequence involves the bootloader, kernel, init, zygote and system servers working together to start the Android system.
Using and Customizing the Android Framework / part 4 of Embedded Android Work...Opersys inc.
1) The document provides an overview of using and customizing the Android framework, covering topics like kickstarting the framework, utilities and commands, system services internals, and creating custom services.
2) It describes the core building blocks of the framework, like services, Dalvik, and the boot process. It also covers utilities like am, pm, and dumpsys.
3) The document discusses native daemons like servicemanager and installd. It explains how to observe the system server and interact with services programmatically.
This document discusses the Android multimedia framework on Jelly Bean. It provides an introduction to OpenMAX and describes the simple stack architecture including the developer API, event handler, surface holder, StageFright, OpenMAX interface, and software/hardware codecs. It explains the workflows and sequence flows for playing a media file, including setting the data source, preparing to play, and starting playback. Finally, it covers the synchronization architecture and flow of StageFright.
There are many books, articles and paper publications about Android and related applications but only a few are related to how Android operating system works internally.In this talk we will see how android boots up , an overview of zygote , how system server and package manager works. This talk will be extremely helpful to foster understanding among android developers about Android Internals as well as everybody else who desires a general understanding of the internal working of Android powered devices.
This document provides an overview of porting Android to new platforms. It discusses the Android software stack, the Android Open Source Project structure, the AOSP code structure, common Android hardware abstraction layers, device configuration files, the AOSP build process, the Android boot process, and Android debugging tools.
The second part of Linux Internals covers system calls, process subsystem and inter process communication mechanisms. Understanding these services provided by Linux are essential for embedded systems engineer.
In order to understand HAL layers of Android Framework, having Linux device driver knowledge is important. Hence Day-2 of the workshop focuses on the same.
Camera camcorder framework overview(ginger bread)fefe7270
1. The document discusses the camera and camcorder frameworks in Android (Gingerbread).
2. The camera framework uses a Binder client-server model with the Camera service as the server and Camera app as the client. The service communicates with the camera HAL through JNI.
3. The camcorder framework similarly uses a Binder model to connect the media recorder app with the media player service. The service records video via the Stagefright recorder and communicates with the camera through its client.
1. The document describes the initialization process of SurfaceFlingerService in Android.
2. SurfaceFlinger is instantiated which creates the main SurfaceFlinger instance. This triggers the initialization of various core Android services.
3. The main display is initialized by creating a DisplayHardware instance and a graphic plane for the display. Shared memory is allocated to share display information.
This document summarizes Android audio APIs and OpenSL ES, an open sound library for embedded systems like Android. It discusses APIs like MediaPlayer, SoundPool, AudioTrack/AudioRecord and their limitations. OpenSL ES provides low-level audio control and is device independent but Android's implementation supports only a subset of OpenSL features. It provides code examples for creating an OpenSL engine and implementing audio playback and recording in a loopback sample application using OpenSL objects like AudioPlayer and AudioRecorder across two threads.
<p>
[데브멘토 동영상] 박현철 BlueFish System CTO 2부 최종</p>
<p>
실전 윈도우폰 망고 앱 디자인& 개발</p>
<p>
코드네임 ‘망고’업데이트, 윈도우폰 앱 개발A to Z</p>
<p>
이번 동영상은Windows Phone에서 멀티 태스킹 지원이 어떻게 변경됐고FAS 구조와 멀티태스킹 아키텍처에 대해 알기 쉽게 설명한다. 특히 윈도우폰 망고 앱을 개발했을 때 필요한Multitasking API에 대한 예제를 직접 데모로 보여준다.</p>
<p>
</p>
<p>
박현철CTO는 Windows Phone MVP와WinMoDev 부시삽으로 활동하고 있으며 루나네스의 이상한 연구소 블로그를 운영하고 있다. 현재 모바일 솔루션 개발업체BlueFish System CTO로 재직하고 있다.</p>
IoT EDU/MAKE 플랫폼인 Circulus 를 이용하여,
라즈베리파이를 개발할 때 제공되는 API 가이드라인입니다.
이 가이드에서는
LED, 초음파센서, 7세그먼트, 온습도 센서, 조도 센서, SW/HW PWM, TTS, 카메라, 사운드인식, 위치파악, 모니터링, 광학문자인식(OCR) 등에 대해 단 몇줄의 자바스크립트만으로 제어하는 예제를 다루고 있습니다. 앞으로도 지속적으로 추가될 예정입니다.
2. 1.AudiopolicyManager 위치 및 역활 안드로이드의 모든것 분석과 포팅 정리
Application 역할:
Framework
1.입출력 디바이스 관리
2.오디오 볼륨 제어 및 관리
Native
Framework 3.오디오 정책 관리
HAL
Kernel
2/10
3. 2.입출력 디바이스 관리 안드로이드의 모든것 분석과 포팅 정리
AudioPolicyManagerbase의 생성자에서 초기 입출력 device 지정
AudioPolicyManagerBase::AudioPolicyManagerBase(AudioPolicyClientInterface *clientInterface)
{
Audiosysystem.h
//출력이 가능한 device enum audio_devices {
mAvailableOutputDevices = AudioSystem::DEVICE_OUT_EARPIECE; // output devices
mAvailableOutputDevices |= AudioSystem::DEVICE_OUT_SPEAKER; DEVICE_OUT_EARPIECE = 0x1,
DEVICE_OUT_SPEAKER = 0x2,
//입력이 가능한 device DEVICE_OUT_WIRED_HEADSET = 0x4,
mAvailableInputDevices = AudioSystem::DEVICE_IN_BUILTIN_MIC; DEVICE_OUT_WIRED_HEADPHONE = 0x8,
…
// input devices
outputDesc->mDevice = AudioSystem::DEVICE_OUT_SPEAKER; DEVICE_IN_COMMUNICATION = 0x10000,
mHardwareOutput = mpClientInterface->openOutput DEVICE_IN_AMBIENT = 0x20000,
(&outputDesc->mDevice, DEVICE_IN_BUILTIN_MIC = 0x40000,
&outputDesc->mSamplingRate, DEVICE_IN_BLUETOOTH_SCO_HEADSET =
&outputDesc->mFormat, 0x80000,
&outputDesc->mChannels, …
&outputDesc->mLatency,
outputDesc->mFlags);
mHardwareOutput 는 playbackthread의 id
//playbackthread가 DEVICE_OUT_SPEAKER 로 설정됨을 gOutput에 저장.
setOutputDevice(mHardwareOutput, , defaultDevice, true);
아래와 같은 함수들에서 현재 output stream의
}
정보를 구할 때 gOutputs 가 사용된다.
getOutputSamplingRate
AudioSystem의 gOutput의 정보를 갱신 함.
getOutputFrameCount
getOutputLatency
3/10
4. 2.입출력 디바이스 관리 안드로이드의 모든것 분석과 포팅 정리
setDeviceConnectionState 에 의해 입출력 device 변경
status_t AudioPolicyManagerBase::setDeviceConnectionState
{
// handle output devices
if (AudioSystem::isOutputDevice(device)){
case AudioSystem::DEVICE_STATE_AVAILABLE:
mAvailableOutputDevices |= device;
case AudioSystem::DEVICE_STATE_UNAVAILABLE:
mAvailableOutputDevices &= ~device;
// request routing change if necessary
uint32_t newDevice = getNewDevice(mHardwareOutput, false);
setOutputDevice(mHardwareOutput, newDevice);
}
// handle input devices
if (AudioSystem::isInputDevice(device)) {
…
mAvailableInputDevices |= device;
mpClientInterface->setParameters(activeInput, param.toString());
}
4/10
5. 2.입출력 디바이스 관리 안드로이드의 모든것 분석과 포팅 정리
setForceUse();정해진 device만을 사용
startOutput(),stopOutput();play시에 Strategy에 맞는 device 선택 및 해제
startInput(),stopInput(); record시에 Strategy에 맞는 device 선택 및 해제
void AudioPolicyManagerBase::setForceUse(AudioSystem::force_use usage, AudioSystem::forced_config config)
{
.. AudioSystem::FOR_MEDI FORCE_SPEAK
case AudioSystem::FOR_MEDIA: A ER
mForceUse[usage] = config;
}
status_t AudioPolicyManagerBase::startOutput
{
//getNewDevice로 설정할 device를 가져와서 set함.
setOutputDevice(output, getNewDevice(output));
}
status_t AudioPolicyManagerBase::stopOutput
{
//현재 play시킨 device와 playbackthred의 output이 일치 하지 않을 경우
//playbackthread의 device로 다시 set해줌.
if (output != mHardwareOutput) {
setOutputDevice(mHardwareOutput, getNewDevice(mHardwareOutput), true);
}
}
5/10
6. 3. 오디오 볼륨 제어 및 관리 안드로이드의 모든것 분석과 포팅 정리
각 stream별 volume setting(MIN,MAX)
AudioService.java
private int[] MAX_STREAM_VOLUME = new int[] {
6, // STREAM_VOICE_CALL
7, // STREAM_SYSTEM
7, // STREAM_RING
15, // STREAM_MUSIC
7, // STREAM_ALARM
7, // STREAM_NOTIFICATION
15, // STREAM_BLUETOOTH_SCO
15, // STREAM_FM
15, // STREAM_DTMF
15, // STREAM_TTS
7, // STREAM_SYSTEM_ENFORCED
};
VolumeStreamState의 생성자에서 각 stream에 대한 volume level 설정
public class VolumeStreamState {
private VolumeStreamState(String settingName, int streamType) {
mIndexMax = MAX_STREAM_VOLUME[streamType];
AudioSystem.initStreamVolume(streamType, 0, mIndexMax);
}
각 stream에 대한 min,max value 설정
void AudioPolicyManagerBase::initStreamVolume{
mStreams[stream].mIndexMin = indexMin;
mStreams[stream].mIndexMax = indexMax;
}
6/10
7. 3. 오디오 볼륨 제어 및 관리 안드로이드의 모든것 분석과 포팅 정리
Volume 조절
AudioService.java
public void handleMessage(Message msg) {
switch (baseMsgWhat) {
case MSG_SET_SYSTEM_VOLUME:
setSystemVolume((VolumeStreamState) msg.obj);
break;
}
private void setSystemVolume(VolumeStreamState streamState) {
// Adjust volume
setStreamVolumeIndex(streamState.mStreamType, streamState.mIndex);
//setting에 sound value 값을 반영 하기 위해 MSG_PERSIST_VOLUME 날려 줌.
sendMsg(mAudioHandler, MSG_PERSIST_VOLUME, streamState.mStreamType,
SENDMSG_REPLACE, 1, 1, streamState, PERSIST_DELAY);
}
private void setStreamVolumeIndex(int stream, int index) {
AudioSystem.setStreamVolumeIndex(stream, (index + 5)/10);
}
status_t AudioPolicyManagerBase::setStreamVolumeIndex(AudioSystem::stream_type stream, int index)
{
status_t volStatus = checkAndSetVolume(stream, index, mOutputs.keyAt(i), mOutputs.valueAt(i)->device());
}
7/10
8. 3. 오디오 볼륨 제어 및 관리 안드로이드의 모든것 분석과 포팅 정리
status_t AudioPolicyManagerBase::checkAndSetVolume((int stream, int index..)
{
//정수로 들어온 volume value를 float형으로 바꿔준다.
float volume = computeVolume(stream, index, output, device);
mpClientInterface->setStreamVolume((AudioSystem::stream_type)stream, volume, output, delayMs);
}
AudioFlinger의 mStreamTypes 변수에 volume set해줌.
status_t AudioFlinger::PlaybackThread::setStreamVolume(int stream, float value)
{
mStreamTypes[stream].volume = value;
}
Play시에 이 구조체에 있는 volume값으로 play됨.
8/10