- **[Audio Driver Development Procedure](#section3200)**
- [Development on the Existing Platform](#section3221)
- [Development on a New Platform](#section3222)
-**[Audio Driver Development Examples](#section4000)**
-[Codec Driver Development Example](#section4100)
-[Filling in Codec Data Structures](#section4111)
-[Initializing codecDevice and codecDai Devices](#section4112)
-[Implementing the Codec Operation Function Set](#section4113)
-[Registering and Binding Codec to HDF](#section4114)
-[Configuring HCS](#section4115)
-[Accessory Driver Development Example](#section4200)
-[Filling in Accessory Data Structures](#section4221)
-[Initializing accessoryDevice and accessoryDai Devices](#section4222)
-[Implementing the Accessory Operation Function Set](#section4223)
-[Registering and Binding Accessory to HDF](#section4224)
-[Configuring HCS](#section4225)
-[Platform Driver Development Example](#section4300)
-[Filling in Platform Data Structures](#section4331)
-[Initializing the dmaDevice Device](#section4332)
-[Implementing the DMA Operation Function Set](#section4333)
-[Registering and Binding Platform to HDF](#sectionsection4334)
-[Configuring HCS](#section4335)
-[DAI Driver Development Example](#section4400)
-[Filling in DAI Data Structures](#section4441)
-[Initializing the daiDevice Device](#section4442)
-[Implementing the DAI Operation Function Set](#section4443)
-[Registering and Binding DAI to HDF](#section4444)
-[Configuring HCS](#section4445)
-[Adding Compilation Configuration to Makefile](#section4500)
-[Source Code Structure and Directory](#section4600)
-**[Development Procedure and Example Using HAL](#section5000)**
-[Development Procedure](#section5100)
-[Development Example](#section5200)
-**[Summary](#section9999)**
# Audio Driver Overview<a name="section1000"></a>
A multimedia system is an indispensable part in Internet of Things (IoT) devices. Audio is an important module of the multimedia system, and building an audio driver model is particularly important in development.
This document describes the audio driver architecture and functional components and how to develop the audio driver based on the Hardware Driver Foundation (HDF). Chip vendors can develop their own drivers and Hardware abstraction layer (HAL) API invocation based on the driver architecture.
The audio driver architecture is implemented based on the [HDF](https://device.harmonyos.com/en/docs/documentation/guide/driver-hdf-overview-0000001051715456). The audio driver architecture is as follows:

The driver architecture consists of the following:
- Hardware Device Interface (HDI) adapter: implements the audio HAL driver (HDI adaptation) and provides hardware driver capability interfaces for the audio service (frameworks). The HDI adapter provides interface objects such as Audio Manager, Audio Adapter, Audio Control, Audio Capture and Audio Render.
- Audio interface lib: works with the audio driver model in the kernel to control audio hardware, read recording data, and write playback data. It contains **Stream\_ctrl\_common**, which is used to interact with the audio HDI adapter layer.
- Audio Driver Model (ADM): serves the multimedia audio subsystem and enables system developers to develop applications based on scenarios. With ADM, codec and DSP device vendors can adapt their driver code based on the unified interfaces provided by the ADM and implement quick development and easy adaptation to the OpenHarmony system.
- Audio Control Dispatch: receives control instructions from the library layer and distributes the control instructions to the driver layer.
- Audio Stream Dispatch: receives data from the library layer and distributes the data to the driver layer.
- Card Manager: performs management of multiple audio cards. Each audio adapter consists of the digital audio interface (DAI), Platform, Codec, Accessory, DSP and Smart Audio Power Manager (SAPM) modules.
- Platform Driver: servers as the driver adaptation layer.
- Smart Audio Power Manager (SAPM): optimizes the power consumption policy of the ADM.
The audio driver provides the **hdf\_audio\_render**, **hdf\_audio\_capture** and **hdf\_audio\_control** services for the HDI layer. The driver service nodes in the **dev** directory of the development board are as follows:
```c
# ls -l hdf_audio*
crw-rw----1systemsystem248,61970-01-0100:00hdf_audio_capture// Audio data recording streaming service.
crw-rw----1systemsystem248,41970-01-0100:00hdf_audio_codec_dev0// Name of audio adapter device 0.
crw-rw----1systemsystem248,41970-01-0100:00hdf_audio_codec_dev1// Name of audio adapter device 1.
crw-rw----1systemsystem248,51970-01-0100:00hdf_audio_control// Audio control streaming service.
crw-rw----1systemsystem248,71970-01-0100:00hdf_audio_render// Audio data playback streaming service.
```
The audio adapter devices have the following driver services:
hdf\_audio\_codec\_dev0
-**dma\_service\_0**: DMA service
-**dai_service**: CPU DAI service
-**codec\_service\_0**: codec service (built-in codec)
-**dsp\_service\_0**: DSP service (optional)
hdf\_audio\_codec\_dev1
-**dma\_service\_0**: DMA service
-**dai_service**: CPU DAI service
-**codec\_service\_1**: accessory service (SmartPA)
-**dsp\_service\_0**: DSP service (optional)
### Startup Process<a name="section3111"></a>

1. When the system starts, the Platform, Codec, Accessory, DSP and DAI drivers of the audio module are loaded first. Each driver obtains the configuration information from its configuration file and saves the obtained information to the data structures.
2. Each driver module calls the ADM registration interface to add itself to the linked list of the driver module.
3. The ADM reads the hdf\_audio\_driver\_0 and hdf\_audio\_driver\_1 configuration and loads the devices of each module.
4. The ADM module initializes each module device by calling the initialization API of each module.
5. Add the initialized audio devices to the cardManager linked list.
### Playback Process<a name="section3112"></a>

1. The Interface Lib dispatches the **Render Open** instruction through the service launched by the driver for handling the playback streaming (referred to as driver service hereinafter). Upon receiving the instruction, the Stream Dispatch service calls the API of each module to deliver the instruction.
2. The Interface Lib dispatches a path select instruction through the control service. Upon receiving the instruction, the Control Dispatch service calls the DAI API to set the path.
3. The Interface Lib dispatches hardware parameters through the driver service. Upon receiving the parameters, the Stream Dispatch service calls the API of each module to set hardware parameters.
4. The Interface Lib dispatches the start playing instruction through the driver service. Upon receiving the instruction, the Stream Dispatch service calls the API of each module to perform related settings for each module.
5. The Interface Lib dispatches audio data through the driver service. Upon receiving the data, the Stream Dispatch service calls the **Platform AudioPcmWrite** API to send the audio data to direct memory access (DMA).
6. The Interface Lib dispatches the stop playing instruction through the driver service. Upon receiving the instruction, the Stream Dispatch service calls the stop API of each module to perform related settings for each module.
7. The Interface Lib dispatches the **Render Close** instruction through the driver service. Upon receiving the instruction, the Stream Dispatch service calls the **Platform AudioRenderClose** API to release resources.
### Control Process<a name="section3113"></a>

1. When the volume needs to be adjusted, the Interface Lib dispatches the instruction for obtaining the volume range through the control service. Upon receiving the instruction, the Control Dispatch service parses the instruction and calls **get()** of the Codec module to obtain the volume range.
2. The Interface Lib dispatches the instruction for setting the volume through the control service. Upon receiving the instruction, the Control Dispatch service parses the instruction and calls **Set()** of the Codec module to set the volume.
## Audio Driver Development Procedure<a name="section3200"></a>
### Development on the Existing Platform<a name="section3221"></a>
The following figure shows the driver development process for adapting the ADM to the codec or accessory (SmartPA) of the existing platform (Hi3516D V300).

- Add register information to the private HDF configuration source (HCS) of codec or smartPA based on the chip description.
- If the workflow of the newly added codec or SmartPA is the same as that of the existing codec or SmartPA, you do not need to implement the operation function set or configure the compilation file for the newly added codec or SmartPA.
- Perform build, debugging, and testing.
### Development on a New Platform<a name="section3222"></a>
The following figure shows the driver development process of the ADM on a new platform.

The audio-related drivers codec (optional), DAI, DMA, DSP (optional), and SmartPA (optional) need to be adapted to the new platform.
- Add register information of each module driver to the private configuration file of each module according to the chip description.
- Implement the operation function set of each module.
- Modify the compilation file of the audio module.
- Perform build, debugging, and testing.
# Audio Driver Development Examples<a name="section4000"></a>
Code path: **drivers/peripheral/audio**
The following uses Hi3516D V300 as an example to describe how to develop the audio codec driver, accessory driver, DAI driver, and platform driver.
## Codec Driver Development Example<a name="section4100"></a>
### Initializing CodecDevice and CodecDai Devices<a name="section4112"></a>
**CODECDeviceInit** sets audio input/audio output (AIAO), initializes registers, inserts **g_audioControls** into the controller linked list, initializes the power management, and selects a path.
AUDIO_DRIVER_LOG_DEBUG("codec dai device name: %s\n",device->devDaiName);
(void)card;
returnHDF_SUCCESS;
}
```
### Implementing the Codec Operation Function Set<a name="section4113"></a>
The codec module is encapsulated with the **read()** and **write()** functions of the read and write registers at the operating system abstraction layer (OSAL).
If the new platform cannot use the OSAL read and write functions to operate registers, the developer should implement the **read()** and **write()**.
### Registering and Binding Codec to HDF<a name="section4114"></a>
This process depends on the driver implementation mode of the HDF. For details, see [HDF](https://gitee.com/openharmony/docs/blob/master/en/device-dev/driver/driver-hdf.md).
Fill in the **g_codecDriverEntry** structure. Ensure that the value of **moduleName** is the same as that in **device_info.hcs**. Implement the pointers to the **Bind**, **Init**, and **Release** functions.
**CodecDriverInit** obtains the codec service name and private register configuration, and inserts them into the linked list by using **AudioRegisterCodec**.
Configure the driver node, loading sequence, and service name in the .hcs file. For details about the HCS syntax, see [Driver Configuration Management](https://gitee.com/openharmony/docs/blob/master/en/device-dev/driver/driver-hdf-manage.md) in the HDF framework.
**Configuring Codec Device Information in device_info.hcs**
Add codec node configuration. Modify **moduleName** in the configuration file. The value must be the same as **moduleName** in the **HdfDriverEntry** structure. Generally, the value should present the hardware platform, for example, **moduleName = "CODEC_HI3516"**.
The code snippet is as follows:
```c
audio::host{
device_codec::device{
device0::deviceNode{
policy=1;// The codec module provides services only for the kernel.
priority=50;// The codec module must be loaded before the load of the HDF_AUDIO module.
preload=0;
permission=0666;
moduleName="CODEC_HI3516";// The value must be the same as moduleName in HdfDriverEntry.
serviceName="codec_service_0";// Name of the service provided externally.
deviceMatchAttr="hdf_codec_driver";// Name of the private attribute, which is used to match the corresponding private data (including the register configuration).
}
}
```
**Configuring Dependencies in audio_config.hcs**
Configure dependencies between the codec, platform, DAI, DSP, and accessory for the audio_card device.
The code snippet is as follows:
```c
root{
platfrom{
...
controller_0x120c1001::card_controller{
// Set the private data attribute name, which must be the same as deviceMatchAttr in device_info.hcs.
match_attr="hdf_audio_driver_1";
serviceName="hdf_audio_smartpa_dev0";// Name of the service provided externally.
accessoryName="codec_service_1";// External codec service name.
platformName="dma_service_0";// DMA service.
cpuDaiName="dai_service";// CPU DAI service.
accessoryDaiName="accessory_dai";// External DAI.
dspName="dsp_service_0";// DSP service name.
dspDaiName="dsp_dai";// DSP DAI
}
}
}
```
**Configuring Private Registers in codec_config.hcs**
The configuration matches **deviceMatchAttr** of the codec configured in **device_info.hcs**. It includes the register configuration.
Binding the control function configuration is to configure the control functions and their register parameters in the .hcs file according to unified structure specifications. The configuration can be obtained and parsed, and added to the controller linked list.
-**regConfig**: register and control function configuration
-**ctrlParamsSeqConfig**: control function register configuration
-**daiStartupSeqConfig**: DAI startup configuration
-**resetSeqConfig**: reset process register configuration
-**initSeqConfig**: initialization process register configuration
-**controlsConfig**: control function configuration. The **array index** (specific service scenario) and **iface** (same as the HAL) are of fixed values.
```
array index
0: Main Playback Volume
1: Main Capture Volume
2: Playback Mute
3: Capture Mute
4: Mic Left Gain
5: Mic Right Gain
6: External Codec Enable
7: Internally Codec Enable
8: Render Channel Mode
9: Capture Channel Mode
iface
0: virtual dac device
1: virtual adc device
2: virtual adc device
3: virtual mixer device
4: Codec device
5: PGA device
6: AIAO device
```
**ctrlParamsSeqConfig**: control function register configuration. The **item** sequence corresponds to the **item** sequence in **controlsConfig**, indicating the register configuration corresponding to a function.
```c
root{
platfrom{
templatecodec_controller{
match_attr="";
serviceName="";
codecDaiName="";
}
controller_0x120c1030::codec_controller{
match_attr="hdf_codec_driver";
serviceName="codec_service_0";
codecDaiName="codec_dai";
/* Base address of the Hi3516 register*/
idInfo{
chipName="hi3516";// Codec name
chipIdRegister=0x113c0000;// Codec base address
chipIdSize=0x1000;// Codec address offset
}
/* Register configuration, including configuration of registers*/
regConfig{
/* reg: register address
rreg: register address
shift: shift bits
rshift: rshift bits
min: min value
max: max value
mask: mask of value
invert: enum InvertVal 0-uninvert 1-invert
value: value
*/
/* reg, value */
initSeqConfig=[
0x14,0x04000002,
0x18,0xFD200004,
0x1C,0x00180018,
0x20,0x83830028,
0x24,0x00005C5C,
0x28,0x00130000,
0x30,0xFF035A00,
0x34,0x08000001,
0x38,0x06062424,
0x3C,0x1E1EC001,
0x14,0x04000002
];
/* control function config
array index, iface, enable*/
controlsConfig=[
0,0,0,
1,1,1,
2,0,1,
3,1,1,
4,2,1,
5,2,1,
8,6,0,
9,6,0,
];
/* control function register config
reg, rreg, shift, rshift, min, max, mask, invert, value */
When the codec is registered, the input parameter **device** contains controller_0x120c1030 node information. You only need to parse the node to obtain the configuration information.
Obtain and use the configuration of the **regConfig** node. After the configuration files are parsed, the register information in the code can be directly updated.
SmartPA is a type of accessory driver. The SmartPA development procedure is similar to the codec development procedure.
1. Define and fill in an accessory instance.
2. Implement callbacks for the accessory instance.
3. Register and bind the accessory instance to the HDF.
4. Configure the HCS and makefile.
### Filling in Accessory Data Structures<a name="section4221"></a>
Fill in the following data structures for the accessory module:
-**g_tfa9879Data**: operation function set of the accessory device. It contains the configuration in the .hcs file, and defines and maps the methods for initializing the accessory device and reading and writing registers.
-**g_tfa9879DaiDeviceOps**: data set of the DAI of the accessory device. It defines and maps the driver name, initialization, and operation set of the data access interface of the accessory device.
-**g_tfa9879DaiData**: data set of the DAI of the accessory device. It defines and maps the driver name, initialization, and operation set of the data access interface of the accessory device.
```c
structAccessoryDatag_tfa9879Data={
.Init=Tfa9879DeviceInit,
.Read=AccessoryDeviceRegRead,
.Write=AccessoryDeviceRegWrite,
};
structAudioDaiOpsg_tfa9879DaiDeviceOps={
.Startup=Tfa9879DaiStartup,
.HwParams=Tfa9879DaiHwParams,
};
structDaiDatag_tfa9879DaiData={
.drvDaiName="accessory_dai",
.DaiInit=Tfa9879DaiDeviceInit,
.ops=&g_tfa9879DaiDeviceOps,
};
```
### Initializing accessoryDevice and accessoryDai Devices<a name="section4222"></a>
As the entry function for device initialization, **Tfa9879DeviceInit** sets the address of the SmartPA I2C device, obtains configuration data, initializes (including resets) the device registers, and adds the control function to the controller linked list. The current demo also includes the initialization of the registers related to the Hi3516D V300 device, such as initialization of GPIO pins.
### Implementing the Accessory Operation Function Set<a name="section4223"></a>
The callbacks **AccessoryDeviceRegRead** and **AccessoryDeviceRegWrite** invokes **AccessoryI2cReadWrite** to read and write the control register values.
daiParamsVal.channelVal=param->channels;// Set the audio channel.
ret=AccessoryDaiParamsUpdate(daiParamsVal);
...
returnHDF_SUCCESS;
}
```
### Registering and Binding Accessory to HDF<a name="section4224"></a>
This process depends on the driver implementation mode of the HDF. For details, see [HDF](https://gitee.com/openharmony/docs/blob/master/en/device-dev/driver/driver-hdf.md).
Fill in the **g_tfa9879DriverEntry** structure. Ensure that the value of **moduleName** is the same as that in **device_info.hcs**. Implement the pointers to the **Bind**, **Init**, and **Release** functions.
### Implementing the DMA Operation Function Set<a name="section4333"></a>
The DMA device operation function set includes the encapsulation of DMA common APIs. If the common APIs cannot meet development requirements, you can implement new DMA callbacks.
### Registering and Binding Platform to HDF<a name="section4334"></a>
This process depends on the driver implementation mode of the HDF. For details, see [HDF](https://gitee.com/openharmony/docs/blob/master/en/device-dev/driver/driver-hdf.md).
- Fill in the **g_platformDriverEntry** structure.
- Ensure that the value of **moduleName** is the same as that in **device_info.hcs**.
- Implement the pointers to the **Bind**, **Init**, and **Release** functions.
The major steps for developing the DAI driver are as follows:
1. Define and fill in a DAI instance.
2. Implement callbacks for the DAI instance.
3. Register and bind the DAI instance to the HDF.
4. Configure the HCS and makefile.
### Filling in DAI Data Structures<a name="section4441"></a>
Fill in the following structures for the DAI module:
-**g_daiData**: private configuration of the DAI device, including the initialization of the DAI device, read/write of registers, and operation functions.
-**g_daiDeviceOps**: DAI device operation function set, including setting DAI parameters and triggering and starting the DAI device.
```c
structAudioDaiOpsg_daiDeviceOps={
.HwParams=DaiHwParams,
.Trigger=DaiTrigger,
.Startup=DaiStartup,
};
structDaiDatag_daiData={
.DaiInit=DaiDeviceInit,
.Read=AudioDeviceReadReg,
.Write=AudioDeviceWriteReg,
.ops=&g_daiDeviceOps,
};
```
### Initializing the daiDevice Device<a name="section4442"></a>
**DaiDeviceInit** initializes DAI configuration and adds the information to the controller linked list.
### Registering and Binding DAI to HDF<a name="section4444"></a>
This process depends on the driver implementation mode of the HDF. For details, see [HDF](https://gitee.com/openharmony/docs/blob/master/en/device-dev/driver/driver-hdf.md).
- Fill in the **g_daiDriverEntry** structure.
- Ensure that the value of **moduleName** is the same as that in **device_info.hcs**.
- Implement the pointers to the **Bind**, **Init**, and **Release** functions.
## Source Code Structure and Directory<a name="section4600"></a>
The development example implements the functions in the header file of the driver interface. The following uses Hi3516 as an example to describe the directory structure.
Path of the driver implementation sample code: **drivers/peripheral/audio/chipsets**
# Development Procedure and Example Using HAL<a name="section5000"></a>
Code path: **drivers/peripheral/audio/hal**
## Development Procedure<a name="section5100"></a>

1. Call **GetAudioManagerFuncs()** to obtain functions.
2. Call **GetAllAdapters()** to obtain information about the supported audio adapters and call **LoadAdapter()** load the corresponding audio adapter.
3. Create an audio player class by calling **CreateRender()** or create a recorder class and deliver audio attributes.
4. Call the methods mounted to the created audio player class to call **render->control.Start()** and **render->RenderFrame()** to dispatch the start instruction and deliver audio data cyclically.
5. During the playback, call **render->control.Pause()**, **render->control.Resume()** or **render->volume.SetVolume()** to control the audio player service, for example, pausing the playback, resume the playback, and adjusting the volume.
6. After the audio player service is complete, stop the playback, destroy the audio player class, and unload the audio adapter.
/* Deliver the number of the audio to be played.*/
render->control.Start((AudioHandle)render);// Dispatch the start instruction and prepare for the action.
pthread_create(&g_tids,NULL,(void*)(&FrameStart),&g_str);// Start the thread to play the audio clip.
/* Control instructions*/
render->control.Pause((AudioHandle)render);// Pause the playback.
render->control.Resume((AudioHandle)render);// Resume the playback.
render->volume.SetVolume((AudioHandle)render,0.5);// Set the volume.
/* Stop playback and destroy the audio player class.*/
render->control.Stop((AudioHandle)render);
adapter->DestroyRender(adapter,render);
/* Unload the audio adapter.*/
manager->UnloadAdapter(manager,adapter);
}
```
# Summary<a name="section9999"></a>
This document provides all the key adaptations involved in the driver development based on the audio driver framework. It elaborates how to adapt the audio driver and use HDI APIs. You can easily develop your audio drivers based on the chip you use by referring to this document.