提交 43e2896e 编写于 作者: G Geevarghese V K

js api usage doc update based on new d.ts

Signed-off-by: NGeevarghese V K <geevarghese.v.k1@huawei.com>
上级 0d8c9861
......@@ -146,7 +146,13 @@ You use audio management APIs to set and obtain volume, and get information abou
</th>
</tr>
</thead>
<tbody><tr id="row188162012454"><td class="cellrowborder" valign="top" width="50%" headers="mcps1.2.3.1.1 "><p id="p764215288462"><a name="p764215288462"></a><a name="p764215288462"></a>MEDIA = 1</p>
<tbody>
<tr id="row188162012454"><td class="cellrowborder" valign="top" width="50%" headers="mcps1.2.3.1.1 "><p id="p764215288462"><a name="p764215288462"></a><a name="p764215288462"></a>VOICE_CALL = 0</p>
</td>
<td class="cellrowborder" valign="top" width="50%" headers="mcps1.2.3.1.2 "><p id="p1596200459"><a name="p1596200459"></a><a name="p1596200459"></a>Audio streams for voice calls</p>
</td>
</tr>
<tr id="row188162012454"><td class="cellrowborder" valign="top" width="50%" headers="mcps1.2.3.1.1 "><p id="p764215288462"><a name="p764215288462"></a><a name="p764215288462"></a>MEDIA = 1</p>
</td>
<td class="cellrowborder" valign="top" width="50%" headers="mcps1.2.3.1.2 "><p id="p1596200459"><a name="p1596200459"></a><a name="p1596200459"></a>Audio streams for media purpose</p>
</td>
......@@ -154,6 +160,11 @@ You use audio management APIs to set and obtain volume, and get information abou
<tr id="row1288915367468"><td class="cellrowborder" valign="top" width="50%" headers="mcps1.2.3.1.1 "><p id="p51611346194614"><a name="p51611346194614"></a><a name="p51611346194614"></a>RINGTONE = 2</p>
</td>
<td class="cellrowborder" valign="top" width="50%" headers="mcps1.2.3.1.2 "><p id="p9333131144712"><a name="p9333131144712"></a><a name="p9333131144712"></a>Audio streams for ring tones</p>
<tr id="row188162012454"><td class="cellrowborder" valign="top" width="50%" headers="mcps1.2.3.1.1 "><p id="p764215288462"><a name="p764215288462"></a><a name="p764215288462"></a>VOICE_ASSISTANT = 9</p>
</td>
<td class="cellrowborder" valign="top" width="50%" headers="mcps1.2.3.1.2 "><p id="p1596200459"><a name="p1596200459"></a><a name="p1596200459"></a>Audio streams for voice assistant</p>
</td>
</tr>
</td>
</tr>
</tbody>
......@@ -222,27 +233,27 @@ You use audio management APIs to set and obtain volume, and get information abou
<td class="cellrowborder" valign="top" width="50%" headers="mcps1.2.3.1.2 "><p id="p17389145016497"><a name="p17389145016497"></a><a name="p17389145016497"></a>Invalid device</p>
</td>
</tr>
<tr id="row938915016493"><td class="cellrowborder" valign="top" width="50%" headers="mcps1.2.3.1.1 "><p id="p538925044916"><a name="p538925044916"></a><a name="p538925044916"></a>SPEAKER = 1</p>
<tr id="row938915016493"><td class="cellrowborder" valign="top" width="50%" headers="mcps1.2.3.1.1 "><p id="p538925044916"><a name="p538925044916"></a><a name="p538925044916"></a>SPEAKER = 2</p>
</td>
<td class="cellrowborder" valign="top" width="50%" headers="mcps1.2.3.1.2 "><p id="p16724165865017"><a name="p16724165865017"></a><a name="p16724165865017"></a>Speaker</p>
</td>
</tr>
<tr id="row12389105084916"><td class="cellrowborder" valign="top" width="50%" headers="mcps1.2.3.1.1 "><p id="p538914502497"><a name="p538914502497"></a><a name="p538914502497"></a>WIRED_HEADSET = 2</p>
<tr id="row12389105084916"><td class="cellrowborder" valign="top" width="50%" headers="mcps1.2.3.1.1 "><p id="p538914502497"><a name="p538914502497"></a><a name="p538914502497"></a>WIRED_HEADSET = 3</p>
</td>
<td class="cellrowborder" valign="top" width="50%" headers="mcps1.2.3.1.2 "><p id="p63891850144911"><a name="p63891850144911"></a><a name="p63891850144911"></a>Wired headset</p>
</td>
</tr>
<tr id="row2389205074915"><td class="cellrowborder" valign="top" width="50%" headers="mcps1.2.3.1.1 "><p id="p10389175054919"><a name="p10389175054919"></a><a name="p10389175054919"></a>BLUETOOTH_SCO = 3</p>
<tr id="row2389205074915"><td class="cellrowborder" valign="top" width="50%" headers="mcps1.2.3.1.1 "><p id="p10389175054919"><a name="p10389175054919"></a><a name="p10389175054919"></a>BLUETOOTH_SCO = 7</p>
</td>
<td class="cellrowborder" valign="top" width="50%" headers="mcps1.2.3.1.2 "><p id="p538905016496"><a name="p538905016496"></a><a name="p538905016496"></a>Bluetooth device using the synchronous connection oriented link (SCO)</p>
<td class="cellrowborder" valign="top" width="50%" headers="mcps1.2.3.1.2 "><p id="p538905016496"><a name="p538905016496"></a><a name="p538905016496"></a>Bluetooth device using the synchronous connection oriented (SCO) link</p>
</td>
</tr>
<tr id="row83891502499"><td class="cellrowborder" valign="top" width="50%" headers="mcps1.2.3.1.1 "><p id="p1938975015494"><a name="p1938975015494"></a><a name="p1938975015494"></a>BLUETOOTH_A2DP = 4</p>
<tr id="row83891502499"><td class="cellrowborder" valign="top" width="50%" headers="mcps1.2.3.1.1 "><p id="p1938975015494"><a name="p1938975015494"></a><a name="p1938975015494"></a>BLUETOOTH_A2DP = 8</p>
</td>
<td class="cellrowborder" valign="top" width="50%" headers="mcps1.2.3.1.2 "><p id="p193891550134912"><a name="p193891550134912"></a><a name="p193891550134912"></a>Bluetooth device using advanced audio distribution profile (A2DP)</p>
</td>
</tr>
<tr id="row11389175014916"><td class="cellrowborder" valign="top" width="50%" headers="mcps1.2.3.1.1 "><p id="p1738955018497"><a name="p1738955018497"></a><a name="p1738955018497"></a>MIC = 5</p>
<tr id="row11389175014916"><td class="cellrowborder" valign="top" width="50%" headers="mcps1.2.3.1.1 "><p id="p1738955018497"><a name="p1738955018497"></a><a name="p1738955018497"></a>MIC = 15</p>
</td>
<td class="cellrowborder" valign="top" width="50%" headers="mcps1.2.3.1.2 "><p id="p73891250174914"><a name="p73891250174914"></a><a name="p73891250174914"></a>Microphone</p>
</td>
......@@ -269,5 +280,3 @@ You use audio management APIs to set and obtain volume, and get information abou
console.log(`Media getVolume ${value}`);
});
```
......@@ -3,8 +3,6 @@
---
## ***Note***:
1. This document applies to JavaScript.
2. Changes to the AudioRenderer interface have been proposed.
When the updated APIs have been integrated, the document will be revised, and apps must adapt to it.
---
## **Summary**
This guide will show you how to use AudioRenderer to create an audio player app.
......@@ -24,11 +22,28 @@ Please see [**js-apis-audio.md**](https://gitee.com/openharmony/docs/blob/master
## **Usage**
Here's an example of how to use AudioRenderer to play a raw audio file.
1. Use **createAudioRenderer** to create an AudioRenderer instance for the **AudioVolumeType**.\
1. Use **createAudioRenderer** to create an AudioRenderer instance. Renderer parameters can be set in **audioRendererOptions**.\
This object can be used to play, control, and obtain the status of the playback, as well as receive callback notifications.
```
const volType = audio.AudioVolumeType.MEDIA; // For music
const audioRenderer = audio.createAudioRenderer(volType);
var audioStreamInfo = {
samplingRate: audio.AudioSamplingRate.SAMPLE_RATE_44100,
channels: audio.AudioChannel.CHANNEL_1,
sampleFormat: audio.AudioSampleFormat.SAMPLE_FORMAT_S16LE,
encodingType: audio.AudioEncodingType.ENCODING_TYPE_RAW
}
var audioRendererInfo = {
content: audio.ContentType.CONTENT_TYPE_SPEECH,
usage: audio.StreamUsage.STREAM_USAGE_VOICE_COMMUNICATION,
rendererFlags: 1
}
var audioRendererOptions = {
streamInfo: audioStreamInfo,
rendererInfo: audioRendererInfo
}
let audioRenderer = await audio.createAudioRenderer(audioRendererOptions);
```
2. Subscribe to audio interruption events using the **on** API.\
......@@ -118,27 +133,8 @@ Here's an example of how to use AudioRenderer to play a raw audio file.
});
```
3. Prepare the renderer. Call **SetParams** on the instance. You need to set the renderer parameters based on the audio playback specification.
```
async function prepareRenderer() {
// file_example_WAV_2MG.wav
var audioParams = {
format: audio.AudioSampleFormat.SAMPLE_S16LE,
channels: audio.AudioChannel.STEREO,
samplingRate: audio.AudioSamplingRate.SAMPLE_RATE_16000,
encoding: audio.AudioEncodingType.ENCODING_PCM,
};
let response = await audioRenderer.setParams(audioParams);
var state = audioRenderer.state;
if (state != audio.AudioState.STATE_PREPARED) {
console.info('Prepare renderer failed');
return;
}
}
```
4. Call the **start()** function on the AudioRenderer instance to start/resume the playback task.\
The renderer state will be STATE _RUNNING once the start is complete. You can then begin writing buffers.
The renderer state will be STATE_RUNNING once the start is complete. You can then begin writing buffers.
```
async function startRenderer() {
var state = audioRenderer.state;
......@@ -148,13 +144,14 @@ Here's an example of how to use AudioRenderer to play a raw audio file.
console.info('Renderer is not in a correct state to start');
return;
}
var started = await audioRenderer.start();
if (started) {
isPlay = true;
await audioRenderer.start();
state = audioRenderer.state;
if (state == audio.AudioState.STATE_RUNNING) {
console.info('Renderer started');
} else {
console.error('Renderer start failed');
return;
}
}
......@@ -212,8 +209,11 @@ Here's an example of how to use AudioRenderer to play a raw audio file.
console.info('Renderer is not running');
return;
}
var paused = await audioRenderer.pause();
if (paused) {
await audioRenderer.pause();
state = audioRenderer.state;
if (state == audio.AudioState.STATE_PAUSED) {
console.info('Renderer paused');
} else {
console.error('Renderer pause failed');
......@@ -226,8 +226,11 @@ Here's an example of how to use AudioRenderer to play a raw audio file.
console.info('Renderer is not running or paused');
return;
}
var stopped = await audioRenderer.stop();
if (stopped) {
await audioRenderer.stop();
state = audioRenderer.state;
if (state == audio.AudioState.STATE_STOPPED) {
console.info('Renderer stopped');
} else {
console.error('Renderer stop failed');
......@@ -243,8 +246,11 @@ Here's an example of how to use AudioRenderer to play a raw audio file.
console.info('Resourced already released');
return;
}
var released = await audioRenderer.release();
if (released) {
await audioRenderer.release();
state = audioRenderer.state;
if (state == STATE_RELEASED) {
console.info('Renderer released');
} else {
console.info('Renderer release failed');
......@@ -257,7 +263,6 @@ Here's an example of how to use AudioRenderer to play a raw audio file.
You should also keep in mind that an AudioRenderer is state-based.
That is, the AudioRenderer has an internal state that you must always check when calling playback control APIs, because some operations are only acceptable while the renderer is in a given state.\
The system may throw an error/exception or generate other undefined behaviour if you perform an operation while in the improper state.\
Before each necessary operation, the example code performs a state check.
## **Asynchronous Operations:**
Most of the AudioRenderer calls are asynchronous. As a result, the UI thread will not be blocked.\
......@@ -267,4 +272,3 @@ provides reference for both callback and promise.
## **Other APIs:**
See [**js-apis-audio.md**](https://gitee.com/openharmony/docs/blob/master/en/application-dev/reference/apis/js-apis-audio.md) for more useful APIs like getAudioTime, drain, and getBufferSize.
......@@ -2,15 +2,6 @@
This module provides the following functions: audio management, audio rendering and system sound management.
---
## ***Note:***
Changes to the AudioRenderer interface have been proposed.
When the updated APIs have been integrated, the document will be revised, and apps must adapt to it.
---
## Modules to Import<a name="s56d19203690d4782bfc74069abb6bd71"></a>
```
......@@ -71,8 +62,25 @@ Obtains an **AudioRenderer** instance.
**Example**
```
const volType = audio.AudioVolumeType.MEDIA;
const audioRenderer = audio.createAudioRenderer(volType);
var audioStreamInfo = {
samplingRate: audio.AudioSamplingRate.SAMPLE_RATE_44100,
channels: audio.AudioChannel.CHANNEL_1,
sampleFormat: audio.AudioSampleFormat.SAMPLE_FORMAT_S16LE,
encodingType: audio.AudioEncodingType.ENCODING_TYPE_RAW
}
var audioRendererInfo = {
content: audio.ContentType.CONTENT_TYPE_SPEECH,
usage: audio.StreamUsage.STREAM_USAGE_VOICE_COMMUNICATION,
rendererFlags: 1
}
var audioRendererOptions = {
streamInfo: audioStreamInfo,
rendererInfo: audioRendererInfo
}
let audioRenderer = await audio.createAudioRenderer(audioRendererOptions);
```
......@@ -112,7 +120,15 @@ Enumerates audio stream types.
</th>
</tr>
</thead>
<tbody><tr id="row1389215612395"><td class="cellrowborder" valign="top" width="30.380000000000003%" headers="mcps1.1.4.1.1 "><p id="p52851329122117"><a name="p52851329122117"></a><a name="p52851329122117"></a>RINGTONE</p>
<tbody>
<tr id="row1389215612395"><td class="cellrowborder" valign="top" width="30.380000000000003%" headers="mcps1.1.4.1.1 "><p id="p52851329122117"><a name="p52851329122117"></a><a name="p52851329122117"></a>VOICE_CALL</p>
</td>
<td class="cellrowborder" valign="top" width="9.950000000000001%" headers="mcps1.1.4.1.2 "><p id="p2282152962115"><a name="p2282152962115"></a><a name="p2282152962115"></a>0</p>
</td>
<td class="cellrowborder" valign="top" width="59.67%" headers="mcps1.1.4.1.3 "><p id="p328012293211"><a name="p328012293211"></a><a name="p328012293211"></a>Audio stream for voice calls.</p>
</td>
</tr>
<tr id="row1389215612395"><td class="cellrowborder" valign="top" width="30.380000000000003%" headers="mcps1.1.4.1.1 "><p id="p52851329122117"><a name="p52851329122117"></a><a name="p52851329122117"></a>RINGTONE</p>
</td>
<td class="cellrowborder" valign="top" width="9.950000000000001%" headers="mcps1.1.4.1.2 "><p id="p2282152962115"><a name="p2282152962115"></a><a name="p2282152962115"></a>2</p>
</td>
......@@ -126,6 +142,13 @@ Enumerates audio stream types.
<td class="cellrowborder" valign="top" width="59.67%" headers="mcps1.1.4.1.3 "><p id="p182452299212"><a name="p182452299212"></a><a name="p182452299212"></a>Audio stream for media purpose.</p>
</td>
</tr>
<tr id="row6892145616397"><td class="cellrowborder" valign="top" width="30.380000000000003%" headers="mcps1.1.4.1.1 "><p id="p027662952110"><a name="p027662952110"></a><a name="p027662952110"></a>VOICE_ASSISTANT</p>
</td>
<td class="cellrowborder" valign="top" width="9.950000000000001%" headers="mcps1.1.4.1.2 "><p id="p17273229192113"><a name="p17273229192113"></a><a name="p17273229192113"></a>9</p>
</td>
<td class="cellrowborder" valign="top" width="59.67%" headers="mcps1.1.4.1.3 "><p id="p182452299212"><a name="p182452299212"></a><a name="p182452299212"></a>Audio stream for voice assistant.</p>
</td>
</tr>
</tbody>
</table>
......@@ -221,35 +244,35 @@ Enumerates audio device types.
</tr>
<tr id="row16728520192714"><td class="cellrowborder" valign="top" width="30.380000000000003%" headers="mcps1.1.4.1.1 "><p id="p4753161132815"><a name="p4753161132815"></a><a name="p4753161132815"></a>SPEAKER</p>
</td>
<td class="cellrowborder" valign="top" width="9.950000000000001%" headers="mcps1.1.4.1.2 "><p id="p3728920162713"><a name="p3728920162713"></a><a name="p3728920162713"></a>1</p>
<td class="cellrowborder" valign="top" width="9.950000000000001%" headers="mcps1.1.4.1.2 "><p id="p3728920162713"><a name="p3728920162713"></a><a name="p3728920162713"></a>2</p>
</td>
<td class="cellrowborder" valign="top" width="59.67%" headers="mcps1.1.4.1.3 "><p id="p17728112062715"><a name="p17728112062715"></a><a name="p17728112062715"></a>Speaker.</p>
</td>
</tr>
<tr id="row1758117472814"><td class="cellrowborder" valign="top" width="30.380000000000003%" headers="mcps1.1.4.1.1 "><p id="p74802011112815"><a name="p74802011112815"></a><a name="p74802011112815"></a>WIRED_HEADSET</p>
</td>
<td class="cellrowborder" valign="top" width="9.950000000000001%" headers="mcps1.1.4.1.2 "><p id="p35820462819"><a name="p35820462819"></a><a name="p35820462819"></a>2</p>
<td class="cellrowborder" valign="top" width="9.950000000000001%" headers="mcps1.1.4.1.2 "><p id="p35820462819"><a name="p35820462819"></a><a name="p35820462819"></a>3</p>
</td>
<td class="cellrowborder" valign="top" width="59.67%" headers="mcps1.1.4.1.3 "><p id="p155821548285"><a name="p155821548285"></a><a name="p155821548285"></a>Wired headset.</p>
</td>
</tr>
<tr id="row1335108192818"><td class="cellrowborder" valign="top" width="30.380000000000003%" headers="mcps1.1.4.1.1 "><p id="p107521514142811"><a name="p107521514142811"></a><a name="p107521514142811"></a>BLUETOOTH_SCO</p>
</td>
<td class="cellrowborder" valign="top" width="9.950000000000001%" headers="mcps1.1.4.1.2 "><p id="p18335108112819"><a name="p18335108112819"></a><a name="p18335108112819"></a>3</p>
<td class="cellrowborder" valign="top" width="9.950000000000001%" headers="mcps1.1.4.1.2 "><p id="p18335108112819"><a name="p18335108112819"></a><a name="p18335108112819"></a>7</p>
</td>
<td class="cellrowborder" valign="top" width="59.67%" headers="mcps1.1.4.1.3 "><p id="p193351683289"><a name="p193351683289"></a><a name="p193351683289"></a>Bluetooth device using the synchronous connection oriented (SCO) link.</p>
</td>
</tr>
<tr id="row1649111617286"><td class="cellrowborder" valign="top" width="30.380000000000003%" headers="mcps1.1.4.1.1 "><p id="p10784017102818"><a name="p10784017102818"></a><a name="p10784017102818"></a>BLUETOOTH_A2DP</p>
</td>
<td class="cellrowborder" valign="top" width="9.950000000000001%" headers="mcps1.1.4.1.2 "><p id="p849110610286"><a name="p849110610286"></a><a name="p849110610286"></a>4</p>
<td class="cellrowborder" valign="top" width="9.950000000000001%" headers="mcps1.1.4.1.2 "><p id="p849110610286"><a name="p849110610286"></a><a name="p849110610286"></a>8</p>
</td>
<td class="cellrowborder" valign="top" width="59.67%" headers="mcps1.1.4.1.3 "><p id="p549117620284"><a name="p549117620284"></a><a name="p549117620284"></a>Bluetooth device using the advanced audio distribution profile (A2DP).</p>
</td>
</tr>
<tr id="row81701220112812"><td class="cellrowborder" valign="top" width="30.380000000000003%" headers="mcps1.1.4.1.1 "><p id="p168642028152812"><a name="p168642028152812"></a><a name="p168642028152812"></a>MIC</p>
</td>
<td class="cellrowborder" valign="top" width="9.950000000000001%" headers="mcps1.1.4.1.2 "><p id="p517062012812"><a name="p517062012812"></a><a name="p517062012812"></a>5</p>
<td class="cellrowborder" valign="top" width="9.950000000000001%" headers="mcps1.1.4.1.2 "><p id="p517062012812"><a name="p517062012812"></a><a name="p517062012812"></a>15</p>
</td>
<td class="cellrowborder" valign="top" width="59.67%" headers="mcps1.1.4.1.3 "><p id="p5170520112813"><a name="p5170520112813"></a><a name="p5170520112813"></a>Microphone.</p>
</td>
......@@ -302,19 +325,24 @@ Enumerates the audio sample formats.
| Name | Default Value | Description |
| :------------ | :------------ | :------------------------------------ |
| INVALID_WIDTH | -1 | Invalid format. |
| SAMPLE_U8 | 1 | Unsigned 8 bit integer. |
| SAMPLE_S16LE | 0 | Signed 16 bit integer, little endian. |
| SAMPLE_S24LE | 1 | Signed 24 bit integer, little endian. |
| SAMPLE_S32LE | 2 | Signed 32 bit integer, little endian. |
| SAMPLE_U8 | 0 | Unsigned 8 bit integer. |
| SAMPLE_S16LE | 1 | Signed 16 bit integer, little endian. |
| SAMPLE_S24LE | 2 | Signed 24 bit integer, little endian. |
| SAMPLE_S32LE | 3 | Signed 32 bit integer, little endian. |
## AudioChannel<sup>8+</sup><a name="audiochannel"></a>
Enumerates the audio channels.
| Name | Default Value | Description |
| :----- | :------------ | :--------------- |
| MONO | 1 | Channel count 1. |
| STEREO | 2 | Channel count 2. |
| CHANNEL_1 | 0x1 << 0 | Channel count 1. |
| CHANNEL_2 | 0x1 << 1 | Channel count 2. |
| CHANNEL_3 | 0x1 << 2 | Channel count 3. |
| CHANNEL_4 | 0x1 << 3 | Channel count 4. |
| CHANNEL_5 | 0x1 << 4 | Channel count 5. |
| CHANNEL_6 | 0x1 << 5 | Channel count 6. |
| CHANNEL_7 | 0x1 << 6 | Channel count 7. |
| CHANNEL_8 | 0x1 << 7 | Channel count 8. |
## AudioSamplingRate<sup>8+</sup><a name="audiosamplingrate"></a>
......@@ -339,9 +367,9 @@ Enumerates the audio sampling rates.
Enumerates the audio encoding types.
| Name | Default Value | Description |
| :--------------- | :------------ | :---------- |
| ENCODING_PCM | 0 | PCM. |
| ENCODING_INVALID | 1 | Invalid. |
| :-------------------- | :------------ | :---------------- |
| ENCODING_TYPE_INVALID | -1 | Invalid. |
| ENCODING_TYPE_RAW | 0 | PCM encoding. |
## ContentType<sup>8+</sup><a name="contentype"></a>
......@@ -431,23 +459,17 @@ Enumerates the ringtone types.
| RINGTONE_TYPE_DEFAULT | 0 | Default type. |
| RINGTONE_TYPE_MULTISIM | 1 | Multi-SIM type. |
## AudioParameters<sup>8+</sup><a name="audioparameters"></a>
Describes audio parameters of playback files.
## AudioStreamInfo<sup>8+</sup><a name="audiorstreaminfo"></a>
Describes audio stream information.
**Parameters**
| Name | Type | Mandatory | Description |
| :----------- | :---------------- | :-------- | :-------------------------------------------- |
| format | AudioSampleFormat | Yes | Sample format of the audio file to be played. |
| channels | AudioChannel | Yes | Channel count of the audio file to be played. |
| samplingRate | AudioSamplingRate | Yes | Sample rate of the audio file to be played. |
| encoding | AudioEncodingType | Yes | Encoding type of the audio file to be played. |
| contentType | ContentType | Yes | Content type. |
| usage | StreamUsage | Yes | Stream usage. |
| deviceRole | DeviceRole | Yes | Device role. |
| deviceType | DeviceType | Yes | Device type. |
| :------------ | :-------------------- | :-------- | :-------------------- |
| samplingRate | AudioSamplingRate | Yes | Sampling rate. |
| channels | AudioChannel | Yes | Audio channels. |
| sampleFormat | AudioSampleFormat | Yes | Audio sample format. |
| encodingType | AudioEncodingType | Yes | Audio encoding type. |
## AudioRendererInfo<sup>8+</sup><a name="audiorendererinfo"></a>
Describes audio renderer information.
......@@ -2461,19 +2483,17 @@ Defines the current render state.
var state = audioRenderer.state;
```
## audioRenderer.getRendererInfo
## audioRenderer.setParams
setParams(params: AudioParameters, callback: AsyncCallback<void\>): void<sup>8+</sup><a name="setparams-asynccallback"></a>
getRendererInfo(callback: AsyncCallback<AudioRendererInfo\>): void<sup>8+</sup><a name="getrendererinfo-asynccallback"></a>
Sets audio parameters for rendering. This method uses an asynchronous callback to return the result.
Gets the renderer information provided while creating a renderer instance. This method uses an asynchronous callback to return the result.
**Parameters**
| Name | Type | Mandatory | Description |
| :------- | :------------------- | :-------- | :-------------------------------------- |
| params | AudioParameters | Yes | Audio parameters of the file to be set. |
| callback | AsyncCallback<void\> | Yes | Callback used to return the result. |
| :------- | :--------------------------------- | :-------- | :------------------------------------------------ |
| callback | AsyncCallback<AudioRendererInfo\> | Yes | Callback used to return the renderer information. |
| | | | |
**Return value**
......@@ -2483,66 +2503,52 @@ None
**Example**
```
var audioParams = {
format: audio.AudioSampleFormat.SAMPLE_S16LE,
channels: audio.AudioChannel.STEREO,
samplingRate: audio.AudioSamplingRate.SAMPLE_RATE_16000,
encoding: audio.AudioEncodingType.ENCODING_PCM,
};
audioRenderer.setParams(audioParams, (err)=>{
if (err) {
console.error('Failed to set params. ${err.message}');
return;
}
console.log('Callback invoked to indicate a successful params setting.');
audioRenderer.getRendererInfo((err, rendererInfo)=>{
console.log('Renderer GetRendererInfo:');
console.log('Renderer content:' + rendererInfo.content);
console.log('Renderer usage:' + rendererInfo.usage);
console.log('Renderer flags:' + rendererInfo.rendererFlags);
})
```
## audioRenderer.setParams
## audioRenderer.getRendererInfo
setParams(params: AudioParameters): Promise<void\><sup>8+</sup><a name="setparams-promise"></a>
getParams(): Promise<AudioRendererInfo\><sup>8+</sup><a name="getrendererinfo-promise"></a>
Sets audio parameters for rendering. This method uses a promise to return the result.
Gets the renderer information provided while creating a renderer instance. This method uses a promise to return the result.
**Parameters**
| Name | Type | Mandatory | Description |
| :----- | :-------------- | :-------- | :-------------------------------------- |
| params | AudioParameters | Yes | Audio parameters of the file to be set. |
None
**Return value**
| Type | Description |
| :------------- | :--------------------------------- |
| Promise<void\> | Promise used to return the result. |
| :---------------------------- | :----------------------------------------------- |
| Promise<AudioRendererInfo\> | Promise used to return the renderer information. |
**Example**
```
var audioParams = {
format: audio.AudioSampleFormat.SAMPLE_S16LE,
channels: audio.AudioChannel.STEREO,
samplingRate: audio.AudioSamplingRate.SAMPLE_RATE_16000,
encoding: audio.AudioEncodingType.ENCODING_PCM,
};
await audioRenderer.setParams(audioParams);
let rendererInfo = await audioRenderer.getRendererInfo();
console.log('Renderer GetRendererInfo:');
console.log('Renderer content:' + rendererInfo.content);
console.log('Renderer usage:' + rendererInfo.usage);
console.log('Renderer flags:' + rendererInfo.rendererFlags);
```
## audioRenderer.getStreamInfo
## audioRenderer.getParams
getParams(callback: AsyncCallback<AudioParameters\>): void<sup>8+</sup><a name="getparams-asynccallback"></a>
getStreamInfo(callback: AsyncCallback<AudioStreamInfo\>): void<sup>8+</sup><a name="getstreaminfo-asynccallback"></a>
Gets audio parameters of the renderer. This method uses an asynchronous callback to return the result.
Gets the renderer stream information. This method uses an asynchronous callback to return the result.
**Parameters**
| Name | Type | Mandatory | Description |
| :------- | :------------------------------ | :-------- | :-------------------------------------------- |
| callback | AsyncCallback<AudioParameters\> | Yes | Callback used to return the audio parameters. |
| :------- | :--------------------------------- | :-------- | :---------------------------------------------- |
| callback | AsyncCallback<AudioStreamInfo\> | Yes | Callback used to return the stream information. |
| | | | |
**Return value**
......@@ -2552,21 +2558,20 @@ None
**Example**
```
audioRenderer.getParams((err, audioParams)=>{
console.log('Renderer GetParams:');
console.log('Renderer format:' + audioParams.format);
console.log('Renderer samplingRate:' + audioParams.samplingRate);
console.log('Renderer channels:' + audioParams.channels);
console.log('Renderer encoding:' + audioParams.encoding);
audioRenderer.getStreamInfo((err, streamInfo)=>{
console.log('Renderer GetStreamInfo:');
console.log('Renderer sampling rate:' + streamInfo.samplingRate);
console.log('Renderer channel:' + streamInfo.AudioChannel);
console.log('Renderer format:' + streamInfo.AudioSampleFormat);
console.log('Renderer encoding type:' + streamInfo.AudioEncodingType);
})
```
## audioRenderer.getStreamInfo
## audioRenderer.getParams
getParams(): Promise<AudioParameters\><sup>8+</sup><a name="getparams-promise"></a>
getStreamInfo(): Promise<AudioStreamInfo\><sup>8+</sup><a name="getstreaminfo-promise"></a>
Gets audio parameters of the renderer. This method uses a promise to return the result.
Gets the renderer stream information. This method uses a promise to return the result.
**Parameters**
......@@ -2575,32 +2580,31 @@ None
**Return value**
| Type | Description |
| :------------------------ | :------------------------------------------- |
| Promise<AudioParameters\> | Promise used to return the audio parameters. |
| :---------------------------- | :----------------------------------------------- |
| Promise<AudioStreamInfo\> | Promise used to return the stream information. |
**Example**
```
let audioParams = await audioRenderer.getParams();
console.log('Renderer GetParams:');
console.log('Renderer format:' + audioParams.format);
console.log('Renderer samplingRate:' + audioParams.samplingRate);
console.log('Renderer channels:' + audioParams.channels);
console.log('Renderer encoding:' + audioParams.encoding);
let streamInfo = await audioRenderer.getStreamInfo();
console.log('Renderer GetStreamInfo:');
console.log('Renderer sampling rate:' + streamInfo.samplingRate);
console.log('Renderer channel:' + streamInfo.AudioChannel);
console.log('Renderer format:' + streamInfo.AudioSampleFormat);
console.log('Renderer encoding type:' + streamInfo.AudioEncodingType);
```
## audioRenderer.start
start(callback: AsyncCallback<boolean\>): void<sup>8+</sup><a name="start-asynccallback"></a>
start(callback: AsyncCallback<void\>): void<sup>8+</sup><a name="start-asynccallback"></a>
Starts the renderer. This method uses an asynchronous callback to return the result.
**Parameters**
| Name | Type | Mandatory | Description |
| :------- | :---------------------- | :-------- | :----------------------------------------------------------------------------- |
| callback | AsyncCallback<boolean\> | Yes | Returns true if the renderer is started successfully; returns false otherwise. |
| :------- | :---------------------- | :-------- | :-------------------------------------- |
| callback | AsyncCallback<void\> | Yes | Callback used to return the result. |
| | | | |
**Return value**
......@@ -2610,11 +2614,11 @@ None
**Example**
```
audioRenderer.start((err, started)=>{
if (started) {
console.log('Renderer started.');
audioRenderer.start((err)=>{
if (err) {
console.error('Renderer start failed.');
} else {
console.error('Renderer start rejected.');
console.info('Renderer start success.');
}
})
```
......@@ -2622,7 +2626,7 @@ audioRenderer.start((err, started)=>{
## audioRenderer.start
start(): Promise<boolean\><a name="start-promise"><sup>8+</sup></a>
start(): Promise<void\><a name="start-promise"><sup>8+</sup></a>
Starts the renderer. This method uses a promise to return the result.
......@@ -2633,32 +2637,27 @@ None
**Return value**
| Type | Description |
| :---------------- | :----------------------------------------------------------------------------- |
| Promise<boolean\> | Returns true if the renderer is started successfully; returns false otherwise. |
| :------------- | :--------------------------------- |
| Promise<void\> | Promise used to return the result. |
**Example**
```
var started = await audioRenderer.start();
if (started) {
console.log('Renderer started');
} else {
console.error('Renderer start rejected');
}
await audioRenderer.start();
```
## audioRenderer.pause
pause(callback: AsyncCallback<boolean\>): void<sup>8+</sup><a name="pause-asynccallback"></a>
pause(callback: AsyncCallback<void\>): void<sup>8+</sup><a name="pause-asynccallback"></a>
Pauses rendering. This method uses an asynchronous callback to return the result.
**Parameters**
| Name | Type | Mandatory | Description |
| :------- | :---------------------- | :-------- | :---------------------------------------------------------------------------- |
| callback | AsyncCallback<boolean\> | Yes | Returns true if the renderer is paused successfully; returns false otherwise. |
| :------- | :---------------------- | :-------- | :------------------------------------ |
| callback | AsyncCallback<void\> | Yes | Callback used to return the result. |
| | | | |
**Return value**
......@@ -2668,11 +2667,11 @@ None
**Example**
```
audioRenderer.pause((err, paused)=>{
if (paused) {
console.log('Renderer paused.');
} else {
audioRenderer.pause((err)=>{
if (err) {
console.error('Renderer pause failed');
} else {
console.log('Renderer paused.');
}
})
```
......@@ -2681,7 +2680,7 @@ audioRenderer.pause((err, paused)=>{
## audioRenderer.pause
pause(): Promise<boolean\><sup>8+</sup><a name="pause-promise"></a>
pause(): Promise<void\><sup>8+</sup><a name="pause-promise"></a>
Pauses rendering. This method uses a promise to return the result.
......@@ -2692,33 +2691,28 @@ None
**Return value**
| Type | Description |
| :---------------- | :---------------------------------------------------------------------------- |
| Promise<boolean\> | Returns true if the renderer is paused successfully; returns false otherwise. |
| :------------- | :--------------------------------- |
| Promise<void\> | Promise used to return the result. |
**Example**
```
var paused = await audioRenderer.pause();
if (paused) {
console.log('Renderer paused');
} else {
console.error('Renderer pause failed');
}
await audioRenderer.pause();
```
## audioRenderer.drain
drain(callback: AsyncCallback<boolean\>): void<sup>8+</sup><a name="drain-asynccallback"></a>
drain(callback: AsyncCallback<void\>): void<sup>8+</sup><a name="drain-asynccallback"></a>
Drains the playback buffer. This method uses an asynchronous callback to return the result.
**Parameters**
| Name | Type | Mandatory | Description |
| :------- | :---------------------- | :-------- | :--------------------------------------------------------------------------- |
| callback | AsyncCallback<boolean\> | Yes | Returns true if the buffer is drained successfully; returns false otherwise. |
| :------- | :---------------------- | :-------- | :---------------------------------------|
| callback | AsyncCallback<void\> | Yes | Callback used to return the result. |
| | | | |
**Return value**
......@@ -2728,11 +2722,11 @@ None
**Example**
```
audioRenderer.drain((err, drained)=>{
if (drained) {
console.log('Renderer drained.');
} else {
audioRenderer.drain((err)=>{
if (err) {
console.error('Renderer drain failed');
} else {
console.log('Renderer drained.');
}
})
```
......@@ -2740,7 +2734,7 @@ audioRenderer.drain((err, drained)=>{
## audioRenderer.drain
drain(): Promise<boolean\><sup>8+</sup><a name="drain-promise"></a>
drain(): Promise<void\><sup>8+</sup><a name="drain-promise"></a>
Drains the playback buffer. This method uses a promise to return the result.
......@@ -2751,32 +2745,27 @@ None
**Return value**
| Type | Description |
| :---------------- | :--------------------------------------------------------------------------- |
| Promise<boolean\> | Returns true if the buffer is drained successfully; returns false otherwise. |
| :------------- | :--------------------------------- |
| Promise<void\> | Promise used to return the result. |
**Example**
```
var drained = await audioRenderer.drain();
if (drained) {
console.log('Renderer drained');
} else {
console.error('Renderer drain failed');
}
await audioRenderer.drain();
```
## audioRenderer.stop
stop(callback: AsyncCallback<boolean\>): void<sup>8+</sup><a name="stop-asynccallback"></a>
stop(callback: AsyncCallback<void\>): void<sup>8+</sup><a name="stop-asynccallback"></a>
Stops rendering. This method uses an asynchronous callback to return the result.
**Parameters**
| Name | Type | Mandatory | Description |
| :------- | :---------------------- | :-------- | :------------------------------------------------------------------------------ |
| callback | AsyncCallback<boolean\> | Yes | Returns true if the rendering is stopped successfully; returns false otherwise. |
| :------- | :---------------------- | :-------- | :------------------------------------- |
| callback | AsyncCallback<void\> | Yes | Callback used to return the result. |
| | | | |
**Return value**
......@@ -2786,11 +2775,11 @@ None
**Example**
```
audioRenderer.stop((err, stopped)=>{
if (stopped) {
console.log('Renderer stopped.');
} else {
audioRenderer.stop((err)=>{
if (err) {
console.error('Renderer stop failed');
} else {
console.log('Renderer stopped.');
}
})
```
......@@ -2798,7 +2787,7 @@ audioRenderer.stop((err, stopped)=>{
## audioRenderer.stop
stop(): Promise<boolean\><sup>8+</sup><a name="stop-promise"></a>
stop(): Promise<void\><sup>8+</sup><a name="stop-promise"></a>
Stops rendering. This method uses a promise to return the result.
......@@ -2809,32 +2798,27 @@ None
**Return value**
| Type | Description |
| :---------------- | :------------------------------------------------------------------------------ |
| Promise<boolean\> | Returns true if the rendering is stopped successfully; returns false otherwise. |
| :------------- | :--------------------------------- |
| Promise<void\> | Promise used to return the result. |
**Example**
```
var stopped = await audioRenderer.stop();
if (stopped) {
console.log('Renderer stopped');
} else {
console.error('Renderer stop failed');
}
await audioRenderer.stop();
```
## audioRenderer.release
release(callback: AsyncCallback<boolean\>): void<sup>8+</sup><a name="release-asynccallback"></a>
release(callback: AsyncCallback<void\>): void<sup>8+</sup><a name="release-asynccallback"></a>
Releases the renderer. This method uses an asynchronous callback to return the result.
**Parameters**
| Name | Type | Mandatory | Description |
| :------- | :---------------------- | :-------- | :------------------------------------------------------------------------------ |
| callback | AsyncCallback<boolean\> | Yes | Returns true if the renderer is released successfully; returns false otherwise. |
| :------- | :---------------------- | :-------- | :------------------------------------- |
| callback | AsyncCallback<void\> | Yes | Callback used to return the result. |
| | | | |
**Return value**
......@@ -2844,11 +2828,11 @@ None
**Example**
```
audioRenderer.release((err, released)=>{
if (released) {
console.log('Renderer released.');
} else {
audioRenderer.release((err)=>{
if (err) {
console.error('Renderer release failed');
} else {
console.log('Renderer released.');
}
})
```
......@@ -2857,7 +2841,7 @@ audioRenderer.release((err, released)=>{
## audioRenderer.release
release(): Promise<boolean\><sup>8+</sup><a name="release-promise"></a>
release(): Promise<void\><sup>8+</sup><a name="release-promise"></a>
Releases the renderer. This method uses a promise to return the result.
......@@ -2868,18 +2852,13 @@ None
**Return value**
| Type | Description |
| :---------------- | :------------------------------------------------------------------------------ |
| Promise<boolean\> | Returns true if the renderer is released successfully; returns false otherwise. |
| :------------- | :--------------------------------- |
| Promise<void\> | Promise used to return the result. |
**Example**
```
var released = await audioRenderer.release();
if (released) {
console.log('Renderer released');
} else {
console.error('Renderer release failed');
}
await audioRenderer.release();
```
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册