<tdclass="cellrowborder"valign="top"width="50%"headers="mcps1.2.3.1.2 "><pid="p1596200459"><aname="p1596200459"></a><aname="p1596200459"></a>Audio streams for media purpose</p>
</td>
...
...
@@ -154,6 +160,11 @@ You use audio management APIs to set and obtain volume, and get information abou
<tdclass="cellrowborder"valign="top"width="50%"headers="mcps1.2.3.1.2 "><pid="p9333131144712"><aname="p9333131144712"></a><aname="p9333131144712"></a>Audio streams for ring tones</p>
<tdclass="cellrowborder"valign="top"width="50%"headers="mcps1.2.3.1.2 "><pid="p538905016496"><aname="p538905016496"></a><aname="p538905016496"></a>Bluetooth device using the synchronous connection oriented link (SCO)</p>
<tdclass="cellrowborder"valign="top"width="50%"headers="mcps1.2.3.1.2 "><pid="p538905016496"><aname="p538905016496"></a><aname="p538905016496"></a>Bluetooth device using the synchronous connection oriented (SCO) link</p>
<tdclass="cellrowborder"valign="top"width="50%"headers="mcps1.2.3.1.2 "><pid="p193891550134912"><aname="p193891550134912"></a><aname="p193891550134912"></a>Bluetooth device using advanced audio distribution profile (A2DP)</p>
4. Call the **start()** function on the AudioRenderer instance to start/resume the playback task.\
The renderer state will be STATE_RUNNING once the start is complete. You can then begin writing buffers.
The renderer state will be STATE_RUNNING once the start is complete. You can then begin writing buffers.
```
asyncfunctionstartRenderer(){
varstate=audioRenderer.state;
...
...
@@ -148,13 +144,14 @@ Here's an example of how to use AudioRenderer to play a raw audio file.
console.info('Renderer is not in a correct state to start');
return;
}
varstarted=awaitaudioRenderer.start();
if(started){
isPlay=true;
awaitaudioRenderer.start();
state=audioRenderer.state;
if(state==audio.AudioState.STATE_RUNNING){
console.info('Renderer started');
}else{
console.error('Renderer start failed');
return;
}
}
...
...
@@ -212,8 +209,11 @@ Here's an example of how to use AudioRenderer to play a raw audio file.
console.info('Renderer is not running');
return;
}
varpaused=awaitaudioRenderer.pause();
if(paused){
awaitaudioRenderer.pause();
state=audioRenderer.state;
if(state==audio.AudioState.STATE_PAUSED){
console.info('Renderer paused');
}else{
console.error('Renderer pause failed');
...
...
@@ -226,8 +226,11 @@ Here's an example of how to use AudioRenderer to play a raw audio file.
console.info('Renderer is not running or paused');
return;
}
varstopped=awaitaudioRenderer.stop();
if(stopped){
awaitaudioRenderer.stop();
state=audioRenderer.state;
if(state==audio.AudioState.STATE_STOPPED){
console.info('Renderer stopped');
}else{
console.error('Renderer stop failed');
...
...
@@ -243,8 +246,11 @@ Here's an example of how to use AudioRenderer to play a raw audio file.
console.info('Resourced already released');
return;
}
varreleased=awaitaudioRenderer.release();
if(released){
awaitaudioRenderer.release();
state=audioRenderer.state;
if(state==STATE_RELEASED){
console.info('Renderer released');
}else{
console.info('Renderer release failed');
...
...
@@ -257,7 +263,6 @@ Here's an example of how to use AudioRenderer to play a raw audio file.
You should also keep in mind that an AudioRenderer is state-based.
That is, the AudioRenderer has an internal state that you must always check when calling playback control APIs, because some operations are only acceptable while the renderer is in a given state.\
The system may throw an error/exception or generate other undefined behaviour if you perform an operation while in the improper state.\
Before each necessary operation, the example code performs a state check.
## **Asynchronous Operations:**
Most of the AudioRenderer calls are asynchronous. As a result, the UI thread will not be blocked.\
...
...
@@ -267,4 +272,3 @@ provides reference for both callback and promise.
## **Other APIs:**
See [**js-apis-audio.md**](https://gitee.com/openharmony/docs/blob/master/en/application-dev/reference/apis/js-apis-audio.md) for more useful APIs like getAudioTime, drain, and getBufferSize.