audio-renderer.md 9.8 KB
Newer Older
W
wusongqing 已提交
1
# Audio Rendering Development
V
Vaidegi B 已提交
2

W
wusongqing 已提交
3
## When to Use
V
Vaidegi B 已提交
4

W
wusongqing 已提交
5
**AudioRenderer** provides APIs for rendering audio files and controlling playback. It also supports audio interruption. You can use the APIs provided by **AudioRenderer** to play audio files in output devices and manage playback tasks.
V
Vaidegi B 已提交
6

W
wusongqing 已提交
7
### Audio Interruption
V
Vaidegi B 已提交
8

W
wusongqing 已提交
9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30
When an audio stream with a higher priority needs to be played, the audio renderer interrupts the stream with a lower priority. For example, if a call comes in when the user is listening to music, the music playback, which is the lower priority stream, is paused. For details, see [How to Develop](#how-to-develop).

### State Check

During application development, you are advised to use **on('stateChange')** to subscribe to state changes of the **AudioRenderer** instance. This is because some operations can be performed only when the audio renderer is in a given state. If the application performs an operation when the audio renderer is not in the given state, the system may throw an exception or generate other undefined behavior.

**Figure 1** Audio renderer state

![](figures/audio-renderer-state.png)

### Asynchronous Operations

To ensure that the UI thread is not blocked, most **AudioRenderer** calls are asynchronous. Each API provides the callback and promise functions. The following examples use the promise functions. For more information, see [AudioRenderer in Audio Management](../reference/apis/js-apis-audio.md#audiorenderer8).



## How to Develop

1. Use **createAudioRenderer()** to create an **AudioRenderer** instance.
   Set parameters of the audio renderer in **audioCapturerOptions**. This instance is used to render audio, control and obtain the rendering status, and register a callback for notification.

   ```js
31 32 33
    var audioStreamInfo = {
        samplingRate: audio.AudioSamplingRate.SAMPLE_RATE_44100,
        channels: audio.AudioChannel.CHANNEL_1,
W
wusongqing 已提交
34
     sampleFormat: audio.AudioSampleFormat.SAMPLE_FORMAT_S16LE,
35 36
        encodingType: audio.AudioEncodingType.ENCODING_TYPE_RAW
    }
W
wusongqing 已提交
37
   
38 39 40 41 42
    var audioRendererInfo = {
        content: audio.ContentType.CONTENT_TYPE_SPEECH,
        usage: audio.StreamUsage.STREAM_USAGE_VOICE_COMMUNICATION,
        rendererFlags: 1
    }
W
wusongqing 已提交
43
   
44 45 46 47
    var audioRendererOptions = {
        streamInfo: audioStreamInfo,
        rendererInfo: audioRendererInfo
    }
W
wusongqing 已提交
48
   
49
    let audioRenderer = await audio.createAudioRenderer(audioRendererOptions);
V
Vaidegi B 已提交
50 51
   ```

W
wusongqing 已提交
52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103
2. Use **on('interrupt')** to subscribe to audio interruption events.

   Stream-A is interrupted when Stream-B with a higher or equal priority requests to become active and use the output device.

   In some cases, the audio renderer performs forcible operations such as pausing and ducking, and notifies the application through **InterruptEvent**. In other cases, the application can choose to act on the **InterruptEvent** or ignore it.

   In the case of audio interruption, the application may encounter write failures. To avoid such failures, interruption unaware applications can use **audioRenderer.state** to check the renderer state before writing audio data. The applications can obtain more details by subscribing to the audio interruption events. For details, see [InterruptEvent](../reference/apis/js-apis-audio.md#interruptevent9).
   
   ```js
   audioRenderer.on('interrupt', (interruptEvent) => {
           console.info('InterruptEvent Received');
           console.info('InterruptType: ' + interruptEvent.eventType);
           console.info('InterruptForceType: ' + interruptEvent.forceType);
           console.info('AInterruptHint: ' + interruptEvent.hintType);
   
           if (interruptEvent.forceType == audio.InterruptForceType.INTERRUPT_FORCE) {
               switch (interruptEvent.hintType) {
                   // Force Pause: Action was taken by framework.
                   // Halt the write calls to avoid data loss.
                   case audio.InterruptHint.INTERRUPT_HINT_PAUSE:
                       isPlay = false;
                       break;
                   // Force Stop: Action was taken by framework.
                   // Halt the write calls to avoid data loss.
                   case audio.InterruptHint.INTERRUPT_HINT_STOP:
                       isPlay = false;
                       break;
                   // Force Duck: Action was taken by framework,
                   // just notifying the app that volume has been reduced.
                   case audio.InterruptHint.INTERRUPT_HINT_DUCK:
                       break;
                   // Force Unduck: Action was taken by framework,
                   // just notifying the app that volume has been restored.
                   case audio.InterruptHint.INTERRUPT_HINT_UNDUCK:
                       break;
               }
           } else if (interruptEvent.forceType == audio.InterruptForceType.INTERRUPT_SHARE) {
               switch (interruptEvent.hintType) {
                   // Share Resume: Action is to be taken by App.
                   // Resume the force paused stream if required.
                   case audio.InterruptHint.INTERRUPT_HINT_RESUME:
                       startRenderer();
                       break;
                   // Share Pause: Stream has been interrupted,
                   // It can choose to pause or play concurrently.
                   case audio.InterruptHint.INTERRUPT_HINT_PAUSE:
                       isPlay = false;
                       pauseRenderer();
                       break;
               }
           }
       });
V
Vaidegi B 已提交
104 105
   ```

W
wusongqing 已提交
106 107 108
3. Use **start()** to start audio rendering.
   
   The renderer state will be **STATE_RUNNING** once the audio renderer is started. The application can then begin reading buffers.
V
Vaidegi B 已提交
109

W
wusongqing 已提交
110
   ```js
V
Vaidegi B 已提交
111 112
    async function startRenderer() {
        var state = audioRenderer.state;
W
wusongqing 已提交
113
        // The state should be prepared, paused, or stopped.
V
Vaidegi B 已提交
114 115 116 117 118
        if (state != audio.AudioState.STATE_PREPARED || state != audio.AudioState.STATE_PAUSED ||
            state != audio.AudioState.STATE_STOPPED) {
            console.info('Renderer is not in a correct state to start');
            return;
        }
W
wusongqing 已提交
119
   
120
        await audioRenderer.start();
W
wusongqing 已提交
121
   
122 123
        state = audioRenderer.state;
        if (state == audio.AudioState.STATE_RUNNING) {
V
Vaidegi B 已提交
124 125 126 127 128 129
            console.info('Renderer started');
        } else {
            console.error('Renderer start failed');
        }
    }
   ```
W
wusongqing 已提交
130 131 132 133 134 135

4. Call **write()** to write data to the buffer.

   Read the audio data to be played to the buffer. Call **write()** repeatedly to write the data to the buffer.

   ```js
V
Vaidegi B 已提交
136 137 138 139 140 141 142 143
    async function writeBuffer(buf) {
        var state = audioRenderer.state;
        if (state != audio.AudioState.STATE_RUNNING) {
            console.error('Renderer is not running, do not write');
            isPlay = false;
            return;
        }
        let writtenbytes = await audioRenderer.write(buf);
W
wusongqing 已提交
144
   
V
Vaidegi B 已提交
145 146 147 148 149
        console.info('Actual written bytes: ' + writtenbytes);
        if (writtenbytes < 0) {
            console.error('Write buffer failed. check the state of renderer');
        }
    }
W
wusongqing 已提交
150
   
V
Vaidegi B 已提交
151 152 153 154 155 156 157 158 159
    // Reasonable minimum buffer size for renderer. However, the renderer can accept other read sizes as well.
    const bufferSize = await audioRenderer.getBufferSize();
    const path = '/data/file_example_WAV_2MG.wav';
    let ss = fileio.createStreamSync(path, 'r');
    const totalSize = 2146166; // file_example_WAV_2MG.wav
    let rlen = 0;
    let discardHeader = new ArrayBuffer(44);
    ss.readSync(discardHeader);
    rlen += 44;
W
wusongqing 已提交
160
   
V
Vaidegi B 已提交
161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177
    var id = setInterval(() =>  {
        if (isPlay || isRelease) {
            if (rlen >= totalSize || isRelease) {
                ss.closeSync();
                stopRenderer();
                clearInterval(id);
            }
            let buf = new ArrayBuffer(bufferSize);
            rlen += ss.readSync(buf);
            console.info('Total bytes read from file: ' + rlen);
            writeBuffer(buf);
        } else {
            console.info('check after next interval');
        }
    } , 30); // interval to be set based on audio file format
   ```

W
wusongqing 已提交
178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214
5. (Optional) Call **pause()** or **stop()** to pause or stop rendering.

   ```js
       async function pauseRenderer() {
           var state = audioRenderer.state;
           if (state != audio.AudioState.STATE_RUNNING) {
               console.info('Renderer is not running');
               return;
           }
   
           await audioRenderer.pause();
   
           state = audioRenderer.state;
           if (state == audio.AudioState.STATE_PAUSED) {
               console.info('Renderer paused');
           } else {
               console.error('Renderer pause failed');
           }
       }
   
       async function stopRenderer() {
           var state = audioRenderer.state;
           if (state != audio.AudioState.STATE_RUNNING || state != audio.AudioState.STATE_PAUSED) {
               console.info('Renderer is not running or paused');
               return;
           }
   
           await audioRenderer.stop();
   
           state = audioRenderer.state;
           if (state == audio.AudioState.STATE_STOPPED) {
               console.info('Renderer stopped');
           } else {
               console.error('Renderer stop failed');
           }
   }
   ```
V
Vaidegi B 已提交
215

W
wusongqing 已提交
216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237
6. After the task is complete, call **release()** to release related resources.

   **AudioRenderer** uses a large number of system resources. Therefore, ensure that the resources are released after the task is complete.

   ```js
       async function releaseRenderer() {
           if (state_ == RELEASED || state_ == NEW) {
               console.info('Resourced already released');
               return;
           }
   
           await audioRenderer.release();
   
           state = audioRenderer.state;
           if (state == STATE_RELEASED) {
               console.info('Renderer released');
           } else {
               console.info('Renderer release failed');
           }
   
       }
   ```
V
Vaidegi B 已提交
238

W
wusongqing 已提交
239