提交 2b08d7e2 编写于 作者: G Gloria

Update docs against 15756+15924+15757+15843+15912

Signed-off-by: wusongqing<wusongqing@huawei.com>
上级 5cc343cf
...@@ -21,38 +21,49 @@ This following figure shows the audio capturer state transitions. ...@@ -21,38 +21,49 @@ This following figure shows the audio capturer state transitions.
## Constraints ## Constraints
Before developing the audio data collection feature, configure the **ohos.permission.MICROPHONE** permission for your application. For details, see [Permission Application Guide](../security/accesstoken-guidelines.md). Before developing the audio data collection feature, configure the **ohos.permission.MICROPHONE** permission for your application. For details, see [Permission Application Guide](../security/accesstoken-guidelines.md#declaring-permissions-in-the-configuration-file).
## How to Develop ## How to Develop
For details about the APIs, see [AudioCapturer in Audio Management](../reference/apis/js-apis-audio.md#audiocapturer8). For details about the APIs, see [AudioCapturer in Audio Management](../reference/apis/js-apis-audio.md#audiocapturer8).
1. Use **createAudioCapturer()** to create an **AudioCapturer** instance. 1. Use **createAudioCapturer()** to create a global **AudioCapturer** instance.
Set parameters of the **AudioCapturer** instance in **audioCapturerOptions**. This instance is used to capture audio, control and obtain the recording state, and register a callback for notification. Set parameters of the **AudioCapturer** instance in **audioCapturerOptions**. This instance is used to capture audio, control and obtain the recording state, and register a callback for notification.
```js ```js
import audio from '@ohos.multimedia.audio'; import audio from '@ohos.multimedia.audio';
import fs from '@ohos.file.fs'; // It will be used for the call of the read function in step 3.
// Perform a self-test on APIs related to audio rendering.
@Entry
@Component
struct AudioRenderer {
@State message: string = 'Hello World'
private audioCapturer: audio.AudioCapturer; // It will be called globally.
async initAudioCapturer(){
let audioStreamInfo = {
samplingRate: audio.AudioSamplingRate.SAMPLE_RATE_44100,
channels: audio.AudioChannel.CHANNEL_1,
sampleFormat: audio.AudioSampleFormat.SAMPLE_FORMAT_S16LE,
encodingType: audio.AudioEncodingType.ENCODING_TYPE_RAW
}
let audioCapturerInfo = {
source: audio.SourceType.SOURCE_TYPE_MIC,
capturerFlags: 0 // 0 is the extended flag bit of the audio capturer. The default value is 0.
}
let audioCapturerOptions = {
streamInfo: audioStreamInfo,
capturerInfo: audioCapturerInfo
}
this.audioCapturer = await audio.createAudioCapturer(audioCapturerOptions);
console.log('AudioRecLog: Create audio capturer success.');
}
let audioStreamInfo = {
samplingRate: audio.AudioSamplingRate.SAMPLE_RATE_44100,
channels: audio.AudioChannel.CHANNEL_1,
sampleFormat: audio.AudioSampleFormat.SAMPLE_FORMAT_S16LE,
encodingType: audio.AudioEncodingType.ENCODING_TYPE_RAW
}
let audioCapturerInfo = {
source: audio.SourceType.SOURCE_TYPE_MIC,
capturerFlags: 0 // 0 is the extended flag bit of the audio capturer. The default value is 0.
}
let audioCapturerOptions = {
streamInfo: audioStreamInfo,
capturerInfo: audioCapturerInfo
}
let audioCapturer = await audio.createAudioCapturer(audioCapturerOptions);
console.log('AudioRecLog: Create audio capturer success.');
``` ```
2. Use **start()** to start audio recording. 2. Use **start()** to start audio recording.
...@@ -60,23 +71,18 @@ For details about the APIs, see [AudioCapturer in Audio Management](../reference ...@@ -60,23 +71,18 @@ For details about the APIs, see [AudioCapturer in Audio Management](../reference
The capturer state will be **STATE_RUNNING** once the audio capturer is started. The application can then begin reading buffers. The capturer state will be **STATE_RUNNING** once the audio capturer is started. The application can then begin reading buffers.
```js ```js
import audio from '@ohos.multimedia.audio'; async startCapturer() {
let state = this.audioCapturer.state;
async function startCapturer() {
let state = audioCapturer.state;
// The audio capturer should be in the STATE_PREPARED, STATE_PAUSED, or STATE_STOPPED state after being started. // The audio capturer should be in the STATE_PREPARED, STATE_PAUSED, or STATE_STOPPED state after being started.
if (state != audio.AudioState.STATE_PREPARED || state != audio.AudioState.STATE_PAUSED || if (state == audio.AudioState.STATE_PREPARED || state == audio.AudioState.STATE_PAUSED ||
state != audio.AudioState.STATE_STOPPED) { state == audio.AudioState.STATE_STOPPED) {
console.info('Capturer is not in a correct state to start'); await this.audioCapturer.start();
return; state = this.audioCapturer.state;
} if (state == audio.AudioState.STATE_RUNNING) {
await audioCapturer.start(); console.info('AudioRecLog: Capturer started');
} else {
state = audioCapturer.state; console.error('AudioRecLog: Capturer start failed');
if (state == audio.AudioState.STATE_RUNNING) { }
console.info('AudioRecLog: Capturer started');
} else {
console.error('AudioRecLog: Capturer start failed');
} }
} }
``` ```
...@@ -86,91 +92,88 @@ For details about the APIs, see [AudioCapturer in Audio Management](../reference ...@@ -86,91 +92,88 @@ For details about the APIs, see [AudioCapturer in Audio Management](../reference
The following example shows how to write recorded data into a file. The following example shows how to write recorded data into a file.
```js ```js
import fs from '@ohos.file.fs'; async readData(){
let state = this.audioCapturer.state;
let state = audioCapturer.state; // The read operation can be performed only when the state is STATE_RUNNING.
// The read operation can be performed only when the state is STATE_RUNNING. if (state != audio.AudioState.STATE_RUNNING) {
if (state != audio.AudioState.STATE_RUNNING) { console.info('Capturer is not in a correct state to read');
console.info('Capturer is not in a correct state to read'); return;
return;
}
const path = '/data/data/.pulse_dir/capture_js.wav'; // Path for storing the collected audio file.
let file = fs.openSync(filePath, 0o2);
let fd = file.fd;
if (file !== null) {
console.info('AudioRecLog: file created');
} else {
console.info('AudioRecLog: file create : FAILED');
return;
}
if (fd !== null) {
console.info('AudioRecLog: file fd opened in append mode');
}
let numBuffersToCapture = 150; // Write data for 150 times.
let count = 0;
while (numBuffersToCapture) {
let bufferSize = await audioCapturer.getBufferSize();
let buffer = await audioCapturer.read(bufferSize, true);
let options = {
offset: count * this.bufferSize,
length: this.bufferSize
} }
if (typeof(buffer) == undefined) { const path = '/data/data/.pulse_dir/capture_js.wav'; // Path for storing the collected audio file.
console.info('AudioRecLog: read buffer failed'); let file = fs.openSync(path, 0o2);
let fd = file.fd;
if (file !== null) {
console.info('AudioRecLog: file created');
} else { } else {
let number = fs.writeSync(fd, buffer, options); console.info('AudioRecLog: file create : FAILED');
console.info(`AudioRecLog: data written: ${number}`); return;
} }
numBuffersToCapture--; if (fd !== null) {
count++; console.info('AudioRecLog: file fd opened in append mode');
}
let numBuffersToCapture = 150; // Write data for 150 times.
let count = 0;
while (numBuffersToCapture) {
this.bufferSize = await this.audioCapturer.getBufferSize();
let buffer = await this.audioCapturer.read(this.bufferSize, true);
let options = {
offset: count * this.bufferSize,
length: this.bufferSize
}
if (typeof(buffer) == undefined) {
console.info('AudioRecLog: read buffer failed');
} else {
let number = fs.writeSync(fd, buffer, options);
console.info(`AudioRecLog: data written: ${number}`);
}
numBuffersToCapture--;
count++;
}
} }
``` ```
4. Once the recording is complete, call **stop()** to stop the recording. 4. Once the recording is complete, call **stop()** to stop the recording.
```js ```js
async function StopCapturer() { async StopCapturer() {
let state = audioCapturer.state; let state = this.audioCapturer.state;
// The audio capturer can be stopped only when it is in STATE_RUNNING or STATE_PAUSED state. // The audio capturer can be stopped only when it is in STATE_RUNNING or STATE_PAUSED state.
if (state != audio.AudioState.STATE_RUNNING && state != audio.AudioState.STATE_PAUSED) { if (state != audio.AudioState.STATE_RUNNING && state != audio.AudioState.STATE_PAUSED) {
console.info('AudioRecLog: Capturer is not running or paused'); console.info('AudioRecLog: Capturer is not running or paused');
return; return;
} }
await audioCapturer.stop(); await this.audioCapturer.stop();
state = audioCapturer.state; state = this.audioCapturer.state;
if (state == audio.AudioState.STATE_STOPPED) { if (state == audio.AudioState.STATE_STOPPED) {
console.info('AudioRecLog: Capturer stopped'); console.info('AudioRecLog: Capturer stopped');
} else { } else {
console.error('AudioRecLog: Capturer stop failed'); console.error('AudioRecLog: Capturer stop failed');
} }
} }
``` ```
5. After the task is complete, call **release()** to release related resources. 5. After the task is complete, call **release()** to release related resources.
```js ```js
async function releaseCapturer() { async releaseCapturer() {
let state = audioCapturer.state; let state = this.audioCapturer.state;
// The audio capturer can be released only when it is not in the STATE_RELEASED or STATE_NEW state. // The audio capturer can be released only when it is not in the STATE_RELEASED or STATE_NEW state.
if (state == audio.AudioState.STATE_RELEASED || state == audio.AudioState.STATE_NEW) { if (state == audio.AudioState.STATE_RELEASED || state == audio.AudioState.STATE_NEW) {
console.info('AudioRecLog: Capturer already released'); console.info('AudioRecLog: Capturer already released');
return; return;
} }
await audioCapturer.release(); await this.audioCapturer.release();
state = audioCapturer.state; state = this.audioCapturer.state;
if (state == audio.AudioState.STATE_RELEASED) { if (state == audio.AudioState.STATE_RELEASED) {
console.info('AudioRecLog: Capturer released'); console.info('AudioRecLog: Capturer released');
} else { } else {
console.info('AudioRecLog: Capturer release failed'); console.info('AudioRecLog: Capturer release failed');
} }
} }
``` ```
6. (Optional) Obtain the audio capturer information. 6. (Optional) Obtain the audio capturer information.
...@@ -178,23 +181,20 @@ For details about the APIs, see [AudioCapturer in Audio Management](../reference ...@@ -178,23 +181,20 @@ For details about the APIs, see [AudioCapturer in Audio Management](../reference
You can use the following code to obtain the audio capturer information: You can use the following code to obtain the audio capturer information:
```js ```js
// Obtain the audio capturer state. async getAudioCapturerInfo(){
let state = audioCapturer.state; // Obtain the audio capturer state.
let state = this.audioCapturer.state;
// Obtain the audio capturer information. // Obtain the audio capturer information.
let audioCapturerInfo : audio.AuduioCapturerInfo = await audioCapturer.getCapturerInfo(); let audioCapturerInfo : audio.AudioCapturerInfo = await this.audioCapturer.getCapturerInfo();
// Obtain the audio stream information.
// Obtain the audio stream information. let audioStreamInfo : audio.AudioStreamInfo = await this.audioCapturer.getStreamInfo();
let audioStreamInfo : audio.AudioStreamInfo = await audioCapturer.getStreamInfo(); // Obtain the audio stream ID.
let audioStreamId : number = await this.audioCapturer.getAudioStreamId();
// Obtain the audio stream ID. // Obtain the Unix timestamp, in nanoseconds.
let audioStreamId : number = await audioCapturer.getAudioStreamId(); let audioTime : number = await this.audioCapturer.getAudioTime();
// Obtain a proper minimum buffer size.
// Obtain the Unix timestamp, in nanoseconds. let bufferSize : number = await this.audioCapturer.getBufferSize();
let audioTime : number = await audioCapturer.getAudioTime(); }
// Obtain a proper minimum buffer size.
let bufferSize : number = await audioCapturer.getBufferSize();
``` ```
7. (Optional) Use **on('markReach')** to subscribe to the mark reached event, and use **off('markReach')** to unsubscribe from the event. 7. (Optional) Use **on('markReach')** to subscribe to the mark reached event, and use **off('markReach')** to unsubscribe from the event.
...@@ -202,12 +202,13 @@ For details about the APIs, see [AudioCapturer in Audio Management](../reference ...@@ -202,12 +202,13 @@ For details about the APIs, see [AudioCapturer in Audio Management](../reference
After the mark reached event is subscribed to, when the number of frames collected by the audio capturer reaches the specified value, a callback is triggered and the specified value is returned. After the mark reached event is subscribed to, when the number of frames collected by the audio capturer reaches the specified value, a callback is triggered and the specified value is returned.
```js ```js
audioCapturer.on('markReach', (reachNumber) => { async markReach(){
console.info('Mark reach event Received'); this.audioCapturer.on('markReach', 10, (reachNumber) => {
console.info(`The Capturer reached frame: ${reachNumber}`); console.info('Mark reach event Received');
}); console.info(`The Capturer reached frame: ${reachNumber}`);
});
audioCapturer.off('markReach'); // Unsubscribe from the mark reached event. This event will no longer be listened for. this.audioCapturer.off('markReach'); // Unsubscribe from the mark reached event. This event will no longer be listened for.
}
``` ```
8. (Optional) Use **on('periodReach')** to subscribe to the period reached event, and use **off('periodReach')** to unsubscribe from the event. 8. (Optional) Use **on('periodReach')** to subscribe to the period reached event, and use **off('periodReach')** to unsubscribe from the event.
...@@ -215,40 +216,43 @@ For details about the APIs, see [AudioCapturer in Audio Management](../reference ...@@ -215,40 +216,43 @@ For details about the APIs, see [AudioCapturer in Audio Management](../reference
After the period reached event is subscribed to, each time the number of frames collected by the audio capturer reaches the specified value, a callback is triggered and the specified value is returned. After the period reached event is subscribed to, each time the number of frames collected by the audio capturer reaches the specified value, a callback is triggered and the specified value is returned.
```js ```js
audioCapturer.on('periodReach', (reachNumber) => { async periodReach(){
console.info('Period reach event Received'); this.audioCapturer.on('periodReach', 10, (reachNumber) => {
console.info(`In this period, the Capturer reached frame: ${reachNumber}`); console.info('Period reach event Received');
}); console.info(`In this period, the Capturer reached frame: ${reachNumber}`);
});
audioCapturer.off('periodReach'); // Unsubscribe from the period reached event. This event will no longer be listened for. this.audioCapturer.off('periodReach'); // Unsubscribe from the period reached event. This event will no longer be listened for.
}
``` ```
9. If your application needs to perform some operations when the audio capturer state is updated, it can subscribe to the state change event. When the audio capturer state is updated, the application receives a callback containing the event type. 9. If your application needs to perform some operations when the audio capturer state is updated, it can subscribe to the state change event. When the audio capturer state is updated, the application receives a callback containing the event type.
```js ```js
audioCapturer.on('stateChange', (state) => { async stateChange(){
console.info(`AudioCapturerLog: Changed State to : ${state}`) this.audioCapturer.on('stateChange', (state) => {
switch (state) { console.info(`AudioCapturerLog: Changed State to : ${state}`)
case audio.AudioState.STATE_PREPARED: switch (state) {
console.info('--------CHANGE IN AUDIO STATE----------PREPARED--------------'); case audio.AudioState.STATE_PREPARED:
console.info('Audio State is : Prepared'); console.info('--------CHANGE IN AUDIO STATE----------PREPARED--------------');
break; console.info('Audio State is : Prepared');
case audio.AudioState.STATE_RUNNING: break;
console.info('--------CHANGE IN AUDIO STATE----------RUNNING--------------'); case audio.AudioState.STATE_RUNNING:
console.info('Audio State is : Running'); console.info('--------CHANGE IN AUDIO STATE----------RUNNING--------------');
break; console.info('Audio State is : Running');
case audio.AudioState.STATE_STOPPED: break;
console.info('--------CHANGE IN AUDIO STATE----------STOPPED--------------'); case audio.AudioState.STATE_STOPPED:
console.info('Audio State is : stopped'); console.info('--------CHANGE IN AUDIO STATE----------STOPPED--------------');
break; console.info('Audio State is : stopped');
case audio.AudioState.STATE_RELEASED: break;
console.info('--------CHANGE IN AUDIO STATE----------RELEASED--------------'); case audio.AudioState.STATE_RELEASED:
console.info('Audio State is : released'); console.info('--------CHANGE IN AUDIO STATE----------RELEASED--------------');
break; console.info('Audio State is : released');
default: break;
console.info('--------CHANGE IN AUDIO STATE----------INVALID--------------'); default:
console.info('Audio State is : invalid'); console.info('--------CHANGE IN AUDIO STATE----------INVALID--------------');
break; console.info('Audio State is : invalid');
} break;
}); }
});
}
``` ```
...@@ -19,61 +19,68 @@ The following figure shows the audio renderer state transitions. ...@@ -19,61 +19,68 @@ The following figure shows the audio renderer state transitions.
![audio-renderer-state](figures/audio-renderer-state.png) ![audio-renderer-state](figures/audio-renderer-state.png)
- **PREPARED**: The audio renderer enters this state by calling **create()**. - **PREPARED**: The audio renderer enters this state by calling **create()**.
- **RUNNING**: The audio renderer enters this state by calling **start()** when it is in the **PREPARED** state or by calling **start()** when it is in the **STOPPED** state. - **RUNNING**: The audio renderer enters this state by calling **start()** when it is in the **PREPARED** state or by calling **start()** when it is in the **STOPPED** state.
- **PAUSED**: The audio renderer enters this state by calling **pause()** when it is in the **RUNNING** state. When the audio playback is paused, it can call **start()** to resume the playback. - **PAUSED**: The audio renderer enters this state by calling **pause()** when it is in the **RUNNING** state. When the audio playback is paused, it can call **start()** to resume the playback.
- **STOPPED**: The audio renderer enters this state by calling **stop()** when it is in the **PAUSED** or **RUNNING** state. - **STOPPED**: The audio renderer enters this state by calling **stop()** when it is in the **PAUSED** or **RUNNING** state.
- **RELEASED**: The audio renderer enters this state by calling **release()** when it is in the **PREPARED**, **PAUSED**, or **STOPPED** state. In this state, the audio renderer releases all occupied hardware and software resources and will not transit to any other state. - **RELEASED**: The audio renderer enters this state by calling **release()** when it is in the **PREPARED**, **PAUSED**, or **STOPPED** state. In this state, the audio renderer releases all occupied hardware and software resources and will not transit to any other state.
## How to Develop ## How to Develop
For details about the APIs, see [AudioRenderer in Audio Management](../reference/apis/js-apis-audio.md#audiorenderer8). For details about the APIs, see [AudioRenderer in Audio Management](../reference/apis/js-apis-audio.md#audiorenderer8).
1. Use **createAudioRenderer()** to create an **AudioRenderer** instance. 1. Use **createAudioRenderer()** to create a global **AudioRenderer** instance.
Set parameters of the **AudioRenderer** instance in **audioRendererOptions**. This instance is used to render audio, control and obtain the rendering status, and register a callback for notification. Set parameters of the **AudioRenderer** instance in **audioRendererOptions**. This instance is used to render audio, control and obtain the rendering status, and register a callback for notification.
```js ```js
import audio from '@ohos.multimedia.audio'; import audio from '@ohos.multimedia.audio';
import fs from '@ohos.file.fs';
// Perform a self-test on APIs related to audio rendering.
@Entry
@Component
struct AudioRenderer1129 {
private audioRenderer: audio.AudioRenderer;
private bufferSize; // It will be used for the call of the write function in step 3.
private audioRenderer1: audio.AudioRenderer; // It will be used for the call in the complete example in step 14.
private audioRenderer2: audio.AudioRenderer; // It will be used for the call in the complete example in step 14.
async initAudioRender(){
let audioStreamInfo = { let audioStreamInfo = {
samplingRate: audio.AudioSamplingRate.SAMPLE_RATE_44100, samplingRate: audio.AudioSamplingRate.SAMPLE_RATE_44100,
channels: audio.AudioChannel.CHANNEL_1, channels: audio.AudioChannel.CHANNEL_1,
sampleFormat: audio.AudioSampleFormat.SAMPLE_FORMAT_S16LE, sampleFormat: audio.AudioSampleFormat.SAMPLE_FORMAT_S16LE,
encodingType: audio.AudioEncodingType.ENCODING_TYPE_RAW encodingType: audio.AudioEncodingType.ENCODING_TYPE_RAW
} }
let audioRendererInfo = { let audioRendererInfo = {
content: audio.ContentType.CONTENT_TYPE_SPEECH, content: audio.ContentType.CONTENT_TYPE_SPEECH,
usage: audio.StreamUsage.STREAM_USAGE_VOICE_COMMUNICATION, usage: audio.StreamUsage.STREAM_USAGE_VOICE_COMMUNICATION,
rendererFlags: 0 // 0 is the extended flag bit of the audio renderer. The default value is 0. rendererFlags: 0 // 0 is the extended flag bit of the audio renderer. The default value is 0.
} }
let audioRendererOptions = { let audioRendererOptions = {
streamInfo: audioStreamInfo, streamInfo: audioStreamInfo,
rendererInfo: audioRendererInfo rendererInfo: audioRendererInfo
} }
this.audioRenderer = await audio.createAudioRenderer(audioRendererOptions);
let audioRenderer = await audio.createAudioRenderer(audioRendererOptions);
console.log("Create audio renderer success."); console.log("Create audio renderer success.");
}
}
``` ```
2. Use **start()** to start audio rendering.
2. Use **start()** to start audio rendering.
```js ```js
async function startRenderer() { async startRenderer() {
let state = audioRenderer.state; let state = this.audioRenderer.state;
// The audio renderer should be in the STATE_PREPARED, STATE_PAUSED, or STATE_STOPPED state when start() is called. // The audio renderer should be in the STATE_PREPARED, STATE_PAUSED, or STATE_STOPPED state when start() is called.
if (state != audio.AudioState.STATE_PREPARED && state != audio.AudioState.STATE_PAUSED && if (state != audio.AudioState.STATE_PREPARED && state != audio.AudioState.STATE_PAUSED &&
state != audio.AudioState.STATE_STOPPED) { state != audio.AudioState.STATE_STOPPED) {
console.info('Renderer is not in a correct state to start'); console.info('Renderer is not in a correct state to start');
return; return;
} }
await audioRenderer.start(); await this.audioRenderer.start();
state = audioRenderer.state; state = this.audioRenderer.state;
if (state == audio.AudioState.STATE_RUNNING) { if (state == audio.AudioState.STATE_RUNNING) {
console.info('Renderer started'); console.info('Renderer started');
} else { } else {
...@@ -81,117 +88,102 @@ For details about the APIs, see [AudioRenderer in Audio Management](../reference ...@@ -81,117 +88,102 @@ For details about the APIs, see [AudioRenderer in Audio Management](../reference
} }
} }
``` ```
The renderer state will be **STATE_RUNNING** once the audio renderer is started. The application can then begin reading buffers. The renderer state will be **STATE_RUNNING** once the audio renderer is started. The application can then begin reading buffers.
3. Call **write()** to write data to the buffer. 3. Call **write()** to write data to the buffer.
Read the audio data to be played to the buffer. Call **write()** repeatedly to write the data to the buffer. Read the audio data to be played to the buffer. Call **write()** repeatedly to write the data to the buffer. Import fs from '@ohos.file.fs'; as step 1.
```js ```js
import fs from '@ohos.file.fs'; async writeData(){
import audio from '@ohos.multimedia.audio'; // Set a proper buffer size for the audio renderer. You can also select a buffer of another size.
this.bufferSize = await this.audioRenderer.getBufferSize();
async function writeBuffer(buf) { let dir = globalThis.fileDir; // You must use the sandbox path.
// The write operation can be performed only when the state is STATE_RUNNING. const filePath = dir + '/file_example_WAV_2MG.wav'; // The file to render is in the following path: /data/storage/el2/base/haps/entry/files/file_example_WAV_2MG.wav
if (audioRenderer.state != audio.AudioState.STATE_RUNNING) { console.info(`file filePath: ${ filePath}`);
console.error('Renderer is not running, do not write');
return;
}
let writtenbytes = await audioRenderer.write(buf);
console.info(`Actual written bytes: ${writtenbytes} `);
if (writtenbytes < 0) {
console.error('Write buffer failed. check the state of renderer');
}
}
// Set a proper buffer size for the audio renderer. You can also select a buffer of another size. let file = fs.openSync(filePath, fs.OpenMode.READ_ONLY);
const bufferSize = await audioRenderer.getBufferSize(); let stat = await fs.stat(filePath); // Music file information.
let dir = globalThis.fileDir; // You must use the sandbox path. let buf = new ArrayBuffer(this.bufferSize);
const filePath = dir + '/file_example_WAV_2MG.wav'; // The file to render is in the following path: /data/storage/el2/base/haps/entry/files/file_example_WAV_2MG.wav let len = stat.size % this.bufferSize == 0 ? Math.floor(stat.size / this.bufferSize) : Math.floor(stat.size / this.bufferSize + 1);
console.info(`file filePath: ${ filePath}`); for (let i = 0;i < len; i++) {
let options = {
let file = fs.openSync(filePath, fs.OpenMode.READ_ONLY); offset: i * this.bufferSize,
let stat = await fs.stat(filePath); // Music file information. length: this.bufferSize
let buf = new ArrayBuffer(bufferSize); }
let len = stat.size % this.bufferSize == 0 ? Math.floor(stat.size / this.bufferSize) : Math.floor(stat.size / this.bufferSize + 1); let readsize = await fs.read(file.fd, buf, options)
for (let i = 0;i < len; i++) { let writeSize = await new Promise((resolve,reject)=>{
let options = { this.audioRenderer.write(buf,(err,writeSize)=>{
offset: i * this.bufferSize, if(err){
length: this.bufferSize reject(err)
} }else{
let readsize = await fs.read(file.fd, buf, options) resolve(writeSize)
let writeSize = await new Promise((resolve,reject)=>{ }
this.audioRenderer.write(buf,(err,writeSize)=>{ })
if(err){
reject(err)
}else{
resolve(writeSize)
}
}) })
}) }
fs.close(file)
await this.audioRenderer.stop(); // Stop rendering.
await this.audioRenderer.release(); // Release the resources.
} }
fs.close(file)
await audioRenderer.stop(); // Stop rendering.
await audioRenderer.release(); // Releases the resources.
``` ```
4. (Optional) Call **pause()** or **stop()** to pause or stop rendering. 4. (Optional) Call **pause()** or **stop()** to pause or stop rendering.
```js ```js
async function pauseRenderer() { async pauseRenderer() {
let state = audioRenderer.state; let state = this.audioRenderer.state;
// The audio renderer can be paused only when it is in the STATE_RUNNING state. // The audio renderer can be paused only when it is in the STATE_RUNNING state.
if (state != audio.AudioState.STATE_RUNNING) { if (state != audio.AudioState.STATE_RUNNING) {
console.info('Renderer is not running'); console.info('Renderer is not running');
return; return;
} }
await audioRenderer.pause(); await this.audioRenderer.pause();
state = audioRenderer.state; state = this.audioRenderer.state;
if (state == audio.AudioState.STATE_PAUSED) { if (state == audio.AudioState.STATE_PAUSED) {
console.info('Renderer paused'); console.info('Renderer paused');
} else { } else {
console.error('Renderer pause failed'); console.error('Renderer pause failed');
} }
} }
async function stopRenderer() { async stopRenderer() {
let state = audioRenderer.state; let state = this.audioRenderer.state;
// The audio renderer can be stopped only when it is in STATE_RUNNING or STATE_PAUSED state. // The audio renderer can be stopped only when it is in STATE_RUNNING or STATE_PAUSED state.
if (state != audio.AudioState.STATE_RUNNING && state != audio.AudioState.STATE_PAUSED) { if (state != audio.AudioState.STATE_RUNNING && state != audio.AudioState.STATE_PAUSED) {
console.info('Renderer is not running or paused'); console.info('Renderer is not running or paused');
return; return;
} }
await audioRenderer.stop(); await this.audioRenderer.stop();
state = audioRenderer.state; state = this.audioRenderer.state;
if (state == audio.AudioState.STATE_STOPPED) { if (state == audio.AudioState.STATE_STOPPED) {
console.info('Renderer stopped'); console.info('Renderer stopped');
} else { } else {
console.error('Renderer stop failed'); console.error('Renderer stop failed');
} }
} }
``` ```
5. (Optional) Call **drain()** to clear the buffer. 5. (Optional) Call **drain()** to clear the buffer.
```js ```js
async function drainRenderer() { async drainRenderer() {
let state = audioRenderer.state; let state = this.audioRenderer.state;
// drain() can be used only when the audio renderer is in the STATE_RUNNING state. // drain() can be used only when the audio renderer is in the STATE_RUNNING state.
if (state != audio.AudioState.STATE_RUNNING) { if (state != audio.AudioState.STATE_RUNNING) {
console.info('Renderer is not running'); console.info('Renderer is not running');
return; return;
}
await audioRenderer.drain();
state = audioRenderer.state;
} }
await this.audioRenderer.drain();
state = this.audioRenderer.state;
}
``` ```
6. After the task is complete, call **release()** to release related resources. 6. After the task is complete, call **release()** to release related resources.
...@@ -199,68 +191,63 @@ For details about the APIs, see [AudioRenderer in Audio Management](../reference ...@@ -199,68 +191,63 @@ For details about the APIs, see [AudioRenderer in Audio Management](../reference
**AudioRenderer** uses a large number of system resources. Therefore, ensure that the resources are released after the task is complete. **AudioRenderer** uses a large number of system resources. Therefore, ensure that the resources are released after the task is complete.
```js ```js
async function releaseRenderer() { async releaseRenderer() {
let state = audioRenderer.state; let state = this.audioRenderer.state;
// The audio renderer can be released only when it is not in the STATE_RELEASED or STATE_NEW state. // The audio renderer can be released only when it is not in the STATE_RELEASED or STATE_NEW state.
if (state == audio.AudioState.STATE_RELEASED || state == audio.AudioState.STATE_NEW) { if (state == audio.AudioState.STATE_RELEASED || state == audio.AudioState.STATE_NEW) {
console.info('Renderer already released'); console.info('Renderer already released');
return; return;
} }
await this.audioRenderer.release();
await audioRenderer.release();
state = this.audioRenderer.state;
state = audioRenderer.state; if (state == audio.AudioState.STATE_RELEASED) {
if (state == audio.AudioState.STATE_RELEASED) { console.info('Renderer released');
console.info('Renderer released'); } else {
} else { console.info('Renderer release failed');
console.info('Renderer release failed'); }
}
} }
``` ```
7. (Optional) Obtain the audio renderer information. 7. (Optional) Obtain the audio renderer information.
You can use the following code to obtain the audio renderer information: You can use the following code to obtain the audio renderer information:
```js ```js
// Obtain the audio renderer state. async getRenderInfo(){
let state = audioRenderer.state; // Obtain the audio renderer state.
let state = this.audioRenderer.state;
// Obtain the audio renderer information. // Obtain the audio renderer information.
let audioRendererInfo : audio.AudioRendererInfo = await audioRenderer.getRendererInfo(); let audioRendererInfo : audio.AudioRendererInfo = await this.audioRenderer.getRendererInfo();
// Obtain the audio stream information.
// Obtain the audio stream information. let audioStreamInfo : audio.AudioStreamInfo = await this.audioRenderer.getStreamInfo();
let audioStreamInfo : audio.AudioStreamInfo = await audioRenderer.getStreamInfo(); // Obtain the audio stream ID.
let audioStreamId : number = await this.audioRenderer.getAudioStreamId();
// Obtain the audio stream ID. // Obtain the Unix timestamp, in nanoseconds.
let audioStreamId : number = await audioRenderer.getAudioStreamId(); let audioTime : number = await this.audioRenderer.getAudioTime();
// Obtain a proper minimum buffer size.
// Obtain the Unix timestamp, in nanoseconds. let bufferSize : number = await this.audioRenderer.getBufferSize();
let audioTime : number = await audioRenderer.getAudioTime(); // Obtain the audio renderer rate.
let renderRate : audio.AudioRendererRate = await this.audioRenderer.getRenderRate();
// Obtain a proper minimum buffer size. }
let bufferSize : number = await audioRenderer.getBufferSize();
// Obtain the audio renderer rate.
let renderRate : audio.AudioRendererRate = await audioRenderer.getRenderRate();
``` ```
8. (Optional) Set the audio renderer information. 8. (Optional) Set the audio renderer information.
You can use the following code to set the audio renderer information: You can use the following code to set the audio renderer information:
```js ```js
// Set the audio renderer rate to RENDER_RATE_NORMAL. async setAudioRenderInfo(){
let renderRate : audio.AudioRendererRate = audio.AudioRendererRate.RENDER_RATE_NORMAL; // Set the audio renderer rate to RENDER_RATE_NORMAL.
await audioRenderer.setRenderRate(renderRate); let renderRate : audio.AudioRendererRate = audio.AudioRendererRate.RENDER_RATE_NORMAL;
await this.audioRenderer.setRenderRate(renderRate);
// Set the interruption mode of the audio renderer to SHARE_MODE. // Set the interruption mode of the audio renderer to SHARE_MODE.
let interruptMode : audio.InterruptMode = audio.InterruptMode.SHARE_MODE; let interruptMode : audio.InterruptMode = audio.InterruptMode.SHARE_MODE;
await audioRenderer.setInterruptMode(interruptMode); await this.audioRenderer.setInterruptMode(interruptMode);
// Set the volume of the stream to 0.5.
// Set the volume of the stream to 0.5. let volume : number = 0.5;
let volume : number = 0.5; await this.audioRenderer.setVolume(volume);
await audioRenderer.setVolume(volume); }
``` ```
9. (Optional) Use **on('audioInterrupt')** to subscribe to the audio interruption event, and use **off('audioInterrupt')** to unsubscribe from the event. 9. (Optional) Use **on('audioInterrupt')** to subscribe to the audio interruption event, and use **off('audioInterrupt')** to unsubscribe from the event.
...@@ -272,110 +259,116 @@ For details about the APIs, see [AudioRenderer in Audio Management](../reference ...@@ -272,110 +259,116 @@ For details about the APIs, see [AudioRenderer in Audio Management](../reference
In the case of audio interruption, the application may encounter write failures. To avoid such failures, interruption-unaware applications can use **audioRenderer.state** to check the audio renderer state before writing audio data. The applications can obtain more details by subscribing to the audio interruption events. For details, see [InterruptEvent](../reference/apis/js-apis-audio.md#interruptevent9). In the case of audio interruption, the application may encounter write failures. To avoid such failures, interruption-unaware applications can use **audioRenderer.state** to check the audio renderer state before writing audio data. The applications can obtain more details by subscribing to the audio interruption events. For details, see [InterruptEvent](../reference/apis/js-apis-audio.md#interruptevent9).
It should be noted that the audio interruption event subscription of the **AudioRenderer** module is slightly different from **on('interrupt')** in [AudioManager](../reference/apis/js-apis-audio.md#audiomanager). The **on('interrupt')** and **off('interrupt')** APIs are deprecated since API version 9. In the **AudioRenderer** module, you only need to call **on('audioInterrupt')** to listen for focus change events. When the **AudioRenderer** instance created by the application performs actions such as start, stop, and pause, it requests the focus, which triggers focus transfer and in return enables the related **AudioRenderer** instance to receive a notification through the callback. For instances other than **AudioRenderer**, such as frequency modulation (FM) and voice wakeup, the application does not create an instance. In this case, the application can call **on('interrupt')** in **AudioManager** to receive a focus change notification. It should be noted that the audio interruption event subscription of the **AudioRenderer** module is slightly different from **on('interrupt')** in [AudioManager](../reference/apis/js-apis-audio.md#audiomanager). The **on('interrupt')** and **off('interrupt')** APIs are deprecated since API version 9. In the **AudioRenderer** module, you only need to call **on('audioInterrupt')** to listen for focus change events. When the **AudioRenderer** instance created by the application performs actions such as start, stop, and pause, it requests the focus, which triggers focus transfer and in return enables the related **AudioRenderer** instance to receive a notification through the callback. For instances other than **AudioRenderer**, such as frequency modulation (FM) and voice wakeup, the application does not create an instance. In this case, the application can call **on('interrupt')** in **AudioManager** to receive a focus change notification.
```js ```js
audioRenderer.on('audioInterrupt', (interruptEvent) => { async subscribeAudioRender(){
console.info('InterruptEvent Received'); this.audioRenderer.on('audioInterrupt', (interruptEvent) => {
console.info(`InterruptType: ${interruptEvent.eventType}`); console.info('InterruptEvent Received');
console.info(`InterruptForceType: ${interruptEvent.forceType}`); console.info(`InterruptType: ${interruptEvent.eventType}`);
console.info(`AInterruptHint: ${interruptEvent.hintType}`); console.info(`InterruptForceType: ${interruptEvent.forceType}`);
console.info(`AInterruptHint: ${interruptEvent.hintType}`);
if (interruptEvent.forceType == audio.InterruptForceType.INTERRUPT_FORCE) {
switch (interruptEvent.hintType) { if (interruptEvent.forceType == audio.InterruptForceType.INTERRUPT_FORCE) {
switch (interruptEvent.hintType) {
// Forcible pausing initiated by the audio framework. To prevent data loss, stop the write operation. // Forcible pausing initiated by the audio framework. To prevent data loss, stop the write operation.
case audio.InterruptHint.INTERRUPT_HINT_PAUSE: case audio.InterruptHint.INTERRUPT_HINT_PAUSE:
isPlay = false; console.info('isPlay is false');
break; break;
// Forcible stopping initiated by the audio framework. To prevent data loss, stop the write operation. // Forcible stopping initiated by the audio framework. To prevent data loss, stop the write operation.
case audio.InterruptHint.INTERRUPT_HINT_STOP: case audio.InterruptHint.INTERRUPT_HINT_STOP:
isPlay = false; console.info('isPlay is false');
break; break;
// Forcible ducking initiated by the audio framework. // Forcible ducking initiated by the audio framework.
case audio.InterruptHint.INTERRUPT_HINT_DUCK: case audio.InterruptHint.INTERRUPT_HINT_DUCK:
break; break;
// Undocking initiated by the audio framework. // Undocking initiated by the audio framework.
case audio.InterruptHint.INTERRUPT_HINT_UNDUCK: case audio.InterruptHint.INTERRUPT_HINT_UNDUCK:
break; break;
} }
} else if (interruptEvent.forceType == audio.InterruptForceType.INTERRUPT_SHARE) { } else if (interruptEvent.forceType == audio.InterruptForceType.INTERRUPT_SHARE) {
switch (interruptEvent.hintType) { switch (interruptEvent.hintType) {
// Notify the application that the rendering starts. // Notify the application that the rendering starts.
case audio.InterruptHint.INTERRUPT_HINT_RESUME: case audio.InterruptHint.INTERRUPT_HINT_RESUME:
startRenderer(); this.startRenderer();
break; break;
// Notify the application that the audio stream is interrupted. The application then determines whether to continue. (In this example, the application pauses the rendering.) // Notify the application that the audio stream is interrupted. The application then determines whether to continue. (In this example, the application pauses the rendering.)
case audio.InterruptHint.INTERRUPT_HINT_PAUSE: case audio.InterruptHint.INTERRUPT_HINT_PAUSE:
isPlay = false; console.info('isPlay is false');
pauseRenderer(); this.pauseRenderer();
break; break;
}
} }
} });
}); }
audioRenderer.off('audioInterrupt'); // Unsubscribe from the audio interruption event. This event will no longer be listened for.
``` ```
10. (Optional) Use **on('markReach')** to subscribe to the mark reached event, and use **off('markReach')** to unsubscribe from the event. 10. (Optional) Use **on('markReach')** to subscribe to the mark reached event, and use **off('markReach')** to unsubscribe from the event.
After the mark reached event is subscribed to, when the number of frames rendered by the audio renderer reaches the specified value, a callback is triggered and the specified value is returned. After the mark reached event is subscribed to, when the number of frames rendered by the audio renderer reaches the specified value, a callback is triggered and the specified value is returned.
```js
audioRenderer.on('markReach', (reachNumber) => {
console.info('Mark reach event Received');
console.info(`The renderer reached frame: ${reachNumber}`);
});
audioRenderer.off('markReach'); // Unsubscribe from the mark reached event. This event will no longer be listened for. ```js
async markReach(){
this.audioRenderer.on('markReach', 50, (position) => {
if (position == 50) {
console.info('ON Triggered successfully');
}
});
this.audioRenderer.off('markReach'); // Unsubscribe from the mark reached event. This event will no longer be listened for.
}
``` ```
11. (Optional) Use **on('periodReach')** to subscribe to the period reached event, and use **off('periodReach')** to unsubscribe from the event. 11. (Optional) Use **on('periodReach')** to subscribe to the period reached event, and use **off('periodReach')** to unsubscribe from the event.
After the period reached event is subscribed to, each time the number of frames rendered by the audio renderer reaches the specified value, a callback is triggered and the specified value is returned. After the period reached event is subscribed to, each time the number of frames rendered by the audio renderer reaches the specified value, a callback is triggered and the specified value is returned.
```js
audioRenderer.on('periodReach', (reachNumber) => {
console.info('Period reach event Received');
console.info(`In this period, the renderer reached frame: ${reachNumber} `);
});
audioRenderer.off('periodReach'); // Unsubscribe from the period reached event. This event will no longer be listened for. ```js
async periodReach(){
this.audioRenderer.on('periodReach',10, (reachNumber) => {
console.info(`In this period, the renderer reached frame: ${reachNumber} `);
});
this.audioRenderer.off('periodReach'); // Unsubscribe from the period reached event. This event will no longer be listened for.
}
``` ```
12. (Optional) Use **on('stateChange')** to subscribe to audio renderer state changes. 12. (Optional) Use **on('stateChange')** to subscribe to audio renderer state changes.
After the **stateChange** event is subscribed to, when the audio renderer state changes, a callback is triggered and the audio renderer state is returned. After the **stateChange** event is subscribed to, when the audio renderer state changes, a callback is triggered and the audio renderer state is returned.
```js ```js
audioRenderer.on('stateChange', (audioState) => { async stateChange(){
console.info('State change event Received'); this.audioRenderer.on('stateChange', (audioState) => {
console.info(`Current renderer state is: ${audioState}`); console.info('State change event Received');
}); console.info(`Current renderer state is: ${audioState}`);
});
}
``` ```
13. (Optional) Handle exceptions of **on()**. 13. (Optional) Handle exceptions of **on()**.
If the string or the parameter type passed in **on()** is incorrect , the application throws an exception. In this case, you can use **try catch** to capture the exception. If the string or the parameter type passed in **on()** is incorrect , the application throws an exception. In this case, you can use **try catch** to capture the exception.
```js ```js
try { async errorCall(){
audioRenderer.on('invalidInput', () => { // The string is invalid. try {
}) this.audioRenderer.on('invalidInput', () => { // The string is invalid.
} catch (err) { })
console.info(`Call on function error, ${err}`); // The application throws exception 401. } catch (err) {
} console.info(`Call on function error, ${err}`); // The application throws exception 401.
try { }
audioRenderer.on(1, () => { // The type of the input parameter is incorrect. try {
}) this.audioRenderer.on(1, () => { // The type of the input parameter is incorrect.
} catch (err) { })
console.info(`Call on function error, ${err}`); // The application throws exception 6800101. } catch (err) {
console.info(`Call on function error, ${err}`); // The application throws exception 6800101.
}
} }
``` ```
14. (Optional) Refer to the complete example of **on('audioInterrupt')**. 14. (Optional) Refer to the complete example of **on('audioInterrupt')**.
Declare audioRenderer1 and audioRenderer2 first. For details, see step 1.
Create **AudioRender1** and **AudioRender2** in an application, configure the independent interruption mode, and call **on('audioInterrupt')** to subscribe to audio interruption events. At the beginning, **AudioRender1** has the focus. When **AudioRender2** attempts to obtain the focus, **AudioRender1** receives a focus transfer notification and the related log information is printed. If the shared mode is used, the log information will not be printed during application running. Create **AudioRender1** and **AudioRender2** in an application, configure the independent interruption mode, and call **on('audioInterrupt')** to subscribe to audio interruption events. At the beginning, **AudioRender1** has the focus. When **AudioRender2** attempts to obtain the focus, **AudioRender1** receives a focus transfer notification and the related log information is printed. If the shared mode is used, the log information will not be printed during application running.
```js
```js
async runningAudioRender1(){ async runningAudioRender1(){
let audioStreamInfo = { let audioStreamInfo = {
samplingRate: audio.AudioSamplingRate.SAMPLE_RATE_48000, samplingRate: audio.AudioSamplingRate.SAMPLE_RATE_48000,
...@@ -390,33 +383,33 @@ For details about the APIs, see [AudioRenderer in Audio Management](../reference ...@@ -390,33 +383,33 @@ For details about the APIs, see [AudioRenderer in Audio Management](../reference
} }
let audioRendererOptions = { let audioRendererOptions = {
streamInfo: audioStreamInfo, streamInfo: audioStreamInfo,
rendererInfo: audioRendererInfo rendererInfo: audioRendererInfo
} }
// 1.1 Create an instance. // 1.1 Create an instance.
audioRenderer1 = await audio.createAudioRenderer(audioRendererOptions); this.audioRenderer1 = await audio.createAudioRenderer(audioRendererOptions);
console.info("Create audio renderer 1 success."); console.info("Create audio renderer 1 success.");
// 1.2 Set the independent mode. // 1.2 Set the independent mode.
audioRenderer1.setInterruptMode(1).then( data => { this.audioRenderer1.setInterruptMode(1).then( data => {
console.info('audioRenderer1 setInterruptMode Success!'); console.info('audioRenderer1 setInterruptMode Success!');
}).catch((err) => { }).catch((err) => {
console.error(`audioRenderer1 setInterruptMode Fail: ${err}`); console.error(`audioRenderer1 setInterruptMode Fail: ${err}`);
}); });
// 1.3 Set the listener. // 1.3 Set the listener.
audioRenderer1.on('audioInterrupt', async(interruptEvent) => { this.audioRenderer1.on('audioInterrupt', async(interruptEvent) => {
console.info(`audioRenderer1 on audioInterrupt : ${JSON.stringify(interruptEvent)}`) console.info(`audioRenderer1 on audioInterrupt : ${JSON.stringify(interruptEvent)}`)
}); });
// 1.4 Start rendering. // 1.4 Start rendering.
await audioRenderer1.start(); await this.audioRenderer1.start();
console.info('startAudioRender1 success'); console.info('startAudioRender1 success');
// 1.5 Obtain the buffer size, which is the proper minimum buffer size of the audio renderer. You can also select a buffer of another size. // 1.5 Obtain the buffer size, which is the proper minimum buffer size of the audio renderer. You can also select a buffer of another size.
const bufferSize = await audioRenderer1.getBufferSize(); const bufferSize = await this.audioRenderer1.getBufferSize();
console.info(`audio bufferSize: ${bufferSize}`); console.info(`audio bufferSize: ${bufferSize}`);
// 1.6 Obtain the original audio data file. // 1.6 Obtain the original audio data file.
let dir = globalThis.fileDir; // You must use the sandbox path. let dir = globalThis.fileDir; // You must use the sandbox path.
const path1 = dir + '/music001_48000_32_1.wav'; // The file to render is in the following path: /data/storage/el2/base/haps/entry/files/music001_48000_32_1.wav const path1 = dir + '/music001_48000_32_1.wav'; // The file to render is in the following path: /data/storage/el2/base/haps/entry/files/music001_48000_32_1.wav
...@@ -425,14 +418,14 @@ For details about the APIs, see [AudioRenderer in Audio Management](../reference ...@@ -425,14 +418,14 @@ For details about the APIs, see [AudioRenderer in Audio Management](../reference
let stat = await fs.stat(path1); // Music file information. let stat = await fs.stat(path1); // Music file information.
let buf = new ArrayBuffer(bufferSize); let buf = new ArrayBuffer(bufferSize);
let len = stat.size % this.bufferSize == 0 ? Math.floor(stat.size / this.bufferSize) : Math.floor(stat.size / this.bufferSize + 1); let len = stat.size % this.bufferSize == 0 ? Math.floor(stat.size / this.bufferSize) : Math.floor(stat.size / this.bufferSize + 1);
// 1.7 Render the original audio data in the buffer by using audioRender. // 1.7 Render the original audio data in the buffer by using audioRender.
for (let i = 0;i < len; i++) { for (let i = 0;i < len; i++) {
let options = { let options = {
offset: i * this.bufferSize, offset: i * this.bufferSize,
length: this.bufferSize length: this.bufferSize
} }
let readsize = await fs.read(file.fd, buf, options) let readsize = await fs.read(file1.fd, buf, options)
let writeSize = await new Promise((resolve,reject)=>{ let writeSize = await new Promise((resolve,reject)=>{
this.audioRenderer1.write(buf,(err,writeSize)=>{ this.audioRenderer1.write(buf,(err,writeSize)=>{
if(err){ if(err){
...@@ -441,13 +434,13 @@ For details about the APIs, see [AudioRenderer in Audio Management](../reference ...@@ -441,13 +434,13 @@ For details about the APIs, see [AudioRenderer in Audio Management](../reference
resolve(writeSize) resolve(writeSize)
} }
}) })
}) })
} }
fs.close(file1) fs.close(file1)
await audioRenderer1.stop(); // Stop rendering. await this.audioRenderer1.stop(); // Stop rendering.
await audioRenderer1.release(); Releases the resources. await this.audioRenderer1.release(); // Release the resources.
} }
async runningAudioRender2(){ async runningAudioRender2(){
let audioStreamInfo = { let audioStreamInfo = {
samplingRate: audio.AudioSamplingRate.SAMPLE_RATE_48000, samplingRate: audio.AudioSamplingRate.SAMPLE_RATE_48000,
...@@ -462,33 +455,33 @@ For details about the APIs, see [AudioRenderer in Audio Management](../reference ...@@ -462,33 +455,33 @@ For details about the APIs, see [AudioRenderer in Audio Management](../reference
} }
let audioRendererOptions = { let audioRendererOptions = {
streamInfo: audioStreamInfo, streamInfo: audioStreamInfo,
rendererInfo: audioRendererInfo rendererInfo: audioRendererInfo
} }
// 2.1 Create another instance. // 2.1 Create another instance.
audioRenderer2 = await audio.createAudioRenderer(audioRendererOptions); this.audioRenderer2 = await audio.createAudioRenderer(audioRendererOptions);
console.info("Create audio renderer 2 success."); console.info("Create audio renderer 2 success.");
// 2.2 Set the independent mode. // 2.2 Set the independent mode.
audioRenderer2.setInterruptMode(1).then( data => { this.audioRenderer2.setInterruptMode(1).then( data => {
console.info('audioRenderer2 setInterruptMode Success!'); console.info('audioRenderer2 setInterruptMode Success!');
}).catch((err) => { }).catch((err) => {
console.error(`audioRenderer2 setInterruptMode Fail: ${err}`); console.error(`audioRenderer2 setInterruptMode Fail: ${err}`);
}); });
// 2.3 Set the listener. // 2.3 Set the listener.
audioRenderer2.on('audioInterrupt', async(interruptEvent) => { this.audioRenderer2.on('audioInterrupt', async(interruptEvent) => {
console.info(`audioRenderer2 on audioInterrupt : ${JSON.stringify(interruptEvent)}`) console.info(`audioRenderer2 on audioInterrupt : ${JSON.stringify(interruptEvent)}`)
}); });
// 2.4 Start rendering. // 2.4 Start rendering.
await audioRenderer2.start(); await this.audioRenderer2.start();
console.info('startAudioRender2 success'); console.info('startAudioRender2 success');
// 2.5 Obtain the buffer size. // 2.5 Obtain the buffer size.
const bufferSize = await audioRenderer2.getBufferSize(); const bufferSize = await this.audioRenderer2.getBufferSize();
console.info(`audio bufferSize: ${bufferSize}`); console.info(`audio bufferSize: ${bufferSize}`);
// 2.6 Read the original audio data file. // 2.6 Read the original audio data file.
let dir = globalThis.fileDir; // You must use the sandbox path. let dir = globalThis.fileDir; // You must use the sandbox path.
const path2 = dir + '/music002_48000_32_1.wav'; // The file to render is in the following path: /data/storage/el2/base/haps/entry/files/music002_48000_32_1.wav const path2 = dir + '/music002_48000_32_1.wav'; // The file to render is in the following path: /data/storage/el2/base/haps/entry/files/music002_48000_32_1.wav
...@@ -497,14 +490,14 @@ For details about the APIs, see [AudioRenderer in Audio Management](../reference ...@@ -497,14 +490,14 @@ For details about the APIs, see [AudioRenderer in Audio Management](../reference
let stat = await fs.stat(path2); // Music file information. let stat = await fs.stat(path2); // Music file information.
let buf = new ArrayBuffer(bufferSize); let buf = new ArrayBuffer(bufferSize);
let len = stat.size % this.bufferSize == 0 ? Math.floor(stat.size / this.bufferSize) : Math.floor(stat.size / this.bufferSize + 1); let len = stat.size % this.bufferSize == 0 ? Math.floor(stat.size / this.bufferSize) : Math.floor(stat.size / this.bufferSize + 1);
// 2.7 Render the original audio data in the buffer by using audioRender. // 2.7 Render the original audio data in the buffer by using audioRender.
for (let i = 0;i < len; i++) { for (let i = 0;i < len; i++) {
let options = { let options = {
offset: i * this.bufferSize, offset: i * this.bufferSize,
length: this.bufferSize length: this.bufferSize
} }
let readsize = await fs.read(file.fd, buf, options) let readsize = await fs.read(file2.fd, buf, options)
let writeSize = await new Promise((resolve,reject)=>{ let writeSize = await new Promise((resolve,reject)=>{
this.audioRenderer2.write(buf,(err,writeSize)=>{ this.audioRenderer2.write(buf,(err,writeSize)=>{
if(err){ if(err){
...@@ -513,28 +506,17 @@ For details about the APIs, see [AudioRenderer in Audio Management](../reference ...@@ -513,28 +506,17 @@ For details about the APIs, see [AudioRenderer in Audio Management](../reference
resolve(writeSize) resolve(writeSize)
} }
}) })
}) })
} }
fs.close(file2) fs.close(file2)
await audioRenderer2.stop(); // Stop rendering. await this.audioRenderer2.stop(); // Stop rendering.
await audioRenderer2.release(); // Releases the resources. await this.audioRenderer2.release(); // Release the resources.
}
async writeBuffer(buf, audioRender) {
let writtenbytes;
await audioRender.write(buf).then((value) => {
writtenbytes = value;
console.info(`Actual written bytes: ${writtenbytes} `);
});
if (typeof(writtenbytes) != 'number' || writtenbytes < 0) {
console.error('get Write buffer failed. check the state of renderer');
}
} }
// Integrated invoking entry. // Integrated invoking entry.
async test(){ async test(){
await runningAudioRender1(); await this.runningAudioRender1();
await runningAudioRender2(); await this.runningAudioRender2();
} }
``` ```
\ No newline at end of file \ No newline at end of file
...@@ -292,13 +292,13 @@ export class AVPlayerDemo { ...@@ -292,13 +292,13 @@ export class AVPlayerDemo {
async avPlayerDemo() { async avPlayerDemo() {
// Create an AVPlayer instance. // Create an AVPlayer instance.
this.avPlayer = await media.createAVPlayer() this.avPlayer = await media.createAVPlayer()
let fdPath = 'fd://' let fileDescriptor = undefined
let pathDir = "/data/storage/el2/base/haps/entry/files" // The path used here is an example. Obtain the path based on project requirements. // Use getRawFileDescriptor of the resource management module to obtain the media assets in the application, and use the fdSrc attribute of the AVPlayer to initialize the media asset.
// The stream in the path can be pushed to the device by running the "hdc file send D:\xxx\H264_AAC.mp4 /data/app/el2/100/base/ohos.acts.multimedia.media.avplayer/haps/entry/files" command. // For details on the fd/offset/length parameter, see the Media API. The globalThis.abilityContext parameter is a system environment variable and is saved as a global variable on the main page during the system boost.
let path = pathDir + '/H264_AAC.mp4' await globalThis.abilityContext.resourceManager.getRawFileDescriptor('H264_AAC.mp4').then((value) => {
let file = await fs.open(path) fileDescriptor = {fd: value.fd, offset: value.offset, length: value.length}
fdPath = fdPath + '' + file.fd })
this.avPlayer.url = fdPath this.avPlayer.fdSrc = fileDescriptor
} }
} }
``` ```
......
# AVSession Development # AVSession Development
> **NOTE**
>
> All APIs of the **AVSession** module are system APIs and can be called only by system applications.
## Development for the Session Access End ## Development for the Session Access End
### Basic Concepts ### Basic Concepts
...@@ -26,265 +30,270 @@ Table 1 Common APIs for session access end development ...@@ -26,265 +30,270 @@ Table 1 Common APIs for session access end development
### How to Develop ### How to Develop
1. Import the modules. 1. Import the modules.
```js ```js
import avSession from '@ohos.multimedia.avsession'; import avSession from '@ohos.multimedia.avsession';
import wantAgent from '@ohos.wantAgent'; import wantAgent from '@ohos.wantAgent';
import featureAbility from '@ohos.ability.featureAbility'; import featureAbility from '@ohos.ability.featureAbility';
``` ```
2. Create and activate a session. 2. Create and activate a session.
```js
// Define global variables. ```js
let mediaFavorite = false; // Define global variables.
let currentSession = null; let mediaFavorite = false;
let context = featureAbility.getContext(); let currentSession = null;
let context = featureAbility.getContext();
// Create an audio session.
avSession.createAVSession(context, "AudioAppSample", 'audio').then((session) => { // Create an audio session.
currentSession = session; avSession.createAVSession(context, "AudioAppSample", 'audio').then((session) => {
currentSession.activate(); // Activate the session. currentSession = session;
}).catch((err) => { currentSession.activate(); // Activate the session.
console.info(`createAVSession : ERROR : ${err.message}`); }).catch((err) => {
}); console.info(`createAVSession : ERROR : ${err.message}`);
``` });
```
3. Set the session information, including: 3. Set the session information, including:
- Session metadata. In addition to the current media asset ID (mandatory), you can set the title, album, author, duration, and previous/next media asset ID. For details about the session metadata, see **AVMetadata** in the API document. - Session metadata. In addition to the current media asset ID (mandatory), you can set the title, album, author, duration, and previous/next media asset ID. For details about the session metadata, see **AVMetadata** in the API document.
- Launcher ability, which is implemented by calling an API of **WantAgent**. Generally, **WantAgent** is used to encapsulate want information. For more information, see [wantAgent](../reference/apis/js-apis-wantAgent.md). - Launcher ability, which is implemented by calling an API of [WantAgent](../reference/apis/js-apis-wantAgent.md). Generally, **WantAgent** is used to encapsulate want information.
- Playback state information. - Playback state information.
```js
// Set the session metadata. ```js
let metadata = { // Set the session metadata.
assetId: "121278", let metadata = {
title: "lose yourself", assetId: "121278",
artist: "Eminem", title: "lose yourself",
author: "ST", artist: "Eminem",
album: "Slim shady", author: "ST",
writer: "ST", album: "Slim shady",
composer: "ST", writer: "ST",
duration: 2222, composer: "ST",
mediaImage: "https://www.example.com/example.jpg", // Set it based on your project requirements. duration: 2222,
subtitle: "8 Mile", mediaImage: "https://www.example.com/example.jpg", // Set it based on your project requirements.
description: "Rap", subtitle: "8 Mile",
lyric: "https://www.example.com/example.lrc", // Set it based on your project requirements. description: "Rap",
previousAssetId: "121277", lyric: "https://www.example.com/example.lrc", // Set it based on your project requirements.
nextAssetId: "121279", previousAssetId: "121277",
}; nextAssetId: "121279",
currentSession.setAVMetadata(metadata).then(() => { };
console.info('setAVMetadata successfully'); currentSession.setAVMetadata(metadata).then(() => {
}).catch((err) => { console.info('setAVMetadata successfully');
console.info(`setAVMetadata : ERROR : ${err.message}`);
});
```
```js
// Set the launcher ability.
let wantAgentInfo = {
wants: [
{
bundleName: "com.neu.setResultOnAbilityResultTest1",
abilityName: "com.example.test.MainAbility",
}
],
operationType: wantAgent.OperationType.START_ABILITIES,
requestCode: 0,
wantAgentFlags:[wantAgent.WantAgentFlags.UPDATE_PRESENT_FLAG]
}
wantAgent.getWantAgent(wantAgentInfo).then((agent) => {
currentSession.setLaunchAbility(agent).then(() => {
console.info('setLaunchAbility successfully');
}).catch((err) => { }).catch((err) => {
console.info(`setLaunchAbility : ERROR : ${err.message}`); console.info(`setAVMetadata : ERROR : ${err.message}`);
}); });
}); ```
```
```js
```js // Set the launcher ability.
// Set the playback state information. let wantAgentInfo = {
let PlaybackState = { wants: [
state: avSession.PlaybackState.PLAYBACK_STATE_STOP, {
speed: 1.0, bundleName: "com.neu.setResultOnAbilityResultTest1",
position:{elapsedTime: 0, updateTime: (new Date()).getTime()}, abilityName: "com.example.test.MainAbility",
bufferedTime: 1000, }
loopMode: avSession.LoopMode.LOOP_MODE_SEQUENCE, ],
isFavorite: false, operationType: wantAgent.OperationType.START_ABILITIES,
}; requestCode: 0,
currentSession.setAVPlaybackState(PlaybackState).then(() => { wantAgentFlags:[wantAgent.WantAgentFlags.UPDATE_PRESENT_FLAG]
console.info('setAVPlaybackState successfully'); }
}).catch((err) => {
console.info(`setAVPlaybackState : ERROR : ${err.message}`); wantAgent.getWantAgent(wantAgentInfo).then((agent) => {
}); currentSession.setLaunchAbility(agent).then(() => {
``` console.info('setLaunchAbility successfully');
}).catch((err) => {
```js console.info(`setLaunchAbility : ERROR : ${err.message}`);
// Obtain the controller of this session. });
currentSession.getController().then((selfController) => { });
console.info('getController successfully'); ```
}).catch((err) => {
console.info(`getController : ERROR : ${err.message}`); ```js
}); // Set the playback state information.
``` let PlaybackState = {
state: avSession.PlaybackState.PLAYBACK_STATE_STOP,
```js speed: 1.0,
// Obtain the output device information. position:{elapsedTime: 0, updateTime: (new Date()).getTime()},
currentSession.getOutputDevice().then((outputInfo) => { bufferedTime: 1000,
console.info(`getOutputDevice successfully, deviceName : ${outputInfo.deviceName}`); loopMode: avSession.LoopMode.LOOP_MODE_SEQUENCE,
}).catch((err) => { isFavorite: false,
console.info(`getOutputDevice : ERROR : ${err.message}`); };
}); currentSession.setAVPlaybackState(PlaybackState).then(() => {
``` console.info('setAVPlaybackState successfully');
}).catch((err) => {
console.info(`setAVPlaybackState : ERROR : ${err.message}`);
});
```
4. Subscribe to control command events. ```js
```js // Obtain the controller of this session.
// Subscribe to the 'play' command event. currentSession.getController().then((selfController) => {
currentSession.on('play', () => { console.info('getController successfully');
console.log ("Call AudioPlayer.play."); }).catch((err) => {
// Set the playback state information. console.info(`getController : ERROR : ${err.message}`);
currentSession.setAVPlaybackState({state: avSession.PlaybackState.PLAYBACK_STATE_PLAY}).then(() => { });
console.info('setAVPlaybackState successfully'); ```
}).catch((err) => {
console.info(`setAVPlaybackState : ERROR : ${err.message}`); ```js
}); // Obtain the output device information.
}); currentSession.getOutputDevice().then((outputInfo) => {
console.info(`getOutputDevice successfully, deviceName : ${outputInfo.deviceName}`);
}).catch((err) => {
console.info(`getOutputDevice : ERROR : ${err.message}`);
});
```
4. Subscribe to control command events.
// Subscribe to the 'pause' command event. ```js
currentSession.on('pause', () => { // Subscribe to the 'play' command event.
console.log ("Call AudioPlayer.pause."); currentSession.on('play', () => {
// Set the playback state information. console.log ("Call AudioPlayer.play.");
currentSession.setAVPlaybackState({state: avSession.PlaybackState.PLAYBACK_STATE_PAUSE}).then(() => { // Set the playback state information.
console.info('setAVPlaybackState successfully'); currentSession.setAVPlaybackState({state: avSession.PlaybackState.PLAYBACK_STATE_PLAY}).then(() => {
}).catch((err) => { console.info('setAVPlaybackState successfully');
console.info(`setAVPlaybackState : ERROR : ${err.message}`); }).catch((err) => {
console.info(`setAVPlaybackState : ERROR : ${err.message}`);
});
}); });
});
// Subscribe to the 'stop' command event. // Subscribe to the 'pause' command event.
currentSession.on('stop', () => { currentSession.on('pause', () => {
console.log ("Call AudioPlayer.stop."); console.log ("Call AudioPlayer.pause.");
// Set the playback state information. // Set the playback state information.
currentSession.setAVPlaybackState({state: avSession.PlaybackState.PLAYBACK_STATE_STOP}).then(() => { currentSession.setAVPlaybackState({state: avSession.PlaybackState.PLAYBACK_STATE_PAUSE}).then(() => {
console.info('setAVPlaybackState successfully'); console.info('setAVPlaybackState successfully');
}).catch((err) => { }).catch((err) => {
console.info(`setAVPlaybackState : ERROR : ${err.message}`); console.info(`setAVPlaybackState : ERROR : ${err.message}`);
});
}); });
});
// Subscribe to the 'stop' command event.
// Subscribe to the 'playNext' command event. currentSession.on('stop', () => {
currentSession.on('playNext', () => { console.log ("Call AudioPlayer.stop.");
// When the media file is not ready, download and cache the media file, and set the 'PREPARE' state. // Set the playback state information.
currentSession.setAVPlaybackState({state: avSession.PlaybackState.PLAYBACK_STATE_PREPARE}).then(() => { currentSession.setAVPlaybackState({state: avSession.PlaybackState.PLAYBACK_STATE_STOP}).then(() => {
console.info('setAVPlaybackState successfully'); console.info('setAVPlaybackState successfully');
}).catch((err) => { }).catch((err) => {
console.info(`setAVPlaybackState : ERROR : ${err.message}`); console.info(`setAVPlaybackState : ERROR : ${err.message}`);
});
}); });
// The media file is obtained.
currentSession.setAVMetadata({assetId: '58970105', title: 'See you tomorrow'}).then(() => { // Subscribe to the 'playNext' command event.
console.info('setAVMetadata successfully'); currentSession.on('playNext', () => {
}).catch((err) => { // When the media file is not ready, download and cache the media file, and set the 'PREPARE' state.
console.info(`setAVMetadata : ERROR : ${err.message}`); currentSession.setAVPlaybackState({state: avSession.PlaybackState.PLAYBACK_STATE_PREPARE}).then(() => {
console.info('setAVPlaybackState successfully');
}).catch((err) => {
console.info(`setAVPlaybackState : ERROR : ${err.message}`);
});
// The media file is obtained.
currentSession.setAVMetadata({assetId: '58970105', title: 'See you tomorrow'}).then(() => {
console.info('setAVMetadata successfully');
}).catch((err) => {
console.info(`setAVMetadata : ERROR : ${err.message}`);
});
console.log ("Call AudioPlayer.play.");
// Set the playback state information.
let time = (new Date()).getTime();
currentSession.setAVPlaybackState({state: avSession.PlaybackState.PLAYBACK_STATE_PLAY, position: {elapsedTime: 0, updateTime: time}, bufferedTime:2000}).then(() => {
console.info('setAVPlaybackState successfully');
}).catch((err) => {
console.info(`setAVPlaybackState : ERROR : ${err.message}`);
});
}); });
console.log ("Call AudioPlayer.play.");
// Set the playback state information. // Subscribe to the 'fastForward' command event.
let time = (new Date()).getTime(); currentSession.on('fastForward', () => {
currentSession.setAVPlaybackState({state: avSession.PlaybackState.PLAYBACK_STATE_PLAY, position: {elapsedTime: 0, updateTime: time}, bufferedTime:2000}).then(() => { console.log("Call AudioPlayer for fast forwarding.");
console.info('setAVPlaybackState successfully'); // Set the playback state information.
}).catch((err) => { currentSession.setAVPlaybackState({speed: 2.0}).then(() => {
console.info(`setAVPlaybackState : ERROR : ${err.message}`); console.info('setAVPlaybackState successfully');
}).catch((err) => {
console.info(`setAVPlaybackState : ERROR : ${err.message}`);
});
}); });
});
// Subscribe to the 'seek' command event.
// Subscribe to the 'fastForward' command event. currentSession.on('seek', (time) => {
currentSession.on('fastForward', () => { console.log("Call AudioPlayer.seek.");
console.log("Call AudioPlayer for fast forwarding."); // Set the playback state information.
// Set the playback state information. currentSession.setAVPlaybackState({position: {elapsedTime: time, updateTime: (new Data()).getTime()}}).then(() => {
currentSession.setAVPlaybackState({speed: 2.0}).then(() => { console.info('setAVPlaybackState successfully');
console.info('setAVPlaybackState successfully'); }).catch((err) => {
}).catch((err) => { console.info(`setAVPlaybackState : ERROR : ${err.message}`);
console.info(`setAVPlaybackState : ERROR : ${err.message}`); });
}); });
});
// Subscribe to the 'setSpeed' command event.
// Subscribe to the 'seek' command event. currentSession.on('setSpeed', (speed) => {
currentSession.on('seek', (time) => { console.log(`Call AudioPlayer to set the speed to ${speed}`);
console.log("Call AudioPlayer.seek."); // Set the playback state information.
// Set the playback state information. currentSession.setAVPlaybackState({speed: speed}).then(() => {
currentSession.setAVPlaybackState({position: {elapsedTime: time, updateTime: (new Data()).getTime()}}).then(() => { console.info('setAVPlaybackState successfully');
console.info('setAVPlaybackState successfully'); }).catch((err) => {
}).catch((err) => { console.info(`setAVPlaybackState : ERROR : ${err.message}`);
console.info(`setAVPlaybackState : ERROR : ${err.message}`); });
}); });
});
// Subscribe to the 'setLoopMode' command event.
// Subscribe to the 'setSpeed' command event. currentSession.on('setLoopMode', (mode) => {
currentSession.on('setSpeed', (speed) => { console.log(`The application switches to the loop mode ${mode}`);
console.log(`Call AudioPlayer to set the speed to ${speed}`); // Set the playback state information.
// Set the playback state information. currentSession.setAVPlaybackState({loopMode: mode}).then(() => {
currentSession.setAVPlaybackState({speed: speed}).then(() => { console.info('setAVPlaybackState successfully');
console.info('setAVPlaybackState successfully'); }).catch((err) => {
}).catch((err) => { console.info(`setAVPlaybackState : ERROR : ${err.message}`);
console.info(`setAVPlaybackState : ERROR : ${err.message}`); });
}); });
});
// Subscribe to the 'toggleFavorite' command event.
// Subscribe to the 'setLoopMode' command event. currentSession.on('toggleFavorite', (assetId) => {
currentSession.on('setLoopMode', (mode) => { console.log(`The application favorites ${assetId}.`);
console.log(`The application switches to the loop mode ${mode}`); // Perform the switch based on the last status.
// Set the playback state information. let favorite = mediaFavorite == false ? true : false;
currentSession.setAVPlaybackState({loopMode: mode}).then(() => { currentSession.setAVPlaybackState({isFavorite: favorite}).then(() => {
console.info('setAVPlaybackState successfully'); console.info('setAVPlaybackState successfully');
}).catch((err) => { }).catch((err) => {
console.info(`setAVPlaybackState : ERROR : ${err.message}`); console.info(`setAVPlaybackState : ERROR : ${err.message}`);
});
mediaFavorite = favorite;
}); });
});
// Subscribe to the key event.
// Subscribe to the 'toggleFavorite' command event. currentSession.on('handleKeyEvent', (event) => {
currentSession.on('toggleFavorite', (assetId) => { console.log(`User presses the key ${event.keyCode}`);
console.log(`The application favorites ${assetId}.`);
// Perform the switch based on the last status.
let favorite = mediaFavorite == false ? true : false;
currentSession.setAVPlaybackState({isFavorite: favorite}).then(() => {
console.info('setAVPlaybackState successfully');
}).catch((err) => {
console.info(`setAVPlaybackState : ERROR : ${err.message}`);
}); });
mediaFavorite = favorite;
}); // Subscribe to output device changes.
currentSession.on('outputDeviceChange', (device) => {
// Subscribe to the key event. console.log(`Output device changed to ${device.deviceName}`);
currentSession.on('handleKeyEvent', (event) => { });
console.log(`User presses the key ${event.keyCode}`); ```
});
// Subscribe to output device changes.
currentSession.on('outputDeviceChange', (device) => {
console.log(`Output device changed to ${device.deviceName}`);
});
```
5. Release resources. 5. Release resources.
```js
// Unsubscribe from the events. ```js
currentSession.off('play'); // Unsubscribe from the events.
currentSession.off('pause'); currentSession.off('play');
currentSession.off('stop'); currentSession.off('pause');
currentSession.off('playNext'); currentSession.off('stop');
currentSession.off('playPrevious'); currentSession.off('playNext');
currentSession.off('fastForward'); currentSession.off('playPrevious');
currentSession.off('rewind'); currentSession.off('fastForward');
currentSession.off('seek'); currentSession.off('rewind');
currentSession.off('setSpeed'); currentSession.off('seek');
currentSession.off('setLoopMode'); currentSession.off('setSpeed');
currentSession.off('toggleFavorite'); currentSession.off('setLoopMode');
currentSession.off('handleKeyEvent'); currentSession.off('toggleFavorite');
currentSession.off('outputDeviceChange'); currentSession.off('handleKeyEvent');
currentSession.off('outputDeviceChange');
// Deactivate the session and destroy the object.
currentSession.deactivate().then(() => { // Deactivate the session and destroy the object.
currentSession.destroy(); currentSession.deactivate().then(() => {
}); currentSession.destroy();
``` });
```
### Verification ### Verification
Touch the play, pause, or next button on the media application. Check whether the media playback state changes accordingly. Touch the play, pause, or next button on the media application. Check whether the media playback state changes accordingly.
...@@ -362,215 +371,221 @@ Table 2 Common APIs for session control end development ...@@ -362,215 +371,221 @@ Table 2 Common APIs for session control end development
### How to Develop ### How to Develop
1. Import the modules. 1. Import the modules.
```js
import avSession from '@ohos.multimedia.avsession'; ```js
import {Action, KeyEvent} from '@ohos.multimodalInput.KeyEvent'; import avSession from '@ohos.multimedia.avsession';
import wantAgent from '@ohos.wantAgent'; import {Action, KeyEvent} from '@ohos.multimodalInput.KeyEvent';
import audio from '@ohos.multimedia.audio'; import wantAgent from '@ohos.wantAgent';
``` import audio from '@ohos.multimedia.audio';
```
2. Obtain the session descriptors and create a controller. 2. Obtain the session descriptors and create a controller.
```js
// Define global variables. ```js
let g_controller = new Array<avSession.AVSessionController>(); // Define global variables.
let g_centerSupportCmd:Set<avSession.AVControlCommandType> = new Set(['play', 'pause', 'playNext', 'playPrevious', 'fastForward', 'rewind', 'seek','setSpeed', 'setLoopMode', 'toggleFavorite']); let g_controller = new Array<avSession.AVSessionController>();
let g_validCmd:Set<avSession.AVControlCommandType>; let g_centerSupportCmd:Set<avSession.AVControlCommandType> = new Set(['play', 'pause', 'playNext', 'playPrevious', 'fastForward', 'rewind', 'seek','setSpeed', 'setLoopMode', 'toggleFavorite']);
let g_validCmd:Set<avSession.AVControlCommandType>;
// Obtain the session descriptors and create a controller.
avSession.getAllSessionDescriptors().then((descriptors) => { // Obtain the session descriptors and create a controller.
descriptors.forEach((descriptor) => { avSession.getAllSessionDescriptors().then((descriptors) => {
avSession.createController(descriptor.sessionId).then((controller) => { descriptors.forEach((descriptor) => {
g_controller.push(controller); avSession.createController(descriptor.sessionId).then((controller) => {
}).catch((err) => { g_controller.push(controller);
console.error('createController error'); }).catch((err) => {
}); console.error('createController error');
});
});
}).catch((err) => {
console.error('getAllSessionDescriptors error');
}); });
}).catch((err) => {
console.error('getAllSessionDescriptors error'); // Subscribe to the 'sessionCreate' event and create a controller.
}); avSession.on('sessionCreate', (session) => {
// After a session is added, you must create a controller.
// Subscribe to the 'sessionCreate' event and create a controller. avSession.createController(session.sessionId).then((controller) => {
avSession.on('sessionCreate', (session) => { g_controller.push(controller);
// After a session is added, you must create a controller. }).catch((err) => {
avSession.createController(session.sessionId).then((controller) => { console.info(`createController : ERROR : ${err.message}`);
g_controller.push(controller); });
}).catch((err) => { });
console.info(`createController : ERROR : ${err.message}`); ```
});
});
```
3. Subscribe to the session state and service changes. 3. Subscribe to the session state and service changes.
```js
// Subscribe to the 'activeStateChange' event.
controller.on('activeStateChange', (isActive) => {
if (isActive) {
console.log ("The widget corresponding to the controller is highlighted.");
} else {
console.log("The widget corresponding to the controller is invalid.");
}
});
// Subscribe to the 'sessionDestroy' event to enable Media Controller to get notified when the session dies. ```js
controller.on('sessionDestroy', () => { // Subscribe to the 'activeStateChange' event.
console.info('on sessionDestroy : SUCCESS '); controller.on('activeStateChange', (isActive) => {
controller.destroy().then(() => { if (isActive) {
console.info('destroy : SUCCESS '); console.log ("The widget corresponding to the controller is highlighted.");
}).catch((err) => { } else {
console.info(`destroy : ERROR :${err.message}`); console.log("The widget corresponding to the controller is invalid.");
}
}); });
});
// Subscribe to the 'sessionDestroy' event to enable Media Controller to get notified when the session dies.
// Subscribe to the 'sessionDestroy' event to enable the application to get notified when the session dies. controller.on('sessionDestroy', () => {
avSession.on('sessionDestroy', (session) => { console.info('on sessionDestroy : SUCCESS ');
let index = g_controller.findIndex((controller) => { controller.destroy().then(() => {
return controller.sessionId == session.sessionId; console.info('destroy : SUCCESS ');
}).catch((err) => {
console.info(`destroy : ERROR :${err.message}`);
});
}); });
if (index != 0) {
g_controller[index].destroy(); // Subscribe to the 'sessionDestroy' event to enable the application to get notified when the session dies.
g_controller.splice(index, 1); avSession.on('sessionDestroy', (session) => {
} let index = g_controller.findIndex((controller) => {
}); return controller.sessionId == session.sessionId;
});
// Subscribe to the 'topSessionChange' event. if (index != 0) {
avSession.on('topSessionChange', (session) => { g_controller[index].destroy();
let index = g_controller.findIndex((controller) => { g_controller.splice(index, 1);
return controller.sessionId == session.sessionId; }
}); });
// Place the session on the top.
if (index != 0) { // Subscribe to the 'topSessionChange' event.
g_controller.sort((a, b) => { avSession.on('topSessionChange', (session) => {
return a.sessionId == session.sessionId ? -1 : 0; let index = g_controller.findIndex((controller) => {
}); return controller.sessionId == session.sessionId;
} });
}); // Place the session on the top.
if (index != 0) {
// Subscribe to the 'sessionServiceDie' event. g_controller.sort((a, b) => {
avSession.on('sessionServiceDie', () => { return a.sessionId == session.sessionId ? -1 : 0;
// The server is abnormal, and the application clears resources. });
console.log("Server exception"); }
}) });
```
// Subscribe to the 'sessionServiceDie' event.
avSession.on('sessionServiceDie', () => {
// The server is abnormal, and the application clears resources.
console.log("Server exception");
})
```
4. Subscribe to media session information changes. 4. Subscribe to media session information changes.
```js
// Subscribe to metadata changes.
let metaFilter = ['assetId', 'title', 'description'];
controller.on('metadataChange', metaFilter, (metadata) => {
console.info(`on metadataChange assetId : ${metadata.assetId}`);
});
// Subscribe to playback state changes.
let playbackFilter = ['state', 'speed', 'loopMode'];
controller.on('playbackStateChange', playbackFilter, (playbackState) => {
console.info(`on playbackStateChange state : ${playbackState.state}`);
});
// Subscribe to supported command changes.
controller.on('validCommandChange', (cmds) => {
console.info(`validCommandChange : SUCCESS : size : ${cmds.size}`);
console.info(`validCommandChange : SUCCESS : cmds : ${cmds.values()}`);
g_validCmd.clear();
for (let c of g_centerSupportCmd) {
if (cmds.has(c)) {
g_validCmd.add(c);
}
}
});
// Subscribe to output device changes. ```js
controller.on('outputDeviceChange', (device) => { // Subscribe to metadata changes.
console.info(`on outputDeviceChange device isRemote : ${device.isRemote}`); let metaFilter = ['assetId', 'title', 'description'];
}); controller.on('metadataChange', metaFilter, (metadata) => {
``` console.info(`on metadataChange assetId : ${metadata.assetId}`);
});
// Subscribe to playback state changes.
let playbackFilter = ['state', 'speed', 'loopMode'];
controller.on('playbackStateChange', playbackFilter, (playbackState) => {
console.info(`on playbackStateChange state : ${playbackState.state}`);
});
// Subscribe to supported command changes.
controller.on('validCommandChange', (cmds) => {
console.info(`validCommandChange : SUCCESS : size : ${cmds.size}`);
console.info(`validCommandChange : SUCCESS : cmds : ${cmds.values()}`);
g_validCmd.clear();
for (let c of g_centerSupportCmd) {
if (cmds.has(c)) {
g_validCmd.add(c);
}
}
});
// Subscribe to output device changes.
controller.on('outputDeviceChange', (device) => {
console.info(`on outputDeviceChange device isRemote : ${device.isRemote}`);
});
```
5. Control the session behavior. 5. Control the session behavior.
```js
// When the user touches the play button, the control command 'play' is sent to the session. ```js
if (g_validCmd.has('play')) { // When the user touches the play button, the control command 'play' is sent to the session.
controller.sendControlCommand({command:'play'}).then(() => { if (g_validCmd.has('play')) {
console.info('sendControlCommand successfully'); controller.sendControlCommand({command:'play'}).then(() => {
console.info('sendControlCommand successfully');
}).catch((err) => {
console.info(`sendControlCommand : ERROR : ${err.message}`);
});
}
// When the user selects the single loop mode, the corresponding control command is sent to the session.
if (g_validCmd.has('setLoopMode')) {
controller.sendControlCommand({command: 'setLoopMode', parameter: avSession.LoopMode.LOOP_MODE_SINGLE}).then(() => {
console.info('sendControlCommand successfully');
}).catch((err) => {
console.info(`sendControlCommand : ERROR : ${err.message}`);
});
}
// Send a key event.
let keyItem = {code: 0x49, pressedTime: 123456789, deviceId: 0};
let event = {action: 2, key: keyItem, keys: [keyItem]};
controller.sendAVKeyEvent(event).then(() => {
console.info('sendAVKeyEvent Successfully');
}).catch((err) => { }).catch((err) => {
console.info(`sendControlCommand : ERROR : ${err.message}`); console.info(`sendAVKeyEvent : ERROR : ${err.message}`);
}); });
}
// The user touches the blank area on the widget to start the application.
// When the user selects the single loop mode, the corresponding control command is sent to the session. controller.getLaunchAbility().then((want) => {
if (g_validCmd.has('setLoopMode')) { console.log("Starting the application in the foreground");
controller.sendControlCommand({command: 'setLoopMode', parameter: avSession.LoopMode.LOOP_MODE_SINGLE}).then(() => { }).catch((err) => {
console.info('sendControlCommand successfully'); console.info(`getLaunchAbility : ERROR : ${err.message}`);
});
// Send the system key event.
let keyItem = {code: 0x49, pressedTime: 123456789, deviceId: 0};
let event = {action: 2, key: keyItem, keys: [keyItem]};
avSession.sendSystemAVKeyEvent(event).then(() => {
console.info('sendSystemAVKeyEvent Successfully');
}).catch((err) => {
console.info(`sendSystemAVKeyEvent : ERROR : ${err.message}`);
});
// Send a system control command to the top session.
let avcommand = {command: 'toggleFavorite', parameter: "false"};
avSession.sendSystemControlCommand(avcommand).then(() => {
console.info('sendSystemControlCommand successfully');
}).catch((err) => {
console.info(`sendSystemControlCommand : ERROR : ${err.message}`);
});
// Cast the session to another device.
let audioManager = audio.getAudioManager();
let audioDevices;
await audioManager.getDevices(audio.DeviceFlag.OUTPUT_DEVICES_FLAG).then((data) => {
audioDevices = data;
console.info('Promise returned to indicate that the device list is obtained.');
}).catch((err) => {
console.info(`getDevices : ERROR : ${err.message}`);
});
avSession.castAudio('all', audioDevices).then(() => {
console.info('createController : SUCCESS');
}).catch((err) => { }).catch((err) => {
console.info(`sendControlCommand : ERROR : ${err.message}`); console.info(`createController : ERROR : ${err.message}`);
}); });
} ```
// Send a key event.
let keyItem = {code: 0x49, pressedTime: 123456789, deviceId: 0};
let event = {action: 2, key: keyItem, keys: [keyItem]};
controller.sendAVKeyEvent(event).then(() => {
console.info('sendAVKeyEvent Successfully');
}).catch((err) => {
console.info(`sendAVKeyEvent : ERROR : ${err.message}`);
});
// The user touches the blank area on the widget to start the application.
controller.getLaunchAbility().then((want) => {
console.log("Starting the application in the foreground");
}).catch((err) => {
console.info(`getLaunchAbility : ERROR : ${err.message}`);
});
// Send the system key event.
let keyItem = {code: 0x49, pressedTime: 123456789, deviceId: 0};
let event = {action: 2, key: keyItem, keys: [keyItem]};
avSession.sendSystemAVKeyEvent(event).then(() => {
console.info('sendSystemAVKeyEvent Successfully');
}).catch((err) => {
console.info(`sendSystemAVKeyEvent : ERROR : ${err.message}`);
});
// Send a system control command to the top session.
let avcommand = {command: 'toggleFavorite', parameter: "false"};
avSession.sendSystemControlCommand(avcommand).then(() => {
console.info('sendSystemControlCommand successfully');
}).catch((err) => {
console.info(`sendSystemControlCommand : ERROR : ${err.message}`);
});
// Cast the session to another device.
let audioManager = audio.getAudioManager();
let audioDevices;
await audioManager.getDevices(audio.DeviceFlag.OUTPUT_DEVICES_FLAG).then((data) => {
audioDevices = data;
console.info('Promise returned to indicate that the device list is obtained.');
}).catch((err) => {
console.info(`getDevices : ERROR : ${err.message}`);
});
avSession.castAudio('all', audioDevices).then(() => {
console.info('createController : SUCCESS');
}).catch((err) => {
console.info(`createController : ERROR : ${err.message}`);
});
```
6. Release resources. 6. Release resources.
```js
// Unsubscribe from the events. ```js
controller.off('metadataChange'); // Unsubscribe from the events.
controller.off('playbackStateChange'); controller.off('metadataChange');
controller.off('sessionDestroy'); controller.off('playbackStateChange');
controller.off('activeStateChange'); controller.off('sessionDestroy');
controller.off('validCommandChange'); controller.off('activeStateChange');
controller.off('outputDeviceChange'); controller.off('validCommandChange');
controller.off('outputDeviceChange');
// Destroy the controller.
controller.destroy().then(() => { // Destroy the controller.
console.info('destroy : SUCCESS '); controller.destroy().then(() => {
}).catch((err) => { console.info('destroy : SUCCESS ');
console.info(`destroy : ERROR : ${err.message}`); }).catch((err) => {
}); console.info(`destroy : ERROR : ${err.message}`);
``` });
```
### Verification ### Verification
When you touch the play, pause, or next button in Media Controller, the playback state of the application changes accordingly. When you touch the play, pause, or next button in Media Controller, the playback state of the application changes accordingly.
......
# AVSession Overview # AVSession Overview
> **NOTE**
>
> All APIs of the **AVSession** module are system APIs and can be called only by system applications.
## Overview ## Overview
AVSession, short for audio and video session, is also known as media session. AVSession, short for audio and video session, is also known as media session.
...@@ -49,4 +53,4 @@ The **AVSession** module provides two classes: **AVSession** and **AVSessionCont ...@@ -49,4 +53,4 @@ The **AVSession** module provides two classes: **AVSession** and **AVSessionCont
- AVSession can transmit media playback information and control commands. It does not display information or execute control commands. - AVSession can transmit media playback information and control commands. It does not display information or execute control commands.
- Do not develop Media Controller for common applications. For common audio and video applications running on OpenHarmony, the default control end is Media Controller, which is a system application. You do not need to carry out additional development for Media Controller. - Do not develop Media Controller for common applications. For common audio and video applications running on OpenHarmony, the default control end is Media Controller, which is a system application. You do not need to carry out additional development for Media Controller.
- If you want to develop your own system running OpenHarmony, you can develop your own Media Controller. - If you want to develop your own system running OpenHarmony, you can develop your own Media Controller.
- For better background management of audio and video applications, the **AVSession** module enforces background control for third-party applications. Only third-party applications that have accessed AVSession can play audio in the background. Otherwise, the system forcibly pauses the playback when a third-party application switches to the background. - For better background management of audio and video applications, the **AVSession** module enforces background control for applications. Only applications that have accessed AVSession can play audio in the background. Otherwise, the system forcibly pauses the playback when an application switches to the background.
...@@ -23,9 +23,9 @@ import audio from '@ohos.multimedia.audio'; ...@@ -23,9 +23,9 @@ import audio from '@ohos.multimedia.audio';
| Name | Type | Readable | Writable| Description | | Name | Type | Readable | Writable| Description |
| --------------------------------------- | ----------| ---- | ---- | ------------------ | | --------------------------------------- | ----------| ---- | ---- | ------------------ |
| LOCAL_NETWORK_ID<sup>9+</sup> | string | Yes | No | Network ID of the local device.<br>This is a system API.<br>**System capability**: SystemCapability.Multimedia.Audio.Device | | LOCAL_NETWORK_ID<sup>9+</sup> | string | Yes | No | Network ID of the local device.<br>This is a system API.<br> **System capability**: SystemCapability.Multimedia.Audio.Device |
| DEFAULT_VOLUME_GROUP_ID<sup>9+</sup> | number | Yes | No | Default volume group ID.<br>**System capability**: SystemCapability.Multimedia.Audio.Volume | | DEFAULT_VOLUME_GROUP_ID<sup>9+</sup> | number | Yes | No | Default volume group ID.<br> **System capability**: SystemCapability.Multimedia.Audio.Volume |
| DEFAULT_INTERRUPT_GROUP_ID<sup>9+</sup> | number | Yes | No | Default audio interruption group ID.<br>**System capability**: SystemCapability.Multimedia.Audio.Interrupt | | DEFAULT_INTERRUPT_GROUP_ID<sup>9+</sup> | number | Yes | No | Default audio interruption group ID.<br> **System capability**: SystemCapability.Multimedia.Audio.Interrupt |
**Example** **Example**
...@@ -349,7 +349,10 @@ Enumerates the audio stream types. ...@@ -349,7 +349,10 @@ Enumerates the audio stream types.
| VOICE_CALL<sup>8+</sup> | 0 | Audio stream for voice calls.| | VOICE_CALL<sup>8+</sup> | 0 | Audio stream for voice calls.|
| RINGTONE | 2 | Audio stream for ringtones. | | RINGTONE | 2 | Audio stream for ringtones. |
| MEDIA | 3 | Audio stream for media purpose. | | MEDIA | 3 | Audio stream for media purpose. |
| ALARM<sup>10+</sup> | 4 | Audio stream for alarming. |
| ACCESSIBILITY<sup>10+</sup> | 5 | Audio stream for accessibility. |
| VOICE_ASSISTANT<sup>8+</sup> | 9 | Audio stream for voice assistant.| | VOICE_ASSISTANT<sup>8+</sup> | 9 | Audio stream for voice assistant.|
| ULTRASONIC<sup>10+</sup> | 10 | Audio stream for ultrasonic.<br>This is a system API.|
| ALL<sup>9+</sup> | 100 | All public audio streams.<br>This is a system API.| | ALL<sup>9+</sup> | 100 | All public audio streams.<br>This is a system API.|
## InterruptRequestResultType<sup>9+</sup> ## InterruptRequestResultType<sup>9+</sup>
...@@ -531,7 +534,7 @@ Enumerates the audio content types. ...@@ -531,7 +534,7 @@ Enumerates the audio content types.
| CONTENT_TYPE_MOVIE | 3 | Movie. | | CONTENT_TYPE_MOVIE | 3 | Movie. |
| CONTENT_TYPE_SONIFICATION | 4 | Notification tone. | | CONTENT_TYPE_SONIFICATION | 4 | Notification tone. |
| CONTENT_TYPE_RINGTONE<sup>8+</sup> | 5 | Ringtone. | | CONTENT_TYPE_RINGTONE<sup>8+</sup> | 5 | Ringtone. |
| CONTENT_TYPE_ULTRASONIC<sup>10+</sup>| 9 | Ultrasonic.<br>This is a system API.|
## StreamUsage ## StreamUsage
Enumerates the audio stream usage. Enumerates the audio stream usage.
...@@ -544,7 +547,10 @@ Enumerates the audio stream usage. ...@@ -544,7 +547,10 @@ Enumerates the audio stream usage.
| STREAM_USAGE_MEDIA | 1 | Used for media. | | STREAM_USAGE_MEDIA | 1 | Used for media. |
| STREAM_USAGE_VOICE_COMMUNICATION | 2 | Used for voice communication.| | STREAM_USAGE_VOICE_COMMUNICATION | 2 | Used for voice communication.|
| STREAM_USAGE_VOICE_ASSISTANT<sup>9+</sup> | 3 | Used for voice assistant.| | STREAM_USAGE_VOICE_ASSISTANT<sup>9+</sup> | 3 | Used for voice assistant.|
| STREAM_USAGE_ALARM<sup>10+</sup> | 4 | Used for alarming. |
| STREAM_USAGE_NOTIFICATION_RINGTONE | 6 | Used for notification.| | STREAM_USAGE_NOTIFICATION_RINGTONE | 6 | Used for notification.|
| STREAM_USAGE_ACCESSIBILITY<sup>10+</sup> | 8 | Used for accessibility. |
| STREAM_USAGE_SYSTEM<sup>10+</sup> | 9 | System tone (such as screen lock or keypad tone).<br>This is a system API.|
## InterruptRequestType<sup>9+</sup> ## InterruptRequestType<sup>9+</sup>
...@@ -1757,7 +1763,7 @@ Sets a device to the active state. This API uses an asynchronous callback to ret ...@@ -1757,7 +1763,7 @@ Sets a device to the active state. This API uses an asynchronous callback to ret
| Name | Type | Mandatory| Description | | Name | Type | Mandatory| Description |
| ---------- | ------------------------------------- | ---- | ------------------------ | | ---------- | ------------------------------------- | ---- | ------------------------ |
| deviceType | [ActiveDeviceType](#activedevicetypedeprecated) | Yes | Active audio device type. | | deviceType | [ActiveDeviceType](#activedevicetypedeprecated) | Yes | Active audio device type. |
| active | boolean | Yes | Active state to set. The value **true** means to set the device to the active state, and **false** means the opposite. | | active | boolean | Yes | Active state to set. The value **true** means to set the device to the active state, and **false** means the opposite. |
| callback | AsyncCallback&lt;void&gt; | Yes | Callback used to return the result.| | callback | AsyncCallback&lt;void&gt; | Yes | Callback used to return the result.|
...@@ -1789,7 +1795,7 @@ Sets a device to the active state. This API uses a promise to return the result. ...@@ -1789,7 +1795,7 @@ Sets a device to the active state. This API uses a promise to return the result.
| Name | Type | Mandatory| Description | | Name | Type | Mandatory| Description |
| ---------- | ------------------------------------- | ---- | ------------------ | | ---------- | ------------------------------------- | ---- | ------------------ |
| deviceType | [ActiveDeviceType](#activedevicetypedeprecated) | Yes | Active audio device type. | | deviceType | [ActiveDeviceType](#activedevicetypedeprecated) | Yes | Active audio device type.|
| active | boolean | Yes | Active state to set. The value **true** means to set the device to the active state, and **false** means the opposite. | | active | boolean | Yes | Active state to set. The value **true** means to set the device to the active state, and **false** means the opposite. |
**Return value** **Return value**
...@@ -1823,7 +1829,7 @@ Checks whether a device is active. This API uses an asynchronous callback to ret ...@@ -1823,7 +1829,7 @@ Checks whether a device is active. This API uses an asynchronous callback to ret
| Name | Type | Mandatory| Description | | Name | Type | Mandatory| Description |
| ---------- | ------------------------------------- | ---- | ------------------------ | | ---------- | ------------------------------------- | ---- | ------------------------ |
| deviceType | [ActiveDeviceType](#activedevicetypedeprecated) | Yes | Active audio device type. | | deviceType | [ActiveDeviceType](#activedevicetypedeprecated) | Yes | Active audio device type. |
| callback | AsyncCallback&lt;boolean&gt; | Yes | Callback used to return the active state of the device.| | callback | AsyncCallback&lt;boolean&gt; | Yes | Callback used to return the active state of the device.|
**Example** **Example**
...@@ -1854,7 +1860,7 @@ Checks whether a device is active. This API uses a promise to return the result. ...@@ -1854,7 +1860,7 @@ Checks whether a device is active. This API uses a promise to return the result.
| Name | Type | Mandatory| Description | | Name | Type | Mandatory| Description |
| ---------- | ------------------------------------- | ---- | ------------------ | | ---------- | ------------------------------------- | ---- | ------------------ |
| deviceType | [ActiveDeviceType](#activedevicetypedeprecated) | Yes | Active audio device type. | | deviceType | [ActiveDeviceType](#activedevicetypedeprecated) | Yes | Active audio device type.|
**Return value** **Return value**
...@@ -4568,15 +4574,15 @@ let filePath = path + '/StarWars10s-2C-48000-4SW.wav'; ...@@ -4568,15 +4574,15 @@ let filePath = path + '/StarWars10s-2C-48000-4SW.wav';
let file = fs.openSync(filePath, fs.OpenMode.READ_ONLY); let file = fs.openSync(filePath, fs.OpenMode.READ_ONLY);
let stat = await fs.stat(path); let stat = await fs.stat(path);
let buf = new ArrayBuffer(bufferSize); let buf = new ArrayBuffer(bufferSize);
let len = stat.size % this.bufferSize == 0 ? Math.floor(stat.size / this.bufferSize) : Math.floor(stat.size / this.bufferSize + 1); let len = stat.size % bufferSize == 0 ? Math.floor(stat.size / bufferSize) : Math.floor(stat.size / bufferSize + 1);
for (let i = 0;i < len; i++) { for (let i = 0;i < len; i++) {
let options = { let options = {
offset: i * this.bufferSize, offset: i * bufferSize,
length: this.bufferSize length: bufferSize
} }
let readsize = await fs.read(file.fd, buf, options) let readsize = await fs.read(file.fd, buf, options)
let writeSize = await new Promise((resolve,reject)=>{ let writeSize = await new Promise((resolve,reject)=>{
this.audioRenderer.write(buf,(err,writeSize)=>{ audioRenderer.write(buf,(err,writeSize)=>{
if(err){ if(err){
reject(err) reject(err)
}else{ }else{
...@@ -4585,6 +4591,7 @@ for (let i = 0;i < len; i++) { ...@@ -4585,6 +4591,7 @@ for (let i = 0;i < len; i++) {
}) })
}) })
} }
``` ```
### write<sup>8+</sup> ### write<sup>8+</sup>
...@@ -4621,15 +4628,15 @@ let filePath = path + '/StarWars10s-2C-48000-4SW.wav'; ...@@ -4621,15 +4628,15 @@ let filePath = path + '/StarWars10s-2C-48000-4SW.wav';
let file = fs.openSync(filePath, fs.OpenMode.READ_ONLY); let file = fs.openSync(filePath, fs.OpenMode.READ_ONLY);
let stat = await fs.stat(path); let stat = await fs.stat(path);
let buf = new ArrayBuffer(bufferSize); let buf = new ArrayBuffer(bufferSize);
let len = stat.size % this.bufferSize == 0 ? Math.floor(stat.size / this.bufferSize) : Math.floor(stat.size / this.bufferSize + 1); let len = stat.size % bufferSize == 0 ? Math.floor(stat.size / bufferSize) : Math.floor(stat.size / bufferSize + 1);
for (let i = 0;i < len; i++) { for (let i = 0;i < len; i++) {
let options = { let options = {
offset: i * this.bufferSize, offset: i * bufferSize,
length: this.bufferSize length: bufferSize
} }
let readsize = await fs.read(file.fd, buf, options) let readsize = await fs.read(file.fd, buf, options)
try{ try{
let writeSize = await this.audioRenderer.write(buf); let writeSize = await audioRenderer.write(buf);
} catch(err) { } catch(err) {
console.error(`audioRenderer.write err: ${err}`); console.error(`audioRenderer.write err: ${err}`);
} }
...@@ -4969,7 +4976,7 @@ For details about the error codes, see [Audio Error Codes](../errorcodes/errorco ...@@ -4969,7 +4976,7 @@ For details about the error codes, see [Audio Error Codes](../errorcodes/errorco
| ID | Error Message | | ID | Error Message |
| ------- | ------------------------------ | | ------- | ------------------------------ |
| 6800101 | if input parameter value error | | 6800101 | if input parameter value error |
**Example** **Example**
......
...@@ -2,7 +2,10 @@ ...@@ -2,7 +2,10 @@
> **NOTE** > **NOTE**
> >
> The APIs of this module are supported since API version 6. Updates will be marked with a superscript to indicate their earliest API version. > - The APIs of this module are supported since API version 6. Updates will be marked with a superscript to indicate their earliest API version.
> - This API is deprecated since API version 9 and will be retained until API version 13.
> - Certain functionalities are changed as system APIs and can be used only by system applications. To use these functionalities, call [@ohos.filemanagement.userFileManager](js-apis-userFileManager.md).
> - The functionalities for selecting and storing media assets are still open to common applications. To use these functionalities, call [@ohos.file.picker](js-apis-file-picker.md).
## Modules to Import ## Modules to Import
```js ```js
...@@ -131,17 +134,12 @@ async function example() { ...@@ -131,17 +134,12 @@ async function example() {
console.info('fileAsset.displayName ' + '0 : ' + fileAsset.displayName); console.info('fileAsset.displayName ' + '0 : ' + fileAsset.displayName);
// Call getNextObject to obtain the next file until the last one. // Call getNextObject to obtain the next file until the last one.
for (let i = 1; i < count; i++) { for (let i = 1; i < count; i++) {
fetchFileResult.getNextObject((error, fileAsset) => { let fileAsset = await fetchFileResult.getNextObject();
if (fileAsset == undefined) { console.info('fileAsset.displayName ' + i + ': ' + fileAsset.displayName);
console.error('get next object failed with error: ' + error);
return;
}
console.info('fileAsset.displayName ' + i + ': ' + fileAsset.displayName);
})
} }
// Release the FetchFileResult instance and invalidate it. Other APIs can no longer be called.
fetchFileResult.close();
}); });
// Release the FetchFileResult instance and invalidate it. Other APIs can no longer be called.
fetchFileResult.close();
}); });
} }
``` ```
...@@ -199,18 +197,15 @@ async function example() { ...@@ -199,18 +197,15 @@ async function example() {
console.info('fileAsset.displayName ' + '0 : ' + fileAsset.displayName); console.info('fileAsset.displayName ' + '0 : ' + fileAsset.displayName);
// Call getNextObject to obtain the next file until the last one. // Call getNextObject to obtain the next file until the last one.
for (let i = 1; i < count; i++) { for (let i = 1; i < count; i++) {
fetchFileResult.getNextObject().then((fileAsset) => { let fileAsset = await fetchFileResult.getNextObject();
console.info('fileAsset.displayName ' + i + ': ' + fileAsset.displayName); console.info('fileAsset.displayName ' + i + ': ' + fileAsset.displayName);
}).catch((error) => {
console.error('get next object failed with error: ' + error);
})
} }
// Release the FetchFileResult instance and invalidate it. Other APIs can no longer be called.
fetchFileResult.close();
}).catch((error) => { }).catch((error) => {
// Calling getFirstObject fails. // Calling getFirstObject fails.
console.error('get first object failed with error: ' + error); console.error('get first object failed with error: ' + error);
}); });
// Release the FetchFileResult instance and invalidate it. Other APIs can no longer be called.
fetchFileResult.close();
}).catch((error) => { }).catch((error) => {
// Calling getFileAssets fails. // Calling getFileAssets fails.
console.error('get file assets failed with error: ' + error); console.error('get file assets failed with error: ' + error);
...@@ -500,7 +495,7 @@ async function example() { ...@@ -500,7 +495,7 @@ async function example() {
### getAlbums<sup>7+</sup> ### getAlbums<sup>7+</sup>
getAlbums(options: MediaFetchOptions, callback: AsyncCallback<Array&lt;Album&gt;>): void getAlbums(options: MediaFetchOptions, callback: AsyncCallback&lt;Array&lt;Album&gt;&gt;): void
Obtains the albums. This API uses an asynchronous callback to return the result. Obtains the albums. This API uses an asynchronous callback to return the result.
...@@ -535,7 +530,7 @@ async function example() { ...@@ -535,7 +530,7 @@ async function example() {
### getAlbums<sup>7+</sup> ### getAlbums<sup>7+</sup>
getAlbums(options: MediaFetchOptions): Promise<Array&lt;Album&gt;> getAlbums(options: MediaFetchOptions): Promise&lt;Array&lt;Album&gt;&gt;
Obtains the albums. This API uses a promise to return the result. Obtains the albums. This API uses a promise to return the result.
...@@ -615,7 +610,7 @@ Call this API when you no longer need to use the APIs in the **MediaLibrary** in ...@@ -615,7 +610,7 @@ Call this API when you no longer need to use the APIs in the **MediaLibrary** in
media.release() media.release()
``` ```
### storeMediaAsset<sup>(deprecated)</sup> ### storeMediaAsset
storeMediaAsset(option: MediaAssetOption, callback: AsyncCallback&lt;string&gt;): void storeMediaAsset(option: MediaAssetOption, callback: AsyncCallback&lt;string&gt;): void
...@@ -623,7 +618,7 @@ Stores a media asset. This API uses an asynchronous callback to return the URI t ...@@ -623,7 +618,7 @@ Stores a media asset. This API uses an asynchronous callback to return the URI t
> **NOTE** > **NOTE**
> >
> This API is deprecated since API version 9. > This API is supported since API version 6 and can be used only by the FA model.
**System capability**: SystemCapability.Multimedia.MediaLibrary.Core **System capability**: SystemCapability.Multimedia.MediaLibrary.Core
...@@ -653,7 +648,7 @@ mediaLibrary.getMediaLibrary().storeMediaAsset(option, (error, value) => { ...@@ -653,7 +648,7 @@ mediaLibrary.getMediaLibrary().storeMediaAsset(option, (error, value) => {
``` ```
### storeMediaAsset<sup>(deprecated)</sup> ### storeMediaAsset
storeMediaAsset(option: MediaAssetOption): Promise&lt;string&gt; storeMediaAsset(option: MediaAssetOption): Promise&lt;string&gt;
...@@ -661,7 +656,7 @@ Stores a media asset. This API uses a promise to return the URI that stores the ...@@ -661,7 +656,7 @@ Stores a media asset. This API uses a promise to return the URI that stores the
> **NOTE** > **NOTE**
> >
> This API is deprecated since API version 9. > This API is supported since API version 6 and can be used only by the FA model.
**System capability**: SystemCapability.Multimedia.MediaLibrary.Core **System capability**: SystemCapability.Multimedia.MediaLibrary.Core
...@@ -694,15 +689,15 @@ mediaLibrary.getMediaLibrary().storeMediaAsset(option).then((value) => { ...@@ -694,15 +689,15 @@ mediaLibrary.getMediaLibrary().storeMediaAsset(option).then((value) => {
``` ```
### startImagePreview<sup>(deprecated)</sup> ### startImagePreview
startImagePreview(images: Array&lt;string&gt;, index: number, callback: AsyncCallback&lt;void&gt;): void startImagePreview(images: Array&lt;string&gt;, index: number, callback: AsyncCallback&lt;void&gt;): void
Starts image preview, with the first image to preview specified. This API can be used to preview local images whose URIs start with **datashare://** or online images whose URIs start with **https://**. It uses an asynchronous callback to return the execution result. Starts image preview, with the first image to preview specified. This API can be used to preview local images whose URIs start with **datashare://** or online images whose URIs start with **https://**. It uses an asynchronous callback to return the execution result.
> **NOTE** > **NOTE**
> > This API is supported since API version 6 and can be used only by the FA model.
> This API is deprecated since API version 9. You are advised to use the **\<[Image](../arkui-ts/ts-basic-components-image.md)>** component instead. The **\<Image>** component can be used to render and display local and online images. > You are advised to use the **\<[Image](../arkui-ts/ts-basic-components-image.md)>** component instead. The **\<Image>** component can be used to render and display local and online images.
**System capability**: SystemCapability.Multimedia.MediaLibrary.Core **System capability**: SystemCapability.Multimedia.MediaLibrary.Core
...@@ -738,15 +733,15 @@ mediaLibrary.getMediaLibrary().startImagePreview(images, index, (error) => { ...@@ -738,15 +733,15 @@ mediaLibrary.getMediaLibrary().startImagePreview(images, index, (error) => {
``` ```
### startImagePreview<sup>(deprecated)</sup> ### startImagePreview
startImagePreview(images: Array&lt;string&gt;, callback: AsyncCallback&lt;void&gt;): void startImagePreview(images: Array&lt;string&gt;, callback: AsyncCallback&lt;void&gt;): void
Starts image preview. This API can be used to preview local images whose URIs start with **datashare://** or online images whose URIs start with **https://**. It uses an asynchronous callback to return the execution result. Starts image preview. This API can be used to preview local images whose URIs start with **datashare://** or online images whose URIs start with **https://**. It uses an asynchronous callback to return the execution result.
> **NOTE** > **NOTE**
> > This API is supported since API version 6 and can be used only by the FA model.
> This API is deprecated since API version 9. You are advised to use the **\<[Image](../arkui-ts/ts-basic-components-image.md)>** component instead. The **\<Image>** component can be used to render and display local and online images. > You are advised to use the **\<[Image](../arkui-ts/ts-basic-components-image.md)>** component instead. The **\<Image>** component can be used to render and display local and online images.
**System capability**: SystemCapability.Multimedia.MediaLibrary.Core **System capability**: SystemCapability.Multimedia.MediaLibrary.Core
...@@ -780,15 +775,15 @@ mediaLibrary.getMediaLibrary().startImagePreview(images, (error) => { ...@@ -780,15 +775,15 @@ mediaLibrary.getMediaLibrary().startImagePreview(images, (error) => {
``` ```
### startImagePreview<sup>(deprecated)</sup> ### startImagePreview
startImagePreview(images: Array&lt;string&gt;, index?: number): Promise&lt;void&gt; startImagePreview(images: Array&lt;string&gt;, index?: number): Promise&lt;void&gt;
Starts image preview, with the first image to preview specified. This API can be used to preview local images whose URIs start with **datashare://** or online images whose URIs start with **https://**. It uses a promise to return the execution result. Starts image preview, with the first image to preview specified. This API can be used to preview local images whose URIs start with **datashare://** or online images whose URIs start with **https://**. It uses a promise to return the execution result.
> **NOTE** > **NOTE**
> > This API is supported since API version 6 and can be used only by the FA model.
> This API is deprecated since API version 9. You are advised to use the **\<[Image](../arkui-ts/ts-basic-components-image.md)>** component instead. The **\<Image>** component can be used to render and display local and online images. > You are advised to use the **\<[Image](../arkui-ts/ts-basic-components-image.md)>** component instead. The **\<Image>** component can be used to render and display local and online images.
**System capability**: SystemCapability.Multimedia.MediaLibrary.Core **System capability**: SystemCapability.Multimedia.MediaLibrary.Core
...@@ -827,15 +822,15 @@ mediaLibrary.getMediaLibrary().startImagePreview(images, index).then(() => { ...@@ -827,15 +822,15 @@ mediaLibrary.getMediaLibrary().startImagePreview(images, index).then(() => {
``` ```
### startMediaSelect<sup>(deprecated)</sup> ### startMediaSelect
startMediaSelect(option: MediaSelectOption, callback: AsyncCallback&lt;Array&lt;string&gt;&gt;): void startMediaSelect(option: MediaSelectOption, callback: AsyncCallback&lt;Array&lt;string&gt;&gt;): void
Starts media selection. This API uses an asynchronous callback to return the list of URIs that store the selected media assets. Starts media selection. This API uses an asynchronous callback to return the list of URIs that store the selected media assets.
> **NOTE** > **NOTE**
> > This API is supported since API version 6 and can be used only by the FA model.
> This API is deprecated since API version 9. You are advised to use the system app Gallery instead. Gallery is a built-in visual resource access application that provides features such as image and video management and browsing. For details about how to use Gallery, visit [OpenHarmony/applications_photos](https://gitee.com/openharmony/applications_photos). > You are advised to use the system app Gallery instead. Gallery is a built-in visual resource access application that provides features such as image and video management and browsing. For details about how to use Gallery, visit [OpenHarmony/applications_photos](https://gitee.com/openharmony/applications_photos).
**System capability**: SystemCapability.Multimedia.MediaLibrary.Core **System capability**: SystemCapability.Multimedia.MediaLibrary.Core
...@@ -843,7 +838,7 @@ Starts media selection. This API uses an asynchronous callback to return the lis ...@@ -843,7 +838,7 @@ Starts media selection. This API uses an asynchronous callback to return the lis
| Name | Type | Mandatory | Description | | Name | Type | Mandatory | Description |
| -------- | ---------------------------------------- | ---- | ------------------------------------ | | -------- | ---------------------------------------- | ---- | ------------------------------------ |
| option | [MediaSelectOption](#mediaselectoptiondeprecated) | Yes | Media selection option. | | option | [MediaSelectOption](#mediaselectoption) | Yes | Media selection option. |
| callback | AsyncCallback&lt;Array&lt;string&gt;&gt; | Yes | Callback used to return the list of URIs (starting with **datashare://**) that store the selected media assets.| | callback | AsyncCallback&lt;Array&lt;string&gt;&gt; | Yes | Callback used to return the list of URIs (starting with **datashare://**) that store the selected media assets.|
**Example** **Example**
...@@ -864,15 +859,15 @@ mediaLibrary.getMediaLibrary().startMediaSelect(option, (error, value) => { ...@@ -864,15 +859,15 @@ mediaLibrary.getMediaLibrary().startMediaSelect(option, (error, value) => {
``` ```
### startMediaSelect<sup>(deprecated)</sup> ### startMediaSelect
startMediaSelect(option: MediaSelectOption): Promise&lt;Array&lt;string&gt;&gt; startMediaSelect(option: MediaSelectOption): Promise&lt;Array&lt;string&gt;&gt;
Starts media selection. This API uses a promise to return the list of URIs that store the selected media assets. Starts media selection. This API uses a promise to return the list of URIs that store the selected media assets.
> **NOTE** > **NOTE**
> > This API is supported since API version 6 and can be used only by the FA model.
> This API is deprecated since API version 9. You are advised to use the system app Gallery instead. Gallery is a built-in visual resource access application that provides features such as image and video management and browsing. For details about how to use Gallery, visit [OpenHarmony/applications_photos](https://gitee.com/openharmony/applications_photos). > You are advised to use the system app Gallery instead. Gallery is a built-in visual resource access application that provides features such as image and video management and browsing. For details about how to use Gallery, visit [OpenHarmony/applications_photos](https://gitee.com/openharmony/applications_photos).
**System capability**: SystemCapability.Multimedia.MediaLibrary.Core **System capability**: SystemCapability.Multimedia.MediaLibrary.Core
...@@ -880,7 +875,7 @@ Starts media selection. This API uses a promise to return the list of URIs that ...@@ -880,7 +875,7 @@ Starts media selection. This API uses a promise to return the list of URIs that
| Name | Type | Mandatory | Description | | Name | Type | Mandatory | Description |
| ------ | --------------------------------------- | ---- | ------- | | ------ | --------------------------------------- | ---- | ------- |
| option | [MediaSelectOption](#mediaselectoptiondeprecated) | Yes | Media selection option.| | option | [MediaSelectOption](#mediaselectoption) | Yes | Media selection option.|
**Return value** **Return value**
...@@ -1041,7 +1036,6 @@ async function example() { ...@@ -1041,7 +1036,6 @@ async function example() {
Provides APIs for encapsulating file asset attributes. Provides APIs for encapsulating file asset attributes.
> **NOTE** > **NOTE**
>
> 1. The system attempts to parse the file content if the file is an audio or video file. The actual field values will be restored from the passed values during scanning on some devices. > 1. The system attempts to parse the file content if the file is an audio or video file. The actual field values will be restored from the passed values during scanning on some devices.
> 2. Some devices may not support the modification of **orientation**. You are advised to use [ModifyImageProperty](js-apis-image.md#modifyimageproperty9) of the **image** module. > 2. Some devices may not support the modification of **orientation**. You are advised to use [ModifyImageProperty](js-apis-image.md#modifyimageproperty9) of the **image** module.
...@@ -1923,9 +1917,9 @@ async function example() { ...@@ -1923,9 +1917,9 @@ async function example() {
if(i == fetchCount - 1) { if(i == fetchCount - 1) {
var result = fetchFileResult.isAfterLast(); var result = fetchFileResult.isAfterLast();
console.info('mediaLibrary fileAsset isAfterLast result: ' + result); console.info('mediaLibrary fileAsset isAfterLast result: ' + result);
fetchFileResult.close();
} }
} }
fetchFileResult.close();
} }
``` ```
...@@ -1985,8 +1979,8 @@ async function example() { ...@@ -1985,8 +1979,8 @@ async function example() {
return; return;
} }
console.info('getFirstObject successfully, displayName : ' + fileAsset.displayName); console.info('getFirstObject successfully, displayName : ' + fileAsset.displayName);
fetchFileResult.close();
}) })
fetchFileResult.close();
} }
``` ```
...@@ -2018,10 +2012,10 @@ async function example() { ...@@ -2018,10 +2012,10 @@ async function example() {
let fetchFileResult = await media.getFileAssets(getImageOp); let fetchFileResult = await media.getFileAssets(getImageOp);
fetchFileResult.getFirstObject().then((fileAsset) => { fetchFileResult.getFirstObject().then((fileAsset) => {
console.info('getFirstObject successfully, displayName: ' + fileAsset.displayName); console.info('getFirstObject successfully, displayName: ' + fileAsset.displayName);
fetchFileResult.close();
}).catch((error) => { }).catch((error) => {
console.error('getFirstObject failed with error: ' + error); console.error('getFirstObject failed with error: ' + error);
}); });
fetchFileResult.close();
} }
``` ```
...@@ -2055,16 +2049,16 @@ async function example() { ...@@ -2055,16 +2049,16 @@ async function example() {
}; };
let fetchFileResult = await media.getFileAssets(getImageOp); let fetchFileResult = await media.getFileAssets(getImageOp);
let fileAsset = await fetchFileResult.getFirstObject(); let fileAsset = await fetchFileResult.getFirstObject();
if (! fetchFileResult.isAfterLast) { if (!fileAsset.isAfterLast) {
fetchFileResult.getNextObject((error, fileAsset) => { fetchFileResult.getNextObject((error, fileAsset) => {
if (error) { if (error) {
console.error('fetchFileResult getNextObject failed with error: ' + error); console.error('fetchFileResult getNextObject failed with error: ' + error);
return; return;
} }
console.log('fetchFileResult getNextObject successfully, displayName: ' + fileAsset.displayName); console.log('fetchFileResult getNextObject successfully, displayName: ' + fileAsset.displayName);
fetchFileResult.close();
}) })
} }
fetchFileResult.close();
} }
``` ```
...@@ -2099,14 +2093,14 @@ async function example() { ...@@ -2099,14 +2093,14 @@ async function example() {
}; };
let fetchFileResult = await media.getFileAssets(getImageOp); let fetchFileResult = await media.getFileAssets(getImageOp);
let fileAsset = await fetchFileResult.getFirstObject(); let fileAsset = await fetchFileResult.getFirstObject();
if (! fetchFileResult.isAfterLast) { if (!fileAsset.isAfterLast) {
fetchFileResult.getNextObject().then((fileAsset) => { fetchFileResult.getNextObject().then((fileAsset) => {
console.info('fetchFileResult getNextObject successfully, displayName: ' + fileAsset.displayName); console.info('fetchFileResult getNextObject successfully, displayName: ' + fileAsset.displayName);
fetchFileResult.close();
}).catch((error) => { }).catch((error) => {
console.error('fetchFileResult getNextObject failed with error: ' + error); console.error('fetchFileResult getNextObject failed with error: ' + error);
}) })
} }
fetchFileResult.close();
} }
``` ```
...@@ -2142,8 +2136,8 @@ async function example() { ...@@ -2142,8 +2136,8 @@ async function example() {
return; return;
} }
console.info('getLastObject successfully, displayName: ' + fileAsset.displayName); console.info('getLastObject successfully, displayName: ' + fileAsset.displayName);
fetchFileResult.close();
}) })
fetchFileResult.close();
} }
``` ```
...@@ -2175,10 +2169,10 @@ async function example() { ...@@ -2175,10 +2169,10 @@ async function example() {
let fetchFileResult = await media.getFileAssets(getImageOp); let fetchFileResult = await media.getFileAssets(getImageOp);
fetchFileResult.getLastObject().then((fileAsset) => { fetchFileResult.getLastObject().then((fileAsset) => {
console.info('getLastObject successfully, displayName: ' + fileAsset.displayName); console.info('getLastObject successfully, displayName: ' + fileAsset.displayName);
fetchFileResult.close();
}).catch((error) => { }).catch((error) => {
console.error('getLastObject failed with error: ' + error); console.error('getLastObject failed with error: ' + error);
}); });
fetchFileResult.close();
} }
``` ```
...@@ -2215,8 +2209,8 @@ async function example() { ...@@ -2215,8 +2209,8 @@ async function example() {
return; return;
} }
console.info('getPositionObject successfully, displayName: ' + fileAsset.displayName); console.info('getPositionObject successfully, displayName: ' + fileAsset.displayName);
fetchFileResult.close();
}) })
fetchFileResult.close();
} }
``` ```
...@@ -2254,10 +2248,10 @@ async function example() { ...@@ -2254,10 +2248,10 @@ async function example() {
let fetchFileResult = await media.getFileAssets(getImageOp); let fetchFileResult = await media.getFileAssets(getImageOp);
fetchFileResult.getPositionObject(0).then((fileAsset) => { fetchFileResult.getPositionObject(0).then((fileAsset) => {
console.info('getPositionObject successfully, displayName: ' + fileAsset.displayName); console.info('getPositionObject successfully, displayName: ' + fileAsset.displayName);
fetchFileResult.close();
}).catch((error) => { }).catch((error) => {
console.error('getPositionObject failed with error: ' + error); console.error('getPositionObject failed with error: ' + error);
}); });
fetchFileResult.close();
} }
``` ```
...@@ -2294,9 +2288,9 @@ async function example() { ...@@ -2294,9 +2288,9 @@ async function example() {
} }
for (let i = 0; i < fetchFileResult.getCount(); i++) { for (let i = 0; i < fetchFileResult.getCount(); i++) {
console.info('getAllObject fileAssetList ' + i + ' displayName: ' + fileAssetList[i].displayName); console.info('getAllObject fileAssetList ' + i + ' displayName: ' + fileAssetList[i].displayName);
} }
fetchFileResult.close();
}) })
fetchFileResult.close();
} }
``` ```
...@@ -2330,10 +2324,10 @@ async function example() { ...@@ -2330,10 +2324,10 @@ async function example() {
for (let i = 0; i < fetchFileResult.getCount(); i++) { for (let i = 0; i < fetchFileResult.getCount(); i++) {
console.info('getAllObject fileAssetList ' + i + ' displayName: ' + fileAssetList[i].displayName); console.info('getAllObject fileAssetList ' + i + ' displayName: ' + fileAssetList[i].displayName);
} }
fetchFileResult.close();
}).catch((error) => { }).catch((error) => {
console.error('getAllObject failed with error: ' + error); console.error('getAllObject failed with error: ' + error);
}); });
fetchFileResult.close();
} }
``` ```
...@@ -2465,10 +2459,10 @@ async function example() { ...@@ -2465,10 +2459,10 @@ async function example() {
console.error('album getFileAssets failed with error: ' + error); console.error('album getFileAssets failed with error: ' + error);
return; return;
} }
let count = fetchFileResult.getcount(); let count = fetchFileResult.getCount();
console.info('album getFileAssets successfully, count: ' + count); console.info('album getFileAssets successfully, count: ' + count);
fetchFileResult.close();
}); });
fetchFileResult.close();
} }
``` ```
...@@ -2502,7 +2496,7 @@ async function example() { ...@@ -2502,7 +2496,7 @@ async function example() {
selections: '', selections: '',
selectionArgs: [], selectionArgs: [],
}; };
let fileNoArgsfetchOp = { let fileNoArgsfetchOp = {
selections: '', selections: '',
selectionArgs: [], selectionArgs: [],
}; };
...@@ -2510,13 +2504,13 @@ async function example() { ...@@ -2510,13 +2504,13 @@ async function example() {
const albumList = await media.getAlbums(AlbumNoArgsfetchOp); const albumList = await media.getAlbums(AlbumNoArgsfetchOp);
const album = albumList[0]; const album = albumList[0];
// Obtain an album from the album list and obtain all media assets that meet the retrieval options in the album. // Obtain an album from the album list and obtain all media assets that meet the retrieval options in the album.
album.getFileAssets(fileNoArgsfetchOp).then((albumFetchFileResult) => { album.getFileAssets(fileNoArgsfetchOp).then((fetchFileResult) => {
let count = fetchFileResult.getcount(); let count = fetchFileResult.getCount();
console.info('album getFileAssets successfully, count: ' + count); console.info('album getFileAssets successfully, count: ' + count);
fetchFileResult.close();
}).catch((error) => { }).catch((error) => {
console.error('album getFileAssets failed with error: ' + error); console.error('album getFileAssets failed with error: ' + error);
}); });
fetchFileResult.close();
} }
``` ```
...@@ -2555,7 +2549,6 @@ Enumerates media types. ...@@ -2555,7 +2549,6 @@ Enumerates media types.
Enumerates key file information. Enumerates key file information.
> **NOTE** > **NOTE**
>
> The **bucket_id** field may change after file rename or movement. Therefore, you must obtain the field again before using it. > The **bucket_id** field may change after file rename or movement. Therefore, you must obtain the field again before using it.
**System capability**: SystemCapability.Multimedia.MediaLibrary.Core **System capability**: SystemCapability.Multimedia.MediaLibrary.Core
...@@ -2641,14 +2634,10 @@ Describes the image size. ...@@ -2641,14 +2634,10 @@ Describes the image size.
| width | number | Yes | Yes | Image width, in pixels.| | width | number | Yes | Yes | Image width, in pixels.|
| height | number | Yes | Yes | Image height, in pixels.| | height | number | Yes | Yes | Image height, in pixels.|
## MediaAssetOption<sup>(deprecated)</sup> ## MediaAssetOption
Implements the media asset option. Implements the media asset option.
> **NOTE**
>
> This API is deprecated since API version 9.
**System capability**: SystemCapability.Multimedia.MediaLibrary.Core **System capability**: SystemCapability.Multimedia.MediaLibrary.Core
...@@ -2658,17 +2647,13 @@ Implements the media asset option. ...@@ -2658,17 +2647,13 @@ Implements the media asset option.
| mimeType | string | Yes | Yes | Multipurpose Internet Mail Extensions (MIME) type of the media.<br>The value can be 'image/\*', 'video/\*', 'audio/\*' or 'file\*'.| | mimeType | string | Yes | Yes | Multipurpose Internet Mail Extensions (MIME) type of the media.<br>The value can be 'image/\*', 'video/\*', 'audio/\*' or 'file\*'.|
| relativePath | string | Yes | Yes | Custom path for storing media assets, for example, 'Pictures/'. If this parameter is unspecified, media assets are stored in the default path.<br> Default path of images: 'Pictures/'<br> Default path of videos: 'Videos/'<br> Default path of audios: 'Audios/'<br> Default path of files: 'Documents/'| | relativePath | string | Yes | Yes | Custom path for storing media assets, for example, 'Pictures/'. If this parameter is unspecified, media assets are stored in the default path.<br> Default path of images: 'Pictures/'<br> Default path of videos: 'Videos/'<br> Default path of audios: 'Audios/'<br> Default path of files: 'Documents/'|
## MediaSelectOption<sup>(deprecated)</sup> ## MediaSelectOption
Describes media selection option. Describes media selection option.
> **NOTE**
>
> This API is deprecated since API version 9.
**System capability**: SystemCapability.Multimedia.MediaLibrary.Core **System capability**: SystemCapability.Multimedia.MediaLibrary.Core
| Name | Type | Readable| Writable| Description | | Name | Type | Readable| Writable| Description |
| ----- | ------ | ---- | ---- | -------------------- | | ----- | ------ | ---- | ---- | -------------------- |
| type | 'image' &#124; 'video' &#124; 'media' | Yes | Yes | Media type, which can be **image**, **media**, or **video**. Currently, only **media** is supported.| | type | 'image' &#124; 'video' &#124; 'media' | Yes | Yes | Media type, which can be **image**, **media**, or **video**. Currently, only **media** is supported.|
| count | number | Yes | Yes | Number of media assets selected. The value starts from 1, which indicates that one media asset can be selected. | | count | number | Yes | Yes | Maximum number of media assets that can be selected. The value starts from 1, which indicates that one media asset can be selected. |
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册