提交 2b08d7e2 编写于 作者: G Gloria

Update docs against 15756+15924+15757+15843+15912

Signed-off-by: wusongqing<wusongqing@huawei.com>
上级 5cc343cf
......@@ -21,19 +21,28 @@ This following figure shows the audio capturer state transitions.
## Constraints
Before developing the audio data collection feature, configure the **ohos.permission.MICROPHONE** permission for your application. For details, see [Permission Application Guide](../security/accesstoken-guidelines.md).
Before developing the audio data collection feature, configure the **ohos.permission.MICROPHONE** permission for your application. For details, see [Permission Application Guide](../security/accesstoken-guidelines.md#declaring-permissions-in-the-configuration-file).
## How to Develop
For details about the APIs, see [AudioCapturer in Audio Management](../reference/apis/js-apis-audio.md#audiocapturer8).
1. Use **createAudioCapturer()** to create an **AudioCapturer** instance.
1. Use **createAudioCapturer()** to create a global **AudioCapturer** instance.
Set parameters of the **AudioCapturer** instance in **audioCapturerOptions**. This instance is used to capture audio, control and obtain the recording state, and register a callback for notification.
```js
import audio from '@ohos.multimedia.audio';
import fs from '@ohos.file.fs'; // It will be used for the call of the read function in step 3.
// Perform a self-test on APIs related to audio rendering.
@Entry
@Component
struct AudioRenderer {
@State message: string = 'Hello World'
private audioCapturer: audio.AudioCapturer; // It will be called globally.
async initAudioCapturer(){
let audioStreamInfo = {
samplingRate: audio.AudioSamplingRate.SAMPLE_RATE_44100,
channels: audio.AudioChannel.CHANNEL_1,
......@@ -51,8 +60,10 @@ For details about the APIs, see [AudioCapturer in Audio Management](../reference
capturerInfo: audioCapturerInfo
}
let audioCapturer = await audio.createAudioCapturer(audioCapturerOptions);
this.audioCapturer = await audio.createAudioCapturer(audioCapturerOptions);
console.log('AudioRecLog: Create audio capturer success.');
}
```
2. Use **start()** to start audio recording.
......@@ -60,25 +71,20 @@ For details about the APIs, see [AudioCapturer in Audio Management](../reference
The capturer state will be **STATE_RUNNING** once the audio capturer is started. The application can then begin reading buffers.
```js
import audio from '@ohos.multimedia.audio';
async function startCapturer() {
let state = audioCapturer.state;
async startCapturer() {
let state = this.audioCapturer.state;
// The audio capturer should be in the STATE_PREPARED, STATE_PAUSED, or STATE_STOPPED state after being started.
if (state != audio.AudioState.STATE_PREPARED || state != audio.AudioState.STATE_PAUSED ||
state != audio.AudioState.STATE_STOPPED) {
console.info('Capturer is not in a correct state to start');
return;
}
await audioCapturer.start();
state = audioCapturer.state;
if (state == audio.AudioState.STATE_PREPARED || state == audio.AudioState.STATE_PAUSED ||
state == audio.AudioState.STATE_STOPPED) {
await this.audioCapturer.start();
state = this.audioCapturer.state;
if (state == audio.AudioState.STATE_RUNNING) {
console.info('AudioRecLog: Capturer started');
} else {
console.error('AudioRecLog: Capturer start failed');
}
}
}
```
3. Read the captured audio data and convert it to a byte stream. Call **read()** repeatedly to read the data until the application stops the recording.
......@@ -86,17 +92,15 @@ For details about the APIs, see [AudioCapturer in Audio Management](../reference
The following example shows how to write recorded data into a file.
```js
import fs from '@ohos.file.fs';
let state = audioCapturer.state;
async readData(){
let state = this.audioCapturer.state;
// The read operation can be performed only when the state is STATE_RUNNING.
if (state != audio.AudioState.STATE_RUNNING) {
console.info('Capturer is not in a correct state to read');
return;
}
const path = '/data/data/.pulse_dir/capture_js.wav'; // Path for storing the collected audio file.
let file = fs.openSync(filePath, 0o2);
let file = fs.openSync(path, 0o2);
let fd = file.fd;
if (file !== null) {
console.info('AudioRecLog: file created');
......@@ -104,16 +108,14 @@ For details about the APIs, see [AudioCapturer in Audio Management](../reference
console.info('AudioRecLog: file create : FAILED');
return;
}
if (fd !== null) {
console.info('AudioRecLog: file fd opened in append mode');
}
let numBuffersToCapture = 150; // Write data for 150 times.
let count = 0;
while (numBuffersToCapture) {
let bufferSize = await audioCapturer.getBufferSize();
let buffer = await audioCapturer.read(bufferSize, true);
this.bufferSize = await this.audioCapturer.getBufferSize();
let buffer = await this.audioCapturer.read(this.bufferSize, true);
let options = {
offset: count * this.bufferSize,
length: this.bufferSize
......@@ -127,22 +129,23 @@ For details about the APIs, see [AudioCapturer in Audio Management](../reference
numBuffersToCapture--;
count++;
}
}
```
4. Once the recording is complete, call **stop()** to stop the recording.
```js
async function StopCapturer() {
let state = audioCapturer.state;
async StopCapturer() {
let state = this.audioCapturer.state;
// The audio capturer can be stopped only when it is in STATE_RUNNING or STATE_PAUSED state.
if (state != audio.AudioState.STATE_RUNNING && state != audio.AudioState.STATE_PAUSED) {
console.info('AudioRecLog: Capturer is not running or paused');
return;
}
await audioCapturer.stop();
await this.audioCapturer.stop();
state = audioCapturer.state;
state = this.audioCapturer.state;
if (state == audio.AudioState.STATE_STOPPED) {
console.info('AudioRecLog: Capturer stopped');
} else {
......@@ -154,17 +157,17 @@ For details about the APIs, see [AudioCapturer in Audio Management](../reference
5. After the task is complete, call **release()** to release related resources.
```js
async function releaseCapturer() {
let state = audioCapturer.state;
async releaseCapturer() {
let state = this.audioCapturer.state;
// The audio capturer can be released only when it is not in the STATE_RELEASED or STATE_NEW state.
if (state == audio.AudioState.STATE_RELEASED || state == audio.AudioState.STATE_NEW) {
console.info('AudioRecLog: Capturer already released');
return;
}
await audioCapturer.release();
await this.audioCapturer.release();
state = audioCapturer.state;
state = this.audioCapturer.state;
if (state == audio.AudioState.STATE_RELEASED) {
console.info('AudioRecLog: Capturer released');
} else {
......@@ -178,23 +181,20 @@ For details about the APIs, see [AudioCapturer in Audio Management](../reference
You can use the following code to obtain the audio capturer information:
```js
async getAudioCapturerInfo(){
// Obtain the audio capturer state.
let state = audioCapturer.state;
let state = this.audioCapturer.state;
// Obtain the audio capturer information.
let audioCapturerInfo : audio.AuduioCapturerInfo = await audioCapturer.getCapturerInfo();
let audioCapturerInfo : audio.AudioCapturerInfo = await this.audioCapturer.getCapturerInfo();
// Obtain the audio stream information.
let audioStreamInfo : audio.AudioStreamInfo = await audioCapturer.getStreamInfo();
let audioStreamInfo : audio.AudioStreamInfo = await this.audioCapturer.getStreamInfo();
// Obtain the audio stream ID.
let audioStreamId : number = await audioCapturer.getAudioStreamId();
let audioStreamId : number = await this.audioCapturer.getAudioStreamId();
// Obtain the Unix timestamp, in nanoseconds.
let audioTime : number = await audioCapturer.getAudioTime();
let audioTime : number = await this.audioCapturer.getAudioTime();
// Obtain a proper minimum buffer size.
let bufferSize : number = await audioCapturer.getBufferSize();
let bufferSize : number = await this.audioCapturer.getBufferSize();
}
```
7. (Optional) Use **on('markReach')** to subscribe to the mark reached event, and use **off('markReach')** to unsubscribe from the event.
......@@ -202,12 +202,13 @@ For details about the APIs, see [AudioCapturer in Audio Management](../reference
After the mark reached event is subscribed to, when the number of frames collected by the audio capturer reaches the specified value, a callback is triggered and the specified value is returned.
```js
audioCapturer.on('markReach', (reachNumber) => {
async markReach(){
this.audioCapturer.on('markReach', 10, (reachNumber) => {
console.info('Mark reach event Received');
console.info(`The Capturer reached frame: ${reachNumber}`);
});
audioCapturer.off('markReach'); // Unsubscribe from the mark reached event. This event will no longer be listened for.
this.audioCapturer.off('markReach'); // Unsubscribe from the mark reached event. This event will no longer be listened for.
}
```
8. (Optional) Use **on('periodReach')** to subscribe to the period reached event, and use **off('periodReach')** to unsubscribe from the event.
......@@ -215,18 +216,20 @@ For details about the APIs, see [AudioCapturer in Audio Management](../reference
After the period reached event is subscribed to, each time the number of frames collected by the audio capturer reaches the specified value, a callback is triggered and the specified value is returned.
```js
audioCapturer.on('periodReach', (reachNumber) => {
async periodReach(){
this.audioCapturer.on('periodReach', 10, (reachNumber) => {
console.info('Period reach event Received');
console.info(`In this period, the Capturer reached frame: ${reachNumber}`);
});
audioCapturer.off('periodReach'); // Unsubscribe from the period reached event. This event will no longer be listened for.
this.audioCapturer.off('periodReach'); // Unsubscribe from the period reached event. This event will no longer be listened for.
}
```
9. If your application needs to perform some operations when the audio capturer state is updated, it can subscribe to the state change event. When the audio capturer state is updated, the application receives a callback containing the event type.
```js
audioCapturer.on('stateChange', (state) => {
async stateChange(){
this.audioCapturer.on('stateChange', (state) => {
console.info(`AudioCapturerLog: Changed State to : ${state}`)
switch (state) {
case audio.AudioState.STATE_PREPARED:
......@@ -251,4 +254,5 @@ For details about the APIs, see [AudioCapturer in Audio Management](../reference
break;
}
});
}
```
......@@ -19,26 +19,32 @@ The following figure shows the audio renderer state transitions.
![audio-renderer-state](figures/audio-renderer-state.png)
- **PREPARED**: The audio renderer enters this state by calling **create()**.
- **RUNNING**: The audio renderer enters this state by calling **start()** when it is in the **PREPARED** state or by calling **start()** when it is in the **STOPPED** state.
- **PAUSED**: The audio renderer enters this state by calling **pause()** when it is in the **RUNNING** state. When the audio playback is paused, it can call **start()** to resume the playback.
- **STOPPED**: The audio renderer enters this state by calling **stop()** when it is in the **PAUSED** or **RUNNING** state.
- **RELEASED**: The audio renderer enters this state by calling **release()** when it is in the **PREPARED**, **PAUSED**, or **STOPPED** state. In this state, the audio renderer releases all occupied hardware and software resources and will not transit to any other state.
## How to Develop
For details about the APIs, see [AudioRenderer in Audio Management](../reference/apis/js-apis-audio.md#audiorenderer8).
1. Use **createAudioRenderer()** to create an **AudioRenderer** instance.
1. Use **createAudioRenderer()** to create a global **AudioRenderer** instance.
Set parameters of the **AudioRenderer** instance in **audioRendererOptions**. This instance is used to render audio, control and obtain the rendering status, and register a callback for notification.
```js
import audio from '@ohos.multimedia.audio';
import fs from '@ohos.file.fs';
// Perform a self-test on APIs related to audio rendering.
@Entry
@Component
struct AudioRenderer1129 {
private audioRenderer: audio.AudioRenderer;
private bufferSize; // It will be used for the call of the write function in step 3.
private audioRenderer1: audio.AudioRenderer; // It will be used for the call in the complete example in step 14.
private audioRenderer2: audio.AudioRenderer; // It will be used for the call in the complete example in step 14.
async initAudioRender(){
let audioStreamInfo = {
samplingRate: audio.AudioSamplingRate.SAMPLE_RATE_44100,
channels: audio.AudioChannel.CHANNEL_1,
......@@ -54,16 +60,17 @@ For details about the APIs, see [AudioRenderer in Audio Management](../reference
streamInfo: audioStreamInfo,
rendererInfo: audioRendererInfo
}
let audioRenderer = await audio.createAudioRenderer(audioRendererOptions);
this.audioRenderer = await audio.createAudioRenderer(audioRendererOptions);
console.log("Create audio renderer success.");
}
}
```
2. Use **start()** to start audio rendering.
```js
async function startRenderer() {
let state = audioRenderer.state;
async startRenderer() {
let state = this.audioRenderer.state;
// The audio renderer should be in the STATE_PREPARED, STATE_PAUSED, or STATE_STOPPED state when start() is called.
if (state != audio.AudioState.STATE_PREPARED && state != audio.AudioState.STATE_PAUSED &&
state != audio.AudioState.STATE_STOPPED) {
......@@ -71,9 +78,9 @@ For details about the APIs, see [AudioRenderer in Audio Management](../reference
return;
}
await audioRenderer.start();
await this.audioRenderer.start();
state = audioRenderer.state;
state = this.audioRenderer.state;
if (state == audio.AudioState.STATE_RUNNING) {
console.info('Renderer started');
} else {
......@@ -86,34 +93,19 @@ For details about the APIs, see [AudioRenderer in Audio Management](../reference
3. Call **write()** to write data to the buffer.
Read the audio data to be played to the buffer. Call **write()** repeatedly to write the data to the buffer.
Read the audio data to be played to the buffer. Call **write()** repeatedly to write the data to the buffer. Import fs from '@ohos.file.fs'; as step 1.
```js
import fs from '@ohos.file.fs';
import audio from '@ohos.multimedia.audio';
async function writeBuffer(buf) {
// The write operation can be performed only when the state is STATE_RUNNING.
if (audioRenderer.state != audio.AudioState.STATE_RUNNING) {
console.error('Renderer is not running, do not write');
return;
}
let writtenbytes = await audioRenderer.write(buf);
console.info(`Actual written bytes: ${writtenbytes} `);
if (writtenbytes < 0) {
console.error('Write buffer failed. check the state of renderer');
}
}
async writeData(){
// Set a proper buffer size for the audio renderer. You can also select a buffer of another size.
const bufferSize = await audioRenderer.getBufferSize();
this.bufferSize = await this.audioRenderer.getBufferSize();
let dir = globalThis.fileDir; // You must use the sandbox path.
const filePath = dir + '/file_example_WAV_2MG.wav'; // The file to render is in the following path: /data/storage/el2/base/haps/entry/files/file_example_WAV_2MG.wav
console.info(`file filePath: ${ filePath}`);
let file = fs.openSync(filePath, fs.OpenMode.READ_ONLY);
let stat = await fs.stat(filePath); // Music file information.
let buf = new ArrayBuffer(bufferSize);
let buf = new ArrayBuffer(this.bufferSize);
let len = stat.size % this.bufferSize == 0 ? Math.floor(stat.size / this.bufferSize) : Math.floor(stat.size / this.bufferSize + 1);
for (let i = 0;i < len; i++) {
let options = {
......@@ -133,24 +125,25 @@ For details about the APIs, see [AudioRenderer in Audio Management](../reference
}
fs.close(file)
await audioRenderer.stop(); // Stop rendering.
await audioRenderer.release(); // Releases the resources.
await this.audioRenderer.stop(); // Stop rendering.
await this.audioRenderer.release(); // Release the resources.
}
```
4. (Optional) Call **pause()** or **stop()** to pause or stop rendering.
```js
async function pauseRenderer() {
let state = audioRenderer.state;
async pauseRenderer() {
let state = this.audioRenderer.state;
// The audio renderer can be paused only when it is in the STATE_RUNNING state.
if (state != audio.AudioState.STATE_RUNNING) {
console.info('Renderer is not running');
return;
}
await audioRenderer.pause();
await this.audioRenderer.pause();
state = audioRenderer.state;
state = this.audioRenderer.state;
if (state == audio.AudioState.STATE_PAUSED) {
console.info('Renderer paused');
} else {
......@@ -158,17 +151,17 @@ For details about the APIs, see [AudioRenderer in Audio Management](../reference
}
}
async function stopRenderer() {
let state = audioRenderer.state;
async stopRenderer() {
let state = this.audioRenderer.state;
// The audio renderer can be stopped only when it is in STATE_RUNNING or STATE_PAUSED state.
if (state != audio.AudioState.STATE_RUNNING && state != audio.AudioState.STATE_PAUSED) {
console.info('Renderer is not running or paused');
return;
}
await audioRenderer.stop();
await this.audioRenderer.stop();
state = audioRenderer.state;
state = this.audioRenderer.state;
if (state == audio.AudioState.STATE_STOPPED) {
console.info('Renderer stopped');
} else {
......@@ -180,17 +173,16 @@ For details about the APIs, see [AudioRenderer in Audio Management](../reference
5. (Optional) Call **drain()** to clear the buffer.
```js
async function drainRenderer() {
let state = audioRenderer.state;
async drainRenderer() {
let state = this.audioRenderer.state;
// drain() can be used only when the audio renderer is in the STATE_RUNNING state.
if (state != audio.AudioState.STATE_RUNNING) {
console.info('Renderer is not running');
return;
}
await audioRenderer.drain();
state = audioRenderer.state;
await this.audioRenderer.drain();
state = this.audioRenderer.state;
}
```
......@@ -199,17 +191,16 @@ For details about the APIs, see [AudioRenderer in Audio Management](../reference
**AudioRenderer** uses a large number of system resources. Therefore, ensure that the resources are released after the task is complete.
```js
async function releaseRenderer() {
let state = audioRenderer.state;
async releaseRenderer() {
let state = this.audioRenderer.state;
// The audio renderer can be released only when it is not in the STATE_RELEASED or STATE_NEW state.
if (state == audio.AudioState.STATE_RELEASED || state == audio.AudioState.STATE_NEW) {
console.info('Renderer already released');
return;
}
await this.audioRenderer.release();
await audioRenderer.release();
state = audioRenderer.state;
state = this.audioRenderer.state;
if (state == audio.AudioState.STATE_RELEASED) {
console.info('Renderer released');
} else {
......@@ -223,26 +214,22 @@ For details about the APIs, see [AudioRenderer in Audio Management](../reference
You can use the following code to obtain the audio renderer information:
```js
async getRenderInfo(){
// Obtain the audio renderer state.
let state = audioRenderer.state;
let state = this.audioRenderer.state;
// Obtain the audio renderer information.
let audioRendererInfo : audio.AudioRendererInfo = await audioRenderer.getRendererInfo();
let audioRendererInfo : audio.AudioRendererInfo = await this.audioRenderer.getRendererInfo();
// Obtain the audio stream information.
let audioStreamInfo : audio.AudioStreamInfo = await audioRenderer.getStreamInfo();
let audioStreamInfo : audio.AudioStreamInfo = await this.audioRenderer.getStreamInfo();
// Obtain the audio stream ID.
let audioStreamId : number = await audioRenderer.getAudioStreamId();
let audioStreamId : number = await this.audioRenderer.getAudioStreamId();
// Obtain the Unix timestamp, in nanoseconds.
let audioTime : number = await audioRenderer.getAudioTime();
let audioTime : number = await this.audioRenderer.getAudioTime();
// Obtain a proper minimum buffer size.
let bufferSize : number = await audioRenderer.getBufferSize();
let bufferSize : number = await this.audioRenderer.getBufferSize();
// Obtain the audio renderer rate.
let renderRate : audio.AudioRendererRate = await audioRenderer.getRenderRate();
let renderRate : audio.AudioRendererRate = await this.audioRenderer.getRenderRate();
}
```
8. (Optional) Set the audio renderer information.
......@@ -250,17 +237,17 @@ For details about the APIs, see [AudioRenderer in Audio Management](../reference
You can use the following code to set the audio renderer information:
```js
async setAudioRenderInfo(){
// Set the audio renderer rate to RENDER_RATE_NORMAL.
let renderRate : audio.AudioRendererRate = audio.AudioRendererRate.RENDER_RATE_NORMAL;
await audioRenderer.setRenderRate(renderRate);
await this.audioRenderer.setRenderRate(renderRate);
// Set the interruption mode of the audio renderer to SHARE_MODE.
let interruptMode : audio.InterruptMode = audio.InterruptMode.SHARE_MODE;
await audioRenderer.setInterruptMode(interruptMode);
await this.audioRenderer.setInterruptMode(interruptMode);
// Set the volume of the stream to 0.5.
let volume : number = 0.5;
await audioRenderer.setVolume(volume);
await this.audioRenderer.setVolume(volume);
}
```
9. (Optional) Use **on('audioInterrupt')** to subscribe to the audio interruption event, and use **off('audioInterrupt')** to unsubscribe from the event.
......@@ -274,7 +261,8 @@ For details about the APIs, see [AudioRenderer in Audio Management](../reference
It should be noted that the audio interruption event subscription of the **AudioRenderer** module is slightly different from **on('interrupt')** in [AudioManager](../reference/apis/js-apis-audio.md#audiomanager). The **on('interrupt')** and **off('interrupt')** APIs are deprecated since API version 9. In the **AudioRenderer** module, you only need to call **on('audioInterrupt')** to listen for focus change events. When the **AudioRenderer** instance created by the application performs actions such as start, stop, and pause, it requests the focus, which triggers focus transfer and in return enables the related **AudioRenderer** instance to receive a notification through the callback. For instances other than **AudioRenderer**, such as frequency modulation (FM) and voice wakeup, the application does not create an instance. In this case, the application can call **on('interrupt')** in **AudioManager** to receive a focus change notification.
```js
audioRenderer.on('audioInterrupt', (interruptEvent) => {
async subscribeAudioRender(){
this.audioRenderer.on('audioInterrupt', (interruptEvent) => {
console.info('InterruptEvent Received');
console.info(`InterruptType: ${interruptEvent.eventType}`);
console.info(`InterruptForceType: ${interruptEvent.forceType}`);
......@@ -284,11 +272,11 @@ For details about the APIs, see [AudioRenderer in Audio Management](../reference
switch (interruptEvent.hintType) {
// Forcible pausing initiated by the audio framework. To prevent data loss, stop the write operation.
case audio.InterruptHint.INTERRUPT_HINT_PAUSE:
isPlay = false;
console.info('isPlay is false');
break;
// Forcible stopping initiated by the audio framework. To prevent data loss, stop the write operation.
case audio.InterruptHint.INTERRUPT_HINT_STOP:
isPlay = false;
console.info('isPlay is false');
break;
// Forcible ducking initiated by the audio framework.
case audio.InterruptHint.INTERRUPT_HINT_DUCK:
......@@ -301,18 +289,17 @@ For details about the APIs, see [AudioRenderer in Audio Management](../reference
switch (interruptEvent.hintType) {
// Notify the application that the rendering starts.
case audio.InterruptHint.INTERRUPT_HINT_RESUME:
startRenderer();
this.startRenderer();
break;
// Notify the application that the audio stream is interrupted. The application then determines whether to continue. (In this example, the application pauses the rendering.)
case audio.InterruptHint.INTERRUPT_HINT_PAUSE:
isPlay = false;
pauseRenderer();
console.info('isPlay is false');
this.pauseRenderer();
break;
}
}
});
audioRenderer.off('audioInterrupt'); // Unsubscribe from the audio interruption event. This event will no longer be listened for.
}
```
10. (Optional) Use **on('markReach')** to subscribe to the mark reached event, and use **off('markReach')** to unsubscribe from the event.
......@@ -320,12 +307,14 @@ For details about the APIs, see [AudioRenderer in Audio Management](../reference
After the mark reached event is subscribed to, when the number of frames rendered by the audio renderer reaches the specified value, a callback is triggered and the specified value is returned.
```js
audioRenderer.on('markReach', (reachNumber) => {
console.info('Mark reach event Received');
console.info(`The renderer reached frame: ${reachNumber}`);
async markReach(){
this.audioRenderer.on('markReach', 50, (position) => {
if (position == 50) {
console.info('ON Triggered successfully');
}
});
audioRenderer.off('markReach'); // Unsubscribe from the mark reached event. This event will no longer be listened for.
this.audioRenderer.off('markReach'); // Unsubscribe from the mark reached event. This event will no longer be listened for.
}
```
11. (Optional) Use **on('periodReach')** to subscribe to the period reached event, and use **off('periodReach')** to unsubscribe from the event.
......@@ -333,12 +322,13 @@ For details about the APIs, see [AudioRenderer in Audio Management](../reference
After the period reached event is subscribed to, each time the number of frames rendered by the audio renderer reaches the specified value, a callback is triggered and the specified value is returned.
```js
audioRenderer.on('periodReach', (reachNumber) => {
console.info('Period reach event Received');
async periodReach(){
this.audioRenderer.on('periodReach',10, (reachNumber) => {
console.info(`In this period, the renderer reached frame: ${reachNumber} `);
});
audioRenderer.off('periodReach'); // Unsubscribe from the period reached event. This event will no longer be listened for.
this.audioRenderer.off('periodReach'); // Unsubscribe from the period reached event. This event will no longer be listened for.
}
```
12. (Optional) Use **on('stateChange')** to subscribe to audio renderer state changes.
......@@ -346,10 +336,12 @@ For details about the APIs, see [AudioRenderer in Audio Management](../reference
After the **stateChange** event is subscribed to, when the audio renderer state changes, a callback is triggered and the audio renderer state is returned.
```js
audioRenderer.on('stateChange', (audioState) => {
async stateChange(){
this.audioRenderer.on('stateChange', (audioState) => {
console.info('State change event Received');
console.info(`Current renderer state is: ${audioState}`);
});
}
```
13. (Optional) Handle exceptions of **on()**.
......@@ -357,24 +349,25 @@ For details about the APIs, see [AudioRenderer in Audio Management](../reference
If the string or the parameter type passed in **on()** is incorrect , the application throws an exception. In this case, you can use **try catch** to capture the exception.
```js
async errorCall(){
try {
audioRenderer.on('invalidInput', () => { // The string is invalid.
this.audioRenderer.on('invalidInput', () => { // The string is invalid.
})
} catch (err) {
console.info(`Call on function error, ${err}`); // The application throws exception 401.
}
try {
audioRenderer.on(1, () => { // The type of the input parameter is incorrect.
this.audioRenderer.on(1, () => { // The type of the input parameter is incorrect.
})
} catch (err) {
console.info(`Call on function error, ${err}`); // The application throws exception 6800101.
}
}
```
14. (Optional) Refer to the complete example of **on('audioInterrupt')**.
Declare audioRenderer1 and audioRenderer2 first. For details, see step 1.
Create **AudioRender1** and **AudioRender2** in an application, configure the independent interruption mode, and call **on('audioInterrupt')** to subscribe to audio interruption events. At the beginning, **AudioRender1** has the focus. When **AudioRender2** attempts to obtain the focus, **AudioRender1** receives a focus transfer notification and the related log information is printed. If the shared mode is used, the log information will not be printed during application running.
```js
async runningAudioRender1(){
let audioStreamInfo = {
......@@ -394,27 +387,27 @@ For details about the APIs, see [AudioRenderer in Audio Management](../reference
}
// 1.1 Create an instance.
audioRenderer1 = await audio.createAudioRenderer(audioRendererOptions);
this.audioRenderer1 = await audio.createAudioRenderer(audioRendererOptions);
console.info("Create audio renderer 1 success.");
// 1.2 Set the independent mode.
audioRenderer1.setInterruptMode(1).then( data => {
this.audioRenderer1.setInterruptMode(1).then( data => {
console.info('audioRenderer1 setInterruptMode Success!');
}).catch((err) => {
console.error(`audioRenderer1 setInterruptMode Fail: ${err}`);
});
// 1.3 Set the listener.
audioRenderer1.on('audioInterrupt', async(interruptEvent) => {
this.audioRenderer1.on('audioInterrupt', async(interruptEvent) => {
console.info(`audioRenderer1 on audioInterrupt : ${JSON.stringify(interruptEvent)}`)
});
// 1.4 Start rendering.
await audioRenderer1.start();
await this.audioRenderer1.start();
console.info('startAudioRender1 success');
// 1.5 Obtain the buffer size, which is the proper minimum buffer size of the audio renderer. You can also select a buffer of another size.
const bufferSize = await audioRenderer1.getBufferSize();
const bufferSize = await this.audioRenderer1.getBufferSize();
console.info(`audio bufferSize: ${bufferSize}`);
// 1.6 Obtain the original audio data file.
......@@ -432,7 +425,7 @@ For details about the APIs, see [AudioRenderer in Audio Management](../reference
offset: i * this.bufferSize,
length: this.bufferSize
}
let readsize = await fs.read(file.fd, buf, options)
let readsize = await fs.read(file1.fd, buf, options)
let writeSize = await new Promise((resolve,reject)=>{
this.audioRenderer1.write(buf,(err,writeSize)=>{
if(err){
......@@ -444,8 +437,8 @@ For details about the APIs, see [AudioRenderer in Audio Management](../reference
})
}
fs.close(file1)
await audioRenderer1.stop(); // Stop rendering.
await audioRenderer1.release(); Releases the resources.
await this.audioRenderer1.stop(); // Stop rendering.
await this.audioRenderer1.release(); // Release the resources.
}
async runningAudioRender2(){
......@@ -466,27 +459,27 @@ For details about the APIs, see [AudioRenderer in Audio Management](../reference
}
// 2.1 Create another instance.
audioRenderer2 = await audio.createAudioRenderer(audioRendererOptions);
this.audioRenderer2 = await audio.createAudioRenderer(audioRendererOptions);
console.info("Create audio renderer 2 success.");
// 2.2 Set the independent mode.
audioRenderer2.setInterruptMode(1).then( data => {
this.audioRenderer2.setInterruptMode(1).then( data => {
console.info('audioRenderer2 setInterruptMode Success!');
}).catch((err) => {
console.error(`audioRenderer2 setInterruptMode Fail: ${err}`);
});
// 2.3 Set the listener.
audioRenderer2.on('audioInterrupt', async(interruptEvent) => {
this.audioRenderer2.on('audioInterrupt', async(interruptEvent) => {
console.info(`audioRenderer2 on audioInterrupt : ${JSON.stringify(interruptEvent)}`)
});
// 2.4 Start rendering.
await audioRenderer2.start();
await this.audioRenderer2.start();
console.info('startAudioRender2 success');
// 2.5 Obtain the buffer size.
const bufferSize = await audioRenderer2.getBufferSize();
const bufferSize = await this.audioRenderer2.getBufferSize();
console.info(`audio bufferSize: ${bufferSize}`);
// 2.6 Read the original audio data file.
......@@ -504,7 +497,7 @@ For details about the APIs, see [AudioRenderer in Audio Management](../reference
offset: i * this.bufferSize,
length: this.bufferSize
}
let readsize = await fs.read(file.fd, buf, options)
let readsize = await fs.read(file2.fd, buf, options)
let writeSize = await new Promise((resolve,reject)=>{
this.audioRenderer2.write(buf,(err,writeSize)=>{
if(err){
......@@ -516,25 +509,14 @@ For details about the APIs, see [AudioRenderer in Audio Management](../reference
})
}
fs.close(file2)
await audioRenderer2.stop(); // Stop rendering.
await audioRenderer2.release(); // Releases the resources.
}
async writeBuffer(buf, audioRender) {
let writtenbytes;
await audioRender.write(buf).then((value) => {
writtenbytes = value;
console.info(`Actual written bytes: ${writtenbytes} `);
});
if (typeof(writtenbytes) != 'number' || writtenbytes < 0) {
console.error('get Write buffer failed. check the state of renderer');
}
await this.audioRenderer2.stop(); // Stop rendering.
await this.audioRenderer2.release(); // Release the resources.
}
// Integrated invoking entry.
async test(){
await runningAudioRender1();
await runningAudioRender2();
await this.runningAudioRender1();
await this.runningAudioRender2();
}
```
\ No newline at end of file
......@@ -292,13 +292,13 @@ export class AVPlayerDemo {
async avPlayerDemo() {
// Create an AVPlayer instance.
this.avPlayer = await media.createAVPlayer()
let fdPath = 'fd://'
let pathDir = "/data/storage/el2/base/haps/entry/files" // The path used here is an example. Obtain the path based on project requirements.
// The stream in the path can be pushed to the device by running the "hdc file send D:\xxx\H264_AAC.mp4 /data/app/el2/100/base/ohos.acts.multimedia.media.avplayer/haps/entry/files" command.
let path = pathDir + '/H264_AAC.mp4'
let file = await fs.open(path)
fdPath = fdPath + '' + file.fd
this.avPlayer.url = fdPath
let fileDescriptor = undefined
// Use getRawFileDescriptor of the resource management module to obtain the media assets in the application, and use the fdSrc attribute of the AVPlayer to initialize the media asset.
// For details on the fd/offset/length parameter, see the Media API. The globalThis.abilityContext parameter is a system environment variable and is saved as a global variable on the main page during the system boost.
await globalThis.abilityContext.resourceManager.getRawFileDescriptor('H264_AAC.mp4').then((value) => {
fileDescriptor = {fd: value.fd, offset: value.offset, length: value.length}
})
this.avPlayer.fdSrc = fileDescriptor
}
}
```
......
# AVSession Development
> **NOTE**
>
> All APIs of the **AVSession** module are system APIs and can be called only by system applications.
## Development for the Session Access End
### Basic Concepts
......@@ -26,35 +30,38 @@ Table 1 Common APIs for session access end development
### How to Develop
1. Import the modules.
```js
import avSession from '@ohos.multimedia.avsession';
import wantAgent from '@ohos.wantAgent';
import featureAbility from '@ohos.ability.featureAbility';
```
```js
import avSession from '@ohos.multimedia.avsession';
import wantAgent from '@ohos.wantAgent';
import featureAbility from '@ohos.ability.featureAbility';
```
2. Create and activate a session.
```js
// Define global variables.
let mediaFavorite = false;
let currentSession = null;
let context = featureAbility.getContext();
// Create an audio session.
avSession.createAVSession(context, "AudioAppSample", 'audio').then((session) => {
```js
// Define global variables.
let mediaFavorite = false;
let currentSession = null;
let context = featureAbility.getContext();
// Create an audio session.
avSession.createAVSession(context, "AudioAppSample", 'audio').then((session) => {
currentSession = session;
currentSession.activate(); // Activate the session.
}).catch((err) => {
}).catch((err) => {
console.info(`createAVSession : ERROR : ${err.message}`);
});
```
});
```
3. Set the session information, including:
- Session metadata. In addition to the current media asset ID (mandatory), you can set the title, album, author, duration, and previous/next media asset ID. For details about the session metadata, see **AVMetadata** in the API document.
- Launcher ability, which is implemented by calling an API of **WantAgent**. Generally, **WantAgent** is used to encapsulate want information. For more information, see [wantAgent](../reference/apis/js-apis-wantAgent.md).
- Launcher ability, which is implemented by calling an API of [WantAgent](../reference/apis/js-apis-wantAgent.md). Generally, **WantAgent** is used to encapsulate want information.
- Playback state information.
```js
// Set the session metadata.
let metadata = {
```js
// Set the session metadata.
let metadata = {
assetId: "121278",
title: "lose yourself",
artist: "Eminem",
......@@ -69,17 +76,17 @@ let metadata = {
lyric: "https://www.example.com/example.lrc", // Set it based on your project requirements.
previousAssetId: "121277",
nextAssetId: "121279",
};
currentSession.setAVMetadata(metadata).then(() => {
};
currentSession.setAVMetadata(metadata).then(() => {
console.info('setAVMetadata successfully');
}).catch((err) => {
}).catch((err) => {
console.info(`setAVMetadata : ERROR : ${err.message}`);
});
```
});
```
```js
// Set the launcher ability.
let wantAgentInfo = {
```js
// Set the launcher ability.
let wantAgentInfo = {
wants: [
{
bundleName: "com.neu.setResultOnAbilityResultTest1",
......@@ -89,56 +96,57 @@ let wantAgentInfo = {
operationType: wantAgent.OperationType.START_ABILITIES,
requestCode: 0,
wantAgentFlags:[wantAgent.WantAgentFlags.UPDATE_PRESENT_FLAG]
}
}
wantAgent.getWantAgent(wantAgentInfo).then((agent) => {
wantAgent.getWantAgent(wantAgentInfo).then((agent) => {
currentSession.setLaunchAbility(agent).then(() => {
console.info('setLaunchAbility successfully');
}).catch((err) => {
console.info(`setLaunchAbility : ERROR : ${err.message}`);
});
});
```
});
```
```js
// Set the playback state information.
let PlaybackState = {
```js
// Set the playback state information.
let PlaybackState = {
state: avSession.PlaybackState.PLAYBACK_STATE_STOP,
speed: 1.0,
position:{elapsedTime: 0, updateTime: (new Date()).getTime()},
bufferedTime: 1000,
loopMode: avSession.LoopMode.LOOP_MODE_SEQUENCE,
isFavorite: false,
};
currentSession.setAVPlaybackState(PlaybackState).then(() => {
};
currentSession.setAVPlaybackState(PlaybackState).then(() => {
console.info('setAVPlaybackState successfully');
}).catch((err) => {
}).catch((err) => {
console.info(`setAVPlaybackState : ERROR : ${err.message}`);
});
```
});
```
```js
// Obtain the controller of this session.
currentSession.getController().then((selfController) => {
```js
// Obtain the controller of this session.
currentSession.getController().then((selfController) => {
console.info('getController successfully');
}).catch((err) => {
}).catch((err) => {
console.info(`getController : ERROR : ${err.message}`);
});
```
});
```
```js
// Obtain the output device information.
currentSession.getOutputDevice().then((outputInfo) => {
```js
// Obtain the output device information.
currentSession.getOutputDevice().then((outputInfo) => {
console.info(`getOutputDevice successfully, deviceName : ${outputInfo.deviceName}`);
}).catch((err) => {
}).catch((err) => {
console.info(`getOutputDevice : ERROR : ${err.message}`);
});
```
});
```
4. Subscribe to control command events.
```js
// Subscribe to the 'play' command event.
currentSession.on('play', () => {
```js
// Subscribe to the 'play' command event.
currentSession.on('play', () => {
console.log ("Call AudioPlayer.play.");
// Set the playback state information.
currentSession.setAVPlaybackState({state: avSession.PlaybackState.PLAYBACK_STATE_PLAY}).then(() => {
......@@ -146,11 +154,11 @@ currentSession.on('play', () => {
}).catch((err) => {
console.info(`setAVPlaybackState : ERROR : ${err.message}`);
});
});
});
// Subscribe to the 'pause' command event.
currentSession.on('pause', () => {
// Subscribe to the 'pause' command event.
currentSession.on('pause', () => {
console.log ("Call AudioPlayer.pause.");
// Set the playback state information.
currentSession.setAVPlaybackState({state: avSession.PlaybackState.PLAYBACK_STATE_PAUSE}).then(() => {
......@@ -158,10 +166,10 @@ currentSession.on('pause', () => {
}).catch((err) => {
console.info(`setAVPlaybackState : ERROR : ${err.message}`);
});
});
});
// Subscribe to the 'stop' command event.
currentSession.on('stop', () => {
// Subscribe to the 'stop' command event.
currentSession.on('stop', () => {
console.log ("Call AudioPlayer.stop.");
// Set the playback state information.
currentSession.setAVPlaybackState({state: avSession.PlaybackState.PLAYBACK_STATE_STOP}).then(() => {
......@@ -169,10 +177,10 @@ currentSession.on('stop', () => {
}).catch((err) => {
console.info(`setAVPlaybackState : ERROR : ${err.message}`);
});
});
});
// Subscribe to the 'playNext' command event.
currentSession.on('playNext', () => {
// Subscribe to the 'playNext' command event.
currentSession.on('playNext', () => {
// When the media file is not ready, download and cache the media file, and set the 'PREPARE' state.
currentSession.setAVPlaybackState({state: avSession.PlaybackState.PLAYBACK_STATE_PREPARE}).then(() => {
console.info('setAVPlaybackState successfully');
......@@ -193,10 +201,10 @@ currentSession.on('playNext', () => {
}).catch((err) => {
console.info(`setAVPlaybackState : ERROR : ${err.message}`);
});
});
});
// Subscribe to the 'fastForward' command event.
currentSession.on('fastForward', () => {
// Subscribe to the 'fastForward' command event.
currentSession.on('fastForward', () => {
console.log("Call AudioPlayer for fast forwarding.");
// Set the playback state information.
currentSession.setAVPlaybackState({speed: 2.0}).then(() => {
......@@ -204,10 +212,10 @@ currentSession.on('fastForward', () => {
}).catch((err) => {
console.info(`setAVPlaybackState : ERROR : ${err.message}`);
});
});
});
// Subscribe to the 'seek' command event.
currentSession.on('seek', (time) => {
// Subscribe to the 'seek' command event.
currentSession.on('seek', (time) => {
console.log("Call AudioPlayer.seek.");
// Set the playback state information.
currentSession.setAVPlaybackState({position: {elapsedTime: time, updateTime: (new Data()).getTime()}}).then(() => {
......@@ -215,10 +223,10 @@ currentSession.on('seek', (time) => {
}).catch((err) => {
console.info(`setAVPlaybackState : ERROR : ${err.message}`);
});
});
});
// Subscribe to the 'setSpeed' command event.
currentSession.on('setSpeed', (speed) => {
// Subscribe to the 'setSpeed' command event.
currentSession.on('setSpeed', (speed) => {
console.log(`Call AudioPlayer to set the speed to ${speed}`);
// Set the playback state information.
currentSession.setAVPlaybackState({speed: speed}).then(() => {
......@@ -226,10 +234,10 @@ currentSession.on('setSpeed', (speed) => {
}).catch((err) => {
console.info(`setAVPlaybackState : ERROR : ${err.message}`);
});
});
});
// Subscribe to the 'setLoopMode' command event.
currentSession.on('setLoopMode', (mode) => {
// Subscribe to the 'setLoopMode' command event.
currentSession.on('setLoopMode', (mode) => {
console.log(`The application switches to the loop mode ${mode}`);
// Set the playback state information.
currentSession.setAVPlaybackState({loopMode: mode}).then(() => {
......@@ -237,10 +245,10 @@ currentSession.on('setLoopMode', (mode) => {
}).catch((err) => {
console.info(`setAVPlaybackState : ERROR : ${err.message}`);
});
});
});
// Subscribe to the 'toggleFavorite' command event.
currentSession.on('toggleFavorite', (assetId) => {
// Subscribe to the 'toggleFavorite' command event.
currentSession.on('toggleFavorite', (assetId) => {
console.log(`The application favorites ${assetId}.`);
// Perform the switch based on the last status.
let favorite = mediaFavorite == false ? true : false;
......@@ -250,41 +258,42 @@ currentSession.on('toggleFavorite', (assetId) => {
console.info(`setAVPlaybackState : ERROR : ${err.message}`);
});
mediaFavorite = favorite;
});
});
// Subscribe to the key event.
currentSession.on('handleKeyEvent', (event) => {
// Subscribe to the key event.
currentSession.on('handleKeyEvent', (event) => {
console.log(`User presses the key ${event.keyCode}`);
});
});
// Subscribe to output device changes.
currentSession.on('outputDeviceChange', (device) => {
// Subscribe to output device changes.
currentSession.on('outputDeviceChange', (device) => {
console.log(`Output device changed to ${device.deviceName}`);
});
```
});
```
5. Release resources.
```js
// Unsubscribe from the events.
currentSession.off('play');
currentSession.off('pause');
currentSession.off('stop');
currentSession.off('playNext');
currentSession.off('playPrevious');
currentSession.off('fastForward');
currentSession.off('rewind');
currentSession.off('seek');
currentSession.off('setSpeed');
currentSession.off('setLoopMode');
currentSession.off('toggleFavorite');
currentSession.off('handleKeyEvent');
currentSession.off('outputDeviceChange');
// Deactivate the session and destroy the object.
currentSession.deactivate().then(() => {
```js
// Unsubscribe from the events.
currentSession.off('play');
currentSession.off('pause');
currentSession.off('stop');
currentSession.off('playNext');
currentSession.off('playPrevious');
currentSession.off('fastForward');
currentSession.off('rewind');
currentSession.off('seek');
currentSession.off('setSpeed');
currentSession.off('setLoopMode');
currentSession.off('toggleFavorite');
currentSession.off('handleKeyEvent');
currentSession.off('outputDeviceChange');
// Deactivate the session and destroy the object.
currentSession.deactivate().then(() => {
currentSession.destroy();
});
```
});
```
### Verification
Touch the play, pause, or next button on the media application. Check whether the media playback state changes accordingly.
......@@ -362,22 +371,24 @@ Table 2 Common APIs for session control end development
### How to Develop
1. Import the modules.
```js
import avSession from '@ohos.multimedia.avsession';
import {Action, KeyEvent} from '@ohos.multimodalInput.KeyEvent';
import wantAgent from '@ohos.wantAgent';
import audio from '@ohos.multimedia.audio';
```
```js
import avSession from '@ohos.multimedia.avsession';
import {Action, KeyEvent} from '@ohos.multimodalInput.KeyEvent';
import wantAgent from '@ohos.wantAgent';
import audio from '@ohos.multimedia.audio';
```
2. Obtain the session descriptors and create a controller.
```js
// Define global variables.
let g_controller = new Array<avSession.AVSessionController>();
let g_centerSupportCmd:Set<avSession.AVControlCommandType> = new Set(['play', 'pause', 'playNext', 'playPrevious', 'fastForward', 'rewind', 'seek','setSpeed', 'setLoopMode', 'toggleFavorite']);
let g_validCmd:Set<avSession.AVControlCommandType>;
// Obtain the session descriptors and create a controller.
avSession.getAllSessionDescriptors().then((descriptors) => {
```js
// Define global variables.
let g_controller = new Array<avSession.AVSessionController>();
let g_centerSupportCmd:Set<avSession.AVControlCommandType> = new Set(['play', 'pause', 'playNext', 'playPrevious', 'fastForward', 'rewind', 'seek','setSpeed', 'setLoopMode', 'toggleFavorite']);
let g_validCmd:Set<avSession.AVControlCommandType>;
// Obtain the session descriptors and create a controller.
avSession.getAllSessionDescriptors().then((descriptors) => {
descriptors.forEach((descriptor) => {
avSession.createController(descriptor.sessionId).then((controller) => {
g_controller.push(controller);
......@@ -385,44 +396,45 @@ avSession.getAllSessionDescriptors().then((descriptors) => {
console.error('createController error');
});
});
}).catch((err) => {
}).catch((err) => {
console.error('getAllSessionDescriptors error');
});
});
// Subscribe to the 'sessionCreate' event and create a controller.
avSession.on('sessionCreate', (session) => {
// Subscribe to the 'sessionCreate' event and create a controller.
avSession.on('sessionCreate', (session) => {
// After a session is added, you must create a controller.
avSession.createController(session.sessionId).then((controller) => {
g_controller.push(controller);
}).catch((err) => {
console.info(`createController : ERROR : ${err.message}`);
});
});
```
});
```
3. Subscribe to the session state and service changes.
```js
// Subscribe to the 'activeStateChange' event.
controller.on('activeStateChange', (isActive) => {
```js
// Subscribe to the 'activeStateChange' event.
controller.on('activeStateChange', (isActive) => {
if (isActive) {
console.log ("The widget corresponding to the controller is highlighted.");
} else {
console.log("The widget corresponding to the controller is invalid.");
}
});
});
// Subscribe to the 'sessionDestroy' event to enable Media Controller to get notified when the session dies.
controller.on('sessionDestroy', () => {
// Subscribe to the 'sessionDestroy' event to enable Media Controller to get notified when the session dies.
controller.on('sessionDestroy', () => {
console.info('on sessionDestroy : SUCCESS ');
controller.destroy().then(() => {
console.info('destroy : SUCCESS ');
}).catch((err) => {
console.info(`destroy : ERROR :${err.message}`);
});
});
});
// Subscribe to the 'sessionDestroy' event to enable the application to get notified when the session dies.
avSession.on('sessionDestroy', (session) => {
// Subscribe to the 'sessionDestroy' event to enable the application to get notified when the session dies.
avSession.on('sessionDestroy', (session) => {
let index = g_controller.findIndex((controller) => {
return controller.sessionId == session.sessionId;
});
......@@ -430,10 +442,10 @@ avSession.on('sessionDestroy', (session) => {
g_controller[index].destroy();
g_controller.splice(index, 1);
}
});
});
// Subscribe to the 'topSessionChange' event.
avSession.on('topSessionChange', (session) => {
// Subscribe to the 'topSessionChange' event.
avSession.on('topSessionChange', (session) => {
let index = g_controller.findIndex((controller) => {
return controller.sessionId == session.sessionId;
});
......@@ -443,31 +455,32 @@ avSession.on('topSessionChange', (session) => {
return a.sessionId == session.sessionId ? -1 : 0;
});
}
});
});
// Subscribe to the 'sessionServiceDie' event.
avSession.on('sessionServiceDie', () => {
// Subscribe to the 'sessionServiceDie' event.
avSession.on('sessionServiceDie', () => {
// The server is abnormal, and the application clears resources.
console.log("Server exception");
})
```
})
```
4. Subscribe to media session information changes.
```js
// Subscribe to metadata changes.
let metaFilter = ['assetId', 'title', 'description'];
controller.on('metadataChange', metaFilter, (metadata) => {
```js
// Subscribe to metadata changes.
let metaFilter = ['assetId', 'title', 'description'];
controller.on('metadataChange', metaFilter, (metadata) => {
console.info(`on metadataChange assetId : ${metadata.assetId}`);
});
});
// Subscribe to playback state changes.
let playbackFilter = ['state', 'speed', 'loopMode'];
controller.on('playbackStateChange', playbackFilter, (playbackState) => {
// Subscribe to playback state changes.
let playbackFilter = ['state', 'speed', 'loopMode'];
controller.on('playbackStateChange', playbackFilter, (playbackState) => {
console.info(`on playbackStateChange state : ${playbackState.state}`);
});
});
// Subscribe to supported command changes.
controller.on('validCommandChange', (cmds) => {
// Subscribe to supported command changes.
controller.on('validCommandChange', (cmds) => {
console.info(`validCommandChange : SUCCESS : size : ${cmds.size}`);
console.info(`validCommandChange : SUCCESS : cmds : ${cmds.values()}`);
g_validCmd.clear();
......@@ -476,87 +489,89 @@ controller.on('validCommandChange', (cmds) => {
g_validCmd.add(c);
}
}
});
});
// Subscribe to output device changes.
controller.on('outputDeviceChange', (device) => {
// Subscribe to output device changes.
controller.on('outputDeviceChange', (device) => {
console.info(`on outputDeviceChange device isRemote : ${device.isRemote}`);
});
```
});
```
5. Control the session behavior.
```js
// When the user touches the play button, the control command 'play' is sent to the session.
if (g_validCmd.has('play')) {
```js
// When the user touches the play button, the control command 'play' is sent to the session.
if (g_validCmd.has('play')) {
controller.sendControlCommand({command:'play'}).then(() => {
console.info('sendControlCommand successfully');
}).catch((err) => {
console.info(`sendControlCommand : ERROR : ${err.message}`);
});
}
}
// When the user selects the single loop mode, the corresponding control command is sent to the session.
if (g_validCmd.has('setLoopMode')) {
// When the user selects the single loop mode, the corresponding control command is sent to the session.
if (g_validCmd.has('setLoopMode')) {
controller.sendControlCommand({command: 'setLoopMode', parameter: avSession.LoopMode.LOOP_MODE_SINGLE}).then(() => {
console.info('sendControlCommand successfully');
}).catch((err) => {
console.info(`sendControlCommand : ERROR : ${err.message}`);
});
}
}
// Send a key event.
let keyItem = {code: 0x49, pressedTime: 123456789, deviceId: 0};
let event = {action: 2, key: keyItem, keys: [keyItem]};
controller.sendAVKeyEvent(event).then(() => {
// Send a key event.
let keyItem = {code: 0x49, pressedTime: 123456789, deviceId: 0};
let event = {action: 2, key: keyItem, keys: [keyItem]};
controller.sendAVKeyEvent(event).then(() => {
console.info('sendAVKeyEvent Successfully');
}).catch((err) => {
}).catch((err) => {
console.info(`sendAVKeyEvent : ERROR : ${err.message}`);
});
});
// The user touches the blank area on the widget to start the application.
controller.getLaunchAbility().then((want) => {
// The user touches the blank area on the widget to start the application.
controller.getLaunchAbility().then((want) => {
console.log("Starting the application in the foreground");
}).catch((err) => {
}).catch((err) => {
console.info(`getLaunchAbility : ERROR : ${err.message}`);
});
});
// Send the system key event.
let keyItem = {code: 0x49, pressedTime: 123456789, deviceId: 0};
let event = {action: 2, key: keyItem, keys: [keyItem]};
avSession.sendSystemAVKeyEvent(event).then(() => {
// Send the system key event.
let keyItem = {code: 0x49, pressedTime: 123456789, deviceId: 0};
let event = {action: 2, key: keyItem, keys: [keyItem]};
avSession.sendSystemAVKeyEvent(event).then(() => {
console.info('sendSystemAVKeyEvent Successfully');
}).catch((err) => {
}).catch((err) => {
console.info(`sendSystemAVKeyEvent : ERROR : ${err.message}`);
});
});
// Send a system control command to the top session.
let avcommand = {command: 'toggleFavorite', parameter: "false"};
avSession.sendSystemControlCommand(avcommand).then(() => {
// Send a system control command to the top session.
let avcommand = {command: 'toggleFavorite', parameter: "false"};
avSession.sendSystemControlCommand(avcommand).then(() => {
console.info('sendSystemControlCommand successfully');
}).catch((err) => {
}).catch((err) => {
console.info(`sendSystemControlCommand : ERROR : ${err.message}`);
});
});
// Cast the session to another device.
let audioManager = audio.getAudioManager();
let audioDevices;
await audioManager.getDevices(audio.DeviceFlag.OUTPUT_DEVICES_FLAG).then((data) => {
// Cast the session to another device.
let audioManager = audio.getAudioManager();
let audioDevices;
await audioManager.getDevices(audio.DeviceFlag.OUTPUT_DEVICES_FLAG).then((data) => {
audioDevices = data;
console.info('Promise returned to indicate that the device list is obtained.');
}).catch((err) => {
}).catch((err) => {
console.info(`getDevices : ERROR : ${err.message}`);
});
});
avSession.castAudio('all', audioDevices).then(() => {
avSession.castAudio('all', audioDevices).then(() => {
console.info('createController : SUCCESS');
}).catch((err) => {
}).catch((err) => {
console.info(`createController : ERROR : ${err.message}`);
});
```
});
```
6. Release resources.
```js
// Unsubscribe from the events.
```js
// Unsubscribe from the events.
controller.off('metadataChange');
controller.off('playbackStateChange');
controller.off('sessionDestroy');
......@@ -570,7 +585,7 @@ avSession.castAudio('all', audioDevices).then(() => {
}).catch((err) => {
console.info(`destroy : ERROR : ${err.message}`);
});
```
```
### Verification
When you touch the play, pause, or next button in Media Controller, the playback state of the application changes accordingly.
......
# AVSession Overview
> **NOTE**
>
> All APIs of the **AVSession** module are system APIs and can be called only by system applications.
## Overview
AVSession, short for audio and video session, is also known as media session.
......@@ -49,4 +53,4 @@ The **AVSession** module provides two classes: **AVSession** and **AVSessionCont
- AVSession can transmit media playback information and control commands. It does not display information or execute control commands.
- Do not develop Media Controller for common applications. For common audio and video applications running on OpenHarmony, the default control end is Media Controller, which is a system application. You do not need to carry out additional development for Media Controller.
- If you want to develop your own system running OpenHarmony, you can develop your own Media Controller.
- For better background management of audio and video applications, the **AVSession** module enforces background control for third-party applications. Only third-party applications that have accessed AVSession can play audio in the background. Otherwise, the system forcibly pauses the playback when a third-party application switches to the background.
- For better background management of audio and video applications, the **AVSession** module enforces background control for applications. Only applications that have accessed AVSession can play audio in the background. Otherwise, the system forcibly pauses the playback when an application switches to the background.
......@@ -23,9 +23,9 @@ import audio from '@ohos.multimedia.audio';
| Name | Type | Readable | Writable| Description |
| --------------------------------------- | ----------| ---- | ---- | ------------------ |
| LOCAL_NETWORK_ID<sup>9+</sup> | string | Yes | No | Network ID of the local device.<br>This is a system API.<br>**System capability**: SystemCapability.Multimedia.Audio.Device |
| DEFAULT_VOLUME_GROUP_ID<sup>9+</sup> | number | Yes | No | Default volume group ID.<br>**System capability**: SystemCapability.Multimedia.Audio.Volume |
| DEFAULT_INTERRUPT_GROUP_ID<sup>9+</sup> | number | Yes | No | Default audio interruption group ID.<br>**System capability**: SystemCapability.Multimedia.Audio.Interrupt |
| LOCAL_NETWORK_ID<sup>9+</sup> | string | Yes | No | Network ID of the local device.<br>This is a system API.<br> **System capability**: SystemCapability.Multimedia.Audio.Device |
| DEFAULT_VOLUME_GROUP_ID<sup>9+</sup> | number | Yes | No | Default volume group ID.<br> **System capability**: SystemCapability.Multimedia.Audio.Volume |
| DEFAULT_INTERRUPT_GROUP_ID<sup>9+</sup> | number | Yes | No | Default audio interruption group ID.<br> **System capability**: SystemCapability.Multimedia.Audio.Interrupt |
**Example**
......@@ -349,7 +349,10 @@ Enumerates the audio stream types.
| VOICE_CALL<sup>8+</sup> | 0 | Audio stream for voice calls.|
| RINGTONE | 2 | Audio stream for ringtones. |
| MEDIA | 3 | Audio stream for media purpose. |
| ALARM<sup>10+</sup> | 4 | Audio stream for alarming. |
| ACCESSIBILITY<sup>10+</sup> | 5 | Audio stream for accessibility. |
| VOICE_ASSISTANT<sup>8+</sup> | 9 | Audio stream for voice assistant.|
| ULTRASONIC<sup>10+</sup> | 10 | Audio stream for ultrasonic.<br>This is a system API.|
| ALL<sup>9+</sup> | 100 | All public audio streams.<br>This is a system API.|
## InterruptRequestResultType<sup>9+</sup>
......@@ -531,7 +534,7 @@ Enumerates the audio content types.
| CONTENT_TYPE_MOVIE | 3 | Movie. |
| CONTENT_TYPE_SONIFICATION | 4 | Notification tone. |
| CONTENT_TYPE_RINGTONE<sup>8+</sup> | 5 | Ringtone. |
| CONTENT_TYPE_ULTRASONIC<sup>10+</sup>| 9 | Ultrasonic.<br>This is a system API.|
## StreamUsage
Enumerates the audio stream usage.
......@@ -544,7 +547,10 @@ Enumerates the audio stream usage.
| STREAM_USAGE_MEDIA | 1 | Used for media. |
| STREAM_USAGE_VOICE_COMMUNICATION | 2 | Used for voice communication.|
| STREAM_USAGE_VOICE_ASSISTANT<sup>9+</sup> | 3 | Used for voice assistant.|
| STREAM_USAGE_ALARM<sup>10+</sup> | 4 | Used for alarming. |
| STREAM_USAGE_NOTIFICATION_RINGTONE | 6 | Used for notification.|
| STREAM_USAGE_ACCESSIBILITY<sup>10+</sup> | 8 | Used for accessibility. |
| STREAM_USAGE_SYSTEM<sup>10+</sup> | 9 | System tone (such as screen lock or keypad tone).<br>This is a system API.|
## InterruptRequestType<sup>9+</sup>
......@@ -1789,7 +1795,7 @@ Sets a device to the active state. This API uses a promise to return the result.
| Name | Type | Mandatory| Description |
| ---------- | ------------------------------------- | ---- | ------------------ |
| deviceType | [ActiveDeviceType](#activedevicetypedeprecated) | Yes | Active audio device type. |
| deviceType | [ActiveDeviceType](#activedevicetypedeprecated) | Yes | Active audio device type.|
| active | boolean | Yes | Active state to set. The value **true** means to set the device to the active state, and **false** means the opposite. |
**Return value**
......@@ -1854,7 +1860,7 @@ Checks whether a device is active. This API uses a promise to return the result.
| Name | Type | Mandatory| Description |
| ---------- | ------------------------------------- | ---- | ------------------ |
| deviceType | [ActiveDeviceType](#activedevicetypedeprecated) | Yes | Active audio device type. |
| deviceType | [ActiveDeviceType](#activedevicetypedeprecated) | Yes | Active audio device type.|
**Return value**
......@@ -4568,15 +4574,15 @@ let filePath = path + '/StarWars10s-2C-48000-4SW.wav';
let file = fs.openSync(filePath, fs.OpenMode.READ_ONLY);
let stat = await fs.stat(path);
let buf = new ArrayBuffer(bufferSize);
let len = stat.size % this.bufferSize == 0 ? Math.floor(stat.size / this.bufferSize) : Math.floor(stat.size / this.bufferSize + 1);
let len = stat.size % bufferSize == 0 ? Math.floor(stat.size / bufferSize) : Math.floor(stat.size / bufferSize + 1);
for (let i = 0;i < len; i++) {
let options = {
offset: i * this.bufferSize,
length: this.bufferSize
offset: i * bufferSize,
length: bufferSize
}
let readsize = await fs.read(file.fd, buf, options)
let writeSize = await new Promise((resolve,reject)=>{
this.audioRenderer.write(buf,(err,writeSize)=>{
audioRenderer.write(buf,(err,writeSize)=>{
if(err){
reject(err)
}else{
......@@ -4585,6 +4591,7 @@ for (let i = 0;i < len; i++) {
})
})
}
```
### write<sup>8+</sup>
......@@ -4621,15 +4628,15 @@ let filePath = path + '/StarWars10s-2C-48000-4SW.wav';
let file = fs.openSync(filePath, fs.OpenMode.READ_ONLY);
let stat = await fs.stat(path);
let buf = new ArrayBuffer(bufferSize);
let len = stat.size % this.bufferSize == 0 ? Math.floor(stat.size / this.bufferSize) : Math.floor(stat.size / this.bufferSize + 1);
let len = stat.size % bufferSize == 0 ? Math.floor(stat.size / bufferSize) : Math.floor(stat.size / bufferSize + 1);
for (let i = 0;i < len; i++) {
let options = {
offset: i * this.bufferSize,
length: this.bufferSize
offset: i * bufferSize,
length: bufferSize
}
let readsize = await fs.read(file.fd, buf, options)
try{
let writeSize = await this.audioRenderer.write(buf);
let writeSize = await audioRenderer.write(buf);
} catch(err) {
console.error(`audioRenderer.write err: ${err}`);
}
......
......@@ -2,7 +2,10 @@
> **NOTE**
>
> The APIs of this module are supported since API version 6. Updates will be marked with a superscript to indicate their earliest API version.
> - The APIs of this module are supported since API version 6. Updates will be marked with a superscript to indicate their earliest API version.
> - This API is deprecated since API version 9 and will be retained until API version 13.
> - Certain functionalities are changed as system APIs and can be used only by system applications. To use these functionalities, call [@ohos.filemanagement.userFileManager](js-apis-userFileManager.md).
> - The functionalities for selecting and storing media assets are still open to common applications. To use these functionalities, call [@ohos.file.picker](js-apis-file-picker.md).
## Modules to Import
```js
......@@ -131,18 +134,13 @@ async function example() {
console.info('fileAsset.displayName ' + '0 : ' + fileAsset.displayName);
// Call getNextObject to obtain the next file until the last one.
for (let i = 1; i < count; i++) {
fetchFileResult.getNextObject((error, fileAsset) => {
if (fileAsset == undefined) {
console.error('get next object failed with error: ' + error);
return;
}
let fileAsset = await fetchFileResult.getNextObject();
console.info('fileAsset.displayName ' + i + ': ' + fileAsset.displayName);
})
}
});
// Release the FetchFileResult instance and invalidate it. Other APIs can no longer be called.
fetchFileResult.close();
});
});
}
```
......@@ -199,18 +197,15 @@ async function example() {
console.info('fileAsset.displayName ' + '0 : ' + fileAsset.displayName);
// Call getNextObject to obtain the next file until the last one.
for (let i = 1; i < count; i++) {
fetchFileResult.getNextObject().then((fileAsset) => {
let fileAsset = await fetchFileResult.getNextObject();
console.info('fileAsset.displayName ' + i + ': ' + fileAsset.displayName);
}).catch((error) => {
console.error('get next object failed with error: ' + error);
})
}
// Release the FetchFileResult instance and invalidate it. Other APIs can no longer be called.
fetchFileResult.close();
}).catch((error) => {
// Calling getFirstObject fails.
console.error('get first object failed with error: ' + error);
});
// Release the FetchFileResult instance and invalidate it. Other APIs can no longer be called.
fetchFileResult.close();
}).catch((error) => {
// Calling getFileAssets fails.
console.error('get file assets failed with error: ' + error);
......@@ -500,7 +495,7 @@ async function example() {
### getAlbums<sup>7+</sup>
getAlbums(options: MediaFetchOptions, callback: AsyncCallback<Array&lt;Album&gt;>): void
getAlbums(options: MediaFetchOptions, callback: AsyncCallback&lt;Array&lt;Album&gt;&gt;): void
Obtains the albums. This API uses an asynchronous callback to return the result.
......@@ -535,7 +530,7 @@ async function example() {
### getAlbums<sup>7+</sup>
getAlbums(options: MediaFetchOptions): Promise<Array&lt;Album&gt;>
getAlbums(options: MediaFetchOptions): Promise&lt;Array&lt;Album&gt;&gt;
Obtains the albums. This API uses a promise to return the result.
......@@ -615,7 +610,7 @@ Call this API when you no longer need to use the APIs in the **MediaLibrary** in
media.release()
```
### storeMediaAsset<sup>(deprecated)</sup>
### storeMediaAsset
storeMediaAsset(option: MediaAssetOption, callback: AsyncCallback&lt;string&gt;): void
......@@ -623,7 +618,7 @@ Stores a media asset. This API uses an asynchronous callback to return the URI t
> **NOTE**
>
> This API is deprecated since API version 9.
> This API is supported since API version 6 and can be used only by the FA model.
**System capability**: SystemCapability.Multimedia.MediaLibrary.Core
......@@ -653,7 +648,7 @@ mediaLibrary.getMediaLibrary().storeMediaAsset(option, (error, value) => {
```
### storeMediaAsset<sup>(deprecated)</sup>
### storeMediaAsset
storeMediaAsset(option: MediaAssetOption): Promise&lt;string&gt;
......@@ -661,7 +656,7 @@ Stores a media asset. This API uses a promise to return the URI that stores the
> **NOTE**
>
> This API is deprecated since API version 9.
> This API is supported since API version 6 and can be used only by the FA model.
**System capability**: SystemCapability.Multimedia.MediaLibrary.Core
......@@ -694,15 +689,15 @@ mediaLibrary.getMediaLibrary().storeMediaAsset(option).then((value) => {
```
### startImagePreview<sup>(deprecated)</sup>
### startImagePreview
startImagePreview(images: Array&lt;string&gt;, index: number, callback: AsyncCallback&lt;void&gt;): void
Starts image preview, with the first image to preview specified. This API can be used to preview local images whose URIs start with **datashare://** or online images whose URIs start with **https://**. It uses an asynchronous callback to return the execution result.
> **NOTE**
>
> This API is deprecated since API version 9. You are advised to use the **\<[Image](../arkui-ts/ts-basic-components-image.md)>** component instead. The **\<Image>** component can be used to render and display local and online images.
> This API is supported since API version 6 and can be used only by the FA model.
> You are advised to use the **\<[Image](../arkui-ts/ts-basic-components-image.md)>** component instead. The **\<Image>** component can be used to render and display local and online images.
**System capability**: SystemCapability.Multimedia.MediaLibrary.Core
......@@ -738,15 +733,15 @@ mediaLibrary.getMediaLibrary().startImagePreview(images, index, (error) => {
```
### startImagePreview<sup>(deprecated)</sup>
### startImagePreview
startImagePreview(images: Array&lt;string&gt;, callback: AsyncCallback&lt;void&gt;): void
Starts image preview. This API can be used to preview local images whose URIs start with **datashare://** or online images whose URIs start with **https://**. It uses an asynchronous callback to return the execution result.
> **NOTE**
>
> This API is deprecated since API version 9. You are advised to use the **\<[Image](../arkui-ts/ts-basic-components-image.md)>** component instead. The **\<Image>** component can be used to render and display local and online images.
> This API is supported since API version 6 and can be used only by the FA model.
> You are advised to use the **\<[Image](../arkui-ts/ts-basic-components-image.md)>** component instead. The **\<Image>** component can be used to render and display local and online images.
**System capability**: SystemCapability.Multimedia.MediaLibrary.Core
......@@ -780,15 +775,15 @@ mediaLibrary.getMediaLibrary().startImagePreview(images, (error) => {
```
### startImagePreview<sup>(deprecated)</sup>
### startImagePreview
startImagePreview(images: Array&lt;string&gt;, index?: number): Promise&lt;void&gt;
Starts image preview, with the first image to preview specified. This API can be used to preview local images whose URIs start with **datashare://** or online images whose URIs start with **https://**. It uses a promise to return the execution result.
> **NOTE**
>
> This API is deprecated since API version 9. You are advised to use the **\<[Image](../arkui-ts/ts-basic-components-image.md)>** component instead. The **\<Image>** component can be used to render and display local and online images.
> This API is supported since API version 6 and can be used only by the FA model.
> You are advised to use the **\<[Image](../arkui-ts/ts-basic-components-image.md)>** component instead. The **\<Image>** component can be used to render and display local and online images.
**System capability**: SystemCapability.Multimedia.MediaLibrary.Core
......@@ -827,15 +822,15 @@ mediaLibrary.getMediaLibrary().startImagePreview(images, index).then(() => {
```
### startMediaSelect<sup>(deprecated)</sup>
### startMediaSelect
startMediaSelect(option: MediaSelectOption, callback: AsyncCallback&lt;Array&lt;string&gt;&gt;): void
Starts media selection. This API uses an asynchronous callback to return the list of URIs that store the selected media assets.
> **NOTE**
>
> This API is deprecated since API version 9. You are advised to use the system app Gallery instead. Gallery is a built-in visual resource access application that provides features such as image and video management and browsing. For details about how to use Gallery, visit [OpenHarmony/applications_photos](https://gitee.com/openharmony/applications_photos).
> This API is supported since API version 6 and can be used only by the FA model.
> You are advised to use the system app Gallery instead. Gallery is a built-in visual resource access application that provides features such as image and video management and browsing. For details about how to use Gallery, visit [OpenHarmony/applications_photos](https://gitee.com/openharmony/applications_photos).
**System capability**: SystemCapability.Multimedia.MediaLibrary.Core
......@@ -843,7 +838,7 @@ Starts media selection. This API uses an asynchronous callback to return the lis
| Name | Type | Mandatory | Description |
| -------- | ---------------------------------------- | ---- | ------------------------------------ |
| option | [MediaSelectOption](#mediaselectoptiondeprecated) | Yes | Media selection option. |
| option | [MediaSelectOption](#mediaselectoption) | Yes | Media selection option. |
| callback | AsyncCallback&lt;Array&lt;string&gt;&gt; | Yes | Callback used to return the list of URIs (starting with **datashare://**) that store the selected media assets.|
**Example**
......@@ -864,15 +859,15 @@ mediaLibrary.getMediaLibrary().startMediaSelect(option, (error, value) => {
```
### startMediaSelect<sup>(deprecated)</sup>
### startMediaSelect
startMediaSelect(option: MediaSelectOption): Promise&lt;Array&lt;string&gt;&gt;
Starts media selection. This API uses a promise to return the list of URIs that store the selected media assets.
> **NOTE**
>
> This API is deprecated since API version 9. You are advised to use the system app Gallery instead. Gallery is a built-in visual resource access application that provides features such as image and video management and browsing. For details about how to use Gallery, visit [OpenHarmony/applications_photos](https://gitee.com/openharmony/applications_photos).
> This API is supported since API version 6 and can be used only by the FA model.
> You are advised to use the system app Gallery instead. Gallery is a built-in visual resource access application that provides features such as image and video management and browsing. For details about how to use Gallery, visit [OpenHarmony/applications_photos](https://gitee.com/openharmony/applications_photos).
**System capability**: SystemCapability.Multimedia.MediaLibrary.Core
......@@ -880,7 +875,7 @@ Starts media selection. This API uses a promise to return the list of URIs that
| Name | Type | Mandatory | Description |
| ------ | --------------------------------------- | ---- | ------- |
| option | [MediaSelectOption](#mediaselectoptiondeprecated) | Yes | Media selection option.|
| option | [MediaSelectOption](#mediaselectoption) | Yes | Media selection option.|
**Return value**
......@@ -1041,7 +1036,6 @@ async function example() {
Provides APIs for encapsulating file asset attributes.
> **NOTE**
>
> 1. The system attempts to parse the file content if the file is an audio or video file. The actual field values will be restored from the passed values during scanning on some devices.
> 2. Some devices may not support the modification of **orientation**. You are advised to use [ModifyImageProperty](js-apis-image.md#modifyimageproperty9) of the **image** module.
......@@ -1923,9 +1917,9 @@ async function example() {
if(i == fetchCount - 1) {
var result = fetchFileResult.isAfterLast();
console.info('mediaLibrary fileAsset isAfterLast result: ' + result);
fetchFileResult.close();
}
}
fetchFileResult.close();
}
```
......@@ -1985,8 +1979,8 @@ async function example() {
return;
}
console.info('getFirstObject successfully, displayName : ' + fileAsset.displayName);
})
fetchFileResult.close();
})
}
```
......@@ -2018,10 +2012,10 @@ async function example() {
let fetchFileResult = await media.getFileAssets(getImageOp);
fetchFileResult.getFirstObject().then((fileAsset) => {
console.info('getFirstObject successfully, displayName: ' + fileAsset.displayName);
fetchFileResult.close();
}).catch((error) => {
console.error('getFirstObject failed with error: ' + error);
});
fetchFileResult.close();
}
```
......@@ -2055,16 +2049,16 @@ async function example() {
};
let fetchFileResult = await media.getFileAssets(getImageOp);
let fileAsset = await fetchFileResult.getFirstObject();
if (! fetchFileResult.isAfterLast) {
if (!fileAsset.isAfterLast) {
fetchFileResult.getNextObject((error, fileAsset) => {
if (error) {
console.error('fetchFileResult getNextObject failed with error: ' + error);
return;
}
console.log('fetchFileResult getNextObject successfully, displayName: ' + fileAsset.displayName);
fetchFileResult.close();
})
}
fetchFileResult.close();
}
```
......@@ -2099,14 +2093,14 @@ async function example() {
};
let fetchFileResult = await media.getFileAssets(getImageOp);
let fileAsset = await fetchFileResult.getFirstObject();
if (! fetchFileResult.isAfterLast) {
if (!fileAsset.isAfterLast) {
fetchFileResult.getNextObject().then((fileAsset) => {
console.info('fetchFileResult getNextObject successfully, displayName: ' + fileAsset.displayName);
fetchFileResult.close();
}).catch((error) => {
console.error('fetchFileResult getNextObject failed with error: ' + error);
})
}
fetchFileResult.close();
}
```
......@@ -2142,8 +2136,8 @@ async function example() {
return;
}
console.info('getLastObject successfully, displayName: ' + fileAsset.displayName);
})
fetchFileResult.close();
})
}
```
......@@ -2175,10 +2169,10 @@ async function example() {
let fetchFileResult = await media.getFileAssets(getImageOp);
fetchFileResult.getLastObject().then((fileAsset) => {
console.info('getLastObject successfully, displayName: ' + fileAsset.displayName);
fetchFileResult.close();
}).catch((error) => {
console.error('getLastObject failed with error: ' + error);
});
fetchFileResult.close();
}
```
......@@ -2215,8 +2209,8 @@ async function example() {
return;
}
console.info('getPositionObject successfully, displayName: ' + fileAsset.displayName);
})
fetchFileResult.close();
})
}
```
......@@ -2254,10 +2248,10 @@ async function example() {
let fetchFileResult = await media.getFileAssets(getImageOp);
fetchFileResult.getPositionObject(0).then((fileAsset) => {
console.info('getPositionObject successfully, displayName: ' + fileAsset.displayName);
fetchFileResult.close();
}).catch((error) => {
console.error('getPositionObject failed with error: ' + error);
});
fetchFileResult.close();
}
```
......@@ -2295,8 +2289,8 @@ async function example() {
for (let i = 0; i < fetchFileResult.getCount(); i++) {
console.info('getAllObject fileAssetList ' + i + ' displayName: ' + fileAssetList[i].displayName);
}
})
fetchFileResult.close();
})
}
```
......@@ -2330,10 +2324,10 @@ async function example() {
for (let i = 0; i < fetchFileResult.getCount(); i++) {
console.info('getAllObject fileAssetList ' + i + ' displayName: ' + fileAssetList[i].displayName);
}
fetchFileResult.close();
}).catch((error) => {
console.error('getAllObject failed with error: ' + error);
});
fetchFileResult.close();
}
```
......@@ -2465,10 +2459,10 @@ async function example() {
console.error('album getFileAssets failed with error: ' + error);
return;
}
let count = fetchFileResult.getcount();
let count = fetchFileResult.getCount();
console.info('album getFileAssets successfully, count: ' + count);
});
fetchFileResult.close();
});
}
```
......@@ -2510,13 +2504,13 @@ async function example() {
const albumList = await media.getAlbums(AlbumNoArgsfetchOp);
const album = albumList[0];
// Obtain an album from the album list and obtain all media assets that meet the retrieval options in the album.
album.getFileAssets(fileNoArgsfetchOp).then((albumFetchFileResult) => {
let count = fetchFileResult.getcount();
album.getFileAssets(fileNoArgsfetchOp).then((fetchFileResult) => {
let count = fetchFileResult.getCount();
console.info('album getFileAssets successfully, count: ' + count);
fetchFileResult.close();
}).catch((error) => {
console.error('album getFileAssets failed with error: ' + error);
});
fetchFileResult.close();
}
```
......@@ -2555,7 +2549,6 @@ Enumerates media types.
Enumerates key file information.
> **NOTE**
>
> The **bucket_id** field may change after file rename or movement. Therefore, you must obtain the field again before using it.
**System capability**: SystemCapability.Multimedia.MediaLibrary.Core
......@@ -2641,14 +2634,10 @@ Describes the image size.
| width | number | Yes | Yes | Image width, in pixels.|
| height | number | Yes | Yes | Image height, in pixels.|
## MediaAssetOption<sup>(deprecated)</sup>
## MediaAssetOption
Implements the media asset option.
> **NOTE**
>
> This API is deprecated since API version 9.
**System capability**: SystemCapability.Multimedia.MediaLibrary.Core
......@@ -2658,17 +2647,13 @@ Implements the media asset option.
| mimeType | string | Yes | Yes | Multipurpose Internet Mail Extensions (MIME) type of the media.<br>The value can be 'image/\*', 'video/\*', 'audio/\*' or 'file\*'.|
| relativePath | string | Yes | Yes | Custom path for storing media assets, for example, 'Pictures/'. If this parameter is unspecified, media assets are stored in the default path.<br> Default path of images: 'Pictures/'<br> Default path of videos: 'Videos/'<br> Default path of audios: 'Audios/'<br> Default path of files: 'Documents/'|
## MediaSelectOption<sup>(deprecated)</sup>
## MediaSelectOption
Describes media selection option.
> **NOTE**
>
> This API is deprecated since API version 9.
**System capability**: SystemCapability.Multimedia.MediaLibrary.Core
| Name | Type | Readable| Writable| Description |
| ----- | ------ | ---- | ---- | -------------------- |
| type | 'image' &#124; 'video' &#124; 'media' | Yes | Yes | Media type, which can be **image**, **media**, or **video**. Currently, only **media** is supported.|
| count | number | Yes | Yes | Number of media assets selected. The value starts from 1, which indicates that one media asset can be selected. |
| count | number | Yes | Yes | Maximum number of media assets that can be selected. The value starts from 1, which indicates that one media asset can be selected. |
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册