提交 a0238e88 编写于 作者: G Gloria

Update docs against 17810+17946+18006+17887+17995+18024+18113+18030+18082+17808+18662+18378

Signed-off-by: wusongqing<wusongqing@huawei.com>
上级 c1182727
......@@ -14,7 +14,7 @@ The audio interruption policy determines the operations (for example, pause, res
Two audio interruption modes, specified by [InterruptMode](../reference/apis/js-apis-audio.md#interruptmode9), are preset in the audio interruption policy:
- **SHARED_MODE**: Multiple audio streams created by an application share one audio focus. The concurrency rules between these audio streams are determined by the application, without the use of the audio interruption policy. However, if another application needs to play audio while one of these audio streams is being played, the audio interruption policy is triggered.
- **SHARE_MODE**: Multiple audio streams created by an application share one audio focus. The concurrency rules between these audio streams are determined by the application, without the use of the audio interruption policy. However, if another application needs to play audio while one of these audio streams is being played, the audio interruption policy is triggered.
- **INDEPENDENT_MODE**: Each audio stream created by an application has an independent audio focus. When multiple audio streams are played concurrently, the audio interruption policy is triggered.
......
......@@ -8,7 +8,7 @@ OpenHarmony provides multiple classes for you to develop audio playback applicat
- [AudioRenderer](using-audiorenderer-for-playback.md): provides ArkTS and JS API to implement audio output. It supports only the PCM format and requires applications to continuously write audio data. The applications can perform data preprocessing, for example, setting the sampling rate and bit width of audio files, before audio input. This class can be used to develop more professional and diverse playback applications. To use this class, you must have basic audio processing knowledge.
- [OpenSLES](using-opensl-es-for-playback.md): provides a set of standard, cross-platform, yet unique native audio APIs. It supports audio output in PCM format and is applicable to playback applications that are ported from other embedded platforms or that implements audio output at the native layer.
- [OpenSL ES](using-opensl-es-for-playback.md): provides a set of standard, cross-platform, yet unique native audio APIs. It supports audio output in PCM format and is applicable to playback applications that are ported from other embedded platforms or that implements audio output at the native layer.
- [TonePlayer](using-toneplayer-for-playback.md): provides ArkTS and JS API to implement the playback of dialing tones and ringback tones. It can be used to play the content selected from a fixed type range, without requiring the input of media assets or audio data. This class is application to specific scenarios where dialing tones and ringback tones are played. is available only to system applications.
......
......@@ -8,7 +8,7 @@ OpenHarmony provides multiple classes for you to develop audio recording applica
- [AudioCapturer](using-audiocapturer-for-recording.md): provides ArkTS and JS API to implement audio input. It supports only the PCM format and requires applications to continuously read audio data. The application can perform data processing after audio output. This class can be used to develop more professional and diverse recording applications. To use this class, you must have basic audio processing knowledge.
- [OpenSLES](using-opensl-es-for-recording.md): provides a set of standard, cross-platform, yet unique native audio APIs. It supports audio input in PCM format and is applicable to recording applications that are ported from other embedded platforms or that implements audio input at the native layer.
- [OpenSL ES](using-opensl-es-for-recording.md): provides a set of standard, cross-platform, yet unique native audio APIs. It supports audio input in PCM format and is applicable to recording applications that are ported from other embedded platforms or that implements audio input at the native layer.
## Precautions for Developing Audio Recording Applications
......
......@@ -59,6 +59,7 @@ The table below lists the supported protocols.
| -------- | -------- |
| Local VOD| The file descriptor is supported, but the file path is not.|
| Network VoD| HTTP, HTTPS, and HLS are supported.|
| Live webcasting| HLS is supported.|
The table below lists the supported audio playback formats.
......
......@@ -2,7 +2,7 @@
## Multimedia Subsystem Architecture
The multimedia subsystem provides the capability of processing users' visual and auditory information. For example, it can be used to collect, compress, store, decompress, and play audio and video information. Based on the type of media information to process, the media system is usually divided into four modules: audio, media, camera, and image.
The multimedia subsystem provides the capability of processing users' visual and auditory information. For example, it can be used to collect, compress, store, decompress, and play audio and video information. Based on the type of media information to process, the multimedia subsystem subsystem is usually divided into four modules: audio, media, camera, and image.
As shown in the figure below, the multimedia subsystem provides APIs for developing audio/video, camera, and gallery applications, and provides adaptation and acceleration for different hardware chips. In the middle part, it provides core media functionalities and management mechanisms in the form of services.
......
......@@ -151,9 +151,6 @@ export default class AudioRendererDemo {
console.info(`${TAG}: creating AudioRenderer success`);
this.renderModel = renderer;
this.renderModel.on('stateChange', (state) => { // Set the events to listen for. A callback is invoked when the AudioRenderer is switched to the specified state.
if (state == 1) {
console.info('audio renderer state is: STATE_PREPARED');
}
if (state == 2) {
console.info('audio renderer state is: STATE_RUNNING');
}
......
......@@ -12,7 +12,7 @@ During application development, you can use the **state** attribute of the AVPla
**Figure 1** Playback state transition
![Playback state change](figures/playback-status-change.png)
![Playback status change](figures/playback-status-change.png)
For details about the state, see [AVPlayerState](../reference/apis/js-apis-media.md#avplayerstate9). When the AVPlayer is in the **prepared**, **playing**, **paused**, or **completed** state, the playback engine is working and a large amount of RAM is occupied. If your application does not need to use the AVPlayer, call **reset()** or **release()** to release the instance.
......@@ -68,7 +68,9 @@ import common from '@ohos.app.ability.common';
export class AVPlayerDemo {
private avPlayer;
private count: number = 0;
private isSeek: boolean = true; // Specify whether the seek operation is supported.
private fileSize: number = -1;
private fd: number = 0;
// Set AVPlayer callback functions.
setAVPlayerCallback() {
// Callback function for the seek operation.
......@@ -102,8 +104,13 @@ export class AVPlayerDemo {
case 'playing': // This state is reported upon a successful callback of play().
console.info('AVPlayer state playing called.');
if (this.count !== 0) {
if (this.isSeek) {
console.info('AVPlayer start to seek.');
this.avPlayer.seek (this.avPlayer.duration); // Call seek() to seek to the end of the audio clip.
} else {
// When the seek operation is not supported, the playback continues until it reaches the end.
console.info('AVPlayer wait to play end.');
}
} else {
this.avPlayer.pause(); // Call pause() to pause the playback.
}
......@@ -145,6 +152,7 @@ export class AVPlayerDemo {
// Open the corresponding file address to obtain the file descriptor and assign a value to the URL to trigger the reporting of the initialized state.
let file = await fs.open(path);
fdPath = fdPath + '' + file.fd;
this.isSeek = true; // The seek operation is supported.
this.avPlayer.url = fdPath;
}
......@@ -158,10 +166,85 @@ export class AVPlayerDemo {
// The return type is {fd,offset,length}, where fd indicates the file descriptor address of the HAP file, offset indicates the media asset offset, and length indicates the duration of the media asset to play.
let context = getContext(this) as common.UIAbilityContext;
let fileDescriptor = await context.resourceManager.getRawFd('01.mp3');
this.isSeek = true; // The seek operation is supported.
// Assign a value to fdSrc to trigger the reporting of the initialized state.
this.avPlayer.fdSrc = fileDescriptor;
}
// The following demo shows how to use the file system to open the sandbox address, obtain the media file address, and play the media file with the seek operation using the dataSrc attribute.
async avPlayerDataSrcSeekDemo() {
// Create an AVPlayer instance.
this.avPlayer = await media.createAVPlayer();
// Set a callback function for state changes.
this.setAVPlayerCallback();
// dataSrc indicates the playback source address. When the seek operation is supported, fileSize indicates the size of the file to be played. The following describes how to assign a value to fileSize.
let src = {
fileSize: -1,
callback: (buf, length, pos) => {
let num = 0;
if (buf == undefined || length == undefined || pos == undefined) {
return -1;
}
num = fs.readSync(this.fd, buf, { offset: pos, length: length });
if (num > 0 && (this.fileSize >= pos)) {
return num;
}
return -1;
}
}
let context = getContext(this) as common.UIAbilityContext;
// Obtain the sandbox address filesDir through UIAbilityContext. The stage model is used as an example.
let pathDir = context.filesDir;
let path = pathDir + '/01.mp3';
await fs.open(path).then((file) => {
this.fd = file.fd;
})
// Obtain the size of the file to be played.
this.fileSize = fs.statSync(path).size;
src.fileSize = this.fileSize;
this.isSeek = true; // The seek operation is supported.
this.avPlayer.dataSrc = src;
}
// The following demo shows how to use the file system to open the sandbox address, obtain the media file address, and play the media file without the seek operation using the dataSrc attribute.
async avPlayerDataSrcNoSeekDemo() {
// Create an AVPlayer instance.
this.avPlayer = await media.createAVPlayer();
// Set a callback function for state changes.
this.setAVPlayerCallback();
let context = getContext(this) as common.UIAbilityContext;
let src: object = {
fileSize: -1,
callback: (buf, length, pos) => {
let num = 0;
if (buf == undefined || length == undefined) {
return -1;
}
num = fs.readSync(this.fd, buf);
if (num > 0) {
return num;
}
return -1;
}
}
// Obtain the sandbox address filesDir through UIAbilityContext. The stage model is used as an example.
let pathDir = context.filesDir;
let path = pathDir + '/01.mp3';
await fs.open(path).then((file) => {
this.fd = file.fd;
})
this.isSeek = false; // The seek operation is not supported.
this.avPlayer.dataSrc = src;
}
// The following demo shows how to play live streams by setting the network address through the URL.
async avPlayerLiveDemo() {
// Create an AVPlayer instance.
this.avPlayer = await media.createAVPlayer();
// Set a callback function for state changes.
this.setAVPlayerCallback();
this.isSeek = false; // The seek operation is not supported.
this.avPlayer.url = 'http://xxx.xxx.xxx.xxx:xx/xx/index.m3u8';
}
}
```
<!--no_check-->
\ No newline at end of file
# AVSession Controller
Media Controller preset in OpenHarmony functions as the controller to interact with audio and video applications, for example, obtaining and displaying media information and delivering control commands.
Media Controller preset in OpenHarmony functions as the controller to interact with audio and video applications, for example, obtaining and displaying media information and delivering playback control commands.
You can develop a system application (for example, a new playback control center or voice assistant) as the controller to interact with audio and video applications in the system.
......@@ -8,24 +8,50 @@ You can develop a system application (for example, a new playback control center
- AVSessionDescriptor: session information, including the session ID, session type (audio/video), custom session name (**sessionTag**), information about the corresponding application (**elementName**), and whether the session is pined on top (isTopSession).
- Top session: session with the highest priority in the system, for example, a session that is being played. Generally, the controller must hold an **AVSessionController** object to communicate with a session. However, the controller can directly communicate with the top session, for example, directly sending a control command or key event, without holding an **AVSessionController** object.
- Top session: session with the highest priority in the system, for example, a session that is being played. Generally, the controller must hold an **AVSessionController** object to communicate with a session. However, the controller can directly communicate with the top session, for example, directly sending a playback control command or key event, without holding an **AVSessionController** object.
## Available APIs
The table below lists the key APIs used by the controller. The APIs use either a callback or promise to return the result. The APIs listed below use a callback. They provide the same functions as their counterparts that use a promise.
The key APIs used by the controller are classified into the following types:
1. APIs called by the **AVSessionManager** object, which is obtained by means of import. An example API is **AVSessionManager.createController(sessionId)**.
2. APIs called by the **AVSessionController** object. An example API is **controller.getAVPlaybackState()**.
Asynchronous JavaScript APIs use either a callback or promise to return the result. The APIs listed below use a callback. They provide the same functions as their counterparts that use a promise.
For details, see [AVSession Management](../reference/apis/js-apis-avsession.md).
### APIs Called by the AVSessionManager Object
| API| Description|
| -------- | -------- |
| getAllSessionDescriptors(callback: AsyncCallback&lt;Array&lt;Readonly&lt;AVSessionDescriptor&gt;&gt;&gt;): void | Obtains the descriptors of all AVSessions in the system.|
| createController(sessionId: string, callback: AsyncCallback&lt;AVSessionController&gt;): void | Creates an AVSessionController.|
| getValidCommands(callback: AsyncCallback&lt;Array&lt;AVControlCommandType&gt;&gt;): void | Obtains valid commands supported by the AVSession.<br>Control commands listened by an audio and video application when it accesses the AVSession are considered as valid commands supported by the AVSession. For details, see [Provider of AVSession](using-avsession-developer.md).|
| getValidCommands(callback: AsyncCallback&lt;Array&lt;AVControlCommandType&gt;&gt;): void | Obtains valid commands supported by the AVSession.<br>Playback control commands listened by an audio and video application when it accesses the AVSession are considered as valid commands supported by the AVSession. For details, see [Provider of AVSession](using-avsession-developer.md).|
| getLaunchAbility(callback: AsyncCallback&lt;WantAgent&gt;): void | Obtains the UIAbility that is configured in the AVSession and can be started.<br>The UIAbility configured here is started when a user operates the UI of the controller, for example, clicking a widget in Media Controller.|
| sendAVKeyEvent(event: KeyEvent, callback: AsyncCallback&lt;void&gt;): void | Sends a key event to an AVSession through the AVSessionController object.|
| sendSystemAVKeyEvent(event: KeyEvent, callback: AsyncCallback&lt;void&gt;): void | Sends a key event to the top session.|
| sendControlCommand(command: AVControlCommand, callback: AsyncCallback&lt;void&gt;): void | Sends a control command to an AVSession through the AVSessionController object.|
| sendSystemControlCommand(command: AVControlCommand, callback: AsyncCallback&lt;void&gt;): void | Sends a control command to the top session.|
| sendControlCommand(command: AVControlCommand, callback: AsyncCallback&lt;void&gt;): void | Sends a playback control command to an AVSession through the AVSessionController object.|
| sendSystemControlCommand(command: AVControlCommand, callback: AsyncCallback&lt;void&gt;): void | Sends a playback control command to the top session.|
| getHistoricalSessionDescriptors(maxSize: number, callback: AsyncCallback\<Array\<Readonly\<AVSessionDescriptor>>>): void<sup>10+<sup> | Obtains the descriptors of historical sessions.|
### APIs Called by the AVSessionController Object
| API| Description|
| -------- | -------- |
| getAVPlaybackState(callback: AsyncCallback&lt;AVPlaybackState&gt;): void | Obtains the information related to the playback state.|
| getAVMetadata(callback: AsyncCallback&lt;AVMetadata&gt;): void | Obtains the session metadata.|
| getOutputDevice(callback: AsyncCallback&lt;OutputDeviceInfo&gt;): void | Obtains the output device information.|
| sendAVKeyEvent(event: KeyEvent, callback: AsyncCallback&lt;void&gt;): void | Sends a key event to the session corresponding to this controller.|
| getLaunchAbility(callback: AsyncCallback&lt;WantAgent&gt;): void | Obtains the **WantAgent** object saved by the application in the session.|
| isActive(callback: AsyncCallback&lt;boolean&gt;): void | Checks whether the session is activated.|
| destroy(callback: AsyncCallback&lt;void&gt;): void | Destroys this controller. A controller can no longer be used after being destroyed.|
| getValidCommands(callback: AsyncCallback&lt;Array&lt;AVControlCommandType&gt;&gt;): void | Obtains valid commands supported by the session.|
| sendControlCommand(command: AVControlCommand, callback: AsyncCallback&lt;void&gt;): void | Sends a playback control command to the session through the controller.|
| sendCommonCommand(command: string, args: {[key: string]: Object}, callback: AsyncCallback&lt;void&gt;): void<sup>10+<sup> | Sends a custom playback control command to the session through the controller.|
| getAVQueueItems(callback: AsyncCallback&lt;Array&lt;AVQueueItem&gt;&gt;): void<sup>10+<sup> | Obtains the information related to the items in the playlist.|
| getAVQueueTitle(callback: AsyncCallback&lt;string&gt;): void<sup>10+<sup> | Obtains the name of the playlist.|
| skipToQueueItem(itemId: number, callback: AsyncCallback&lt;void&gt;): void<sup>10+<sup> | Sends the ID of an item in the playlist to the session for processing. The session can play the song.|
| getExtras(callback: AsyncCallback&lt;{[key: string]: Object}&gt;): void<sup>10+<sup> | Obtains the custom media packet set by the provider.|
## How to Develop
......@@ -48,13 +74,26 @@ To enable a system application to access the AVSession service as a controller,
AVSessionManager.createController(descriptor.sessionId).then((controller) => {
g_controller.push(controller);
}).catch((err) => {
console.error(`createController : ERROR : ${err.message}`);
console.error(`Failed to create controller. Code: ${err.code}, message: ${err.message}`);
});
});
}).catch((err) => {
console.error(`getAllSessionDescriptors : ERROR : ${err.message}`);
console.error(`Failed to get all session descriptors. Code: ${err.code}, message: ${err.message}`);
});
// Obtain the descriptors of historical sessions.
avSession.getHistoricalSessionDescriptors().then((descriptors) => {
console.info(`getHistoricalSessionDescriptors : SUCCESS : descriptors.length : ${descriptors.length}`);
if (descriptors.length > 0){
console.info(`getHistoricalSessionDescriptors : SUCCESS : descriptors[0].isActive : ${descriptors[0].isActive}`);
console.info(`getHistoricalSessionDescriptors : SUCCESS : descriptors[0].type : ${descriptors[0].type}`);
console.info(`getHistoricalSessionDescriptors : SUCCESS : descriptors[0].sessionTag : ${descriptors[0].sessionTag}`);
console.info(`getHistoricalSessionDescriptors : SUCCESS : descriptors[0].sessionId : ${descriptors[0].sessionId}`);
console.info(`getHistoricalSessionDescriptors : SUCCESS : descriptors[0].elementName.bundleName : ${descriptors[0].elementName.bundleName}`);
}
}).catch((err) => {
console.error(`Failed to get historical session descriptors, error code: ${err.code}, error message: ${err.message}`);
});
```
2. Listen for the session state and service state events.
......@@ -74,7 +113,7 @@ To enable a system application to access the AVSession service as a controller,
AVSessionManager.createController(session.sessionId).then((controller) => {
g_controller.push(controller);
}).catch((err) => {
console.info(`createController : ERROR : ${err.message}`);
console.error(`Failed to create controller. Code: ${err.code}, message: ${err.message}`);
});
});
......@@ -103,7 +142,7 @@ To enable a system application to access the AVSession service as a controller,
// Subscribe to the 'sessionServiceDie' event.
AVSessionManager.on('sessionServiceDie', () => {
// The server is abnormal, and the application clears resources.
console.info("Server exception.");
console.info(`Server exception.`);
})
```
......@@ -117,6 +156,10 @@ To enable a system application to access the AVSession service as a controller,
- **validCommandChange**: triggered when the valid commands supported by the session changes.
- **outputDeviceChange**: triggered when the output device changes.
- **sessionDestroy**: triggered when a session is destroyed.
- **sessionEvent**: triggered when the custom session event changes.
- **extrasChange**: triggered when the custom media packet of the session changes.
- **queueItemsChange**: triggered when one or more items in the custom playlist of the session changes.
- **queueTitleChange**: triggered when the custom playlist name of the session changes.
The controller can listen for events as required.
......@@ -124,18 +167,18 @@ To enable a system application to access the AVSession service as a controller,
// Subscribe to the 'activeStateChange' event.
controller.on('activeStateChange', (isActive) => {
if (isActive) {
console.info("The widget corresponding to the controller is highlighted.");
console.info(`The widget corresponding to the controller is highlighted.`);
} else {
console.info("The widget corresponding to the controller is invalid.");
console.info(`The widget corresponding to the controller is invalid.`);
}
});
// Subscribe to the 'sessionDestroy' event to enable the controller to get notified when the session dies.
controller.on('sessionDestroy', () => {
console.info('on sessionDestroy : SUCCESS ');
info(`on sessionDestroy : SUCCESS `);
controller.destroy().then(() => {
console.info('destroy : SUCCESS ');
console.info(`destroy : SUCCESS`);
}).catch((err) => {
console.info(`destroy : ERROR :${err.message}`);
console.error(`Failed to destroy session. Code: ${err.code}, message: ${err.message}`);
});
});
......@@ -164,6 +207,22 @@ To enable a system application to access the AVSession service as a controller,
controller.on('outputDeviceChange', (device) => {
console.info(`on outputDeviceChange device isRemote : ${device.isRemote}`);
});
// Subscribe to custom session event changes.
controller.on('sessionEvent', (eventName, eventArgs) => {
console.info(`Received new session event, event name is ${eventName}, args are ${JSON.stringify(eventArgs)}`);
});
// Subscribe to custom media packet changes.
controller.on('extrasChange', (extras) => {
console.info(`Received custom media packet, packet data is ${JSON.stringify(extras)}`);
});
// Subscribe to custom playlist item changes.
controller.on('queueItemsChange', (items) => {
console.info(`Caught queue items change, items length is ${items.length}`);
});
// Subscribe to custom playback name changes.
controller.on('queueTitleChange', (title) => {
console.info(`Caught queue title change, title is ${title}`);
});
```
4. Obtain the media information transferred by the provider for display on the UI, for example, displaying the track being played and the playback state in Media Controller.
......@@ -186,19 +245,36 @@ To enable a system application to access the AVSession service as a controller,
let avPlaybackState: AVSessionManager.AVPlaybackState = await controller.getAVPlaybackState();
console.info(`get playbackState by controller : ${avPlaybackState.state}`);
console.info(`get favoriteState by controller : ${avPlaybackState.isFavorite}`);
// Obtain the playlist items of the session.
let queueItems: Array<AVSessionManager.AVQueueItem> = await controller.getAVQueueItems();
console.info(`get queueItems length by controller : ${queueItems.length}`);
// Obtain the playlist name of the session.
let queueTitle: string = await controller.getAVQueueTitle();
console.info(`get queueTitle by controller : ${queueTitle}`);
// Obtain the custom media packet of the session.
let extras: any = await controller.getExtras();
console.info(`get custom media packets by controller : ${JSON.stringify(extras)}`);
// Obtain the ability information provided by the application corresponding to the session.
let agent: WantAgent = await controller.getLaunchAbility();
console.info(`get want agent info by controller : ${JSON.stringify(agent)}`);
// Obtain the current playback position of the session.
let currentTime: number = controller.getRealPlaybackPositionSync();
console.info(`get current playback time by controller : ${currentTime}`);
// Obtain valid commands supported by the session.
let validCommands: Array<AVSessionManager.AVControlCommandType> = await controller.getValidCommands();
console.info(`get valid commands by controller : ${JSON.stringify(validCommands)}`);
}
```
5. Control the playback behavior, for example, sending a command to operate (play/pause/previous/next) the item being played in Media Controller.
After listening for the control command event, the audio and video application serving as the provider needs to implement the corresponding operation.
After listening for the playback control command event, the audio and video application serving as the provider needs to implement the corresponding operation.
```ts
async sendCommandToSessionByController() {
// It is assumed that an AVSessionController object corresponding to the session already exists. For details about how to create an AVSessionController object, see the code snippet above.
let controller: AVSessionManager.AVSessionController = ALLREADY_HAVE_A_CONTROLLER;
// Obtain the commands supported by the session.
// Obtain valid commands supported by the session.
let validCommandTypeArray: Array<AVSessionManager.AVControlCommandType> = await controller.getValidCommands();
console.info(`get validCommandArray by controller : length : ${validCommandTypeArray.length}`);
// Deliver the 'play' command.
......@@ -222,6 +298,23 @@ To enable a system application to access the AVSession service as a controller,
let avCommand: AVSessionManager.AVControlCommand = {command:'playNext'};
controller.sendControlCommand(avCommand);
}
// Deliver a custom playback control command.
let commandName: string = 'custom command';
let args = {
command : 'This is my custom command'
}
await controller.sendCommonCommand(commandName, args).then(() => {
console.info(`SendCommonCommand successfully`);
}).catch((err) => {
console.error(`Failed to send common command. Code: ${err.code}, message: ${err.message}`);
})
// Set the ID of an item in the specified playlist for the session to play.
let queueItemId: number = 0;
await controller.skipToQueueItem(queueItemId).then(() => {
console.info(`SkipToQueueItem successfully`);
}).catch((err) => {
console.error(`Failed to skip to queue item. Code: ${err.code}, message: ${err.message}`);
});
}
```
......@@ -235,9 +328,9 @@ To enable a system application to access the AVSession service as a controller,
// Destroy the AVSessionController object. After being destroyed, it is no longer available.
controller.destroy(function (err) {
if (err) {
console.info(`Destroy controller ERROR : code: ${err.code}, message: ${err.message}`);
console.error(`Failed to destroy controller. Code: ${err.code}, message: ${err.message}`);
} else {
console.info('Destroy controller SUCCESS');
console.info(`Destroy controller SUCCESS`);
}
});
}
......
# AVSession Provider
An audio and video application needs to access the AVSession service as a provider in order to display media information in the controller (for example, Media Controller) and respond to control commands delivered by the controller.
An audio and video application needs to access the AVSession service as a provider in order to display media information in the controller (for example, Media Controller) and respond to playback control commands delivered by the controller.
## Basic Concepts
- AVMetadata: media data related attributes, including the IDs of the current media asset (assetId), previous media asset (previousAssetId), and next media asset (nextAssetId), title, author, album, writer, and duration.
- AVPlaybackState: playback state attributes, including the playback state, position, speed, buffered time, loop mode, and whether the media asset is favorited (**isFavorite**).
- AVPlaybackState: playback state attributes, including the playback state, position, speed, buffered time, loop mode, media item being played (activeItemId), custom media data (extras), and whether the media asset is favorited (isFavorite).
## Available APIs
......@@ -21,8 +21,14 @@ For details, see [AVSession Management](../reference/apis/js-apis-avsession.md).
| setAVPlaybackState(state: AVPlaybackState, callback: AsyncCallback&lt;void&gt;): void | Sets the AVSession playback state.|
| setLaunchAbility(ability: WantAgent, callback: AsyncCallback&lt;void&gt;): void | Starts a UIAbility.|
| getController(callback: AsyncCallback&lt;AVSessionController&gt;): void | Obtains the controller of the AVSession.|
| getOutputDevice(callback: AsyncCallback&lt;OutputDeviceInfo&gt;): void | Obtains the output device information.|
| activate(callback: AsyncCallback&lt;void&gt;): void | Activates the AVSession.|
| deactivate(callback: AsyncCallback&lt;void&gt;): void | Deactivates this session.|
| destroy(callback: AsyncCallback&lt;void&gt;): void | Destroys the AVSession.|
| setAVQueueItems(items: Array&lt;AVQueueItem&gt;, callback: AsyncCallback&lt;void&gt;): void <sup>10+<sup> | Sets a playlist.|
| setAVQueueTitle(title: string, callback: AsyncCallback&lt;void&gt;): void<sup>10+<sup> | Sets a name for the playlist.|
| dispatchSessionEvent(event: string, args: {[key: string]: Object}, callback: AsyncCallback&lt;void&gt;): void<sup>10+<sup> | Dispatches a custom session event.|
| setExtras(extras: {[key: string]: Object}, callback: AsyncCallback&lt;void&gt;): void<sup>10+<sup> | Sets a custom media packet in the form of a key-value pair.|
## How to Develop
......@@ -31,11 +37,20 @@ To enable an audio and video application to access the AVSession service as a pr
1. Call an API in the **AVSessionManager** class to create and activate an **AVSession** object.
```ts
// To create an AVSession object, you must first obtain the application context. You can set a global variable in the EntryAbility file of the application to store the application context.
export default class EntryAbility extends UIAbility {
onCreate(want, launchParam) {
globalThis.context = this.context; // Set the global variable globalThis.context to store the application context.
}
// Other APIs of the EntryAbility class.
}
// Start to create and activate an AVSession object.
import AVSessionManager from '@ohos.multimedia.avsession'; // Import the AVSessionManager module.
// Create an AVSession object.
async createSession() {
let session: AVSessionManager.AVSession = await AVSessionManager.createAVSession(this.context, 'SESSION_NAME', 'audio');
let session: AVSessionManager.AVSession = await AVSessionManager.createAVSession(globalThis.context, 'SESSION_NAME', 'audio');
session.activate();
console.info(`session create done : sessionId : ${session.sessionId}`);
}
......@@ -49,19 +64,19 @@ To enable an audio and video application to access the AVSession service as a pr
```ts
async setSessionInfo() {
// It is assumed that an AVSession object has been created. For details about how to create an AVSession object, see the node snippet above.
let session: AVSessionManager.AVSession = ALLREADY_CREATE_A_SESSION;
// It is assumed that an AVSession object has been created. For details about how to create an AVSession object, see the node snippet in step 1.
let session: AVSessionManager.AVSession = ALREADY_CREATE_A_SESSION;
// The player logic that triggers changes in the session metadata and playback state is omitted here.
// Set necessary session metadata.
let metadata: AVSessionManager.AVMetadata = {
assetId: "0",
title: "TITLE",
artist: "ARTIST"
assetId: '0',
title: 'TITLE',
artist: 'ARTIST'
};
session.setAVMetadata(metadata).then(() => {
console.info('SetAVMetadata successfully');
console.info(`SetAVMetadata successfully`);
}).catch((err) => {
console.info(`SetAVMetadata BusinessError: code: ${err.code}, message: ${err.message}`);
console.error(`Failed to set AVMetadata. Code: ${err.code}, message: ${err.message}`);
});
// Set the playback state to paused and set isFavorite to false.
let playbackState: AVSessionManager.AVPlaybackState = {
......@@ -70,11 +85,51 @@ To enable an audio and video application to access the AVSession service as a pr
};
session.setAVPlaybackState(playbackState, function (err) {
if (err) {
console.info(`SetAVPlaybackState BusinessError: code: ${err.code}, message: ${err.message}`);
console.error(`Failed to set AVPlaybackState. Code: ${err.code}, message: ${err.message}`);
} else {
console.info('SetAVPlaybackState successfully');
console.info(`SetAVPlaybackState successfully`);
}
});
// Set a playlist.
let queueItemDescription_1 = {
mediaId: '001',
title: 'music_name',
subtitle: 'music_sub_name',
description: 'music_description',
icon: PIXELMAP_OBJECT,
iconUri: 'http://www.xxx.com',
extras: {'extras':'any'}
};
let queueItem_1 = {
itemId: 1,
description: queueItemDescription_1
};
let queueItemDescription_2 = {
mediaId: '002',
title: 'music_name',
subtitle: 'music_sub_name',
description: 'music_description',
icon: PIXELMAP_OBJECT,
iconUri: 'http://www.xxx.com',
extras: {'extras':'any'}
};
let queueItem_2 = {
itemId: 2,
description: queueItemDescription_2
};
let queueItemsArray = [queueItem_1, queueItem_2];
session.setAVQueueItems(queueItemsArray).then(() => {
console.info(`SetAVQueueItems successfully`);
}).catch((err) => {
console.error(`Failed to set AVQueueItem, error code: ${err.code}, error message: ${err.message}`);
});
// Set a name for the playlist.
let queueTitle = 'QUEUE_TITLE';
session.setAVQueueTitle(queueTitle).then(() => {
console.info(`SetAVQueueTitle successfully`);
}).catch((err) => {
console.info(`Failed to set AVQueueTitle, error code: ${err.code}, error message: ${err.message}`);
});
}
```
......@@ -82,74 +137,178 @@ To enable an audio and video application to access the AVSession service as a pr
The UIAbility is set through the **WantAgent** API. For details, see [WantAgent](../reference/apis/js-apis-app-ability-wantAgent.md).
```ts
import WantAgent from "@ohos.app.ability.wantAgent";
import wantAgent from "@ohos.app.ability.wantAgent";
```
```ts
// It is assumed that an AVSession object has been created. For details about how to create an AVSession object, see the node snippet above.
let session: AVSessionManager.AVSession = ALLREADY_CREATE_A_SESSION;
// It is assumed that an AVSession object has been created. For details about how to create an AVSession object, see the node snippet in step 1.
let session: AVSessionManager.AVSession = ALREADY_CREATE_A_SESSION;
let wantAgentInfo = {
wants: [
{
bundleName: "com.example.musicdemo",
abilityName: "com.example.musicdemo.MainAbility"
bundleName: 'com.example.musicdemo',
abilityName: 'com.example.musicdemo.MainAbility'
}
],
operationType: WantAgent.OperationType.START_ABILITIES,
operationType: wantAgent.OperationType.START_ABILITIES,
requestCode: 0,
wantAgentFlags: [WantAgent.WantAgentFlags.UPDATE_PRESENT_FLAG]
wantAgentFlags: [wantAgent.WantAgentFlags.UPDATE_PRESENT_FLAG]
}
WantAgent.getWantAgent(wantAgentInfo).then((agent) => {
session.setLaunchAbility(agent)
wantAgent.getWantAgent(wantAgentInfo).then((agent) => {
session.setLaunchAbility(agent);
})
```
4. Listen for control commands delivered by the controller, for example, Media Controller.
4. Set a custom session event. The controller performs an operation after receiving the event.
> **NOTE**
>
> After the provider registers a listener for the control command event, the event will be reflected in **getValidCommands()** of the controller. In other words, the controller determines that the command is valid and triggers the corresponding event as required. To ensure that the control commands delivered by the controller can be executed normally, the provider should not use a null implementation for listening.
> The data set through **dispatchSessionEvent** is not saved in the **AVSession** object or AVSession service.
```ts
// It is assumed that an AVSession object has been created. For details about how to create an AVSession object, see the node snippet in step 1.
let session: AVSessionManager.AVSession = ALREADY_CREATE_A_SESSION;
let eventName = 'dynamic_lyric';
let args = {
lyric : 'This is my lyric'
}
await session.dispatchSessionEvent(eventName, args).then(() => {
console.info(`Dispatch session event successfully`);
}).catch((err) => {
console.error(`Failed to dispatch session event. Code: ${err.code}, message: ${err.message}`);
})
```
5. Set a custom media packet. The controller performs an operation after receiving the event.
> **NOTE**
>
> The data set by using **setExtras** is stored in the AVSession service. The data lifecycle is the same as that of the **AVSession** object, and the controller corresponding to the object can use **getExtras** to obtain the data.
```ts
// It is assumed that an AVSession object has been created. For details about how to create an AVSession object, see the node snippet in step 1.
let session: AVSessionManager.AVSession = ALREADY_CREATE_A_SESSION;
let extras = {
extra : 'This is my custom meida packet'
}
await session.setExtras(extras).then(() => {
console.info(`Set extras successfully`);
}).catch((err) => {
console.error(`Failed to set extras. Code: ${err.code}, message: ${err.message}`);
})
```
6. Listen for playback control commands or events delivered by the controller, for example, Media Controller.
Both fixed playback control commands and advanced playback control events can be listened for.
- Listening for Fixed Playback Control Commands
> **NOTE**
>
> After the provider registers a listener for fixed playback control commands, the commands will be reflected in **getValidCommands()** of the controller. In other words, the controller determines that the command is valid and triggers the corresponding event as required. To ensure that the playback control commands delivered by the controller can be executed normally, the provider should not use a null implementation for listening.
Fixed playback control commands on the session side include basic operation commands such as play, pause, previous, and next. For details, see [AVControlCommand](../reference/apis/js-apis-avsession.md).
```ts
async setListenerForMesFromController() {
// It is assumed that an AVSession object has been created. For details about how to create an AVSession object, see the node snippet above.
let session: AVSessionManager.AVSession = ALLREADY_CREATE_A_SESSION;
// It is assumed that an AVSession object has been created. For details about how to create an AVSession object, see the node snippet in step 1.
let session: AVSessionManager.AVSession = ALREADY_CREATE_A_SESSION;
// Generally, logic processing on the player is implemented in the listener.
// After the processing is complete, use the setter to synchronize the playback information. For details, see the code snippet above.
session.on('play', () => {
console.info('on play , do play task');
console.info(`on play , do play task`);
// do some tasks ···
});
session.on('pause', () => {
console.info('on pause , do pause task');
console.info(`on pause , do pause task`);
// do some tasks ···
});
session.on('stop', () => {
console.info('on stop , do stop task');
console.info(`on stop , do stop task`);
// do some tasks ···
});
session.on('playNext', () => {
console.info('on playNext , do playNext task');
console.info(`on playNext , do playNext task`);
// do some tasks ···
});
session.on('playPrevious', () => {
console.info('on playPrevious , do playPrevious task');
console.info(`on playPrevious , do playPrevious task`);
// do some tasks ···
});
session.on('fastForward', () => {
console.info(`on fastForward , do fastForward task`);
// do some tasks ···
});
session.on('rewind', () => {
console.info(`on rewind , do rewind task`);
// do some tasks ···
});
session.on('seek', (time) => {
console.info(`on seek , the seek time is ${time}`);
// do some tasks ···
});
session.on('setSpeed', (speed) => {
console.info(`on setSpeed , the speed is ${speed}`);
// do some tasks ···
});
session.on('setLoopMode', (mode) => {
console.info(`on setLoopMode , the loop mode is ${mode}`);
// do some tasks ···
});
session.on('toggleFavorite', (assetId) => {
console.info(`on toggleFavorite , the target asset Id is ${assetId}`);
// do some tasks ···
});
}
```
- Listening for Advanced Playback Control Events
The following advanced playback control events can be listened for:
- **skipToQueueItem**: triggered when an item in the playlist is selected.
- **handleKeyEvent**: triggered when a key is pressed.
- **outputDeviceChange**: triggered when the output device changes.
- **commonCommand**: triggered when a custom playback control command changes.
```ts
async setListenerForMesFromController() {
// It is assumed that an AVSession object has been created. For details about how to create an AVSession object, see the node snippet in step 1.
let session: AVSessionManager.AVSession = ALREADY_CREATE_A_SESSION;
// Generally, logic processing on the player is implemented in the listener.
// After the processing is complete, use the setter to synchronize the playback information. For details, see the code snippet above.
session.on('skipToQueueItem', (itemId) => {
console.info(`on skipToQueueItem , do skip task`);
// do some tasks ···
});
session.on('handleKeyEvent', (event) => {
console.info(`on handleKeyEvent , the event is ${JSON.stringify(event)}`);
// do some tasks ···
});
session.on('outputDeviceChange', (device) => {
console.info(`on outputDeviceChange , the device info is ${JSON.stringify(device)}`);
// do some tasks ···
});
session.on('commonCommand', (commandString, args) => {
console.info(`on commonCommand , command is ${commandString}, args are ${JSON.stringify(args)}`);
// do some tasks ···
});
}
```
5. Obtain an **AVSessionController** object for this **AVSession** object for interaction.
7. Obtain an **AVSessionController** object for this **AVSession** object for interaction.
```ts
async createControllerFromSession() {
// It is assumed that an AVSession object has been created. For details about how to create an AVSession object, see the node snippet above.
let session: AVSessionManager.AVSession = ALLREADY_CREATE_A_SESSION;
// It is assumed that an AVSession object has been created. For details about how to create an AVSession object, see the node snippet in step 1.
let session: AVSessionManager.AVSession = ALREADY_CREATE_A_SESSION;
// Obtain an AVSessionController object for this AVSession object.
let controller: AVSessionManager.AVSessionController = await session.getController();
// The AVSessionController object can interact with the AVSession object, for example, by delivering a control command.
// The AVSessionController object can interact with the AVSession object, for example, by delivering a playback control command.
let avCommand: AVSessionManager.AVControlCommand = {command:'play'};
controller.sendControlCommand(avCommand);
......@@ -163,13 +322,14 @@ To enable an audio and video application to access the AVSession service as a pr
}
```
6. When the audio and video application exits and does not need to continue playback, cancel the listener and destroy the **AVSession** object.
The code snippet below is used for canceling the listener for control commands:
8. When the audio and video application exits and does not need to continue playback, cancel the listener and destroy the **AVSession** object.
The code snippet below is used for canceling the listener for playback control commands:
```ts
async unregisterSessionListener() {
// It is assumed that an AVSession object has been created. For details about how to create an AVSession object, see the node snippet above.
let session: AVSessionManager.AVSession = ALLREADY_CREATE_A_SESSION;
// It is assumed that an AVSession object has been created. For details about how to create an AVSession object, see the node snippet in step 1.
let session: AVSessionManager.AVSession = ALREADY_CREATE_A_SESSION;
// Cancel the listener of the AVSession object.
session.off('play');
......@@ -177,21 +337,26 @@ To enable an audio and video application to access the AVSession service as a pr
session.off('stop');
session.off('playNext');
session.off('playPrevious');
session.off('skipToQueueItem');
session.off('handleKeyEvent');
session.off('outputDeviceChange');
session.off('commonCommand');
}
```
The code snippet below is used for destroying the AVSession object:
```ts
async destroySession() {
// It is assumed that an AVSession object has been created. For details about how to create an AVSession object, see the node snippet above.
let session: AVSessionManager.AVSession = ALLREADY_CREATE_A_SESSION;
// It is assumed that an AVSession object has been created. For details about how to create an AVSession object, see the node snippet in step 1.
let session: AVSessionManager.AVSession = ALREADY_CREATE_A_SESSION;
// Destroy the AVSession object.
session.destroy(function (err) {
if (err) {
console.info(`Destroy BusinessError: code: ${err.code}, message: ${err.message}`);
console.error(`Failed to destroy session. Code: ${err.code}, message: ${err.message}`);
} else {
console.info('Destroy : SUCCESS ');
console.info(`Destroy : SUCCESS `);
}
});
}
......
......@@ -36,15 +36,15 @@ To enable a system application that accesses the AVSession service as the contro
let audioDevices;
await audioRoutingManager.getDevices(audio.DeviceFlag.OUTPUT_DEVICES_FLAG).then((data) => {
audioDevices = data;
console.info('Promise returned to indicate that the device list is obtained.');
console.info(`Promise returned to indicate that the device list is obtained.`);
}).catch((err) => {
console.info(`getDevices : ERROR : ${err.message}`);
console.error(`Failed to get devices. Code: ${err.code}, message: ${err.message}`);
});
AVSessionManager.castAudio('all', audioDevices).then(() => {
console.info('createController : SUCCESS');
console.info(`createController : SUCCESS`);
}).catch((err) => {
console.info(`createController : ERROR : ${err.message}`);
console.error(`Failed to cast audio. Code: ${err.code}, message: ${err.message}`);
});
```
......
......@@ -78,7 +78,9 @@ export class AVPlayerDemo {
private avPlayer;
private count: number = 0;
private surfaceID: string; // The surfaceID parameter specifies the window used to display the video. Its value is obtained through the XComponent.
private isSeek: boolean = true; // Specify whether the seek operation is supported.
private fileSize: number = -1;
private fd: number = 0;
// Set AVPlayer callback functions.
setAVPlayerCallback() {
// Callback function for the seek operation.
......@@ -113,8 +115,13 @@ export class AVPlayerDemo {
case 'playing': // This state is reported upon a successful callback of play().
console.info('AVPlayer state playing called.');
if (this.count !== 0) {
if (this.isSeek) {
console.info('AVPlayer start to seek.');
this.avPlayer.seek (this.avPlayer.duration); // Call seek() to seek to the end of the video clip.
} else {
// When the seek operation is not supported, the playback continues until it reaches the end.
console.info('AVPlayer wait to play end.');
}
} else {
this.avPlayer.pause(); // Call pause() to pause the playback.
}
......@@ -156,6 +163,7 @@ export class AVPlayerDemo {
// Open the corresponding file address to obtain the file descriptor and assign a value to the URL to trigger the reporting of the initialized state.
let file = await fs.open(path);
fdPath = fdPath + '' + file.fd;
this.isSeek = true; // The seek operation is supported.
this.avPlayer.url = fdPath;
}
......@@ -169,9 +177,86 @@ export class AVPlayerDemo {
// The return type is {fd,offset,length}, where fd indicates the file descriptor address of the HAP file, offset indicates the media asset offset, and length indicates the duration of the media asset to play.
let context = getContext(this) as common.UIAbilityContext;
let fileDescriptor = await context.resourceManager.getRawFd('H264_AAC.mp4');
this.isSeek = true; // The seek operation is supported.
// Assign a value to fdSrc to trigger the reporting of the initialized state.
this.avPlayer.fdSrc = fileDescriptor;
}
// The following demo shows how to use the file system to open the sandbox address, obtain the media file address, and play the media file with the seek operation using the dataSrc attribute.
async avPlayerDataSrcSeekDemo() {
// Create an AVPlayer instance.
this.avPlayer = await media.createAVPlayer();
// Set a callback function for state changes.
this.setAVPlayerCallback();
// dataSrc indicates the playback source address. When the seek operation is supported, fileSize indicates the size of the file to be played. The following describes how to assign a value to fileSize.
let src = {
fileSize: -1,
callback: (buf, length, pos) => {
let num = 0;
if (buf == undefined || length == undefined || pos == undefined) {
return -1;
}
num = fs.readSync(this.fd, buf, { offset: pos, length: length });
if (num > 0 && (this.fileSize >= pos)) {
return num;
}
return -1;
}
}
let context = getContext(this) as common.UIAbilityContext;
// Obtain the sandbox address filesDir through UIAbilityContext. The stage model is used as an example.
let pathDir = context.filesDir;
let path = pathDir + '/H264_AAC.mp4';
await fs.open(path).then((file) => {
this.fd = file.fd;
})
// Obtain the size of the file to be played.
this.fileSize = fs.statSync(path).size;
src.fileSize = this.fileSize;
this.isSeek = true; // The seek operation is supported.
this.avPlayer.dataSrc = src;
}
// The following demo shows how to use the file system to open the sandbox address, obtain the media file address, and play the media file without the seek operation using the dataSrc attribute.
async avPlayerDataSrcNoSeekDemo() {
// Create an AVPlayer instance.
this.avPlayer = await media.createAVPlayer();
// Set a callback function for state changes.
this.setAVPlayerCallback();
let context = getContext(this) as common.UIAbilityContext;
let src: object = {
fileSize: -1,
callback: (buf, length, pos) => {
let num = 0;
if (buf == undefined || length == undefined) {
return -1;
}
num = fs.readSync(this.fd, buf);
if (num > 0) {
return num;
}
return -1;
}
}
// Obtain the sandbox address filesDir through UIAbilityContext. The stage model is used as an example.
let pathDir = context.filesDir;
let path = pathDir + '/H264_AAC.mp4';
await fs.open(path).then((file) => {
this.fd = file.fd;
})
this.isSeek = false; // The seek operation is not supported.
this.avPlayer.dataSrc = src;
}
// The following demo shows how to play live streams by setting the network address through the URL.
async avPlayerLiveDemo() {
// Create an AVPlayer instance.
this.avPlayer = await media.createAVPlayer();
// Set a callback function for state changes.
this.setAVPlayerCallback();
this.isSeek = false; // The seek operation is not supported.
this.avPlayer.url = 'http://xxx.xxx.xxx.xxx:xx/xx/index.m3u8'; // Play live webcasting streams using HLS.
}
}
```
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册