提交 a0238e88 编写于 作者: G Gloria

Update docs against 17810+17946+18006+17887+17995+18024+18113+18030+18082+17808+18662+18378

Signed-off-by: wusongqing<wusongqing@huawei.com>
上级 c1182727
......@@ -14,7 +14,7 @@ The audio interruption policy determines the operations (for example, pause, res
Two audio interruption modes, specified by [InterruptMode](../reference/apis/js-apis-audio.md#interruptmode9), are preset in the audio interruption policy:
- **SHARED_MODE**: Multiple audio streams created by an application share one audio focus. The concurrency rules between these audio streams are determined by the application, without the use of the audio interruption policy. However, if another application needs to play audio while one of these audio streams is being played, the audio interruption policy is triggered.
- **SHARE_MODE**: Multiple audio streams created by an application share one audio focus. The concurrency rules between these audio streams are determined by the application, without the use of the audio interruption policy. However, if another application needs to play audio while one of these audio streams is being played, the audio interruption policy is triggered.
- **INDEPENDENT_MODE**: Each audio stream created by an application has an independent audio focus. When multiple audio streams are played concurrently, the audio interruption policy is triggered.
......
......@@ -8,7 +8,7 @@ OpenHarmony provides multiple classes for you to develop audio playback applicat
- [AudioRenderer](using-audiorenderer-for-playback.md): provides ArkTS and JS API to implement audio output. It supports only the PCM format and requires applications to continuously write audio data. The applications can perform data preprocessing, for example, setting the sampling rate and bit width of audio files, before audio input. This class can be used to develop more professional and diverse playback applications. To use this class, you must have basic audio processing knowledge.
- [OpenSLES](using-opensl-es-for-playback.md): provides a set of standard, cross-platform, yet unique native audio APIs. It supports audio output in PCM format and is applicable to playback applications that are ported from other embedded platforms or that implements audio output at the native layer.
- [OpenSL ES](using-opensl-es-for-playback.md): provides a set of standard, cross-platform, yet unique native audio APIs. It supports audio output in PCM format and is applicable to playback applications that are ported from other embedded platforms or that implements audio output at the native layer.
- [TonePlayer](using-toneplayer-for-playback.md): provides ArkTS and JS API to implement the playback of dialing tones and ringback tones. It can be used to play the content selected from a fixed type range, without requiring the input of media assets or audio data. This class is application to specific scenarios where dialing tones and ringback tones are played. is available only to system applications.
......
......@@ -8,7 +8,7 @@ OpenHarmony provides multiple classes for you to develop audio recording applica
- [AudioCapturer](using-audiocapturer-for-recording.md): provides ArkTS and JS API to implement audio input. It supports only the PCM format and requires applications to continuously read audio data. The application can perform data processing after audio output. This class can be used to develop more professional and diverse recording applications. To use this class, you must have basic audio processing knowledge.
- [OpenSLES](using-opensl-es-for-recording.md): provides a set of standard, cross-platform, yet unique native audio APIs. It supports audio input in PCM format and is applicable to recording applications that are ported from other embedded platforms or that implements audio input at the native layer.
- [OpenSL ES](using-opensl-es-for-recording.md): provides a set of standard, cross-platform, yet unique native audio APIs. It supports audio input in PCM format and is applicable to recording applications that are ported from other embedded platforms or that implements audio input at the native layer.
## Precautions for Developing Audio Recording Applications
......
......@@ -59,6 +59,7 @@ The table below lists the supported protocols.
| -------- | -------- |
| Local VOD| The file descriptor is supported, but the file path is not.|
| Network VoD| HTTP, HTTPS, and HLS are supported.|
| Live webcasting| HLS is supported.|
The table below lists the supported audio playback formats.
......
......@@ -2,7 +2,7 @@
## Multimedia Subsystem Architecture
The multimedia subsystem provides the capability of processing users' visual and auditory information. For example, it can be used to collect, compress, store, decompress, and play audio and video information. Based on the type of media information to process, the media system is usually divided into four modules: audio, media, camera, and image.
The multimedia subsystem provides the capability of processing users' visual and auditory information. For example, it can be used to collect, compress, store, decompress, and play audio and video information. Based on the type of media information to process, the multimedia subsystem subsystem is usually divided into four modules: audio, media, camera, and image.
As shown in the figure below, the multimedia subsystem provides APIs for developing audio/video, camera, and gallery applications, and provides adaptation and acceleration for different hardware chips. In the middle part, it provides core media functionalities and management mechanisms in the form of services.
......
......@@ -151,9 +151,6 @@ export default class AudioRendererDemo {
console.info(`${TAG}: creating AudioRenderer success`);
this.renderModel = renderer;
this.renderModel.on('stateChange', (state) => { // Set the events to listen for. A callback is invoked when the AudioRenderer is switched to the specified state.
if (state == 1) {
console.info('audio renderer state is: STATE_PREPARED');
}
if (state == 2) {
console.info('audio renderer state is: STATE_RUNNING');
}
......
......@@ -12,7 +12,7 @@ During application development, you can use the **state** attribute of the AVPla
**Figure 1** Playback state transition
![Playback state change](figures/playback-status-change.png)
![Playback status change](figures/playback-status-change.png)
For details about the state, see [AVPlayerState](../reference/apis/js-apis-media.md#avplayerstate9). When the AVPlayer is in the **prepared**, **playing**, **paused**, or **completed** state, the playback engine is working and a large amount of RAM is occupied. If your application does not need to use the AVPlayer, call **reset()** or **release()** to release the instance.
......@@ -68,7 +68,9 @@ import common from '@ohos.app.ability.common';
export class AVPlayerDemo {
private avPlayer;
private count: number = 0;
private isSeek: boolean = true; // Specify whether the seek operation is supported.
private fileSize: number = -1;
private fd: number = 0;
// Set AVPlayer callback functions.
setAVPlayerCallback() {
// Callback function for the seek operation.
......@@ -102,8 +104,13 @@ export class AVPlayerDemo {
case 'playing': // This state is reported upon a successful callback of play().
console.info('AVPlayer state playing called.');
if (this.count !== 0) {
console.info('AVPlayer start to seek.');
this.avPlayer.seek (this.avPlayer.duration); // Call seek() to seek to the end of the audio clip.
if (this.isSeek) {
console.info('AVPlayer start to seek.');
this.avPlayer.seek (this.avPlayer.duration); // Call seek() to seek to the end of the audio clip.
} else {
// When the seek operation is not supported, the playback continues until it reaches the end.
console.info('AVPlayer wait to play end.');
}
} else {
this.avPlayer.pause(); // Call pause() to pause the playback.
}
......@@ -145,6 +152,7 @@ export class AVPlayerDemo {
// Open the corresponding file address to obtain the file descriptor and assign a value to the URL to trigger the reporting of the initialized state.
let file = await fs.open(path);
fdPath = fdPath + '' + file.fd;
this.isSeek = true; // The seek operation is supported.
this.avPlayer.url = fdPath;
}
......@@ -158,10 +166,85 @@ export class AVPlayerDemo {
// The return type is {fd,offset,length}, where fd indicates the file descriptor address of the HAP file, offset indicates the media asset offset, and length indicates the duration of the media asset to play.
let context = getContext(this) as common.UIAbilityContext;
let fileDescriptor = await context.resourceManager.getRawFd('01.mp3');
this.isSeek = true; // The seek operation is supported.
// Assign a value to fdSrc to trigger the reporting of the initialized state.
this.avPlayer.fdSrc = fileDescriptor;
}
// The following demo shows how to use the file system to open the sandbox address, obtain the media file address, and play the media file with the seek operation using the dataSrc attribute.
async avPlayerDataSrcSeekDemo() {
// Create an AVPlayer instance.
this.avPlayer = await media.createAVPlayer();
// Set a callback function for state changes.
this.setAVPlayerCallback();
// dataSrc indicates the playback source address. When the seek operation is supported, fileSize indicates the size of the file to be played. The following describes how to assign a value to fileSize.
let src = {
fileSize: -1,
callback: (buf, length, pos) => {
let num = 0;
if (buf == undefined || length == undefined || pos == undefined) {
return -1;
}
num = fs.readSync(this.fd, buf, { offset: pos, length: length });
if (num > 0 && (this.fileSize >= pos)) {
return num;
}
return -1;
}
}
let context = getContext(this) as common.UIAbilityContext;
// Obtain the sandbox address filesDir through UIAbilityContext. The stage model is used as an example.
let pathDir = context.filesDir;
let path = pathDir + '/01.mp3';
await fs.open(path).then((file) => {
this.fd = file.fd;
})
// Obtain the size of the file to be played.
this.fileSize = fs.statSync(path).size;
src.fileSize = this.fileSize;
this.isSeek = true; // The seek operation is supported.
this.avPlayer.dataSrc = src;
}
// The following demo shows how to use the file system to open the sandbox address, obtain the media file address, and play the media file without the seek operation using the dataSrc attribute.
async avPlayerDataSrcNoSeekDemo() {
// Create an AVPlayer instance.
this.avPlayer = await media.createAVPlayer();
// Set a callback function for state changes.
this.setAVPlayerCallback();
let context = getContext(this) as common.UIAbilityContext;
let src: object = {
fileSize: -1,
callback: (buf, length, pos) => {
let num = 0;
if (buf == undefined || length == undefined) {
return -1;
}
num = fs.readSync(this.fd, buf);
if (num > 0) {
return num;
}
return -1;
}
}
// Obtain the sandbox address filesDir through UIAbilityContext. The stage model is used as an example.
let pathDir = context.filesDir;
let path = pathDir + '/01.mp3';
await fs.open(path).then((file) => {
this.fd = file.fd;
})
this.isSeek = false; // The seek operation is not supported.
this.avPlayer.dataSrc = src;
}
// The following demo shows how to play live streams by setting the network address through the URL.
async avPlayerLiveDemo() {
// Create an AVPlayer instance.
this.avPlayer = await media.createAVPlayer();
// Set a callback function for state changes.
this.setAVPlayerCallback();
this.isSeek = false; // The seek operation is not supported.
this.avPlayer.url = 'http://xxx.xxx.xxx.xxx:xx/xx/index.m3u8';
}
}
```
<!--no_check-->
\ No newline at end of file
# AVSession Controller
Media Controller preset in OpenHarmony functions as the controller to interact with audio and video applications, for example, obtaining and displaying media information and delivering control commands.
Media Controller preset in OpenHarmony functions as the controller to interact with audio and video applications, for example, obtaining and displaying media information and delivering playback control commands.
You can develop a system application (for example, a new playback control center or voice assistant) as the controller to interact with audio and video applications in the system.
......@@ -8,24 +8,50 @@ You can develop a system application (for example, a new playback control center
- AVSessionDescriptor: session information, including the session ID, session type (audio/video), custom session name (**sessionTag**), information about the corresponding application (**elementName**), and whether the session is pined on top (isTopSession).
- Top session: session with the highest priority in the system, for example, a session that is being played. Generally, the controller must hold an **AVSessionController** object to communicate with a session. However, the controller can directly communicate with the top session, for example, directly sending a control command or key event, without holding an **AVSessionController** object.
- Top session: session with the highest priority in the system, for example, a session that is being played. Generally, the controller must hold an **AVSessionController** object to communicate with a session. However, the controller can directly communicate with the top session, for example, directly sending a playback control command or key event, without holding an **AVSessionController** object.
## Available APIs
The table below lists the key APIs used by the controller. The APIs use either a callback or promise to return the result. The APIs listed below use a callback. They provide the same functions as their counterparts that use a promise.
The key APIs used by the controller are classified into the following types:
1. APIs called by the **AVSessionManager** object, which is obtained by means of import. An example API is **AVSessionManager.createController(sessionId)**.
2. APIs called by the **AVSessionController** object. An example API is **controller.getAVPlaybackState()**.
Asynchronous JavaScript APIs use either a callback or promise to return the result. The APIs listed below use a callback. They provide the same functions as their counterparts that use a promise.
For details, see [AVSession Management](../reference/apis/js-apis-avsession.md).
| API| Description|
### APIs Called by the AVSessionManager Object
| API| Description|
| -------- | -------- |
| getAllSessionDescriptors(callback: AsyncCallback&lt;Array&lt;Readonly&lt;AVSessionDescriptor&gt;&gt;&gt;): void | Obtains the descriptors of all AVSessions in the system.|
| createController(sessionId: string, callback: AsyncCallback&lt;AVSessionController&gt;): void | Creates an AVSessionController.|
| getValidCommands(callback: AsyncCallback&lt;Array&lt;AVControlCommandType&gt;&gt;): void | Obtains valid commands supported by the AVSession.<br>Playback control commands listened by an audio and video application when it accesses the AVSession are considered as valid commands supported by the AVSession. For details, see [Provider of AVSession](using-avsession-developer.md).|
| getLaunchAbility(callback: AsyncCallback&lt;WantAgent&gt;): void | Obtains the UIAbility that is configured in the AVSession and can be started.<br>The UIAbility configured here is started when a user operates the UI of the controller, for example, clicking a widget in Media Controller.|
| sendAVKeyEvent(event: KeyEvent, callback: AsyncCallback&lt;void&gt;): void | Sends a key event to an AVSession through the AVSessionController object.|
| sendSystemAVKeyEvent(event: KeyEvent, callback: AsyncCallback&lt;void&gt;): void | Sends a key event to the top session.|
| sendControlCommand(command: AVControlCommand, callback: AsyncCallback&lt;void&gt;): void | Sends a playback control command to an AVSession through the AVSessionController object.|
| sendSystemControlCommand(command: AVControlCommand, callback: AsyncCallback&lt;void&gt;): void | Sends a playback control command to the top session.|
| getHistoricalSessionDescriptors(maxSize: number, callback: AsyncCallback\<Array\<Readonly\<AVSessionDescriptor>>>): void<sup>10+<sup> | Obtains the descriptors of historical sessions.|
### APIs Called by the AVSessionController Object
| API| Description|
| -------- | -------- |
| getAllSessionDescriptors(callback: AsyncCallback&lt;Array&lt;Readonly&lt;AVSessionDescriptor&gt;&gt;&gt;): void | Obtains the descriptors of all AVSessions in the system.|
| createController(sessionId: string, callback: AsyncCallback&lt;AVSessionController&gt;): void | Creates an AVSessionController.|
| getValidCommands(callback: AsyncCallback&lt;Array&lt;AVControlCommandType&gt;&gt;): void | Obtains valid commands supported by the AVSession.<br>Control commands listened by an audio and video application when it accesses the AVSession are considered as valid commands supported by the AVSession. For details, see [Provider of AVSession](using-avsession-developer.md).|
| getLaunchAbility(callback: AsyncCallback&lt;WantAgent&gt;): void | Obtains the UIAbility that is configured in the AVSession and can be started.<br>The UIAbility configured here is started when a user operates the UI of the controller, for example, clicking a widget in Media Controller.|
| sendAVKeyEvent(event: KeyEvent, callback: AsyncCallback&lt;void&gt;): void | Sends a key event to an AVSession through the AVSessionController object.|
| sendSystemAVKeyEvent(event: KeyEvent, callback: AsyncCallback&lt;void&gt;): void | Sends a key event to the top session.|
| sendControlCommand(command: AVControlCommand, callback: AsyncCallback&lt;void&gt;): void | Sends a control command to an AVSession through the AVSessionController object.|
| sendSystemControlCommand(command: AVControlCommand, callback: AsyncCallback&lt;void&gt;): void | Sends a control command to the top session.|
| getAVPlaybackState(callback: AsyncCallback&lt;AVPlaybackState&gt;): void | Obtains the information related to the playback state.|
| getAVMetadata(callback: AsyncCallback&lt;AVMetadata&gt;): void | Obtains the session metadata.|
| getOutputDevice(callback: AsyncCallback&lt;OutputDeviceInfo&gt;): void | Obtains the output device information.|
| sendAVKeyEvent(event: KeyEvent, callback: AsyncCallback&lt;void&gt;): void | Sends a key event to the session corresponding to this controller.|
| getLaunchAbility(callback: AsyncCallback&lt;WantAgent&gt;): void | Obtains the **WantAgent** object saved by the application in the session.|
| isActive(callback: AsyncCallback&lt;boolean&gt;): void | Checks whether the session is activated.|
| destroy(callback: AsyncCallback&lt;void&gt;): void | Destroys this controller. A controller can no longer be used after being destroyed.|
| getValidCommands(callback: AsyncCallback&lt;Array&lt;AVControlCommandType&gt;&gt;): void | Obtains valid commands supported by the session.|
| sendControlCommand(command: AVControlCommand, callback: AsyncCallback&lt;void&gt;): void | Sends a playback control command to the session through the controller.|
| sendCommonCommand(command: string, args: {[key: string]: Object}, callback: AsyncCallback&lt;void&gt;): void<sup>10+<sup> | Sends a custom playback control command to the session through the controller.|
| getAVQueueItems(callback: AsyncCallback&lt;Array&lt;AVQueueItem&gt;&gt;): void<sup>10+<sup> | Obtains the information related to the items in the playlist.|
| getAVQueueTitle(callback: AsyncCallback&lt;string&gt;): void<sup>10+<sup> | Obtains the name of the playlist.|
| skipToQueueItem(itemId: number, callback: AsyncCallback&lt;void&gt;): void<sup>10+<sup> | Sends the ID of an item in the playlist to the session for processing. The session can play the song.|
| getExtras(callback: AsyncCallback&lt;{[key: string]: Object}&gt;): void<sup>10+<sup> | Obtains the custom media packet set by the provider.|
## How to Develop
......@@ -48,13 +74,26 @@ To enable a system application to access the AVSession service as a controller,
AVSessionManager.createController(descriptor.sessionId).then((controller) => {
g_controller.push(controller);
}).catch((err) => {
console.error(`createController : ERROR : ${err.message}`);
console.error(`Failed to create controller. Code: ${err.code}, message: ${err.message}`);
});
});
}).catch((err) => {
console.error(`getAllSessionDescriptors : ERROR : ${err.message}`);
console.error(`Failed to get all session descriptors. Code: ${err.code}, message: ${err.message}`);
});
// Obtain the descriptors of historical sessions.
avSession.getHistoricalSessionDescriptors().then((descriptors) => {
console.info(`getHistoricalSessionDescriptors : SUCCESS : descriptors.length : ${descriptors.length}`);
if (descriptors.length > 0){
console.info(`getHistoricalSessionDescriptors : SUCCESS : descriptors[0].isActive : ${descriptors[0].isActive}`);
console.info(`getHistoricalSessionDescriptors : SUCCESS : descriptors[0].type : ${descriptors[0].type}`);
console.info(`getHistoricalSessionDescriptors : SUCCESS : descriptors[0].sessionTag : ${descriptors[0].sessionTag}`);
console.info(`getHistoricalSessionDescriptors : SUCCESS : descriptors[0].sessionId : ${descriptors[0].sessionId}`);
console.info(`getHistoricalSessionDescriptors : SUCCESS : descriptors[0].elementName.bundleName : ${descriptors[0].elementName.bundleName}`);
}
}).catch((err) => {
console.error(`Failed to get historical session descriptors, error code: ${err.code}, error message: ${err.message}`);
});
```
2. Listen for the session state and service state events.
......@@ -74,7 +113,7 @@ To enable a system application to access the AVSession service as a controller,
AVSessionManager.createController(session.sessionId).then((controller) => {
g_controller.push(controller);
}).catch((err) => {
console.info(`createController : ERROR : ${err.message}`);
console.error(`Failed to create controller. Code: ${err.code}, message: ${err.message}`);
});
});
......@@ -103,7 +142,7 @@ To enable a system application to access the AVSession service as a controller,
// Subscribe to the 'sessionServiceDie' event.
AVSessionManager.on('sessionServiceDie', () => {
// The server is abnormal, and the application clears resources.
console.info("Server exception.");
console.info(`Server exception.`);
})
```
......@@ -117,6 +156,10 @@ To enable a system application to access the AVSession service as a controller,
- **validCommandChange**: triggered when the valid commands supported by the session changes.
- **outputDeviceChange**: triggered when the output device changes.
- **sessionDestroy**: triggered when a session is destroyed.
- **sessionEvent**: triggered when the custom session event changes.
- **extrasChange**: triggered when the custom media packet of the session changes.
- **queueItemsChange**: triggered when one or more items in the custom playlist of the session changes.
- **queueTitleChange**: triggered when the custom playlist name of the session changes.
The controller can listen for events as required.
......@@ -124,18 +167,18 @@ To enable a system application to access the AVSession service as a controller,
// Subscribe to the 'activeStateChange' event.
controller.on('activeStateChange', (isActive) => {
if (isActive) {
console.info("The widget corresponding to the controller is highlighted.");
console.info(`The widget corresponding to the controller is highlighted.`);
} else {
console.info("The widget corresponding to the controller is invalid.");
console.info(`The widget corresponding to the controller is invalid.`);
}
});
// Subscribe to the 'sessionDestroy' event to enable the controller to get notified when the session dies.
controller.on('sessionDestroy', () => {
console.info('on sessionDestroy : SUCCESS ');
info(`on sessionDestroy : SUCCESS `);
controller.destroy().then(() => {
console.info('destroy : SUCCESS ');
console.info(`destroy : SUCCESS`);
}).catch((err) => {
console.info(`destroy : ERROR :${err.message}`);
console.error(`Failed to destroy session. Code: ${err.code}, message: ${err.message}`);
});
});
......@@ -164,10 +207,26 @@ To enable a system application to access the AVSession service as a controller,
controller.on('outputDeviceChange', (device) => {
console.info(`on outputDeviceChange device isRemote : ${device.isRemote}`);
});
// Subscribe to custom session event changes.
controller.on('sessionEvent', (eventName, eventArgs) => {
console.info(`Received new session event, event name is ${eventName}, args are ${JSON.stringify(eventArgs)}`);
});
// Subscribe to custom media packet changes.
controller.on('extrasChange', (extras) => {
console.info(`Received custom media packet, packet data is ${JSON.stringify(extras)}`);
});
// Subscribe to custom playlist item changes.
controller.on('queueItemsChange', (items) => {
console.info(`Caught queue items change, items length is ${items.length}`);
});
// Subscribe to custom playback name changes.
controller.on('queueTitleChange', (title) => {
console.info(`Caught queue title change, title is ${title}`);
});
```
4. Obtain the media information transferred by the provider for display on the UI, for example, displaying the track being played and the playback state in Media Controller.
```ts
async getInfoFromSessionByController() {
// It is assumed that an AVSessionController object corresponding to the session already exists. For details about how to create an AVSessionController object, see the code snippet above.
......@@ -186,19 +245,36 @@ To enable a system application to access the AVSession service as a controller,
let avPlaybackState: AVSessionManager.AVPlaybackState = await controller.getAVPlaybackState();
console.info(`get playbackState by controller : ${avPlaybackState.state}`);
console.info(`get favoriteState by controller : ${avPlaybackState.isFavorite}`);
// Obtain the playlist items of the session.
let queueItems: Array<AVSessionManager.AVQueueItem> = await controller.getAVQueueItems();
console.info(`get queueItems length by controller : ${queueItems.length}`);
// Obtain the playlist name of the session.
let queueTitle: string = await controller.getAVQueueTitle();
console.info(`get queueTitle by controller : ${queueTitle}`);
// Obtain the custom media packet of the session.
let extras: any = await controller.getExtras();
console.info(`get custom media packets by controller : ${JSON.stringify(extras)}`);
// Obtain the ability information provided by the application corresponding to the session.
let agent: WantAgent = await controller.getLaunchAbility();
console.info(`get want agent info by controller : ${JSON.stringify(agent)}`);
// Obtain the current playback position of the session.
let currentTime: number = controller.getRealPlaybackPositionSync();
console.info(`get current playback time by controller : ${currentTime}`);
// Obtain valid commands supported by the session.
let validCommands: Array<AVSessionManager.AVControlCommandType> = await controller.getValidCommands();
console.info(`get valid commands by controller : ${JSON.stringify(validCommands)}`);
}
```
5. Control the playback behavior, for example, sending a command to operate (play/pause/previous/next) the item being played in Media Controller.
After listening for the control command event, the audio and video application serving as the provider needs to implement the corresponding operation.
After listening for the playback control command event, the audio and video application serving as the provider needs to implement the corresponding operation.
```ts
async sendCommandToSessionByController() {
// It is assumed that an AVSessionController object corresponding to the session already exists. For details about how to create an AVSessionController object, see the code snippet above.
let controller: AVSessionManager.AVSessionController = ALLREADY_HAVE_A_CONTROLLER;
// Obtain the commands supported by the session.
// Obtain valid commands supported by the session.
let validCommandTypeArray: Array<AVSessionManager.AVControlCommandType> = await controller.getValidCommands();
console.info(`get validCommandArray by controller : length : ${validCommandTypeArray.length}`);
// Deliver the 'play' command.
......@@ -222,11 +298,28 @@ To enable a system application to access the AVSession service as a controller,
let avCommand: AVSessionManager.AVControlCommand = {command:'playNext'};
controller.sendControlCommand(avCommand);
}
// Deliver a custom playback control command.
let commandName: string = 'custom command';
let args = {
command : 'This is my custom command'
}
await controller.sendCommonCommand(commandName, args).then(() => {
console.info(`SendCommonCommand successfully`);
}).catch((err) => {
console.error(`Failed to send common command. Code: ${err.code}, message: ${err.message}`);
})
// Set the ID of an item in the specified playlist for the session to play.
let queueItemId: number = 0;
await controller.skipToQueueItem(queueItemId).then(() => {
console.info(`SkipToQueueItem successfully`);
}).catch((err) => {
console.error(`Failed to skip to queue item. Code: ${err.code}, message: ${err.message}`);
});
}
```
6. When the audio and video application exits, cancel the listener and release the resources.
```ts
async destroyController() {
// It is assumed that an AVSessionController object corresponding to the session already exists. For details about how to create an AVSessionController object, see the code snippet above.
......@@ -235,9 +328,9 @@ To enable a system application to access the AVSession service as a controller,
// Destroy the AVSessionController object. After being destroyed, it is no longer available.
controller.destroy(function (err) {
if (err) {
console.info(`Destroy controller ERROR : code: ${err.code}, message: ${err.message}`);
console.error(`Failed to destroy controller. Code: ${err.code}, message: ${err.message}`);
} else {
console.info('Destroy controller SUCCESS');
console.info(`Destroy controller SUCCESS`);
}
});
}
......
......@@ -36,15 +36,15 @@ To enable a system application that accesses the AVSession service as the contro
let audioDevices;
await audioRoutingManager.getDevices(audio.DeviceFlag.OUTPUT_DEVICES_FLAG).then((data) => {
audioDevices = data;
console.info('Promise returned to indicate that the device list is obtained.');
console.info(`Promise returned to indicate that the device list is obtained.`);
}).catch((err) => {
console.info(`getDevices : ERROR : ${err.message}`);
console.error(`Failed to get devices. Code: ${err.code}, message: ${err.message}`);
});
AVSessionManager.castAudio('all', audioDevices).then(() => {
console.info('createController : SUCCESS');
console.info(`createController : SUCCESS`);
}).catch((err) => {
console.info(`createController : ERROR : ${err.message}`);
console.error(`Failed to cast audio. Code: ${err.code}, message: ${err.message}`);
});
```
......
......@@ -78,7 +78,9 @@ export class AVPlayerDemo {
private avPlayer;
private count: number = 0;
private surfaceID: string; // The surfaceID parameter specifies the window used to display the video. Its value is obtained through the XComponent.
private isSeek: boolean = true; // Specify whether the seek operation is supported.
private fileSize: number = -1;
private fd: number = 0;
// Set AVPlayer callback functions.
setAVPlayerCallback() {
// Callback function for the seek operation.
......@@ -113,8 +115,13 @@ export class AVPlayerDemo {
case 'playing': // This state is reported upon a successful callback of play().
console.info('AVPlayer state playing called.');
if (this.count !== 0) {
console.info('AVPlayer start to seek.');
this.avPlayer.seek (this.avPlayer.duration); // Call seek() to seek to the end of the video clip.
if (this.isSeek) {
console.info('AVPlayer start to seek.');
this.avPlayer.seek (this.avPlayer.duration); // Call seek() to seek to the end of the video clip.
} else {
// When the seek operation is not supported, the playback continues until it reaches the end.
console.info('AVPlayer wait to play end.');
}
} else {
this.avPlayer.pause(); // Call pause() to pause the playback.
}
......@@ -152,10 +159,11 @@ export class AVPlayerDemo {
let context = getContext(this) as common.UIAbilityContext;
// Obtain the sandbox address filesDir through UIAbilityContext. The stage model is used as an example.
let pathDir = context.filesDir;
let path = pathDir + '/H264_AAC.mp4';
let path = pathDir + '/H264_AAC.mp4';
// Open the corresponding file address to obtain the file descriptor and assign a value to the URL to trigger the reporting of the initialized state.
let file = await fs.open(path);
fdPath = fdPath + '' + file.fd;
this.isSeek = true; // The seek operation is supported.
this.avPlayer.url = fdPath;
}
......@@ -169,9 +177,86 @@ export class AVPlayerDemo {
// The return type is {fd,offset,length}, where fd indicates the file descriptor address of the HAP file, offset indicates the media asset offset, and length indicates the duration of the media asset to play.
let context = getContext(this) as common.UIAbilityContext;
let fileDescriptor = await context.resourceManager.getRawFd('H264_AAC.mp4');
this.isSeek = true; // The seek operation is supported.
// Assign a value to fdSrc to trigger the reporting of the initialized state.
this.avPlayer.fdSrc = fileDescriptor;
}
// The following demo shows how to use the file system to open the sandbox address, obtain the media file address, and play the media file with the seek operation using the dataSrc attribute.
async avPlayerDataSrcSeekDemo() {
// Create an AVPlayer instance.
this.avPlayer = await media.createAVPlayer();
// Set a callback function for state changes.
this.setAVPlayerCallback();
// dataSrc indicates the playback source address. When the seek operation is supported, fileSize indicates the size of the file to be played. The following describes how to assign a value to fileSize.
let src = {
fileSize: -1,
callback: (buf, length, pos) => {
let num = 0;
if (buf == undefined || length == undefined || pos == undefined) {
return -1;
}
num = fs.readSync(this.fd, buf, { offset: pos, length: length });
if (num > 0 && (this.fileSize >= pos)) {
return num;
}
return -1;
}
}
let context = getContext(this) as common.UIAbilityContext;
// Obtain the sandbox address filesDir through UIAbilityContext. The stage model is used as an example.
let pathDir = context.filesDir;
let path = pathDir + '/H264_AAC.mp4';
await fs.open(path).then((file) => {
this.fd = file.fd;
})
// Obtain the size of the file to be played.
this.fileSize = fs.statSync(path).size;
src.fileSize = this.fileSize;
this.isSeek = true; // The seek operation is supported.
this.avPlayer.dataSrc = src;
}
// The following demo shows how to use the file system to open the sandbox address, obtain the media file address, and play the media file without the seek operation using the dataSrc attribute.
async avPlayerDataSrcNoSeekDemo() {
// Create an AVPlayer instance.
this.avPlayer = await media.createAVPlayer();
// Set a callback function for state changes.
this.setAVPlayerCallback();
let context = getContext(this) as common.UIAbilityContext;
let src: object = {
fileSize: -1,
callback: (buf, length, pos) => {
let num = 0;
if (buf == undefined || length == undefined) {
return -1;
}
num = fs.readSync(this.fd, buf);
if (num > 0) {
return num;
}
return -1;
}
}
// Obtain the sandbox address filesDir through UIAbilityContext. The stage model is used as an example.
let pathDir = context.filesDir;
let path = pathDir + '/H264_AAC.mp4';
await fs.open(path).then((file) => {
this.fd = file.fd;
})
this.isSeek = false; // The seek operation is not supported.
this.avPlayer.dataSrc = src;
}
// The following demo shows how to play live streams by setting the network address through the URL.
async avPlayerLiveDemo() {
// Create an AVPlayer instance.
this.avPlayer = await media.createAVPlayer();
// Set a callback function for state changes.
this.setAVPlayerCallback();
this.isSeek = false; // The seek operation is not supported.
this.avPlayer.url = 'http://xxx.xxx.xxx.xxx:xx/xx/index.m3u8'; // Play live webcasting streams using HLS.
}
}
```
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册