diff --git a/en/application-dev/application-models/Readme-EN.md b/en/application-dev/application-models/Readme-EN.md
index 65f2b4c16ea42ecdf37082a5a9f8e26eb20dd6e6..65d5dc91d92ae7622d7a5824759798230f3a4806 100644
--- a/en/application-dev/application-models/Readme-EN.md
+++ b/en/application-dev/application-models/Readme-EN.md
@@ -17,7 +17,6 @@
- ExtensionAbility Component
- [ExtensionAbility Component Overview](extensionability-overview.md)
- [ServiceExtensionAbility](serviceextensionability.md)
- - [DataShareExtensionAbility (for System Applications Only)](datashareextensionability.md)
- [AccessibilityExtensionAbility](accessibilityextensionability.md)
- [EnterpriseAdminExtensionAbility](enterprise-extensionAbility.md)
- [InputMethodExtensionAbility](inputmethodextentionability.md)
@@ -37,9 +36,9 @@
- [Applying Custom Drawing in the Widget](arkts-ui-widget-page-custom-drawing.md)
- Widget Event Development
- [Widget Event Capability Overview](arkts-ui-widget-event-overview.md)
+ - [Redirecting to a Specified Page Through the Router Event](arkts-ui-widget-event-router.md)
- [Updating Widget Content Through FormExtensionAbility](arkts-ui-widget-event-formextensionability.md)
- [Updating Widget Content Through UIAbility](arkts-ui-widget-event-uiability.md)
- - [Redirecting to a Specified Page Through the Router Event](arkts-ui-widget-event-router.md)
- Widget Data Interaction
- [Widget Data Interaction Overview](arkts-ui-widget-interaction-overview.md)
- [Configuring a Widget to Update Periodically](arkts-ui-widget-update-by-time.md)
@@ -62,8 +61,8 @@
- [Cross-Device Migration (for System Applications Only)](hop-cross-device-migration.md)
- [Multi-device Collaboration (for System Applications Only)](hop-multi-device-collaboration.md)
- [Subscribing to System Environment Variable Changes](subscribe-system-environment-variable-changes.md)
- - IPC
- - [Process Model](process-model-stage.md)
+ - Process Model
+ - [Process Model Overview](process-model-stage.md)
- Common Events
- [Introduction to Common Events](common-event-overview.md)
- Common Event Subscription
@@ -74,13 +73,13 @@
- [Publishing Common Events](common-event-publish.md)
- [Removing Sticky Common Events](common-event-remove-sticky.md)
- [Background Services](background-services.md)
- - Inter-Thread Communication
- - [Thread Model](thread-model-stage.md)
+ - Thread Model
+ - [Thread Model Overview](thread-model-stage.md)
- [Using Emitter for Inter-Thread Communication](itc-with-emitter.md)
- [Using Worker for Inter-Thread Communication](itc-with-worker.md)
- Mission Management
- [Mission Management Scenarios](mission-management-overview.md)
- - [Mission Management and Launch Type](mission-management-launch-type.md)
+ - [Mission and Launch Type](mission-management-launch-type.md)
- [Page Stack and MissionList](page-mission-stack.md)
- [Setting the Icon and Name of a Mission Snapshot](mission-set-icon-name-for-task-snapshot.md)
- [Application Configuration File](config-file-stage.md)
@@ -120,12 +119,12 @@
- [Context](application-context-fa.md)
- [Want](want-fa.md)
- [Component Startup Rules](component-startup-rules-fa.md)
- - IPC
- - [Process Model](process-model-fa.md)
+ - Process Model
+ - [Process Model Overview](process-model-fa.md)
- [Common Events](common-event-fa.md)
- [Background Services](rpc.md)
- - Inter-Thread Communication
- - [Thread Model](thread-model-fa.md)
+ - Thread Model
+ - [Thread Model Overview](thread-model-fa.md)
- [Inter-Thread Communication](itc-fa-overview.md)
- [Mission Management](mission-management-fa.md)
- [Application Configuration File](config-file-fa.md)
diff --git a/en/application-dev/application-models/arkts-ui-widget-image-update.md b/en/application-dev/application-models/arkts-ui-widget-image-update.md
index 00c00a744afd8422274617005a50583fef5d92ee..4862fbf747c0275d179eb4a2f988280379f2d262 100644
--- a/en/application-dev/application-models/arkts-ui-widget-image-update.md
+++ b/en/application-dev/application-models/arkts-ui-widget-image-update.md
@@ -4,7 +4,7 @@
Generally, local images or online images downloaded from the network need to be displayed on a widget. To obtain local and online images, use the FormExtensionAbility. The following exemplifies how to show local and online images on a widget.
-1. Internet access is required for downloading online images. Therefore, you need to apply for the **ohos.permission.INTERNET** permission. For details, see[Declaring Permissions in the Configuration File](../security/accesstoken-guidelines.md).
+1. Internet access is required for downloading online images. Therefore, you need to apply for the **ohos.permission.INTERNET** permission. For details, see [Declaring Permissions in the Configuration File](../security/accesstoken-guidelines.md).
2. Update local files in the **onAddForm** lifecycle callback of the EntryFormAbility.
diff --git a/en/application-dev/application-models/datashareextensionability.md b/en/application-dev/application-models/datashareextensionability.md
deleted file mode 100644
index bea3de69c6d7ad375206fb1d53bcc36c2624989d..0000000000000000000000000000000000000000
--- a/en/application-dev/application-models/datashareextensionability.md
+++ /dev/null
@@ -1,4 +0,0 @@
-# DataShareExtensionAbility (for System Applications Only)
-
-
-DataShareExtensionAbility provides the data sharing capability. System applications can implement a DataShareExtensionAbility or access an existing DataShareExtensionAbility in the system. Third-party applications can only access an existing DataShareExtensionAbility. For details, see [Cross-Application Data Sharing Overview](../database/share-device-data-across-apps-overview.md).
diff --git a/en/application-dev/application-models/mission-management-launch-type.md b/en/application-dev/application-models/mission-management-launch-type.md
index 199de6eefead9fc056adf8d08c49f792a54a4a83..56a389cc52e093008491f75e01144bd7635b94eb 100644
--- a/en/application-dev/application-models/mission-management-launch-type.md
+++ b/en/application-dev/application-models/mission-management-launch-type.md
@@ -1,4 +1,4 @@
-# Mission Management and Launch Type
+# Mission and Launch Type
One UIAbility instance corresponds to one mission. The number of UIAbility instances is related to the UIAbility launch type, specified by **launchType**, which is configured in the **config.json** file in the FA model and the [module.json5](../quick-start/module-configuration-file.md) file in the stage model.
@@ -11,13 +11,13 @@ The following describes how the mission list manager manages the UIAbility insta

-- **multiton**: Each time [startAbility()](../reference/apis/js-apis-inner-application-uiAbilityContext.md#uiabilitycontextstartability) is called, a **UIAbility** instance is created in the application process.
+- **multiton**: Each time [startAbility()](../reference/apis/js-apis-inner-application-uiAbilityContext.md#uiabilitycontextstartability) is called, a UIAbility instance is created in the application process.
**Figure 2** Missions and multiton mode

-- **specified**: The ([onAcceptWant()](../reference/apis/js-apis-app-ability-abilityStage.md#abilitystageonacceptwant)) method of [AbilityStage](abilitystage.md) determines whether to create an instance.
+- **specified**: The ([onAcceptWant()](../reference/apis/js-apis-app-ability-abilityStage.md#abilitystageonacceptwant)) method of [AbilityStage](abilitystage.md) determines whether to create a UIAbility instance.
**Figure 3** Missions and specified mode
diff --git a/en/application-dev/application-models/mission-management-overview.md b/en/application-dev/application-models/mission-management-overview.md
index ba55ebb136ebffca0294bf69013f2f2ab4392e7f..785a9f8291ea43e756ebed07843ceef23570160d 100644
--- a/en/application-dev/application-models/mission-management-overview.md
+++ b/en/application-dev/application-models/mission-management-overview.md
@@ -4,7 +4,7 @@
Before getting started with the development of mission management, be familiar with the following concepts related to mission management:
-- AbilityRecord: minimum unit for the system service to manage a UIAbility instance. It corresponds to a UIAbility component instance of an application.
+- AbilityRecord: minimum unit for the system service to manage a UIAbility instance. It corresponds to a UIAbility component instance of an application. A maximum of 512 UIAbility instances can be managed on the system service side.
- MissionRecord: minimum unit for mission management. One MissionRecord has only one AbilityRecord. In other words, a UIAbility component instance corresponds to a mission.
@@ -30,42 +30,42 @@ Missions are managed by system applications (such as home screen), rather than t
A UIAbility instance corresponds to an independent mission. Therefore, when an application calls [startAbility()](../reference/apis/js-apis-inner-application-uiAbilityContext.md#uiabilitycontextstartability) to start a UIAbility, a mission is created.
-To call [missionManager](../reference/apis/js-apis-application-missionManager.md) to manage missions, the home screen application must request the **ohos.permission.MANAGE_MISSIONS** permission. For details about the configuration, see [Declaring Permissions in the Configuration File](../security/accesstoken-guidelines.md#declaring-permissions-in-the-configuration-file).
+1. To call [missionManager](../reference/apis/js-apis-application-missionManager.md) to manage missions, the home screen application must request the **ohos.permission.MANAGE_MISSIONS** permission. For details about the configuration, see [Declaring Permissions in the Configuration File](../security/accesstoken-guidelines.md#declaring-permissions-in-the-configuration-file).
-You can use **missionManager** to manage missions, for example, listening for mission changes, obtaining mission information or snapshots, and clearing, locking, or unlocking missions.
+2. You can use **missionManager** to manage missions, for example, listening for mission changes, obtaining mission information or snapshots, and clearing, locking, or unlocking missions.
```ts
import missionManager from '@ohos.app.ability.missionManager'
let listener = {
- // Listen for mission creation.
- onMissionCreated: function (mission) {
- console.info("--------onMissionCreated-------")
- },
- // Listen for mission destruction.
- onMissionDestroyed: function (mission) {
- console.info("--------onMissionDestroyed-------")
- },
- // Listen for mission snapshot changes.
- onMissionSnapshotChanged: function (mission) {
- console.info("--------onMissionSnapshotChanged-------")
- },
- // Listen for switching the mission to the foreground.
- onMissionMovedToFront: function (mission) {
- console.info("--------onMissionMovedToFront-------")
- },
- // Listen for mission icon changes.
- onMissionIconUpdated: function (mission, icon) {
- console.info("--------onMissionIconUpdated-------")
- },
- // Listen for mission name changes.
- onMissionLabelUpdated: function (mission) {
- console.info("--------onMissionLabelUpdated-------")
- },
- // Listen for mission closure events.
- onMissionClosed: function (mission) {
- console.info("--------onMissionClosed-------")
- }
+ // Listen for mission creation.
+ onMissionCreated: function (mission) {
+ console.info("--------onMissionCreated-------")
+ },
+ // Listen for mission destruction.
+ onMissionDestroyed: function (mission) {
+ console.info("--------onMissionDestroyed-------")
+ },
+ // Listen for mission snapshot changes.
+ onMissionSnapshotChanged: function (mission) {
+ console.info("--------onMissionSnapshotChanged-------")
+ },
+ // Listen for switching the mission to the foreground.
+ onMissionMovedToFront: function (mission) {
+ console.info("--------onMissionMovedToFront-------")
+ },
+ // Listen for mission icon changes.
+ onMissionIconUpdated: function (mission, icon) {
+ console.info("--------onMissionIconUpdated-------")
+ },
+ // Listen for mission name changes.
+ onMissionLabelUpdated: function (mission) {
+ console.info("--------onMissionLabelUpdated-------")
+ },
+ // Listen for mission closure events.
+ onMissionClosed: function (mission) {
+ console.info("--------onMissionClosed-------")
+ }
};
// 1. Register a mission change listener.
@@ -73,56 +73,56 @@ You can use **missionManager** to manage missions, for example, listening for mi
// 2. Obtain the latest 20 missions in the system.
missionManager.getMissionInfos("", 20, (error, missions) => {
- console.info("getMissionInfos is called, error.code = " + error.code);
- console.info("size = " + missions.length);
- console.info("missions = " + JSON.stringify(missions));
+ console.info("getMissionInfos is called, error.code = " + error.code);
+ console.info("size = " + missions.length);
+ console.info("missions = " + JSON.stringify(missions));
});
// 3. Obtain the detailed information about a mission.
let missionId = 11; // The mission ID 11 is only an example.
let mission = missionManager.getMissionInfo("", missionId).catch(function (err) {
- console.info(err);
+ console.info(err);
});
// 4. Obtain the mission snapshot.
missionManager.getMissionSnapShot("", missionId, (error, snapshot) => {
- console.info("getMissionSnapShot is called, error.code = " + error.code);
- console.info("bundleName = " + snapshot.ability.bundleName);
+ console.info("getMissionSnapShot is called, error.code = " + error.code);
+ console.info("bundleName = " + snapshot.ability.bundleName);
})
// 5. Obtain the low-resolution mission snapshot.
missionManager.getLowResolutionMissionSnapShot("", missionId, (error, snapshot) => {
- console.info("getLowResolutionMissionSnapShot is called, error.code = " + error.code);
- console.info("bundleName = " + snapshot.ability.bundleName);
+ console.info("getLowResolutionMissionSnapShot is called, error.code = " + error.code);
+ console.info("bundleName = " + snapshot.ability.bundleName);
})
// 6. Lock or unlock the mission.
missionManager.lockMission(missionId).then(() => {
- console.info("lockMission is called ");
+ console.info("lockMission is called ");
});
missionManager.unlockMission(missionId).then(() => {
- console.info("unlockMission is called ");
+ console.info("unlockMission is called ");
});
// 7. Switch the mission to the foreground.
missionManager.moveMissionToFront(missionId).then(() => {
- console.info("moveMissionToFront is called ");
+ console.info("moveMissionToFront is called ");
});
// 8. Clear a single mission.
missionManager.clearMission(missionId).then(() => {
- console.info("clearMission is called ");
+ console.info("clearMission is called ");
});
// 9. Clear all missions.
missionManager.clearAllMissions().catch(function (err) {
- console.info(err);
+ console.info(err);
});
// 10. Deregister the mission change listener.
missionManager.off('mission', listenerId, (error) => {
- console.info("unregisterMissionListener");
+ console.info("unregisterMissionListener");
})
```
diff --git a/en/application-dev/application-models/process-model-fa.md b/en/application-dev/application-models/process-model-fa.md
index 699643031121521fbf95d26a949df906fa175a18..ce4c9778d3bf678c7ecb8094477050a42eebb7d7 100644
--- a/en/application-dev/application-models/process-model-fa.md
+++ b/en/application-dev/application-models/process-model-fa.md
@@ -1,4 +1,4 @@
-# Process Model (FA Model)
+# Process Model Overview (FA Model)
The OpenHarmony process model is shown below.
diff --git a/en/application-dev/application-models/process-model-stage.md b/en/application-dev/application-models/process-model-stage.md
index 03da480722de124a1ede58da52e74cd48c5f23f0..cf758d94636773dfd190366d0e215de655902abd 100644
--- a/en/application-dev/application-models/process-model-stage.md
+++ b/en/application-dev/application-models/process-model-stage.md
@@ -1,4 +1,4 @@
-# Process Model (Stage Model)
+# Process Model Overview (Stage Model)
The OpenHarmony process model is shown below.
diff --git a/en/application-dev/application-models/thread-model-fa.md b/en/application-dev/application-models/thread-model-fa.md
index 75401be69cba994ac631b6da997fb6ce2ea35a2f..f6b335f8932ee1ebd5bb9bdf11db99ff354a1470 100644
--- a/en/application-dev/application-models/thread-model-fa.md
+++ b/en/application-dev/application-models/thread-model-fa.md
@@ -1,13 +1,11 @@
-# Thread Model (FA Model)
-
+# Thread Model Overview (FA Model)
There are three types of threads in the FA model:
-
- Main thread
-
-Manages other threads.
-
+
+ Manages other threads.
+
- Ability thread
- One ability thread for each ability.
- Distributes input events.
@@ -19,10 +17,8 @@ Manages other threads.
Performs time-consuming operations
-
Based on the OpenHarmony thread model, different services run on different threads. Service interaction requires inter-thread communication. Threads can communicate with each other in Emitter or Worker mode. Emitter is mainly used for event synchronization between threads, and Worker is mainly used to execute time-consuming tasks.
-
> **NOTE**
>
> The FA model provides an independent thread for each ability. Emitter is mainly used for event synchronization within the ability thread, between a pair of ability threads, or between the ability thread and worker thread.
diff --git a/en/application-dev/application-models/thread-model-stage.md b/en/application-dev/application-models/thread-model-stage.md
index 4ca9fb3ed369f78cf12054c7b6da085b8640b1db..7343b9b619a5d68354e65e254a22a2b078ca44ee 100644
--- a/en/application-dev/application-models/thread-model-stage.md
+++ b/en/application-dev/application-models/thread-model-stage.md
@@ -1,4 +1,4 @@
-# Thread Model (Stage Model)
+# Thread Model Overview (Stage Model)
For an OpenHarmony application, each process has a main thread to provide the following functionalities:
diff --git a/en/application-dev/database/share-data-by-silent-access.md b/en/application-dev/database/share-data-by-silent-access.md
index 142642f98646003c675fcbd15d9369b6664948a6..50ff03f084c889a807c6caf4d7c369bfbe0d2a51 100644
--- a/en/application-dev/database/share-data-by-silent-access.md
+++ b/en/application-dev/database/share-data-by-silent-access.md
@@ -3,7 +3,7 @@
## When to Use
-According to big data statistics, in a typical cross-application data access scenario, applications are started nearly 83 times on average in a day.
+In a typical cross-application data access scenario, an application may be started multiple times.
To reduce the number of application startup times and improve the access speed, OpenHarmony provides the silent access feature, which allows direct access to the database without starting the data provider.
diff --git a/en/application-dev/file-management/select-user-file.md b/en/application-dev/file-management/select-user-file.md
index 77fc2dd23c080c357d1749df4bb3ca551cba3a0d..d339f27e9c1e09cdc77094610619e933816a55f8 100644
--- a/en/application-dev/file-management/select-user-file.md
+++ b/en/application-dev/file-management/select-user-file.md
@@ -37,15 +37,28 @@ The **FilePicker** provides the following interfaces by file type:
Use [PhotoSelectResult](../reference/apis/js-apis-file-picker.md#photoselectresult) to return a result set. Further operations on the selected files can be performed based on the file URIs in the result set.
```ts
+ let uri = null;
const photoPicker = new picker.PhotoViewPicker();
- photoPicker.select(photoSelectOptions)
- .then(async (photoSelectResult) => {
- let uri = photoSelectResult.photoUris[0];
- // Perform operations on the files based on the file URIs obtained.
- })
- .catch((err) => {
- console.error(`Invoke documentPicker.select failed, code is ${err.code}, message is ${err.message}`);
- })
+ photoPicker.select(photoSelectOptions).then((photoSelectResult) => {
+ uri = photoSelectResult.photoUris[0];
+ }).catch((err) => {
+ console.error(`Invoke photoPicker.select failed, code is ${err.code}, message is ${err.message}`);
+ })
+ ```
+
+5. After the GUI is returned from FilePicker, use [**fs.openSync**](../reference/apis/js-apis-file-fs.md#fsopensync) to open the file based on the URI and obtain the FD.
+
+ ```ts
+ let file = fs.openSync(uri, fs.OpenMode.READ_WRITE);
+ console.info('file fd: ' + file.fd);
+ ```
+
+6. Use [fs.writeSync](../reference/apis/js-apis-file-fs.md#writesync) to write data to the file based on the FD, and then close the FD.
+
+ ```ts
+ let writeLen = fs.writeSync(file.fd, 'hello, world');
+ console.info('write data to file succeed and size is:' + writeLen);
+ fs.closeSync(file);
```
## Selecting Documents
@@ -63,23 +76,69 @@ The **FilePicker** provides the following interfaces by file type:
```
3. Create a **documentViewPicker** instance, and call [**select()**](../reference/apis/js-apis-file-picker.md#select-3) to open the **FilePicker** page for the user to select documents.
+
After the documents are selected successfully, a result set containing the file URIs is returned. Further operations can be performed on the documents based on the file URIs.
+
+ For example, you can use [file management APIs](../reference/apis/js-apis-file-fs.md) to obtain file attribute information, such as the file size, access time, and last modification time, based on the URI. If you need to obtain the file name, use [startAbilityForResult](../../application-dev/application-models/uiability-intra-device-interaction.md).
+
> **NOTE**
>
> Currently, **DocumentSelectOptions** is not configurable. By default, all types of user files are selected.
```ts
+ let uri = null;
const documentViewPicker = new picker.DocumentViewPicker(); // Create a documentViewPicker instance.
- documentViewPicker.select(documentSelectOptions)
- .then((documentSelectResult) => {
- let uri = documentSelectResult[0];
- // Perform operations on the documents based on the file URIs.
- })
- .catch((err) => {
- console.error(`Invoke documentPicker.select failed, code is ${err.code}, message is ${err.message}`);
- })
+ documentViewPicker.select(documentSelectOptions).then((documentSelectResult) => {
+ uri = documentSelectResult[0];
+ }).catch((err) => {
+ console.error(`Invoke documentPicker.select failed, code is ${err.code}, message is ${err.message}`);
+ })
+ ```
+
+ > **NOTE**
+ >
+ > Currently, **DocumentSelectOptions** does not provide the method for obtaining the file name. To obtain the file name, use **startAbilityForResult()**.
+
+ ```ts
+ let config = {
+ action: 'ohos.want.action.OPEN_FILE',
+ parameters: {
+ startMode: 'choose',
+ }
+ }
+ try {
+ let result = await context.startAbilityForResult(config, {windowMode: 1});
+ if (result.resultCode !== 0) {
+ console.error(`DocumentPicker.select failed, code is ${result.resultCode}, message is ${result.want.parameters.message}`);
+ return;
+ }
+ // Obtain the URI of the document.
+ let select_item_list = result.want.parameters.select_item_list;
+ // Obtain the name of the document.
+ let file_name_list = result.want.parameters.file_name_list;
+ } catch (err) {
+ console.error(`Invoke documentPicker.select failed, code is ${err.code}, message is ${err.message}`);
+ }
```
+4. After the GUI is returned from FilePicker, use [**fs.openSync**](../reference/apis/js-apis-file-fs.md#fsopensync) to open the file based on the URI and obtain the FD.
+
+ ```ts
+ let file = fs.openSync(uri, fs.OpenMode.READ_WRITE);
+ console.info('file fd: ' + file.fd);
+ ```
+
+5. Use [fs.readSync](../reference/apis/js-apis-file-fs.md#readsync) to read data from the file based on the FD, and then close the FD.
+
+ ```ts
+ let file = fs.openSync(uri, fs.OpenMode.READ_WRITE);
+ let buf = new ArrayBuffer(4096);
+ let num = fs.readSync(file.fd, buf);
+ console.info('read data to file succeed and size is:' + num);
+ fs.closeSync(file);
+ ```
+
+
## Selecting an Audio File
1. Import the **FilePicker** module.
@@ -105,13 +164,26 @@ The **FilePicker** provides the following interfaces by file type:
> Currently, **AudioSelectOptions** is not configurable. By default, all types of user files are selected.
```ts
+ let uri = null;
const audioViewPicker = new picker.AudioViewPicker();
- audioViewPicker.select(audioSelectOptions)
- .then(audioSelectResult => {
- let uri = audioSelectOptions[0];
- // Perform operations on the audio files based on the file URIs.
- })
- .catch((err) => {
- console.error(`Invoke audioPicker.select failed, code is ${err.code}, message is ${err.message}`);
- })
+ audioViewPicker.select(audioSelectOptions).then(audioSelectResult => {
+ uri = audioSelectOptions[0];
+ }).catch((err) => {
+ console.error(`Invoke audioPicker.select failed, code is ${err.code}, message is ${err.message}`);
+ })
+ ```
+
+4. After the GUI is returned from FilePicker, use [**fs.openSync**](../reference/apis/js-apis-file-fs.md#fsopensync) to open the file based on the URI and obtain the FD.
+
+ ```ts
+ let file = fs.openSync(uri, fs.OpenMode.READ_WRITE);
+ console.info('file fd: ' + file.fd);
+ ```
+
+5. Use [fs.writeSync](../reference/apis/js-apis-file-fs.md#writesync) to write data to the file based on the FD, and then close the FD.
+
+ ```ts
+ let writeLen = fs.writeSync(file.fd, 'hello, world');
+ console.info('write data to file succeed and size is:' + writeLen);
+ fs.closeSync(file);
```
diff --git a/en/application-dev/file-management/share-app-file.md b/en/application-dev/file-management/share-app-file.md
index d9ee1d90904f5cdb43cd1987a66b09668200bc81..c2f8f8d12f5ff056e043fb632cff9752c95256ce 100644
--- a/en/application-dev/file-management/share-app-file.md
+++ b/en/application-dev/file-management/share-app-file.md
@@ -12,7 +12,7 @@ You can use the related APIs to [share a file with another application](#sharing
The file URIs are in the following format:
- file://<bundleName>/<path>/\#networkid=<networkid>
+ file://<bundleName>/<path>
- **file**: indicates a file URI.
@@ -20,8 +20,6 @@ The file URIs are in the following format:
- *path*: specifies the application sandbox path of the file.
-- *networkid* (optional): specifies the device to which the file belongs in a distributed file system. Leave this parameter unspecified if the file location does not need to be set.
-
## Sharing a File with Another Application
Before sharing application files, you need to [obtain the application file path](../application-models/application-context-stage.md#obtaining-the-application-development-path).
diff --git a/en/application-dev/media/audio-playback-concurrency.md b/en/application-dev/media/audio-playback-concurrency.md
index 0b36594f6bef62c7ba7588bc8977af67609a6c9d..fee7e776d68914ad376e01b8fe40ee84e3da4224 100644
--- a/en/application-dev/media/audio-playback-concurrency.md
+++ b/en/application-dev/media/audio-playback-concurrency.md
@@ -14,7 +14,7 @@ The audio interruption policy determines the operations (for example, pause, res
Two audio interruption modes, specified by [InterruptMode](../reference/apis/js-apis-audio.md#interruptmode9), are preset in the audio interruption policy:
-- **SHARED_MODE**: Multiple audio streams created by an application share one audio focus. The concurrency rules between these audio streams are determined by the application, without the use of the audio interruption policy. However, if another application needs to play audio while one of these audio streams is being played, the audio interruption policy is triggered.
+- **SHARE_MODE**: Multiple audio streams created by an application share one audio focus. The concurrency rules between these audio streams are determined by the application, without the use of the audio interruption policy. However, if another application needs to play audio while one of these audio streams is being played, the audio interruption policy is triggered.
- **INDEPENDENT_MODE**: Each audio stream created by an application has an independent audio focus. When multiple audio streams are played concurrently, the audio interruption policy is triggered.
diff --git a/en/application-dev/media/audio-playback-overview.md b/en/application-dev/media/audio-playback-overview.md
index d17970d6de9b8b238db74d971ad5f58c605462eb..5ef7a6f9c4d08719a71c9e07f34fa1104802fcc6 100644
--- a/en/application-dev/media/audio-playback-overview.md
+++ b/en/application-dev/media/audio-playback-overview.md
@@ -8,7 +8,7 @@ OpenHarmony provides multiple classes for you to develop audio playback applicat
- [AudioRenderer](using-audiorenderer-for-playback.md): provides ArkTS and JS API to implement audio output. It supports only the PCM format and requires applications to continuously write audio data. The applications can perform data preprocessing, for example, setting the sampling rate and bit width of audio files, before audio input. This class can be used to develop more professional and diverse playback applications. To use this class, you must have basic audio processing knowledge.
-- [OpenSLES](using-opensl-es-for-playback.md): provides a set of standard, cross-platform, yet unique native audio APIs. It supports audio output in PCM format and is applicable to playback applications that are ported from other embedded platforms or that implements audio output at the native layer.
+- [OpenSL ES](using-opensl-es-for-playback.md): provides a set of standard, cross-platform, yet unique native audio APIs. It supports audio output in PCM format and is applicable to playback applications that are ported from other embedded platforms or that implements audio output at the native layer.
- [TonePlayer](using-toneplayer-for-playback.md): provides ArkTS and JS API to implement the playback of dialing tones and ringback tones. It can be used to play the content selected from a fixed type range, without requiring the input of media assets or audio data. This class is application to specific scenarios where dialing tones and ringback tones are played. is available only to system applications.
diff --git a/en/application-dev/media/audio-recording-overview.md b/en/application-dev/media/audio-recording-overview.md
index 698255fddd78d98f9e635b16b3db94e6980bd4a0..2c6fb6fe5b8ffd0e82478d450e64bfc0e10257c6 100644
--- a/en/application-dev/media/audio-recording-overview.md
+++ b/en/application-dev/media/audio-recording-overview.md
@@ -8,7 +8,7 @@ OpenHarmony provides multiple classes for you to develop audio recording applica
- [AudioCapturer](using-audiocapturer-for-recording.md): provides ArkTS and JS API to implement audio input. It supports only the PCM format and requires applications to continuously read audio data. The application can perform data processing after audio output. This class can be used to develop more professional and diverse recording applications. To use this class, you must have basic audio processing knowledge.
-- [OpenSLES](using-opensl-es-for-recording.md): provides a set of standard, cross-platform, yet unique native audio APIs. It supports audio input in PCM format and is applicable to recording applications that are ported from other embedded platforms or that implements audio input at the native layer.
+- [OpenSL ES](using-opensl-es-for-recording.md): provides a set of standard, cross-platform, yet unique native audio APIs. It supports audio input in PCM format and is applicable to recording applications that are ported from other embedded platforms or that implements audio input at the native layer.
## Precautions for Developing Audio Recording Applications
diff --git a/en/application-dev/media/avplayer-avrecorder-overview.md b/en/application-dev/media/avplayer-avrecorder-overview.md
index 051ca3b66ce1839046a2e783a8c274c304625045..3bf9b785b93a32b60b73c902449b9b019e651a2b 100644
--- a/en/application-dev/media/avplayer-avrecorder-overview.md
+++ b/en/application-dev/media/avplayer-avrecorder-overview.md
@@ -59,6 +59,7 @@ The table below lists the supported protocols.
| -------- | -------- |
| Local VOD| The file descriptor is supported, but the file path is not.|
| Network VoD| HTTP, HTTPS, and HLS are supported.|
+| Live webcasting| HLS is supported.|
The table below lists the supported audio playback formats.
diff --git a/en/application-dev/media/figures/audiocapturer-status-change.png b/en/application-dev/media/figures/audiocapturer-status-change.png
index aadbc4fb6470b7cdc0f399ee5954a96c01a7f7c3..ff76a8414f7a254af7d2796e44f2c2555dc9185f 100644
Binary files a/en/application-dev/media/figures/audiocapturer-status-change.png and b/en/application-dev/media/figures/audiocapturer-status-change.png differ
diff --git a/en/application-dev/media/media-application-overview.md b/en/application-dev/media/media-application-overview.md
index d350482e61e7bc9659054b0426c10ce07da88045..6ca7bbd61bccab668076da76841e785d1918abe9 100644
--- a/en/application-dev/media/media-application-overview.md
+++ b/en/application-dev/media/media-application-overview.md
@@ -2,7 +2,7 @@
## Multimedia Subsystem Architecture
-The multimedia subsystem provides the capability of processing users' visual and auditory information. For example, it can be used to collect, compress, store, decompress, and play audio and video information. Based on the type of media information to process, the media system is usually divided into four modules: audio, media, camera, and image.
+The multimedia subsystem provides the capability of processing users' visual and auditory information. For example, it can be used to collect, compress, store, decompress, and play audio and video information. Based on the type of media information to process, the multimedia subsystem subsystem is usually divided into four modules: audio, media, camera, and image.
As shown in the figure below, the multimedia subsystem provides APIs for developing audio/video, camera, and gallery applications, and provides adaptation and acceleration for different hardware chips. In the middle part, it provides core media functionalities and management mechanisms in the form of services.
diff --git a/en/application-dev/media/using-audiorenderer-for-playback.md b/en/application-dev/media/using-audiorenderer-for-playback.md
index 11934e669813fa7a89ceef43bd2c3795db6bad75..d72637819259e3752a33b37d6f645786793cfc38 100644
--- a/en/application-dev/media/using-audiorenderer-for-playback.md
+++ b/en/application-dev/media/using-audiorenderer-for-playback.md
@@ -151,9 +151,6 @@ export default class AudioRendererDemo {
console.info(`${TAG}: creating AudioRenderer success`);
this.renderModel = renderer;
this.renderModel.on('stateChange', (state) => { // Set the events to listen for. A callback is invoked when the AudioRenderer is switched to the specified state.
- if (state == 1) {
- console.info('audio renderer state is: STATE_PREPARED');
- }
if (state == 2) {
console.info('audio renderer state is: STATE_RUNNING');
}
diff --git a/en/application-dev/media/using-avplayer-for-playback.md b/en/application-dev/media/using-avplayer-for-playback.md
index 6cb6ab1e67ef0ae8a44e04fa915ad87bcc9ed024..9af55fb71b489f7b3ece342969c72d18ed90eaab 100644
--- a/en/application-dev/media/using-avplayer-for-playback.md
+++ b/en/application-dev/media/using-avplayer-for-playback.md
@@ -12,7 +12,7 @@ During application development, you can use the **state** attribute of the AVPla
**Figure 1** Playback state transition
-
+
For details about the state, see [AVPlayerState](../reference/apis/js-apis-media.md#avplayerstate9). When the AVPlayer is in the **prepared**, **playing**, **paused**, or **completed** state, the playback engine is working and a large amount of RAM is occupied. If your application does not need to use the AVPlayer, call **reset()** or **release()** to release the instance.
@@ -68,7 +68,9 @@ import common from '@ohos.app.ability.common';
export class AVPlayerDemo {
private avPlayer;
private count: number = 0;
-
+ private isSeek: boolean = true; // Specify whether the seek operation is supported.
+ private fileSize: number = -1;
+ private fd: number = 0;
// Set AVPlayer callback functions.
setAVPlayerCallback() {
// Callback function for the seek operation.
@@ -102,8 +104,13 @@ export class AVPlayerDemo {
case 'playing': // This state is reported upon a successful callback of play().
console.info('AVPlayer state playing called.');
if (this.count !== 0) {
- console.info('AVPlayer start to seek.');
- this.avPlayer.seek (this.avPlayer.duration); // Call seek() to seek to the end of the audio clip.
+ if (this.isSeek) {
+ console.info('AVPlayer start to seek.');
+ this.avPlayer.seek (this.avPlayer.duration); // Call seek() to seek to the end of the audio clip.
+ } else {
+ // When the seek operation is not supported, the playback continues until it reaches the end.
+ console.info('AVPlayer wait to play end.');
+ }
} else {
this.avPlayer.pause(); // Call pause() to pause the playback.
}
@@ -145,6 +152,7 @@ export class AVPlayerDemo {
// Open the corresponding file address to obtain the file descriptor and assign a value to the URL to trigger the reporting of the initialized state.
let file = await fs.open(path);
fdPath = fdPath + '' + file.fd;
+ this.isSeek = true; // The seek operation is supported.
this.avPlayer.url = fdPath;
}
@@ -158,10 +166,85 @@ export class AVPlayerDemo {
// The return type is {fd,offset,length}, where fd indicates the file descriptor address of the HAP file, offset indicates the media asset offset, and length indicates the duration of the media asset to play.
let context = getContext(this) as common.UIAbilityContext;
let fileDescriptor = await context.resourceManager.getRawFd('01.mp3');
+ this.isSeek = true; // The seek operation is supported.
// Assign a value to fdSrc to trigger the reporting of the initialized state.
this.avPlayer.fdSrc = fileDescriptor;
}
+
+ // The following demo shows how to use the file system to open the sandbox address, obtain the media file address, and play the media file with the seek operation using the dataSrc attribute.
+ async avPlayerDataSrcSeekDemo() {
+ // Create an AVPlayer instance.
+ this.avPlayer = await media.createAVPlayer();
+ // Set a callback function for state changes.
+ this.setAVPlayerCallback();
+ // dataSrc indicates the playback source address. When the seek operation is supported, fileSize indicates the size of the file to be played. The following describes how to assign a value to fileSize.
+ let src = {
+ fileSize: -1,
+ callback: (buf, length, pos) => {
+ let num = 0;
+ if (buf == undefined || length == undefined || pos == undefined) {
+ return -1;
+ }
+ num = fs.readSync(this.fd, buf, { offset: pos, length: length });
+ if (num > 0 && (this.fileSize >= pos)) {
+ return num;
+ }
+ return -1;
+ }
+ }
+ let context = getContext(this) as common.UIAbilityContext;
+ // Obtain the sandbox address filesDir through UIAbilityContext. The stage model is used as an example.
+ let pathDir = context.filesDir;
+ let path = pathDir + '/01.mp3';
+ await fs.open(path).then((file) => {
+ this.fd = file.fd;
+ })
+ // Obtain the size of the file to be played.
+ this.fileSize = fs.statSync(path).size;
+ src.fileSize = this.fileSize;
+ this.isSeek = true; // The seek operation is supported.
+ this.avPlayer.dataSrc = src;
+ }
+
+ // The following demo shows how to use the file system to open the sandbox address, obtain the media file address, and play the media file without the seek operation using the dataSrc attribute.
+ async avPlayerDataSrcNoSeekDemo() {
+ // Create an AVPlayer instance.
+ this.avPlayer = await media.createAVPlayer();
+ // Set a callback function for state changes.
+ this.setAVPlayerCallback();
+ let context = getContext(this) as common.UIAbilityContext;
+ let src: object = {
+ fileSize: -1,
+ callback: (buf, length, pos) => {
+ let num = 0;
+ if (buf == undefined || length == undefined) {
+ return -1;
+ }
+ num = fs.readSync(this.fd, buf);
+ if (num > 0) {
+ return num;
+ }
+ return -1;
+ }
+ }
+ // Obtain the sandbox address filesDir through UIAbilityContext. The stage model is used as an example.
+ let pathDir = context.filesDir;
+ let path = pathDir + '/01.mp3';
+ await fs.open(path).then((file) => {
+ this.fd = file.fd;
+ })
+ this.isSeek = false; // The seek operation is not supported.
+ this.avPlayer.dataSrc = src;
+ }
+
+ // The following demo shows how to play live streams by setting the network address through the URL.
+ async avPlayerLiveDemo() {
+ // Create an AVPlayer instance.
+ this.avPlayer = await media.createAVPlayer();
+ // Set a callback function for state changes.
+ this.setAVPlayerCallback();
+ this.isSeek = false; // The seek operation is not supported.
+ this.avPlayer.url = 'http://xxx.xxx.xxx.xxx:xx/xx/index.m3u8';
+ }
}
```
-
-
\ No newline at end of file
diff --git a/en/application-dev/media/using-avsession-controller.md b/en/application-dev/media/using-avsession-controller.md
index 5e4b69d8b48f5acad64f120892062e66d67c6b12..958661d90cec031cbbca1bb7c11ccb80e4dc0e66 100644
--- a/en/application-dev/media/using-avsession-controller.md
+++ b/en/application-dev/media/using-avsession-controller.md
@@ -1,6 +1,6 @@
# AVSession Controller
-Media Controller preset in OpenHarmony functions as the controller to interact with audio and video applications, for example, obtaining and displaying media information and delivering control commands.
+Media Controller preset in OpenHarmony functions as the controller to interact with audio and video applications, for example, obtaining and displaying media information and delivering playback control commands.
You can develop a system application (for example, a new playback control center or voice assistant) as the controller to interact with audio and video applications in the system.
@@ -8,24 +8,50 @@ You can develop a system application (for example, a new playback control center
- AVSessionDescriptor: session information, including the session ID, session type (audio/video), custom session name (**sessionTag**), information about the corresponding application (**elementName**), and whether the session is pined on top (isTopSession).
-- Top session: session with the highest priority in the system, for example, a session that is being played. Generally, the controller must hold an **AVSessionController** object to communicate with a session. However, the controller can directly communicate with the top session, for example, directly sending a control command or key event, without holding an **AVSessionController** object.
+- Top session: session with the highest priority in the system, for example, a session that is being played. Generally, the controller must hold an **AVSessionController** object to communicate with a session. However, the controller can directly communicate with the top session, for example, directly sending a playback control command or key event, without holding an **AVSessionController** object.
## Available APIs
-The table below lists the key APIs used by the controller. The APIs use either a callback or promise to return the result. The APIs listed below use a callback. They provide the same functions as their counterparts that use a promise.
+The key APIs used by the controller are classified into the following types:
+1. APIs called by the **AVSessionManager** object, which is obtained by means of import. An example API is **AVSessionManager.createController(sessionId)**.
+2. APIs called by the **AVSessionController** object. An example API is **controller.getAVPlaybackState()**.
+
+Asynchronous JavaScript APIs use either a callback or promise to return the result. The APIs listed below use a callback. They provide the same functions as their counterparts that use a promise.
For details, see [AVSession Management](../reference/apis/js-apis-avsession.md).
-| API| Description|
+### APIs Called by the AVSessionManager Object
+
+| API| Description|
+| -------- | -------- |
+| getAllSessionDescriptors(callback: AsyncCallback<Array<Readonly<AVSessionDescriptor>>>): void | Obtains the descriptors of all AVSessions in the system.|
+| createController(sessionId: string, callback: AsyncCallback<AVSessionController>): void | Creates an AVSessionController.|
+| getValidCommands(callback: AsyncCallback<Array<AVControlCommandType>>): void | Obtains valid commands supported by the AVSession. Playback control commands listened by an audio and video application when it accesses the AVSession are considered as valid commands supported by the AVSession. For details, see [Provider of AVSession](using-avsession-developer.md).|
+| getLaunchAbility(callback: AsyncCallback<WantAgent>): void | Obtains the UIAbility that is configured in the AVSession and can be started. The UIAbility configured here is started when a user operates the UI of the controller, for example, clicking a widget in Media Controller.|
+| sendAVKeyEvent(event: KeyEvent, callback: AsyncCallback<void>): void | Sends a key event to an AVSession through the AVSessionController object.|
+| sendSystemAVKeyEvent(event: KeyEvent, callback: AsyncCallback<void>): void | Sends a key event to the top session.|
+| sendControlCommand(command: AVControlCommand, callback: AsyncCallback<void>): void | Sends a playback control command to an AVSession through the AVSessionController object.|
+| sendSystemControlCommand(command: AVControlCommand, callback: AsyncCallback<void>): void | Sends a playback control command to the top session.|
+| getHistoricalSessionDescriptors(maxSize: number, callback: AsyncCallback\>>): void10+ | Obtains the descriptors of historical sessions.|
+
+### APIs Called by the AVSessionController Object
+
+| API| Description|
| -------- | -------- |
-| getAllSessionDescriptors(callback: AsyncCallback<Array<Readonly<AVSessionDescriptor>>>): void | Obtains the descriptors of all AVSessions in the system.|
-| createController(sessionId: string, callback: AsyncCallback<AVSessionController>): void | Creates an AVSessionController.|
-| getValidCommands(callback: AsyncCallback<Array<AVControlCommandType>>): void | Obtains valid commands supported by the AVSession. Control commands listened by an audio and video application when it accesses the AVSession are considered as valid commands supported by the AVSession. For details, see [Provider of AVSession](using-avsession-developer.md).|
-| getLaunchAbility(callback: AsyncCallback<WantAgent>): void | Obtains the UIAbility that is configured in the AVSession and can be started. The UIAbility configured here is started when a user operates the UI of the controller, for example, clicking a widget in Media Controller.|
-| sendAVKeyEvent(event: KeyEvent, callback: AsyncCallback<void>): void | Sends a key event to an AVSession through the AVSessionController object.|
-| sendSystemAVKeyEvent(event: KeyEvent, callback: AsyncCallback<void>): void | Sends a key event to the top session.|
-| sendControlCommand(command: AVControlCommand, callback: AsyncCallback<void>): void | Sends a control command to an AVSession through the AVSessionController object.|
-| sendSystemControlCommand(command: AVControlCommand, callback: AsyncCallback<void>): void | Sends a control command to the top session.|
+| getAVPlaybackState(callback: AsyncCallback<AVPlaybackState>): void | Obtains the information related to the playback state.|
+| getAVMetadata(callback: AsyncCallback<AVMetadata>): void | Obtains the session metadata.|
+| getOutputDevice(callback: AsyncCallback<OutputDeviceInfo>): void | Obtains the output device information.|
+| sendAVKeyEvent(event: KeyEvent, callback: AsyncCallback<void>): void | Sends a key event to the session corresponding to this controller.|
+| getLaunchAbility(callback: AsyncCallback<WantAgent>): void | Obtains the **WantAgent** object saved by the application in the session.|
+| isActive(callback: AsyncCallback<boolean>): void | Checks whether the session is activated.|
+| destroy(callback: AsyncCallback<void>): void | Destroys this controller. A controller can no longer be used after being destroyed.|
+| getValidCommands(callback: AsyncCallback<Array<AVControlCommandType>>): void | Obtains valid commands supported by the session.|
+| sendControlCommand(command: AVControlCommand, callback: AsyncCallback<void>): void | Sends a playback control command to the session through the controller.|
+| sendCommonCommand(command: string, args: {[key: string]: Object}, callback: AsyncCallback<void>): void10+ | Sends a custom playback control command to the session through the controller.|
+| getAVQueueItems(callback: AsyncCallback<Array<AVQueueItem>>): void10+ | Obtains the information related to the items in the playlist.|
+| getAVQueueTitle(callback: AsyncCallback<string>): void10+ | Obtains the name of the playlist.|
+| skipToQueueItem(itemId: number, callback: AsyncCallback<void>): void10+ | Sends the ID of an item in the playlist to the session for processing. The session can play the song.|
+| getExtras(callback: AsyncCallback<{[key: string]: Object}>): void10+ | Obtains the custom media packet set by the provider.|
## How to Develop
@@ -48,13 +74,26 @@ To enable a system application to access the AVSession service as a controller,
AVSessionManager.createController(descriptor.sessionId).then((controller) => {
g_controller.push(controller);
}).catch((err) => {
- console.error(`createController : ERROR : ${err.message}`);
+ console.error(`Failed to create controller. Code: ${err.code}, message: ${err.message}`);
});
});
}).catch((err) => {
- console.error(`getAllSessionDescriptors : ERROR : ${err.message}`);
+ console.error(`Failed to get all session descriptors. Code: ${err.code}, message: ${err.message}`);
});
+ // Obtain the descriptors of historical sessions.
+ avSession.getHistoricalSessionDescriptors().then((descriptors) => {
+ console.info(`getHistoricalSessionDescriptors : SUCCESS : descriptors.length : ${descriptors.length}`);
+ if (descriptors.length > 0){
+ console.info(`getHistoricalSessionDescriptors : SUCCESS : descriptors[0].isActive : ${descriptors[0].isActive}`);
+ console.info(`getHistoricalSessionDescriptors : SUCCESS : descriptors[0].type : ${descriptors[0].type}`);
+ console.info(`getHistoricalSessionDescriptors : SUCCESS : descriptors[0].sessionTag : ${descriptors[0].sessionTag}`);
+ console.info(`getHistoricalSessionDescriptors : SUCCESS : descriptors[0].sessionId : ${descriptors[0].sessionId}`);
+ console.info(`getHistoricalSessionDescriptors : SUCCESS : descriptors[0].elementName.bundleName : ${descriptors[0].elementName.bundleName}`);
+ }
+ }).catch((err) => {
+ console.error(`Failed to get historical session descriptors, error code: ${err.code}, error message: ${err.message}`);
+ });
```
2. Listen for the session state and service state events.
@@ -74,7 +113,7 @@ To enable a system application to access the AVSession service as a controller,
AVSessionManager.createController(session.sessionId).then((controller) => {
g_controller.push(controller);
}).catch((err) => {
- console.info(`createController : ERROR : ${err.message}`);
+ console.error(`Failed to create controller. Code: ${err.code}, message: ${err.message}`);
});
});
@@ -103,7 +142,7 @@ To enable a system application to access the AVSession service as a controller,
// Subscribe to the 'sessionServiceDie' event.
AVSessionManager.on('sessionServiceDie', () => {
// The server is abnormal, and the application clears resources.
- console.info("Server exception.");
+ console.info(`Server exception.`);
})
```
@@ -117,6 +156,10 @@ To enable a system application to access the AVSession service as a controller,
- **validCommandChange**: triggered when the valid commands supported by the session changes.
- **outputDeviceChange**: triggered when the output device changes.
- **sessionDestroy**: triggered when a session is destroyed.
+ - **sessionEvent**: triggered when the custom session event changes.
+ - **extrasChange**: triggered when the custom media packet of the session changes.
+ - **queueItemsChange**: triggered when one or more items in the custom playlist of the session changes.
+ - **queueTitleChange**: triggered when the custom playlist name of the session changes.
The controller can listen for events as required.
@@ -124,18 +167,18 @@ To enable a system application to access the AVSession service as a controller,
// Subscribe to the 'activeStateChange' event.
controller.on('activeStateChange', (isActive) => {
if (isActive) {
- console.info("The widget corresponding to the controller is highlighted.");
+ console.info(`The widget corresponding to the controller is highlighted.`);
} else {
- console.info("The widget corresponding to the controller is invalid.");
+ console.info(`The widget corresponding to the controller is invalid.`);
}
});
// Subscribe to the 'sessionDestroy' event to enable the controller to get notified when the session dies.
controller.on('sessionDestroy', () => {
- console.info('on sessionDestroy : SUCCESS ');
+ info(`on sessionDestroy : SUCCESS `);
controller.destroy().then(() => {
- console.info('destroy : SUCCESS ');
+ console.info(`destroy : SUCCESS`);
}).catch((err) => {
- console.info(`destroy : ERROR :${err.message}`);
+ console.error(`Failed to destroy session. Code: ${err.code}, message: ${err.message}`);
});
});
@@ -164,10 +207,26 @@ To enable a system application to access the AVSession service as a controller,
controller.on('outputDeviceChange', (device) => {
console.info(`on outputDeviceChange device isRemote : ${device.isRemote}`);
});
+ // Subscribe to custom session event changes.
+ controller.on('sessionEvent', (eventName, eventArgs) => {
+ console.info(`Received new session event, event name is ${eventName}, args are ${JSON.stringify(eventArgs)}`);
+ });
+ // Subscribe to custom media packet changes.
+ controller.on('extrasChange', (extras) => {
+ console.info(`Received custom media packet, packet data is ${JSON.stringify(extras)}`);
+ });
+ // Subscribe to custom playlist item changes.
+ controller.on('queueItemsChange', (items) => {
+ console.info(`Caught queue items change, items length is ${items.length}`);
+ });
+ // Subscribe to custom playback name changes.
+ controller.on('queueTitleChange', (title) => {
+ console.info(`Caught queue title change, title is ${title}`);
+ });
```
4. Obtain the media information transferred by the provider for display on the UI, for example, displaying the track being played and the playback state in Media Controller.
-
+
```ts
async getInfoFromSessionByController() {
// It is assumed that an AVSessionController object corresponding to the session already exists. For details about how to create an AVSessionController object, see the code snippet above.
@@ -186,19 +245,36 @@ To enable a system application to access the AVSession service as a controller,
let avPlaybackState: AVSessionManager.AVPlaybackState = await controller.getAVPlaybackState();
console.info(`get playbackState by controller : ${avPlaybackState.state}`);
console.info(`get favoriteState by controller : ${avPlaybackState.isFavorite}`);
+ // Obtain the playlist items of the session.
+ let queueItems: Array = await controller.getAVQueueItems();
+ console.info(`get queueItems length by controller : ${queueItems.length}`);
+ // Obtain the playlist name of the session.
+ let queueTitle: string = await controller.getAVQueueTitle();
+ console.info(`get queueTitle by controller : ${queueTitle}`);
+ // Obtain the custom media packet of the session.
+ let extras: any = await controller.getExtras();
+ console.info(`get custom media packets by controller : ${JSON.stringify(extras)}`);
+ // Obtain the ability information provided by the application corresponding to the session.
+ let agent: WantAgent = await controller.getLaunchAbility();
+ console.info(`get want agent info by controller : ${JSON.stringify(agent)}`);
+ // Obtain the current playback position of the session.
+ let currentTime: number = controller.getRealPlaybackPositionSync();
+ console.info(`get current playback time by controller : ${currentTime}`);
+ // Obtain valid commands supported by the session.
+ let validCommands: Array = await controller.getValidCommands();
+ console.info(`get valid commands by controller : ${JSON.stringify(validCommands)}`);
}
```
5. Control the playback behavior, for example, sending a command to operate (play/pause/previous/next) the item being played in Media Controller.
- After listening for the control command event, the audio and video application serving as the provider needs to implement the corresponding operation.
+ After listening for the playback control command event, the audio and video application serving as the provider needs to implement the corresponding operation.
-
```ts
async sendCommandToSessionByController() {
// It is assumed that an AVSessionController object corresponding to the session already exists. For details about how to create an AVSessionController object, see the code snippet above.
let controller: AVSessionManager.AVSessionController = ALLREADY_HAVE_A_CONTROLLER;
- // Obtain the commands supported by the session.
+ // Obtain valid commands supported by the session.
let validCommandTypeArray: Array = await controller.getValidCommands();
console.info(`get validCommandArray by controller : length : ${validCommandTypeArray.length}`);
// Deliver the 'play' command.
@@ -222,11 +298,28 @@ To enable a system application to access the AVSession service as a controller,
let avCommand: AVSessionManager.AVControlCommand = {command:'playNext'};
controller.sendControlCommand(avCommand);
}
+ // Deliver a custom playback control command.
+ let commandName: string = 'custom command';
+ let args = {
+ command : 'This is my custom command'
+ }
+ await controller.sendCommonCommand(commandName, args).then(() => {
+ console.info(`SendCommonCommand successfully`);
+ }).catch((err) => {
+ console.error(`Failed to send common command. Code: ${err.code}, message: ${err.message}`);
+ })
+ // Set the ID of an item in the specified playlist for the session to play.
+ let queueItemId: number = 0;
+ await controller.skipToQueueItem(queueItemId).then(() => {
+ console.info(`SkipToQueueItem successfully`);
+ }).catch((err) => {
+ console.error(`Failed to skip to queue item. Code: ${err.code}, message: ${err.message}`);
+ });
}
```
6. When the audio and video application exits, cancel the listener and release the resources.
-
+
```ts
async destroyController() {
// It is assumed that an AVSessionController object corresponding to the session already exists. For details about how to create an AVSessionController object, see the code snippet above.
@@ -235,9 +328,9 @@ To enable a system application to access the AVSession service as a controller,
// Destroy the AVSessionController object. After being destroyed, it is no longer available.
controller.destroy(function (err) {
if (err) {
- console.info(`Destroy controller ERROR : code: ${err.code}, message: ${err.message}`);
+ console.error(`Failed to destroy controller. Code: ${err.code}, message: ${err.message}`);
} else {
- console.info('Destroy controller SUCCESS');
+ console.info(`Destroy controller SUCCESS`);
}
});
}
diff --git a/en/application-dev/media/using-avsession-developer.md b/en/application-dev/media/using-avsession-developer.md
index 077f0b956a5fb6abaf26c647132bdbb81e78fc63..bf0b914647b9c364bea0ac86a30def88fe3c0f52 100644
--- a/en/application-dev/media/using-avsession-developer.md
+++ b/en/application-dev/media/using-avsession-developer.md
@@ -1,12 +1,12 @@
# AVSession Provider
-An audio and video application needs to access the AVSession service as a provider in order to display media information in the controller (for example, Media Controller) and respond to control commands delivered by the controller.
+An audio and video application needs to access the AVSession service as a provider in order to display media information in the controller (for example, Media Controller) and respond to playback control commands delivered by the controller.
## Basic Concepts
- AVMetadata: media data related attributes, including the IDs of the current media asset (assetId), previous media asset (previousAssetId), and next media asset (nextAssetId), title, author, album, writer, and duration.
-- AVPlaybackState: playback state attributes, including the playback state, position, speed, buffered time, loop mode, and whether the media asset is favorited (**isFavorite**).
+- AVPlaybackState: playback state attributes, including the playback state, position, speed, buffered time, loop mode, media item being played (activeItemId), custom media data (extras), and whether the media asset is favorited (isFavorite).
## Available APIs
@@ -14,28 +14,43 @@ The table below lists the key APIs used by the provider. The APIs use either a c
For details, see [AVSession Management](../reference/apis/js-apis-avsession.md).
-| API| Description|
+| API| Description|
| -------- | -------- |
-| createAVSession(context: Context, tag: string, type: AVSessionType, callback: AsyncCallback<AVSession>): void | Creates an AVSession. Only one AVSession can be created for a UIAbility.|
-| setAVMetadata(data: AVMetadata, callback: AsyncCallback<void>): void | Sets AVSession metadata.|
-| setAVPlaybackState(state: AVPlaybackState, callback: AsyncCallback<void>): void | Sets the AVSession playback state.|
-| setLaunchAbility(ability: WantAgent, callback: AsyncCallback<void>): void | Starts a UIAbility.|
-| getController(callback: AsyncCallback<AVSessionController>): void | Obtains the controller of the AVSession.|
-| activate(callback: AsyncCallback<void>): void | Activates the AVSession.|
-| destroy(callback: AsyncCallback<void>): void | Destroys the AVSession.|
+| createAVSession(context: Context, tag: string, type: AVSessionType, callback: AsyncCallback<AVSession>): void | Creates an AVSession. Only one AVSession can be created for a UIAbility.|
+| setAVMetadata(data: AVMetadata, callback: AsyncCallback<void>): void | Sets AVSession metadata.|
+| setAVPlaybackState(state: AVPlaybackState, callback: AsyncCallback<void>): void | Sets the AVSession playback state.|
+| setLaunchAbility(ability: WantAgent, callback: AsyncCallback<void>): void | Starts a UIAbility.|
+| getController(callback: AsyncCallback<AVSessionController>): void | Obtains the controller of the AVSession.|
+| getOutputDevice(callback: AsyncCallback<OutputDeviceInfo>): void | Obtains the output device information.|
+| activate(callback: AsyncCallback<void>): void | Activates the AVSession.|
+| deactivate(callback: AsyncCallback<void>): void | Deactivates this session.|
+| destroy(callback: AsyncCallback<void>): void | Destroys the AVSession.|
+| setAVQueueItems(items: Array<AVQueueItem>, callback: AsyncCallback<void>): void 10+ | Sets a playlist.|
+| setAVQueueTitle(title: string, callback: AsyncCallback<void>): void10+ | Sets a name for the playlist.|
+| dispatchSessionEvent(event: string, args: {[key: string]: Object}, callback: AsyncCallback<void>): void10+ | Dispatches a custom session event.|
+| setExtras(extras: {[key: string]: Object}, callback: AsyncCallback<void>): void10+ | Sets a custom media packet in the form of a key-value pair.|
## How to Develop
To enable an audio and video application to access the AVSession service as a provider, proceed as follows:
1. Call an API in the **AVSessionManager** class to create and activate an **AVSession** object.
-
+
```ts
+ // To create an AVSession object, you must first obtain the application context. You can set a global variable in the EntryAbility file of the application to store the application context.
+ export default class EntryAbility extends UIAbility {
+ onCreate(want, launchParam) {
+ globalThis.context = this.context; // Set the global variable globalThis.context to store the application context.
+ }
+ // Other APIs of the EntryAbility class.
+ }
+
+ // Start to create and activate an AVSession object.
import AVSessionManager from '@ohos.multimedia.avsession'; // Import the AVSessionManager module.
// Create an AVSession object.
async createSession() {
- let session: AVSessionManager.AVSession = await AVSessionManager.createAVSession(this.context, 'SESSION_NAME', 'audio');
+ let session: AVSessionManager.AVSession = await AVSessionManager.createAVSession(globalThis.context, 'SESSION_NAME', 'audio');
session.activate();
console.info(`session create done : sessionId : ${session.sessionId}`);
}
@@ -46,22 +61,22 @@ To enable an audio and video application to access the AVSession service as a pr
- AVPlaybackState
The controller will call an API in the **AVSessionController** class to obtain the information and display or process the information.
-
+
```ts
async setSessionInfo() {
- // It is assumed that an AVSession object has been created. For details about how to create an AVSession object, see the node snippet above.
- let session: AVSessionManager.AVSession = ALLREADY_CREATE_A_SESSION;
+ // It is assumed that an AVSession object has been created. For details about how to create an AVSession object, see the node snippet in step 1.
+ let session: AVSessionManager.AVSession = ALREADY_CREATE_A_SESSION;
// The player logic that triggers changes in the session metadata and playback state is omitted here.
// Set necessary session metadata.
let metadata: AVSessionManager.AVMetadata = {
- assetId: "0",
- title: "TITLE",
- artist: "ARTIST"
+ assetId: '0',
+ title: 'TITLE',
+ artist: 'ARTIST'
};
session.setAVMetadata(metadata).then(() => {
- console.info('SetAVMetadata successfully');
+ console.info(`SetAVMetadata successfully`);
}).catch((err) => {
- console.info(`SetAVMetadata BusinessError: code: ${err.code}, message: ${err.message}`);
+ console.error(`Failed to set AVMetadata. Code: ${err.code}, message: ${err.message}`);
});
// Set the playback state to paused and set isFavorite to false.
let playbackState: AVSessionManager.AVPlaybackState = {
@@ -70,86 +85,230 @@ To enable an audio and video application to access the AVSession service as a pr
};
session.setAVPlaybackState(playbackState, function (err) {
if (err) {
- console.info(`SetAVPlaybackState BusinessError: code: ${err.code}, message: ${err.message}`);
+ console.error(`Failed to set AVPlaybackState. Code: ${err.code}, message: ${err.message}`);
} else {
- console.info('SetAVPlaybackState successfully');
+ console.info(`SetAVPlaybackState successfully`);
}
});
+ // Set a playlist.
+ let queueItemDescription_1 = {
+ mediaId: '001',
+ title: 'music_name',
+ subtitle: 'music_sub_name',
+ description: 'music_description',
+ icon: PIXELMAP_OBJECT,
+ iconUri: 'http://www.xxx.com',
+ extras: {'extras':'any'}
+ };
+ let queueItem_1 = {
+ itemId: 1,
+ description: queueItemDescription_1
+ };
+ let queueItemDescription_2 = {
+ mediaId: '002',
+ title: 'music_name',
+ subtitle: 'music_sub_name',
+ description: 'music_description',
+ icon: PIXELMAP_OBJECT,
+ iconUri: 'http://www.xxx.com',
+ extras: {'extras':'any'}
+ };
+ let queueItem_2 = {
+ itemId: 2,
+ description: queueItemDescription_2
+ };
+ let queueItemsArray = [queueItem_1, queueItem_2];
+ session.setAVQueueItems(queueItemsArray).then(() => {
+ console.info(`SetAVQueueItems successfully`);
+ }).catch((err) => {
+ console.error(`Failed to set AVQueueItem, error code: ${err.code}, error message: ${err.message}`);
+ });
+ // Set a name for the playlist.
+ let queueTitle = 'QUEUE_TITLE';
+ session.setAVQueueTitle(queueTitle).then(() => {
+ console.info(`SetAVQueueTitle successfully`);
+ }).catch((err) => {
+ console.info(`Failed to set AVQueueTitle, error code: ${err.code}, error message: ${err.message}`);
+ });
}
```
3. Set the UIAbility to be started by the controller. The UIAbility configured here is started when a user operates the UI of the controller, for example, clicking a widget in Media Controller.
The UIAbility is set through the **WantAgent** API. For details, see [WantAgent](../reference/apis/js-apis-app-ability-wantAgent.md).
-
+
```ts
- import WantAgent from "@ohos.app.ability.wantAgent";
+ import wantAgent from "@ohos.app.ability.wantAgent";
```
```ts
- // It is assumed that an AVSession object has been created. For details about how to create an AVSession object, see the node snippet above.
- let session: AVSessionManager.AVSession = ALLREADY_CREATE_A_SESSION;
+ // It is assumed that an AVSession object has been created. For details about how to create an AVSession object, see the node snippet in step 1.
+ let session: AVSessionManager.AVSession = ALREADY_CREATE_A_SESSION;
let wantAgentInfo = {
wants: [
{
- bundleName: "com.example.musicdemo",
- abilityName: "com.example.musicdemo.MainAbility"
+ bundleName: 'com.example.musicdemo',
+ abilityName: 'com.example.musicdemo.MainAbility'
}
],
- operationType: WantAgent.OperationType.START_ABILITIES,
+ operationType: wantAgent.OperationType.START_ABILITIES,
requestCode: 0,
- wantAgentFlags: [WantAgent.WantAgentFlags.UPDATE_PRESENT_FLAG]
+ wantAgentFlags: [wantAgent.WantAgentFlags.UPDATE_PRESENT_FLAG]
}
- WantAgent.getWantAgent(wantAgentInfo).then((agent) => {
- session.setLaunchAbility(agent)
+ wantAgent.getWantAgent(wantAgentInfo).then((agent) => {
+ session.setLaunchAbility(agent);
})
```
-4. Listen for control commands delivered by the controller, for example, Media Controller.
+4. Set a custom session event. The controller performs an operation after receiving the event.
+
> **NOTE**
>
- > After the provider registers a listener for the control command event, the event will be reflected in **getValidCommands()** of the controller. In other words, the controller determines that the command is valid and triggers the corresponding event as required. To ensure that the control commands delivered by the controller can be executed normally, the provider should not use a null implementation for listening.
-
+ > The data set through **dispatchSessionEvent** is not saved in the **AVSession** object or AVSession service.
+
```ts
- async setListenerForMesFromController() {
- // It is assumed that an AVSession object has been created. For details about how to create an AVSession object, see the node snippet above.
- let session: AVSessionManager.AVSession = ALLREADY_CREATE_A_SESSION;
- // Generally, logic processing on the player is implemented in the listener.
- // After the processing is complete, use the setter to synchronize the playback information. For details, see the code snippet above.
- session.on('play', () => {
- console.info('on play , do play task');
-
- // do some tasks ···
- });
- session.on('pause', () => {
- console.info('on pause , do pause task');
- // do some tasks ···
- });
- session.on('stop', () => {
- console.info('on stop , do stop task');
- // do some tasks ···
- });
- session.on('playNext', () => {
- console.info('on playNext , do playNext task');
- // do some tasks ···
- });
- session.on('playPrevious', () => {
- console.info('on playPrevious , do playPrevious task');
- // do some tasks ···
- });
+ // It is assumed that an AVSession object has been created. For details about how to create an AVSession object, see the node snippet in step 1.
+ let session: AVSessionManager.AVSession = ALREADY_CREATE_A_SESSION;
+ let eventName = 'dynamic_lyric';
+ let args = {
+ lyric : 'This is my lyric'
}
+ await session.dispatchSessionEvent(eventName, args).then(() => {
+ console.info(`Dispatch session event successfully`);
+ }).catch((err) => {
+ console.error(`Failed to dispatch session event. Code: ${err.code}, message: ${err.message}`);
+ })
```
-5. Obtain an **AVSessionController** object for this **AVSession** object for interaction.
-
+5. Set a custom media packet. The controller performs an operation after receiving the event.
+
+ > **NOTE**
+ >
+ > The data set by using **setExtras** is stored in the AVSession service. The data lifecycle is the same as that of the **AVSession** object, and the controller corresponding to the object can use **getExtras** to obtain the data.
+
+ ```ts
+ // It is assumed that an AVSession object has been created. For details about how to create an AVSession object, see the node snippet in step 1.
+ let session: AVSessionManager.AVSession = ALREADY_CREATE_A_SESSION;
+ let extras = {
+ extra : 'This is my custom meida packet'
+ }
+ await session.setExtras(extras).then(() => {
+ console.info(`Set extras successfully`);
+ }).catch((err) => {
+ console.error(`Failed to set extras. Code: ${err.code}, message: ${err.message}`);
+ })
+ ```
+
+6. Listen for playback control commands or events delivered by the controller, for example, Media Controller.
+
+ Both fixed playback control commands and advanced playback control events can be listened for.
+
+ - Listening for Fixed Playback Control Commands
+
+ > **NOTE**
+ >
+ > After the provider registers a listener for fixed playback control commands, the commands will be reflected in **getValidCommands()** of the controller. In other words, the controller determines that the command is valid and triggers the corresponding event as required. To ensure that the playback control commands delivered by the controller can be executed normally, the provider should not use a null implementation for listening.
+
+ Fixed playback control commands on the session side include basic operation commands such as play, pause, previous, and next. For details, see [AVControlCommand](../reference/apis/js-apis-avsession.md).
+
+ ```ts
+ async setListenerForMesFromController() {
+ // It is assumed that an AVSession object has been created. For details about how to create an AVSession object, see the node snippet in step 1.
+ let session: AVSessionManager.AVSession = ALREADY_CREATE_A_SESSION;
+ // Generally, logic processing on the player is implemented in the listener.
+ // After the processing is complete, use the setter to synchronize the playback information. For details, see the code snippet above.
+ session.on('play', () => {
+ console.info(`on play , do play task`);
+ // do some tasks ···
+ });
+ session.on('pause', () => {
+ console.info(`on pause , do pause task`);
+ // do some tasks ···
+ });
+ session.on('stop', () => {
+ console.info(`on stop , do stop task`);
+ // do some tasks ···
+ });
+ session.on('playNext', () => {
+ console.info(`on playNext , do playNext task`);
+ // do some tasks ···
+ });
+ session.on('playPrevious', () => {
+ console.info(`on playPrevious , do playPrevious task`);
+ // do some tasks ···
+ });
+ session.on('fastForward', () => {
+ console.info(`on fastForward , do fastForward task`);
+ // do some tasks ···
+ });
+ session.on('rewind', () => {
+ console.info(`on rewind , do rewind task`);
+ // do some tasks ···
+ });
+
+ session.on('seek', (time) => {
+ console.info(`on seek , the seek time is ${time}`);
+ // do some tasks ···
+ });
+ session.on('setSpeed', (speed) => {
+ console.info(`on setSpeed , the speed is ${speed}`);
+ // do some tasks ···
+ });
+ session.on('setLoopMode', (mode) => {
+ console.info(`on setLoopMode , the loop mode is ${mode}`);
+ // do some tasks ···
+ });
+ session.on('toggleFavorite', (assetId) => {
+ console.info(`on toggleFavorite , the target asset Id is ${assetId}`);
+ // do some tasks ···
+ });
+ }
+ ```
+
+ - Listening for Advanced Playback Control Events
+
+ The following advanced playback control events can be listened for:
+
+ - **skipToQueueItem**: triggered when an item in the playlist is selected.
+ - **handleKeyEvent**: triggered when a key is pressed.
+ - **outputDeviceChange**: triggered when the output device changes.
+ - **commonCommand**: triggered when a custom playback control command changes.
+
+ ```ts
+ async setListenerForMesFromController() {
+ // It is assumed that an AVSession object has been created. For details about how to create an AVSession object, see the node snippet in step 1.
+ let session: AVSessionManager.AVSession = ALREADY_CREATE_A_SESSION;
+ // Generally, logic processing on the player is implemented in the listener.
+ // After the processing is complete, use the setter to synchronize the playback information. For details, see the code snippet above.
+ session.on('skipToQueueItem', (itemId) => {
+ console.info(`on skipToQueueItem , do skip task`);
+ // do some tasks ···
+ });
+ session.on('handleKeyEvent', (event) => {
+ console.info(`on handleKeyEvent , the event is ${JSON.stringify(event)}`);
+ // do some tasks ···
+ });
+ session.on('outputDeviceChange', (device) => {
+ console.info(`on outputDeviceChange , the device info is ${JSON.stringify(device)}`);
+ // do some tasks ···
+ });
+ session.on('commonCommand', (commandString, args) => {
+ console.info(`on commonCommand , command is ${commandString}, args are ${JSON.stringify(args)}`);
+ // do some tasks ···
+ });
+ }
+ ```
+
+7. Obtain an **AVSessionController** object for this **AVSession** object for interaction.
+
```ts
async createControllerFromSession() {
- // It is assumed that an AVSession object has been created. For details about how to create an AVSession object, see the node snippet above.
- let session: AVSessionManager.AVSession = ALLREADY_CREATE_A_SESSION;
+ // It is assumed that an AVSession object has been created. For details about how to create an AVSession object, see the node snippet in step 1.
+ let session: AVSessionManager.AVSession = ALREADY_CREATE_A_SESSION;
// Obtain an AVSessionController object for this AVSession object.
let controller: AVSessionManager.AVSessionController = await session.getController();
- // The AVSessionController object can interact with the AVSession object, for example, by delivering a control command.
+ // The AVSessionController object can interact with the AVSession object, for example, by delivering a playback control command.
let avCommand: AVSessionManager.AVControlCommand = {command:'play'};
controller.sendControlCommand(avCommand);
@@ -163,13 +322,14 @@ To enable an audio and video application to access the AVSession service as a pr
}
```
-6. When the audio and video application exits and does not need to continue playback, cancel the listener and destroy the **AVSession** object.
- The code snippet below is used for canceling the listener for control commands:
+8. When the audio and video application exits and does not need to continue playback, cancel the listener and destroy the **AVSession** object.
+
+ The code snippet below is used for canceling the listener for playback control commands:
```ts
async unregisterSessionListener() {
- // It is assumed that an AVSession object has been created. For details about how to create an AVSession object, see the node snippet above.
- let session: AVSessionManager.AVSession = ALLREADY_CREATE_A_SESSION;
+ // It is assumed that an AVSession object has been created. For details about how to create an AVSession object, see the node snippet in step 1.
+ let session: AVSessionManager.AVSession = ALREADY_CREATE_A_SESSION;
// Cancel the listener of the AVSession object.
session.off('play');
@@ -177,22 +337,27 @@ To enable an audio and video application to access the AVSession service as a pr
session.off('stop');
session.off('playNext');
session.off('playPrevious');
+ session.off('skipToQueueItem');
+ session.off('handleKeyEvent');
+ session.off('outputDeviceChange');
+ session.off('commonCommand');
}
```
-
- The code snippet below is used for destroying the AVSession object:
-
- ```ts
+
+
+ The code snippet below is used for destroying the AVSession object:
+
+ ```ts
async destroySession() {
- // It is assumed that an AVSession object has been created. For details about how to create an AVSession object, see the node snippet above.
- let session: AVSessionManager.AVSession = ALLREADY_CREATE_A_SESSION;
- // Destroy the AVSession object.
- session.destroy(function (err) {
- if (err) {
- console.info(`Destroy BusinessError: code: ${err.code}, message: ${err.message}`);
- } else {
- console.info('Destroy : SUCCESS ');
- }
- });
+ // It is assumed that an AVSession object has been created. For details about how to create an AVSession object, see the node snippet in step 1.
+ let session: AVSessionManager.AVSession = ALREADY_CREATE_A_SESSION;
+ // Destroy the AVSession object.
+ session.destroy(function (err) {
+ if (err) {
+ console.error(`Failed to destroy session. Code: ${err.code}, message: ${err.message}`);
+ } else {
+ console.info(`Destroy : SUCCESS `);
+ }
+ });
}
- ```
+ ```
\ No newline at end of file
diff --git a/en/application-dev/media/using-distributed-avsession.md b/en/application-dev/media/using-distributed-avsession.md
index c1835d661fdd2b57b7dce0f2507dbea748eaea7e..fc5c49b9804b67750228a010c1c53d58a2086bbf 100644
--- a/en/application-dev/media/using-distributed-avsession.md
+++ b/en/application-dev/media/using-distributed-avsession.md
@@ -36,15 +36,15 @@ To enable a system application that accesses the AVSession service as the contro
let audioDevices;
await audioRoutingManager.getDevices(audio.DeviceFlag.OUTPUT_DEVICES_FLAG).then((data) => {
audioDevices = data;
- console.info('Promise returned to indicate that the device list is obtained.');
+ console.info(`Promise returned to indicate that the device list is obtained.`);
}).catch((err) => {
- console.info(`getDevices : ERROR : ${err.message}`);
+ console.error(`Failed to get devices. Code: ${err.code}, message: ${err.message}`);
});
AVSessionManager.castAudio('all', audioDevices).then(() => {
- console.info('createController : SUCCESS');
+ console.info(`createController : SUCCESS`);
}).catch((err) => {
- console.info(`createController : ERROR : ${err.message}`);
+ console.error(`Failed to cast audio. Code: ${err.code}, message: ${err.message}`);
});
```
diff --git a/en/application-dev/media/video-playback.md b/en/application-dev/media/video-playback.md
index fff4aa830ddc45e7d20e0fd06655adfdc5243fe5..f745f2f25120f7068de6634af56aa1f443c5b5d9 100644
--- a/en/application-dev/media/video-playback.md
+++ b/en/application-dev/media/video-playback.md
@@ -78,7 +78,9 @@ export class AVPlayerDemo {
private avPlayer;
private count: number = 0;
private surfaceID: string; // The surfaceID parameter specifies the window used to display the video. Its value is obtained through the XComponent.
-
+ private isSeek: boolean = true; // Specify whether the seek operation is supported.
+ private fileSize: number = -1;
+ private fd: number = 0;
// Set AVPlayer callback functions.
setAVPlayerCallback() {
// Callback function for the seek operation.
@@ -113,8 +115,13 @@ export class AVPlayerDemo {
case 'playing': // This state is reported upon a successful callback of play().
console.info('AVPlayer state playing called.');
if (this.count !== 0) {
- console.info('AVPlayer start to seek.');
- this.avPlayer.seek (this.avPlayer.duration); // Call seek() to seek to the end of the video clip.
+ if (this.isSeek) {
+ console.info('AVPlayer start to seek.');
+ this.avPlayer.seek (this.avPlayer.duration); // Call seek() to seek to the end of the video clip.
+ } else {
+ // When the seek operation is not supported, the playback continues until it reaches the end.
+ console.info('AVPlayer wait to play end.');
+ }
} else {
this.avPlayer.pause(); // Call pause() to pause the playback.
}
@@ -152,10 +159,11 @@ export class AVPlayerDemo {
let context = getContext(this) as common.UIAbilityContext;
// Obtain the sandbox address filesDir through UIAbilityContext. The stage model is used as an example.
let pathDir = context.filesDir;
- let path = pathDir + '/H264_AAC.mp4';
+ let path = pathDir + '/H264_AAC.mp4';
// Open the corresponding file address to obtain the file descriptor and assign a value to the URL to trigger the reporting of the initialized state.
let file = await fs.open(path);
fdPath = fdPath + '' + file.fd;
+ this.isSeek = true; // The seek operation is supported.
this.avPlayer.url = fdPath;
}
@@ -169,9 +177,86 @@ export class AVPlayerDemo {
// The return type is {fd,offset,length}, where fd indicates the file descriptor address of the HAP file, offset indicates the media asset offset, and length indicates the duration of the media asset to play.
let context = getContext(this) as common.UIAbilityContext;
let fileDescriptor = await context.resourceManager.getRawFd('H264_AAC.mp4');
+ this.isSeek = true; // The seek operation is supported.
// Assign a value to fdSrc to trigger the reporting of the initialized state.
this.avPlayer.fdSrc = fileDescriptor;
}
+
+ // The following demo shows how to use the file system to open the sandbox address, obtain the media file address, and play the media file with the seek operation using the dataSrc attribute.
+ async avPlayerDataSrcSeekDemo() {
+ // Create an AVPlayer instance.
+ this.avPlayer = await media.createAVPlayer();
+ // Set a callback function for state changes.
+ this.setAVPlayerCallback();
+ // dataSrc indicates the playback source address. When the seek operation is supported, fileSize indicates the size of the file to be played. The following describes how to assign a value to fileSize.
+ let src = {
+ fileSize: -1,
+ callback: (buf, length, pos) => {
+ let num = 0;
+ if (buf == undefined || length == undefined || pos == undefined) {
+ return -1;
+ }
+ num = fs.readSync(this.fd, buf, { offset: pos, length: length });
+ if (num > 0 && (this.fileSize >= pos)) {
+ return num;
+ }
+ return -1;
+ }
+ }
+ let context = getContext(this) as common.UIAbilityContext;
+ // Obtain the sandbox address filesDir through UIAbilityContext. The stage model is used as an example.
+ let pathDir = context.filesDir;
+ let path = pathDir + '/H264_AAC.mp4';
+ await fs.open(path).then((file) => {
+ this.fd = file.fd;
+ })
+ // Obtain the size of the file to be played.
+ this.fileSize = fs.statSync(path).size;
+ src.fileSize = this.fileSize;
+ this.isSeek = true; // The seek operation is supported.
+ this.avPlayer.dataSrc = src;
+ }
+
+ // The following demo shows how to use the file system to open the sandbox address, obtain the media file address, and play the media file without the seek operation using the dataSrc attribute.
+ async avPlayerDataSrcNoSeekDemo() {
+ // Create an AVPlayer instance.
+ this.avPlayer = await media.createAVPlayer();
+ // Set a callback function for state changes.
+ this.setAVPlayerCallback();
+ let context = getContext(this) as common.UIAbilityContext;
+ let src: object = {
+ fileSize: -1,
+ callback: (buf, length, pos) => {
+ let num = 0;
+ if (buf == undefined || length == undefined) {
+ return -1;
+ }
+ num = fs.readSync(this.fd, buf);
+ if (num > 0) {
+ return num;
+ }
+ return -1;
+ }
+ }
+ // Obtain the sandbox address filesDir through UIAbilityContext. The stage model is used as an example.
+ let pathDir = context.filesDir;
+ let path = pathDir + '/H264_AAC.mp4';
+ await fs.open(path).then((file) => {
+ this.fd = file.fd;
+ })
+ this.isSeek = false; // The seek operation is not supported.
+ this.avPlayer.dataSrc = src;
+ }
+
+ // The following demo shows how to play live streams by setting the network address through the URL.
+ async avPlayerLiveDemo() {
+ // Create an AVPlayer instance.
+ this.avPlayer = await media.createAVPlayer();
+ // Set a callback function for state changes.
+ this.setAVPlayerCallback();
+ this.isSeek = false; // The seek operation is not supported.
+ this.avPlayer.url = 'http://xxx.xxx.xxx.xxx:xx/xx/index.m3u8'; // Play live webcasting streams using HLS.
+ }
}
```
diff --git a/en/application-dev/quick-start/arkts-create-custom-components.md b/en/application-dev/quick-start/arkts-create-custom-components.md
index a747ba48ccf234f2ece8f7bd41bfe5a6f6799f86..b094ff46868af53f56627b1cfe2c5f297284f938 100644
--- a/en/application-dev/quick-start/arkts-create-custom-components.md
+++ b/en/application-dev/quick-start/arkts-create-custom-components.md
@@ -114,15 +114,28 @@ To fully understand the preceding example, a knowledge of the following concepts
}
```
+- \@Recycle: A custom component decorated with \@Recycle can be reused.
+
+ > **NOTE**
+ >
+ > Since API version 10, this decorator is supported in ArkTS widgets.
+
+ ```ts
+ @Recycle
+ @Component
+ struct MyComponent {
+ }
+ ```
+
## Member Functions/Variables
-In addition to the mandatory** build()** function, a custom component may implement other member functions with the following restrictions:
+In addition to the mandatory **build()** function, a custom component may implement other member functions with the following restrictions:
- Static functions are not supported.
-- Access to the member functions is always private. Defining **private** access is optional. Defining access other than **private** is a syntax error.
+- Access to the member functions is always private.
A custom component can also implement member variables with the following restrictions:
@@ -130,7 +143,7 @@ A custom component can also implement member variables with the following restri
- Static member variables are not supported.
-- Access to the member variables is always private.The access rules of member variables are the same as those of member functions.
+- Access to the member variables is always private. The access rules of member variables are the same as those of member functions.
- Local initialization is optional for some member variables and mandatory for others. For details about whether local initialization or initialization from the parent component is required, see [State Management](arkts-state-management-overview.md).
@@ -378,4 +391,5 @@ In the preceding example:
2. The **StyleExample** parent component holds a **Controller** instance and with which calls the **changeText** API of **Controller**. That is, the value of the state variable **value** of **MyComponent** is changed through the **this** pointer of the **MyComponent** child component held by the controller.
Through the encapsulation of the controller, **MyComponent** exposes the **changeText** API. All instances that hold the controller can call the **changeText** API to change the value of the **MyComponent** state variable **value**.
+
diff --git a/en/application-dev/quick-start/arkts-observed-and-objectlink.md b/en/application-dev/quick-start/arkts-observed-and-objectlink.md
index a84802480d1fbd3eb4b6942d5b8f1a2f5f642aa1..1282545cc1d65a1acdc90da3ac5a2828cf611ca9 100644
--- a/en/application-dev/quick-start/arkts-observed-and-objectlink.md
+++ b/en/application-dev/quick-start/arkts-observed-and-objectlink.md
@@ -316,7 +316,7 @@ class StringArray extends Array {
-Declare a class that extends from** Array**: **class StringArray extends Array\ {}** and create an instance of **StringArray**. The use of the **new** operator is required for the \@Observed class decorator to work properly.
+Declare a class that extends from **Array**: **class StringArray extends Array\ {}** and create an instance of **StringArray**. The use of the **new** operator is required for the \@Observed class decorator to work properly.
```ts
diff --git a/en/application-dev/quick-start/arkts-page-custom-components-lifecycle.md b/en/application-dev/quick-start/arkts-page-custom-components-lifecycle.md
index 2b8f2293918bef081b6f6377a99d6fdb2e35a03e..90a06cc468f5dc383ec3cf15a9f2d8a894f63239 100644
--- a/en/application-dev/quick-start/arkts-page-custom-components-lifecycle.md
+++ b/en/application-dev/quick-start/arkts-page-custom-components-lifecycle.md
@@ -19,7 +19,7 @@ The following lifecycle callbacks are provided for the lifecycle of a page, that
- [onBackPress](../reference/arkui-ts/ts-custom-component-lifecycle.md#onbackpress): Invoked when the user clicks the Back button.
-The following lifecycle callbacks are provided for the lifecycle of a custom component, which is one decorated with \@Component:
+The following lifecycle callbacks are provided for the lifecycle of a component, that is, the lifecycle of a custom component decorated with \@Component:
- [aboutToAppear](../reference/arkui-ts/ts-custom-component-lifecycle.md#abouttoappear): Invoked when the custom component is about to appear. Specifically, it is invoked after a new instance of the custom component is created and before its **build** function is executed.
@@ -134,7 +134,7 @@ struct MyComponent {
Child()
}
// When this.showChild is false, the Child child component is deleted, and Child aboutToDisappear is invoked.
- Button('create or delete Child').onClick(() => {
+ Button('delete Child').onClick(() => {
this.showChild = false;
})
// Because of the pushing from the current page to Page2, onPageHide is invoked.
diff --git a/en/application-dev/quick-start/arkts-rendering-control-lazyforeach.md b/en/application-dev/quick-start/arkts-rendering-control-lazyforeach.md
index 7086569f912a0c447228643958c38b3bd16e0045..95b6665c834a4bdf281b4712717e71f51659c112 100644
--- a/en/application-dev/quick-start/arkts-rendering-control-lazyforeach.md
+++ b/en/application-dev/quick-start/arkts-rendering-control-lazyforeach.md
@@ -75,15 +75,17 @@ interface DataChangeListener {
}
```
-| Declaration | Parameter Type | Description |
-| ---------------------------------------- | -------------------------------------- | ---------------------------------------- |
-| onDataReloaded(): void | - | Invoked when all data is reloaded. |
-| onDataAdded(index: number):void | number | Invoked when data is added to the position indicated by the specified index. **index**: index of the position where data is added. |
-| onDataMoved(from: number, to: number): void | from: number, to: number | Invoked when data is moved. **from**: original position of data; **to**: target position of data. **NOTE** The ID must remain unchanged before and after data movement. If the ID changes, APIs for deleting and adding data must be called.|
-| onDataChanged(index: number): void | number | Invoked when data in the position indicated by the specified index is changed. **index**: listener for data changes. |
-| onDataAdd(index: number): void | number | Invoked when data is added to the position indicated by the specified index. **index**: index of the position where data is added. |
-| onDataMove(from: number, to: number): void | from: number, to: number | Invoked when data is moved. **from**: original position of data; **to**: target position of data. **NOTE** The ID must remain unchanged before and after data movement. If the ID changes, APIs for deleting and adding data must be called.|
-| onDataChanged(index: number): void | number | Invoked when data in the position indicated by the specified index is changed. **index**: index of the position where data is changed.|
+| Declaration | Parameter Type | Description |
+| ------------------------------------------------------------ | -------------------------------------- | ------------------------------------------------------------ |
+| onDataReloaded(): void | - | Invoked when all data is reloaded. |
+| onDataAdded(index: number):void(deprecated) | number | Invoked when data is added to the position indicated by the specified index. This API is deprecated since API version 8. You are advised to use **onDataAdd** instead. **index**: index of the position where data is added.|
+| onDataMoved(from: number, to: number): void(deprecated) | from: number, to: number | Invoked when data is moved. This API is deprecated since API version 8. You are advised to use **onDataMove** instead. **from**: original position of data; **to**: target position of data. **NOTE** The ID must remain unchanged before and after data movement. If the ID changes, APIs for deleting and adding data must be called.|
+| onDataDeleted(index: number):void(deprecated) | number | Invoked when data is deleted from the position indicated by the specified index. LazyForEach will update the displayed content accordingly. This API is deprecated since API version 8. You are advised to use **onDataDelete** instead. **index**: index of the position where data is deleted.|
+| onDataChanged(index: number): void(deprecated) | number | Invoked when data in the position indicated by the specified index is changed. This API is deprecated since API version 8. You are advised to use **onDataChange** instead. **index**: listener for data changes.|
+| onDataAdd(index: number): void8+ | number | Invoked when data is added to the position indicated by the specified index. **index**: index of the position where data is added.|
+| onDataMove(from: number, to: number): void8+ | from: number, to: number | Invoked when data is moved. **from**: original position of data; **to**: target position of data. **NOTE** The ID must remain unchanged before and after data movement. If the ID changes, APIs for deleting and adding data must be called.|
+| onDataDelete(index: number):void8+ | number | Invoked when data is deleted from the position indicated by the specified index. LazyForEach will update the displayed content accordingly. **index**: index of the position where data is deleted. **NOTE** Before **onDataDelete** is called, ensure that the corresponding data in **dataSource** has been deleted. Otherwise, undefined behavior will occur during page rendering.|
+| onDataChange(index: number): void8+ | number | Invoked when data in the position indicated by the specified index is changed. **index**: index of the position where data is changed.|
## Restrictions
diff --git a/en/application-dev/quick-start/arkts-state-management-overview.md b/en/application-dev/quick-start/arkts-state-management-overview.md
index 3009306fa7b6ae61c9c9730635e5dc00fbe32dd2..09db3f3183cd8156d532be741698fcc2fbfd26ef 100644
--- a/en/application-dev/quick-start/arkts-state-management-overview.md
+++ b/en/application-dev/quick-start/arkts-state-management-overview.md
@@ -82,7 +82,7 @@ According to the data transfer mode and synchronization type, decorators can als
- Decorators that allow for two-way (mutable) transfer
-The following figure illustrates the decorators. For details, see [Managing State by a Component](arkts-state.md) and [Managing State by an Application](arkts-application-state-management-overview.md). You can use these decorators at your disposal to implement linkage between data and the UI.
+The following figure illustrates the decorators. For details, see [Component State Management](arkts-state.md) and [Application State Management](arkts-application-state-management-overview.md). You can use these decorators at your disposal to implement linkage between data and the UI.

@@ -106,6 +106,10 @@ Decorators for [managing the state owned by a component](arkts-state.md):
- \@ObjectLink: An \@ObjectLink decorated variable, when used with an \@Observed decorated class of the parent component, is for two-way data synchronization in scenarios involving multiple levels of nested objects or arrays in the class.
+> **NOTE**
+>
+> Only [\@Observed/\@ObjectLink](arkts-observed-and-objectlink.md) can observe changes of nested attributes. Other decorators can only observe changes of attributes at the first layer. For details, see the "Observed Changes and Behavior" part in each decorator section.
+
Decorators for [managing the state owned by an application](arkts-state.md):
diff --git a/en/application-dev/quick-start/arkts-state.md b/en/application-dev/quick-start/arkts-state.md
index d9cc3c750e3b2d5ad6a1adb539b177f797c62402..6ac35e388df8471d78245fd52fa82aa8a5a7f580 100644
--- a/en/application-dev/quick-start/arkts-state.md
+++ b/en/application-dev/quick-start/arkts-state.md
@@ -231,11 +231,11 @@ struct MyComponent {
Text(`${this.title.value}`)
Button(`Click to change title`).onClick(() => {
// The update of the @State decorated variable triggers the update of the component.
- this.title.value = this.title.value === 'Hello ArkUI' ? 'Hello World' : 'HelloArkUI';
+ this.title.value = this.title.value === 'Hello ArkUI' ? 'Hello World' : 'Hello ArkUI';
})
Button(`Click to increase count=${this.count}`).onClick(() => {
- // The update of the @State decorated variable triggers the update of the component.
+ // The update of the @State decorated variable triggers the update of the