提交 df0ec2b1 编写于 作者: G Gloria

Update docs against 10615+11266

Signed-off-by: wusongqing<wusongqing@huawei.com>
上级 d6eada90
# Audio Playback Development
## When to Use
## Introduction
You can use audio playback APIs to convert audio data into audible analog signals, play the signals using output devices, and manage playback tasks.
You can use audio playback APIs to convert audio data into audible analog signals and play the signals using output devices. You can also manage playback tasks. For example, you can start, suspend, stop playback, release resources, set the volume, seek to a playback position, and obtain track information.
**Figure 1** Playback status
## Working Principles
The following figures show the audio playback status changes and the interaction with external modules for audio playback.
**Figure 1** Audio playback state transition
![en-us_image_audio_state_machine](figures/en-us_image_audio_state_machine.png)
**Note**: If the status is **Idle**, setting the **src** attribute does not change the status. In addition, after the **src** attribute is set successfully, you must call **reset()** before setting it to another value.
**NOTE**: If the status is **Idle**, setting the **src** attribute does not change the status. In addition, after the **src** attribute is set successfully, you must call **reset()** before setting it to another value.
**Figure 2** Layer 0 diagram of audio playback
**Figure 2** Interaction with external modules for audio playback
![en-us_image_audio_player](figures/en-us_image_audio_player.png)
**NOTE**: When a third-party application calls the JS interface provided by the JS interface layer to implement a feature, the framework layer invokes the audio component through the media service of the native framework and outputs the audio data decoded by the software to the audio HDI of the hardware interface layer to implement audio playback.
## How to Develop
For details about the APIs, see [AudioPlayer in the Media API](../reference/apis/js-apis-media.md#audioplayer).
> **NOTE**
>
> The method for obtaining the path in the FA model is different from that in the stage model. **pathDir** used in the sample code below is an example. You need to obtain the path based on project requirements. For details about how to obtain the path, see [Application Sandbox Path Guidelines](../reference/apis/js-apis-fileio.md#guidelines).
### Full-Process Scenario
The full audio playback process includes creating an instance, setting the URI, playing audio, seeking to the playback position, setting the volume, pausing playback, obtaining track information, stopping playback, resetting the player, and releasing resources.
......@@ -99,8 +109,9 @@ async function audioPlayerDemo() {
setCallBack(audioPlayer); // Set the event callbacks.
// 2. Set the URI of the audio file.
let fdPath = 'fd://'
// The stream in the path can be pushed to the device by running the "hdc file send D:\xxx\01.mp3 /data/app/el1/bundle/public/ohos.acts.multimedia.audio.audioplayer/ohos.acts.multimedia.audio.audioplayer/assets/entry/resources/rawfile" command.
let path = '/data/app/el1/bundle/public/ohos.acts.multimedia.audio.audioplayer/ohos.acts.multimedia.audio.audioplayer/assets/entry/resources/rawfile/01.mp3';
let pathDir = "/data/storage/el2/base/haps/entry/files" // The method for obtaining pathDir in the FA model is different from that in the stage model. For details, see NOTE just below How to Develop. You need to obtain pathDir based on project requirements.
// The stream in the path can be pushed to the device by running the "hdc file send D:\xxx\01.mp3 /data/app/el2/100/base/ohos.acts.multimedia.audio.audioplayer/haps/entry/files" command.
let path = pathDir + '/01.mp3'
await fileIO.open(path).then((fdNumber) => {
fdPath = fdPath + '' + fdNumber;
console.info('open fd success fd is' + fdPath);
......@@ -118,6 +129,7 @@ async function audioPlayerDemo() {
```js
import media from '@ohos.multimedia.media'
import fileIO from '@ohos.fileio'
export class AudioDemo {
// Set the player callbacks.
setCallBack(audioPlayer) {
......@@ -139,8 +151,9 @@ export class AudioDemo {
let audioPlayer = media.createAudioPlayer(); // Create an AudioPlayer instance.
this.setCallBack(audioPlayer); // Set the event callbacks.
let fdPath = 'fd://'
// The stream in the path can be pushed to the device by running the "hdc file send D:\xxx\01.mp3 /data/app/el1/bundle/public/ohos.acts.multimedia.audio.audioplayer/ohos.acts.multimedia.audio.audioplayer/assets/entry/resources/rawfile" command.
let path = '/data/app/el1/bundle/public/ohos.acts.multimedia.audio.audioplayer/ohos.acts.multimedia.audio.audioplayer/assets/entry/resources/rawfile/01.mp3';
let pathDir = "/data/storage/el2/base/haps/entry/files" // The method for obtaining pathDir in the FA model is different from that in the stage model. For details, see NOTE just below How to Develop. You need to obtain pathDir based on project requirements.
// The stream in the path can be pushed to the device by running the "hdc file send D:\xxx\01.mp3 /data/app/el2/100/base/ohos.acts.multimedia.audio.audioplayer/haps/entry/files" command.
let path = pathDir + '/01.mp3'
await fileIO.open(path).then((fdNumber) => {
fdPath = fdPath + '' + fdNumber;
console.info('open fd success fd is' + fdPath);
......@@ -159,6 +172,7 @@ export class AudioDemo {
```js
import media from '@ohos.multimedia.media'
import fileIO from '@ohos.fileio'
export class AudioDemo {
// Set the player callbacks.
private isNextMusic = false;
......@@ -185,8 +199,9 @@ export class AudioDemo {
async nextMusic(audioPlayer) {
this.isNextMusic = true;
let nextFdPath = 'fd://'
// The stream in the path can be pushed to the device by running the "hdc file send D:\xxx\02.mp3 /data/app/el1/bundle/public/ohos.acts.multimedia.audio.audioplayer/ohos.acts.multimedia.audio.audioplayer/assets/entry/resources/rawfile" command.
let nextpath = '/data/app/el1/bundle/public/ohos.acts.multimedia.audio.audioplayer/ohos.acts.multimedia.audio.audioplayer/assets/entry/resources/rawfile/02.mp3';
let pathDir = "/data/storage/el2/base/haps/entry/files" // The method for obtaining pathDir in the FA model is different from that in the stage model. For details, see NOTE just below How to Develop. You need to obtain pathDir based on project requirements.
// The stream in the path can be pushed to the device by running the "hdc file send D:\xxx\02.mp3 /data/app/el2/100/base/ohos.acts.multimedia.audio.audioplayer/haps/entry/files" command.
let nextpath = pathDir + '/02.mp3'
await fileIO.open(nextpath).then((fdNumber) => {
nextFdPath = nextFdPath + '' + fdNumber;
console.info('open fd success fd is' + nextFdPath);
......@@ -202,8 +217,9 @@ export class AudioDemo {
let audioPlayer = media.createAudioPlayer(); // Create an AudioPlayer instance.
this.setCallBack(audioPlayer); // Set the event callbacks.
let fdPath = 'fd://'
// The stream in the path can be pushed to the device by running the "hdc file send D:\xxx\01.mp3 /data/app/el1/bundle/public/ohos.acts.multimedia.audio.audioplayer/ohos.acts.multimedia.audio.audioplayer/assets/entry/resources/rawfile" command.
let path = '/data/app/el1/bundle/public/ohos.acts.multimedia.audio.audioplayer/ohos.acts.multimedia.audio.audioplayer/assets/entry/resources/rawfile/01.mp3';
let pathDir = "/data/storage/el2/base/haps/entry/files" // The method for obtaining pathDir in the FA model is different from that in the stage model. For details, see NOTE just below How to Develop. You need to obtain pathDir based on project requirements.
// The stream in the path can be pushed to the device by running the "hdc file send D:\xxx\01.mp3 /data/app/el2/100/base/ohos.acts.multimedia.audio.audioplayer/haps/entry/files" command.
let path = pathDir + '/01.mp3'
await fileIO.open(path).then((fdNumber) => {
fdPath = fdPath + '' + fdNumber;
console.info('open fd success fd is' + fdPath);
......@@ -222,6 +238,7 @@ export class AudioDemo {
```js
import media from '@ohos.multimedia.media'
import fileIO from '@ohos.fileio'
export class AudioDemo {
// Set the player callbacks.
setCallBack(audioPlayer) {
......@@ -239,8 +256,9 @@ export class AudioDemo {
let audioPlayer = media.createAudioPlayer(); // Create an AudioPlayer instance.
this.setCallBack(audioPlayer); // Set the event callbacks.
let fdPath = 'fd://'
// The stream in the path can be pushed to the device by running the "hdc file send D:\xxx\01.mp3 /data/app/el1/bundle/public/ohos.acts.multimedia.audio.audioplayer/ohos.acts.multimedia.audio.audioplayer/assets/entry/resources/rawfile" command.
let path = '/data/app/el1/bundle/public/ohos.acts.multimedia.audio.audioplayer/ohos.acts.multimedia.audio.audioplayer/assets/entry/resources/rawfile/01.mp3';
let pathDir = "/data/storage/el2/base/haps/entry/files" // The method for obtaining pathDir in the FA model is different from that in the stage model. For details, see NOTE just below How to Develop. You need to obtain pathDir based on project requirements.
// The stream in the path can be pushed to the device by running the "hdc file send D:\xxx\01.mp3 /data/app/el2/100/base/ohos.acts.multimedia.audio.audioplayer/haps/entry/files" command.
let path = pathDir + '/01.mp3'
await fileIO.open(path).then((fdNumber) => {
fdPath = fdPath + '' + fdNumber;
console.info('open fd success fd is' + fdPath);
......
# Audio Recording Development
## When to Use
## Introduction
During audio recording, audio signals are captured, encoded, and saved to files. You can specify parameters such as the sampling rate, number of audio channels, encoding format, encapsulation format, and file path for audio recording.
During audio recording, audio signals are captured, encoded, and saved to files. You can specify parameters such as the sampling rate, number of audio channels, encoding format, encapsulation format, and output file path for audio recording.
## Working Principles
The following figures show the audio recording state transition and the interaction with external modules for audio recording.
**Figure 1** Audio recording state transition
......@@ -10,10 +14,16 @@ During audio recording, audio signals are captured, encoded, and saved to files.
**Figure 2** Layer 0 diagram of audio recording
**Figure 2** Interaction with external modules for audio recording
![en-us_image_audio_recorder_zero](figures/en-us_image_audio_recorder_zero.png)
**NOTE**: When a third-party recording application or recorder calls the JS interface provided by the JS interface layer to implement a feature, the framework layer invokes the audio component through the media service of the native framework to obtain the audio data captured through the audio HDI. The framework layer then encodes the audio data through software and saves the encoded and encapsulated audio data to a file to implement audio recording.
## Constraints
Before developing audio recording, configure the **ohos.permission.MICROPHONE** permission for your application. For details about the configuration, see [Permission Application Guide](../security/accesstoken-guidelines.md).
## How to Develop
For details about the APIs, see [AudioRecorder in the Media API](../reference/apis/js-apis-media.md#audiorecorder).
......
# Video Playback Development
## When to Use
## Introduction
You can use video playback APIs to convert video data into visible signals, play the signals using output devices, and manage playback tasks. This document describes development for the following video playback scenarios: full-process, normal playback, video switching, and loop playback.
You can use video playback APIs to convert audio data into audible analog signals and play the signals using output devices. You can also manage playback tasks. For example, you can start, suspend, stop playback, release resources, set the volume, seek to a playback position, set the playback speed, and obtain track information. This document describes development for the following video playback scenarios: full-process, normal playback, video switching, and loop playback.
## Working Principles
The following figures show the video playback state transition and the interaction with external modules for video playback.
**Figure 1** Video playback state transition
![en-us_image_video_state_machine](figures/en-us_image_video_state_machine.png)
**Figure 2** Layer 0 diagram of video playback
**Figure 2** Interaction with external modules for video playback
![en-us_image_video_player](figures/en-us_image_video_player.png)
**NOTE**: When a third-party application calls a JS interface provided by the JS interface layer, the framework layer invokes the audio component through the media service of the native framework to output the audio data decoded by the software to the audio HDI. The graphics subsystem outputs the image data decoded by the codec HDI at the hardware interface layer to the display HDI. In this way, video playback is implemented.
*Note: Video playback requires hardware capabilities such as display, audio, and codec.*
1. A third-party application obtains a surface ID from the XComponent.
......
# Video Recording Development
## When to Use
## Introduction
During video recording, audio and video signals are captured, encoded, and saved to files. You can specify parameters such as the encoding format, encapsulation format, and file path for video recording.
You can use video recording APIs to capture audio and video signals, encode them, and save them to files. You can start, suspend, resume, and stop recording, and release resources. You can also specify parameters such as the encoding format, encapsulation format, and file path for video recording.
## Working Principles
The following figures show the video recording state transition and the interaction with external modules for video recording.
**Figure 1** Video recording state transition
![en-us_image_video_recorder_state_machine](figures/en-us_image_video_recorder_state_machine.png)
**Figure 2** Layer 0 diagram of video recording
**Figure 2** Interaction with external modules for video recording
![en-us_image_video_recorder_zero](figures/en-us_image_video_recorder_zero.png)
**NOTE**: When a third-party camera application or system camera calls a JS interface provided by the JS interface layer, the framework layer uses the media service of the native framework to invoke the audio component. Through the audio HDI, the audio component captures audio data, encodes the audio data through software, and saves the encoded audio data to a file. The graphics subsystem captures image data through the video HDI, encodes the image data through the video codec HDI, and saves the encoded image data to a file. In this way, video recording is implemented.
## Constraints
Before developing video recording, configure the permissions **ohos.permission.MICROPHONE** and **ohos.permission.CAMERA** for your application. For details about the configuration, see [Permission Application Guide](../security/accesstoken-guidelines.md).
## How to Develop
For details about the APIs, see [VideoRecorder in the Media API](../reference/apis/js-apis-media.md#videorecorder9).
......@@ -147,3 +157,4 @@ export class VideoRecorderDemo {
}
}
```
# Media
> **NOTE**
>
> The initial APIs of this module are supported since API version 6. Newly added APIs will be marked with a superscript to indicate their earliest API version.
The multimedia subsystem provides a set of simple and easy-to-use APIs for you to access the system and use media resources.
......@@ -53,7 +52,7 @@ Creates a **VideoPlayer** instance. This API uses an asynchronous callback to re
| Name | Type | Mandatory| Description |
| -------- | ------------------------------------------- | ---- | ------------------------------ |
| callback | AsyncCallback<[VideoPlayer](#videoplayer8)> | Yes | Callback used to return the **VideoPlayer** instance, which can be used to manage and play video media.|
| callback | AsyncCallback<[VideoPlayer](#videoplayer8)> | Yes | Callback used to return the result. If the operation is successful, the **VideoPlayer** instance is returned; otherwise, **null** is returned. The instance can be used to manage and play video media.|
**Example**
......@@ -80,9 +79,9 @@ Creates a **VideoPlayer** instance. This API uses a promise to return the result
**Return value**
| Type | Description |
| ------------------------------------- | ----------------------------------- |
| Promise<[VideoPlayer](#videoplayer8)> | Promise used to return the **VideoPlayer** instance, which can be used to manage and play video media.|
| Type | Description |
| ------------------------------------- | ------------------------------------------------------------ |
| Promise<[VideoPlayer](#videoplayer8)> | Promise used to return the result. If the operation is successful, the **VideoPlayer** instance is returned; otherwise, **null** is returned. The instance can be used to manage and play video media.|
**Example**
......@@ -112,9 +111,9 @@ Only one **AudioRecorder** instance can be created per device.
**Return value**
| Type | Description |
| ------------------------------- | ----------------------------------------- |
| [AudioRecorder](#audiorecorder) | Returns the **AudioRecorder** instance if the operation is successful; returns **null** otherwise.|
| Type | Description |
| ------------------------------- | ------------------------------------------------------------ |
| [AudioRecorder](#audiorecorder) | Returns the **AudioRecorder** instance if the operation is successful; returns **null** otherwise. The instance can be used to record audio media.|
**Example**
......@@ -135,7 +134,7 @@ Only one **AudioRecorder** instance can be created per device.
| Name | Type | Mandatory| Description |
| -------- | ----------------------------------------------- | ---- | ------------------------------ |
| callback | AsyncCallback<[VideoRecorder](#videorecorder9)> | Yes | Callback used to return the **VideoRecorder** instance, which can be used to record video media.|
| callback | AsyncCallback<[VideoRecorder](#videorecorder9)> | Yes | Callback used to return the result. If the operation is successful, the **VideoRecorder** instance is returned; otherwise, **null** is returned. The instance can be used to record video media.|
**Example**
......@@ -163,9 +162,9 @@ Only one **AudioRecorder** instance can be created per device.
**Return value**
| Type | Description |
| ----------------------------------------- | ----------------------------------- |
| Promise<[VideoRecorder](#videorecorder9)> | Promise used to return the **VideoRecorder** instance, which can be used to record video media.|
| Type | Description |
| ----------------------------------------- | ------------------------------------------------------------ |
| Promise<[VideoRecorder](#videorecorder9)> | Promise used to return the result. If the operation is successful, the **VideoRecorder** instance is returned; otherwise, **null** is returned. The instance can be used to record video media.|
**Example**
......@@ -263,7 +262,7 @@ Enumerates the buffering event types.
| BUFFERING_START | 1 | Buffering starts. |
| BUFFERING_END | 2 | Buffering ends. |
| BUFFERING_PERCENT | 3 | Buffering progress, in percent. |
| CACHED_DURATION | 4 | Cache duration, in milliseconds.|
| CACHED_DURATION | 4 | Cache duration, in ms.|
## AudioPlayer
......@@ -362,9 +361,9 @@ Seeks to the specified playback position.
**Parameters**
| Name| Type | Mandatory| Description |
| ------ | ------ | ---- | ------------------------------------ |
| timeMs | number | Yes | Position to seek to, in milliseconds.|
| Name| Type | Mandatory| Description |
| ------ | ------ | ---- | ----------------------------------------------------------- |
| timeMs | number | Yes | Position to seek to, in ms. The value range is [0, duration].|
**Example**
......@@ -427,9 +426,9 @@ Obtains the audio track information. This API uses an asynchronous callback to r
**Parameters**
| Name | Type | Mandatory| Description |
| -------- | ------------------------------------------------------------ | ---- | -------------------------- |
| callback | AsyncCallback<Array<[MediaDescription](#mediadescription8)>> | Yes | Callback used to return the audio track information obtained.|
| Name | Type | Mandatory| Description |
| -------- | ------------------------------------------------------------ | ---- | ------------------------------------------ |
| callback | AsyncCallback<Array<[MediaDescription](#mediadescription8)>> | Yes | Callback used to return a **MediaDescription** array, which records the audio track information.|
**Example**
......@@ -463,9 +462,9 @@ Obtains the audio track information. This API uses a promise to return the resul
**Return value**
| Type | Description |
| ------------------------------------------------------ | ------------------------------- |
| Promise<Array<[MediaDescription](#mediadescription8)>> | Promise used to return the audio track information obtained.|
| Type | Description |
| ------------------------------------------------------ | ----------------------------------------------- |
| Promise<Array<[MediaDescription](#mediadescription8)>> | Promise used to return a **MediaDescription** array, which records the audio track information.|
**Example**
......@@ -497,7 +496,7 @@ for (let i = 0; i < arrayDescription.length; i++) {
on(type: 'bufferingUpdate', callback: (infoType: [BufferingInfoType](#bufferinginfotype8), value: number) => void): void
Subscribes to the audio buffering update event.
Subscribes to the audio buffering update event. Only network playback supports this subscription.
**System capability**: SystemCapability.Multimedia.Media.AudioPlayer
......@@ -594,7 +593,7 @@ audioPlayer.src = fdPath; // Set the src attribute and trigger the 'dataLoad' e
on(type: 'timeUpdate', callback: Callback\<number>): void
Subscribes to the **'timeUpdate'** event.
Subscribes to the **'timeUpdate'** event. This event is reported every second when the audio playback is in progress.
**System capability**: SystemCapability.Multimedia.Media.AudioPlayer
......@@ -1014,10 +1013,10 @@ Seeks to the specified playback position. The next key frame at the specified po
**Parameters**
| Name | Type | Mandatory| Description |
| -------- | -------- | ---- | ------------------------------------ |
| timeMs | number | Yes | Position to seek to, in milliseconds.|
| callback | function | Yes | Callback used to return the result. |
| Name | Type | Mandatory| Description |
| -------- | -------- | ---- | ------------------------------------------------------------ |
| timeMs | number | Yes | Position to seek to, in ms. The value range is [0, duration].|
| callback | function | Yes | Callback used to return the result. |
**Example**
......@@ -1042,11 +1041,11 @@ Seeks to the specified playback position. This API uses an asynchronous callback
**Parameters**
| Name | Type | Mandatory| Description |
| -------- | ---------------------- | ---- | ------------------------------------ |
| timeMs | number | Yes | Position to seek to, in milliseconds.|
| mode | [SeekMode](#seekmode8) | Yes | Seek mode. |
| callback | function | Yes | Callback used to return the result. |
| Name | Type | Mandatory| Description |
| -------- | ---------------------- | ---- | ------------------------------------------------------------ |
| timeMs | number | Yes | Position to seek to, in ms. The value range is [0, duration].|
| mode | [SeekMode](#seekmode8) | Yes | Seek mode. |
| callback | function | Yes | Callback used to return the result. |
**Example**
......@@ -1072,16 +1071,16 @@ Seeks to the specified playback position. If **mode** is not specified, the next
**Parameters**
| Name| Type | Mandatory| Description |
| ------ | ---------------------- | ---- | ------------------------------------ |
| timeMs | number | Yes | Position to seek to, in milliseconds.|
| mode | [SeekMode](#seekmode8) | No | Seek mode. |
| Name| Type | Mandatory| Description |
| ------ | ---------------------- | ---- | ------------------------------------------------------------ |
| timeMs | number | Yes | Position to seek to, in ms. The value range is [0, duration].|
| mode | [SeekMode](#seekmode8) | No | Seek mode. |
**Return value**
| Type | Description |
| -------------- | ----------------------------------- |
| Promise\<void> | Promise used to return the result.|
| Type | Description |
| -------------- | ------------------------------------------- |
| Promise\<number> | Promise used to return the playback position, in ms.|
**Example**
......@@ -1220,9 +1219,9 @@ Obtains the video track information. This API uses an asynchronous callback to r
**Parameters**
| Name | Type | Mandatory| Description |
| -------- | ------------------------------------------------------------ | ---- | -------------------------- |
| callback | AsyncCallback<Array<[MediaDescription](#mediadescription8)>> | Yes | Callback used to return the video track information obtained.|
| Name | Type | Mandatory| Description |
| -------- | ------------------------------------------------------------ | ---- | ------------------------------------------ |
| callback | AsyncCallback<Array<[MediaDescription](#mediadescription8)>> | Yes | Callback used to return a **MediaDescription** array, which records the video track information.|
**Example**
......@@ -1256,9 +1255,9 @@ Obtains the video track information. This API uses a promise to return the resul
**Return value**
| Type | Description |
| ------------------------------------------------------ | ------------------------------- |
| Promise<Array<[MediaDescription](#mediadescription8)>> | Promise used to return the video track information obtained.|
| Type | Description |
| ------------------------------------------------------ | ----------------------------------------------- |
| Promise<Array<[MediaDescription](#mediadescription8)>> | Promise used to return a **MediaDescription** array, which records the video track information.|
**Example**
......@@ -1332,9 +1331,9 @@ Sets the video playback speed. This API uses a promise to return the result.
**Return value**
| Type | Description |
| ---------------- | ------------------------- |
| Promise\<number> | Promise used to return the result.|
| Type | Description |
| ---------------- | ------------------------------------------------------------ |
| Promise\<number> | Promise used to return playback speed. For details, see [PlaybackSpeed](#playbackspeed8).|
**Example**
......@@ -1389,13 +1388,13 @@ Selects a bit rate from available ones, which can be obtained by calling [availa
| Name | Type | Mandatory| Description |
| ------- | ------ | ---- | -------------------------------------------- |
| bitrate | number | Yes | Bit rate to select, which is used in the HLS multi-bit rate scenario. The unit is bit/s.|
| bitrate | number | Yes | Bit rate, which is used in the HLS multi-bit rate scenario. The unit is bit/s.|
**Return value**
| Type | Description |
| ---------------- | ------------------------- |
| Promise\<number> | Promise used to return the result.|
| Type | Description |
| ---------------- | --------------------------- |
| Promise\<number> | Promise used to return the bit rate.|
**Example**
......@@ -1435,7 +1434,7 @@ videoPlayer.on('playbackCompleted', () => {
on(type: 'bufferingUpdate', callback: (infoType: BufferingInfoType, value: number) => void): void
Subscribes to the video buffering update event.
Subscribes to the video buffering update event. Only network playback supports this subscription.
**System capability**: SystemCapability.Multimedia.Media.VideoPlayer
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册