This repository stores source code information of the multimedia subsystem. It provides unified interfaces for you to develop media applications. With this repository, you can easily obtain media resources and focus on service development.
This repository stores source code information of the multimedia subsystem. It provides unified interfaces for you to develop media applications. With this repository, you can easily obtain media resources and focus on service development. The following two figures show the framework and service flow of the multimedia subsystem, respectively.
Based on code information about this repository, the device configuration file is stored in **test\\lite\\devini**. When using the configuration file, you can place it in the **/data** directory of the development board in use. You use the configuration file for the adaption to the sensor, resolution, frame rate, and more.
As shown in Figure 1, the multimedia framework supports the camera, recording, and playback functions. These functions support the development of HarmonyOS JavaScript applications and various kit modules that use media capabilities. The multimedia framework consists of the framework and core service layers. The framework layer provides native APIs and corresponding service implementation for applications. It implements audio/video input and output, audio/video encoding and decoding, as well as video file packing and demultiplexing for camera, recording, and playback services. The core service layer leverages the capabilities provided by the hardware platform to use the underlying hardware and related drivers. In addition, the core server implements file management, storage management, and log management.
![](figures/en-us_image_0000001055193837.png)
**Figure 2** Multimedia service flow<aname="fig931392183713"></a>
As shown in these figures, the multimedia subsystem consists of camera, recorder, and player modules. The camera module provides YUV or RGB, JPEG, and H.264 or H.265 data, which is stored in the surface \(shared memory\); the recorder module packs H.264 or H.265 and AAC data in the surface into MP4 files; the player module demultiplexes the MP4 files into audio and video data, sends the data to corresponding decoders, and then plays the audio and video.
As shown in Figure 2, the multimedia subsystem consists of camera, recorder, and player modules. The camera module provides YUV or RGB, JPEG, and H.264 or H.265 data, which is stored in the surface \(shared memory\); the recorder module packs H.264 or H.265 and AAC data in the surface into MP4 files; the player module demultiplexes the MP4 files into audio and video data, sends the data to corresponding decoders, and then plays the audio and video.
<tdclass="cellrowborder"valign="top"width="59.29%"headers="mcps1.2.3.1.2 "><pid="p879375920132"><aname="p879375920132"></a><aname="p879375920132"></a>Internal framework implementation, including <strongid="b1696433143010"><aname="b1696433143010"></a><aname="b1696433143010"></a>Audio</strong>, <strongid="b19075193012"><aname="b19075193012"></a><aname="b19075193012"></a>Camera</strong>, and <strongid="b19889207193019"><aname="b19889207193019"></a><aname="b19889207193019"></a>Player.Recorder</strong></p>
<tdclass="cellrowborder"valign="top"width="59.29%"headers="mcps1.2.3.1.2 "><pid="p879375920132"><aname="p879375920132"></a><aname="p879375920132"></a>Internal framework implementation, including <strongid="b1696433143010"><aname="b1696433143010"></a><aname="b1696433143010"></a>Audio</strong>, <strongid="b19075193012"><aname="b19075193012"></a><aname="b19075193012"></a>Camera</strong>, <strongid="b19889207193019"><aname="b19889207193019"></a><aname="b19889207193019"></a>Player</strong>, and <strongid="b166613312593"><aname="b166613312593"></a><aname="b166613312593"></a>Recorder</strong>.</p>
<tdclass="cellrowborder"valign="top"width="59.29%"headers="mcps1.2.3.1.2 "><pid="p182076317465"><aname="p182076317465"></a><aname="p182076317465"></a>Underlying service implementation</p>
<tdclass="cellrowborder"valign="top"width="59.29%"headers="mcps1.2.3.1.2 "><pid="p1279144754611"><aname="p1279144754611"></a><aname="p1279144754611"></a>Header files of multimedia adaptation APIs related to the hardware platform</p>
</td>
</tr>
</tbody>
...
...
@@ -58,20 +56,24 @@ As shown in these figures, the multimedia subsystem consists of camera, recorder
## Constraints<a name="section722512541395"></a>
- C++11 or later
- Currently, Hi3516DV300 and Hi3518EV300 are supported, and only Hi3516DV300 supports the playback function.
- Hi3516DV300 and Hi3518EV300 are supported, and only Hi3516DV300 supports the playback function.
- By default, Hi3516DV300 supports Sony IMX335, and Hi3518EV300 supports crystalline phase JXF23.
## Installation<a name="section11914418405"></a>
- Load the kernel and related drivers before installing the repository. For details, see readme files of kernel and driver subsystems.
- Modify required configuration files, such as those in **test/devini**. For details, see the _Configuration File Description_. Currently, only IMX335 and IMX327 sensors are supported. For other sensors, seek help from the open source community.
- For details about how to invoke native APIs, see the demonstration in the **test** directory.
- Configure a proper configuration file. For details, see the configuration file in **applications/sample/camera/media** directory. If you want to adapt to other sensors, seek help from the open source community. Ensure that the configuration file is stored in the **/storage/data** directory of the development board in use. You can use this configuration file to adapt to the sensor, resolution, and frame rate.
## Usage<a name="section1467220266400"></a>
## Use Case<a name="section1467220266400"></a>
For details about how to call native APIs, see the demo in the **applications/sample/camera/media** directory.
You can use media APIs to record, preview, and play audios and videos. Before using these resources, create a **CameraKit** object and register various callbacks to respond to many events in the media module. You create a **Camera** object to operate camera resources, for example, to start preview, recording, and stream capturing, and set related parameters.
For details about how to call multimedia APIs to implement the video recording, preview, and playback, see _Multimedia Development Guide_.
The following example overrides the event class:
Create a **CameraKit** object and register various callbacks to respond to many events in the media module. Then, create a **Camera** object to operate camera resources, for example, to start preview, recording, and stream capturing, and set related parameters.