Camera is one of the services provided by the OpenHarmony multimedia subsystem. The camera module provides recording, preview, and photographing features and supports concurrent stream reading by multiple users.
Camera is one of the services provided by the OpenHarmony multimedia subsystem. The camera module provides recording, preview, and photographing features and supports concurrent stream reading by multiple users.
...
@@ -10,72 +11,38 @@ It is considered good practice that you understand the following concepts before
...
@@ -10,72 +11,38 @@ It is considered good practice that you understand the following concepts before
A video frame is formed by the stream data of a video image. Video data streams are formed by a series of image data arranged at a fixed time interval.
A video frame is formed by the stream data of a video image. Video data streams are formed by a series of image data arranged at a fixed time interval.
- Frame per second \(FPS\)
-Frames per second (FPS)
FPS is used to represent the frame rate at which images are refreshed during video playback, or the number of frames per second during video playback. A higher frame rate means smoother video playback.
FPS is used to represent the frame rate at which images are refreshed during video playback, or the number of frames per second during video playback. A higher frame rate means smoother video playback.
- Resolution
- Resolution
Information about each image frame consists of pixels. The number of pixels in an image is presented by the resolution. For example, 1080p \(1920 x 1080\) indicates that the image width is 1920 pixels and the image height is 1080 pixels.
Information about each image frame consists of pixels. The number of pixels in an image is presented by the resolution. For example, 1080p (1920 x 1080) indicates that the image width is 1920 pixels and the image height is 1080 pixels.
## Working Principles<a name="section193961322175011"></a>
## Working Principles
- Multimedia services
- Multimedia services
Multimedia services are started by the **Init** process upon system startup, and media hardware resources \(such as memory, display hardware, image sensors, and codecs\) are initialized and allocated. During the initialization, the configuration file is parsed, which determines the upper limit of capabilities and resources of each service. Generally, the upper limit is configured by original equipment manufacturers \(OEMs\) in the configuration file. The following configuration items are available for the camera service during multimedia service initialization:
Multimedia services are started by the **Init** process upon system startup, and media hardware resources (such as memory, display hardware, image sensors, and codecs) are initialized and allocated. During the initialization, the configuration file is parsed, which determines the upper limit of capabilities and resources of each service. Generally, the upper limit is configured by original equipment manufacturers (OEMs) in the configuration file. The following configuration items are available for the camera service during multimedia service initialization:
- Memory pool: Memory blocks in the memory pool are accessed and released continuously by all multimedia services.
- Memory pool: Memory blocks in the memory pool are accessed and released continuously by all multimedia services.
- Image sensor: sensor type, resolution, ISP, and more
- Image sensor: sensor type, resolution, ISP, and more.
- Image processor: resolution, bit rate, image inversion, and more
- Image processor: resolution, bit rate, image inversion, and more.
- Image encoder: encoding format, bit rate, resolution, and more
- Image encoder: encoding format, bit rate, resolution, and more.
- Major classes
- Major classes
You can use the **Camera** class and its asynchronous callback classes to configure and access the camera functionalities. The three callback classes correspond to different asynchronous processing scenarios, as described in [Table 1](#table486418149411).
You can use the **Camera** class and its asynchronous callback classes to configure and access the camera functionalities. The three callback classes correspond to different asynchronous processing scenarios, as described in the table below.
| Camera | Configures the static camera capability through the configuration class to use basic camera functionalities.| Photographing, video recording, and previewing|
<td class="cellrowborder" valign="top" width="44.34443444344435%" headers="mcps1.2.4.1.2 "><p id="p48653148414"><a name="p48653148414"></a><a name="p48653148414"></a>Configures the static camera capability through the configuration class to use basic camera functionalities.</p>
</td>
<td class="cellrowborder" valign="top" width="33.33333333333333%" headers="mcps1.2.4.1.3 "><p id="p986510145416"><a name="p986510145416"></a><a name="p986510145416"></a>Photographing, video recording, and previewing</p>
<td class="cellrowborder" valign="top" width="33.33333333333333%" headers="mcps1.2.4.1.3 "><p id="p486541444119"><a name="p486541444119"></a><a name="p486541444119"></a>Start and end of photographing, and frame rate changes</p>
</td>
</tr>
</tbody>
</table>
- Stream transfer
- Stream transfer
...
@@ -87,26 +54,22 @@ It is considered good practice that you understand the following concepts before
...
@@ -87,26 +54,22 @@ It is considered good practice that you understand the following concepts before
Similarly, you can create a surface, implement consumer logic for it, and transfer it to the **Camera** instance. For example, transmit video streams over the network or save captured frame data as an image file.
Similarly, you can create a surface, implement consumer logic for it, and transfer it to the **Camera** instance. For example, transmit video streams over the network or save captured frame data as an image file.
The graphics module also obtains stream resources from the camera module through surfaces. For details, see development guidelines on [Graphic](subsys-graphics-overview.md).
The graphics module also obtains stream resources from the camera module through surfaces. For details, see [Overview of Small-System Graphics](../subsystems/subsys-graphics-overview.md).
- Camera running process
- Camera running process
1. Creating a camera
1. Creating a camera
This process creates a **Camera** instance by **CameraManager**, binds the camera device to the server, and asynchronously notifies you of the successful creation. The following figure shows the time sequence between classes.
This process creates a **Camera** instance by **CameraManager**, binds the camera device to the server, and asynchronously notifies you of the successful creation. The following figure shows the time sequence between classes.
**Figure 1** Sequence diagram for creating a camera<a name="fig9882125184416"></a>
**Figure 1** Sequence diagram for creating a camera
This process creates a **Camera** instance via **CameraKit**, and configures frame attributes via **FrameConfig** for recording or previewing. The following figure shows the time sequence.
This process creates a **Camera** instance via **CameraKit**, and configures frame attributes via **FrameConfig** for recording or previewing. The following figure shows the time sequence.
**Figure 2** Sequence diagram for recording/previewing<a name="fig642695404512"></a>
**Figure 2** Sequence diagram for recording/previewing
OpenHarmony multimedia services help you to develop for audio and video playback and recording.
OpenHarmony multimedia services help you to develop audio and video playback and recording.
- The media playback module facilitates the development of audio and video playback, including media file and stream playback, volume control, and playback progress control.
- The media playback module facilitates the development of audio and video playback, including media file and stream playback, volume control, and playback progress control.
- The media recording module supports the development of audio and video recording and provides functions to set the size of captured video, encoding bit rate, encoder type, video frame rate, audio sampling rate, and output file format.
- The media recording module supports the development of audio and video recording and provides functions to set the size of captured video, encoding bit rate, encoder type, video frame rate, audio sampling rate, and output file format.
## Basic Concepts
## Basic Concepts
It is considered good practice that you understand the following concepts before starting development:
It is considered good practice that you understand the following concepts before starting development:
...
@@ -13,10 +17,9 @@ It is considered good practice that you understand the following concepts before
...
@@ -13,10 +17,9 @@ It is considered good practice that you understand the following concepts before
The streaming media technology refers to a process to encode continuous video and audio data and store the data on a network server. A viewer can watch and listen to the video and audio during download with no need to wait for the completion of download.
The streaming media technology refers to a process to encode continuous video and audio data and store the data on a network server. A viewer can watch and listen to the video and audio during download with no need to wait for the completion of download.
- Video frame rate
- Video frame rate
The frame rate is used to measure the number of displayed frames, which is the number of images transmitted per second. The more frames per second \(FPS\), the smoother the video.
The frame rate is used to measure the number of displayed frames, which is the number of images transmitted per second. The more frames per second (FPS), the smoother the video.
- Bit rate
- Bit rate
...
@@ -24,20 +27,19 @@ It is considered good practice that you understand the following concepts before
...
@@ -24,20 +27,19 @@ It is considered good practice that you understand the following concepts before
- Sampling rate
- Sampling rate
The sampling rate is the number of samples per second taken from continuous signals to form discrete signals. The unit is hertz \(Hz\).
The sampling rate is the number of samples per second taken from continuous signals to form discrete signals. The unit is hertz (Hz).
## Encoding and Decoding
Available audio and video encoding and decoding capabilities vary depending on device types. The following table lists supported specifications for available development boards.
## Codec Specifications
**Table 1** Encoding and decoding specifications for different development boards
Available audio and video codecs vary depending on device types. The following table lists supported specifications for available development boards.
| Device Type | Development Board | Decoding | Encoding |
**Table 1** Codec specifications for different development boards
| Cameras with a screen | Hi3516 | - Audio: MPEG-4 AAC Profile (AAC LC), MPEG Audio Layer 3 (MP3), mono and dual channels, MPEG-4 (.mp4 and .m4a), and MP3 (.mp3) are supported.<br>- Video: The H.265 (HEVC) and H.264 (AVC) (for streams encoded using a chip of the same type) and the MPEG-4 (.mp4) container format are supported. | - Audio: AAC-LC encoding, mono and dual channels, and the MPEG-4 (.mp4) container format are supported.<br/>- Video: H.264 and H.265 encoding and the MPEG-4 (.mp4) container format are supported.</li></ul> |
| Cameras without a screen | Hi3518 | - Audio: MPEG-4 AAC Profile (AAC LC), MPEG Audio Layer 3 (MP3), mono and dual channels, MPEG-4 (.mp4 and .m4a), and MP3 (.mp3) are supported.<br/>- Video: none | <ul><li>Audio: AAC-LC encoding, mono and dual channels, and the MPEG-4 (.mp4) container format are supported.<br/>- Video: H.264 and H.265 encoding and the MPEG-4 (.mp4) container format are supported. |
| WLAN connecting devices | Hi3861 | N/A | N/A |
For details about the encoding and decoding specifications of Hi3516 and Hi3518, refer to their documentation.
| Device Type| Development Board| Decoding| Encoding|
| -------- | -------- | -------- | -------- |
| Cameras with a screen| Hi3516 | - Audio: MPEG-4 AAC Profile (AAC LC), mono and dual channels, MPEG-4 (.mp4 and .m4a) <br>- Video: The H.265 (HEVC) and H.264 (AVC) (for streams encoded using a chip of the same type) and the MPEG-4 (.mp4) container format are supported.| - Audio: AAC-LC encoding, mono and dual channels, and the MPEG-4 (.mp4) container format are supported.<br>- Video: H.264 and H.265 encoding and the MPEG-4 (.mp4) container format are supported.|
| Cameras without a screen| Hi3518 | - Audio: MPEG-4 AAC Profile (AAC LC), mono and dual channels, and the MPEG-4 (.mp4 and .m4a) container format are supported.<br>- Video: none| - Audio: AAC-LC encoding, mono and dual channels, and the MPEG-4 (.mp4) container format are supported.<br>- Video: H.264 and H.265 encoding and the MPEG-4 (.mp4) container format are supported.|
| WLAN connecting devices| Hi3861 | N/A| N/A|
For details about the codec specifications of Hi3516 and Hi3518, refer to their documentation.