- HDI implementation layer: implements standard southbound interfaces of OpenHarmony cameras.
+**适配层**:屏蔽底层芯片和OS(Operation System)差异,支持多平台适配。
- Framework layer: connects to the HDI implementation layer for control instruction and stream transfer, establishes data channels, and manages camera devices.
**<a name="fig3672817152110"></a>**
**图 1** 基于HDF驱动框架的Camera驱动模型
![](figures/logic-view-of-camera-hal-zh.png)
- Adaptation layer: shields the differences between bottom-layer chips and OSs for multi-platform adaptation.
4. Service通过Stream控制整个流的操作,AttachBufferQueue接口将从显示模块申请的BufferQueue下发到底层,由CameraDeviceDriverModel自行管理buffer,当Capture接口下发命令后,底层开始向上传递buffer。Pipeline的IspNode依次从BufferQueue获取指定数量buffer,然后下发到底层ISP(Image Signal Processor,图像信号处理器)硬件,ISP填充完之后将buffer传递给CameraDeviceDriverModel,CameraDeviceDriverModel通过循环线程将buffer填充到已经创建好的Pipeline中,各个Node处理后通过回调传递给上层,同时buffer返回BufferQueue等待下一次下发。
1. The CameraDeviceHost process is created during system startup. The process enumerates underlying devices, creates a **DeviceManager** instance that manages the device tree, an object for each underlying device, and a **CameraHost** instance, and registers the **CameraHost** instance with the UHDF service. Through this service, the upper layer can obtain the CameraDeviceHost process at the bottom layer to operate the underlying devices. Note that the **DeviceManager** instance can also be created by using the configuration table.
2. The Camera Service obtains the **CameraHost** instance through the CameraDeviceHost service. The **CameraHost** instance can be used to obtain the bottom-layer camera capabilities, turn on the flashlight, call the **Open\(\)** interface to start the camera and create a connection, create a **DeviceManager** instance \(powering on the bottom-layer hardware modules\), and create a **CameraDevice** instance \(providing the device control interface for the upper layer\). When the **CameraDevice** instance is created, each submodule of PipelineCore is instantiated. Among the submodules, StreamPiplineCore is responsible for creating pipelines, and MetaQueueManager is responsible for reporting metadata.
3. The Camera Service configures the stream and creates a **Stream** instance through the **CameraDevice** instance at the bottom layer. The StreamPipelineStrategy module creates the node connection mode of the corresponding stream by using the mode issued by the upper layer and querying the configuration table. The StreamPipelineBuilder module creates a node and returns the pipeline to the StreamPipelineDispatcher module through the connection. The StreamPipelineDispatcher module provides unified pipeline invoking management.
4. The Camera Service controls the stream operations through the **Stream** instance. The **AttachBufferQueue\(\)** interface is used to deliver the buffer queue requested from the display module to the bottom layer. The CameraDeviceDriverModel manages the buffer. After the **Capture\(\)** interface is called to deliver commands, the bottom layer transfers the buffer to the upper layer. The Image Signal Processor \(ISP\) node obtains a specified number of buffers from the buffer queue and delivers the buffers to the bottom-layer ISP hardware. After filling the buffers, the ISP hardware transfers the buffers to the CameraDeviceDriverModel. The CameraDeviceDriverModel fills the created pipeline with the received buffers by using a loop thread. Each node processes the pipeline data and transfers the data to the upper layer by using a callback. At the same time, the buffers are freed for reuse.
5. The Camera Service delivers the photographing command through the **Capture\(\)** interface. The **ChangeToOfflineStream\(\)** interface is used to query the position of the photographing buffer. If the ISP hardware has output an image and sent the image data to the IPP node, the common photographing streams can be converted into offline streams. Otherwise, the close process is executed. The **ChangeToOfflineStream\(\)** interface transfers **StreamInfo** to enable the offline stream to obtain the stream information of the common stream, confirms the node connection mode of the offline stream based on the configuration table, and creates the node connection of the offline stream. If the node connection has been created, the interface releases the node required by the non-offline stream through **CloseCamera\(\)**. It then waits for the buffer to return from the bottom-layer pipeline to the upper layer and then releases the pipeline resources.
6. The Camera Service sends the **CaptureSetting** parameter to the CameraDeviceDriverModel through the **UpdateSettings\(\)** interface of the **CameraDevice** instance. The CameraDeviceDriverModel forwards the parameter to each node through the StreamPipelineDispatcher module. The **CaptureSetting** parameter carried in the **StartStreamingCapture\(\)** and **Capture\(\)** interfaces is forwarded to the node to which the stream belongs through the StreamPipelineDispatcher module.
## 开发指导<a name="section161941989596"></a>
7. The Camera Service controls underlying metadata reporting through the **EnableResult\(\)** and **DisableResult\(\)** interfaces. If the bottom-layer metadata needs to be reported, the pipeline creates a buffer queue in the CameraDeviceDriverModel to collect and transfer metadata, queries the configuration table based on the StreamPipelineStrategy module, and creates and connects to the specified node through the StreamPipelineBuilder module. The MetaQueueManager module delivers the buffer to the bottom layer, and the bottom-layer node fills in data. The MetaQueueManager module then invokes the upper-layer callback to transfer the data to the upper layer.
8. The Camera Service calls the **Close\(\)** interface of the **CameraDevice** class, and the **CameraDevice** instance calls the corresponding DeviceManager module to power off each hardware. If an offline stream exists in the subpipeline of the IPP node, the offline stream must be reserved until the execution is complete.
9. To implement dynamic frame control, a CollectBuffer thread is started in the StreamOperator. The CollectBuffer thread obtains a buffer from the buffer queue of each stream. If the frame rate of a stream needs to be controlled \(1/n of the sensor output frame rate\), the CollectBuffer thread can control the buffer packaging of each frame as required, and determine whether to collect the buffer of the stream. For example, if the output frame rate of the sensor is 120 fps and the preview stream frame rate is 30 fps, the CollectBuffer thread collects the buffer of the preview stream every 4 fps.
For details about the HDI functionalities and the function passing rules, see "Available APIs" in [Camera](https://gitee.com/openharmony/drivers_peripheral/blob/master/camera/README_zh.md).
### How to Develop
The following describes the main APIs of the camera driver model, including the APIs for registering and detecting cameras, creating, capturing, and destroying streams, and enabling and disabling devices. \(To clearly describe the implementation of main functionalities, some error judgment and log source code are not described here.\)
1. Register a **CameraHost**.
Define the **HdfDriverEntry** structure to define the method for initializing a **CameraHost**.
```
struct HdfDriverEntry g_cameraHostDriverEntry = {
...
...
@@ -61,12 +57,12 @@ OpenHarmony相机驱动框架模型对上实现相机HDI(Hardware Driver Inter
**HdfCameraHostDriverBind** defined in the **HdfDriverEntry** structure provides the registration of **CameraServiceDispatch\(\)** and **CameraHostStubInstance\(\)**. **CameraServiceDispatch\(\)** is used to remotely call a method of the **CameraHost**, such as **OpenCamera\(\)** and **SetFlashlight\(\)**. **CameraHostStubInstance\(\)** is used to initialize the camera device, which is called during system startup.
```
int HdfCameraHostDriverBind(HdfDeviceObject *deviceObject)
...
...
@@ -81,16 +77,16 @@ OpenHarmony相机驱动框架模型对上实现相机HDI(Hardware Driver Inter
**CameraHostStubInstance\(\)** finally calls **CameraHostImpl::Init\(\)** to obtain the physical camera and initialize the DeviceManager and PipelineCore modules.
sptr<CameraHostProxy> hostSptr = iface_cast<CameraHostProxy>(remote); // Return the CameraHostProxy object that contains methods such as OpenCamera() to the caller.
The **CameraHostProxy** class provides five interfaces: **SetCallback\(\)**, **GetCameraIds\(\)**, **GetCameraAbility\(\)**, **OpenCamera\(\)**, and **SetFlashlight\(\)**. The following describes **OpenCamera\(\)**.
The **OpenCamera\(\)** interface calls the remote **CameraHostStubOpenCamera\(\)** interface through the CMD\_CAMERA\_HOST\_OPEN\_CAMERA to obtain an **ICameraDevice** object.
**Remote\(\)-\>SendRequest** calls **CameraHostServiceStubOnRemoteRequest\(\)**, enters the **CameraHostStubOpenCamera\(\)** interface based on **cmdId**, and finally calls **CameraHostImpl::OpenCamera\(\)** to obtain a **CameraDevice** and power on the camera hardware.
**CameraDeviceImpl** defines interfaces such as **GetStreamOperator\(\)**, **UpdateSettings\(\)**, **SetResultMode\(\)**, and **GetEnabledResult\(\)**. The following is an example of implementing the **GetStreamOperator\(\)** interface:
The **CreateStreams\(\)** interface in the **StreamOperatorImpl** class is used to create a **StreamBase** instance, which can then be used to initialize operations such as **CreateBufferPool\(\)** by using the **init\(\)** method.
Use the **CommitStreams\(\)** method to configure the stream, including PipelineCore initialization and creation. It must be called after the stream is created.
std::vector<int> streamIds_; // IDs of streams to be captured
std::shared_ptr<CameraStandard::CameraMetadata> captureSetting_; // Camera ability can be obtained through the GetCameraAbility() interface of CameraHost.
Use the **ReleaseStreams\(\)** interface in the **StreamOperatorImpl** class t release the streams created by using **CreateStream\(\)** and **CommitStreams\(\)** and destroy the pipeline.
Use the **Close\(\)** interface in the **CameraDeviceImpl** class to close the camera device. This interface calls **PowerDown\(\)** in the **DeviceManager** to power off the device.
There is a camera demo in the **/drivers/peripheral/camera/hal/init** directory. After system startup, the executable file **ohos\_camera\_demo** is generated in the **/system/bin** directory. This demo can implement basic camera capabilities such as preview and photographing. The following uses the demo as an example to describe how to use the HDI to implement the **PreviewOn\(\)** and **CaptureON\(\)** interfaces.
1. Construct a Hos3516Demo object in the **main** function. This object contains methods for initializing the camera and starting, stopping, and releasing streams. The **mainDemo-\>InitSensors\(\)** function is used to initialize the **CameraHost**, and the **mainDemo-\>InitCameraDevice\(\)** function is used to initialize the **CameraDevice**.
```
int main(int argc, char** argv)
{
RetCode rc = RC_OK;
auto mainDemo = std::make_shared<Hos3516Demo>();
rc = mainDemo->InitSensors(); // 初始化CameraHost
rc = mainDemo->InitSensors(); // Initialize the CameraHost.
The function used to initialize the **CameraHost** is implemented as follows, where the HDI **ICameraHost::Get\(\)** is called to obtain the **demoCameraHost** and set the callback:
```
RetCode Hos3516Demo::InitSensors()
...
...
@@ -454,7 +451,7 @@ OpenHarmony相机驱动框架模型对上实现相机HDI(Hardware Driver Inter
The implementation of the function for initializing the **CameraDevice** is as follows, where the **GetCameraIds\(cameraIds\_\)**, **GetCameraAbility\(cameraId, ability\_\)**, and **OpenCamera\(cameraIds\_.front\(\), callback, demoCameraDevice\_\)** interfaces are called to obtain the **demoCameraHost**.
```
RetCode Hos3516Demo::InitCameraDevice()
...
...
@@ -469,24 +466,24 @@ OpenHarmony相机驱动框架模型对上实现相机HDI(Hardware Driver Inter
2. Implement the **PreviewOn\(\)** interface to configure streams, enable preview streams, and start stream capture. After this interface is called, the camera preview channel starts running. Two streams are enabled: preview stream and capture or video stream. Only the preview stream will be captured.
The **StartCaptureStream\(\)**, **StartVideoStream\(\)**, and **StartPreviewStream\(\)** interfaces call the **CreateStream\(\)** interface with different input parameters.
```
RetCode Hos3516Demo::StartVideoStream()
...
...
@@ -494,22 +491,22 @@ OpenHarmony相机驱动框架模型对上实现相机HDI(Hardware Driver Inter
The **CreateStream\(\)** interface calls the HDI to configure and create a stream. Specifically, the interface first calls the HDI to obtain a **StreamOperation** object and then creates a **StreamInfo** object. Call **CreateStreams\(\)** and **CommitStreams\(\)** to create and configure a stream.
```
RetCode Hos3516Demo::CreateStreams(const int streamIdSecond, StreamIntent intent)
The **CaptureON\(\)** interface calls the **Capture\(\)** method of **StreamOperator** to obtain camera data, rotate the buffer, and start a thread to receive data of the corresponding type.
```
RetCode Hos3516Demo::CaptureON(const int streamId, const int captureId, CaptureMode mode)
The streamCustomerCapture_->ReceiveFrameOn([this](void* addr, const uint32_t size) { // Create a capture thread to receive the passed buffers through the StoreImage callback.
streamCustomerVideo_->ReceiveFrameOn([this](void* addr, const uint32_t size) {// Create a video thread to receive the passed buffer by calling the StoreVideo callback.
StoreVideo(addr, size);
});
}
...
...
@@ -573,7 +570,7 @@ OpenHarmony相机驱动框架模型对上实现相机HDI(Hardware Driver Inter
3. Implement the **ManuList\(\)** function to obtain characters from the console through the **fgets\(\)** interface. Different characters correspond to different capabilities provided by the demo, and the functionality menu is printed.
Other capabilities in the demo are implemented by calling different HDIs, which are similar to **PreviewOn\(\)**. For details, see [ohos\_camera\_demo](https://gitee.com/openharmony/drivers_peripheral/tree/master/camera/hal/init).