The OpenHarmony camera driver model implements the camera hardware device interface (HDI) and the camera pipeline model to manage camera devices.
The camera driver model consists of the following layers:
The OpenHarmony camera driver model provides the camera hardware device interface (HDI) and the camera pipeline model to manage camera devices.
The camera driver model is divided into three layers:
+ HDI implementation layer: implements standard ohos (OpenHarmony operating system) APIs for cameras.
+ Framework layer: connects to the HDI implementation layer for control instruction and stream transfer, establishes data channels, and manages camera devices.
+ Device adaptation layer: shields the differences between underlying chips and OSs for multi-platform adaptation.
+ Framework layer: interacts with the HDI implementation layer to set up data channels and operate camera devices.
+ Device adaptation layer: supports different platforms by shielding the differences in underlying chips and operating systems.
### Working Principles<a name="3"></a>
### Working Principles
The camera module is used to initialize services and devices, set up data channels, and configure, create, deliver, and capture streams. The figure below illustrates camera driver model.
The camera module is used to initialize services and devices, set up data channels, and configure, create, deliver, and capture streams. The following figure shows the camera driver model.
**Figure 1** HDF-based camera driver model

1. When the system starts, the camera_host process is created. The process enumerates underlying devices, creates a **DeviceManager** instance that manages the device tree, an object for each underlying device, and a **CameraHost** instance, and registers the **CameraHost** instance with the UHDF service. Through the UHDF service, the camera service can obtain the underlying **CameraDeviceHost** services to operate the hardware devices. Note that the **DeviceManager** instance can also be created by using the configuration table.
1. When the system starts, the camera_host process is created. The process enumerates underlying devices, creates a **DeviceManager** instance (to manage the device tree), an object for each underlying device, and a **CameraHost** instance, and registers the **CameraHost** instance with the user-mode HDF (UHDF) service. Through the UHDF service, the camera service can obtain the underlying **CameraDeviceHost** services to operate the hardware devices. The **DeviceManager** instance can also be created by using the configuration table.
2. The Camera Service obtains the **CameraHost** instance through the CameraDeviceHost service. The **CameraHost** instance can be used to obtain the bottom-layer camera capabilities, turn on the flashlight, call the **Open()** interface to start the camera and create a connection, create a **DeviceManager** instance (powering on the bottom-layer hardware modules), and create a **CameraDevice** instance (providing the device control interface for the upper layer). When the **CameraDevice** instance is created, each submodule of PipelineCore is instantiated. Among the submodules, StreamPipelineCore is responsible for creating pipelines, and MetaQueueManager is responsible for reporting metadata.
2. The Camera Service obtains the **CameraHost** instance through the CameraDeviceHost service.
The **CameraHost** instance can be used to obtain the underlying camera capabilities, turn on the flashlight, call **Open()** to start a camera and set up a connection with the camera, create a **DeviceManager** instance (to power on the hardware modules), and create a **CameraDevice** instance (to provide the device control interface for the upper layer).
When the **CameraDevice** instance is created, the PipelineCore modules will be instantiated. The StreamPipelineCore module creates pipelines, and the MetaQueueManager module reports metadata.
3. The Camera Service configures stream and creates a **Stream** class through the CameraDevice module. The StreamPipelineStrategy module creates the node connection mode of the corresponding stream by using the mode issued by the upper layer and querying the configuration table. The StreamPipelineBuilder module creates a node and returns the pipeline to the StreamPipelineDispatcher module through the connection. The StreamPipelineDispatcher module dispatches pipelines.
4. The Camera Service controls the stream operations through the **Stream** instance. The **AttachBufferQueue()** interface is used to deliver the buffer queue requested from the display module to the bottom layer. The CameraDeviceDriverModel manages the buffer. After the **Capture()** interface is called to deliver commands, the bottom layer transfers the buffer to the upper layer. The Image Signal Processor (ISP) node obtains a specified number of buffers from the buffer queue and delivers the buffers to the bottom-layer ISP hardware. After filling the buffers, the ISP hardware transfers the buffers to the CameraDeviceDriverModel. The CameraDeviceDriverModel fills the created pipeline with the received buffers by using a loop thread. Each node processes the pipeline data and transfers the data to the upper layer by using a callback. At the same time, the buffers are freed for reuse.
4. The Camera Service controls the stream operations through the **Stream** instance.
5. The Camera Service delivers the photographing command through the **Capture()** interface. The **ChangeToOfflineStream()** interface is used to query the position of the photographing buffer. If the ISP hardware has output an image and sent the image data to the IPP node, the common photographing streams can be converted into offline streams. Otherwise, the close process is executed. The **ChangeToOfflineStream()** interface transfers **StreamInfo** to enable the offline stream to obtain the stream information of the common stream, confirms the node connection mode of the offline stream based on the configuration table, and creates the node connection of the offline stream. If the node connection has been created, the interface releases the node required by the non-offline stream through **CloseCamera()**. It then waits for the buffer to return from the bottom-layer pipeline to the upper layer and then releases the pipeline resources.
**AttachBufferQueue()** delivers the buffer queue requested from the display module to the bottom layer. The CameraDeviceDriverModel manages the buffer. After **Capture()** is called to deliver commands, the bottom layer transfers the buffer to the upper layer. The Image Signal Processor (ISP) node obtains a specified number of buffers from the buffer queue and delivers the buffers to the bottom-layer ISP hardware. After filling the buffers, the ISP hardware transfers the buffers to the CameraDeviceDriverModel. The CameraDeviceDriverModel fills the created pipeline with the received buffers by using a loop thread. Each node processes the pipeline data and transfers the data to the upper layer in a callback. At the same time, the buffers are freed to the buffer queue for reuse.
6. The Camera Service sends the **CaptureSetting** parameter to the CameraDeviceDriverModel through the **UpdateSettings()** interface of the **CameraDevice** instance. The CameraDeviceDriverModel forwards the parameter to each node through the StreamPipelineDispatcher module. The **CaptureSetting** parameter carried in **StartStreamingCapture()** and **Capture()** is forwarded to the node to which the stream belongs through the StreamPipelineDispatcher module.
5. The Camera Service delivers the photographing command through **Capture()**. **ChangeToOfflineStream()** is used to query the position of the photographing buffer. If the ISP hardware has output an image and sent the image data to the IPP node, the common photographing stream can be converted into an offline stream. Otherwise, the close process is executed. **ChangeToOfflineStream()** passes **StreamInfo** to enable the offline stream to obtain the stream information of the common stream, determines the node connection mode of the offline stream based on the configuration table, and creates the node connection for the offline stream (if the node connection has been created, the node required by the non-offline stream will be closed by **CloseCamera**.) When the buffer is transferred from the pipeline to the upper layer, the pipeline resources are released.
7. The Camera Service controls underlying metadata reporting through the **EnableResult()** and **DisableResult()** interfaces. If the underlying metadata needs to be reported, the pipeline creates a buffer queue in the CameraDeviceDriverModel to collect and transfer metadata, queries the configuration table based on the StreamPipelineStrategy module, and creates and connects to the specified node through the StreamPipelineBuilder module. The MetaQueueManager module delivers the buffer to the bottom layer, and the bottom-layer node fills in data. The MetaQueueManager module then invokes the upper-layer callback to transfer the data to the upper layer.
6. The Camera Service sends the **CaptureSetting** parameter to the CameraDeviceDriverModel through **UpdateSettings()** of the **CameraDevice** instance. The CameraDeviceDriverModel forwards the parameter to each node through the StreamPipelineDispatcher module. The **CaptureSetting** parameter carried in **StartStreamingCapture()** and **Capture()** is forwarded to the node to which the stream belongs through the StreamPipelineDispatcher module.
8. The Camera Service calls the **Close()** interface of the **CameraDevice** class, and the **CameraDevice** instance calls the corresponding DeviceManager module to power off each hardware. If an offline stream exists in the subpipeline of the IPP node, the offline stream must be reserved until the execution is complete.
7. The Camera Service uses **EnableResult()** and **DisableResult()** to control the reporting of underlying metadata. If the underlying metadata needs to be reported, the pipeline creates a buffer queue in the CameraDeviceDriverModel to collect and transfer metadata, queries the configuration table based on the StreamPipelineStrategy module, and creates and connects to the specified node through the StreamPipelineBuilder module. The MetaQueueManager module delivers the buffer to the bottom layer, and the bottom-layer node fills in data. The MetaQueueManager module then invokes the upper-layer callback to transfer the data to the upper layer.
8. The Camera Service calls **Close()** of the **CameraDevice** class, and the **CameraDevice** instance calls the corresponding DeviceManager module to power off each hardware. If an offline stream exists in the subpipeline of the IPP node, the offline stream must be reserved until the execution is complete.
9. To implement dynamic frame control, a CollectBuffer thread is started in the StreamOperator. The CollectBuffer thread obtains a buffer from the buffer queue of each stream. If the frame rate of a stream needs to be controlled (1/n of the sensor output frame rate), the CollectBuffer thread can control the buffer packaging of each frame as required, and determine whether to collect the buffer of the stream. For example, if the output frame rate of the sensor is 120 fps and the preview stream frame rate is 30 fps, the CollectBuffer thread collects the buffer of the preview stream every 4 fps.
## Development Guidelines<a name="4"></a>
## Development Guidelines
### When to Use<a name="5"></a>
### When to Use
The camera module encapsulates camera operations in camera preview, photographing, and video streams to facilitate camera hardware operations and improve development efficiency.
The camera module encapsulates camera operations in camera preview, photographing, and video streams to implement camera hardware operations and improve development efficiency.
### Available APIs<a name="6"></a>
### Available APIs
The following table describes the C++ APIs generated from the Interface Definition Language (IDL) interface description. For details about the interface declaration, see the .idl file in **/drivers/interface/camera/v1_0/**.
The parameters passed in the HDI cannot exceed the capability range obtained by **GetCameraAbility**. Even if the parameters beyond the capability range can be passed in APIs such as **UpdateSettings**, **CommitStreams**, and **Capture** with no error returned, unexpected behavior may be caused.
| void OnError(ErrorType type, int32_t errorCode) | Called when an error occurs on the device to return error information. You need to implement this interface.|
| void OnResult(uint64_t timestamp, const std::shared_ptr<CameraMetadata> &result) | Callback invoked to report metadata related to the camera device. |
| int32_t OnError(ErrorType type, int32_t errorCode) | Called when an error occurs on the camera device. The caller needs to implement this API. |
| int32_t OnResult(uint64_t timestamp, const std::vector<uint8_t>& result) | Called to report metadata related to the camera device. |
| void OnCaptureStarted(int32_t captureId, const std::vector<int32_t> &streamIds) | Called when a capture starts. |
| void OnCaptureEnded(int32_t captureId,<br> const std::vector<std::shared_ptr<CaptureEndedInfo>> &infos) | Called when a capture ends. |
| void OnCaptureError(int32_t captureId,<br> const std::vector<std::shared_ptr<CaptureErrorInfo>> &infos) | Called when an error occurs during the capture.|
| void OnFrameShutter(int32_t captureId,<br> const std::vector<int32_t> &streamIds, uint64_t timestamp) | Called when a frame is captured. |
| int32_t OnCaptureStarted(int32_t captureId, const std::vector<int32_t>& streamIds) | Called when a capture starts. |
| int32_t OnCaptureEnded(int32_t captureId, const std::vector<CaptureEndedInfo>& infos) | Called when a capture ends. |
| int32_t OnCaptureError(int32_t captureId, const std::vector<CaptureErrorInfo>& infos) | Called when an error occurs during the capture. |
| int32_t OnFrameShutter(int32_t captureId, const std::vector<int32_t>& streamIds, uint64_t timestamp) | Called when a frame is captured. |
### How to Develop<a name="7"></a>
To camera driver development procedure is as follows:
### How to Develop
The camera driver development procedure is as follows:
1. Register a **CameraHost**.
1. Register a **CameraHost** instance.
Define the **HdfDriverEntry** structure to define the method for initializing a **CameraHost**.
```
Define the **HdfDriverEntry** structure to define the method for initializing **CameraHost**. For details about the code, see **drivers/peripheral/camera/interfaces/hdi_ipc/camera_host_driver.cpp**.
```c++
struct HdfDriverEntry g_cameraHostDriverEntry = {
.moduleVersion = 1,
.moduleName = "camera_service",
...
...
@@ -133,35 +142,48 @@ To camera driver development procedure is as follows:
HDF_INIT(g_cameraHostDriverEntry); // Register the HdfDriverEntry structure with the HDF.
```
2. Initialize the **CameraHost**.
2. Initialize the **CameraHost** service.
**HdfCameraHostDriverBind** defined in the **HdfDriverEntry** structure provides the registration of **CameraServiceDispatch()** and **CameraHostStubInstance()**. **CameraServiceDispatch()** is used to remotely call a method of the **CameraHost**, such as **OpenCamera()** and **SetFlashlight()**. **CameraHostStubInstance()** is used to initialize the camera device, which is called during system startup.
The **HdfCameraHostDriverBind()** method defined in the **HdfDriverEntry** structure registers **CameraServiceDispatch()** and **CameraHostStubInstance()**. **CameraServiceDispatch()** is used to remotely call the **CameraHost** methods, such as **OpenCamera()** and **SetFlashlight()**. **CameraHostStubInstance()** is called during the system startup to initialize the camera.
```
int HdfCameraHostDriverBind(HdfDeviceObject *deviceObject)
```c++
static int HdfCameraHostDriverBind(struct HdfDeviceObject *deviceObject)
{
HDF_LOGI("HdfCameraHostDriverBind enter!");
if (deviceObject == nullptr) {
HDF_LOGE("HdfCameraHostDriverBind: HdfDeviceObject is NULL !");
HDF_LOGI("HdfCameraHostDriverBind enter");
auto *hdfCameraHostHost = new (std::nothrow) HdfCameraHostHost;
if (hdfCameraHostHost == nullptr) {
HDF_LOGE("%{public}s: failed to create HdfCameraHostHost object", __func__);
return HDF_FAILURE;
}
hdfCameraHostHost->ioService.Dispatch = CameraHostDriverDispatch; // Provide a method to remotely call a CameraHost method.
hdfCameraHostHost->ioService.Open = NULL;
hdfCameraHostHost->ioService.Release = NULL;
auto serviceImpl = ICameraHost::Get(true);
if (serviceImpl == nullptr) {
HDF_LOGE("%{public}s: failed to get of implement service", __func__);
@@ -192,11 +214,11 @@ To camera driver development procedure is as follows:
**CameraHostStubInstance()** finally calls **CameraHostImpl::Init()** to obtain the physical camera and initialize the DeviceManager and PipelineCore modules.
3. Obtain the **CameraHost**.
3. Obtain the **CameraHost** service.
Call the **Get()** interface to obtain the **CameraHost** from the **CameraService**. The **Get()** interface is as follows:
Use **Get()** to obtain the **CameraHost** from the **CameraService**. The **Get()** method is as follows:
@@ -218,51 +240,80 @@ To camera driver development procedure is as follows:
}
```
4. Implement the **OpenCamera\(\)** interface.
4. Open a camera device.
The **CameraHostProxy** class provides five interfaces: **SetCallback()**, **GetCameraIds()**, **GetCameraAbility()**, **OpenCamera()**, and **SetFlashlight()**. The following describes **OpenCamera()**.
The **OpenCamera()** interface calls the remote **CameraHostStubOpenCamera()** interface through the CMD_CAMERA_HOST_OPEN_CAMERA to obtain an **ICameraDevice** object.
The **CameraHostProxy** class provides **SetCallback()**, **GetCameraIds()**, **GetCameraAbility()**, **OpenCamera()**, and **SetFlashlight()**.
Use **OpenCamera()** to call the remote **CameraHostStubOpenCamera()** through the **CMD_CAMERA_HOST_OPEN_CAMERA** to obtain an **ICameraDevice** object.
**Remote()->SendRequest** calls **CameraHostServiceStubOnRemoteRequest()**, enters the **CameraHostStubOpenCamera()** interface based on **cmdId**, and finally calls **CameraHostImpl::OpenCamera()** to obtain a **CameraDevice** and power on the camera hardware.
**Remote()->SendRequest** calls **CameraHostServiceStubOnRemoteRequest()**, locates **CameraHostStubOpenCamera()** based on **cmdId**, and finally calls **CameraHostImpl::OpenCamera()** to obtain a **CameraDevice** and power on the camera hardware.
@@ -274,214 +325,333 @@ To camera driver development procedure is as follows:
CameraPowerDown(phyCameraIds);
return DEVICE_ERROR;
}
auto sptrDevice = deviceBackup_.find(cameraId);
if (sptrDevice == deviceBackup_.end()) {
#ifdef CAMERA_BUILT_ON_OHOS_LITE
deviceBackup_[cameraId] = cameraDevice;
#else
deviceBackup_[cameraId] = cameraDevice.get();
#endif
}
device = deviceBackup_[cameraId];
cameraDevice->SetStatus(true);
return NO_ERROR;
CAMERA_LOGD("open camera success.");
DFX_LOCAL_HITRACE_END;
return HDI::Camera::V1_0::NO_ERROR;
}
```
5. Implement the **GetStreamOperator\(\)** interface.
5. Obtain streams.
**CameraDeviceImpl** defines interfaces such as **GetStreamOperator()**, **UpdateSettings()**, **SetResultMode()**, and **GetEnabledResult()**. The following is an example of implementing the **GetStreamOperator()** interface:
**CameraDeviceImpl** defines **GetStreamOperator()**, **UpdateSettings()**, **SetResultMode()**, and **GetEnabledResult()**. Use **GetStreamOperator()** to obtain steams.
Fill in the **StreamInfo** structure before creating a stream by calling **CreateStreams()**.
Fill in the **StreamInfo** structure before creating streams by calling **CreateStreams()**.
```
```c++
using StreamInfo = struct _StreamInfo {
int streamId_;
int width_; // Stream width
int height_; // Stream height
int format_; // Stream format, for example, PIXEL_FMT_YCRCB_420_SP
int width_; // Stream width
int height_; // Stream height
int format_; // Stream format, for example, PIXEL_FMT_YCRCB_420_SP
int dataSpace_;
StreamIntent intent_; // StreamIntent, for example, PREVIEW
StreamIntent intent_; // StreamIntent, for example, PREVIEW
bool tunneledMode_;
OHOS::sptr<OHOS::IBufferProducer> bufferQueue_; // The stream buffer queue can be created by using the streamCustomer->CreateProducer() interface.
ufferProducerSequenceable bufferQueue_; // Use streamCustomer->CreateProducer() to create a buffer queue for streams.
int minFrameDuration_;
EncodeType encodeType_;
};
```
The **CreateStreams()** interface in the **StreamOperatorImpl** class is used to create a **StreamBase** instance, which can then be used to initialize operations such as **CreateBufferPool()** by using the **init()** interface.
**CreateStreams()** is a method in the **StreamOperator** class (**StreamOperatorImpl** is the base class of **StreamOperator**). Use **CreateStreams()** to create a **StreamBase** object, which initializes operations such as **CreateBufferPool** through its **Init()** method.
std::shared_ptr<StreamBase> stream = StreamFactory::Instance().CreateShared(itr->second); // Create a StreamBase instance.
RetCode rc = stream->Init(streamInfo);
return RC_OK;
}
DFX_LOCAL_HITRACE_END;
return HDI::Camera::V1_0::NO_ERROR;
}
```
7. Configure the stream.
7. Configure streams.
Use the **CommitStreams()** interface to configure the stream, including PipelineCore initialization and creation. It must be called after the stream is created.
Use **CommitStreams()** to configure streams, including initializing and creating **PipelineCore**. **CommitStreams()** must be called after streams are created.
RetCode rc = itr->second->Cancel(); // Call Cancel() in CameraCapture to cancel the stream capture.
if (rc != RC_OK) {
return DEVICE_ERROR;
}
requestMap_.erase(itr); // Erase the CameraCapture object.
DFX_LOCAL_HITRACE_END;
return HDI::Camera::V1_0::NO_ERROR;
}
```
Use the **ReleaseStreams()** interface in the **StreamOperatorImpl** class t release the streams created by using **CreateStream()** and **CommitStreams()** and destroy the pipeline.
Use **ReleaseStreams()** in the **StreamOperatorImpl** class to release the streams created by using **CreateStream()** and **CommitStreams()** and destroy the pipeline.
Use the **Close()** interface in the **CameraDeviceImpl** class to close the camera device. This interface calls **PowerDown()** in the **DeviceManager** to power off the device.
### Development Example<a name = "8"></a>
Use **Close()** in the **CameraDeviceImpl** class to close the camera device. The **PowerDown()** in **DeviceManager** is called to power off the device.
There is a camera demo in the **/drivers/peripheral/camera/hal/init** directory. After system startup, the executable file **ohos_camera_demo** is generated in the **/vendor/bin** directory. This demo can implement basic camera capabilities such as preview and photographing. The following uses the demo as an example to describe how to use the HDI to write the **PreviewOn()** and **CaptureON()** instances. For details, see [ohos_camera_demo](https://gitee.com/openharmony/drivers_peripheral/tree/master/camera/hal/init).
### Example
1. Construct a CameraDemo object in the **main** function. This object contains methods for initializing the camera and starting, stopping, and releasing streams. The **mainDemo->InitSensors()** function is used to initialize the **CameraHost**, and the **mainDemo->InitCameraDevice()** function is used to initialize the **CameraDevice**.
There is a [ohos_camera_demo](https://gitee.com/openharmony/drivers_peripheral/tree/master/camera/hal/init) in the **/drivers/peripheral/camera/hal/init** directory. After the system is started, the executable file **ohos_camera_demo** is generated in the **/vendor/bin** directory. This demo implements basic camera capabilities such as preview and photographing.
```
The following uses the demo to describe how to use the HDI to implement **PreviewOn()** and **CaptureON()**.
1. Construct a **CameraDemo** object in the **main** function. This object contains methods for initializing the camera and starting, stopping, and releasing streams. The **mainDemo->InitSensors()** function is used to initialize the **CameraHost**, and the **mainDemo->InitCameraDevice()** function is used to initialize the **CameraDevice**.
```c++
int main(int argc, char** argv)
{
RetCode rc = RC_OK;
auto mainDemo = std::make_shared<CameraDemo>();
rc = mainDemo->InitSensors(); // Initialize the CameraHost.
rc = mainDemo->InitSensors(); // Initialize the CameraHost.
ManuList(mainDemo, argc, argv); // Print the menu to the console.
ManuList(mainDemo, argc, argv); // Print the menu to the console.
return RC_OK;
}
...
...
@@ -507,190 +677,309 @@ There is a camera demo in the **/drivers/peripheral/camera/hal/init** directory.
The function used to initialize the **CameraHost** is implemented as follows, where the HDI **ICameraHost::Get()** is called to obtain the **demoCameraHost** and set the callback:
The implementation of the function for initializing the **CameraDevice** is as follows, where the **GetCameraIds(cameraIds_)**, **GetCameraAbility(cameraId, ability_)**, and **OpenCamera(cameraIds_.front(), callback, demoCameraDevice_)** interfaces are called to obtain the **demoCameraHost**.
The function for initializing the **CameraDevice** is implemented as follows. The **GetCameraIds(cameraIds_)**, **GetCameraAbility(cameraId, ability_)**, and **OpenCamera(cameraIds\_.front(), callback, demoCameraDevice_)** methods are used to obtain the **demoCameraHost**.
2. Implement the **PreviewOn()** interface to configure streams, enable preview streams, and start stream capture. After this interface is called, the camera preview channel starts running. Two streams are enabled: preview stream and capture or video stream. Only the preview stream will be captured.
rc = mainDemo->StartPreviewStream(); // Configure the preview stream.
if (mode == 0) {
rc = mainDemo->StartCaptureStream(); // Configure the capture stream.
} else {
rc = mainDemo->StartVideoStream(); // Configure the video stream.
}
rc = mainDemo->CaptureON(STREAM_ID_PREVIEW, CAPTURE_ID_PREVIEW, CAPTURE_PREVIEW); // Capture the preview stream.
return RC_OK;
}
```
2. Implement **PreviewOn()** to configure streams, enable preview streams, and start stream capture.
The **StartCaptureStream()**, **StartVideoStream()**, and **StartPreviewStream()** interfaces call the **CreateStream()** interface with different input parameters.
After **PreviewOn()** is called, the camera preview channel starts running. Two streams are enabled: preview stream and capture or video stream. Only the preview stream will be captured.
The **CreateStream()** interface calls the HDI to configure and create a stream. Specifically, the interface first calls the HDI to obtain a **StreamOperation** object and then creates a **StreamInfo** object. Call **CreateStreams()** and **CommitStreams()** to create and configure a stream.
The **StartCaptureStream()**, **StartVideoStream()**, and **StartPreviewStream()** methods call **CreateStream()** with different input parameters.
```
RetCode CameraDemo::CreateStreams(const int streamIdSecond, StreamIntent intent)
Use **CreateStream()** to call an HDI API to configure and create streams. Specifically, **CreateStream()** calls the HDI to obtain a **StreamOperation** object and then creates a **StreamInfo** object. Call **CreateStreams()** and **CommitStreams()** to create and configure streams.
```c++
RetCode OhosCameraDemo::CreateStream(const int streamId, std::shared_ptr<StreamCustomer> &streamCustomer,
The **CaptureON()** interface calls the **Capture()** interface of **StreamOperator** to obtain camera data, rotate the buffer, and start a thread to receive data of the corresponding type.
Use **CaptureON()** to call the **Capture()** method of **StreamOperator** to obtain camera data, flip the buffer, and start a thread to receive data of the corresponding type.
```
RetCode CameraDemo::CaptureON(const int streamId, const int captureId, CaptureMode mode)
```c++
RetCode OhosCameraDemo::CaptureON(const int streamId,
const int captureId, CaptureMode mode)
{
std::shared_ptr<Camera::CaptureInfo> captureInfo = std::make_shared<Camera::CaptureInfo>(); // Create and fill in CaptureInfo.
captureInfo->streamIds_ = {streamId};
captureInfo->captureSetting_ = ability_;
captureInfo->enableShutterCallback_ = false;
int rc = streamOperator_->Capture(captureId, captureInfo, true);// The stream capture starts, and buffer recycling starts.
CAMERA_LOGI("demo test: CaptureON enter streamId == %{public}d and captureId == %{public}d and mode == %{public}d",
streamId, captureId, mode);
std::lock_guard<std::mutex> l(metaDatalock_);
if (mode == CAPTURE_SNAPSHOT) {
constexpr double latitude = 27.987500; // dummy data: Qomolangma latitde
streamCustomerPreview_->ReceiveFrameOn(nullptr); // Create a preview thread to receive the passed buffers.
streamCustomerPreview_->ReceiveFrameOn(nullptr); // Create a preview thread to receive the passed buffer.
} else if (mode == CAPTURE_SNAPSHOT) {
streamCustomerCapture_->ReceiveFrameOn([this](void* addr, const uint32_t size) { // Create a capture thread to receive the passed buffers through the StoreImage callback.
streamCustomerCapture_->ReceiveFrameOn([this](void* addr, const uint32_t size) { // Create a capture thread to receive the passed buffer through the StoreImage callback.
StoreImage(addr, size);
});
} else if (mode == CAPTURE_VIDEO) {
OpenVideoFile();
streamCustomerVideo_->ReceiveFrameOn([this](void* addr, const uint32_t size) {// Create a video thread to receive the passed buffer by calling the StoreVideo callback.
streamCustomerVideo_->ReceiveFrameOn([this](void* addr, const uint32_t size) { // Create a video thread to receive the passed buffer through the StoreImage callback.
StoreVideo(addr, size);
});
}
CAMERA_LOGD("demo test: CaptureON exit");
return RC_OK;
}
```
3. Implement the **ManuList()** function to obtain characters from the console through the **fgets()** interface. Different characters correspond to different capabilities provided by the demo, and the functionality menu is printed.
3. Implement **ManuList()** to obtain characters from the console through **fgets()**. Different characters correspond to different capabilities provided by the demo, and the functionality menu is printed.
@@ -700,9 +989,9 @@ There is a camera demo in the **/drivers/peripheral/camera/hal/init** directory.
}
```
The **PutMenuAndGetChr()** interface prints the menu of the demo and calls **fgets()** to wait for commands from the console.
Use **PutMenuAndGetChr()** to print the menu of the demo and call **fgets()** to wait for commands from the console.
```
```c++
static int PutMenuAndGetChr(void)
{
constexpr uint32_t inputCount = 50;
...
...
@@ -724,7 +1013,7 @@ There is a camera demo in the **/drivers/peripheral/camera/hal/init** directory.
The console outputs the menu details as follows:
```
```c++
"Options:\n"
"-h | --help Print this message\n"
"-o | --offline stream offline test\n"
...
...
@@ -732,8 +1021,28 @@ There is a camera demo in the **/drivers/peripheral/camera/hal/init** directory.
"-w | --set WB Set white balance Cloudy\n"
"-v | --video capture Video of 10s\n"
"-a | --Set AE Set Auto exposure\n"
"-e | --Set Metadeta Set Metadata\n"
"-f | --Set Flashlight Set flashlight ON 5s OFF\n"
"-q | --quit stop preview and quit this app\n");
```
4. Compile and build the **ohos_camera_demo**.
Add **init:ohos_camera_demo** to **deps** in the **drivers/peripheral/camera/hal/BUILD.gn** file.
The sample code is as follows:
```
deps = [
"buffer_manager:camera_buffer_manager",
"device_manager:camera_device_manager",
"hdi_impl:camera_host_service_1.0",
"pipeline_core:camera_pipeline_core",
"utils:camera_utils",
"init:ohos_camera_demo",
]
```
The following uses RK3568 development board as an example.
1. Run the **./build.sh --product-name rk3568 --ccache** command to generate the executable binary file **ohos_camera_demo** in **out/rk3568/packages/phone/vendor/bin/**.
2. Import the executable file **ohos_camera_demo** to the development board, modify the permission, and run the file.