提交 7ae6a711 编写于 作者: G gineshidalgo99

Undistort works for all producer types

上级 72d1992a
......@@ -77,7 +77,7 @@ For further details, check [all released features](doc/released_features.md) and
</p>
### Runtime Analysis
Inference time comparison between the 3 available pose estimation libraries: OpenPose, Mask R-CNN, and Alpha-Pose (fast Pytorch version):
Inference time comparison between the 3 available pose estimation libraries: OpenPose, Alpha-Pose (fast Pytorch version), and Mask R-CNN:
<p align="center">
<img src="doc/media/openpose_vs_competition.png", width="360">
</p>
......
......@@ -150,8 +150,8 @@ Each flag is divided into flag name, default value, and description.
- DEFINE_int32(frame_rotate, 0, "Rotate each frame, 4 possible values: 0, 90, 180, 270.");
- DEFINE_bool(frames_repeat, false, "Repeat frames when finished.");
- DEFINE_bool(process_real_time, false, "Enable to keep the original source frame rate (e.g., for video). If the processing time is too long, it will skip frames. If it is too fast, it will slow it down.");
- DEFINE_string(camera_parameter_folder, "models/cameraParameters/flir/", "String with the folder where the camera parameters are located.");
- DEFINE_bool(frame_keep_distortion, false, "If false (default), it will undistortionate the image based on the `camera_parameter_folder` camera parameters; if true, it will not undistortionate, i.e., it will leave it as it is.");
- DEFINE_string(camera_parameter_path, "models/cameraParameters/flir", "String with the folder where the camera parameters are located. If there is only 1 XML file (for single video, webcam, or images from the same camera), you must specify the whole XML file path (ending in .xml).");
- DEFINE_bool(frame_undistort, false, "If false (default), it will not undistort the image, if true, it will undistortionate them based on the camera parameters found in `camera_parameter_path`");
3. OpenPose
- DEFINE_string(model_folder, "models/", "Folder path (absolute or relative) where the models (pose, face, ...) are located.");
......@@ -191,7 +191,7 @@ Each flag is divided into flag name, default value, and description.
8. OpenPose 3-D Reconstruction
- DEFINE_bool(3d, false, "Running OpenPose 3-D reconstruction demo: 1) Reading from a stereo camera system. 2) Performing 3-D reconstruction from the multiple views. 3) Displaying 3-D reconstruction results. Note that it will only display 1 person. If multiple people is present, it will fail.");
- DEFINE_int32(3d_min_views, -1, "Minimum number of views required to reconstruct each keypoint. By default (-1), it will require all the cameras to see the keypoint in order to reconstruct it.");
- DEFINE_int32(3d_views, 1, "Complementary option to `--image_dir` or `--video`. OpenPose will read as many images per iteration, allowing tasks such as stereo camera processing (`--3d`). Note that `--camera_parameters_folder` must be set. OpenPose must find as many `xml` files in the parameter folder as this number indicates.");
- DEFINE_int32(3d_views, -1, "Complementary option for `--image_dir` or `--video`. OpenPose will read as many images per iteration, allowing tasks such as stereo camera processing (`--3d`). Note that `--camera_parameter_path` must be set. OpenPose must find as many `xml` files in the parameter folder as this number indicates.");
9. Extra algorithms
- DEFINE_bool(identification, false, "Experimental, not available yet. Whether to enable people identification across frames.");
......
......@@ -289,6 +289,13 @@ Note: Check the differences between these models in [doc/faq.md#difference-betwe
#### Python API
To install the Python API, ensure that the `BUILD_PYTHON` flag is turned on while running CMake GUI and follow the standard installation steps. After the installation, check [doc/modules/python_module.md](./modules/python_module.md) for further details.
Note: If you are in Windows, and you fail to install the required third party Python libraries, it might print an error similar to: `Exception: Error: OpenPose library could not be found. Did you enable BUILD_PYTHON in CMake and have this Python script in the right folder?`. From GitHub issue #941:
```
I had a similar issue with Visual Studio (VS). I am pretty sure that the issue is that while you are compiling OpenPose in VS, it tries to import cv2 (python-opencv) and it fails. So make sure that if you open cmd.exe and run python, you can actually import cv2 without errors. I could not, but I had cv2 installed in a IPython environment (Anaconda), so I activated that environment, and then ran (change this to adapt it to your VS version and location of OpenPose.sln):
C:\Program Files (x86)\Microsoft Visual Studio\2017\Enterprise\MSBuild.exe C:\path\to\OpenPose.sln
```
#### CPU Version
......
......@@ -66,7 +66,7 @@ This demo assumes n arbitrary stereo cameras from the FLIR company (formerly Poi
## Camera Calibration
The user must manually get the intrinsic and extrinsic parameters of the stereo-cameras. Note, we will assume `Flir` cameras, which is specified by default with the flag `--camera_parameter_folder "models/cameraParameters/flir/"`. Otherwise, change the path to your camera name accordingly.
The user must manually get the intrinsic and extrinsic parameters of the stereo-cameras. Note, we will assume `Flir` cameras, which is specified by default with the flag `--camera_parameter_path "models/cameraParameters/flir/"`. Otherwise, change the path to your camera name accordingly.
There are 2 alternatives to calibrate the cameras:
......
......@@ -37,7 +37,7 @@ Note: In order to maximize calibration quality, **do not reuse the same video se
### Step 1 - Distortion and Intrinsic Parameter Calibration
1. Run OpenPose and save images for your desired camera. Use a grid (chessboard) pattern and move around all the image area. Depending on the images source:
1. Webcam calibration: `./build/examples/openpose/openpose.bin --num_gpu 0 --frame_keep_distortion --write_images {intrinsic_images_folder_path}`.
1. Webcam calibration: `./build/examples/openpose/openpose.bin --num_gpu 0 --write_images {intrinsic_images_folder_path}`.
2. Flir camera calibration: Add the flags `--flir_camera --flir_camera_index 0` (or the desired flir camera index) to the webcam command.
3. Calibration from video sequence: Add the flag `--video {video_path}` to the webcam command.
4. Any other camera brand: Simply save your images in `{intrinsic_images_folder_path}`, file names are not relevant.
......@@ -51,21 +51,43 @@ Note: In order to maximize calibration quality, **do not reuse the same video se
```
4. In this case, the intrinsic parameters would have been generated as `{intrinsic_images_folder_path}/18079958.xml`.
5. Run steps 1-4 for each one of your cameras.
6. After you calibrate the camera intrinsics, when you run OpenPose with those cameras, you should see the lines in real-life to be (almost) perfect lines in the image. Otherwise, the calibration was not good. Try checking straight patterns such us wall corners or ceilings:
6. After you calibrate the camera intrinsics, when you run OpenPose with those cameras, you should see the lines in real-life to be (almost) perfect lines in the image. Otherwise, the calibration was not good. Try checking straight patterns such us wall or ceiling edges:
```sh
# With distortion (lines might seem with a more circular shape)
./build/examples/openpose/openpose.bin --num_gpu 0 --flir_camera --flir_camera_index 0 --frame_keep_distortion
# Without distortion (lines should look as lines)
# With distortion (straight lines might not look as straight lines but rather with a more circular shape)
./build/examples/openpose/openpose.bin --num_gpu 0 --flir_camera --flir_camera_index 0
# Without distortion (straight lines should look as straight lines)
./build/examples/openpose/openpose.bin --num_gpu 0 --flir_camera --flir_camera_index 0 --frame_undistort
```
7. Full example for 4 Flir/Point Grey cameras:
Examples:
1. Full example for a folder of images, a video, webcam streaming, etc.:
```sh
# Ubuntu and Mac
# Get images for calibration (only if target is not `--image_dir`)
# If video
./build/examples/openpose/openpose.bin --num_gpu 0 --video examples/media/video_chessboard.avi --write_images ~/Desktop/Calib_intrinsics
# If webcam
./build/examples/openpose/openpose.bin --num_gpu 0 --webcam --write_images ~/Desktop/Calib_intrinsics
# Run calibration
./build/examples/calibration/calibration.bin --mode 1 --grid_square_size_mm 30.0 --grid_number_inner_corners "8x6" --calibration_image_dir ~/Desktop/Calib_intrinsics/ --camera_parameter_folder models/cameraParameters/ --camera_serial_number frame_intrinsics
# Output: {OpenPose path}/models/cameraParameters/frame_intrinsics.xml
# Visualize undistorted images
./build/examples/openpose/openpose.bin --num_gpu 0 --image_dir ~/Desktop/Calib_intrinsics/ --frame_undistort --camera_parameter_path "models/cameraParameters/frame_intrinsics.xml"
# If video
./build/examples/openpose/openpose.bin --num_gpu 0 --video examples/media/video_chessboard.avi --frame_undistort --camera_parameter_path "models/cameraParameters/frame_intrinsics.xml"
# If webcam
./build/examples/openpose/openpose.bin --num_gpu 0 --webcam --frame_undistort --camera_parameter_path "models/cameraParameters/frame_intrinsics.xml"
```
2. Full example for 4-view Flir/Point Grey camera system:
```sh
# Ubuntu and Mac
# Get images for calibration
./build/examples/openpose/openpose.bin --num_gpu 0 --frame_keep_distortion --flir_camera --flir_camera_index 0 --write_images ~/Desktop/intrinsics_0
./build/examples/openpose/openpose.bin --num_gpu 0 --frame_keep_distortion --flir_camera --flir_camera_index 1 --write_images ~/Desktop/intrinsics_1
./build/examples/openpose/openpose.bin --num_gpu 0 --frame_keep_distortion --flir_camera --flir_camera_index 2 --write_images ~/Desktop/intrinsics_2
./build/examples/openpose/openpose.bin --num_gpu 0 --frame_keep_distortion --flir_camera --flir_camera_index 3 --write_images ~/Desktop/intrinsics_3
./build/examples/openpose/openpose.bin --num_gpu 0 --flir_camera --flir_camera_index 0 --write_images ~/Desktop/intrinsics_0
./build/examples/openpose/openpose.bin --num_gpu 0 --flir_camera --flir_camera_index 1 --write_images ~/Desktop/intrinsics_1
./build/examples/openpose/openpose.bin --num_gpu 0 --flir_camera --flir_camera_index 2 --write_images ~/Desktop/intrinsics_2
./build/examples/openpose/openpose.bin --num_gpu 0 --flir_camera --flir_camera_index 3 --write_images ~/Desktop/intrinsics_3
# Run calibration
# - Note: If your computer has enough RAM memory, you can run all of them at the same time in order to speed up the time (they are not internally multi-threaded).
./build/examples/calibration/calibration.bin --mode 1 --grid_square_size_mm 127.0 --grid_number_inner_corners "9x6" --camera_serial_number 17012332 --calibration_image_dir ~/Desktop/intrinsics_0
......@@ -74,13 +96,11 @@ Note: In order to maximize calibration quality, **do not reuse the same video se
./build/examples/calibration/calibration.bin --mode 1 --grid_square_size_mm 127.0 --grid_number_inner_corners "9x6" --camera_serial_number 18079957 --calibration_image_dir ~/Desktop/intrinsics_3
# Visualize undistorted images
# - Camera parameters will be saved on their respective serial number files, so OpenPose will automatically find them
./build/examples/openpose/openpose.bin --num_gpu 0 --flir_camera
```
:: Windows
:: build\x64\Release\calibration.exe with the same flags as above
```
./build/examples/openpose/openpose.bin --num_gpu 0 --flir_camera --frame_undistort
```
3. For Windows, simply run `build\x64\Release\calibration.exe` (or the one from the binary portable demo) with the same flags as above.
### Step 2 - Extrinsic Parameter Calibration
......@@ -107,7 +127,7 @@ Note: In order to maximize calibration quality, **do not reuse the same video se
```
4. Hint to verify extrinsic calibration is successful:
1. Translation vector - Global distance:
1. Manually open each one of the generated XML files from the folder indicated by the flag `--camera_parameter_folder` (or the default one indicated by the `--help` flag if the former was not used).
1. Manually open each one of the generated XML files from the folder indicated by the flag `--camera_parameter_path` (or the default one indicated by the `--help` flag if the former was not used).
2. The field `CameraMatrix` is a 3 x 4 matrix (you can see that the subfield `rows` in that file is 3 and `cols` is 4).
3. Order the matrix in that 3 x 4 shape (e.g., by copying in a different text file with the shape of 3 rows and 4 columns).
4. The 3 first components of the last column of the `CameraMatrix` field define the global `translation` (in meters) with respect to the global origin (in our case camera 1).
......
......@@ -292,6 +292,7 @@ OpenPose Library - Release Notes
22. Examples do not end in core dumped if an OpenPose exception occurred during initialization, but it is rather closed returning -1. However, it will still results in core dumped if the exception occurs during multi-threading execution.
23. Video (`--write_video`) can be generated from images (`--image_dir`), as long as they maintain the same resolution.
24. Added `--fps_max` flag to limit the maximum processing frame rate of OpenPose (useful to display results at a maximum desired speed).
25. Frame undistortion can be applied not only to FLIR cameras, but also to all other input sources (image, webcam, video, etc.).
2. Functions or parameters renamed:
1. By default, python example `tutorial_developer/python_2_pose_from_heatmaps.py` was using 2 scales starting at -1x736, changed to 1 scale at -1x368.
2. WrapperStructPose default parameters changed to match those of the OpenPose demo binary.
......@@ -301,6 +302,8 @@ OpenPose Library - Release Notes
6. Removed old COCO 2014 validation scripts.
7. WrapperStructOutput split into WrapperStructOutput and WrapperStructGui.
8. Replaced `--camera_fps` flag by `--write_video_fps`, given that it was a confusing name: It did not affect the webcam FPS, but only the FPS of the output video. In addition, default value changed from 30 to -1.
9. Renamed `--frame_keep_distortion` as `--frame_undistort`, which performs the opposite operation (the default value has been also changed to the opposite).
10. Renamed `--camera_parameter_folder` as `--camera_parameter_path` because it could also take a whole XML file path rather than its parent folder.
3. Main bugs fixed:
1. CMake-GUI was forcing to Release mode, allowed Debug modes too.
2. NMS returns in index 0 the number of found peaks. However, while the number of peaks was truncated to a maximum of 127, this index 0 was saving the real number instead of the truncated one.
......
......@@ -57,10 +57,11 @@ int openPoseDemo()
// const auto saveImagesWithCorners = false;
const auto saveImagesWithCorners = true;
// Run camera calibration code
op::estimateAndSaveIntrinsics(gridInnerCorners, FLAGS_grid_square_size_mm, flags,
op::formatAsDirectory(FLAGS_camera_parameter_folder),
op::formatAsDirectory(FLAGS_calibration_image_dir),
FLAGS_camera_serial_number, saveImagesWithCorners);
op::estimateAndSaveIntrinsics(
gridInnerCorners, FLAGS_grid_square_size_mm, flags,
op::formatAsDirectory(FLAGS_camera_parameter_folder),
op::formatAsDirectory(FLAGS_calibration_image_dir),
FLAGS_camera_serial_number, saveImagesWithCorners);
op::log("Intrinsic calibration completed!", op::Priority::High);
}
......@@ -69,15 +70,10 @@ int openPoseDemo()
{
op::log("Running calibration (extrinsic parameters)...", op::Priority::High);
// Parameters
op::estimateAndSaveExtrinsics(FLAGS_camera_parameter_folder,
op::formatAsDirectory(FLAGS_calibration_image_dir),
op::flagsToPoint(FLAGS_grid_number_inner_corners, "12x7"),
FLAGS_grid_square_size_mm,
FLAGS_cam0,
FLAGS_cam1,
FLAGS_omit_distortion,
FLAGS_combine_cam0_extrinsics);
op::estimateAndSaveExtrinsics(
FLAGS_camera_parameter_folder, op::formatAsDirectory(FLAGS_calibration_image_dir),
op::flagsToPoint(FLAGS_grid_number_inner_corners, "12x7"), FLAGS_grid_square_size_mm, FLAGS_cam0,
FLAGS_cam1, FLAGS_omit_distortion, FLAGS_combine_cam0_extrinsics);
op::log("Extrinsic calibration completed!", op::Priority::High);
}
......
......@@ -94,7 +94,7 @@ int openPoseDemo()
const op::WrapperStructInput wrapperStructInput{
producerType, producerString, FLAGS_frame_first, FLAGS_frame_step, FLAGS_frame_last,
FLAGS_process_real_time, FLAGS_frame_flip, FLAGS_frame_rotate, FLAGS_frames_repeat,
cameraSize, FLAGS_camera_parameter_folder, !FLAGS_frame_keep_distortion, (unsigned int) FLAGS_3d_views};
cameraSize, FLAGS_camera_parameter_path, FLAGS_frame_undistort, FLAGS_3d_views};
opWrapper.configure(wrapperStructInput);
// Output (comment or use default argument to disable any output)
const op::WrapperStructOutput wrapperStructOutput{
......
......@@ -12,15 +12,13 @@ clear && clear
# Parameters
IMAGE_FOLDER=/media/posefs3b/Users/gines/openpose_train/dataset/COCO/cocoapi/images/test2017_dev/
JSON_FOLDER=../evaluation/coco_val_jsons/
# JSON_FOLDER=/media/posefs3b/Users/gines/openpose_train/training_results/2_23_51/best_702k/
OP_BIN=./build/examples/openpose/openpose.bin
# # 1 scale
# $OP_BIN --image_dir $IMAGE_FOLDER --display 0 --render_pose 0 --cli_verbose 0.2 --write_coco_json ${JSON_FOLDER}1_test.json
# $OP_BIN --image_dir $IMAGE_FOLDER --display 0 --render_pose 0 --cli_verbose 0.2 --write_coco_json ${JSON_FOLDER}1_test_max.json \
# --maximize_positives
# 4 scales
# $OP_BIN --image_dir $IMAGE_FOLDER --display 0 --render_pose 0 --cli_verbose 0.2 --write_coco_json ${JSON_FOLDER}1_4_test.json \
# --scale_number 4 --scale_gap 0.25 --net_resolution "1312x736"
$OP_BIN --image_dir $IMAGE_FOLDER --display 0 --render_pose 0 --cli_verbose 0.2 --write_coco_json ${JSON_FOLDER}1_4_test_max.json \
--scale_number 4 --scale_gap 0.25 --net_resolution "1312x736" --maximize_positives
# \
# --model_pose BODY_23 --model_folder ${JSON_FOLDER}
......@@ -12,6 +12,7 @@ clear && clear
# Parameters
IMAGE_FOLDER=~/devel/images/val2017/
JSON_FOLDER=../evaluation/coco_val_jsons/
# JSON_FOLDER=/media/posefs3b/Users/gines/openpose_train/training_results/2_23_51/best_702k/
OP_BIN=./build/examples/openpose/openpose.bin
# 1 scale
......@@ -19,12 +20,10 @@ $OP_BIN --image_dir $IMAGE_FOLDER --display 0 --render_pose 0 --cli_verbose 0.2
# $OP_BIN --image_dir $IMAGE_FOLDER --display 0 --render_pose 0 --cli_verbose 0.2 --write_coco_json ${JSON_FOLDER}1_max.json --write_coco_foot_json ${JSON_FOLDER}1_foot_max.json \
# --maximize_positives
# # 3 scales
# $OP_BIN --image_dir $IMAGE_FOLDER --display 0 --render_pose 0 --cli_verbose 0.2 --write_coco_json ${JSON_FOLDER}1_3.json --write_coco_foot_json ${JSON_FOLDER}3_foot.json \
# --scale_number 3 --scale_gap 0.25
# # 4 scales
# $OP_BIN --image_dir $IMAGE_FOLDER --display 0 --render_pose 0 --cli_verbose 0.2 --write_coco_json ${JSON_FOLDER}1_4.json --write_coco_foot_json ${JSON_FOLDER}4_foot.json \
# --scale_number 4 --scale_gap 0.25 --net_resolution "1312x736"
# $OP_BIN --image_dir $IMAGE_FOLDER --display 0 --render_pose 0 --cli_verbose 0.2 --write_coco_json ${JSON_FOLDER}1_4_max.json --write_coco_foot_json ${JSON_FOLDER}4_foot_max.json \
# --scale_number 4 --scale_gap 0.25 --net_resolution "1312x736" --maximize_positives
# # \
# # --model_pose BODY_23 --model_folder ${JSON_FOLDER}
......@@ -107,7 +107,7 @@ int tutorialAddModule1()
const op::WrapperStructInput wrapperStructInput{
producerType, producerString, FLAGS_frame_first, FLAGS_frame_step, FLAGS_frame_last,
FLAGS_process_real_time, FLAGS_frame_flip, FLAGS_frame_rotate, FLAGS_frames_repeat,
cameraSize, FLAGS_camera_parameter_folder, !FLAGS_frame_keep_distortion, (unsigned int) FLAGS_3d_views};
cameraSize, FLAGS_camera_parameter_path, FLAGS_frame_undistort, FLAGS_3d_views};
opWrapperT.configure(wrapperStructInput);
// Output (comment or use default argument to disable any output)
const op::WrapperStructOutput wrapperStructOutput{
......
......@@ -189,7 +189,7 @@ int tutorialApiCpp5()
const op::WrapperStructInput wrapperStructInput{
producerType, producerString, FLAGS_frame_first, FLAGS_frame_step, FLAGS_frame_last,
FLAGS_process_real_time, FLAGS_frame_flip, FLAGS_frame_rotate, FLAGS_frames_repeat,
cameraSize, FLAGS_camera_parameter_folder, !FLAGS_frame_keep_distortion, (unsigned int) FLAGS_3d_views};
cameraSize, FLAGS_camera_parameter_path, FLAGS_frame_undistort, FLAGS_3d_views};
opWrapperT.configure(wrapperStructInput);
// Output (comment or use default argument to disable any output)
const op::WrapperStructOutput wrapperStructOutput{
......
......@@ -157,7 +157,7 @@ int tutorialApiCpp6()
const op::WrapperStructInput wrapperStructInput{
producerType, producerString, FLAGS_frame_first, FLAGS_frame_step, FLAGS_frame_last,
FLAGS_process_real_time, FLAGS_frame_flip, FLAGS_frame_rotate, FLAGS_frames_repeat,
cameraSize, FLAGS_camera_parameter_folder, !FLAGS_frame_keep_distortion, (unsigned int) FLAGS_3d_views};
cameraSize, FLAGS_camera_parameter_path, FLAGS_frame_undistort, FLAGS_3d_views};
opWrapperT.configure(wrapperStructInput);
// Output (comment or use default argument to disable any output)
const op::WrapperStructOutput wrapperStructOutput{
......
......@@ -204,7 +204,7 @@ int tutorialApiCpp8()
const op::WrapperStructInput wrapperStructInput{
producerType, producerString, FLAGS_frame_first, FLAGS_frame_step, FLAGS_frame_last,
FLAGS_process_real_time, FLAGS_frame_flip, FLAGS_frame_rotate, FLAGS_frames_repeat,
cameraSize, FLAGS_camera_parameter_folder, !FLAGS_frame_keep_distortion, (unsigned int) FLAGS_3d_views};
cameraSize, FLAGS_camera_parameter_path, FLAGS_frame_undistort, FLAGS_3d_views};
opWrapperT.configure(wrapperStructInput);
// Output (comment or use default argument to disable any output)
const op::WrapperStructOutput wrapperStructOutput{
......
......@@ -45,16 +45,17 @@ DEFINE_int32(flir_camera_index, -1, "Select -1 (default) to
DEFINE_string(ip_camera, "", "String with the IP camera URL. It supports protocols like RTSP and HTTP.");
DEFINE_bool(process_real_time, false, "Enable to keep the original source frame rate (e.g., for video). If the processing time is"
" too long, it will skip frames. If it is too fast, it will slow it down.");
DEFINE_string(camera_parameter_folder, "models/cameraParameters/flir/", "String with the folder where the camera parameters are located.");
DEFINE_bool(frame_keep_distortion, false, "If false (default), it will undistortionate the image based on the"
" `camera_parameter_folder` camera parameters; if true, it will not undistortionate, i.e.,"
" it will leave it as it is.");
DEFINE_string(camera_parameter_path, "models/cameraParameters/flir/", "String with the folder where the camera parameters are located. If there"
" is only 1 XML file (for single video, webcam, or images from the same camera), you must"
" specify the whole XML file path (ending in .xml).");
DEFINE_bool(frame_undistort, false, "If false (default), it will not undistort the image, if true, it will undistortionate them"
" based on the camera parameters found in `camera_parameter_path`");
// OpenPose
DEFINE_string(output_resolution, "-1x-1", "The image resolution (display and output). Use \"-1x-1\" to force the program to use the"
" input image resolution.");
DEFINE_int32(3d_views, 1, "Complementary option to `--image_dir` or `--video`. OpenPose will read as many images per"
" iteration, allowing tasks such as stereo camera processing (`--3d`). Note that"
" `--camera_parameters_folder` must be set. OpenPose must find as many `xml` files in the"
" `--camera_parameter_path` must be set. OpenPose must find as many `xml` files in the"
" parameter folder as this number indicates.");
// Consumer
DEFINE_bool(fullscreen, false, "Run in full-screen mode (press f during runtime to toggle).");
......@@ -86,8 +87,8 @@ int tutorialDeveloperThread1()
const auto displayProducerFpsMode = (FLAGS_process_real_time
? op::ProducerFpsMode::OriginalFps : op::ProducerFpsMode::RetrievalFps);
auto producerSharedPtr = createProducer(
producerType, producerString, cameraSize, FLAGS_camera_parameter_folder, !FLAGS_frame_keep_distortion,
(unsigned int) FLAGS_3d_views);
producerType, producerString, cameraSize, FLAGS_camera_parameter_path, FLAGS_frame_undistort,
FLAGS_3d_views);
producerSharedPtr->setProducerFpsMode(displayProducerFpsMode);
op::log("", op::Priority::Low, __LINE__, __FUNCTION__, __FILE__);
// Step 3 - Setting producer
......
......@@ -46,16 +46,17 @@ DEFINE_int32(flir_camera_index, -1, "Select -1 (default) to
DEFINE_string(ip_camera, "", "String with the IP camera URL. It supports protocols like RTSP and HTTP.");
DEFINE_bool(process_real_time, false, "Enable to keep the original source frame rate (e.g., for video). If the processing time is"
" too long, it will skip frames. If it is too fast, it will slow it down.");
DEFINE_string(camera_parameter_folder, "models/cameraParameters/flir/", "String with the folder where the camera parameters are located.");
DEFINE_bool(frame_keep_distortion, false, "If false (default), it will undistortionate the image based on the"
" `camera_parameter_folder` camera parameters; if true, it will not undistortionate, i.e.,"
" it will leave it as it is.");
DEFINE_string(camera_parameter_path, "models/cameraParameters/flir/", "String with the folder where the camera parameters are located. If there"
" is only 1 XML file (for single video, webcam, or images from the same camera), you must"
" specify the whole XML file path (ending in .xml).");
DEFINE_bool(frame_undistort, false, "If false (default), it will not undistort the image, if true, it will undistortionate them"
" based on the camera parameters found in `camera_parameter_path`");
// OpenPose
DEFINE_string(output_resolution, "-1x-1", "The image resolution (display and output). Use \"-1x-1\" to force the program to use the"
" input image resolution.");
DEFINE_int32(3d_views, 1, "Complementary option to `--image_dir` or `--video`. OpenPose will read as many images per"
" iteration, allowing tasks such as stereo camera processing (`--3d`). Note that"
" `--camera_parameters_folder` must be set. OpenPose must find as many `xml` files in the"
" `--camera_parameter_path` must be set. OpenPose must find as many `xml` files in the"
" parameter folder as this number indicates.");
// Consumer
DEFINE_bool(fullscreen, false, "Run in full-screen mode (press f during runtime to toggle).");
......@@ -120,8 +121,8 @@ int tutorialDeveloperThread2()
const auto displayProducerFpsMode = (FLAGS_process_real_time
? op::ProducerFpsMode::OriginalFps : op::ProducerFpsMode::RetrievalFps);
auto producerSharedPtr = createProducer(
producerType, producerString, cameraSize, FLAGS_camera_parameter_folder, !FLAGS_frame_keep_distortion,
(unsigned int) FLAGS_3d_views);
producerType, producerString, cameraSize, FLAGS_camera_parameter_path, FLAGS_frame_undistort,
FLAGS_3d_views);
producerSharedPtr->setProducerFpsMode(displayProducerFpsMode);
op::log("", op::Priority::Low, __LINE__, __FUNCTION__, __FILE__);
// Step 3 - Setting producer
......
......@@ -24,6 +24,10 @@ namespace op
void readParameters(const std::string& cameraParameterPath,
const std::vector<std::string>& serialNumbers = {});
// It simply calls the previous readParameters with a single element
void readParameters(const std::string& cameraParameterPath,
const std::string& serialNumber);
void writeParameters(const std::string& cameraParameterPath) const;
unsigned long long getNumberCameras() const;
......@@ -38,14 +42,24 @@ namespace op
const std::vector<cv::Mat>& getCameraDistortions() const;
bool getUndistortImage() const;
void setUndistortImage(const bool undistortImage);
void undistort(cv::Mat& frame, const unsigned int cameraIndex = 0u);
private:
std::vector<std::string> mSerialNumbers;
unsigned long long mNumberCameras;
std::vector<cv::Mat> mCameraMatrices;
std::vector<cv::Mat> mCameraExtrinsics;
std::vector<cv::Mat> mCameraIntrinsics;
std::vector<cv::Mat> mCameraDistortions;
// Undistortion (optional)
bool mUndistortImage;
std::vector<cv::Mat> mRemoveDistortionMaps1;
std::vector<cv::Mat> mRemoveDistortionMaps2;
DELETE_COPY(CameraParameterReader);
};
}
......
......@@ -13,27 +13,19 @@ namespace op
* CALIB_THIN_PRISM_MODEL, or CALIB_TILTED_MODEL)
* @param outputFilePath String with the name of the file where to write
*/
OP_API void estimateAndSaveIntrinsics(const Point<int>& gridInnerCorners,
const float gridSquareSizeMm,
const int flags,
const std::string& outputParameterFolder,
const std::string& imagesFolder,
const std::string& serialNumber,
const bool saveImagesWithCorners = false);
OP_API void estimateAndSaveIntrinsics(
const Point<int>& gridInnerCorners, const float gridSquareSizeMm, const int flags,
const std::string& outputParameterFolder, const std::string& imagesFolder, const std::string& serialNumber,
const bool saveImagesWithCorners = false);
OP_API void estimateAndSaveExtrinsics(const std::string& intrinsicsFolder,
const std::string& extrinsicsImagesFolder,
const Point<int>& gridInnerCorners,
const float gridSquareSizeMm,
const int index0,
const int index1,
const bool imagesAreUndistorted,
const bool combineCam0Extrinsics);
OP_API void estimateAndSaveExtrinsics(
const std::string& intrinsicsFolder, const std::string& extrinsicsImagesFolder,
const Point<int>& gridInnerCorners, const float gridSquareSizeMm, const int index0, const int index1,
const bool imagesAreUndistorted, const bool combineCam0Extrinsics);
OP_API void estimateAndSaveSiftFile(const Point<int>& gridInnerCorners,
const std::string& imagesFolder,
const int numberCameras,
const bool saveImagesWithCorners = false);
OP_API void estimateAndSaveSiftFile(
const Point<int>& gridInnerCorners, const std::string& imagesFolder, const int numberCameras,
const bool saveImagesWithCorners = false);
}
#endif // OPENPOSE_CALIBRATION_CAMERA_PARAMETER_ESTIMATION_HPP
......@@ -14,26 +14,24 @@ namespace op
BottomRight
};
OP_API std::pair<bool, std::vector<cv::Point2f>> findAccurateGridCorners(const cv::Mat& image,
const cv::Size& gridInnerCorners);
OP_API std::pair<bool, std::vector<cv::Point2f>> findAccurateGridCorners(
const cv::Mat& image, const cv::Size& gridInnerCorners);
OP_API std::vector<cv::Point3f> getObjects3DVector(const cv::Size& gridInnerCorners,
const float gridSquareSizeMm);
OP_API std::vector<cv::Point3f> getObjects3DVector(
const cv::Size& gridInnerCorners, const float gridSquareSizeMm);
OP_API void drawGridCorners(cv::Mat& image, const cv::Size& gridInnerCorners,
const std::vector<cv::Point2f>& points2DVector);
OP_API void drawGridCorners(
cv::Mat& image, const cv::Size& gridInnerCorners, const std::vector<cv::Point2f>& points2DVector);
OP_API std::array<unsigned int, 4> getOutterCornerIndices(const std::vector<cv::Point2f>& points2DVector,
const cv::Size& gridInnerCorners);
OP_API std::array<unsigned int, 4> getOutterCornerIndices(
const std::vector<cv::Point2f>& points2DVector, const cv::Size& gridInnerCorners);
OP_API void reorderPoints(std::vector<cv::Point2f>& points2DVector,
const cv::Size& gridInnerCorners,
OP_API void reorderPoints(std::vector<cv::Point2f>& points2DVector, const cv::Size& gridInnerCorners,
const Points2DOrigin points2DOriginDesired);
OP_API void plotGridCorners(const cv::Size& gridInnerCorners,
const std::vector<cv::Point2f>& points2DVector,
const std::string& imagePath,
const cv::Mat& image);
OP_API void plotGridCorners(
const cv::Size& gridInnerCorners, const std::vector<cv::Point2f>& points2DVector,
const std::string& imagePath, const cv::Mat& image);
}
#endif // OPENPOSE_CALIBRATION_GRID_PATTERN_FUNCTIONS_HPP
......@@ -52,10 +52,11 @@ DEFINE_int32(frame_rotate, 0, "Rotate each frame, 4 po
DEFINE_bool(frames_repeat, false, "Repeat frames when finished.");
DEFINE_bool(process_real_time, false, "Enable to keep the original source frame rate (e.g., for video). If the processing time is"
" too long, it will skip frames. If it is too fast, it will slow it down.");
DEFINE_string(camera_parameter_folder, "models/cameraParameters/flir/", "String with the folder where the camera parameters are located.");
DEFINE_bool(frame_keep_distortion, false, "If false (default), it will undistortionate the image based on the"
" `camera_parameter_folder` camera parameters; if true, it will not undistortionate, i.e.,"
" it will leave it as it is.");
DEFINE_string(camera_parameter_path, "models/cameraParameters/flir/", "String with the folder where the camera parameters are located. If there"
" is only 1 XML file (for single video, webcam, or images from the same camera), you must"
" specify the whole XML file path (ending in .xml).");
DEFINE_bool(frame_undistort, false, "If false (default), it will not undistort the image, if true, it will undistortionate them"
" based on the camera parameters found in `camera_parameter_path`");
#endif // OPENPOSE_FLAGS_DISABLE_PRODUCER
// OpenPose
DEFINE_string(model_folder, "models/", "Folder path (absolute or relative) where the models (pose, face, ...) are located.");
......@@ -148,9 +149,9 @@ DEFINE_bool(3d, false, "Running OpenPose 3-D re
" fail.");
DEFINE_int32(3d_min_views, -1, "Minimum number of views required to reconstruct each keypoint. By default (-1), it will"
" require all the cameras to see the keypoint in order to reconstruct it.");
DEFINE_int32(3d_views, 1, "Complementary option to `--image_dir` or `--video`. OpenPose will read as many images per"
DEFINE_int32(3d_views, -1, "Complementary option for `--image_dir` or `--video`. OpenPose will read as many images per"
" iteration, allowing tasks such as stereo camera processing (`--3d`). Note that"
" `--camera_parameters_folder` must be set. OpenPose must find as many `xml` files in the"
" `--camera_parameter_path` must be set. OpenPose must find as many `xml` files in the"
" parameter folder as this number indicates.");
// Extra algorithms
DEFINE_bool(identification, false, "Experimental, not available yet. Whether to enable people identification across frames.");
......
......@@ -17,6 +17,7 @@ namespace op
Flip,
Rotation,
FrameStep,
NumberViews,
Size,
};
......
#ifndef OPENPOSE_PRODUCER_IMAGE_DIRECTORY_READER_HPP
#define OPENPOSE_PRODUCER_IMAGE_DIRECTORY_READER_HPP
#include <openpose/3d/cameraParameterReader.hpp>
#include <openpose/core/common.hpp>
#include <openpose/producer/producer.hpp>
......@@ -19,22 +18,17 @@ namespace op
* Constructor of ImageDirectoryReader. It sets the image directory path from which the images will be loaded
* and generates a std::vector<std::string> with the list of images on that directory.
* @param imageDirectoryPath const std::string parameter with the folder path containing the images.
* @param imageDirectoryStereo const int parameter with the number of images per iteration (>1 would represent
* stereo processing).
* @param cameraParameterPath const std::string parameter with the folder path containing the camera
* parameters (only required if imageDirectorystereo > 1).
* @param numberViews const int parameter with the number of images per iteration (>1 would represent
* stereo processing).
*/
explicit ImageDirectoryReader(const std::string& imageDirectoryPath, const unsigned int imageDirectoryStereo = 1,
const std::string& cameraParameterPath = "");
explicit ImageDirectoryReader(
const std::string& imageDirectoryPath, const std::string& cameraParameterPath = "",
const bool undistortImage = false, const int numberViews = -1);
virtual ~ImageDirectoryReader();
std::vector<cv::Mat> getCameraMatrices();
std::vector<cv::Mat> getCameraExtrinsics();
std::vector<cv::Mat> getCameraIntrinsics();
std::string getNextFrameName();
inline bool isOpened() const
......@@ -53,9 +47,7 @@ namespace op
private:
const std::string mImageDirectoryPath;
const unsigned int mImageDirectoryStereo;
const std::vector<std::string> mFilePaths;
CameraParameterReader mCameraParameterReader;
Point<int> mResolution;
long long mFrameNameCounter;
......
......@@ -16,16 +16,11 @@ namespace op
* Constructor of IpCameraReader. It opens the IP camera as a wrapper of cv::VideoCapture.
* @param cameraPath const std::string parameter with the full camera IP link.
*/
explicit IpCameraReader(const std::string& cameraPath);
explicit IpCameraReader(const std::string& cameraPath, const std::string& cameraParameterPath = "",
const bool undistortImage = false);
virtual ~IpCameraReader();
std::vector<cv::Mat> getCameraMatrices();
std::vector<cv::Mat> getCameraExtrinsics();
std::vector<cv::Mat> getCameraIntrinsics();
std::string getNextFrameName();
inline bool isOpened() const
......
......@@ -3,6 +3,7 @@
#include <opencv2/core/core.hpp> // cv::Mat
#include <opencv2/highgui/highgui.hpp> // capProperties of OpenCV
#include <openpose/3d/cameraParameterReader.hpp>
#include <openpose/core/common.hpp>
#include <openpose/producer/enumClasses.hpp>
......@@ -17,10 +18,9 @@ namespace op
public:
/**
* Constructor of Producer.
* @param repeatWhenFinished bool indicating whether the producer must be restarted after
* it finishes.
*/
explicit Producer(const ProducerType type);
explicit Producer(const ProducerType type, const std::string& cameraParameterPath, const bool undistortImage,
const int mNumberViews);
/**
* Destructor of Producer. It is virtual so that any children class can implement
......@@ -42,21 +42,24 @@ namespace op
/**
* It retrieves and returns the camera matrixes from the frames producer.
* Virtual class because FlirReader implements their own.
* @return std::vector<cv::Mat> with the camera matrices.
*/
virtual std::vector<cv::Mat> getCameraMatrices() = 0;
virtual std::vector<cv::Mat> getCameraMatrices();
/**
* It retrieves and returns the camera extrinsic parameters from the frames producer.
* Virtual class because FlirReader implements their own.
* @return std::vector<cv::Mat> with the camera extrinsic parameters.
*/
virtual std::vector<cv::Mat> getCameraExtrinsics() = 0;
virtual std::vector<cv::Mat> getCameraExtrinsics();
/**
* It retrieves and returns the camera intrinsic parameters from the frames producer.
* Virtual class because FlirReader implements their own.
* @return std::vector<cv::Mat> with the camera intrinsic parameters.
*/
virtual std::vector<cv::Mat> getCameraIntrinsics() = 0;
virtual std::vector<cv::Mat> getCameraIntrinsics();
/**
* This function returns a unique frame name (e.g., the frame number for video, the
......@@ -175,6 +178,8 @@ namespace op
unsigned long long mNumberFramesTrackingFps;
unsigned int mNumberSetPositionTrackingFps;
std::chrono::high_resolution_clock::time_point mClockTrackingFps;
// Camera parameters
CameraParameterReader mCameraParameterReader;
DELETE_COPY(Producer);
};
......@@ -186,7 +191,7 @@ namespace op
const ProducerType producerType = ProducerType::None, const std::string& producerString = "",
const Point<int>& cameraResolution = Point<int>{-1,-1},
const std::string& cameraParameterPath = "models/cameraParameters/", const bool undistortImage = true,
const unsigned int imageDirectoryStereo = -1);
const int numberViews = -1);
}
#endif // OPENPOSE_PRODUCER_PRODUCER_HPP
......@@ -19,14 +19,18 @@ namespace op
* This constructor of VideoCaptureReader wraps cv::VideoCapture(const int).
* @param index const int indicating the cv::VideoCapture constructor int argument, in the range [0, 9].
*/
explicit VideoCaptureReader(const int index, const bool throwExceptionIfNoOpened);
explicit VideoCaptureReader(const int index, const bool throwExceptionIfNoOpened,
const std::string& cameraParameterPath, const bool undistortImage,
const int numberViews);
/**
* This constructor of VideoCaptureReader wraps cv::VideoCapture(const std::string).
* @param path const std::string indicating the cv::VideoCapture constructor string argument.
* @param producerType const std::string indicating whether the frame source is an IP camera or video.
*/
explicit VideoCaptureReader(const std::string& path, const ProducerType producerType);
explicit VideoCaptureReader(const std::string& path, const ProducerType producerType,
const std::string& cameraParameterPath, const bool undistortImage,
const int numberViews);
/**
* Destructor of VideoCaptureReader. It releases the cv::VideoCapture member. It is virtual so that
......
#ifndef OPENPOSE_PRODUCER_VIDEO_READER_HPP
#define OPENPOSE_PRODUCER_VIDEO_READER_HPP
#include <openpose/3d/cameraParameterReader.hpp>
#include <openpose/core/common.hpp>
#include <openpose/producer/videoCaptureReader.hpp>
......@@ -18,22 +17,17 @@ namespace op
* Constructor of VideoReader. It opens the video as a wrapper of cv::VideoCapture. It includes a flag to
* indicate whether the video should be repeated once it is completely read.
* @param videoPath const std::string parameter with the full video path location.
* @param imageDirectoryStereo const int parameter with the number of images per iteration (>1 would represent
* stereo processing).
* @param cameraParameterPath const std::string parameter with the folder path containing the camera
* parameters (only required if imageDirectorystereo > 1).
* @param numberViews const int parameter with the number of images per iteration (>1 would represent
* stereo processing).
*/
explicit VideoReader(const std::string& videoPath, const unsigned int imageDirectoryStereo = 1,
const std::string& cameraParameterPath = "");
explicit VideoReader(
const std::string& videoPath, const std::string& cameraParameterPath = "",
const bool undistortImage = false, const int numberViews = -1);
virtual ~VideoReader();
std::vector<cv::Mat> getCameraMatrices();
std::vector<cv::Mat> getCameraExtrinsics();
std::vector<cv::Mat> getCameraIntrinsics();
std::string getNextFrameName();
inline bool isOpened() const
......@@ -46,9 +40,7 @@ namespace op
void set(const int capProperty, const double value);
private:
const unsigned int mImageDirectoryStereo;
const std::string mPathName;
CameraParameterReader mCameraParameterReader;
cv::Mat getRawFrame();
......
......@@ -25,16 +25,11 @@ namespace op
* cannot be opened.
*/
explicit WebcamReader(const int webcamIndex = 0, const Point<int>& webcamResolution = Point<int>{},
const bool throwExceptionIfNoOpened = true);
const bool throwExceptionIfNoOpened = true, const std::string& cameraParameterPath = "",
const bool undistortImage = false);
virtual ~WebcamReader();
std::vector<cv::Mat> getCameraMatrices();
std::vector<cv::Mat> getCameraExtrinsics();
std::vector<cv::Mat> getCameraIntrinsics();
std::string getNextFrameName();
bool isOpened() const;
......
......@@ -99,7 +99,7 @@ namespace op
auto producerSharedPtr = createProducer(
wrapperStructInput.producerType, wrapperStructInput.producerString,
wrapperStructInput.cameraResolution, wrapperStructInput.cameraParameterPath,
wrapperStructInput.undistortImage, wrapperStructInput.imageDirectoryStereo);
wrapperStructInput.undistortImage, wrapperStructInput.numberViews);
// Editable arguments
auto wrapperStructPose = wrapperStructPoseTemp;
......@@ -708,13 +708,17 @@ namespace op
for (const auto& poseGpuRenderer : poseGpuRenderers)
renderers.emplace_back(std::static_pointer_cast<Renderer>(poseGpuRenderer));
// Display
const auto numberViews = (intRound(producerSharedPtr->get(ProducerProperty::NumberViews)));
auto finalOutputSizeGui = finalOutputSize;
if (numberViews > 1 && finalOutputSizeGui.x > 0)
finalOutputSizeGui.x *= numberViews;
// Adam (+3-D/2-D) display
if (displayAdam)
{
#ifdef USE_3D_ADAM_MODEL
// Gui
const auto gui = std::make_shared<GuiAdam>(
finalOutputSize, wrapperStructGui.fullScreen, threadManager.getIsRunningSharedPtr(),
finalOutputSizeGui, wrapperStructGui.fullScreen, threadManager.getIsRunningSharedPtr(),
spVideoSeek, poseExtractorNets, faceExtractorNets, handExtractorNets, renderers,
wrapperStructGui.displayMode, JointAngleEstimation::getTotalModel(),
wrapperStructOutput.writeVideoAdam
......@@ -729,7 +733,7 @@ namespace op
{
// Gui
const auto gui = std::make_shared<Gui3D>(
finalOutputSize, wrapperStructGui.fullScreen, threadManager.getIsRunningSharedPtr(),
finalOutputSizeGui, wrapperStructGui.fullScreen, threadManager.getIsRunningSharedPtr(),
spVideoSeek, poseExtractorNets, faceExtractorNets, handExtractorNets, renderers,
wrapperStructPose.poseModel, wrapperStructGui.displayMode
);
......@@ -741,7 +745,7 @@ namespace op
{
// Gui
const auto gui = std::make_shared<Gui>(
finalOutputSize, wrapperStructGui.fullScreen, threadManager.getIsRunningSharedPtr(),
finalOutputSizeGui, wrapperStructGui.fullScreen, threadManager.getIsRunningSharedPtr(),
spVideoSeek, poseExtractorNets, faceExtractorNets, handExtractorNets, renderers
);
// WGui
......
......@@ -72,7 +72,8 @@ namespace op
Point<int> cameraResolution;
/**
* Directory path for the camera parameters (intrinsic and extrinsic parameters).
* Directory path for the camera parameters (intrinsic and extrinsic parameters) or optionally XML file
* full path (if only 1 view).
*/
std::string cameraParameterPath;
......@@ -82,9 +83,13 @@ namespace op
bool undistortImage;
/**
* Number of camera views recorded (only for prerecorded produced sources, such as video and image directory).
* Number of camera views.
* Complementary option for `--image_dir` or `--video`
* It is -1 for Flir cameras (# cameras detected at runtime), as well as for any other frames source (-1 and 1
* are equivalent for those). It could be greater than 1 only for prerecorded produced sources, such as video
* and image directory.
*/
unsigned int imageDirectoryStereo;
int numberViews;
/**
* Constructor of the struct.
......@@ -98,7 +103,7 @@ namespace op
const bool realTimeProcessing = false, const bool frameFlip = false, const int frameRotate = 0,
const bool framesRepeat = false, const Point<int>& cameraResolution = Point<int>{-1,-1},
const std::string& cameraParameterPath = "models/cameraParameters/",
const bool undistortImage = true, const unsigned int imageDirectoryStereo = 1);
const bool undistortImage = false, const int numberViews = -1);
};
}
......
#include <openpose/filestream/fileStream.hpp>
#include <opencv2/imgproc/imgproc.hpp> // cv::undistort, cv::initUndistortRectifyMap
#include <openpose/filestream/fileStream.hpp>
#include <openpose/utilities/fileSystem.hpp>
#include <openpose/3d/cameraParameterReader.hpp>
namespace op
{
CameraParameterReader::CameraParameterReader()
CameraParameterReader::CameraParameterReader() :
mUndistortImage{false}
{
}
......@@ -15,7 +17,8 @@ namespace op
CameraParameterReader::CameraParameterReader(const std::string& serialNumber,
const cv::Mat& cameraIntrinsics,
const cv::Mat& cameraDistortion,
const cv::Mat& cameraExtrinsics)
const cv::Mat& cameraExtrinsics) :
mUndistortImage{false}
{
try
{
......@@ -34,6 +37,9 @@ namespace op
else
mCameraExtrinsics.emplace_back(cv::Mat::eye(3, 4, cameraIntrinsics.type()));
mCameraMatrices.emplace_back(mCameraIntrinsics.back() * mCameraExtrinsics.back());
// Undistortion cv::Mats
mRemoveDistortionMaps1.resize(getNumberCameras());
mRemoveDistortionMaps2.resize(getNumberCameras());
}
catch (const std::exception& e)
{
......@@ -99,6 +105,9 @@ namespace op
mCameraMatrices.emplace_back(mCameraIntrinsics.back() * mCameraExtrinsics.back());
// log(cameraParameters.at(0));
}
// Undistortion cv::Mats
mRemoveDistortionMaps1.resize(getNumberCameras());
mRemoveDistortionMaps2.resize(getNumberCameras());
// // mCameraMatrices
// log("\nFull camera matrices:");
// for (const auto& cvMat : mCameraMatrices)
......@@ -118,6 +127,19 @@ namespace op
}
}
void CameraParameterReader::readParameters(const std::string& cameraParameterPath,
const std::string& serialNumber)
{
try
{
readParameters(cameraParameterPath, std::vector<std::string>{serialNumber});
}
catch (const std::exception& e)
{
error(e.what(), __LINE__, __FUNCTION__, __FILE__);
}
}
void CameraParameterReader::writeParameters(const std::string& cameraParameterPath) const
{
try
......@@ -225,4 +247,87 @@ namespace op
return mCameraDistortions;
}
}
bool CameraParameterReader::getUndistortImage() const
{
try
{
return mUndistortImage;
}
catch (const std::exception& e)
{
error(e.what(), __LINE__, __FUNCTION__, __FILE__);
return false;
}
}
void CameraParameterReader::setUndistortImage(const bool undistortImage)
{
try
{
mUndistortImage = undistortImage;
}
catch (const std::exception& e)
{
error(e.what(), __LINE__, __FUNCTION__, __FILE__);
}
}
void CameraParameterReader::undistort(cv::Mat& frame, const unsigned int cameraIndex)
{
try
{
if (mUndistortImage)
{
// Sanity check
if (mRemoveDistortionMaps1.size() <= cameraIndex || mRemoveDistortionMaps2.size() <= cameraIndex)
{
error("Variable cameraIndex is out of bounds, it should be smaller than mRemoveDistortionMapsX.",
__LINE__, __FUNCTION__, __FILE__);
}
// Only first time
if (mRemoveDistortionMaps1[cameraIndex].empty() || mRemoveDistortionMaps2[cameraIndex].empty())
{
const auto cameraIntrinsics = mCameraIntrinsics.at(0);
const auto cameraDistorsions = mCameraDistortions.at(0);
const auto imageSize = frame.size();
// // Option a - 80 ms / 3 images
// // http://docs.opencv.org/2.4/modules/imgproc/doc/geometric_transformations.html#undistort
// cv::undistort(cvMatDistorted, mCvMats[i], cameraIntrinsics, cameraDistorsions);
// // In OpenCV 2.4, cv::undistort is exactly equal than cv::initUndistortRectifyMap
// (with CV_16SC2) + cv::remap (with LINEAR). I.e., log(cv::norm(cvMatMethod1-cvMatMethod2)) = 0.
// Option b - 15 ms / 3 images (LINEAR) or 25 ms (CUBIC)
// Distorsion removal - not required and more expensive (applied to the whole image instead of
// only to our interest points)
cv::initUndistortRectifyMap(
cameraIntrinsics, cameraDistorsions, cv::Mat(),
// cameraIntrinsics instead of cv::getOptimalNewCameraMatrix to
// avoid black borders
cameraIntrinsics,
// #include <opencv2/calib3d/calib3d.hpp> for next line
// cv::getOptimalNewCameraMatrix(cameraIntrinsics,
// cameraDistorsions,
// imageSize, 1,
// imageSize, 0),
imageSize,
CV_16SC2, // Faster, less memory
// CV_32FC1, // More accurate
mRemoveDistortionMaps1[cameraIndex],
mRemoveDistortionMaps2[cameraIndex]);
}
cv::Mat undistortedCvMat;
cv::remap(frame, undistortedCvMat,
mRemoveDistortionMaps1[cameraIndex], mRemoveDistortionMaps2[cameraIndex],
// cv::INTER_NEAREST);
cv::INTER_LINEAR);
// cv::INTER_CUBIC);
// cv::INTER_LANCZOS4); // Smoother, but we do not need this quality & its >>expensive
std::swap(undistortedCvMat, frame);
}
}
catch (const std::exception& e)
{
error(e.what(), __LINE__, __FUNCTION__, __FILE__);
}
}
}
......@@ -108,11 +108,10 @@ namespace op
}
}
std::pair<double, std::vector<double>> calcReprojectionErrors(const std::vector<std::vector<cv::Point3f>>& objects3DVectors,
const std::vector<std::vector<cv::Point2f>>& points2DVectors,
const std::vector<cv::Mat>& rVecs,
const std::vector<cv::Mat>& tVecs,
const Intrinsics& intrinsics)
std::pair<double, std::vector<double>> calcReprojectionErrors(
const std::vector<std::vector<cv::Point3f>>& objects3DVectors,
const std::vector<std::vector<cv::Point2f>>& points2DVectors, const std::vector<cv::Mat>& rVecs,
const std::vector<cv::Mat>& tVecs, const Intrinsics& intrinsics)
{
try
{
......@@ -125,12 +124,9 @@ namespace op
for (auto i = 0ull; i < objects3DVectors.size(); ++i )
{
cv::projectPoints(cv::Mat(objects3DVectors.at(i)),
rVecs.at(i),
tVecs.at(i),
intrinsics.cameraMatrix,
intrinsics.distortionCoefficients,
points2DVectors2);
cv::projectPoints(
cv::Mat(objects3DVectors.at(i)), rVecs.at(i), tVecs.at(i), intrinsics.cameraMatrix,
intrinsics.distortionCoefficients, points2DVectors2);
const auto err = cv::norm(cv::Mat(points2DVectors.at(i)), cv::Mat(points2DVectors2), CV_L2);
const auto n = objects3DVectors.at(i).size();
......@@ -150,10 +146,9 @@ namespace op
}
}
Intrinsics calcIntrinsicParameters(const cv::Size& imageSize,
const std::vector<std::vector<cv::Point2f>>& points2DVectors,
const std::vector<std::vector<cv::Point3f>>& objects3DVectors,
const int calibrateCameraFlags)
Intrinsics calcIntrinsicParameters(
const cv::Size& imageSize, const std::vector<std::vector<cv::Point2f>>& points2DVectors,
const std::vector<std::vector<cv::Point3f>>& objects3DVectors, const int calibrateCameraFlags)
{
try
{
......@@ -164,9 +159,9 @@ namespace op
Intrinsics intrinsics;
std::vector<cv::Mat> rVecs;
std::vector<cv::Mat> tVecs;
const auto rms = cv::calibrateCamera(objects3DVectors, points2DVectors, imageSize, intrinsics.cameraMatrix,
intrinsics.distortionCoefficients, rVecs, tVecs,
calibrateCameraFlags);
const auto rms = cv::calibrateCamera(
objects3DVectors, points2DVectors, imageSize, intrinsics.cameraMatrix,
intrinsics.distortionCoefficients, rVecs, tVecs, calibrateCameraFlags);
// cv::checkRange checks that every array element is neither NaN nor infinite
const auto calibrationIsCorrect = cv::checkRange(intrinsics.cameraMatrix)
......@@ -176,8 +171,8 @@ namespace op
double totalAvgErr;
std::vector<double> reprojectionErrors;
std::tie(totalAvgErr, reprojectionErrors) = calcReprojectionErrors(objects3DVectors, points2DVectors,
rVecs, tVecs, intrinsics);
std::tie(totalAvgErr, reprojectionErrors) = calcReprojectionErrors(
objects3DVectors, points2DVectors, rVecs, tVecs, intrinsics);
log("\nIntrinsics:", Priority::High);
log("Re-projection error - cv::calibrateCamera vs. calcReprojectionErrors:\t" + std::to_string(rms)
......@@ -441,10 +436,9 @@ namespace op
return Eigen::Matrix4d{};
}
}
std::pair<cv::Mat, cv::Mat> solveCorrespondences2D3D(const cv::Mat& cameraMatrix,
const cv::Mat& distortionCoefficients,
const std::vector<cv::Point3f>& objects3DVector,
const std::vector<cv::Point2f>& points2DVector)
std::pair<cv::Mat, cv::Mat> solveCorrespondences2D3D(
const cv::Mat& cameraMatrix, const cv::Mat& distortionCoefficients,
const std::vector<cv::Point3f>& objects3DVector, const std::vector<cv::Point2f>& points2DVector)
{
try
{
......@@ -487,11 +481,9 @@ namespace op
}
}
std::tuple<cv::Mat, cv::Mat, std::vector<cv::Point2f>, std::vector<cv::Point3f>> calcExtrinsicParametersOpenCV(const cv::Mat& image,
const cv::Mat& cameraMatrix,
const cv::Mat& distortionCoefficients,
const cv::Size& gridInnerCorners,
const float gridSquareSizeMm)
std::tuple<cv::Mat, cv::Mat, std::vector<cv::Point2f>, std::vector<cv::Point3f>> calcExtrinsicParametersOpenCV(
const cv::Mat& image, const cv::Mat& cameraMatrix, const cv::Mat& distortionCoefficients,
const cv::Size& gridInnerCorners, const float gridSquareSizeMm)
{
try
{
......@@ -556,12 +548,9 @@ namespace op
}
}
std::tuple<bool, Eigen::Matrix3d, Eigen::Vector3d, Eigen::Matrix3d, Eigen::Vector3d> getExtrinsicParameters(const std::vector<std::string>& cameraPaths,
const cv::Size& gridInnerCorners,
const float gridSquareSizeMm,
const bool coutAndPlotGridCorners,
const std::vector<cv::Mat>& intrinsics,
const std::vector<cv::Mat>& distortions)
std::tuple<bool, Eigen::Matrix3d, Eigen::Vector3d, Eigen::Matrix3d, Eigen::Vector3d> getExtrinsicParameters(
const std::vector<std::string>& cameraPaths, const cv::Size& gridInnerCorners, const float gridSquareSizeMm,
const bool coutAndPlotGridCorners, const std::vector<cv::Mat>& intrinsics, const std::vector<cv::Mat>& distortions)
{
try
{
......@@ -857,13 +846,10 @@ namespace op
// Public functions
void estimateAndSaveIntrinsics(const Point<int>& gridInnerCorners,
const float gridSquareSizeMm,
const int flags,
const std::string& outputParameterFolder,
const std::string& imagesFolder,
const std::string& serialNumber,
const bool saveImagesWithCorners)
void estimateAndSaveIntrinsics(
const Point<int>& gridInnerCorners, const float gridSquareSizeMm, const int flags,
const std::string& outputParameterFolder, const std::string& imagesFolder, const std::string& serialNumber,
const bool saveImagesWithCorners)
{
try
{
......@@ -873,11 +859,11 @@ namespace op
// Read images in folder
log("", Priority::Low, __LINE__, __FUNCTION__, __FILE__);
std::vector<std::vector<cv::Point2f>> points2DVectors;
const auto imageAndPaths = getImageAndPaths(imagesFolder);
// Get 2D grid corners of each image
log("", Priority::Low, __LINE__, __FUNCTION__, __FILE__);
std::vector<std::vector<cv::Point2f>> points2DVectors;
std::vector<cv::Mat> imagesWithCorners;
const auto imageSize = imageAndPaths.at(0).first.size();
for (auto i = 0u ; i < imageAndPaths.size() ; i++)
......@@ -902,9 +888,9 @@ namespace op
points2DVectors.emplace_back(points2DVector);
}
else
std::cerr << "Chessboard not found in this image." << std::endl;
log("Chessboard not found in image " + imageAndPaths.at(i).second + ".", Priority::High);
// Show image (with chessboard corners if found)
// Debugging (optional) - Show image (with chessboard corners if found)
if (saveImagesWithCorners)
{
cv::Mat imageToPlot = image.clone();
......@@ -920,18 +906,17 @@ namespace op
// Run calibration
log("", Priority::Low, __LINE__, __FUNCTION__, __FILE__);
// objects3DVector is the same one for each image
const std::vector<std::vector<cv::Point3f>> objects3DVectors(points2DVectors.size(),
getObjects3DVector(gridInnerCornersCvSize,
gridSquareSizeMm));
const std::vector<std::vector<cv::Point3f>> objects3DVectors(
points2DVectors.size(), getObjects3DVector(gridInnerCornersCvSize, gridSquareSizeMm));
const auto intrinsics = calcIntrinsicParameters(imageSize, points2DVectors, objects3DVectors, flags);
// Save intrinsics/results
log("", Priority::Low, __LINE__, __FUNCTION__, __FILE__);
CameraParameterReader cameraParameterReader{serialNumber, intrinsics.cameraMatrix,
intrinsics.distortionCoefficients};
CameraParameterReader cameraParameterReader{
serialNumber, intrinsics.cameraMatrix, intrinsics.distortionCoefficients};
cameraParameterReader.writeParameters(outputParameterFolder);
// Save images with corners
// Debugging (optional) - Save images with corners
log("", Priority::Low, __LINE__, __FUNCTION__, __FILE__);
if (saveImagesWithCorners)
{
......
......@@ -14,13 +14,13 @@ namespace op
// cv::cvtColor(image, imageGray, CV_BGR2GRAY);
const auto winSize = std::max(5,
(int)std::round(cv::norm(cv::Mat(points2DVector.at(0) - points2DVector.at(1)), cv::NORM_INF) / 4));
cv::cornerSubPix(image,
points2DVector,
cv::Size{winSize, winSize}, // Depending on the chessboard size;
// cv::Size{11,11}, // Default in code I got, used above one
cv::Size{-1,-1},
cv::TermCriteria{ CV_TERMCRIT_EPS+CV_TERMCRIT_ITER, 1000, 1e-9 });
// cv::TermCriteria{ CV_TERMCRIT_EPS+CV_TERMCRIT_ITER, 30, 0.1 }); // Default
cv::cornerSubPix(
image, points2DVector,
cv::Size{winSize, winSize}, // Depending on the chessboard size;
// cv::Size{11,11}, // Default in code I got, used above one
cv::Size{-1,-1},
cv::TermCriteria{ CV_TERMCRIT_EPS+CV_TERMCRIT_ITER, 1000, 1e-9 });
// cv::TermCriteria{ CV_TERMCRIT_EPS+CV_TERMCRIT_ITER, 30, 0.1 }); // Default
}
}
catch (const std::exception& e)
......@@ -29,8 +29,8 @@ namespace op
}
}
std::pair<bool, std::vector<cv::Point2f>> lightlyTryToFindGridCorners(const cv::Mat& image,
const cv::Size& gridInnerCorners)
std::pair<bool, std::vector<cv::Point2f>> tryToFindGridCorners(const cv::Mat& image,
const cv::Size& gridInnerCorners)
{
try
{
......@@ -59,21 +59,20 @@ namespace op
if (!image.empty())
{
std::tie(chessboardFound, points2DVector) = lightlyTryToFindGridCorners(image, gridInnerCorners);
std::tie(chessboardFound, points2DVector) = tryToFindGridCorners(image, gridInnerCorners);
if (!chessboardFound)
{
std::tie(chessboardFound, points2DVector) = lightlyTryToFindGridCorners(image, gridInnerCorners);
std::tie(chessboardFound, points2DVector) = tryToFindGridCorners(image, gridInnerCorners);
if (!chessboardFound)
{
// If not chessboardFound -> try sharpening the image
// std::cerr << "Grid not found, trying sharpening" << std::endl;
cv::Mat sharperedImage;
// hardcoded filter size, to be tested on 50 mm lens
cv::GaussianBlur(image, sharperedImage, cv::Size{0,0}, 105);
// hardcoded weight, to be tested.
cv::addWeighted(image, 1.8, sharperedImage, -0.8, 0, sharperedImage);
std::tie(chessboardFound, points2DVector) = lightlyTryToFindGridCorners(
std::tie(chessboardFound, points2DVector) = tryToFindGridCorners(
sharperedImage, gridInnerCorners);
}
}
......@@ -104,14 +103,13 @@ namespace op
if (!chessboardFound)
{
cv::Mat tempImage;
while (!chessboardFound) // 71 x 71 > 5000
auto counter = 0;
while (!chessboardFound && counter <= 2) // 3 pyrdown max
{
if (!tempImage.empty())
cv::pyrDown(tempImage, tempImage);
else
cv::pyrDown(image, tempImage);
cv::pyrDown((!tempImage.empty() ? tempImage : image), tempImage);
std::tie(chessboardFound, points2DVector) = mediumlyTryToFindGridCorners(
tempImage, gridInnerCorners);
counter++;
// After next pyrDown if will be area > 5000 < 71 x 71 px image
if (tempImage.size().area() <= 20e3)
......@@ -119,7 +117,8 @@ namespace op
}
if (chessboardFound && image.size().width != tempImage.size().width)
{
std::cerr << "Chessboard found at lower resolution: " << tempImage.size() << "px" << std::endl;
log("Chessboard found at lower resolution (" + std::to_string(tempImage.cols) + "x"
+ std::to_string(tempImage.rows) + ".", Priority::High);
for (auto& point : points2DVector)
point *= (image.size().width / tempImage.size().width);
}
......@@ -207,8 +206,8 @@ namespace op
// Public functions
std::pair<bool, std::vector<cv::Point2f>> findAccurateGridCorners(const cv::Mat& image,
const cv::Size& gridInnerCorners)
std::pair<bool, std::vector<cv::Point2f>> findAccurateGridCorners(
const cv::Mat& image, const cv::Size& gridInnerCorners)
{
try
{
......@@ -233,8 +232,8 @@ namespace op
}
}
std::vector<cv::Point3f> getObjects3DVector(const cv::Size& gridInnerCorners,
const float gridSquareSizeMm)
std::vector<cv::Point3f> getObjects3DVector(
const cv::Size& gridInnerCorners, const float gridSquareSizeMm)
{
try
{
......@@ -284,8 +283,8 @@ namespace op
}
}
std::array<unsigned int, 4> getOutterCornerIndices(const std::vector<cv::Point2f>& points2DVector,
const cv::Size& gridInnerCorners)
std::array<unsigned int, 4> getOutterCornerIndices(
const std::vector<cv::Point2f>& points2DVector, const cv::Size& gridInnerCorners)
{
try
{
......@@ -310,8 +309,7 @@ namespace op
}
}
void reorderPoints(std::vector<cv::Point2f>& points2DVector,
const cv::Size& gridInnerCorners,
void reorderPoints(std::vector<cv::Point2f>& points2DVector, const cv::Size& gridInnerCorners,
const Points2DOrigin points2DOriginDesired)
{
try
......@@ -352,10 +350,9 @@ namespace op
}
}
void plotGridCorners(const cv::Size& gridInnerCorners,
const std::vector<cv::Point2f>& points2DVector,
const std::string& imagePath,
const cv::Mat& image)
void plotGridCorners(
const cv::Size& gridInnerCorners, const std::vector<cv::Point2f>& points2DVector,
const std::string& imagePath, const cv::Mat& image)
{
try
{
......
// #include <opencv2/opencv.hpp> // cv::imshow, cv::waitKey, cv::namedWindow, cv::setWindowProperty
#include <opencv2/highgui/highgui.hpp> // cv::imshow, cv::waitKey, cv::namedWindow, cv::setWindowProperty
#include <openpose/gui/frameDisplayer.hpp>
......
......@@ -315,9 +315,9 @@ namespace op
// BODY_23
std::vector<unsigned int>{
// Minimum spanning tree
0,1, 2,3, 4,5, 6,7, 8,9, 10,11, 12,13, 14,15, 16,17, 18,19, 20,21, 22,23, 24,25, 26,27, 28,29, 30,31, 32,33, 34,35, 36,37, 38,39, 40,41, 42,43,
0,1, 2,3, 4,5, 6,7, 8,9, 10,11, 12,13, 14,15, 16,17, 18,19, 20,21, 22,23, 24,25, 26,27, 28,29, 30,31, 32,33, 34,35, 36,37, 38,39, 40,41, 42,43,
// Redundant ones
44,45, 46,47, 48,49, 50,51, 52,53, 54,55, 56,57, 58,59, 60,61, 62,63, 64,65, 66,67, 68,69, 70,71, 72,73
44,45,46,47, 48,49,50,51, 52,53,54,55, 56,57,58,59, 60,61, 62,63, 64,65,66,67, 68,69, 70,71
},
// CAR_22
std::vector<unsigned int>{
......@@ -444,9 +444,10 @@ namespace op
// BODY_23
std::vector<unsigned int>{
// Minimum spanning tree
0,1, 1,2, 2,3, 0,4, 4,5, 5,6, 7,8, 8,9, 10,11, 11,12, 0,13, 13,15, 0,14, 14,16, 12,17, 17,18, 12,19, 9,20, 20,21, 9,22, 1,7, 4,10,
0,1, 1,2, 2,3, 0,4, 4,5, 5,6, 7,8, 8,9, 10,11, 11,12, 0,13, 13,15, 0,14, 14,16, 12,17, 17,18, 12,19, 9,20, 20,21, 9,22, 1,7, 4,10,
// Redundant ones
// Ears-shoulders, ears, hips, shoulders-wrists, hips-ankles, wrists, ankles, wrists-hips, small toes-ankles)
// +2.4% +0.2% +0.3% +0.1% -0.1% (bad!) +0.2% +0.2% +0.1% -0.1% (bad!)
1,15, 4,16, 15,16, 7,10, 1,3, 4,6, 7,9, 10,12, 3,6, 9,12, 3,7, 6,10, 9,21, 12,18
},
// CAR_22
......
......@@ -15,6 +15,10 @@ namespace op
const auto& mapIdx = getPoseMapIndex(poseModel);
const auto numberBodyParts = getPoseNumberBodyParts(poseModel);
const auto numberBodyPartsPlusBkg = numberBodyParts+1;
// Sanity check
if (bodyPartPairs.size() != mapIdx.size())
error("The variables bodyPartPairs and mapIdx should have the same size.",
__LINE__, __FUNCTION__, __FILE__);
// PAFs
for (auto bodyPart = 0u; bodyPart < bodyPartPairs.size(); bodyPart+=2)
......
......@@ -4,10 +4,10 @@
namespace op
{
FlirReader::FlirReader(const std::string& cameraParametersPath, const Point<int>& cameraResolution,
FlirReader::FlirReader(const std::string& cameraParameterPath, const Point<int>& cameraResolution,
const bool undistortImage, const int cameraIndex) :
Producer{ProducerType::FlirCamera},
mSpinnakerWrapper{cameraParametersPath, cameraResolution, undistortImage, cameraIndex},
Producer{ProducerType::FlirCamera, cameraParameterPath, undistortImage, -1},
mSpinnakerWrapper{cameraParameterPath, cameraResolution, undistortImage, cameraIndex},
mFrameNameCounter{0}
{
try
......
......@@ -31,85 +31,20 @@ namespace op
}
ImageDirectoryReader::ImageDirectoryReader(const std::string& imageDirectoryPath,
const unsigned int imageDirectoryStereo,
const std::string& cameraParameterPath) :
Producer{ProducerType::ImageDirectory},
const std::string& cameraParameterPath,
const bool undistortImage,
const int numberViews) :
Producer{ProducerType::ImageDirectory, cameraParameterPath, undistortImage, numberViews},
mImageDirectoryPath{imageDirectoryPath},
mImageDirectoryStereo{imageDirectoryStereo},
mFilePaths{getImagePathsOnDirectory(imageDirectoryPath)},
mFrameNameCounter{0ll}
{
try
{
// If stereo setting --> load camera parameters
if (imageDirectoryStereo > 1)
{
// Read camera parameters from SN
auto serialNumbers = getFilesOnDirectory(cameraParameterPath, ".xml");
// Sanity check
if (serialNumbers.size() != mImageDirectoryStereo && mImageDirectoryStereo > 1)
error("Found different number of camera parameter files than the number indicated by"
" `--3d_views` ("
+ std::to_string(serialNumbers.size()) + " vs. "
+ std::to_string(mImageDirectoryStereo) + "). Make them equal or add"
+ " `--3d_views 1`",
__LINE__, __FUNCTION__, __FILE__);
// Get serial numbers
for (auto& serialNumber : serialNumbers)
serialNumber = getFileNameNoExtension(serialNumber);
// Get camera paremeters
mCameraParameterReader.readParameters(cameraParameterPath, serialNumbers);
}
}
catch (const std::exception& e)
{
error(e.what(), __LINE__, __FUNCTION__, __FILE__);
}
}
ImageDirectoryReader::~ImageDirectoryReader()
{
}
std::vector<cv::Mat> ImageDirectoryReader::getCameraMatrices()
{
try
{
return mCameraParameterReader.getCameraMatrices();
}
catch (const std::exception& e)
{
error(e.what(), __LINE__, __FUNCTION__, __FILE__);
return {};
}
}
std::vector<cv::Mat> ImageDirectoryReader::getCameraExtrinsics()
{
try
{
return mCameraParameterReader.getCameraExtrinsics();
}
catch (const std::exception& e)
{
error(e.what(), __LINE__, __FUNCTION__, __FILE__);
return {};
}
}
std::vector<cv::Mat> ImageDirectoryReader::getCameraIntrinsics()
{
try
{
return mCameraParameterReader.getCameraIntrinsics();
}
catch (const std::exception& e)
{
error(e.what(), __LINE__, __FUNCTION__, __FILE__);
return {};
}
}
std::string ImageDirectoryReader::getNextFrameName()
{
try
......@@ -138,6 +73,7 @@ namespace op
checkFrameIntegrity(frame);
// Update size, since images might have different size between each one of them
mResolution = Point<int>{frame.cols, frame.rows};
// Return final frame
return frame;
}
catch (const std::exception& e)
......@@ -152,7 +88,7 @@ namespace op
try
{
std::vector<cv::Mat> rawFrames;
for (auto i = 0u ; i < mImageDirectoryStereo ; i++)
for (auto i = 0 ; i < intRound(Producer::get(ProducerProperty::NumberViews)) ; i++)
rawFrames.emplace_back(getRawFrame());
return rawFrames;
}
......
......@@ -6,8 +6,9 @@ namespace op
// http://iris.not.iac.es/axis-cgi/mjpg/video.cgi?resolution=320x240?x.mjpeg
// http://www.webcamxp.com/publicipcams.aspx
IpCameraReader::IpCameraReader(const std::string & cameraPath) :
VideoCaptureReader{cameraPath, ProducerType::IPCamera},
IpCameraReader::IpCameraReader(const std::string & cameraPath, const std::string& cameraParameterPath,
const bool undistortImage) :
VideoCaptureReader{cameraPath, ProducerType::IPCamera, cameraParameterPath, undistortImage, 1},
mPathName{cameraPath}
{
}
......@@ -16,45 +17,6 @@ namespace op
{
}
std::vector<cv::Mat> IpCameraReader::getCameraMatrices()
{
try
{
return {};
}
catch (const std::exception& e)
{
error(e.what(), __LINE__, __FUNCTION__, __FILE__);
return {};
}
}
std::vector<cv::Mat> IpCameraReader::getCameraExtrinsics()
{
try
{
return {};
}
catch (const std::exception& e)
{
error(e.what(), __LINE__, __FUNCTION__, __FILE__);
return {};
}
}
std::vector<cv::Mat> IpCameraReader::getCameraIntrinsics()
{
try
{
return {};
}
catch (const std::exception& e)
{
error(e.what(), __LINE__, __FUNCTION__, __FILE__);
return {};
}
}
std::string IpCameraReader::getNextFrameName()
{
try
......
#include <openpose/producer/headers.hpp>
#include <openpose/utilities/check.hpp>
#include <openpose/utilities/fastMath.hpp>
#include <openpose/producer/headers.hpp>
#include <openpose/utilities/fileSystem.hpp>
#include <openpose/producer/producer.hpp>
namespace op
......@@ -20,15 +21,54 @@ namespace op
}
}
Producer::Producer(const ProducerType type) :
Producer::Producer(const ProducerType type, const std::string& cameraParameterPath, const bool undistortImage,
const int numberViews) :
mType{type},
mProducerFpsMode{ProducerFpsMode::RetrievalFps},
mNumberEmptyFrames{0},
mTrackingFps{false}
{
mProperties[(unsigned char)ProducerProperty::AutoRepeat] = (double) false;
mProperties[(unsigned char)ProducerProperty::Flip] = (double) false;
mProperties[(unsigned char)ProducerProperty::Rotation] = 0.;
// Basic properties
mProperties[(unsigned int)ProducerProperty::AutoRepeat] = (double) false;
mProperties[(unsigned int)ProducerProperty::Flip] = (double) false;
mProperties[(unsigned int)ProducerProperty::Rotation] = 0.;
mProperties[(unsigned int)ProducerProperty::NumberViews] = numberViews;
auto& mNumberViews = mProperties[(unsigned int)ProducerProperty::NumberViews];
// Camera (distortion, intrinsic, and extrinsic) parameters
if (mType != ProducerType::FlirCamera)
{
// Undistort image?
mCameraParameterReader.setUndistortImage(undistortImage);
// If no stereo --> Set to 1
if (mNumberViews <= 0)
mNumberViews = 1;
// Get camera paremeters
if (mNumberViews > 1 || undistortImage)
{
const auto extension = getFileExtension(cameraParameterPath);
// Get camera paremeters
if (extension == "xml" || extension == "XML")
mCameraParameterReader.readParameters(
getFileParentFolderPath(cameraParameterPath), getFileNameNoExtension(cameraParameterPath));
else // if (mNumberViews > 1)
{
// Read camera parameters from SN
auto serialNumbers = getFilesOnDirectory(cameraParameterPath, ".xml");
// Get serial numbers
for (auto& serialNumber : serialNumbers)
serialNumber = getFileNameNoExtension(serialNumber);
// Get camera paremeters
mCameraParameterReader.readParameters(cameraParameterPath, serialNumbers);
}
// Sanity check
if ((int)mCameraParameterReader.getNumberCameras() != mNumberViews)
error("Found different number of camera parameter files than the number indicated by"
" `--3d_views` ("
+ std::to_string(mCameraParameterReader.getNumberCameras()) + " vs. "
+ std::to_string(mNumberViews) + "). Make sure they are the same number of files.",
__LINE__, __FUNCTION__, __FILE__);
}
}
}
Producer::~Producer(){}
......@@ -61,6 +101,12 @@ namespace op
keepDesiredFrameRate();
// Get frame
frames = getRawFrames();
// Undistort frames
// TODO: Multi-thread if > 1 frame
for (auto i = 0u ; i < frames.size() ; i++)
if (!frames[i].empty() && mCameraParameterReader.getUndistortImage())
mCameraParameterReader.undistort(frames[i], i);
// Post-process frames
for (auto& frame : frames)
{
// Flip + rotate frame
......@@ -86,6 +132,45 @@ namespace op
}
}
std::vector<cv::Mat> Producer::getCameraMatrices()
{
try
{
return mCameraParameterReader.getCameraMatrices();
}
catch (const std::exception& e)
{
error(e.what(), __LINE__, __FUNCTION__, __FILE__);
return {};
}
}
std::vector<cv::Mat> Producer::getCameraExtrinsics()
{
try
{
return mCameraParameterReader.getCameraExtrinsics();
}
catch (const std::exception& e)
{
error(e.what(), __LINE__, __FUNCTION__, __FILE__);
return {};
}
}
std::vector<cv::Mat> Producer::getCameraIntrinsics()
{
try
{
return mCameraParameterReader.getCameraIntrinsics();
}
catch (const std::exception& e)
{
error(e.what(), __LINE__, __FUNCTION__, __FILE__);
return {};
}
}
void Producer::setProducerFpsMode(const ProducerFpsMode fpsMode)
{
try
......@@ -197,8 +282,8 @@ namespace op
|| (frame.rows != get(CV_CAP_PROP_FRAME_HEIGHT) && get(CV_CAP_PROP_FRAME_HEIGHT) > 0)))
{
log("Frame size changed. Returning empty frame.\nExpected vs. received sizes: "
+ std::to_string(get(CV_CAP_PROP_FRAME_WIDTH))
+ "x" + std::to_string(get(CV_CAP_PROP_FRAME_HEIGHT))
+ std::to_string(intRound(get(CV_CAP_PROP_FRAME_WIDTH)))
+ "x" + std::to_string(intRound(get(CV_CAP_PROP_FRAME_HEIGHT)))
+ " vs. " + std::to_string(frame.cols) + "x" + std::to_string(frame.rows),
Priority::Max, __LINE__, __FUNCTION__, __FILE__);
frame = cv::Mat();
......@@ -354,7 +439,7 @@ namespace op
std::shared_ptr<Producer> createProducer(const ProducerType producerType, const std::string& producerString,
const Point<int>& cameraResolution,
const std::string& cameraParameterPath, const bool undistortImage,
const unsigned int imageDirectoryStereo)
const int numberViews)
{
try
{
......@@ -363,14 +448,14 @@ namespace op
// Directory of images
if (producerType == ProducerType::ImageDirectory)
return std::make_shared<ImageDirectoryReader>(
producerString, imageDirectoryStereo, cameraParameterPath);
producerString, cameraParameterPath, undistortImage, numberViews);
// Video
else if (producerType == ProducerType::Video)
return std::make_shared<VideoReader>(
producerString, imageDirectoryStereo, cameraParameterPath);
producerString, cameraParameterPath, undistortImage, numberViews);
// IP camera
else if (producerType == ProducerType::IPCamera)
return std::make_shared<IpCameraReader>(producerString);
return std::make_shared<IpCameraReader>(producerString, cameraParameterPath, undistortImage);
// Flir camera
else if (producerType == ProducerType::FlirCamera)
return std::make_shared<FlirReader>(
......@@ -386,7 +471,8 @@ namespace op
{
const auto throwExceptionIfNoOpened = true;
return std::make_shared<WebcamReader>(
webcamIndex, cameraResolutionFinal, throwExceptionIfNoOpened);
webcamIndex, cameraResolutionFinal, throwExceptionIfNoOpened, cameraParameterPath,
undistortImage);
}
else
{
......@@ -395,7 +481,8 @@ namespace op
for (auto index = 0 ; index < 10 ; index++)
{
webcamReader = std::make_shared<WebcamReader>(
index, cameraResolutionFinal, throwExceptionIfNoOpened);
index, cameraResolutionFinal, throwExceptionIfNoOpened, cameraParameterPath,
undistortImage);
if (webcamReader->isOpened())
{
log("Auto-detecting camera index... Detected and opened camera " + std::to_string(index)
......
......@@ -5,8 +5,10 @@
namespace op
{
VideoCaptureReader::VideoCaptureReader(const int index, const bool throwExceptionIfNoOpened) :
Producer{ProducerType::Webcam}
VideoCaptureReader::VideoCaptureReader(const int index, const bool throwExceptionIfNoOpened,
const std::string& cameraParameterPath, const bool undistortImage,
const int numberViews) :
Producer{ProducerType::Webcam, cameraParameterPath, undistortImage, numberViews}
{
try
{
......@@ -18,8 +20,10 @@ namespace op
}
}
VideoCaptureReader::VideoCaptureReader(const std::string& path, const ProducerType producerType) :
Producer{producerType},
VideoCaptureReader::VideoCaptureReader(const std::string& path, const ProducerType producerType,
const std::string& cameraParameterPath, const bool undistortImage,
const int numberViews) :
Producer{producerType, cameraParameterPath, undistortImage, numberViews},
mVideoCapture{path}
{
try
......
#include <openpose/utilities/fastMath.hpp>
#include <openpose/utilities/fileSystem.hpp>
#include <openpose/producer/videoReader.hpp>
namespace op
{
VideoReader::VideoReader(const std::string & videoPath, const unsigned int imageDirectoryStereo,
const std::string& cameraParameterPath) :
VideoCaptureReader{videoPath, ProducerType::Video},
mImageDirectoryStereo{imageDirectoryStereo},
VideoReader::VideoReader(const std::string& videoPath, const std::string& cameraParameterPath,
const bool undistortImage, const int numberViews) :
VideoCaptureReader{videoPath, ProducerType::Video, cameraParameterPath, undistortImage, numberViews},
mPathName{getFileNameNoExtension(videoPath)}
{
try
{
// If stereo setting --> load camera parameters
if (imageDirectoryStereo > 1)
{
// Read camera parameters from SN
auto serialNumbers = getFilesOnDirectory(cameraParameterPath, ".xml");
// Sanity check
if (serialNumbers.size() != mImageDirectoryStereo && mImageDirectoryStereo > 1)
error("Found different number of camera parameter files than the number indicated by"
" `--3d_views` ("
+ std::to_string(serialNumbers.size()) + " vs. "
+ std::to_string(mImageDirectoryStereo) + "). Make them equal or add"
+ " `--3d_views 1`",
__LINE__, __FUNCTION__, __FILE__);
// Get serial numbers
for (auto& serialNumber : serialNumbers)
serialNumber = getFileNameNoExtension(serialNumber);
// Get camera paremeters
mCameraParameterReader.readParameters(cameraParameterPath, serialNumbers);
// Set video size
set(CV_CAP_PROP_FRAME_WIDTH, get(CV_CAP_PROP_FRAME_WIDTH)/mImageDirectoryStereo);
}
}
catch (const std::exception& e)
{
error(e.what(), __LINE__, __FUNCTION__, __FILE__);
}
}
VideoReader::~VideoReader()
{
}
std::vector<cv::Mat> VideoReader::getCameraMatrices()
{
try
{
return mCameraParameterReader.getCameraMatrices();
}
catch (const std::exception& e)
{
error(e.what(), __LINE__, __FUNCTION__, __FILE__);
return {};
}
}
std::vector<cv::Mat> VideoReader::getCameraExtrinsics()
{
try
{
return mCameraParameterReader.getCameraExtrinsics();
}
catch (const std::exception& e)
{
error(e.what(), __LINE__, __FUNCTION__, __FILE__);
return {};
}
}
std::vector<cv::Mat> VideoReader::getCameraIntrinsics()
{
try
{
return mCameraParameterReader.getCameraIntrinsics();
}
catch (const std::exception& e)
{
error(e.what(), __LINE__, __FUNCTION__, __FILE__);
return {};
}
}
std::string VideoReader::getNextFrameName()
{
try
......@@ -100,7 +33,7 @@ namespace op
try
{
if (capProperty == CV_CAP_PROP_FRAME_WIDTH)
return VideoCaptureReader::get(capProperty) / mImageDirectoryStereo;
return VideoCaptureReader::get(capProperty) / intRound(Producer::get(ProducerProperty::NumberViews));
else
return VideoCaptureReader::get(capProperty);
}
......@@ -115,10 +48,7 @@ namespace op
{
try
{
if (capProperty == CV_CAP_PROP_FRAME_WIDTH)
return VideoCaptureReader::set(capProperty, value * mImageDirectoryStereo);
else
VideoCaptureReader::set(capProperty, value);
VideoCaptureReader::set(capProperty, value);
}
catch (const std::exception& e)
{
......@@ -143,14 +73,15 @@ namespace op
{
try
{
const auto numberViews = intRound(Producer::get(ProducerProperty::NumberViews));
auto cvMats = VideoCaptureReader::getRawFrames();
// Split image
if (cvMats.size() == 1 && mImageDirectoryStereo > 1)
if (cvMats.size() == 1 && numberViews > 1)
{
cv::Mat cvMatConcatenated = cvMats.at(0);
cvMats.clear();
const auto individualWidth = cvMatConcatenated.cols/mImageDirectoryStereo;
for (auto i = 0u ; i < mImageDirectoryStereo ; i++)
const auto individualWidth = cvMatConcatenated.cols/numberViews;
for (auto i = 0 ; i < numberViews ; i++)
cvMats.emplace_back(
cv::Mat(cvMatConcatenated,
cv::Rect{(int)(i*individualWidth), 0,
......@@ -158,9 +89,9 @@ namespace op
(int)cvMatConcatenated.rows}));
}
// Sanity check
else if (cvMats.size() != 1 && mImageDirectoryStereo > 1)
error("Unexpected error. Notify us (" + std::to_string(mImageDirectoryStereo) + " vs. "
+ std::to_string(mImageDirectoryStereo) + ").", __LINE__, __FUNCTION__, __FILE__);
else if (cvMats.size() != 1u && numberViews > 1)
error("Unexpected error. Notify us (" + std::to_string(numberViews) + " vs. "
+ std::to_string(numberViews) + ").", __LINE__, __FUNCTION__, __FILE__);
// Return images
return cvMats;
}
......
......@@ -6,8 +6,9 @@
namespace op
{
WebcamReader::WebcamReader(const int webcamIndex, const Point<int>& webcamResolution,
const bool throwExceptionIfNoOpened) :
VideoCaptureReader{webcamIndex, throwExceptionIfNoOpened},
const bool throwExceptionIfNoOpened, const std::string& cameraParameterPath,
const bool undistortImage) :
VideoCaptureReader{webcamIndex, throwExceptionIfNoOpened, cameraParameterPath, undistortImage, 1},
mIndex{webcamIndex},
mFrameNameCounter{-1},
mThreadOpened{std::atomic<bool>{false}},
......@@ -65,45 +66,6 @@ namespace op
}
}
std::vector<cv::Mat> WebcamReader::getCameraMatrices()
{
try
{
return {};
}
catch (const std::exception& e)
{
error(e.what(), __LINE__, __FUNCTION__, __FILE__);
return {};
}
}
std::vector<cv::Mat> WebcamReader::getCameraExtrinsics()
{
try
{
return {};
}
catch (const std::exception& e)
{
error(e.what(), __LINE__, __FUNCTION__, __FILE__);
return {};
}
}
std::vector<cv::Mat> WebcamReader::getCameraIntrinsics()
{
try
{
return {};
}
catch (const std::exception& e)
{
error(e.what(), __LINE__, __FUNCTION__, __FILE__);
return {};
}
}
std::string WebcamReader::getNextFrameName()
{
try
......
......@@ -6,8 +6,7 @@ namespace op
const ProducerType producerType_, const std::string& producerString_, const unsigned long long frameFirst_,
const unsigned long long frameStep_, const unsigned long long frameLast_, const bool realTimeProcessing_,
const bool frameFlip_, const int frameRotate_, const bool framesRepeat_, const Point<int>& cameraResolution_,
const std::string& cameraParameterPath_, const bool undistortImage_,
const unsigned int imageDirectoryStereo_) :
const std::string& cameraParameterPath_, const bool undistortImage_, const int numberViews_) :
producerType{producerType_},
producerString{producerString_},
frameFirst{frameFirst_},
......@@ -20,7 +19,7 @@ namespace op
cameraResolution{cameraResolution_},
cameraParameterPath{cameraParameterPath_},
undistortImage{undistortImage_},
imageDirectoryStereo{imageDirectoryStereo_}
numberViews{numberViews_}
{
}
}
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册