提交 4d278f96 编写于 作者: G gineshidalgo99

Extrinsics calib added

上级 e9754167
......@@ -5,8 +5,9 @@ OpenPose Calibration Module and Demo
1. [Introduction](#introduction)
2. [Installing the Calibration Module](#installing-the-calibration-module)
3. [Running Calibration](#running-calibration)
1. [Step 1 - Distortion and Intrinsic Parameter Calibration](#step-1---distortion-and-intrinsic-parameter-calibration)
2. [Step 2 - Extrinsic Parameter Calibration](#step-2---extrinsic-parameter-calibration)
1. [General Quality Tips](#general-quality-tips)
2. [Step 1 - Distortion and Intrinsic Parameter Calibration](#step-1---distortion-and-intrinsic-parameter-calibration)
3. [Step 2 - Extrinsic Parameter Calibration](#step-2---extrinsic-parameter-calibration)
4. [Using a Different Camera Brand](#using-a-different-camera-brand)
......@@ -26,13 +27,16 @@ Check [doc/installation.md#calibration-module](./installation.md#calibration-mod
## Running Calibration
Note: In order to maximize calibration quality, **do not reuse the same video sequence for both intrinsic and extrinsic parameter estimation**. The intrinsic parameter calibration should be run camera by camera, where each recorded video sequence should be focused in covering all regions of the camera view and repeated from several distances. In the extrinsic sequence, this video sequence should be focused in making sure that the checkboard is visible from at least 2 cameras at the time. So for 3-camera calibration, you would need 1 video sequence per camera as well as a final sequence for the extrinsic parameter calibration.
### General Quality Tips
1. Keep the same orientation of the chessboard, i.e., do not rotate it circularly more than ~15-30 degress. Our algorithm assumes that the origin is the corner at the top left, so rotating the chessboard circularly will change this origin across frames, resulting in many frames being rejected for the final calibration, i.e., lower calibration accuracy.
2. Cover several distances, and within each distance, cover all parts of the image view (all corners and center).
3. Save the images in PNG format (default behavior) in order to improve calibration quality. PNG images are bigger than JPG equivalent, but do not lose information by compression.
4. Use a chessboard as big as possible, ideally a chessboard with of at least 8x6 squares with a square size of at least 100 millimeters. It will specially affect the extrinsic calibration quality.
5. Intrinsics: Recommended about 400 image views for high quality calibration. You should get at least 150 images for a good calibration, while no more than 500. The calibration of a camera takes about 3 minutes with about 100 images, about 1.5h with 200 images, and about 9.5h with 450 images. Required RAM memory also grows exponentially.
6. Extrinsics: Recommended at least 250 images per camera for high quality calibration.
### Step 1 - Distortion and Intrinsic Parameter Calibration
1. Run OpenPose and save images for your desired camera. Use a grid (chessboard) pattern and move around all the image area.
1. Quality tips:
1. Keep the same orientation of the chessboard, i.e., do not rotate it circularly more than ~15-30 degress. Our algorithm assumes that the origin is the corner at the top left, so rotating the chessboard circularly will change this origin across frames, resulting in many frames being rejected for the final calibration, i.e., lower calibration accuracy.
2. Cover several distances, and within each distance, cover all parts of the image view (all corners and center).
3. Save the images in PNG format (default behavior) in order to improve calibration quality. PNG images are bigger than JPG equivalent, but do not lose information by compression.
4. Recommended about 400 image views for high quality calibration. You should get at least 150 images for a good calibration, while no more than 500. The calibration of a camera takes about 3 minutes with about 100 images, about 1.5h with 200 images, and about 9.5h with 450 images. Required RAM memory also grows exponentially.
2. Changing image source:
1. Webcam calibration: `./build/examples/openpose/openpose.bin --num_gpu 0 --frame_keep_distortion --write_images {intrinsic_images_folder_path}`.
2. Flir camera calibration: Add the flags `--flir_camera --flir_camera_index 0` (or the desired flir camera index) to the webcam command.
......@@ -44,7 +48,7 @@ Note: In order to maximize calibration quality, **do not reuse the same video se
```
3. Extract and save the intrinsic parameters:
```
./build/examples/calibration/calibration.bin --mode 1 --grid_square_size_mm 40.0 --grid_number_inner_corners "9x5" --camera_serial_number 18079958 --intrinsics_image_dir {intrinsic_images_folder_path}
./build/examples/calibration/calibration.bin --mode 1 --grid_square_size_mm 40.0 --grid_number_inner_corners "9x5" --camera_serial_number 18079958 --calibration_image_dir {intrinsic_images_folder_path}
```
4. In this case, the intrinsic parameters would have been generated as {intrinsic_images_folder_path}/18079958.xml.
5. Run steps 1-4 for each one of your cameras.
......@@ -64,10 +68,10 @@ Note: In order to maximize calibration quality, **do not reuse the same video se
./build/examples/openpose/openpose.bin --num_gpu 0 --frame_keep_distortion --flir_camera --flir_camera_index 3 --write_images ~/Desktop/intrinsics_3
# Run calibration
# - Note: If your computer has enough RAM memory, you can run all of them at the same time in order to speed up the time (they are not internally multi-threaded).
./build/examples/calibration/calibration.bin --mode 1 --grid_square_size_mm 127.0 --grid_number_inner_corners "9x6" --camera_serial_number 17012332 --intrinsics_image_dir ~/Desktop/intrinsics_0
./build/examples/calibration/calibration.bin --mode 1 --grid_square_size_mm 127.0 --grid_number_inner_corners "9x6" --camera_serial_number 17092861 --intrinsics_image_dir ~/Desktop/intrinsics_1
./build/examples/calibration/calibration.bin --mode 1 --grid_square_size_mm 127.0 --grid_number_inner_corners "9x6" --camera_serial_number 17092865 --intrinsics_image_dir ~/Desktop/intrinsics_2
./build/examples/calibration/calibration.bin --mode 1 --grid_square_size_mm 127.0 --grid_number_inner_corners "9x6" --camera_serial_number 18079957 --intrinsics_image_dir ~/Desktop/intrinsics_3
./build/examples/calibration/calibration.bin --mode 1 --grid_square_size_mm 127.0 --grid_number_inner_corners "9x6" --camera_serial_number 17012332 --calibration_image_dir ~/Desktop/intrinsics_0
./build/examples/calibration/calibration.bin --mode 1 --grid_square_size_mm 127.0 --grid_number_inner_corners "9x6" --camera_serial_number 17092861 --calibration_image_dir ~/Desktop/intrinsics_1
./build/examples/calibration/calibration.bin --mode 1 --grid_square_size_mm 127.0 --grid_number_inner_corners "9x6" --camera_serial_number 17092865 --calibration_image_dir ~/Desktop/intrinsics_2
./build/examples/calibration/calibration.bin --mode 1 --grid_square_size_mm 127.0 --grid_number_inner_corners "9x6" --camera_serial_number 18079957 --calibration_image_dir ~/Desktop/intrinsics_3
# Visualize undistorted images
# - Camera parameters will be saved on their respective serial number files, so OpenPose will automatically find them
./build/examples/openpose/openpose.bin --num_gpu 0 --flir_camera
......@@ -76,7 +80,30 @@ Note: In order to maximize calibration quality, **do not reuse the same video se
### Step 2 - Extrinsic Parameter Calibration
1. We are still implementing this part. Documentation will be available after completing it.
1. After intrinsics calibration, save undirtoted images for all the camera views:
```
./build/examples/openpose/openpose.bin --num_gpu 0 --flir_camera --write_images ~/Desktop/extrinsics
```
2. Run the extrinsic calibration tool between each pair of close cameras. In this example:
- We assume camera 0 to the right, 1 in the middle-right, 2 in the middle-left, and 3 in the left.
- We assume camera 1 as the coordinate origin.
```
./build/examples/calibration/calibration.bin --mode 2 --grid_square_size_mm 127.0 --grid_number_inner_corners 9x6 --omit_distortion --calibration_image_dir ~/Desktop/extrinsics/ --cam0 1 --cam1 0
./build/examples/calibration/calibration.bin --mode 2 --grid_square_size_mm 127.0 --grid_number_inner_corners 9x6 --omit_distortion --calibration_image_dir ~/Desktop/extrinsics/ --cam0 1 --cam1 2
./build/examples/calibration/calibration.bin --mode 2 --grid_square_size_mm 127.0 --grid_number_inner_corners 9x6 --omit_distortion --calibration_image_dir ~/Desktop/extrinsics/ --cam0 1 --cam1 3
# Potentially more accurate equivalent for the calibration between cameras 1 and 3: If camera 3 and 1 are too far from each other and the calibration chessboard is not visible from both cameras at the same time enough times, the calibration can be run between camera 3 and camera 2, which is closer to 3. In that case, the `combine_cam0_extrinsics` flag is required, which tells the calibration toolbox that cam0 is not the global origin (in this case is camera 1).
# Note: Wait until calibration of camera index 2 with respect to 1 is completed, as information from camera 2 XML calibration file will be used:
./build/examples/calibration/calibration.bin --mode 2 --grid_square_size_mm 127.0 --grid_number_inner_corners 9x6 --omit_distortion --calibration_image_dir ~/Desktop/extrinsics/ --cam0 2 --cam1 3 --combine_cam0_extrinsics
```
3. Hint to verify extrinsic calibration is successful:
1. Translation vector - Global distance:
1. Manually open each one of the generated XML files from the folder indicated by the flag `--camera_parameter_folder` (or the default one indicated by the `--help` flag if the former was not used).
2. The field `CameraMatrix` is a 3 x 4 matrix (you can see that the subfield `rows` in that file is 3 and `cols` is 4).
3. Order the matrix in that 3 x 4 shape (e.g. by copying in a different text file with the shape of 3 rows and 4 columns).
4. The 3 first components of the last column of the `CameraMatrix` field define the global `translation` (in meters) with respect to the global origin (in our case camera 1).
5. Thus, the distance between that camera and the origin camera 1 should be (approximately) equal to the L2-norm of the `translation` vector.
2. Translation vector - Relative x-y-z distances:
1. The 3x1 `translation` vector represents the `x`, `y`, and `z` distances to the origin camera, respectively. The camera is looking along the positive `z` axis, the `y` axis is down, and the `x` axis is right. This should match the real distance between both cameras.
......
......@@ -25,11 +25,23 @@ DEFINE_int32(logging_level, 3, "The logging level. Inte
" low priority messages and 4 for important ones.");
// Calibration
DEFINE_int32(mode, 1, "Select 1 for intrinsic camera parameter calibration, 2 for extrinsic ones.");
DEFINE_string(intrinsics_image_dir, "images/intrinsics_18079958/", "Directory where the images for intrinsic calibration are placed.");
DEFINE_double(grid_square_size_mm, 40.0, "Chessboard square length (in millimeters).");
DEFINE_string(grid_number_inner_corners,"9x5", "Number of inner squares in width and height, i.e., number of total squares in width"
DEFINE_string(calibration_image_dir, "images/intrinsics/", "Directory where the images for camera parameter calibration are placed.");
DEFINE_double(grid_square_size_mm, 127.0, "Chessboard square length (in millimeters).");
DEFINE_string(grid_number_inner_corners,"9x6", "Number of inner corners in width and height, i.e., number of total squares in width"
" and height minus 1.");
// Mode 1 - Intrinsics
DEFINE_string(camera_serial_number, "18079958", "Camera serial number.");
// Mode 2 - Extrinsics
DEFINE_bool(omit_distortion, false, "Set to true if image views are already undistorted (e.g., if recorded from OpenPose"
" after intrinsic parameter calibration).");
DEFINE_bool(combine_cam0_extrinsics, false, "Set to true if cam0 extrinsics are not [R=I, t=0]. I will make no effect if cam0 is"
" already the origin. See doc/calibration_demo.md for an example.");
DEFINE_int32(cam0, 1, "Baseline camera for extrinsic calibration, cam1 will be calibrated assuming cam0 the"
" world coordinate origin.");
DEFINE_int32(cam1, 0, "Target camera to estimate its extrinsic parameters, it will be calibrated assuming cam0"
" as the world coordinate origin.");
// // Mode 3
// DEFINE_int32(number_cameras, 4, "Number of cameras.");
// Producer
DEFINE_string(camera_parameter_folder, "models/cameraParameters/flir/", "String with the folder where the camera parameters are or will be"
" located.");
......@@ -51,23 +63,53 @@ int openPoseDemo()
{
op::log("Running calibration (intrinsic parameters)...", op::Priority::High);
// Obtain & save intrinsics
const auto gridInnerCorners = op::flagsToPoint(FLAGS_grid_number_inner_corners, "9x5");
// const auto flags = 0; // 5 parameters
const auto flags = cv::CALIB_RATIONAL_MODEL; // 8 parameters
// const auto flags = cv::CALIB_RATIONAL_MODEL | cv::CALIB_THIN_PRISM_MODEL; // 12 parameters
// const auto flags = cv::CALIB_RATIONAL_MODEL | cv::CALIB_THIN_PRISM_MODEL | cv::CALIB_TILTED_MODEL; // 14 parameters
const auto saveImagesWithCorners = false;
const auto gridInnerCorners = op::flagsToPoint(FLAGS_grid_number_inner_corners, "12x7");
// const auto flags = 0; // 5 parameters
const auto flags = cv::CALIB_RATIONAL_MODEL; // 8 parameters
// const auto flags = cv::CALIB_RATIONAL_MODEL | cv::CALIB_THIN_PRISM_MODEL; // 12 parameters
// const auto flags = cv::CALIB_RATIONAL_MODEL | cv::CALIB_THIN_PRISM_MODEL | cv::CALIB_TILTED_MODEL; // 14
// const auto saveImagesWithCorners = false;
const auto saveImagesWithCorners = true;
// Run camera calibration code
op::estimateAndSaveIntrinsics(gridInnerCorners, FLAGS_grid_square_size_mm, flags,
op::formatAsDirectory(FLAGS_camera_parameter_folder),
op::formatAsDirectory(FLAGS_intrinsics_image_dir),
op::formatAsDirectory(FLAGS_calibration_image_dir),
FLAGS_camera_serial_number, saveImagesWithCorners);
op::log("Intrinsic calibration completed!", op::Priority::High);
}
// Calibration - Extrinsics
else if (FLAGS_mode == 2)
op::error("Unimplemented yet.", __LINE__, __FUNCTION__, __FILE__);
{
op::log("Running calibration (extrinsic parameters)...", op::Priority::High);
// Parameters
op::estimateAndSaveExtrinsics(FLAGS_camera_parameter_folder,
op::formatAsDirectory(FLAGS_calibration_image_dir),
op::flagsToPoint(FLAGS_grid_number_inner_corners, "12x7"),
FLAGS_grid_square_size_mm,
FLAGS_cam0,
FLAGS_cam1,
FLAGS_omit_distortion,
FLAGS_combine_cam0_extrinsics);
op::log("Extrinsic calibration completed!", op::Priority::High);
}
// // Calibration - Extrinsics for Visual SFM
// else if (FLAGS_mode == 3)
// {
// op::log("Running calibration (intrinsic parameters)...", op::Priority::High);
// // Obtain & save intrinsics
// const auto gridInnerCorners = op::flagsToPoint(FLAGS_grid_number_inner_corners, "12x7");
// const auto saveImagesWithCorners = false;
// // const auto saveImagesWithCorners = true;
// // Run camera calibration code
// op::estimateAndSaveSiftFile(gridInnerCorners,
// op::formatAsDirectory(FLAGS_camera_parameter_folder),
// FLAGS_number_cameras,
// saveImagesWithCorners);
// op::log("Intrinsic calibration completed!", op::Priority::High);
// }
else
op::error("Unknown `--mode " + std::to_string(FLAGS_mode) + "`.", __LINE__, __FUNCTION__, __FILE__);
......
......@@ -17,13 +17,17 @@ namespace op
const cv::Mat& cameraDistortion,
const cv::Mat& cameraExtrinsics = cv::Mat());
// serialNumbers is optional. If empty, it will load all the XML files available in the
// cameraParameterPath folder
void readParameters(const std::string& cameraParameterPath,
const std::vector<std::string>& serialNumbers);
const std::vector<std::string>& serialNumbers = {});
void writeParameters(const std::string& cameraParameterPath) const;
unsigned long long getNumberCameras() const;
const std::vector<std::string>& getCameraSerialNumbers() const;
const std::vector<cv::Mat>& getCameraMatrices() const;
const std::vector<cv::Mat>& getCameraExtrinsics() const;
......
......@@ -20,6 +20,20 @@ namespace op
const std::string& imagesFolder,
const std::string& serialNumber,
const bool saveImagesWithCorners = false);
OP_API void estimateAndSaveExtrinsics(const std::string& intrinsicsFolder,
const std::string& extrinsicsImagesFolder,
const Point<int>& gridInnerCorners,
const float gridSquareSizeMm,
const int index0,
const int index1,
const bool imagesAreUndistorted,
const bool combineCam0Extrinsics);
OP_API void estimateAndSaveSiftFile(const Point<int>& gridInnerCorners,
const std::string& imagesFolder,
const int numberCameras,
const bool saveImagesWithCorners = false);
}
#endif // OPENPOSE_CALIBRATION_CAMERA_PARAMETER_ESTIMATION_HPP
......@@ -32,7 +32,6 @@ namespace op
OP_API void plotGridCorners(const cv::Size& gridInnerCorners,
const std::vector<cv::Point2f>& points2DVector,
const bool gridIsMirrored,
const std::string& imagePath,
const cv::Mat& image);
}
......
#include <openpose/filestream/fileStream.hpp>
#include <openpose/utilities/fileSystem.hpp>
#include <openpose/3d/cameraParameterReader.hpp>
namespace op
{
// // User configurable parameters
// // Camera matrices (extrinsic parameters) - rotation and pose orientation between cameras
// // Camera 1
// const cv::Mat M_1_1 = (cv::Mat_<double>(3, 4) << 1, 0, 0, 0,
// 0, 1, 0, 0,
// 0, 0, 1, 0);
// // Not working on Windows
// // const cv::Mat M_1_1 = cv::Mat::eye(3, 4, CV_64F);
// // From camera 1 to 2
// const cv::Mat M_1_2 = (cv::Mat_<double>(3, 4)
// << 0.999962504862692, -0.00165862051503619, 0.00849928507093793, -238.301309354482,
// 0.00176155163779584, 0.999925029704659, -0.0121174215889211, 4.75863886121558,
// -0.00847854967298925, 0.0121319391740716, 0.999890459124058, 15.9219925821916);
// // From camera 1 to 3
// const cv::Mat M_1_3 = (cv::Mat_<double>(3, 4)
// << 0.995809442124071, -0.000473104796892308, 0.0914512501193800, -461.301274485705,
// 0.00165046455210419, 0.999916727562850, -0.0127989806923977, 6.22648121362088,
// -0.0914375794917412, 0.0128962828696210, 0.995727299487585, 63.4911132860733);
// // From camera 2 to 3
// const cv::Mat M_2_3 = (cv::Mat_<double>(3, 4)
// << 0.999644115423621, -0.00194501088674130, -0.0266056278177532, -235.236375502202,
// 0.00201646110733780, 0.999994431880356, 0.00265896462686206, 9.52238656728889,
// 0.0266003079592876, -0.00271166755609303, 0.999642471324391, -4.23534963077479);
// // Intrinsic and distortion parameters
// // Camera 1 parameters
// const cv::Mat INTRINSIC_1 = (cv::Mat_<double>(3, 3) << 817.93481631740565, 0, 600.70689997785121,
// 0, 816.51774059837908, 517.84529566329593,
// 0, 0, 1);
// const cv::Mat DISTORTION_1 = (cv::Mat_<double>(8, 1) <<
// -1.8102158829399091, 9.1966147162623262, -0.00044293900343777355, 0.0013638377686816653,
// 1.3303863414979364, -1.418905163635487, 8.4725535468475819, 4.7911023525901033);
// // Camera 2 parameters
// const cv::Mat INTRINSIC_2 = (cv::Mat_<double>(3, 3) << 816.20921132436638, 0, 612.67087968681585,
// 0, 816.18292222910486, 530.47901782670431,
// 0, 0, 1);
// const cv::Mat DISTORTION_2 = (cv::Mat_<double>(8, 1) <<
// -5.1088507540294881, 133.63995617304997, -0.0010048069080912836, 0.00018825291386406282,
// 20.688286893903879, -4.7604289550474768, 132.42412342224557, 70.01195364029752);
// const cv::Mat INTRINSIC_3 = (cv::Mat_<double>(3, 3) << 798.42980806905666, 0, 646.48130011561727,
// 0, 798.46535448393979, 523.91590563194586,
// 0, 0, 1);
// // Camera 3
// const cv::Mat DISTORTION_3 = (cv::Mat_<double>(8, 1) <<
// -0.57530495294002304, -0.54721992620722582, -0.00037614702677289967, -0.00081995658363481598,
// -0.020321660897680775, -0.18040544059116842, -0.87724444571603022, -0.13136636671099691);
// // Do not modify this code
// const std::vector<cv::Mat> INTRINSICS{ INTRINSIC_1, INTRINSIC_2, INTRINSIC_3 };
// const std::vector<cv::Mat> DISTORTIONS{ DISTORTION_1, DISTORTION_2, DISTORTION_3 };
// const std::vector<cv::Mat> M{ M_1_1, M_1_2, M_1_3 };
// // Not working on Windows
// // const std::vector<cv::Mat> M_EACH_CAMERA{
// // INTRINSIC_1 * M_1_1,
// // INTRINSIC_2 * M_1_2,
// // INTRINSIC_3 * M_1_3
// // };
CameraParameterReader::CameraParameterReader()
{
}
......@@ -100,10 +43,16 @@ namespace op
try
{
// Serial numbers
mSerialNumbers = serialNumbers;
if (serialNumbers.empty())
{
mSerialNumbers = getFilesOnDirectory(cameraParameterPath, "xml");
for (auto& serialNumber : mSerialNumbers)
serialNumber = getFileNameNoExtension(serialNumber);
}
else
mSerialNumbers = serialNumbers;
// Commong saving/loading
const std::string fileNameNoExtension{cameraParameterPath};
const auto dataFormat = DataFormat::Xml;
const std::vector<std::string> cvMatNames {
"CameraMatrix", "Intrinsics", "Distortion"
......@@ -115,9 +64,9 @@ namespace op
mCameraIntrinsics.clear();
mCameraDistortions.clear();
// log("Camera matrices:");
for (auto i = 0ull ; i < serialNumbers.size() ; i++)
for (auto i = 0ull ; i < mSerialNumbers.size() ; i++)
{
const auto parameterPath = fileNameNoExtension + mSerialNumbers.at(i);
const auto parameterPath = cameraParameterPath + mSerialNumbers.at(i);
const auto cameraParameters = loadData(cvMatNames, parameterPath, dataFormat);
// Error if empty element
if (cameraParameters.empty() || cameraParameters.at(0).empty()
......@@ -208,6 +157,19 @@ namespace op
}
}
const std::vector<std::string>& CameraParameterReader::getCameraSerialNumbers() const
{
try
{
return mSerialNumbers;
}
catch (const std::exception& e)
{
error(e.what(), __LINE__, __FUNCTION__, __FILE__);
return mSerialNumbers;
}
}
const std::vector<cv::Mat>& CameraParameterReader::getCameraMatrices() const
{
try
......
......@@ -132,18 +132,17 @@ namespace op
if (cameraMatrices.empty())
error("numberCameras.empty()",
__LINE__, __FUNCTION__, __FILE__);
// Create and fill A
// Create and fill A for homogenous equation system Ax = 0
const auto numberCameras = (int)cameraMatrices.size();
cv::Mat A = cv::Mat::zeros(numberCameras*2, 4, CV_64F);
for (auto i = 0 ; i < numberCameras ; i++)
{
cv::Mat temp = pointsOnEachCamera[i].x*cameraMatrices[i].rowRange(2,3)
- cameraMatrices[i].rowRange(0,1);
temp.copyTo(A.rowRange(i*2, i*2+1));
temp = pointsOnEachCamera[i].y*cameraMatrices[i].rowRange(2,3) - cameraMatrices[i].rowRange(1,2);
temp.copyTo(A.rowRange(i*2+1, i*2+2));
A.rowRange(i*2, i*2+1) = pointsOnEachCamera[i].x*cameraMatrices[i].rowRange(2,3)
- cameraMatrices[i].rowRange(0,1);
A.rowRange(i*2+1, i*2+2) = pointsOnEachCamera[i].y*cameraMatrices[i].rowRange(2,3)
- cameraMatrices[i].rowRange(1,2);
}
// SVD on A
// Solve x for Ax = 0 --> SVD on A
cv::SVD svd{A};
svd.solveZ(A,reconstructedPoint);
reconstructedPoint /= reconstructedPoint.at<double>(3);
......@@ -175,7 +174,8 @@ namespace op
ceres::Problem problem;
for (auto i = 0u; i < cameraMatrices.size(); ++i)
{
// Slow equivalent: double camParam[12]; memcpy(camParam, cameraMatrices[i].data, sizeof(double)*12);
// Slow copy equivalent:
// double camParam[12]; memcpy(camParam, cameraMatrices[i].data, sizeof(double)*12);
const double* const camParam = (double*)cameraMatrices[i].data;
// Each Residual block takes a point and a camera as input and outputs a 2
// dimensional residual. Internally, the cost function stores the observed
......@@ -201,11 +201,26 @@ namespace op
// if (summary.initial_cost > summary.final_cost)
// std::cout << summary.FullReport() << "\n";
// const auto reprojectionErrorDecrease = std::sqrt((summary.initial_cost - summary.final_cost)/double(cameraMatrices.size()));
// const auto reprojectionErrorDecrease = std::sqrt((summary.initial_cost - summary.final_cost)
// / double(cameraMatrices.size()));
// return reprojectionErrorDecrease;
#endif
// assert(reconstructedPoint.at<double>(3) == 1.);
// // Check that our implementation gives similar result than OpenCV
// // But we apply bundle adjustment + >2 views, so it should be better (and slower) than OpenCV one
// if (cameraMatrices.size() == 4)
// {
// cv::Mat triangCoords4D;
// cv::triangulatePoints(cameraMatrices.at(0), cameraMatrices.at(3),
// std::vector<cv::Point2d>{pointsOnEachCamera.at(0)},
// std::vector<cv::Point2d>{pointsOnEachCamera.at(3)}, triangCoords4D);
// triangCoords4D /= triangCoords4D.at<double>(3);
// std::cout << reconstructedPoint << "\n"
// << triangCoords4D << "\n"
// << cv::norm(reconstructedPoint-triangCoords4D) << "\n" << std::endl;
// }
return calcReprojectionError(reconstructedPoint, cameraMatrices, pointsOnEachCamera);
}
catch (const std::exception& e)
......
......@@ -357,7 +357,6 @@ namespace op
void plotGridCorners(const cv::Size& gridInnerCorners,
const std::vector<cv::Point2f>& points2DVector,
const bool gridIsMirrored,
const std::string& imagePath,
const cv::Mat& image)
{
......@@ -369,8 +368,7 @@ namespace op
drawGridCorners(imageToPlot, gridInnerCorners, points2DVector);
// Plotting results
const std::string windowName{
getFileNameAndExtension(imagePath) + " - " + (gridIsMirrored ? "" : "no") + " mirrored grid"};
const std::string windowName = getFileNameAndExtension(imagePath);
cv::pyrDown(imageToPlot, imageToPlot);
cv::imshow(windowName, imageToPlot);
}
......
......@@ -53,7 +53,7 @@ namespace op
auto gXClick = 0.f;
auto gYClick = 0.f;
auto gGViewDistance = -250.f; // -82.3994f; //-45;
auto gMouseXRotateDeg = 0.f; // -63.2f; //0;
auto gMouseXRotateDeg = 180.f; // -63.2f; //0;
auto gMouseYRotateDeg = -5.f; // 7.f; //60;
auto gMouseXPan = -70.f; // 0;
auto gMouseYPan = -30.f; // 0;
......@@ -99,9 +99,9 @@ namespace op
const auto numberPeople = keypoints.getSize(0);
const auto numberBodyParts = keypoints.getSize(1);
const auto numberColors = colors.size();
const auto xOffset = 3000; // 640.f;
const auto yOffset = 360.f;
const auto zOffset = 1000; // 360.f;
const auto xOffset = -3000.f; // 640.f;
const auto yOffset = 1000.f; // 360.f;
const auto zOffset = 1000.f; // 360.f;
const auto xScale = 43.f;
const auto yScale = 24.f;
const auto zScale = 24.f;
......@@ -127,7 +127,7 @@ namespace op
if (keypoints[baseIndex + 3] > 0)
{
cv::Point3f keypoint{
(keypoints[baseIndex] - xOffset) / xScale,
-(keypoints[baseIndex] - xOffset) / xScale,
-(keypoints[baseIndex + 1] - yOffset) / yScale,
(keypoints[baseIndex + 2] - zOffset) / zScale
};
......@@ -159,12 +159,12 @@ namespace op
if (keypoints[baseIndexPairA + 3] > 0 && keypoints[baseIndexPairB + 3] > 0)
{
cv::Point3f pairKeypointA{
(keypoints[baseIndexPairA] - xOffset) / xScale,
-(keypoints[baseIndexPairA] - xOffset) / xScale,
-(keypoints[baseIndexPairA + 1] - yOffset) / yScale,
(keypoints[baseIndexPairA + 2] - zOffset) / zScale
};
cv::Point3f pairKeypointB{
(keypoints[baseIndexPairB] - xOffset) / xScale,
-(keypoints[baseIndexPairB] - xOffset) / xScale,
-(keypoints[baseIndexPairB + 1] - yOffset) / yScale,
(keypoints[baseIndexPairB + 2] - zOffset) / zScale
};
......
......@@ -57,6 +57,7 @@ CAFFE_DIR := 3rdparty/caffe/distribute
# WITH_CERES := 1
# WITH_FLIR_CAMERA := 1
# Eigen directory (Ceres)
# WITH_EIGEN := 1
EIGEN_DIR := /usr/include/eigen3/
# Spinnaker directory
SPINNAKER_DIR := /usr/include/spinnaker
......
......@@ -63,6 +63,7 @@ CAFFE_DIR := 3rdparty/caffe/distribute
# WITH_CERES := 1
# WITH_FLIR_CAMERA := 1
# Eigen directory (Ceres)
# WITH_EIGEN := 1
EIGEN_DIR := /usr/include/eigen3/
# Spinnaker directory
SPINNAKER_DIR := /usr/include/spinnaker
......
......@@ -57,6 +57,7 @@ CAFFE_DIR := 3rdparty/caffe/distribute
# WITH_CERES := 1
# WITH_FLIR_CAMERA := 1
# Eigen directory (Ceres)
# WITH_EIGEN := 1
EIGEN_DIR := /usr/include/eigen3/
# Spinnaker directory
SPINNAKER_DIR := /usr/include/spinnaker
......
......@@ -63,6 +63,7 @@ CAFFE_DIR := 3rdparty/caffe/distribute
# WITH_CERES := 1
# WITH_FLIR_CAMERA := 1
# Eigen directory (Ceres)
# WITH_EIGEN := 1
EIGEN_DIR := /usr/include/eigen3/
# Spinnaker directory
SPINNAKER_DIR := /usr/include/spinnaker
......
......@@ -63,6 +63,7 @@ CAFFE_DIR := 3rdparty/caffe/distribute
# WITH_CERES := 1
# WITH_FLIR_CAMERA := 1
# Eigen directory (Ceres)
# WITH_EIGEN := 1
EIGEN_DIR := /usr/include/eigen3/
# Spinnaker directory
SPINNAKER_DIR := /usr/include/spinnaker
......
......@@ -88,6 +88,11 @@ ifneq ($(WITH_CERES), 0)
LIBRARIES += ceres
INCLUDE_DIRS += $(EIGEN_DIR)
endif
# Eigen support
WITH_EIGEN ?= 0
ifneq ($(WITH_EIGEN), 0)
COMMON_FLAGS += -DWITH_EIGEN
endif
# Spinnaker SDK
WITH_FLIR_CAMERA ?= 0
ifneq ($(WITH_FLIR_CAMERA), 0)
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册