提交 084d1691 编写于 作者: G gineshidalgo99

Added and cleaned some examples

上级 efdf39d6
......@@ -25,12 +25,14 @@ To compile, enable `BUILD_PYTHON` in cmake. Pybind selects the latest version of
## Installation
Check [doc/installation.md#python-module](../installation.md#python-api) for installation steps.
The Python API requires Numpy for array management, and OpenCV for image loading. They can be installed via:
The Python API requires python-dev, Numpy (for array management), and OpenCV (for image loading). They can be installed via:
```
# Python 2
sudo apt-get install python-dev
sudo pip install numpy opencv-python
# Python 3 (recommended)
sudo apt-get install python3-dev
sudo pip3 install numpy opencv-python
```
......
......@@ -314,6 +314,7 @@ OpenPose Library - Release Notes
31. Replaced the old Python wrapper for an updated Pybind11 wrapper version, that includes all the functionality of the C++ API.
32. Function getFilesOnDirectory() can extra all basic image file types at once without requiring to manually enumerate them.
33. Added the flags `--face_detector` and `--hand_detector`, that enable the user to select the face/hand rectangle detector that is used for the later face/hand keypoint detection. It includes OpenCV (for face), and also allows the user to provide its own input. Flag `--hand_tracking` is removed and integrated into this flag too.
34. Maximum queue size per OpenPose thread is configurable through the Wrapper class.
2. Functions or parameters renamed:
1. By default, python example `tutorial_developer/python_2_pose_from_heatmaps.py` was using 2 scales starting at -1x736, changed to 1 scale at -1x368.
2. WrapperStructPose default parameters changed to match those of the OpenPose demo binary.
......
......@@ -10,14 +10,14 @@ Note that this method will be faster than the current system if there is few peo
```
## Custom Standalone Face or Hand Keypoint Detector
Check the examples in `examples/tutorial_api_cpp/`, in particular [examples/tutorial_api_cpp/09_face_from_image.cpp](https://github.com/CMU-Perceptual-Computing-Lab/openpose/blob/master/examples/tutorial_api_cpp/09_face_from_image.cpp) and [examples/tutorial_api_cpp/10_hand_from_image.cpp](https://github.com/CMU-Perceptual-Computing-Lab/openpose/blob/master/examples/tutorial_api_cpp/10_hand_from_image.cpp). The provide examples of face and/or hand keypoint detection given a known bounding box or rectangle for the face and/or hand locations. These examples are equivalent to use the following flags:
Check the examples in `examples/tutorial_api_cpp/`, in particular [examples/tutorial_api_cpp/06_face_from_image.cpp](https://github.com/CMU-Perceptual-Computing-Lab/openpose/blob/master/examples/tutorial_api_cpp/06_face_from_image.cpp) and [examples/tutorial_api_cpp/07_hand_from_image.cpp](https://github.com/CMU-Perceptual-Computing-Lab/openpose/blob/master/examples/tutorial_api_cpp/07_hand_from_image.cpp). The provide examples of face and/or hand keypoint detection given a known bounding box or rectangle for the face and/or hand locations. These examples are equivalent to use the following flags:
```
# Face
examples/tutorial_api_cpp/09_face_from_image.cpp --body_disable --face --face_detector 2
examples/tutorial_api_cpp/06_face_from_image.cpp --body_disable --face --face_detector 2
# Hands
examples/tutorial_api_cpp/10_hand_from_image.cpp --body_disable --hand --hand_detector 2
examples/tutorial_api_cpp/07_hand_from_image.cpp --body_disable --hand --hand_detector 2
```
Note: both `FaceExtractor` and `HandExtractor` classes requires as input **squared rectangles**.
Advance solution: If you wanna use the whole OpenPose framework, you can use the synchronous examples of the `tutorial_api_cpp` folder with the configuration used for [examples/tutorial_api_cpp/09_face_from_image.cpp](https://github.com/CMU-Perceptual-Computing-Lab/openpose/blob/master/examples/tutorial_api_cpp/09_face_from_image.cpp) and [examples/tutorial_api_cpp/10_hand_from_image.cpp](https://github.com/CMU-Perceptual-Computing-Lab/openpose/blob/master/examples/tutorial_api_cpp/10_hand_from_image.cpp).
Advance solution: If you wanna use the whole OpenPose framework, you can use the synchronous examples of the `tutorial_api_cpp` folder with the configuration used for [examples/tutorial_api_cpp/06_face_from_image.cpp](https://github.com/CMU-Perceptual-Computing-Lab/openpose/blob/master/examples/tutorial_api_cpp/06_face_from_image.cpp) and [examples/tutorial_api_cpp/07_hand_from_image.cpp](https://github.com/CMU-Perceptual-Computing-Lab/openpose/blob/master/examples/tutorial_api_cpp/07_hand_from_image.cpp).
......@@ -46,30 +46,30 @@ void printKeypoints(const std::shared_ptr<std::vector<std::shared_ptr<op::Datum>
if (datumsPtr != nullptr && !datumsPtr->empty())
{
// Alternative 1
op::log("Body keypoints: " + datumsPtr->at(0)->poseKeypoints.toString());
op::log("Body keypoints: " + datumsPtr->at(0)->poseKeypoints.toString(), op::Priority::High);
// // Alternative 2
// op::log(datumsPtr->at(0).poseKeypoints);
// op::log(datumsPtr->at(0).poseKeypoints, op::Priority::High);
// // Alternative 3
// std::cout << datumsPtr->at(0).poseKeypoints << std::endl;
// // Alternative 4 - Accesing each element of the keypoints
// op::log("\nKeypoints:");
// op::log("\nKeypoints:", op::Priority::High);
// const auto& poseKeypoints = datumsPtr->at(0).poseKeypoints;
// op::log("Person pose keypoints:");
// op::log("Person pose keypoints:", op::Priority::High);
// for (auto person = 0 ; person < poseKeypoints.getSize(0) ; person++)
// {
// op::log("Person " + std::to_string(person) + " (x, y, score):");
// op::log("Person " + std::to_string(person) + " (x, y, score):", op::Priority::High);
// for (auto bodyPart = 0 ; bodyPart < poseKeypoints.getSize(1) ; bodyPart++)
// {
// std::string valueToPrint;
// for (auto xyscore = 0 ; xyscore < poseKeypoints.getSize(2) ; xyscore++)
// valueToPrint += std::to_string( poseKeypoints[{person, bodyPart, xyscore}] ) + " ";
// op::log(valueToPrint);
// op::log(valueToPrint, op::Priority::High);
// }
// }
// op::log(" ");
// op::log(" ", op::Priority::High);
}
else
op::log("Nullptr or empty datumsPtr found.", op::Priority::High);
......
......@@ -45,10 +45,10 @@ void printKeypoints(const std::shared_ptr<std::vector<std::shared_ptr<op::Datum>
// Example: How to use the pose keypoints
if (datumsPtr != nullptr && !datumsPtr->empty())
{
op::log("Body keypoints: " + datumsPtr->at(0)->poseKeypoints.toString());
op::log("Face keypoints: " + datumsPtr->at(0)->faceKeypoints.toString());
op::log("Left hand keypoints: " + datumsPtr->at(0)->handKeypoints[0].toString());
op::log("Right hand keypoints: " + datumsPtr->at(0)->handKeypoints[1].toString());
op::log("Body keypoints: " + datumsPtr->at(0)->poseKeypoints.toString(), op::Priority::High);
op::log("Face keypoints: " + datumsPtr->at(0)->faceKeypoints.toString(), op::Priority::High);
op::log("Left hand keypoints: " + datumsPtr->at(0)->handKeypoints[0].toString(), op::Priority::High);
op::log("Right hand keypoints: " + datumsPtr->at(0)->handKeypoints[1].toString(), op::Priority::High);
}
else
op::log("Nullptr or empty datumsPtr found.", op::Priority::High);
......
// ----------------------- OpenPose C++ API Tutorial - Example 3 - Body from image configurable -----------------------
// ----------------------- OpenPose C++ API Tutorial - Example 3 - Body from image -----------------------
// It reads an image, process it, and displays it with the pose (and optionally hand and face) keypoints. In addition,
// it includes all the OpenPose configuration flags (enable/disable hand, face, output saving, etc.).
......@@ -47,10 +47,10 @@ void printKeypoints(const std::shared_ptr<std::vector<std::shared_ptr<op::Datum>
// Example: How to use the pose keypoints
if (datumsPtr != nullptr && !datumsPtr->empty())
{
op::log("Body keypoints: " + datumsPtr->at(0)->poseKeypoints.toString());
op::log("Face keypoints: " + datumsPtr->at(0)->faceKeypoints.toString());
op::log("Left hand keypoints: " + datumsPtr->at(0)->handKeypoints[0].toString());
op::log("Right hand keypoints: " + datumsPtr->at(0)->handKeypoints[1].toString());
op::log("Body keypoints: " + datumsPtr->at(0)->poseKeypoints.toString(), op::Priority::High);
op::log("Face keypoints: " + datumsPtr->at(0)->faceKeypoints.toString(), op::Priority::High);
op::log("Left hand keypoints: " + datumsPtr->at(0)->handKeypoints[0].toString(), op::Priority::High);
op::log("Right hand keypoints: " + datumsPtr->at(0)->handKeypoints[1].toString(), op::Priority::High);
}
else
op::log("Nullptr or empty datumsPtr found.", op::Priority::High);
......
// ----------------------- OpenPose C++ API Tutorial - Example 4 - Body from images configurable ----------------------
// ----------------------- OpenPose C++ API Tutorial - Example 4 - Body from images ----------------------
// It reads images, process them, and display them with the pose (and optionally hand and face) keypoints. In addition,
// it includes all the OpenPose configuration flags (enable/disable hand, face, output saving, etc.).
......@@ -25,15 +25,14 @@ bool display(const std::shared_ptr<std::vector<std::shared_ptr<op::Datum>>>& dat
// User's displaying/saving/other processing here
// datum.cvOutputData: rendered frame with pose or heatmaps
// datum.poseKeypoints: Array<float> with the estimated pose
char key = ' ';
if (datumsPtr != nullptr && !datumsPtr->empty())
{
// Display image and sleeps at least 1 ms (it usually sleeps ~5-10 msec to display the image)
cv::imshow(OPEN_POSE_NAME_AND_VERSION + " - Tutorial C++ API", datumsPtr->at(0)->cvOutputData);
key = (char)cv::waitKey(1);
}
else
op::log("Nullptr or empty datumsPtr found.", op::Priority::High);
const auto key = (char)cv::waitKey(1);
return (key == 27);
}
catch (const std::exception& e)
......@@ -50,10 +49,10 @@ void printKeypoints(const std::shared_ptr<std::vector<std::shared_ptr<op::Datum>
// Example: How to use the pose keypoints
if (datumsPtr != nullptr && !datumsPtr->empty())
{
op::log("Body keypoints: " + datumsPtr->at(0)->poseKeypoints.toString());
op::log("Face keypoints: " + datumsPtr->at(0)->faceKeypoints.toString());
op::log("Left hand keypoints: " + datumsPtr->at(0)->handKeypoints[0].toString());
op::log("Right hand keypoints: " + datumsPtr->at(0)->handKeypoints[1].toString());
op::log("Body keypoints: " + datumsPtr->at(0)->poseKeypoints.toString(), op::Priority::High);
op::log("Face keypoints: " + datumsPtr->at(0)->faceKeypoints.toString(), op::Priority::High);
op::log("Left hand keypoints: " + datumsPtr->at(0)->handKeypoints[0].toString(), op::Priority::High);
op::log("Right hand keypoints: " + datumsPtr->at(0)->handKeypoints[1].toString(), op::Priority::High);
}
else
op::log("Nullptr or empty datumsPtr found.", op::Priority::High);
......
// --------------- OpenPose C++ API Tutorial - Example 5 - Body from images configurable and multi GPU ---------------
// --------------- OpenPose C++ API Tutorial - Example 5 - Body from images and multi GPU ---------------
// It reads images, process them, and display them with the pose (and optionally hand and face) keypoints. In addition,
// it includes all the OpenPose configuration flags (enable/disable hand, face, output saving, etc.).
......@@ -31,15 +31,14 @@ bool display(const std::shared_ptr<std::vector<std::shared_ptr<op::Datum>>>& dat
// User's displaying/saving/other processing here
// datum.cvOutputData: rendered frame with pose or heatmaps
// datum.poseKeypoints: Array<float> with the estimated pose
char key = ' ';
if (datumsPtr != nullptr && !datumsPtr->empty())
{
// Display image and sleeps at least 1 ms (it usually sleeps ~5-10 msec to display the image)
cv::imshow(OPEN_POSE_NAME_AND_VERSION + " - Tutorial C++ API", datumsPtr->at(0)->cvOutputData);
key = (char)cv::waitKey(1);
}
else
op::log("Nullptr or empty datumsPtr found.", op::Priority::High);
const auto key = (char)cv::waitKey(1);
return (key == 27);
}
catch (const std::exception& e)
......@@ -56,10 +55,10 @@ void printKeypoints(const std::shared_ptr<std::vector<std::shared_ptr<op::Datum>
// Example: How to use the pose keypoints
if (datumsPtr != nullptr && !datumsPtr->empty())
{
op::log("Body keypoints: " + datumsPtr->at(0)->poseKeypoints.toString());
op::log("Face keypoints: " + datumsPtr->at(0)->faceKeypoints.toString());
op::log("Left hand keypoints: " + datumsPtr->at(0)->handKeypoints[0].toString());
op::log("Right hand keypoints: " + datumsPtr->at(0)->handKeypoints[1].toString());
op::log("Body keypoints: " + datumsPtr->at(0)->poseKeypoints.toString(), op::Priority::High);
op::log("Face keypoints: " + datumsPtr->at(0)->faceKeypoints.toString(), op::Priority::High);
op::log("Left hand keypoints: " + datumsPtr->at(0)->handKeypoints[0].toString(), op::Priority::High);
op::log("Right hand keypoints: " + datumsPtr->at(0)->handKeypoints[1].toString(), op::Priority::High);
}
else
op::log("Nullptr or empty datumsPtr found.", op::Priority::High);
......@@ -177,9 +176,6 @@ int tutorialApiCpp()
// Read frames on directory
const auto imagePaths = op::getFilesOnDirectory(FLAGS_image_dir, op::Extensions::Images);
// Read number of GPUs in your system
const auto numberGPUs = op::getGpuNumber();
// Process and display images
// Option a) Harder to implement but the fastest method
// Create 2 different threads:
......@@ -188,6 +184,9 @@ int tutorialApiCpp()
// Option b) Much easier and faster to implement but slightly slower runtime performance
if (!FLAGS_latency_is_irrelevant_and_computer_with_lots_of_ram)
{
// Read number of GPUs in your system
const auto numberGPUs = op::getGpuNumber();
for (auto imageBaseId = 0u ; imageBaseId < imagePaths.size() ; imageBaseId+=numberGPUs)
{
// Read and push images into OpenPose wrapper
......
......@@ -50,10 +50,10 @@ void printKeypoints(const std::shared_ptr<std::vector<std::shared_ptr<op::Datum>
// Example: How to use the pose keypoints
if (datumsPtr != nullptr && !datumsPtr->empty())
{
op::log("Body keypoints: " + datumsPtr->at(0)->poseKeypoints.toString());
op::log("Face keypoints: " + datumsPtr->at(0)->faceKeypoints.toString());
op::log("Left hand keypoints: " + datumsPtr->at(0)->handKeypoints[0].toString());
op::log("Right hand keypoints: " + datumsPtr->at(0)->handKeypoints[1].toString());
op::log("Body keypoints: " + datumsPtr->at(0)->poseKeypoints.toString(), op::Priority::High);
op::log("Face keypoints: " + datumsPtr->at(0)->faceKeypoints.toString(), op::Priority::High);
op::log("Left hand keypoints: " + datumsPtr->at(0)->handKeypoints[0].toString(), op::Priority::High);
op::log("Right hand keypoints: " + datumsPtr->at(0)->handKeypoints[1].toString(), op::Priority::High);
}
else
op::log("Nullptr or empty datumsPtr found.", op::Priority::High);
......@@ -100,15 +100,14 @@ void configureWrapper(op::Wrapper& opWrapper)
// >1 camera view?
const auto multipleView = (FLAGS_3d || FLAGS_3d_views > 1);
// Face and hand detectors
const auto faceDetector = op::Detector::Provided;
const auto faceDetector = op::flagsToDetector(FLAGS_face_detector);
const auto handDetector = op::flagsToDetector(FLAGS_hand_detector);
// Enabling Google Logging
const bool enableGoogleLogging = true;
// Pose configuration (use WrapperStructPose{} for default and recommended configuration)
const auto bodyEnable = false;
const op::WrapperStructPose wrapperStructPose{
bodyEnable, netInputSize, outputSize, keypointScaleMode, FLAGS_num_gpu, FLAGS_num_gpu_start,
!FLAGS_body_disable, netInputSize, outputSize, keypointScaleMode, FLAGS_num_gpu, FLAGS_num_gpu_start,
FLAGS_scale_number, (float)FLAGS_scale_gap, op::flagsToRenderMode(FLAGS_render_pose, multipleView),
poseModel, !FLAGS_disable_blending, (float)FLAGS_alpha_pose, (float)FLAGS_alpha_heatmap,
FLAGS_part_to_show, FLAGS_model_folder, heatMapTypes, heatMapScaleMode, FLAGS_part_candidates,
......@@ -116,9 +115,8 @@ void configureWrapper(op::Wrapper& opWrapper)
FLAGS_prototxt_path, FLAGS_caffemodel_path, enableGoogleLogging};
opWrapper.configure(wrapperStructPose);
// Face configuration (use op::WrapperStructFace{} to disable it)
const auto face = true;
const op::WrapperStructFace wrapperStructFace{
face, faceDetector, faceNetInputSize,
FLAGS_face, faceDetector, faceNetInputSize,
op::flagsToRenderMode(FLAGS_face_render, multipleView, FLAGS_render_pose),
(float)FLAGS_face_alpha_pose, (float)FLAGS_face_alpha_heatmap, (float)FLAGS_face_render_threshold};
opWrapper.configure(wrapperStructFace);
......@@ -158,6 +156,11 @@ int tutorialApiCpp()
op::log("Starting OpenPose demo...", op::Priority::High);
const auto opTimer = op::getTimerInit();
// Required flags to enable heatmaps
FLAGS_body_disable = true;
FLAGS_face = true;
FLAGS_face_detector = 2;
// Configuring OpenPose
op::log("Configuring OpenPose...", op::Priority::High);
op::Wrapper opWrapper{op::ThreadManagerMode::Asynchronous};
......@@ -196,7 +199,7 @@ int tutorialApiCpp()
op::log("Image could not be processed.", op::Priority::High);
// Info
op::log("NOTE: In addition with the user flags, this demo has auto-selected the following flags:"
op::log("NOTE: In addition with the user flags, this demo has auto-selected the following flags:\n"
" `--body_disable --face --face_detector 2`", op::Priority::High);
// Measuring total time
......
......@@ -50,10 +50,10 @@ void printKeypoints(const std::shared_ptr<std::vector<std::shared_ptr<op::Datum>
// Example: How to use the pose keypoints
if (datumsPtr != nullptr && !datumsPtr->empty())
{
op::log("Body keypoints: " + datumsPtr->at(0)->poseKeypoints.toString());
op::log("Face keypoints: " + datumsPtr->at(0)->faceKeypoints.toString());
op::log("Left hand keypoints: " + datumsPtr->at(0)->handKeypoints[0].toString());
op::log("Right hand keypoints: " + datumsPtr->at(0)->handKeypoints[1].toString());
op::log("Body keypoints: " + datumsPtr->at(0)->poseKeypoints.toString(), op::Priority::High);
op::log("Face keypoints: " + datumsPtr->at(0)->faceKeypoints.toString(), op::Priority::High);
op::log("Left hand keypoints: " + datumsPtr->at(0)->handKeypoints[0].toString(), op::Priority::High);
op::log("Right hand keypoints: " + datumsPtr->at(0)->handKeypoints[1].toString(), op::Priority::High);
}
else
op::log("Nullptr or empty datumsPtr found.", op::Priority::High);
......@@ -101,14 +101,13 @@ void configureWrapper(op::Wrapper& opWrapper)
const auto multipleView = (FLAGS_3d || FLAGS_3d_views > 1);
// Face and hand detectors
const auto faceDetector = op::flagsToDetector(FLAGS_face_detector);
const auto handDetector = op::Detector::Provided;
const auto handDetector = op::flagsToDetector(FLAGS_hand_detector);
// Enabling Google Logging
const bool enableGoogleLogging = true;
// Pose configuration (use WrapperStructPose{} for default and recommended configuration)
const auto bodyEnable = false;
const op::WrapperStructPose wrapperStructPose{
bodyEnable, netInputSize, outputSize, keypointScaleMode, FLAGS_num_gpu, FLAGS_num_gpu_start,
!FLAGS_body_disable, netInputSize, outputSize, keypointScaleMode, FLAGS_num_gpu, FLAGS_num_gpu_start,
FLAGS_scale_number, (float)FLAGS_scale_gap, op::flagsToRenderMode(FLAGS_render_pose, multipleView),
poseModel, !FLAGS_disable_blending, (float)FLAGS_alpha_pose, (float)FLAGS_alpha_heatmap,
FLAGS_part_to_show, FLAGS_model_folder, heatMapTypes, heatMapScaleMode, FLAGS_part_candidates,
......@@ -122,9 +121,8 @@ void configureWrapper(op::Wrapper& opWrapper)
(float)FLAGS_face_alpha_pose, (float)FLAGS_face_alpha_heatmap, (float)FLAGS_face_render_threshold};
opWrapper.configure(wrapperStructFace);
// Hand configuration (use op::WrapperStructHand{} to disable it)
const auto hand = true;
const op::WrapperStructHand wrapperStructHand{
hand, handDetector, handNetInputSize, FLAGS_hand_scale_number, (float)FLAGS_hand_scale_range,
FLAGS_hand, handDetector, handNetInputSize, FLAGS_hand_scale_number, (float)FLAGS_hand_scale_range,
op::flagsToRenderMode(FLAGS_hand_render, multipleView, FLAGS_render_pose), (float)FLAGS_hand_alpha_pose,
(float)FLAGS_hand_alpha_heatmap, (float)FLAGS_hand_render_threshold};
opWrapper.configure(wrapperStructHand);
......@@ -158,6 +156,11 @@ int tutorialApiCpp()
op::log("Starting OpenPose demo...", op::Priority::High);
const auto opTimer = op::getTimerInit();
// Required flags to enable heatmaps
FLAGS_body_disable = true;
FLAGS_hand = true;
FLAGS_hand_detector = 2;
// Configuring OpenPose
op::log("Configuring OpenPose...", op::Priority::High);
op::Wrapper opWrapper{op::ThreadManagerMode::Asynchronous};
......@@ -205,7 +208,7 @@ int tutorialApiCpp()
op::log("Image could not be processed.", op::Priority::High);
// Info
op::log("NOTE: In addition with the user flags, this demo has auto-selected the following flags:"
op::log("NOTE: In addition with the user flags, this demo has auto-selected the following flags:\n"
" `--body_disable --hand --hand_detector 2`", op::Priority::High);
// Measuring total time
......
// ----------------------- OpenPose C++ API Tutorial - Example 8 - Heatmaps from image -----------------------
// It reads an image, process it, and displays it with the body heatmaps. In addition, it includes all the
// OpenPose configuration flags (enable/disable hand, face, output saving, etc.).
// Command-line user intraface
#define OPENPOSE_FLAGS_DISABLE_PRODUCER
#define OPENPOSE_FLAGS_DISABLE_DISPLAY
#include <openpose/flags.hpp>
// OpenPose dependencies
#include <openpose/headers.hpp>
// Custom OpenPose flags
// Producer
DEFINE_string(image_path, "examples/media/COCO_val2014_000000000294.jpg",
"Process an image. Read all standard formats (jpg, png, bmp, etc.).");
// Display
DEFINE_bool(no_display, false,
"Enable to disable the visual display.");
// This worker will just read and return all the jpg files in a directory
bool display(
const std::shared_ptr<std::vector<std::shared_ptr<op::Datum>>>& datumsPtr, const int desiredChannel = 0)
{
try
{
if (datumsPtr != nullptr && !datumsPtr->empty())
{
// Note: Heatmaps are in net_resolution size, which does not necessarily match the final image size
// Read heatmaps
auto& poseHeatMaps = datumsPtr->at(0)->poseHeatMaps;
// Read desired channel
const auto numberChannels = poseHeatMaps.getSize(0);
const auto height = poseHeatMaps.getSize(1);
const auto width = poseHeatMaps.getSize(2);
const cv::Mat desiredChannelHeatMap(
height, width, CV_32F, &poseHeatMaps.getPtr()[desiredChannel % numberChannels*height*width]);
// Read image used from OpenPose body network (same resolution than heatmaps)
auto& inputNetData = datumsPtr->at(0)->inputNetData[0];
const cv::Mat inputNetDataB(
height, width, CV_32F, &inputNetData.getPtr()[0]);
const cv::Mat inputNetDataG(
height, width, CV_32F, &inputNetData.getPtr()[height*width]);
const cv::Mat inputNetDataR(
height, width, CV_32F, &inputNetData.getPtr()[2*height*width]);
cv::Mat netInputImage;
cv::merge({inputNetDataB, inputNetDataG, inputNetDataR}, netInputImage);
netInputImage = (netInputImage+0.5)*255;
// Turn into uint8 cv::Mat
cv::Mat netInputImageUint8;
netInputImage.convertTo(netInputImageUint8, CV_8UC1);
cv::Mat desiredChannelHeatMapUint8;
desiredChannelHeatMap.convertTo(desiredChannelHeatMapUint8, CV_8UC1);
// Combining both images
cv::Mat imageToRender;
cv::applyColorMap(desiredChannelHeatMapUint8, desiredChannelHeatMapUint8, cv::COLORMAP_JET);
cv::addWeighted(netInputImageUint8, 0.5, desiredChannelHeatMapUint8, 0.5, 0., imageToRender);
// Display image
cv::imshow(OPEN_POSE_NAME_AND_VERSION + " - Tutorial C++ API", imageToRender);
}
else
op::log("Nullptr or empty datumsPtr found.", op::Priority::High);
const auto key = (char)cv::waitKey(1);
return (key == 27);
}
catch (const std::exception& e)
{
op::error(e.what(), __LINE__, __FUNCTION__, __FILE__);
return true;
}
}
void printKeypoints(const std::shared_ptr<std::vector<std::shared_ptr<op::Datum>>>& datumsPtr)
{
try
{
// Example: How to use the pose keypoints
if (datumsPtr != nullptr && !datumsPtr->empty())
{
const auto& poseHeatMaps = datumsPtr->at(0)->poseHeatMaps;
const auto numberChannels = poseHeatMaps.getSize(0);
const auto height = poseHeatMaps.getSize(1);
const auto width = poseHeatMaps.getSize(2);
op::log("Body heatmaps has " + std::to_string(numberChannels) + " channels, and each channel has a"
" dimension of " + std::to_string(width) + " x " + std::to_string(height) + " pixels.",
op::Priority::High);
}
else
op::log("Nullptr or empty datumsPtr found.", op::Priority::High);
}
catch (const std::exception& e)
{
op::error(e.what(), __LINE__, __FUNCTION__, __FILE__);
}
}
void configureWrapper(op::Wrapper& opWrapper)
{
try
{
// Configuring OpenPose
// logging_level
op::check(0 <= FLAGS_logging_level && FLAGS_logging_level <= 255, "Wrong logging_level value.",
__LINE__, __FUNCTION__, __FILE__);
op::ConfigureLog::setPriorityThreshold((op::Priority)FLAGS_logging_level);
op::Profiler::setDefaultX(FLAGS_profile_speed);
// Applying user defined configuration - GFlags to program variables
// outputSize
const auto outputSize = op::flagsToPoint(FLAGS_output_resolution, "-1x-1");
// netInputSize
const auto netInputSize = op::flagsToPoint(FLAGS_net_resolution, "-1x368");
// faceNetInputSize
const auto faceNetInputSize = op::flagsToPoint(FLAGS_face_net_resolution, "368x368 (multiples of 16)");
// handNetInputSize
const auto handNetInputSize = op::flagsToPoint(FLAGS_hand_net_resolution, "368x368 (multiples of 16)");
// poseModel
const auto poseModel = op::flagsToPoseModel(FLAGS_model_pose);
// JSON saving
if (!FLAGS_write_keypoint.empty())
op::log("Flag `write_keypoint` is deprecated and will eventually be removed."
" Please, use `write_json` instead.", op::Priority::Max);
// keypointScaleMode
const auto keypointScaleMode = op::flagsToScaleMode(FLAGS_keypoint_scale);
// heatmaps to add
const auto heatMapTypes = op::flagsToHeatMaps(FLAGS_heatmaps_add_parts, FLAGS_heatmaps_add_bkg,
FLAGS_heatmaps_add_PAFs);
const auto heatMapScaleMode = op::flagsToHeatMapScaleMode(FLAGS_heatmaps_scale);
// >1 camera view?
const auto multipleView = (FLAGS_3d || FLAGS_3d_views > 1);
// Face and hand detectors
const auto faceDetector = op::flagsToDetector(FLAGS_face_detector);
const auto handDetector = op::flagsToDetector(FLAGS_hand_detector);
// Enabling Google Logging
const bool enableGoogleLogging = true;
// Pose configuration (use WrapperStructPose{} for default and recommended configuration)
const op::WrapperStructPose wrapperStructPose{
!FLAGS_body_disable, netInputSize, outputSize, keypointScaleMode, FLAGS_num_gpu, FLAGS_num_gpu_start,
FLAGS_scale_number, (float)FLAGS_scale_gap, op::flagsToRenderMode(FLAGS_render_pose, multipleView),
poseModel, !FLAGS_disable_blending, (float)FLAGS_alpha_pose, (float)FLAGS_alpha_heatmap,
FLAGS_part_to_show, FLAGS_model_folder, heatMapTypes, heatMapScaleMode, FLAGS_part_candidates,
(float)FLAGS_render_threshold, FLAGS_number_people_max, FLAGS_maximize_positives, FLAGS_fps_max,
FLAGS_prototxt_path, FLAGS_caffemodel_path, enableGoogleLogging};
opWrapper.configure(wrapperStructPose);
// Face configuration (use op::WrapperStructFace{} to disable it)
const op::WrapperStructFace wrapperStructFace{
FLAGS_face, faceDetector, faceNetInputSize,
op::flagsToRenderMode(FLAGS_face_render, multipleView, FLAGS_render_pose),
(float)FLAGS_face_alpha_pose, (float)FLAGS_face_alpha_heatmap, (float)FLAGS_face_render_threshold};
opWrapper.configure(wrapperStructFace);
// Hand configuration (use op::WrapperStructHand{} to disable it)
const op::WrapperStructHand wrapperStructHand{
FLAGS_hand, handDetector, handNetInputSize, FLAGS_hand_scale_number, (float)FLAGS_hand_scale_range,
op::flagsToRenderMode(FLAGS_hand_render, multipleView, FLAGS_render_pose), (float)FLAGS_hand_alpha_pose,
(float)FLAGS_hand_alpha_heatmap, (float)FLAGS_hand_render_threshold};
opWrapper.configure(wrapperStructHand);
// Extra functionality configuration (use op::WrapperStructExtra{} to disable it)
const op::WrapperStructExtra wrapperStructExtra{
FLAGS_3d, FLAGS_3d_min_views, FLAGS_identification, FLAGS_tracking, FLAGS_ik_threads};
opWrapper.configure(wrapperStructExtra);
// Output (comment or use default argument to disable any output)
const op::WrapperStructOutput wrapperStructOutput{
FLAGS_cli_verbose, FLAGS_write_keypoint, op::stringToDataFormat(FLAGS_write_keypoint_format),
FLAGS_write_json, FLAGS_write_coco_json, FLAGS_write_coco_foot_json, FLAGS_write_coco_json_variant,
FLAGS_write_images, FLAGS_write_images_format, FLAGS_write_video, FLAGS_write_video_fps,
FLAGS_write_video_with_audio, FLAGS_write_heatmaps, FLAGS_write_heatmaps_format, FLAGS_write_video_3d,
FLAGS_write_video_adam, FLAGS_write_bvh, FLAGS_udp_host, FLAGS_udp_port};
opWrapper.configure(wrapperStructOutput);
// No GUI. Equivalent to: opWrapper.configure(op::WrapperStructGui{});
// Set to single-thread (for sequential processing and/or debugging and/or reducing latency)
if (FLAGS_disable_multi_thread)
opWrapper.disableMultiThreading();
}
catch (const std::exception& e)
{
op::error(e.what(), __LINE__, __FUNCTION__, __FILE__);
}
}
int tutorialApiCpp()
{
try
{
op::log("Starting OpenPose demo...", op::Priority::High);
const auto opTimer = op::getTimerInit();
// Required flags to enable heatmaps
FLAGS_heatmaps_add_parts = true;
FLAGS_heatmaps_add_bkg = true;
FLAGS_heatmaps_add_PAFs = true;
FLAGS_heatmaps_scale = 2;
// Configuring OpenPose
op::log("Configuring OpenPose...", op::Priority::High);
op::Wrapper opWrapper{op::ThreadManagerMode::Asynchronous};
configureWrapper(opWrapper);
// Starting OpenPose
op::log("Starting thread(s)...", op::Priority::High);
opWrapper.start();
// Process and display image
const auto imageToProcess = cv::imread(FLAGS_image_path);
auto datumProcessed = opWrapper.emplaceAndPop(imageToProcess);
if (datumProcessed != nullptr)
{
printKeypoints(datumProcessed);
if (!FLAGS_no_display)
{
const auto numberChannels = datumProcessed->at(0)->poseHeatMaps.getSize(0);
for (auto desiredChannel = 0 ; desiredChannel < numberChannels ; desiredChannel++)
if (display(datumProcessed, desiredChannel))
break;
}
}
else
op::log("Image could not be processed.", op::Priority::High);
// Info
op::log("NOTE: In addition with the user flags, this demo has auto-selected the following flags:\n"
" `--heatmaps_add_parts --heatmaps_add_bkg --heatmaps_add_PAFs`",
op::Priority::High);
// Measuring total time
op::printTime(opTimer, "OpenPose demo successfully finished. Total time: ", " seconds.", op::Priority::High);
// Return
return 0;
}
catch (const std::exception& e)
{
return -1;
}
}
int main(int argc, char *argv[])
{
// Parsing command line flags
gflags::ParseCommandLineFlags(&argc, &argv, true);
// Running tutorialApiCpp
return tutorialApiCpp();
}
......@@ -22,15 +22,14 @@ public:
// User's displaying/saving/other processing here
// datumPtr->cvOutputData: rendered frame with pose or heatmaps
// datumPtr->poseKeypoints: Array<float> with the estimated pose
char key = ' ';
if (datumsPtr != nullptr && !datumsPtr->empty())
{
cv::imshow(OPEN_POSE_NAME_AND_VERSION + " - Tutorial C++ API", datumsPtr->at(0)->cvOutputData);
// Display image and sleeps at least 1 ms (it usually sleeps ~5-10 msec to display the image)
key = (char)cv::waitKey(1);
}
else
op::log("Nullptr or empty datumsPtr found.", op::Priority::High);
const auto key = (char)cv::waitKey(1);
return (key == 27);
}
void printKeypoints(const std::shared_ptr<std::vector<std::shared_ptr<op::Datum>>>& datumsPtr)
......@@ -57,9 +56,9 @@ public:
}
op::log(" ");
// Alternative: just getting std::string equivalent
op::log("Face keypoints: " + datumsPtr->at(0)->faceKeypoints.toString());
op::log("Left hand keypoints: " + datumsPtr->at(0)->handKeypoints[0].toString());
op::log("Right hand keypoints: " + datumsPtr->at(0)->handKeypoints[1].toString());
op::log("Face keypoints: " + datumsPtr->at(0)->faceKeypoints.toString(), op::Priority::High);
op::log("Left hand keypoints: " + datumsPtr->at(0)->handKeypoints[0].toString(), op::Priority::High);
op::log("Right hand keypoints: " + datumsPtr->at(0)->handKeypoints[1].toString(), op::Priority::High);
// Heatmaps
const auto& poseHeatMaps = datumsPtr->at(0)->poseHeatMaps;
if (!poseHeatMaps.empty())
......
......@@ -103,15 +103,14 @@ public:
// User's displaying/saving/other processing here
// datumPtr->cvOutputData: rendered frame with pose or heatmaps
// datumPtr->poseKeypoints: Array<float> with the estimated pose
char key = ' ';
if (datumsPtr != nullptr && !datumsPtr->empty())
{
// Display image and sleeps at least 1 ms (it usually sleeps ~5-10 msec to display the image)
cv::imshow(OPEN_POSE_NAME_AND_VERSION + " - Tutorial C++ API", datumsPtr->at(0)->cvOutputData);
key = (char)cv::waitKey(1);
}
else
op::log("Nullptr or empty datumsPtr found.", op::Priority::High);
const auto key = (char)cv::waitKey(1);
return (key == 27);
}
catch (const std::exception& e)
......@@ -142,9 +141,9 @@ public:
}
op::log(" ");
// Alternative: just getting std::string equivalent
op::log("Face keypoints: " + datumsPtr->at(0)->faceKeypoints.toString());
op::log("Left hand keypoints: " + datumsPtr->at(0)->handKeypoints[0].toString());
op::log("Right hand keypoints: " + datumsPtr->at(0)->handKeypoints[1].toString());
op::log("Face keypoints: " + datumsPtr->at(0)->faceKeypoints.toString(), op::Priority::High);
op::log("Left hand keypoints: " + datumsPtr->at(0)->handKeypoints[0].toString(), op::Priority::High);
op::log("Right hand keypoints: " + datumsPtr->at(0)->handKeypoints[1].toString(), op::Priority::High);
// Heatmaps
const auto& poseHeatMaps = datumsPtr->at(0)->poseHeatMaps;
if (!poseHeatMaps.empty())
......
......@@ -6,7 +6,7 @@ set(EXAMPLE_FILES
05_keypoints_from_images_multi_gpu.cpp
06_face_from_image.cpp
07_hand_from_image.cpp
# 08_heatmaps_from_image.cpp
08_heatmaps_from_image.cpp
09_asynchronous_custom_input.cpp
10_asynchronous_custom_output.cpp
11_asynchronous_custom_input_output_and_datum.cpp
......
......@@ -33,8 +33,9 @@ if [[ $RUN_EXAMPLES == true ]] ; then
echo " "
echo "Tutorial API C++: Example 5..."
./build/examples/tutorial_api_cpp/05_keypoints_from_images_multi_gpu.bin --net_resolution -1x32 --write_json output/ --write_images output/ --no_display
./build/examples/tutorial_api_cpp/05_keypoints_from_images_multi_gpu.bin --net_resolution -1x32 --write_json output/ --write_images output/ --no_display --latency_is_irrelevant_and_computer_with_lots_of_ram
# Default configuration of this demo requires getGpuNumber(), which is not implement for CPU-only mode
# ./build/examples/tutorial_api_cpp/05_keypoints_from_images_multi_gpu.bin --net_resolution -1x32 --write_json output/ --write_images output/ --no_display
echo " "
echo "Tutorial API C++: Example 6..."
......@@ -45,9 +46,9 @@ if [[ $RUN_EXAMPLES == true ]] ; then
./build/examples/tutorial_api_cpp/07_hand_from_image.bin --hand_net_resolution 32x32 --write_json output/ --write_images output/ --no_display
echo " "
# echo "Tutorial API C++: Example 8..."
# ./build/examples/tutorial_api_cpp/08_heatmaps_from_image.bin --hand_net_resolution 32x32 --write_json output/ --write_images output/ --no_display
# echo " "
echo "Tutorial API C++: Example 8..."
./build/examples/tutorial_api_cpp/08_heatmaps_from_image.bin --hand_net_resolution 32x32 --write_json output/ --write_images output/ --no_display
echo " "
echo "Tutorial API C++: Example 9..."
./build/examples/tutorial_api_cpp/09_asynchronous_custom_input.bin --image_dir examples/media/ --net_resolution -1x32 --write_json output/ --write_images output/ --display 0
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册