提交 5f768fe5 编写于 作者: G gineshidalgo99

Doc for standalone face/hand + Travis scripts

上级 5b841135
OpenPose Library - Standalone Face Or Hand Keypoint Detector
====================================
In case of hand camera views at which the hands are visible but not the rest of the body, or if you do not need the body keypoint detector and want to considerably speed up the process, you can use the OpenPose face or hand keypoint detectors with your own face or hand detectors, rather than using the body keypoint detector as initial detector for those.
In case of hand camera views at which the hands are visible but not the rest of the body, or if you do not need the body keypoint detector and want to speed up the process, you can use the OpenPose face or hand keypoint detectors with your own face or hand detectors, rather than using the body keypoint detector as initial detector for those.
## Standalone Face Keypoint Detector
Note that this method will be much faster than current system, but also much less accurate.
## OpenCV-based Face Keypoint Detector
Note that this method will be faster than the current system if there is few people in the image, but it is also much less accurate (OpenCV face detector only works with big and frontal faces, while OpenPose works with more scales and face rotations).
```
./build/examples/openpose/openpose.bin --face --body_disable
./build/examples/openpose/openpose.bin --body_disable --face --face_detector 1
```
## Custom Standalone Face Keypoint Detector
There are 2 ways to add the OpenPose face keypoint detector to your own code without using the body pose keypoint extractor as initial face detector:
1. Easiest solution: Forget about the `OpenPose demo` and `wrapper/wrapper.hpp`, and instead use the `include/openpose/face/faceExtractorNet.hpp` class with the output of your face detector. Recommended if you do not wanna use any other OpenPose functionality.
2. Elegant solution: If you wanna use the whole OpenPose framework, simply copy `include/wrapper/wrapper.hpp` as e.g., `examples/userCode/wrapperFace.hpp`, and change our `FaceDetector` or `FaceDetectorOpenCV` class by your custom face detector class inside your `WrapperFace` class. If you wanna omit the Pose keypoint detection for a big speed up if you do not need it, you can simply use the `body_disable` flag.
## Custom Standalone Face or Hand Keypoint Detector
Check the examples in `examples/tutorial_api_cpp/`, in particular [examples/tutorial_api_cpp/09_face_from_image.cpp](https://github.com/CMU-Perceptual-Computing-Lab/openpose/blob/master/examples/tutorial_api_cpp/09_face_from_image.cpp) and [examples/tutorial_api_cpp/10_hand_from_image.cpp](https://github.com/CMU-Perceptual-Computing-Lab/openpose/blob/master/examples/tutorial_api_cpp/10_hand_from_image.cpp). The provide examples of face and/or hand keypoint detection given a known bounding box or rectangle for the face and/or hand locations. These examples are equivalent to use the following flags:
```
# Face
examples/tutorial_api_cpp/09_face_from_image.cpp --body_disable --face --face_detector 2
# Hands
examples/tutorial_api_cpp/10_hand_from_image.cpp --body_disable --hand --hand_detector 2
```
Note: both `FaceExtractor` and `HandExtractor` classes requires as input **squared rectangles**. In addition, the function **`initializationOnThread()` must be called only once, and inside the same thread where `forwardPass` is gonna be run**.
Note: both `FaceExtractor` and `HandExtractor` classes requires as input **squared rectangles**.
## Custom Standalone Hand Keypoint Detector
The analogous steps apply to the hand keypoint detector, but modifying `include/openpose/hand/handExtractorNet.hpp`.
Advance solution: If you wanna use the whole OpenPose framework, you can use the synchronous examples of the `tutorial_api_cpp` folder with the configuration used for [examples/tutorial_api_cpp/09_face_from_image.cpp](https://github.com/CMU-Perceptual-Computing-Lab/openpose/blob/master/examples/tutorial_api_cpp/09_face_from_image.cpp) and [examples/tutorial_api_cpp/10_hand_from_image.cpp](https://github.com/CMU-Perceptual-Computing-Lab/openpose/blob/master/examples/tutorial_api_cpp/10_hand_from_image.cpp).
......@@ -170,9 +170,9 @@ int tutorialApiCpp()
// Read image and face rectangle locations
const auto imageToProcess = cv::imread(FLAGS_image_path);
const std::vector<op::Rectangle<float>> faceRectangles{
op::Rectangle<float>{330.119385f, 277.532715f, 48.717274f, 48.717274f}, // Face 0
op::Rectangle<float>{24.036991f, 267.918793f, 65.175171f, 65.175171f}, // Face 1
op::Rectangle<float>{151.803436f, 32.477852f, 108.295761f, 108.295761f} // Face 2
op::Rectangle<float>{330.119385f, 277.532715f, 48.717274f, 48.717274f}, // Face of person 0
op::Rectangle<float>{24.036991f, 267.918793f, 65.175171f, 65.175171f}, // Face of person 1
op::Rectangle<float>{151.803436f, 32.477852f, 108.295761f, 108.295761f} // Face of person 2
};
// Create new datum
......
......@@ -170,18 +170,18 @@ int tutorialApiCpp()
// Read image and hand rectangle locations
const auto imageToProcess = cv::imread(FLAGS_image_path);
const std::vector<std::array<op::Rectangle<float>, 2>> handRectangles{
// Left/Right hands person 0
// Left/Right hands of person 0
std::array<op::Rectangle<float>, 2>{
op::Rectangle<float>{320.035889f, 377.675049f, 69.300949f, 69.300949f},
op::Rectangle<float>{0.f, 0.f, 0.f, 0.f}},
// Left/Right hands person 1
op::Rectangle<float>{320.035889f, 377.675049f, 69.300949f, 69.300949f}, // Left hand
op::Rectangle<float>{0.f, 0.f, 0.f, 0.f}}, // Right hand
// Left/Right hands of person 1
std::array<op::Rectangle<float>, 2>{
op::Rectangle<float>{80.155792f, 407.673492f, 80.812706f, 80.812706f},
op::Rectangle<float>{46.449715f, 404.559753f, 98.898178f, 98.898178f}},
// Left/Right hands person 2
op::Rectangle<float>{80.155792f, 407.673492f, 80.812706f, 80.812706f}, // Left hand
op::Rectangle<float>{46.449715f, 404.559753f, 98.898178f, 98.898178f}}, // Right hand
// Left/Right hands of person 2
std::array<op::Rectangle<float>, 2>{
op::Rectangle<float>{185.692673f, 303.112244f, 157.587555f, 157.587555f},
op::Rectangle<float>{88.984360f, 268.866547f, 117.818230f, 117.818230f}}
op::Rectangle<float>{185.692673f, 303.112244f, 157.587555f, 157.587555f},// Left hand
op::Rectangle<float>{88.984360f, 268.866547f, 117.818230f, 117.818230f}} // Right hand
};
// Create new datum
......
......@@ -30,8 +30,9 @@ namespace op
WrapperStructPose& wrapperStructPose, const WrapperStructFace& wrapperStructFace,
const WrapperStructHand& wrapperStructHand, const WrapperStructExtra& wrapperStructExtra,
const WrapperStructInput& wrapperStructInput, const WrapperStructOutput& wrapperStructOutput,
const WrapperStructGui& wrapperStructGui, const bool renderOutput, const bool userOutputWsEmpty,
const std::shared_ptr<Producer>& producerSharedPtr, const ThreadManagerMode threadManagerMode);
const WrapperStructGui& wrapperStructGui, const bool renderOutput, const bool userInputAndPreprocessingWsEmpty,
const bool userOutputWsEmpty, const std::shared_ptr<Producer>& producerSharedPtr,
const ThreadManagerMode threadManagerMode);
/**
* Thread ID increase (private internal function).
......@@ -132,11 +133,12 @@ namespace op
const auto renderHandGpu = wrapperStructHand.enable && wrapperStructHand.renderMode == RenderMode::Gpu;
// Check no wrong/contradictory flags enabled
const auto userInputAndPreprocessingWsEmpty = userInputWs.empty();
const auto userOutputWsEmpty = userOutputWs.empty();
wrapperConfigureSanityChecks(
wrapperStructPose, wrapperStructFace, wrapperStructHand, wrapperStructExtra, wrapperStructInput,
wrapperStructOutput, wrapperStructGui, renderOutput, userOutputWsEmpty, producerSharedPtr,
threadManagerMode);
wrapperStructOutput, wrapperStructGui, renderOutput, userInputAndPreprocessingWsEmpty,
userOutputWsEmpty, producerSharedPtr, threadManagerMode);
// Get number threads
auto numberThreads = wrapperStructPose.gpuNumber;
......
......@@ -11,11 +11,11 @@ if [[ $RUN_EXAMPLES == true ]] ; then
echo " "
echo "OpenPose demo..."
./build/examples/openpose/openpose.bin --net_resolution -1x32 --image_dir examples/media/ --write_json output/ --write_images output/ --display 0
./build/examples/openpose/openpose.bin --net_resolution -1x32 --image_dir examples/media/ --write_json output/ --write_images output/ --display 0 --render_pose 1
echo " "
echo "Tutorial Add Module: Example 1..."
./build/examples/tutorial_add_module/1_custom_post_processing.bin --net_resolution -1x32 --image_dir examples/media/ --write_json output/ --write_images output/ --display 0
./build/examples/tutorial_add_module/1_custom_post_processing.bin --net_resolution -1x32 --image_dir examples/media/ --write_json output/ --write_images output/ --display 0 --render_pose 1
echo " "
# # Note: Examples 1-2 require the whole OpenPose resolution (too much RAM memory) and the GUI
......@@ -37,7 +37,7 @@ if [[ $RUN_EXAMPLES == true ]] ; then
echo " "
echo "Tutorial API C++: Example 6..."
./build/examples/tutorial_api_cpp/06_asynchronous_custom_input.bin --net_resolution -1x32 --image_dir examples/media/ --write_json output/ --write_images output/ --display 0
./build/examples/tutorial_api_cpp/06_asynchronous_custom_input.bin --net_resolution -1x32 --image_dir examples/media/ --write_json output/ --write_images output/ --display 0 --render_pose 1
echo " "
echo "Tutorial API C++: Example 7..."
......@@ -57,11 +57,11 @@ if [[ $RUN_EXAMPLES == true ]] ; then
echo " "
echo "Tutorial API C++: Example 11..."
./build/examples/tutorial_api_cpp/11_synchronous_custom_input.bin --net_resolution -1x32 --image_dir examples/media/ --write_json output/ --write_images output/ --display 0
./build/examples/tutorial_api_cpp/11_synchronous_custom_input.bin --net_resolution -1x32 --image_dir examples/media/ --write_json output/ --write_images output/ --display 0 --render_pose 1
echo " "
echo "Tutorial API C++: Example 13..."
./build/examples/tutorial_api_cpp/13_synchronous_custom_postprocessing.bin --net_resolution -1x32 --image_dir examples/media/ --write_json output/ --write_images output/ --display 0
./build/examples/tutorial_api_cpp/13_synchronous_custom_postprocessing.bin --net_resolution -1x32 --image_dir examples/media/ --write_json output/ --write_images output/ --display 0 --render_pose 1
echo " "
echo "Tutorial API C++: Example 14..."
......@@ -76,7 +76,7 @@ if [[ $RUN_EXAMPLES == true ]] ; then
if [[ $WITH_PYTHON == true ]] ; then
echo "Tutorial API Python: OpenPose demo..."
cd build/examples/tutorial_api_python
python openpose_python.py --net_resolution -1x32 --image_dir ../../../examples/media/ --write_json output/ --write_images output/ --display 0
python openpose_python.py --net_resolution -1x32 --image_dir ../../../examples/media/ --write_json output/ --write_images output/ --display 0 --render_pose 1
echo " "
# Note: All Python examples require GUI
fi
......
......@@ -39,8 +39,8 @@ namespace op
};
#ifdef USE_CAFFE
void updateFaceHeatMapsForPerson(Array<float>& heatMaps, const int person, const ScaleMode heatMapScaleMode,
const float* heatMapsGpuPtr)
void updateFaceHeatMapsForPerson(
Array<float>& heatMaps, const int person, const ScaleMode heatMapScaleMode, const float* heatMapsGpuPtr)
{
try
{
......@@ -198,6 +198,11 @@ namespace op
for (auto person = 0 ; person < numberPeople ; person++)
{
const auto& faceRectangle = faceRectangles.at(person);
// Sanity check
if (faceRectangle.width != faceRectangle.height)
error("Face rectangle for face keypoint estimation must be squared, i.e.,"
" width = height (" + std::to_string(faceRectangle.width) + " vs. "
+ std::to_string(faceRectangle.height) + ").", __LINE__, __FUNCTION__, __FILE__);
// Only consider faces with a minimum pixel area
const auto minFaceSize = fastMin(faceRectangle.width, faceRectangle.height);
// // Debugging -> red rectangle
......@@ -246,9 +251,10 @@ namespace op
if (!upImpl->netInitialized)
{
upImpl->netInitialized = true;
reshapeFaceExtractorCaffe(upImpl->spResizeAndMergeCaffe, upImpl->spMaximumCaffe,
upImpl->spCaffeNetOutputBlob, upImpl->spHeatMapsBlob,
upImpl->spPeaksBlob, upImpl->mGpuId);
reshapeFaceExtractorCaffe(
upImpl->spResizeAndMergeCaffe, upImpl->spMaximumCaffe,
upImpl->spCaffeNetOutputBlob, upImpl->spHeatMapsBlob,
upImpl->spPeaksBlob, upImpl->mGpuId);
}
// 2. Resize heat maps + merge different scales
......@@ -268,12 +274,12 @@ namespace op
const auto score = facePeaksPtr[xyIndex + 2];
const auto baseIndex = mFaceKeypoints.getSize(2)
* (part + person * mFaceKeypoints.getSize(1));
mFaceKeypoints[baseIndex] = (float)(Mscaling.at<double>(0,0) * x
+ Mscaling.at<double>(0,1) * y
+ Mscaling.at<double>(0,2));
mFaceKeypoints[baseIndex+1] = (float)(Mscaling.at<double>(1,0) * x
+ Mscaling.at<double>(1,1) * y
+ Mscaling.at<double>(1,2));
mFaceKeypoints[baseIndex] = float(
Mscaling.at<double>(0,0) * x + Mscaling.at<double>(0,1) * y
+ Mscaling.at<double>(0,2));
mFaceKeypoints[baseIndex+1] = float(
Mscaling.at<double>(1,0) * x + Mscaling.at<double>(1,1) * y
+ Mscaling.at<double>(1,2));
mFaceKeypoints[baseIndex+2] = score;
}
// HeatMaps: storing
......
......@@ -291,6 +291,11 @@ namespace op
for (auto person = 0 ; person < numberPeople ; person++)
{
const auto& handRectangle = handRectangles.at(person).at(hand);
// Sanity check
if (handRectangle.width != handRectangle.height)
error("Hand rectangle for hand keypoint estimation must be squared, i.e.,"
" width = height (" + std::to_string(handRectangle.width) + " vs. "
+ std::to_string(handRectangle.height) + ").", __LINE__, __FUNCTION__, __FILE__);
// Only consider faces with a minimum pixel area
const auto minHandSize = fastMin(handRectangle.width, handRectangle.height);
// // Debugging -> red rectangle
......
......@@ -4,17 +4,13 @@
namespace op
{
void wrapperConfigureSanityChecks(WrapperStructPose& wrapperStructPose,
const WrapperStructFace& wrapperStructFace,
const WrapperStructHand& wrapperStructHand,
const WrapperStructExtra& wrapperStructExtra,
const WrapperStructInput& wrapperStructInput,
const WrapperStructOutput& wrapperStructOutput,
const WrapperStructGui& wrapperStructGui,
const bool renderOutput,
const bool userOutputWsEmpty,
const std::shared_ptr<Producer>& producerSharedPtr,
const ThreadManagerMode threadManagerMode)
void wrapperConfigureSanityChecks(
WrapperStructPose& wrapperStructPose, const WrapperStructFace& wrapperStructFace,
const WrapperStructHand& wrapperStructHand, const WrapperStructExtra& wrapperStructExtra,
const WrapperStructInput& wrapperStructInput, const WrapperStructOutput& wrapperStructOutput,
const WrapperStructGui& wrapperStructGui, const bool renderOutput,
const bool userInputAndPreprocessingWsEmpty, const bool userOutputWsEmpty,
const std::shared_ptr<Producer>& producerSharedPtr, const ThreadManagerMode threadManagerMode)
{
try
{
......@@ -114,6 +110,24 @@ namespace op
error("Body, face, and hand keypoint detectors are disabled. You must enable at least one (i.e,"
" unselect `--body_disable`, select `--face`, or select `--hand`.",
__LINE__, __FUNCTION__, __FILE__);
const auto ownDetectorProvided = (wrapperStructFace.detector == Detector::Provided
|| wrapperStructHand.detector == Detector::Provided);
if (ownDetectorProvided && userInputAndPreprocessingWsEmpty
&& threadManagerMode != ThreadManagerMode::Asynchronous
&& threadManagerMode != ThreadManagerMode::AsynchronousIn)
error("You have selected to provide your own face and/or hand rectangle detections (`face_detector 2`"
" and/or `hand_detector 2`), thus OpenPose will not detect face and/or hand keypoints based on"
" the body keypoints. However, you are not providing any information about the location of the"
" faces and/or hands. Either provide the location of the face and/or hands (e.g., see the"
" `examples/tutorial_api_cpp/` examples, or change the value of `--face_detector` and/or"
" `--hand_detector`.", __LINE__, __FUNCTION__, __FILE__);
// Warning
if (ownDetectorProvided && wrapperStructPose.enable)
log("Warning: Body keypoint estimation is enabled while you have also selected to provide your own"
" face and/or hand rectangle detections (`face_detector 2` and/or `hand_detector 2`). Therefore,"
" OpenPose will not detect face and/or hand keypoints based on the body keypoints. Are you sure"
" you want to keep enabled the body keypoint detector? (disable it with `--body_disable`).",
Priority::High);
// If 3-D module, 1 person is the maximum
if (wrapperStructExtra.reconstruct3d && wrapperStructPose.numberPeopleMax != 1)
{
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册