linux下服务器端C++预测的问题
Created by: yjmm10
按照官方提供的教程 cmake : 3.10.3 opencv: 3.4.7 gcc/g++: 4.9.3 在conda 10.2 cudnn7.6 虚拟环境操作 inference文件夹放置到c++预测库的paddle文件夹下 inference/ |-- det_db | |--model | |--params |-- rec_rcnn | |--model | |--params
----------------------------------------------------------------------------------- build.sh文件 OPENCV_DIR=/home/petrichor/github/Paddle/opencv-3.4.7/opencv3 LIB_DIR=/home/petrichor/github/Paddle/download/fluid_inference-avx-mkl/fluid_inference CUDA_LIB_DIR=/usr/local/cuda/lib64 CUDNN_LIB_DIR=/usr/lib/x86_64-linux-gnu/
BUILD_DIR=build
rm -rf ${BUILD_DIR}
mkdir ${BUILD_DIR}
cd ${BUILD_DIR}
cmake ..
-DPADDLE_LIB=${LIB_DIR}
-DWITH_MKL=ON
-DWITH_GPU=OFF
-DWITH_STATIC_LIB=OFF
-DUSE_TENSORRT=OFF
-DOPENCV_DIR=${OPENCV_DIR}
-DCUDNN_LIB=${CUDNN_LIB_DIR}
-DCUDA_LIB=${CUDA_LIB_DIR} \
make -j
-----------------------------------------------------------------------------------run.sh ./build/ocr_system ./tools/config.txt ../../doc/imgs/12.jpg
-----------------------------------------------------------------------------------build 运行结果
-- The CXX compiler identification is GNU 4.9.3 -- The C compiler identification is GNU 4.9.3 -- Check for working CXX compiler: /usr/bin/c++ -- Check for working CXX compiler: /usr/bin/c++ -- works -- Detecting CXX compiler ABI info -- Detecting CXX compiler ABI info - done -- Detecting CXX compile features -- Detecting CXX compile features - done -- Check for working C compiler: /usr/bin/cc -- Check for working C compiler: /usr/bin/cc -- works -- Detecting C compiler ABI info -- Detecting C compiler ABI info - done -- Detecting C compile features -- Detecting C compile features - done -- Found OpenCV: /home/petrichor/github/Paddle/opencv-3.4.7/opencv3 (found version "3.4.7") flags -g -o3 -std=c++11 CMake Warning (dev) in CMakeLists.txt: No cmake_minimum_required command is present. A line of code such as
cmake_minimum_required(VERSION 3.10)
should be added at the top of the file. The version specified may be lower if you wish to support older CMake versions for this project. For more information run "cmake --help-policy CMP0000". This warning is for project developers. Use -Wno-dev to suppress it.
-- Configuring done CMake Warning (dev) at CMakeLists.txt:188 (add_executable): Policy CMP0003 should be set before this line. Add code such as
if(COMMAND cmake_policy)
cmake_policy(SET CMP0003 NEW)
endif(COMMAND cmake_policy)
as early as possible but after the most recent call to cmake_minimum_required or cmake_policy(VERSION). This warning appears because target "ocr_system" links to some libraries for which the linker must search:
glog, gflags, protobuf, z, xxhash, -ldl -lrt -lgomp -lz -lm -lpthread, dl
m, pthread, rt
and other libraries with known full path:
/home/petrichor/github/Paddle/download/fluid_inference-avx-mkl/fluid_inference/third_party/install/mklml/lib/libiomp5.so
/home/petrichor/github/Paddle/download/fluid_inference-avx-mkl/fluid_inference/third_party/install/mkldnn/lib/libmkldnn.so.0
/home/petrichor/github/Paddle/opencv-3.4.7/opencv3/lib64/libopencv_calib3d.a
/home/petrichor/github/Paddle/opencv-3.4.7/opencv3/share/OpenCV/3rdparty/lib64/liblibprotobuf.a
CMake is adding directories in the second list to the linker search path in case they are needed to find libraries from the first list (for backwards compatibility with CMake 2.4). Set policy CMP0003 to OLD or NEW to enable or disable this behavior explicitly. Run "cmake --help-policy CMP0003" for more information. This warning is for project developers. Use -Wno-dev to suppress it.
-- Generating done CMake Warning: Manually-specified variables were not used by the project:
USE_TENSORRT
-- Build files have been written to: /home/petrichor/github/PaddleOCR/deploy/cpp_infer/build
-----------------------------------------------------------------------------------run运行结果 =======Paddle OCR inference config====== char_list_file : ../../ppocr/utils/ppocr_keys_v1.txt cpu_math_library_num_threads : 10 det_db_box_thresh : 0.5 det_db_thresh : 0.3 det_db_unclip_ratio : 2.0 det_model_dir : ./inference/ch_det_mv3_db gpu_id : 0 gpu_mem : 4000 max_side_len : 960 rec_model_dir : ./inference/ch_rec_mv3_crnn use_gpu : 0 use_mkldnn : 0 visualize : 1 =======End of Paddle OCR inference config====== Segmentation fault (core dumped)
最后没有输出结果,还出现了段错误,中间出现libiomp5.so @找不到,我把padle预测库的libiomp5.so文件放到了/usr/lib/x86_64-linux-gnu/文件夹下后正常使用
求指教!