model_optimize_tool没有编译出来
Created by: xiaolvtaomi
操作过程参考的wiki里的model_optimize_tool文档, 第一步:按照步骤2的build.sh执行过后,并未提示成功或者失败; 第二步:在步骤3中的目录,或者 inference_lite_lib.android.armv8 中并没有看到bin/model_optimize_tool 。在inference_lite_lib.android.armv8的目录下面只有目录,没有目录
` root@docker-desktop:/Paddle-Lite# /Paddle-Lite/lite/tools/build.sh --arm_os=android --arm_abi=armv8 --arm_lang=gcc --android_stl=c++_static full_pulish
USAGE:
compile tiny publish so lib: ./build.sh --arm_os= --arm_abi= --arm_lang= --android_stl= tiny_publish
compile full publish so lib: ./build.sh --arm_os= --arm_abi= --arm_lang= --android_stl= full_publish
compile all arm tests: ./build.sh --arm_os= --arm_abi= --arm_lang= test
argument choices:
--arm_os: android --arm_abi: armv8|armv7 --arm_lang: gcc|clang --android_stl: c++_static|c++_shared
tasks:
tiny_publish: a small library for deployment. full_publish: a full library for debug and test. test: produce all the unittests.
root@docker-desktop:/Paddle-Lite# cd build.lite.android.armv8.gcc/ root@docker-desktop:/Paddle-Lite/build.lite.android.armv8.gcc# ls CMakeCache.txt Makefile inference_lite_lib.android.armv8 lite_libs.txt paddle_api_light_bundled.ar third_party CMakeFiles cmake_install.cmake lite lite_tests.txt paddle_api_light_bundled.ar.in root@docker-desktop:/Paddle-Lite/build.lite.android.armv8.gcc# cd lite root@docker-desktop:/Paddle-Lite/build.lite.android.armv8.gcc/lite# ls CMakeFiles Makefile api arm cmake_install.cmake core cuda fluid fpga host kernels model_parser npu opencl operators utils x86 root@docker-desktop:/Paddle-Lite/build.lite.android.armv8.gcc/lite# cd api/ root@docker-desktop:/Paddle-Lite/build.lite.android.armv8.gcc/lite/api# ls CMakeFiles Makefile android cmake_install.cmake root@docker-desktop:/Paddle-Lite/build.lite.android.armv8.gcc/lite/api# cd .. root@docker-desktop:/Paddle-Lite/build.lite.android.armv8.gcc/lite# ls CMakeFiles Makefile api arm cmake_install.cmake core cuda fluid fpga host kernels model_parser npu opencl operators utils x86 root@docker-desktop:/Paddle-Lite/build.lite.android.armv8.gcc/lite# cd .. root@docker-desktop:/Paddle-Lite/build.lite.android.armv8.gcc# ls CMakeCache.txt Makefile inference_lite_lib.android.armv8 lite_libs.txt paddle_api_light_bundled.ar third_party CMakeFiles cmake_install.cmake lite lite_tests.txt paddle_api_light_bundled.ar.in root@docker-desktop:/Paddle-Lite/build.lite.android.armv8.gcc# cd inference_lite_lib.android.armv8/ root@docker-desktop:/Paddle-Lite/build.lite.android.armv8.gcc/inference_lite_lib.android.armv8# ls demo java root@docker-desktop:/Paddle-Lite/build.lite.android.armv8.gcc/inference_lite_lib.android.armv8# pwd /Paddle-Lite/build.lite.android.armv8.gcc/inference_lite_lib.android.armv8 root@docker-desktop:/Paddle-Lite/build.lite.android.armv8.gcc/inference_lite_lib.android.armv8# ls demo java root@docker-desktop:/Paddle-Lite/build.lite.android.armv8.gcc/inference_lite_lib.android.armv8# `