提交 36cd1193 编写于 作者: L LielinJiang

pull develop

......@@ -4,7 +4,6 @@ __pycache__/
*$py.class
# C extensions
*.so
# Distribution / packaging
.Python
......
......@@ -64,11 +64,15 @@ $ pip install -r requirements.txt
* [如何训练U-Net](./turtorial/finetune_unet.md)
* [如何训练ICNet](./turtorial/finetune_icnet.md)
* [如何训练PSPNet](./turtorial/finetune_pspnet.md)
* [如何训练HRNet](./turtorial/finetune_hrnet.md)
### 预测部署
* [模型导出](./docs/model_export.md)
* [C++预测库使用](./inference)
* [使用Python预测](./deploy/python/)
* [使用C++预测](./deploy/cpp/)
* [移动端预测部署](./deploy/lite/)
### 高级功能
......
TRAIN_CROP_SIZE: (512, 512) # (width, height), for unpadding rangescaling and stepscaling
EVAL_CROP_SIZE: (512, 512) # (width, height), for unpadding rangescaling and stepscaling
AUG:
AUG_METHOD: "unpadding" # choice unpadding rangescaling and stepscaling
FIX_RESIZE_SIZE: (512, 512) # (width, height), for unpadding
INF_RESIZE_VALUE: 500 # for rangescaling
MAX_RESIZE_VALUE: 600 # for rangescaling
MIN_RESIZE_VALUE: 400 # for rangescaling
MAX_SCALE_FACTOR: 1.25 # for stepscaling
MIN_SCALE_FACTOR: 0.75 # for stepscaling
SCALE_STEP_SIZE: 0.25 # for stepscaling
MIRROR: True
BATCH_SIZE: 4
DATASET:
DATA_DIR: "./dataset/mini_pet/"
IMAGE_TYPE: "rgb" # choice rgb or rgba
NUM_CLASSES: 3
TEST_FILE_LIST: "./dataset/mini_pet/file_list/test_list.txt"
TRAIN_FILE_LIST: "./dataset/mini_pet/file_list/train_list.txt"
VAL_FILE_LIST: "./dataset/mini_pet/file_list/val_list.txt"
VIS_FILE_LIST: "./dataset/mini_pet/file_list/test_list.txt"
IGNORE_INDEX: 255
SEPARATOR: " "
FREEZE:
MODEL_FILENAME: "__model__"
PARAMS_FILENAME: "__params__"
MODEL:
MODEL_NAME: "hrnet"
DEFAULT_NORM_TYPE: "bn"
HRNET:
STAGE2:
NUM_CHANNELS: [18, 36]
STAGE3:
NUM_CHANNELS: [18, 36, 72]
STAGE4:
NUM_CHANNELS: [18, 36, 72, 144]
TRAIN:
PRETRAINED_MODEL_DIR: "./pretrained_model/hrnet_w18_bn_cityscapes/"
MODEL_SAVE_DIR: "./saved_model/hrnet_w18_bn_pet/"
SNAPSHOT_EPOCH: 10
TEST:
TEST_MODEL: "./saved_model/hrnet_w18_bn_pet/final"
SOLVER:
NUM_EPOCHS: 100
LR: 0.005
LR_POLICY: "poly"
OPTIMIZER: "sgd"
# PaddleSeg 预测部署
`PaddleSeg`目前支持使用`Python``C++`部署在`Windows``Linux` 上, 也可以集成`PaddleServing`服务化部署在 `Linux` 上。
[1. Python预测(支持 Linux 和 Windows)](./python/)
[2. C++预测(支持 Linux 和 Windows)](./cpp/)
[3. 服务化部署(仅支持 Linux)](./serving)
[4. 移动端部署(仅支持Android)](./lite)
......@@ -195,8 +195,8 @@ endif(NOT WIN32)
if(WITH_GPU)
if(NOT WIN32)
if (USE_TENSORRT)
set(DEPS ${DEPS} ${PADDLE_DIR}/third_party/install/tensorrt/lib/libnvinfer${CMAKE_STATIC_LIBRARY_SUFFIX})
set(DEPS ${DEPS} ${PADDLE_DIR}/third_party/install/tensorrt/lib/libnvinfer_plugin${CMAKE_STATIC_LIBRARY_SUFFIX})
set(DEPS ${DEPS} ${PADDLE_DIR}/third_party/install/tensorrt/lib/libnvinfer${CMAKE_SHARED_LIBRARY_SUFFIX})
set(DEPS ${DEPS} ${PADDLE_DIR}/third_party/install/tensorrt/lib/libnvinfer_plugin${CMAKE_SHARED_LIBRARY_SUFFIX})
endif()
set(DEPS ${DEPS} ${CUDA_LIB}/libcudart${CMAKE_SHARED_LIBRARY_SUFFIX})
set(DEPS ${DEPS} ${CUDNN_LIB}/libcudnn${CMAKE_SHARED_LIBRARY_SUFFIX})
......
# PaddleSeg C++预测部署方案
# PaddleSeg 预测部署方案
[1.说明](#1说明)
......@@ -11,7 +11,7 @@
## 1.说明
本目录提供一个跨平台的图像分割模型的C++预测部署方案,用户通过一定的配置,加上少量的代码,即可把模型集成到自己的服务中,完成图像分割的任务。
本目录提供一个跨平台`PaddlePaddle`图像分割模型的`C++`预测部署方案,用户通过一定的配置,加上少量的代码,即可把模型集成到自己的服务中,完成图像分割的任务。
主要设计的目标包括以下四点:
- 跨平台,支持在 windows 和 Linux 完成编译、开发和部署
......@@ -19,11 +19,13 @@
- 可扩展性,支持用户针对新模型开发自己特殊的数据预处理、后处理等逻辑
- 高性能,除了`PaddlePaddle`自身带来的性能优势,我们还针对图像分割的特点对关键步骤进行了性能优化
**注意** 如需要使用`Python`的预测部署方法,请参考:[Python预测部署](../python/)
## 2.主要目录和文件
```
inference
cpp
├── demo.cpp # 演示加载模型、读入数据、完成预测任务C++代码
|
├── conf
......@@ -88,6 +90,8 @@ deeplabv3p_xception65_humanseg
DEPLOY:
# 是否使用GPU预测
USE_GPU: 1
# 是否是PaddleSeg 0.3.0新版本模型
USE_PR : 1
# 模型和参数文件所在目录路径
MODEL_PATH: "/root/projects/models/deeplabv3p_xception65_humanseg"
# 模型文件名
......@@ -123,11 +127,11 @@ DEPLOY:
`Linux` 系统中执行以下命令:
```shell
./demo --conf=/root/projects/PaddleSeg/inference/conf/humanseg.yaml --input_dir=/root/projects/PaddleSeg/inference/images/humanseg/
./demo --conf=/root/projects/PaddleSeg/deploy/cpp/conf/humanseg.yaml --input_dir=/root/projects/PaddleSeg/deploy/cpp/images/humanseg/
```
`Windows` 中执行以下命令:
```shell
D:\projects\PaddleSeg\inference\build\Release>demo.exe --conf=D:\\projects\\PaddleSeg\\inference\\conf\\humanseg.yaml --input_dir=D:\\projects\\PaddleSeg\\inference\\images\humanseg\\
D:\projects\PaddleSeg\deploy\cpp\build\Release>demo.exe --conf=D:\\projects\\PaddleSeg\\deploy\\cpp\\conf\\humanseg.yaml --input_dir=D:\\projects\\PaddleSeg\\deploy\\cpp\\images\humanseg\\
```
......@@ -139,7 +143,7 @@ D:\projects\PaddleSeg\inference\build\Release>demo.exe --conf=D:\\projects\\Padd
| input_dir | 需要预测的图片目录 |
配置文件说明请参考上一步,样例程序会扫描input_dir目录下的所有以**jpg或jpeg**为后缀的图片,并生成对应的预测结果(若input_dir目录下没有以**jpg或jpeg**为后缀的图片,程序会报错)。图像分割会对`demo.jpg`的每个像素进行分类,其预测的结果保存在`demo_jpg.png`中。分割预测结果的图不能直接看到效果,必须经过可视化处理。对于二分类的图像分割模型,样例程序自动将预测结果转换成可视化结果,保存在`demo_jpg_scoremap.png`中, 原始尺寸的预测结果在`demo_jpg_recover.png`中,如下图。对于**多分类**的图像分割模型,请参考[可视化脚本使用方法](./docs/vis.md)
配置文件说明请参考上一步,样例程序会扫描input_dir目录下的所有以**jpg或jpeg**为后缀的图片,并生成对应的预测结果(若input_dir目录下没有以**jpg或jpeg**为后缀的图片,程序会报错)。图像分割会对`demo.jpg`的每个像素进行分类,其预测的结果保存在`demo_jpg_mask.png`中。分割预测结果的图不能直接看到效果,必须经过可视化处理。对于二分类的图像分割模型。如果需要对预测结果进行**可视化**,请参考[可视化脚本使用方法](./docs/vis.md)
输入原图
![avatar](images/humanseg/demo2.jpeg)
......
......@@ -24,7 +24,7 @@ int main(int argc, char** argv) {
google::ParseCommandLineFlags(&argc, &argv, true);
if (FLAGS_conf.empty() || FLAGS_input_dir.empty()) {
std::cout << "Usage: ./predictor --conf=/config/path/to/your/model "
<< "--input_dir=/directory/of/your/input/images";
<< "--input_dir=/directory/of/your/input/images" << std::endl;
return -1;
}
// 1. create a predictor and init it with conf
......
......@@ -16,7 +16,7 @@
1. `mkdir -p /root/projects/ && cd /root/projects`
2. `git clone https://github.com/PaddlePaddle/PaddleSeg.git`
`C++`预测代码在`/root/projects/PaddleSeg/inference` 目录,该目录不依赖任何`PaddleSeg`下其他目录。
`C++`预测代码在`/root/projects/PaddleSeg/deploy/cpp` 目录,该目录不依赖任何`PaddleSeg`下其他目录。
### Step2: 下载PaddlePaddle C++ 预测库 fluid_inference
......@@ -25,9 +25,9 @@ PaddlePaddle C++ 预测库主要分为CPU版本和GPU版本。其中,针对不
| 版本 | 链接 |
| ---- | ---- |
| CPU版本 | [fluid_inference.tgz](https://bj.bcebos.com/paddlehub/paddle_inference_lib/fluid_inference_linux_cpu_1.6.1.tgz) |
| CUDA 9.0版本 | [fluid_inference.tgz](https://bj.bcebos.com/paddlehub/paddle_inference_lib/fluid_inference_linux_cuda97_1.6.1.tgz) |
| CUDA 10.0版本 | [fluid_inference.tgz](https://bj.bcebos.com/paddlehub/paddle_inference_lib/fluid_inference_linux_cuda10_1.6.1.tgz) |
| CPU版本 | [fluid_inference.tgz](https://paddle-inference-lib.bj.bcebos.com/1.6.1-cpu-avx-mkl/fluid_inference.tgz) |
| CUDA 9.0版本 | [fluid_inference.tgz](https://paddle-inference-lib.bj.bcebos.com/1.6.1-gpu-cuda9-cudnn7-avx-mkl/fluid_inference.tgz) |
| CUDA 10.0版本 | [fluid_inference.tgz](https://paddle-inference-lib.bj.bcebos.com/1.6.1-gpu-cuda10-cudnn7-avx-mkl/fluid_inference.tgz) |
针对不同的CPU类型、不同的指令集,官方提供更多可用的预测库版本,目前已经推出1.6版本的预测库。其余版本具体请参考以下链接:[C++预测库下载列表](https://www.paddlepaddle.org.cn/documentation/docs/zh/develop/advanced_usage/deploy/inference/build_and_install_lib_cn.html)
......@@ -75,7 +75,7 @@ make install
在使用**GPU版本**预测库进行编译时,可执行下列操作。**注意**把对应的参数改为你的上述依赖库实际路径:
```shell
cd /root/projects/PaddleSeg/inference
cd /root/projects/PaddleSeg/deploy/cpp
mkdir build && cd build
cmake .. -DWITH_GPU=ON -DPADDLE_DIR=/root/projects/fluid_inference -DCUDA_LIB=/usr/local/cuda/lib64/ -DOPENCV_DIR=/root/projects/opencv3/ -DCUDNN_LIB=/usr/local/cuda/lib64/ -DWITH_STATIC_LIB=OFF
make
......@@ -83,7 +83,7 @@ make
在使用**CPU版本**预测库进行编译时,可执行下列操作。
```shell
cd /root/projects/PaddleSeg/inference
cd /root/projects/PaddleSeg/cpp
mkdir build && cd build
cmake .. -DWITH_GPU=OFF -DPADDLE_DIR=/root/projects/fluid_inference -DOPENCV_DIR=/root/projects/opencv3/ -DWITH_STATIC_LIB=OFF
......@@ -98,4 +98,4 @@ make
./demo --conf=/path/to/your/conf --input_dir=/path/to/your/input/data/directory
```
更详细说明请参考README文档: [预测和可视化部分](../README.md)
\ No newline at end of file
更详细说明请参考README文档: [预测和可视化部分](../README.md)
......@@ -12,7 +12,7 @@ cd inference/tools/
# 拷贝保存分割预测结果的图片到本目录
cp XXX/demo_jpg.png .
# 运行可视化脚本
python visualize.py demo.jpg demo_jpg.png vis_result.png
python visualize.py demo.jpg demo_jpg_mask.png vis_result.png
```
以下为上述运行可视化脚本例子中每个参数的含义,请根据测试机器中图片的**实际路径**修改对应参数。
......
......@@ -15,7 +15,7 @@
1. 打开`cmd`, 执行 `cd /d D:\projects`
2. `git clone http://gitlab.baidu.com/Paddle/PaddleSeg.git`
`C++`预测库代码在`D:\projects\PaddleSeg\inference` 目录,该目录不依赖任何`PaddleSeg`下其他目录。
`C++`预测库代码在`D:\projects\PaddleSeg\deploy\cpp` 目录,该目录不依赖任何`PaddleSeg`下其他目录。
### Step2: 下载PaddlePaddle C++ 预测库 fluid_inference
......@@ -24,9 +24,9 @@ PaddlePaddle C++ 预测库主要分为两大版本:CPU版本和GPU版本。其
| 版本 | 链接 |
| ---- | ---- |
| CPU版本 | [fluid_inference_install_dir.zip](https://bj.bcebos.com/paddlehub/paddle_inference_lib/fluid_install_dir_win_cpu_1.6.zip) |
| CUDA 9.0版本 | [fluid_inference_install_dir.zip](https://bj.bcebos.com/paddlehub/paddle_inference_lib/fluid_inference_install_dir_win_cuda9_1.6.1.zip) |
| CUDA 10.0版本 | [fluid_inference_install_dir.zip](https://bj.bcebos.com/paddlehub/paddle_inference_lib/fluid_inference_install_dir_win_cuda10_1.6.1.zip) |
| CPU版本 | [fluid_inference_install_dir.zip](https://paddle-wheel.bj.bcebos.com/1.6.2/win-infer/mkl/cpu/fluid_inference_install_dir.zip) |
| CUDA 9.0版本 | [fluid_inference_install_dir.zip](https://paddle-wheel.bj.bcebos.com/1.6.2/win-infer/mkl/post97/fluid_inference_install_dir.zip) |
| CUDA 10.0版本 | [fluid_inference_install_dir.zip](https://paddle-wheel.bj.bcebos.com/1.6.2/win-infer/mkl/post107/fluid_inference_install_dir.zip) |
解压后`D:\projects\fluid_inference`目录包含内容为:
```
......@@ -70,19 +70,19 @@ call "C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\vcvarsall.bat" amd6
```bash
# 切换到预测库所在目录
cd /d D:\projects\PaddleSeg\inference\
cd /d D:\projects\PaddleSeg\deply\cpp\
# 创建构建目录, 重新构建只需要删除该目录即可
mkdir build
cd build
# cmake构建VS项目
D:\projects\PaddleSeg\inference\build> cmake .. -G "Visual Studio 14 2015 Win64" -DWITH_GPU=ON -DPADDLE_DIR=D:\projects\fluid_inference -DCUDA_LIB=D:\projects\cudalib\v9.0\lib\x64 -DOPENCV_DIR=D:\projects\opencv -T host=x64
D:\projects\PaddleSeg\deploy\cpp\build> cmake .. -G "Visual Studio 14 2015 Win64" -DWITH_GPU=ON -DPADDLE_DIR=D:\projects\fluid_inference -DCUDA_LIB=D:\projects\cudalib\v9.0\lib\x64 -DOPENCV_DIR=D:\projects\opencv -T host=x64
```
在使用**CPU版本**预测库进行编译时,可执行下列操作。
```bash
# 切换到预测库所在目录
cd /d D:\projects\PaddleSeg\inference\
cd /d D:\projects\PaddleSeg\deploy\cpp\
# 创建构建目录, 重新构建只需要删除该目录即可
mkdir build
cd build
......@@ -102,7 +102,7 @@ D:\projects\PaddleSeg\inference\build> msbuild /m /p:Configuration=Release cpp_i
上述`Visual Studio 2015`编译产出的可执行文件在`build\release`目录下,切换到该目录:
```
cd /d D:\projects\PaddleSeg\inference\build\release
cd /d D:\projects\PaddleSeg\deploy\cpp\build\release
```
之后执行命令:
......
......@@ -15,7 +15,7 @@ Windows 平台下,我们使用`Visual Studio 2015` 和 `Visual Studio 2019 Com
### Step1: 下载代码
1. 点击下载源代码:[下载地址](https://github.com/PaddlePaddle/PaddleSeg/archive/release/v0.2.0.zip)
1. 点击下载源代码:[下载地址](https://github.com/PaddlePaddle/PaddleSeg/archive/release/v0.3.0.zip)
2. 解压,解压后目录重命名为`PaddleSeg`
以下代码目录路径为`D:\projects\PaddleSeg` 为例。
......@@ -27,9 +27,9 @@ PaddlePaddle C++ 预测库主要分为两大版本:CPU版本和GPU版本。其
| 版本 | 链接 |
| ---- | ---- |
| CPU版本 | [fluid_inference_install_dir.zip](https://bj.bcebos.com/paddlehub/paddle_inference_lib/fluid_install_dir_win_cpu_1.6.zip) |
| CUDA 9.0版本 | [fluid_inference_install_dir.zip](https://bj.bcebos.com/paddlehub/paddle_inference_lib/fluid_inference_install_dir_win_cuda9_1.6.1.zip) |
| CUDA 10.0版本 | [fluid_inference_install_dir.zip](https://bj.bcebos.com/paddlehub/paddle_inference_lib/fluid_inference_install_dir_win_cuda10_1.6.1.zip) |
| CPU版本 | [fluid_inference_install_dir.zip](https://paddle-wheel.bj.bcebos.com/1.6.1/win-infer/mkl/cpu/fluid_inference_install_dir.zip) |
| CUDA 9.0版本 | [fluid_inference_install_dir.zip](https://paddle-wheel.bj.bcebos.com/1.6.1/win-infer/mkl/post97/fluid_inference_install_dir.zip) |
| CUDA 10.0版本 | [fluid_inference_install_dir.zip](https://paddle-wheel.bj.bcebos.com/1.6.1/win-infer/mkl/post107/fluid_inference_install_dir.zip) |
解压后`D:\projects\fluid_inference`目录包含内容为:
```
......@@ -74,6 +74,7 @@ fluid_inference
| *CUDA_LIB | CUDA的库路径 |
| OPENCV_DIR | OpenCV的安装路径 |
| PADDLE_DIR | Paddle预测库的路径 |
**注意**在使用CPU版本预测库时,需要把CUDA_LIB的勾去掉。
![step4](https://paddleseg.bj.bcebos.com/inference/vs2019_step5.png)
......@@ -89,7 +90,7 @@ fluid_inference
上述`Visual Studio 2019`编译产出的可执行文件在`out\build\x64-Release`目录下,打开`cmd`,并切换到该目录:
```
cd /d D:\projects\PaddleSeg\inference\out\build\x64-Release
cd /d D:\projects\PaddleSeg\deploy\cpp\out\build\x64-Release
```
之后执行命令:
......
......@@ -36,7 +36,6 @@ namespace PaddleSolution {
const auto& model_dir = _model_config._model_path;
const auto& model_filename = _model_config._model_file_name;
const auto& params_filename = _model_config._param_file_name;
// load paddle model file
if (_model_config._predictor_mode == "NATIVE") {
paddle::NativeConfig config;
......@@ -52,6 +51,12 @@ namespace PaddleSolution {
paddle::AnalysisConfig config;
if (use_gpu) {
config.EnableUseGpu(100, 0);
if (TRT_MAP.find(_model_config._trt_mode) != TRT_MAP.end()) {
auto precision = TRT_MAP[_model_config._trt_mode];
bool use_cab = (precision == paddle::AnalysisConfig::Precision::kInt8);
config.EnableTensorRtEngine(1 << 30, _model_config._batch_size, 40,
precision, false, use_cab);
}
}
auto prog_file = utils::path_join(model_dir, model_filename);
auto param_file = utils::path_join(model_dir, params_filename);
......@@ -83,7 +88,6 @@ namespace PaddleSolution {
int blob_out_len = length;
int seg_out_len = eval_height * eval_width * eval_num_class;
if (blob_out_len != seg_out_len) {
LOG(ERROR) << " [FATAL] unequal: input vs output [" <<
seg_out_len << "|" << blob_out_len << "]" << std::endl;
......@@ -97,25 +101,22 @@ namespace PaddleSolution {
cv::Mat mask_png = cv::Mat(eval_height, eval_width, CV_8UC1);
mask_png.data = _mask.data();
std::string nname(fname);
auto pos = fname.find(".");
auto pos = fname.rfind(".");
nname[pos] = '_';
std::string mask_save_name = nname + ".png";
std::string mask_save_name = nname + "_mask.png";
cv::imwrite(mask_save_name, mask_png);
cv::Mat scoremap_png = cv::Mat(eval_height, eval_width, CV_8UC1);
scoremap_png.data = _scoremap.data();
std::string scoremap_save_name = nname
+ std::string("_scoremap.png");
std::string scoremap_save_name = nname + std::string("_scoremap.png");
cv::imwrite(scoremap_save_name, scoremap_png);
std::cout << "save mask of [" << fname << "] done" << std::endl;
if (height && width) {
int recover_height = *height;
int recover_width = *width;
cv::Mat recover_png = cv::Mat(recover_height,
recover_width, CV_8UC1);
cv::Mat recover_png = cv::Mat(recover_height, recover_width, CV_8UC1);
cv::resize(scoremap_png, recover_png,
cv::Size(recover_width, recover_height),
0, 0, cv::INTER_CUBIC);
cv::Size(recover_width, recover_height), 0, 0, cv::INTER_CUBIC);
std::string recover_name = nname + std::string("_recover.png");
cv::imwrite(recover_name, recover_png);
}
......@@ -176,8 +177,13 @@ namespace PaddleSolution {
}
paddle::PaddleTensor im_tensor;
im_tensor.name = "image";
im_tensor.shape = std::vector<int>{ batch_size, channels,
eval_height, eval_width };
if (!_model_config._use_pr) {
im_tensor.shape = std::vector<int>{ batch_size, channels,
eval_height, eval_width };
} else {
im_tensor.shape = std::vector<int>{ batch_size, eval_height,
eval_width, channels};
}
im_tensor.data.Reset(input_buffer.data(),
real_buffer_size * sizeof(float));
im_tensor.dtype = paddle::PaddleDType::FLOAT32;
......@@ -202,19 +208,45 @@ namespace PaddleSolution {
std::cout << _outputs[0].shape[j] << ",";
}
std::cout << ")" << std::endl;
const size_t nums = _outputs.front().data.length()
/ sizeof(float);
if (out_num % batch_size != 0 || out_num != nums) {
LOG(ERROR) << "outputs data size mismatch with shape size.";
size_t nums = _outputs.front().data.length() / sizeof(float);
if (_model_config._use_pr) {
nums = _outputs.front().data.length() / sizeof(int64_t);
}
// size mismatch checking
bool size_mismatch = out_num % batch_size;
size_mismatch |= (!_model_config._use_pr) && (nums != out_num);
size_mismatch |= _model_config._use_pr && (nums != eval_height * eval_width);
if (size_mismatch) {
LOG(ERROR) << "output with a unexpected size";
return -1;
}
if (_model_config._use_pr) {
std::vector<uchar> out_data;
out_data.resize(out_num);
auto addr = reinterpret_cast<int64_t*>(_outputs[0].data.data());
for (int r = 0; r < out_num; ++r) {
out_data[r] = (int)(addr[r]);
}
for (int r = 0; r < batch_size; ++r) {
cv::Mat mask_png = cv::Mat(eval_height, eval_width, CV_8UC1);
mask_png.data = out_data.data() + eval_height*eval_width*r;
auto name = imgs_batch[r];
auto pos = name.rfind(".");
name[pos] = '_';
std::string mask_save_name = name + "_mask.png";
cv::imwrite(mask_save_name, mask_png);
}
continue;
}
for (int i = 0; i < batch_size; ++i) {
float* output_addr = reinterpret_cast<float*>(
_outputs[0].data.data())
+ i * (out_num / batch_size);
+ i * (nums / batch_size);
output_mask(imgs_batch[i], output_addr,
out_num / batch_size,
nums / batch_size,
&org_height[i],
&org_width[i]);
}
......@@ -278,8 +310,14 @@ namespace PaddleSolution {
return -1;
}
auto im_tensor = _main_predictor->GetInputTensor("image");
im_tensor->Reshape({ batch_size, channels,
if (!_model_config._use_pr) {
im_tensor->Reshape({ batch_size, channels,
eval_height, eval_width });
} else {
im_tensor->Reshape({ batch_size, eval_height,
eval_width, channels});
}
im_tensor->copy_from_cpu(input_buffer.data());
auto t1 = std::chrono::high_resolution_clock::now();
......@@ -292,7 +330,6 @@ namespace PaddleSolution {
auto output_names = _main_predictor->GetOutputNames();
auto output_t = _main_predictor->GetOutputTensor(
output_names[0]);
std::vector<float> out_data;
std::vector<int> output_shape = output_t->shape();
int out_num = 1;
......@@ -303,6 +340,30 @@ namespace PaddleSolution {
}
std::cout << ")" << std::endl;
if (_model_config._use_pr) {
std::vector<int64_t> out_data;
out_data.resize(out_num);
output_t->copy_to_cpu(out_data.data());
std::vector<uchar> mask_data;
mask_data.resize(out_num);
auto addr = reinterpret_cast<int64_t*>(out_data.data());
for (int r = 0; r < out_num; ++r) {
mask_data[r] = (int)(addr[r]);
}
for (int r = 0; r < batch_size; ++r) {
cv::Mat mask_png = cv::Mat(eval_height, eval_width, CV_8UC1);
mask_png.data = mask_data.data() + eval_height*eval_width*r;
auto name = imgs_batch[r];
auto pos = name.rfind(".");
name[pos] = '_';
std::string mask_save_name = name + "_mask.png";
cv::imwrite(mask_save_name, mask_png);
}
continue;
}
std::vector<float> out_data;
out_data.resize(out_num);
output_t->copy_to_cpu(out_data.data());
for (int i = 0; i < batch_size; ++i) {
......
......@@ -55,5 +55,10 @@ class Predictor {
PaddleSolution::PaddleSegModelConfigPaser _model_config;
std::shared_ptr<PaddleSolution::ImagePreProcessor> _preprocessor;
std::unique_ptr<paddle::PaddlePredictor> _main_predictor;
std::map<std::string, paddle::AnalysisConfig::Precision> TRT_MAP = {
{"FP32", paddle::AnalysisConfig::Precision::kFloat32},
{"FP16", paddle::AnalysisConfig::Precision::kHalf},
{"INT8", paddle::AnalysisConfig::Precision::kInt8}
};
};
} // namespace PaddleSolution
......@@ -40,14 +40,18 @@ namespace PaddleSolution {
LOG(ERROR) << "Only support rgb(gray) and rgba image.";
return false;
}
cv::Size resize_size(_config->_resize[0], _config->_resize[1]);
int rw = resize_size.width;
int rh = resize_size.height;
if (*ori_h != rh || *ori_w != rw) {
cv::resize(im, im, resize_size, 0, 0, cv::INTER_LINEAR);
}
utils::normalize(im, data, _config->_mean, _config->_std);
if (!_config->_use_pr) {
utils::normalize(im, data, _config->_mean, _config->_std);
} else {
utils::flatten_mat(im, data);
}
return true;
}
......
......@@ -25,6 +25,7 @@ class PaddleSegModelConfigPaser {
:_class_num(0),
_channels(0),
_use_gpu(0),
_use_pr(0),
_batch_size(1),
_model_file_name("__model__"),
_param_file_name("__params__") {
......@@ -40,10 +41,12 @@ class PaddleSegModelConfigPaser {
_class_num = 0;
_channels = 0;
_use_gpu = 0;
_use_pr = 0;
_batch_size = 1;
_model_file_name.clear();
_model_path.clear();
_param_file_name.clear();
_trt_mode.clear();
}
std::string process_parenthesis(const std::string& str) {
......@@ -70,43 +73,120 @@ class PaddleSegModelConfigPaser {
bool load_config(const std::string& conf_file) {
reset();
YAML::Node config = YAML::LoadFile(conf_file);
YAML::Node config;
try {
config = YAML::LoadFile(conf_file);
} catch(...) {
return false;
}
// 1. get resize
auto str = config["DEPLOY"]["EVAL_CROP_SIZE"].as<std::string>();
_resize = parse_str_to_vec<int>(process_parenthesis(str));
if (config["DEPLOY"]["EVAL_CROP_SIZE"].IsDefined()) {
auto str = config["DEPLOY"]["EVAL_CROP_SIZE"].as<std::string>();
_resize = parse_str_to_vec<int>(process_parenthesis(str));
} else {
std::cerr << "Please set EVAL_CROP_SIZE: (xx, xx)" << std::endl;
return false;
}
// 2. get mean
for (const auto& item : config["DEPLOY"]["MEAN"]) {
_mean.push_back(item.as<float>());
if (config["DEPLOY"]["MEAN"].IsDefined()) {
for (const auto& item : config["DEPLOY"]["MEAN"]) {
_mean.push_back(item.as<float>());
}
} else {
std::cerr << "Please set MEAN: [xx, xx, xx]" << std::endl;
return false;
}
// 3. get std
for (const auto& item : config["DEPLOY"]["STD"]) {
_std.push_back(item.as<float>());
if(config["DEPLOY"]["STD"].IsDefined()) {
for (const auto& item : config["DEPLOY"]["STD"]) {
_std.push_back(item.as<float>());
}
} else {
std::cerr << "Please set STD: [xx, xx, xx]" << std::endl;
return false;
}
// 4. get image type
_img_type = config["DEPLOY"]["IMAGE_TYPE"].as<std::string>();
if (config["DEPLOY"]["IMAGE_TYPE"].IsDefined()) {
_img_type = config["DEPLOY"]["IMAGE_TYPE"].as<std::string>();
} else {
std::cerr << "Please set IMAGE_TYPE: \"rgb\" or \"rgba\"" << std::endl;
return false;
}
// 5. get class number
_class_num = config["DEPLOY"]["NUM_CLASSES"].as<int>();
if (config["DEPLOY"]["NUM_CLASSES"].IsDefined()) {
_class_num = config["DEPLOY"]["NUM_CLASSES"].as<int>();
} else {
std::cerr << "Please set NUM_CLASSES: x" << std::endl;
return false;
}
// 7. set model path
_model_path = config["DEPLOY"]["MODEL_PATH"].as<std::string>();
if (config["DEPLOY"]["MODEL_PATH"].IsDefined()) {
_model_path = config["DEPLOY"]["MODEL_PATH"].as<std::string>();
} else {
std::cerr << "Please set MODEL_PATH: \"/path/to/model_dir\"" << std::endl;
return false;
}
// 8. get model file_name
_model_file_name = config["DEPLOY"]["MODEL_FILENAME"].as<std::string>();
if (config["DEPLOY"]["MODEL_FILENAME"].IsDefined()) {
_model_file_name = config["DEPLOY"]["MODEL_FILENAME"].as<std::string>();
} else {
_model_file_name = "__model__";
}
// 9. get model param file name
_param_file_name =
config["DEPLOY"]["PARAMS_FILENAME"].as<std::string>();
if (config["DEPLOY"]["PARAMS_FILENAME"].IsDefined()) {
_param_file_name
= config["DEPLOY"]["PARAMS_FILENAME"].as<std::string>();
} else {
_param_file_name = "__params__";
}
// 10. get pre_processor
_pre_processor = config["DEPLOY"]["PRE_PROCESSOR"].as<std::string>();
if (config["DEPLOY"]["PRE_PROCESSOR"].IsDefined()) {
_pre_processor = config["DEPLOY"]["PRE_PROCESSOR"].as<std::string>();
} else {
std::cerr << "Please set PRE_PROCESSOR: \"DetectionPreProcessor\"" << std::endl;
return false;
}
// 11. use_gpu
_use_gpu = config["DEPLOY"]["USE_GPU"].as<int>();
if (config["DEPLOY"]["USE_GPU"].IsDefined()) {
_use_gpu = config["DEPLOY"]["USE_GPU"].as<int>();
} else {
_use_gpu = 0;
}
// 12. predictor_mode
_predictor_mode = config["DEPLOY"]["PREDICTOR_MODE"].as<std::string>();
if (config["DEPLOY"]["PREDICTOR_MODE"].IsDefined()) {
_predictor_mode = config["DEPLOY"]["PREDICTOR_MODE"].as<std::string>();
} else {
std::cerr << "Please set PREDICTOR_MODE: \"NATIVE\" or \"ANALYSIS\"" << std::endl;
return false;
}
// 13. batch_size
_batch_size = config["DEPLOY"]["BATCH_SIZE"].as<int>();
if (config["DEPLOY"]["BATCH_SIZE"].IsDefined()) {
_batch_size = config["DEPLOY"]["BATCH_SIZE"].as<int>();
} else {
_batch_size = 1;
}
// 14. channels
_channels = config["DEPLOY"]["CHANNELS"].as<int>();
if (config["DEPLOY"]["CHANNELS"].IsDefined()) {
_channels = config["DEPLOY"]["CHANNELS"].as<int>();
} else {
std::cerr << "Please set CHANNELS: x" << std::endl;
return false;
}
// 15. use_pr
if (config["DEPLOY"]["USE_PR"].IsDefined()) {
_use_pr = config["DEPLOY"]["USE_PR"].as<int>();
} else {
_use_pr = 0;
}
// 16. trt_mode
if (config["DEPLOY"]["TRT_MODE"].IsDefined()) {
_trt_mode = config["DEPLOY"]["TRT_MODE"].as<std::string>();
} else {
_trt_mode = "";
}
return true;
}
......@@ -173,6 +253,10 @@ class PaddleSegModelConfigPaser {
std::string _predictor_mode;
// DEPLOY.BATCH_SIZE
int _batch_size;
// DEPLOY.USE_PR: OP Optimized model
int _use_pr;
// DEPLOY.TRT_MODE: TRT Precesion
std::string _trt_mode;
};
} // namespace PaddleSolution
......@@ -23,7 +23,8 @@
#include <opencv2/highgui/highgui.hpp>
#ifdef _WIN32
#include <filesystem>
#define GLOG_NO_ABBREVIATED_SEVERITIES
#include <windows.h>
#else
#include <dirent.h>
#include <sys/types.h>
......@@ -67,15 +68,21 @@ namespace utils {
// scan a directory and get all files with input extensions
inline std::vector<std::string> get_directory_images(
const std::string& path, const std::string& exts) {
std::string pattern(path);
pattern.append("\\*");
std::vector<std::string> imgs;
for (const auto& item :
std::experimental::filesystem::directory_iterator(path)) {
auto suffix = item.path().extension().string();
if (exts.find(suffix) != std::string::npos && suffix.size() > 0) {
auto fullname = path_join(path,
item.path().filename().string());
imgs.push_back(item.path().string());
}
WIN32_FIND_DATA data;
HANDLE hFind;
if ((hFind = FindFirstFile(pattern.c_str(), &data)) != INVALID_HANDLE_VALUE) {
do {
auto fname = std::string(data.cFileName);
auto pos = fname.rfind(".");
auto ext = fname.substr(pos + 1);
if (ext.size() > 1 && exts.find(ext) != std::string::npos) {
imgs.push_back(path + "\\" + data.cFileName);
}
} while (FindNextFile(hFind, &data) != 0);
FindClose(hFind);
}
return imgs;
}
......@@ -103,6 +110,25 @@ namespace utils {
}
}
// flatten a cv::mat
inline void flatten_mat(cv::Mat& im, float* data) {
int rh = im.rows;
int rw = im.cols;
int rc = im.channels();
#pragma omp parallel for
for (int h = 0; h < rh; ++h) {
const uchar* ptr = im.ptr<uchar>(h);
int im_index = 0;
int top_index = h * rw * rc;
for (int w = 0; w < rw; ++w) {
for (int c = 0; c < rc; ++c) {
float pixel = static_cast<float>(ptr[im_index++]);
data[top_index++] = pixel;
}
}
}
}
// argmax
inline void argmax(float* out, std::vector<int>& shape,
std::vector<uchar>& mask, std::vector<uchar>& scoremap) {
......
# 人像分割在移动端的部署
## 1.介绍
以人像分割在安卓端的部署为例,介绍如何使用[Paddle-Lite](https://github.com/PaddlePaddle/Paddle-Lite)对分割模型进行移动端的部署。文档第二节介绍如何使用人像分割安卓端的demo,后面几章节介绍如何将PaddleSeg的Model部署到安卓设备。
## 2.安卓Demo使用
### 2.1 要求
* Android Studio 3.4;
* Android手机或开发板;
### 2.2 安装
* git clone https://github.com/PaddlePaddle/PaddleSeg.git ;
* 打开Android Studio,在"Welcome to Android Studio"窗口点击"Open an existing Android Studio project",在弹出的路径选择窗口中进入"/PaddleSeg/lite/humanseg-android-demo/"目录,然后点击右下角的"Open"按钮即可导入工程
* 通过USB连接Android手机或开发板;
* 载入工程后,点击菜单栏的Run->Run 'App'按钮,在弹出的"Select Deployment Target"窗口选择已经连接的Android设备,然后点击"OK"按钮;
* 手机上会出现Demo的主界面,选择"Image Segmentation"图标,进入的人像分割示例程序;
* 在人像分割Demo中,默认会载入一张人像图像,并会在图像下方给出CPU的预测结果;
* 在人像分割Demo中,你还可以通过上方的"Gallery"和"Take Photo"按钮从相册或相机中加载测试图像;
### 2.3 其他
此安卓demo基于[Paddle-Lite-Demo](https://github.com/PaddlePaddle/Paddle-Lite-Demo)开发,更多的细节请参考该repo。<br>
*注意:demo中拍照时照片会自动压缩,想测试拍照原图效果,可使用手机相机拍照后从相册中打开进行预测。*
### 2.4 效果展示
<img src="example/human_1.png" width="20%" ><img src="example/human_2.png" width="20%" ><img src="example/human_3.png" width="20%" >
## 3.模型导出
此demo的人像分割模型为[下载链接](https://paddleseg.bj.bcebos.com/models/humanseg_mobilenetv2_1_0_bn_freeze_model_pr_po.zip),是基于Deeplab_v3+mobileNet_v2的humanseg模型,关于humanseg的介绍移步[特色垂类分割模型](./contrib),更多的分割模型导出可参考:[模型导出](https://github.com/PaddlePaddle/PaddleSeg/blob/release/v0.2.0/docs/model_export.md)
## 4.模型转换
### 4.1模型转换工具
准备好PaddleSeg导出来的模型和参数文件后,需要使用Paddle-Lite提供的model_optimize_tool对模型进行优化,并转换成Paddle-Lite支持的文件格式,这里有两种方式来实现:
* 手动编译model_optimize_tool
详细的模型转换方法参考paddlelite提供的官方文档:[模型转化方法](https://paddlepaddle.github.io/Paddle-Lite/v2.0.0/model_optimize_tool/),从PaddleSeg里面导出来的模型使用model_optimize_tool即可导出model.nb和param.nb文件。
* 使用预编译版本的model_optimize_tool,最新的预编译文件参考[release](https://github.com/PaddlePaddle/Paddle-Lite/releases/),此demo使用的版本为[model_optimize_tool](https://github.com/PaddlePaddle/Paddle-Lite/releases/download/v2.0.0/model_optimize_tool)
*注意:如果运行失败,请在[Paddle-Lite源码编译](https://paddlepaddle.github.io/Paddle-Lite/v2.0.0/source_compile/)的开发环境中使用model_optimize_tool*
### 4.2 更新模型
将优化好的model.nb和param.nb文件,替换app/src/main/assets/image_segmentation/
models/deeplab_mobilenet_for_cpu下面的文件即可。
## 5. 更新预测库
Paddle-Lite的编译目前支持Docker,Linux和Mac OS开发环境,建议使用Docker开发环境,以免存在各种依赖问题,同时也提供了预编译版本的预测库。准备Paddle-Lite在安卓端的预测库,主要包括三个文件:
* PaddlePredictor.jar;
* arm64-v8a/libpaddle_lite_jni.so;
* armeabi-v7a/libpaddle_lite_jni.so;
下面分别介绍两种方法:
* 使用预编译版本的预测库,最新的预编译文件参考:[release](https://github.com/PaddlePaddle/Paddle-Lite/releases/),此demo使用的版本:
* arm64-v8a: [inference_lite_lib.android.armv8](https://github.com/PaddlePaddle/Paddle-Lite/releases/download/v2.0.0/inference_lite_lib.android.armv8.gcc.c++_shared.with_extra.full_publish.tar.gz) ;
* armeabi-v7a: [inference_lite_lib.android.armv7](https://github.com/PaddlePaddle/Paddle-Lite/releases/download/v2.0.0/inference_lite_lib.android.armv7.gcc.c++_shared.with_extra.full_publish.tar.gz) ;
解压上面两个文件,PaddlePredictor.jar位于任一文件夹:inference_lite_lib.android.xxx/java/jar/PaddlePredictor.jar;
解压上述inference_lite_lib.android.armv8文件,arm64-v8a/libpaddle_lite_jni.so位于:inference_lite_lib.android.armv8/java/so/libpaddle_lite_jni.so;
解压上述inference_lite_lib.android.armv7文件,armeabi-v7a/libpaddle_lite_jni.so位于:inference_lite_lib.android.armv7/java/so/libpaddle_lite_jni.so;
* 手动编译Paddle-Lite预测库
开发环境的准备和编译方法参考:[Paddle-Lite源码编译](https://paddlepaddle.github.io/Paddle-Lite/v2.0.0/source_compile/)
准备好上述文件,即可参考[java_api](https://paddlepaddle.github.io/Paddle-Lite/v2.0.0/java_api_doc/)在安卓端进行推理。具体使用预测库的方法可参考[Paddle-Lite-Demo](https://github.com/PaddlePaddle/Paddle-Lite-Demo)中更新预测库部分的文档。
*.iml
.gradle
/local.properties
/.idea/caches
/.idea/libraries
/.idea/modules.xml
/.idea/workspace.xml
/.idea/navEditor.xml
/.idea/assetWizardSettings.xml
.DS_Store
/build
/captures
.externalNativeBuild
apply plugin: 'com.android.application'
android {
compileSdkVersion 28
defaultConfig {
applicationId "com.baidu.paddle.lite.demo"
minSdkVersion 15
targetSdkVersion 28
versionCode 1
versionName "1.0"
testInstrumentationRunner "android.support.test.runner.AndroidJUnitRunner"
}
buildTypes {
release {
minifyEnabled false
proguardFiles getDefaultProguardFile('proguard-android-optimize.txt'), 'proguard-rules.pro'
}
}
}
dependencies {
implementation fileTree(include: ['*.jar'], dir: 'libs')
implementation 'com.android.support:appcompat-v7:28.0.0'
implementation 'com.android.support.constraint:constraint-layout:1.1.3'
implementation 'com.android.support:design:28.0.0'
testImplementation 'junit:junit:4.12'
androidTestImplementation 'com.android.support.test:runner:1.0.2'
androidTestImplementation 'com.android.support.test.espresso:espresso-core:3.0.2'
implementation files('libs/PaddlePredictor.jar')
}
#Mon Nov 25 17:01:58 CST 2019
distributionBase=GRADLE_USER_HOME
distributionPath=wrapper/dists
zipStoreBase=GRADLE_USER_HOME
zipStorePath=wrapper/dists
distributionUrl=https\://services.gradle.org/distributions/gradle-5.4.1-all.zip
#!/usr/bin/env sh
##############################################################################
##
## Gradle start up script for UN*X
##
##############################################################################
# Attempt to set APP_HOME
# Resolve links: $0 may be a link
PRG="$0"
# Need this for relative symlinks.
while [ -h "$PRG" ] ; do
ls=`ls -ld "$PRG"`
link=`expr "$ls" : '.*-> \(.*\)$'`
if expr "$link" : '/.*' > /dev/null; then
PRG="$link"
else
PRG=`dirname "$PRG"`"/$link"
fi
done
SAVED="`pwd`"
cd "`dirname \"$PRG\"`/" >/dev/null
APP_HOME="`pwd -P`"
cd "$SAVED" >/dev/null
APP_NAME="Gradle"
APP_BASE_NAME=`basename "$0"`
# Add default JVM options here. You can also use JAVA_OPTS and GRADLE_OPTS to pass JVM options to this script.
DEFAULT_JVM_OPTS=""
# Use the maximum available, or set MAX_FD != -1 to use that value.
MAX_FD="maximum"
warn () {
echo "$*"
}
die () {
echo
echo "$*"
echo
exit 1
}
# OS specific support (must be 'true' or 'false').
cygwin=false
msys=false
darwin=false
nonstop=false
case "`uname`" in
CYGWIN* )
cygwin=true
;;
Darwin* )
darwin=true
;;
MINGW* )
msys=true
;;
NONSTOP* )
nonstop=true
;;
esac
CLASSPATH=$APP_HOME/gradle/wrapper/gradle-wrapper.jar
# Determine the Java command to use to start the JVM.
if [ -n "$JAVA_HOME" ] ; then
if [ -x "$JAVA_HOME/jre/sh/java" ] ; then
# IBM's JDK on AIX uses strange locations for the executables
JAVACMD="$JAVA_HOME/jre/sh/java"
else
JAVACMD="$JAVA_HOME/bin/java"
fi
if [ ! -x "$JAVACMD" ] ; then
die "ERROR: JAVA_HOME is set to an invalid directory: $JAVA_HOME
Please set the JAVA_HOME variable in your environment to match the
location of your Java installation."
fi
else
JAVACMD="java"
which java >/dev/null 2>&1 || die "ERROR: JAVA_HOME is not set and no 'java' command could be found in your PATH.
Please set the JAVA_HOME variable in your environment to match the
location of your Java installation."
fi
# Increase the maximum file descriptors if we can.
if [ "$cygwin" = "false" -a "$darwin" = "false" -a "$nonstop" = "false" ] ; then
MAX_FD_LIMIT=`ulimit -H -n`
if [ $? -eq 0 ] ; then
if [ "$MAX_FD" = "maximum" -o "$MAX_FD" = "max" ] ; then
MAX_FD="$MAX_FD_LIMIT"
fi
ulimit -n $MAX_FD
if [ $? -ne 0 ] ; then
warn "Could not set maximum file descriptor limit: $MAX_FD"
fi
else
warn "Could not query maximum file descriptor limit: $MAX_FD_LIMIT"
fi
fi
# For Darwin, add options to specify how the application appears in the dock
if $darwin; then
GRADLE_OPTS="$GRADLE_OPTS \"-Xdock:name=$APP_NAME\" \"-Xdock:icon=$APP_HOME/media/gradle.icns\""
fi
# For Cygwin, switch paths to Windows format before running java
if $cygwin ; then
APP_HOME=`cygpath --path --mixed "$APP_HOME"`
CLASSPATH=`cygpath --path --mixed "$CLASSPATH"`
JAVACMD=`cygpath --unix "$JAVACMD"`
# We build the pattern for arguments to be converted via cygpath
ROOTDIRSRAW=`find -L / -maxdepth 1 -mindepth 1 -type d 2>/dev/null`
SEP=""
for dir in $ROOTDIRSRAW ; do
ROOTDIRS="$ROOTDIRS$SEP$dir"
SEP="|"
done
OURCYGPATTERN="(^($ROOTDIRS))"
# Add a user-defined pattern to the cygpath arguments
if [ "$GRADLE_CYGPATTERN" != "" ] ; then
OURCYGPATTERN="$OURCYGPATTERN|($GRADLE_CYGPATTERN)"
fi
# Now convert the arguments - kludge to limit ourselves to /bin/sh
i=0
for arg in "$@" ; do
CHECK=`echo "$arg"|egrep -c "$OURCYGPATTERN" -`
CHECK2=`echo "$arg"|egrep -c "^-"` ### Determine if an option
if [ $CHECK -ne 0 ] && [ $CHECK2 -eq 0 ] ; then ### Added a condition
eval `echo args$i`=`cygpath --path --ignore --mixed "$arg"`
else
eval `echo args$i`="\"$arg\""
fi
i=$((i+1))
done
case $i in
(0) set -- ;;
(1) set -- "$args0" ;;
(2) set -- "$args0" "$args1" ;;
(3) set -- "$args0" "$args1" "$args2" ;;
(4) set -- "$args0" "$args1" "$args2" "$args3" ;;
(5) set -- "$args0" "$args1" "$args2" "$args3" "$args4" ;;
(6) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" ;;
(7) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" "$args6" ;;
(8) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" "$args6" "$args7" ;;
(9) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" "$args6" "$args7" "$args8" ;;
esac
fi
# Escape application args
save () {
for i do printf %s\\n "$i" | sed "s/'/'\\\\''/g;1s/^/'/;\$s/\$/' \\\\/" ; done
echo " "
}
APP_ARGS=$(save "$@")
# Collect all arguments for the java command, following the shell quoting and substitution rules
eval set -- $DEFAULT_JVM_OPTS $JAVA_OPTS $GRADLE_OPTS "\"-Dorg.gradle.appname=$APP_BASE_NAME\"" -classpath "\"$CLASSPATH\"" org.gradle.wrapper.GradleWrapperMain "$APP_ARGS"
# by default we should be in the correct project dir, but when run from Finder on Mac, the cwd is wrong
if [ "$(uname)" = "Darwin" ] && [ "$HOME" = "$PWD" ]; then
cd "$(dirname "$0")"
fi
exec "$JAVACMD" "$@"
@if "%DEBUG%" == "" @echo off
@rem ##########################################################################
@rem
@rem Gradle startup script for Windows
@rem
@rem ##########################################################################
@rem Set local scope for the variables with windows NT shell
if "%OS%"=="Windows_NT" setlocal
set DIRNAME=%~dp0
if "%DIRNAME%" == "" set DIRNAME=.
set APP_BASE_NAME=%~n0
set APP_HOME=%DIRNAME%
@rem Add default JVM options here. You can also use JAVA_OPTS and GRADLE_OPTS to pass JVM options to this script.
set DEFAULT_JVM_OPTS=
@rem Find java.exe
if defined JAVA_HOME goto findJavaFromJavaHome
set JAVA_EXE=java.exe
%JAVA_EXE% -version >NUL 2>&1
if "%ERRORLEVEL%" == "0" goto init
echo.
echo ERROR: JAVA_HOME is not set and no 'java' command could be found in your PATH.
echo.
echo Please set the JAVA_HOME variable in your environment to match the
echo location of your Java installation.
goto fail
:findJavaFromJavaHome
set JAVA_HOME=%JAVA_HOME:"=%
set JAVA_EXE=%JAVA_HOME%/bin/java.exe
if exist "%JAVA_EXE%" goto init
echo.
echo ERROR: JAVA_HOME is set to an invalid directory: %JAVA_HOME%
echo.
echo Please set the JAVA_HOME variable in your environment to match the
echo location of your Java installation.
goto fail
:init
@rem Get command-line arguments, handling Windows variants
if not "%OS%" == "Windows_NT" goto win9xME_args
:win9xME_args
@rem Slurp the command line arguments.
set CMD_LINE_ARGS=
set _SKIP=2
:win9xME_args_slurp
if "x%~1" == "x" goto execute
set CMD_LINE_ARGS=%*
:execute
@rem Setup the command line
set CLASSPATH=%APP_HOME%\gradle\wrapper\gradle-wrapper.jar
@rem Execute Gradle
"%JAVA_EXE%" %DEFAULT_JVM_OPTS% %JAVA_OPTS% %GRADLE_OPTS% "-Dorg.gradle.appname=%APP_BASE_NAME%" -classpath "%CLASSPATH%" org.gradle.wrapper.GradleWrapperMain %CMD_LINE_ARGS%
:end
@rem End local scope for the variables with windows NT shell
if "%ERRORLEVEL%"=="0" goto mainEnd
:fail
rem Set variable GRADLE_EXIT_CONSOLE if you need the _script_ return code instead of
rem the _cmd.exe /c_ return code!
if not "" == "%GRADLE_EXIT_CONSOLE%" exit 1
exit /b 1
:mainEnd
if "%OS%"=="Windows_NT" endlocal
:omega
# Add project specific ProGuard rules here.
# You can control the set of applied configuration files using the
# proguardFiles setting in build.gradle.
#
# For more details, see
# http://developer.android.com/guide/developing/tools/proguard.html
# If your project uses WebView with JS, uncomment the following
# and specify the fully qualified class name to the JavaScript interface
# class:
#-keepclassmembers class fqcn.of.javascript.interface.for.webview {
# public *;
#}
# Uncomment this to preserve the line number information for
# debugging stack traces.
#-keepattributes SourceFile,LineNumberTable
# If you keep the line number information, uncomment this to
# hide the original source file name.
#-renamesourcefileattribute SourceFile
package com.baidu.paddle.lite.demo;
import android.content.Context;
import android.support.test.InstrumentationRegistry;
import android.support.test.runner.AndroidJUnit4;
import org.junit.Test;
import org.junit.runner.RunWith;
import static org.junit.Assert.*;
/**
* Instrumented test, which will execute on an Android device.
*
* @see <a href="http://d.android.com/tools/testing">Testing documentation</a>
*/
@RunWith(AndroidJUnit4.class)
public class ExampleInstrumentedTest {
@Test
public void useAppContext() {
// Context of the app under test.
Context appContext = InstrumentationRegistry.getTargetContext();
assertEquals("com.baidu.paddle.lite.demo", appContext.getPackageName());
}
}
<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
package="com.baidu.paddle.lite.demo">
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/>
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE"/>
<uses-permission android:name="android.permission.CAMERA"/>
<application
android:allowBackup="true"
android:largeHeap="true"
android:icon="@mipmap/ic_launcher"
android:label="@string/app_name"
android:roundIcon="@mipmap/ic_launcher_round"
android:supportsRtl="true"
android:theme="@style/AppTheme">
<activity android:name=".MainActivity">
<intent-filter>
<action android:name="android.intent.action.MAIN"/>
<category android:name="android.intent.category.LAUNCHER"/>
</intent-filter>
</activity>
<activity
android:name=".segmentation.ImgSegActivity"
android:label="Image Segmentation"/>
<activity
android:name=".segmentation.ImgSegSettingsActivity"
android:label="Settings">
</activity>
</application>
</manifest>
\ No newline at end of file
/*
* Copyright (C) 2014 The Android Open Source Project
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package com.baidu.paddle.lite.demo;
import android.content.res.Configuration;
import android.os.Bundle;
import android.preference.PreferenceActivity;
import android.support.annotation.LayoutRes;
import android.support.annotation.Nullable;
import android.support.v7.app.ActionBar;
import android.support.v7.app.AppCompatDelegate;
import android.support.v7.widget.Toolbar;
import android.view.MenuInflater;
import android.view.View;
import android.view.ViewGroup;
/**
* A {@link android.preference.PreferenceActivity} which implements and proxies the necessary calls
* to be used with AppCompat.
* <p>
* This technique can be used with an {@link android.app.Activity} class, not just
* {@link android.preference.PreferenceActivity}.
*/
public abstract class AppCompatPreferenceActivity extends PreferenceActivity {
private AppCompatDelegate mDelegate;
@Override
protected void onCreate(Bundle savedInstanceState) {
getDelegate().installViewFactory();
getDelegate().onCreate(savedInstanceState);
super.onCreate(savedInstanceState);
}
@Override
protected void onPostCreate(Bundle savedInstanceState) {
super.onPostCreate(savedInstanceState);
getDelegate().onPostCreate(savedInstanceState);
}
public ActionBar getSupportActionBar() {
return getDelegate().getSupportActionBar();
}
public void setSupportActionBar(@Nullable Toolbar toolbar) {
getDelegate().setSupportActionBar(toolbar);
}
@Override
public MenuInflater getMenuInflater() {
return getDelegate().getMenuInflater();
}
@Override
public void setContentView(@LayoutRes int layoutResID) {
getDelegate().setContentView(layoutResID);
}
@Override
public void setContentView(View view) {
getDelegate().setContentView(view);
}
@Override
public void setContentView(View view, ViewGroup.LayoutParams params) {
getDelegate().setContentView(view, params);
}
@Override
public void addContentView(View view, ViewGroup.LayoutParams params) {
getDelegate().addContentView(view, params);
}
@Override
protected void onPostResume() {
super.onPostResume();
getDelegate().onPostResume();
}
@Override
protected void onTitleChanged(CharSequence title, int color) {
super.onTitleChanged(title, color);
getDelegate().setTitle(title);
}
@Override
public void onConfigurationChanged(Configuration newConfig) {
super.onConfigurationChanged(newConfig);
getDelegate().onConfigurationChanged(newConfig);
}
@Override
protected void onStop() {
super.onStop();
getDelegate().onStop();
}
@Override
protected void onDestroy() {
super.onDestroy();
getDelegate().onDestroy();
}
public void invalidateOptionsMenu() {
getDelegate().invalidateOptionsMenu();
}
private AppCompatDelegate getDelegate() {
if (mDelegate == null) {
mDelegate = AppCompatDelegate.create(this, null);
}
return mDelegate;
}
}
package com.baidu.paddle.lite.demo;
import android.Manifest;
import android.app.ProgressDialog;
import android.content.ContentResolver;
import android.content.Intent;
import android.content.pm.PackageManager;
import android.database.Cursor;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.net.Uri;
import android.os.Bundle;
import android.os.Environment;
import android.os.Handler;
import android.os.HandlerThread;
import android.os.Message;
import android.provider.MediaStore;
import android.support.annotation.NonNull;
import android.support.v4.app.ActivityCompat;
import android.support.v4.content.ContextCompat;
import android.support.v4.content.FileProvider;
import android.support.v7.app.ActionBar;
import android.support.v7.app.AppCompatActivity;
import android.util.Log;
import android.view.Menu;
import android.view.MenuInflater;
import android.view.MenuItem;
import android.widget.Toast;
import java.io.File;
import java.io.IOException;
import java.text.SimpleDateFormat;
import java.util.Date;
public class CommonActivity extends AppCompatActivity {
private static final String TAG = CommonActivity.class.getSimpleName();
public static final int OPEN_GALLERY_REQUEST_CODE = 0;
public static final int TAKE_PHOTO_REQUEST_CODE = 1;
public static final int REQUEST_LOAD_MODEL = 0;
public static final int REQUEST_RUN_MODEL = 1;
public static final int RESPONSE_LOAD_MODEL_SUCCESSED = 0;
public static final int RESPONSE_LOAD_MODEL_FAILED = 1;
public static final int RESPONSE_RUN_MODEL_SUCCESSED = 2;
public static final int RESPONSE_RUN_MODEL_FAILED = 3;
protected ProgressDialog pbLoadModel = null;
protected ProgressDialog pbRunModel = null;
protected Handler receiver = null; // receive messages from worker thread
protected Handler sender = null; // send command to worker thread
protected HandlerThread worker = null; // worker thread to load&run model
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
ActionBar supportActionBar = getSupportActionBar();
if (supportActionBar != null) {
supportActionBar.setDisplayHomeAsUpEnabled(true);
}
receiver = new Handler() {
@Override
public void handleMessage(Message msg) {
switch (msg.what) {
case RESPONSE_LOAD_MODEL_SUCCESSED:
pbLoadModel.dismiss();
onLoadModelSuccessed();
break;
case RESPONSE_LOAD_MODEL_FAILED:
pbLoadModel.dismiss();
Toast.makeText(CommonActivity.this, "Load model failed!", Toast.LENGTH_SHORT).show();
onLoadModelFailed();
break;
case RESPONSE_RUN_MODEL_SUCCESSED:
pbRunModel.dismiss();
onRunModelSuccessed();
break;
case RESPONSE_RUN_MODEL_FAILED:
pbRunModel.dismiss();
Toast.makeText(CommonActivity.this, "Run model failed!", Toast.LENGTH_SHORT).show();
onRunModelFailed();
break;
default:
break;
}
}
};
worker = new HandlerThread("Predictor Worker");
worker.start();
sender = new Handler(worker.getLooper()) {
public void handleMessage(Message msg) {
switch (msg.what) {
case REQUEST_LOAD_MODEL:
// load model and reload test image
if (onLoadModel()) {
receiver.sendEmptyMessage(RESPONSE_LOAD_MODEL_SUCCESSED);
} else {
receiver.sendEmptyMessage(RESPONSE_LOAD_MODEL_FAILED);
}
break;
case REQUEST_RUN_MODEL:
// run model if model is loaded
if (onRunModel()) {
receiver.sendEmptyMessage(RESPONSE_RUN_MODEL_SUCCESSED);
} else {
receiver.sendEmptyMessage(RESPONSE_RUN_MODEL_FAILED);
}
break;
default:
break;
}
}
};
}
public void loadModel() {
pbLoadModel = ProgressDialog.show(this, "", "Loading model...", false, false);
sender.sendEmptyMessage(REQUEST_LOAD_MODEL);
}
public void runModel() {
pbRunModel = ProgressDialog.show(this, "", "Running model...", false, false);
sender.sendEmptyMessage(REQUEST_RUN_MODEL);
}
public boolean onLoadModel() {
return true;
}
public boolean onRunModel() {
return true;
}
public void onLoadModelSuccessed() {
}
public void onLoadModelFailed() {
}
public void onRunModelSuccessed() {
}
public void onRunModelFailed() {
}
public void onImageChanged(Bitmap image) {
}
public void onImageChanged(String path) {
}
public void onSettingsClicked() {
}
@Override
public boolean onCreateOptionsMenu(Menu menu) {
MenuInflater inflater = getMenuInflater();
inflater.inflate(R.menu.menu_action_options, menu);
return true;
}
@Override
public boolean onOptionsItemSelected(MenuItem item) {
switch (item.getItemId()) {
case android.R.id.home:
finish();
break;
case R.id.open_gallery:
if (requestAllPermissions()) {
openGallery();
}
break;
case R.id.take_photo:
if (requestAllPermissions()) {
takePhoto();
}
break;
case R.id.settings:
if (requestAllPermissions()) {
// make sure we have SDCard r&w permissions to load model from SDCard
onSettingsClicked();
}
break;
}
return super.onOptionsItemSelected(item);
}
@Override
public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions,
@NonNull int[] grantResults) {
super.onRequestPermissionsResult(requestCode, permissions, grantResults);
if (grantResults[0] != PackageManager.PERMISSION_GRANTED || grantResults[1] != PackageManager.PERMISSION_GRANTED) {
Toast.makeText(this, "Permission Denied", Toast.LENGTH_SHORT).show();
}
}
private boolean requestAllPermissions() {
if (ContextCompat.checkSelfPermission(this, Manifest.permission.WRITE_EXTERNAL_STORAGE)
!= PackageManager.PERMISSION_GRANTED || ContextCompat.checkSelfPermission(this,
Manifest.permission.CAMERA)
!= PackageManager.PERMISSION_GRANTED) {
ActivityCompat.requestPermissions(this, new String[]{Manifest.permission.WRITE_EXTERNAL_STORAGE,
Manifest.permission.CAMERA},
0);
return false;
}
return true;
}
private void openGallery() {
Intent intent = new Intent(Intent.ACTION_PICK, null);
intent.setDataAndType(MediaStore.Images.Media.EXTERNAL_CONTENT_URI, "image/*");
startActivityForResult(intent, OPEN_GALLERY_REQUEST_CODE);
}
private void takePhoto() {
Intent takePhotoIntent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
if (takePhotoIntent.resolveActivity(getPackageManager()) != null) {
startActivityForResult(takePhotoIntent, TAKE_PHOTO_REQUEST_CODE);
}
}
@Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
if (resultCode == RESULT_OK && data != null) {
switch (requestCode) {
case OPEN_GALLERY_REQUEST_CODE:
try {
ContentResolver resolver = getContentResolver();
Uri uri = data.getData();
Bitmap image = MediaStore.Images.Media.getBitmap(resolver, uri);
String[] proj = {MediaStore.Images.Media.DATA};
Cursor cursor = managedQuery(uri, proj, null, null, null);
cursor.moveToFirst();
onImageChanged(image);
} catch (IOException e) {
Log.e(TAG, e.toString());
}
break;
case TAKE_PHOTO_REQUEST_CODE:
Bitmap image = (Bitmap) data.getParcelableExtra("data");
onImageChanged(image);
break;
default:
break;
}
}
}
@Override
protected void onResume() {
super.onResume();
}
@Override
protected void onDestroy() {
worker.quit();
super.onDestroy();
}
}
package com.baidu.paddle.lite.demo;
import android.content.Intent;
import android.content.SharedPreferences;
import android.os.Bundle;
import android.preference.PreferenceManager;
import android.support.v7.app.AppCompatActivity;
import android.util.Log;
import android.view.View;
import com.baidu.paddle.lite.demo.segmentation.ImgSegActivity;
public class MainActivity extends AppCompatActivity implements View.OnClickListener {
private static final String TAG = MainActivity.class.getSimpleName();
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
// clear all setting items to avoid app crashing due to the incorrect settings
SharedPreferences sharedPreferences = PreferenceManager.getDefaultSharedPreferences(this);
SharedPreferences.Editor editor = sharedPreferences.edit();
editor.clear();
editor.commit();
}
@Override
public void onClick(View v) {
switch (v.getId()) {
case R.id.v_img_seg: {
Intent intent = new Intent(MainActivity.this, ImgSegActivity.class);
startActivity(intent);
} break;
}
}
@Override
protected void onDestroy() {
super.onDestroy();
System.exit(0);
}
}
package com.baidu.paddle.lite.demo;
import android.content.Context;
import android.util.Log;
import com.baidu.paddle.lite.*;
import java.util.ArrayList;
import java.util.Date;
public class Predictor {
private static final String TAG = Predictor.class.getSimpleName();
public boolean isLoaded = false;
public int warmupIterNum = 0;
public int inferIterNum = 1;
protected Context appCtx = null;
public int cpuThreadNum = 1;
public String cpuPowerMode = "LITE_POWER_HIGH";
public String modelPath = "";
public String modelName = "";
protected PaddlePredictor paddlePredictor = null;
protected float inferenceTime = 0;
public Predictor() {
}
public boolean init(Context appCtx, String modelPath, int cpuThreadNum, String cpuPowerMode) {
this.appCtx = appCtx;
isLoaded = loadModel(modelPath, cpuThreadNum, cpuPowerMode);
return isLoaded;
}
protected boolean loadModel(String modelPath, int cpuThreadNum, String cpuPowerMode) {
// release model if exists
releaseModel();
// load model
if (modelPath.isEmpty()) {
return false;
}
String realPath = modelPath;
if (!modelPath.substring(0, 1).equals("/")) {
// read model files from custom file_paths if the first character of mode file_paths is '/'
// otherwise copy model to cache from assets
realPath = appCtx.getCacheDir() + "/" + modelPath;
Utils.copyDirectoryFromAssets(appCtx, modelPath, realPath);
}
if (realPath.isEmpty()) {
return false;
}
MobileConfig config = new MobileConfig();
config.setModelDir(realPath);
config.setThreads(cpuThreadNum);
if (cpuPowerMode.equalsIgnoreCase("LITE_POWER_HIGH")) {
config.setPowerMode(PowerMode.LITE_POWER_HIGH);
} else if (cpuPowerMode.equalsIgnoreCase("LITE_POWER_LOW")) {
config.setPowerMode(PowerMode.LITE_POWER_LOW);
} else if (cpuPowerMode.equalsIgnoreCase("LITE_POWER_FULL")) {
config.setPowerMode(PowerMode.LITE_POWER_FULL);
} else if (cpuPowerMode.equalsIgnoreCase("LITE_POWER_NO_BIND")) {
config.setPowerMode(PowerMode.LITE_POWER_NO_BIND);
} else if (cpuPowerMode.equalsIgnoreCase("LITE_POWER_RAND_HIGH")) {
config.setPowerMode(PowerMode.LITE_POWER_RAND_HIGH);
} else if (cpuPowerMode.equalsIgnoreCase("LITE_POWER_RAND_LOW")) {
config.setPowerMode(PowerMode.LITE_POWER_RAND_LOW);
} else {
Log.e(TAG, "unknown cpu power mode!");
return false;
}
paddlePredictor = PaddlePredictor.createPaddlePredictor(config);
this.cpuThreadNum = cpuThreadNum;
this.cpuPowerMode = cpuPowerMode;
this.modelPath = realPath;
this.modelName = realPath.substring(realPath.lastIndexOf("/") + 1);
return true;
}
public void releaseModel() {
paddlePredictor = null;
isLoaded = false;
cpuThreadNum = 1;
cpuPowerMode = "LITE_POWER_HIGH";
modelPath = "";
modelName = "";
}
public Tensor getInput(int idx) {
if (!isLoaded()) {
return null;
}
return paddlePredictor.getInput(idx);
}
public Tensor getOutput(int idx) {
if (!isLoaded()) {
return null;
}
return paddlePredictor.getOutput(idx);
}
public boolean runModel() {
if (!isLoaded()) {
return false;
}
// warm up
for (int i = 0; i < warmupIterNum; i++){
paddlePredictor.run();
}
// inference
Date start = new Date();
for (int i = 0; i < inferIterNum; i++) {
paddlePredictor.run();
}
Date end = new Date();
inferenceTime = (end.getTime() - start.getTime()) / (float) inferIterNum;
return true;
}
public boolean isLoaded() {
return paddlePredictor != null && isLoaded;
}
public String modelPath() {
return modelPath;
}
public String modelName() {
return modelName;
}
public int cpuThreadNum() {
return cpuThreadNum;
}
public String cpuPowerMode() {
return cpuPowerMode;
}
public float inferenceTime() {
return inferenceTime;
}
}
package com.baidu.paddle.lite.demo;
import android.content.Context;
import android.os.Environment;
import java.io.*;
public class Utils {
private static final String TAG = Utils.class.getSimpleName();
public static void copyFileFromAssets(Context appCtx, String srcPath, String dstPath) {
if (srcPath.isEmpty() || dstPath.isEmpty()) {
return;
}
InputStream is = null;
OutputStream os = null;
try {
is = new BufferedInputStream(appCtx.getAssets().open(srcPath));
os = new BufferedOutputStream(new FileOutputStream(new File(dstPath)));
byte[] buffer = new byte[1024];
int length = 0;
while ((length = is.read(buffer)) != -1) {
os.write(buffer, 0, length);
}
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
try {
os.close();
is.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
public static void copyDirectoryFromAssets(Context appCtx, String srcDir, String dstDir) {
if (srcDir.isEmpty() || dstDir.isEmpty()) {
return;
}
try {
if (!new File(dstDir).exists()) {
new File(dstDir).mkdirs();
}
for (String fileName : appCtx.getAssets().list(srcDir)) {
String srcSubPath = srcDir + File.separator + fileName;
String dstSubPath = dstDir + File.separator + fileName;
if (new File(srcSubPath).isDirectory()) {
copyDirectoryFromAssets(appCtx, srcSubPath, dstSubPath);
} else {
copyFileFromAssets(appCtx, srcSubPath, dstSubPath);
}
}
} catch (Exception e) {
e.printStackTrace();
}
}
public static float[] parseFloatsFromString(String string, String delimiter) {
String[] pieces = string.trim().toLowerCase().split(delimiter);
float[] floats = new float[pieces.length];
for (int i = 0; i < pieces.length; i++) {
floats[i] = Float.parseFloat(pieces[i].trim());
}
return floats;
}
public static long[] parseLongsFromString(String string, String delimiter) {
String[] pieces = string.trim().toLowerCase().split(delimiter);
long[] longs = new long[pieces.length];
for (int i = 0; i < pieces.length; i++) {
longs[i] = Long.parseLong(pieces[i].trim());
}
return longs;
}
public static String getSDCardDirectory() {
return Environment.getExternalStorageDirectory().getAbsolutePath();
}
public static boolean isSupportedNPU() {
String hardware = android.os.Build.HARDWARE;
return hardware.equalsIgnoreCase("kirin810") || hardware.equalsIgnoreCase("kirin990");
}
}
package com.baidu.paddle.lite.demo.segmentation;
import android.content.Intent;
import android.content.SharedPreferences;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.os.Bundle;
import android.preference.PreferenceManager;
import android.text.method.ScrollingMovementMethod;
import android.util.Log;
import android.view.Menu;
import android.widget.ImageView;
import android.widget.TextView;
import android.widget.Toast;
import com.baidu.paddle.lite.demo.CommonActivity;
import com.baidu.paddle.lite.demo.R;
import com.baidu.paddle.lite.demo.Utils;
import com.baidu.paddle.lite.demo.segmentation.config.Config;
import com.baidu.paddle.lite.demo.segmentation.preprocess.Preprocess;
import com.baidu.paddle.lite.demo.segmentation.visual.Visualize;
import java.io.File;
import java.io.IOException;
import java.io.InputStream;
public class ImgSegActivity extends CommonActivity {
private static final String TAG = ImgSegActivity.class.getSimpleName();
protected TextView tvInputSetting;
protected ImageView ivInputImage;
protected TextView tvOutputResult;
protected TextView tvInferenceTime;
// model config
Config config = new Config();
protected ImgSegPredictor predictor = new ImgSegPredictor();
Preprocess preprocess = new Preprocess();
Visualize visualize = new Visualize();
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_img_seg);
tvInputSetting = findViewById(R.id.tv_input_setting);
ivInputImage = findViewById(R.id.iv_input_image);
tvInferenceTime = findViewById(R.id.tv_inference_time);
tvOutputResult = findViewById(R.id.tv_output_result);
tvInputSetting.setMovementMethod(ScrollingMovementMethod.getInstance());
tvOutputResult.setMovementMethod(ScrollingMovementMethod.getInstance());
}
@Override
public boolean onLoadModel() {
return super.onLoadModel() && predictor.init(ImgSegActivity.this, config);
}
@Override
public boolean onRunModel() {
return super.onRunModel() && predictor.isLoaded() && predictor.runModel(preprocess,visualize);
}
@Override
public void onLoadModelSuccessed() {
super.onLoadModelSuccessed();
// load test image from file_paths and run model
try {
if (config.imagePath.isEmpty()) {
return;
}
Bitmap image = null;
// read test image file from custom file_paths if the first character of mode file_paths is '/', otherwise read test
// image file from assets
if (!config.imagePath.substring(0, 1).equals("/")) {
InputStream imageStream = getAssets().open(config.imagePath);
image = BitmapFactory.decodeStream(imageStream);
} else {
if (!new File(config.imagePath).exists()) {
return;
}
image = BitmapFactory.decodeFile(config.imagePath);
}
if (image != null && predictor.isLoaded()) {
predictor.setInputImage(image);
runModel();
}
} catch (IOException e) {
Toast.makeText(ImgSegActivity.this, "Load image failed!", Toast.LENGTH_SHORT).show();
e.printStackTrace();
}
}
@Override
public void onLoadModelFailed() {
super.onLoadModelFailed();
}
@Override
public void onRunModelSuccessed() {
super.onRunModelSuccessed();
// obtain results and update UI
tvInferenceTime.setText("Inference time: " + predictor.inferenceTime() + " ms");
Bitmap outputImage = predictor.outputImage();
if (outputImage != null) {
ivInputImage.setImageBitmap(outputImage);
}
tvOutputResult.setText(predictor.outputResult());
tvOutputResult.scrollTo(0, 0);
}
@Override
public void onRunModelFailed() {
super.onRunModelFailed();
}
@Override
public void onImageChanged(Bitmap image) {
super.onImageChanged(image);
// rerun model if users pick test image from gallery or camera
if (image != null && predictor.isLoaded()) {
// predictor.setConfig(config);
predictor.setInputImage(image);
runModel();
}
}
@Override
public void onImageChanged(String path) {
super.onImageChanged(path);
Bitmap image = BitmapFactory.decodeFile(path);
predictor.setInputImage(image);
runModel();
}
public void onSettingsClicked() {
super.onSettingsClicked();
startActivity(new Intent(ImgSegActivity.this, ImgSegSettingsActivity.class));
}
@Override
public boolean onPrepareOptionsMenu(Menu menu) {
boolean isLoaded = predictor.isLoaded();
menu.findItem(R.id.open_gallery).setEnabled(isLoaded);
menu.findItem(R.id.take_photo).setEnabled(isLoaded);
return super.onPrepareOptionsMenu(menu);
}
@Override
protected void onResume() {
Log.i(TAG,"begin onResume");
super.onResume();
SharedPreferences sharedPreferences = PreferenceManager.getDefaultSharedPreferences(this);
boolean settingsChanged = false;
String model_path = sharedPreferences.getString(getString(R.string.ISG_MODEL_PATH_KEY),
getString(R.string.ISG_MODEL_PATH_DEFAULT));
String label_path = sharedPreferences.getString(getString(R.string.ISG_LABEL_PATH_KEY),
getString(R.string.ISG_LABEL_PATH_DEFAULT));
String image_path = sharedPreferences.getString(getString(R.string.ISG_IMAGE_PATH_KEY),
getString(R.string.ISG_IMAGE_PATH_DEFAULT));
settingsChanged |= !model_path.equalsIgnoreCase(config.modelPath);
settingsChanged |= !label_path.equalsIgnoreCase(config.labelPath);
settingsChanged |= !image_path.equalsIgnoreCase(config.imagePath);
int cpu_thread_num = Integer.parseInt(sharedPreferences.getString(getString(R.string.ISG_CPU_THREAD_NUM_KEY),
getString(R.string.ISG_CPU_THREAD_NUM_DEFAULT)));
settingsChanged |= cpu_thread_num != config.cpuThreadNum;
String cpu_power_mode =
sharedPreferences.getString(getString(R.string.ISG_CPU_POWER_MODE_KEY),
getString(R.string.ISG_CPU_POWER_MODE_DEFAULT));
settingsChanged |= !cpu_power_mode.equalsIgnoreCase(config.cpuPowerMode);
String input_color_format =
sharedPreferences.getString(getString(R.string.ISG_INPUT_COLOR_FORMAT_KEY),
getString(R.string.ISG_INPUT_COLOR_FORMAT_DEFAULT));
settingsChanged |= !input_color_format.equalsIgnoreCase(config.inputColorFormat);
long[] input_shape =
Utils.parseLongsFromString(sharedPreferences.getString(getString(R.string.ISG_INPUT_SHAPE_KEY),
getString(R.string.ISG_INPUT_SHAPE_DEFAULT)), ",");
settingsChanged |= input_shape.length != config.inputShape.length;
if (!settingsChanged) {
for (int i = 0; i < input_shape.length; i++) {
settingsChanged |= input_shape[i] != config.inputShape[i];
}
}
if (settingsChanged) {
config.init(model_path,label_path,image_path,cpu_thread_num,cpu_power_mode,
input_color_format,input_shape);
preprocess.init(config);
// update UI
tvInputSetting.setText("Model: " + config.modelPath.substring(config.modelPath.lastIndexOf("/") + 1) + "\n" + "CPU" +
" Thread Num: " + Integer.toString(config.cpuThreadNum) + "\n" + "CPU Power Mode: " + config.cpuPowerMode);
tvInputSetting.scrollTo(0, 0);
// reload model if configure has been changed
loadModel();
}
}
@Override
protected void onDestroy() {
if (predictor != null) {
predictor.releaseModel();
}
super.onDestroy();
}
}
package com.baidu.paddle.lite.demo.segmentation;
import android.content.Context;
import android.graphics.Bitmap;
import android.util.Log;
import com.baidu.paddle.lite.Tensor;
import com.baidu.paddle.lite.demo.Predictor;
import com.baidu.paddle.lite.demo.segmentation.config.Config;
import com.baidu.paddle.lite.demo.segmentation.preprocess.Preprocess;
import com.baidu.paddle.lite.demo.segmentation.visual.Visualize;
import java.io.InputStream;
import java.util.Date;
import java.util.Vector;
import static android.graphics.Color.blue;
import static android.graphics.Color.green;
import static android.graphics.Color.red;
public class ImgSegPredictor extends Predictor {
private static final String TAG = ImgSegPredictor.class.getSimpleName();
protected Vector<String> wordLabels = new Vector<String>();
Config config;
protected Bitmap inputImage = null;
protected Bitmap scaledImage = null;
protected Bitmap outputImage = null;
protected String outputResult = "";
protected float preprocessTime = 0;
protected float postprocessTime = 0;
public ImgSegPredictor() {
super();
}
public boolean init(Context appCtx, Config config) {
if (config.inputShape.length != 4) {
Log.i(TAG, "size of input shape should be: 4");
return false;
}
if (config.inputShape[0] != 1) {
Log.i(TAG, "only one batch is supported in the image classification demo, you can use any batch size in " +
"your Apps!");
return false;
}
if (config.inputShape[1] != 1 && config.inputShape[1] != 3) {
Log.i(TAG, "only one/three channels are supported in the image classification demo, you can use any " +
"channel size in your Apps!");
return false;
}
if (!config.inputColorFormat.equalsIgnoreCase("RGB") && !config.inputColorFormat.equalsIgnoreCase("BGR")) {
Log.i(TAG, "only RGB and BGR color format is supported.");
return false;
}
super.init(appCtx, config.modelPath, config.cpuThreadNum, config.cpuPowerMode);
if (!super.isLoaded()) {
return false;
}
this.config = config;
return isLoaded;
}
protected boolean loadLabel(String labelPath) {
wordLabels.clear();
// load word labels from file
try {
InputStream assetsInputStream = appCtx.getAssets().open(labelPath);
int available = assetsInputStream.available();
byte[] lines = new byte[available];
assetsInputStream.read(lines);
assetsInputStream.close();
String words = new String(lines);
String[] contents = words.split("\n");
for (String content : contents) {
wordLabels.add(content);
}
Log.i(TAG, "word label size: " + wordLabels.size());
} catch (Exception e) {
Log.e(TAG, e.getMessage());
return false;
}
return true;
}
public Tensor getInput(int idx) {
return super.getInput(idx);
}
public Tensor getOutput(int idx) {
return super.getOutput(idx);
}
public boolean runModel(Bitmap image) {
setInputImage(image);
return runModel();
}
public boolean runModel(Preprocess preprocess, Visualize visualize) {
if (inputImage == null) {
return false;
}
// set input shape
Tensor inputTensor = getInput(0);
inputTensor.resize(config.inputShape);
// pre-process image
Date start = new Date();
preprocess.init(config);
preprocess.to_array(scaledImage);
// feed input tensor with pre-processed data
inputTensor.setData(preprocess.inputData);
Date end = new Date();
preprocessTime = (float) (end.getTime() - start.getTime());
// inference
super.runModel();
Tensor outputTensor = getOutput(0);
// post-process
this.outputImage = visualize.draw(inputImage,outputTensor);
postprocessTime = (float) (end.getTime() - start.getTime());
start = new Date();
outputResult = new String();
end = new Date();
return true;
}
public void setConfig(Config config){
this.config = config;
}
public Bitmap inputImage() {
return inputImage;
}
public Bitmap outputImage() {
return outputImage;
}
public String outputResult() {
return outputResult;
}
public float preprocessTime() {
return preprocessTime;
}
public float postprocessTime() {
return postprocessTime;
}
public void setInputImage(Bitmap image) {
if (image == null) {
return;
}
// scale image to the size of input tensor
Bitmap rgbaImage = image.copy(Bitmap.Config.ARGB_8888, true);
Bitmap scaleImage = Bitmap.createScaledBitmap(rgbaImage, (int) this.config.inputShape[3], (int) this.config.inputShape[2], true);
this.inputImage = rgbaImage;
this.scaledImage = scaleImage;
}
}
package com.baidu.paddle.lite.demo.segmentation;
import android.content.SharedPreferences;
import android.os.Bundle;
import android.preference.CheckBoxPreference;
import android.preference.EditTextPreference;
import android.preference.ListPreference;
import android.support.v7.app.ActionBar;
import com.baidu.paddle.lite.demo.AppCompatPreferenceActivity;
import com.baidu.paddle.lite.demo.R;
import com.baidu.paddle.lite.demo.Utils;
import java.util.ArrayList;
import java.util.List;
public class ImgSegSettingsActivity extends AppCompatPreferenceActivity implements SharedPreferences.OnSharedPreferenceChangeListener {
ListPreference lpChoosePreInstalledModel = null;
CheckBoxPreference cbEnableCustomSettings = null;
EditTextPreference etModelPath = null;
EditTextPreference etLabelPath = null;
EditTextPreference etImagePath = null;
ListPreference lpCPUThreadNum = null;
ListPreference lpCPUPowerMode = null;
ListPreference lpInputColorFormat = null;
EditTextPreference etInputShape = null;
EditTextPreference etInputMean = null;
EditTextPreference etInputStd = null;
List<String> preInstalledModelPaths = null;
List<String> preInstalledLabelPaths = null;
List<String> preInstalledImagePaths = null;
List<String> preInstalledInputShapes = null;
List<String> preInstalledCPUThreadNums = null;
List<String> preInstalledCPUPowerModes = null;
List<String> preInstalledInputColorFormats = null;
List<String> preInstalledInputMeans = null;
List<String> preInstalledInputStds = null;
@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
addPreferencesFromResource(R.xml.settings_img_seg);
ActionBar supportActionBar = getSupportActionBar();
if (supportActionBar != null) {
supportActionBar.setDisplayHomeAsUpEnabled(true);
}
// initialized pre-installed models
preInstalledModelPaths = new ArrayList<String>();
preInstalledLabelPaths = new ArrayList<String>();
preInstalledImagePaths = new ArrayList<String>();
preInstalledInputShapes = new ArrayList<String>();
preInstalledCPUThreadNums = new ArrayList<String>();
preInstalledCPUPowerModes = new ArrayList<String>();
preInstalledInputColorFormats = new ArrayList<String>();
preInstalledInputMeans = new ArrayList<String>();
preInstalledInputStds = new ArrayList<String>();
// add deeplab_mobilenet_for_cpu
preInstalledModelPaths.add(getString(R.string.ISG_MODEL_PATH_DEFAULT));
preInstalledLabelPaths.add(getString(R.string.ISG_LABEL_PATH_DEFAULT));
preInstalledImagePaths.add(getString(R.string.ISG_IMAGE_PATH_DEFAULT));
preInstalledCPUThreadNums.add(getString(R.string.ISG_CPU_THREAD_NUM_DEFAULT));
preInstalledCPUPowerModes.add(getString(R.string.ISG_CPU_POWER_MODE_DEFAULT));
preInstalledInputColorFormats.add(getString(R.string.ISG_INPUT_COLOR_FORMAT_DEFAULT));
preInstalledInputShapes.add(getString(R.string.ISG_INPUT_SHAPE_DEFAULT));
// initialize UI components
lpChoosePreInstalledModel =
(ListPreference) findPreference(getString(R.string.ISG_CHOOSE_PRE_INSTALLED_MODEL_KEY));
String[] preInstalledModelNames = new String[preInstalledModelPaths.size()];
for (int i = 0; i < preInstalledModelPaths.size(); i++) {
preInstalledModelNames[i] =
preInstalledModelPaths.get(i).substring(preInstalledModelPaths.get(i).lastIndexOf("/") + 1);
}
lpChoosePreInstalledModel.setEntries(preInstalledModelNames);
lpChoosePreInstalledModel.setEntryValues(preInstalledModelPaths.toArray(new String[preInstalledModelPaths.size()]));
cbEnableCustomSettings =
(CheckBoxPreference) findPreference(getString(R.string.ISG_ENABLE_CUSTOM_SETTINGS_KEY));
etModelPath = (EditTextPreference) findPreference(getString(R.string.ISG_MODEL_PATH_KEY));
etModelPath.setTitle("Model Path (SDCard: " + Utils.getSDCardDirectory() + ")");
etLabelPath = (EditTextPreference) findPreference(getString(R.string.ISG_LABEL_PATH_KEY));
etImagePath = (EditTextPreference) findPreference(getString(R.string.ISG_IMAGE_PATH_KEY));
lpCPUThreadNum =
(ListPreference) findPreference(getString(R.string.ISG_CPU_THREAD_NUM_KEY));
lpCPUPowerMode =
(ListPreference) findPreference(getString(R.string.ISG_CPU_POWER_MODE_KEY));
lpInputColorFormat =
(ListPreference) findPreference(getString(R.string.ISG_INPUT_COLOR_FORMAT_KEY));
etInputShape = (EditTextPreference) findPreference(getString(R.string.ISG_INPUT_SHAPE_KEY));
}
private void reloadPreferenceAndUpdateUI() {
SharedPreferences sharedPreferences = getPreferenceScreen().getSharedPreferences();
boolean enableCustomSettings =
sharedPreferences.getBoolean(getString(R.string.ISG_ENABLE_CUSTOM_SETTINGS_KEY), false);
String modelPath = sharedPreferences.getString(getString(R.string.ISG_CHOOSE_PRE_INSTALLED_MODEL_KEY),
getString(R.string.ISG_MODEL_PATH_DEFAULT));
int modelIdx = lpChoosePreInstalledModel.findIndexOfValue(modelPath);
if (modelIdx >= 0 && modelIdx < preInstalledModelPaths.size()) {
if (!enableCustomSettings) {
SharedPreferences.Editor editor = sharedPreferences.edit();
editor.putString(getString(R.string.ISG_MODEL_PATH_KEY), preInstalledModelPaths.get(modelIdx));
editor.putString(getString(R.string.ISG_LABEL_PATH_KEY), preInstalledLabelPaths.get(modelIdx));
editor.putString(getString(R.string.ISG_IMAGE_PATH_KEY), preInstalledImagePaths.get(modelIdx));
editor.putString(getString(R.string.ISG_CPU_THREAD_NUM_KEY), preInstalledCPUThreadNums.get(modelIdx));
editor.putString(getString(R.string.ISG_CPU_POWER_MODE_KEY), preInstalledCPUPowerModes.get(modelIdx));
editor.putString(getString(R.string.ISG_INPUT_COLOR_FORMAT_KEY),
preInstalledInputColorFormats.get(modelIdx));
editor.putString(getString(R.string.ISG_INPUT_SHAPE_KEY), preInstalledInputShapes.get(modelIdx));
editor.commit();
}
lpChoosePreInstalledModel.setSummary(modelPath);
}
cbEnableCustomSettings.setChecked(enableCustomSettings);
etModelPath.setEnabled(enableCustomSettings);
etLabelPath.setEnabled(enableCustomSettings);
etImagePath.setEnabled(enableCustomSettings);
lpCPUThreadNum.setEnabled(enableCustomSettings);
lpCPUPowerMode.setEnabled(enableCustomSettings);
lpInputColorFormat.setEnabled(enableCustomSettings);
etInputShape.setEnabled(enableCustomSettings);
etInputMean.setEnabled(enableCustomSettings);
etInputStd.setEnabled(enableCustomSettings);
modelPath = sharedPreferences.getString(getString(R.string.ISG_MODEL_PATH_KEY),
getString(R.string.ISG_MODEL_PATH_DEFAULT));
String labelPath = sharedPreferences.getString(getString(R.string.ISG_LABEL_PATH_KEY),
getString(R.string.ISG_LABEL_PATH_DEFAULT));
String imagePath = sharedPreferences.getString(getString(R.string.ISG_IMAGE_PATH_KEY),
getString(R.string.ISG_IMAGE_PATH_DEFAULT));
String cpuThreadNum = sharedPreferences.getString(getString(R.string.ISG_CPU_THREAD_NUM_KEY),
getString(R.string.ISG_CPU_THREAD_NUM_DEFAULT));
String cpuPowerMode = sharedPreferences.getString(getString(R.string.ISG_CPU_POWER_MODE_KEY),
getString(R.string.ISG_CPU_POWER_MODE_DEFAULT));
String inputColorFormat = sharedPreferences.getString(getString(R.string.ISG_INPUT_COLOR_FORMAT_KEY),
getString(R.string.ISG_INPUT_COLOR_FORMAT_DEFAULT));
String inputShape = sharedPreferences.getString(getString(R.string.ISG_INPUT_SHAPE_KEY),
getString(R.string.ISG_INPUT_SHAPE_DEFAULT));
etModelPath.setSummary(modelPath);
etModelPath.setText(modelPath);
etLabelPath.setSummary(labelPath);
etLabelPath.setText(labelPath);
etImagePath.setSummary(imagePath);
etImagePath.setText(imagePath);
lpCPUThreadNum.setValue(cpuThreadNum);
lpCPUThreadNum.setSummary(cpuThreadNum);
lpCPUPowerMode.setValue(cpuPowerMode);
lpCPUPowerMode.setSummary(cpuPowerMode);
lpInputColorFormat.setValue(inputColorFormat);
lpInputColorFormat.setSummary(inputColorFormat);
etInputShape.setSummary(inputShape);
etInputShape.setText(inputShape);
}
@Override
protected void onResume() {
super.onResume();
getPreferenceScreen().getSharedPreferences().registerOnSharedPreferenceChangeListener(this);
reloadPreferenceAndUpdateUI();
}
@Override
protected void onPause() {
super.onPause();
getPreferenceScreen().getSharedPreferences().unregisterOnSharedPreferenceChangeListener(this);
}
@Override
public void onSharedPreferenceChanged(SharedPreferences sharedPreferences, String key) {
if (key.equals(getString(R.string.ISG_CHOOSE_PRE_INSTALLED_MODEL_KEY))) {
SharedPreferences.Editor editor = sharedPreferences.edit();
editor.putBoolean(getString(R.string.ISG_ENABLE_CUSTOM_SETTINGS_KEY), false);
editor.commit();
}
reloadPreferenceAndUpdateUI();
}
}
package com.baidu.paddle.lite.demo.segmentation.config;
import android.graphics.Bitmap;
public class Config {
public String modelPath = "";
public String labelPath = "";
public String imagePath = "";
public int cpuThreadNum = 1;
public String cpuPowerMode = "";
public String inputColorFormat = "";
public long[] inputShape = new long[]{};
public void init(String modelPath, String labelPath, String imagePath, int cpuThreadNum,
String cpuPowerMode, String inputColorFormat,long[] inputShape){
this.modelPath = modelPath;
this.labelPath = labelPath;
this.imagePath = imagePath;
this.cpuThreadNum = cpuThreadNum;
this.cpuPowerMode = cpuPowerMode;
this.inputColorFormat = inputColorFormat;
this.inputShape = inputShape;
}
public void setInputShape(Bitmap inputImage){
this.inputShape[0] = 1;
this.inputShape[1] = 3;
this.inputShape[2] = inputImage.getHeight();
this.inputShape[3] = inputImage.getWidth();
}
}
package com.baidu.paddle.lite.demo.segmentation.preprocess;
import android.graphics.Bitmap;
import android.util.Log;
import com.baidu.paddle.lite.demo.segmentation.config.Config;
import static android.graphics.Color.blue;
import static android.graphics.Color.green;
import static android.graphics.Color.red;
public class Preprocess {
private static final String TAG = Preprocess.class.getSimpleName();
Config config;
int channels;
int width;
int height;
public float[] inputData;
public void init(Config config){
this.config = config;
this.channels = (int) config.inputShape[1];
this.height = (int) config.inputShape[2];
this.width = (int) config.inputShape[3];
this.inputData = new float[channels * width * height];
}
public boolean to_array(Bitmap inputImage){
if (channels == 3) {
int[] channelIdx = null;
if (config.inputColorFormat.equalsIgnoreCase("RGB")) {
channelIdx = new int[]{0, 1, 2};
} else if (config.inputColorFormat.equalsIgnoreCase("BGR")) {
channelIdx = new int[]{2, 1, 0};
} else {
Log.i(TAG, "unknown color format " + config.inputColorFormat + ", only RGB and BGR color format is " +
"supported!");
return false;
}
int[] channelStride = new int[]{width * height, width * height * 2};
for (int y = 0; y < height; y++) {
for (int x = 0; x < width; x++) {
int color = inputImage.getPixel(x, y);
float[] rgb = new float[]{(float) red(color) , (float) green(color) ,
(float) blue(color)};
inputData[y * width + x] = rgb[channelIdx[0]] ;
inputData[y * width + x + channelStride[0]] = rgb[channelIdx[1]] ;
inputData[y * width + x + channelStride[1]] = rgb[channelIdx[2]];
}
}
} else if (channels == 1) {
for (int y = 0; y < height; y++) {
for (int x = 0; x < width; x++) {
int color = inputImage.getPixel(x, y);
float gray = (float) (red(color) + green(color) + blue(color));
inputData[y * width + x] = gray;
}
}
} else {
Log.i(TAG, "unsupported channel size " + Integer.toString(channels) + ", only channel 1 and 3 is " +
"supported!");
return false;
}
return true;
}
}
package com.baidu.paddle.lite.demo.segmentation.visual;
import android.graphics.Bitmap;
import android.graphics.Canvas;
import android.graphics.Matrix;
import android.graphics.Paint;
import android.util.Log;
import com.baidu.paddle.lite.Tensor;
public class Visualize {
private static final String TAG = Visualize.class.getSimpleName();
public Bitmap draw(Bitmap inputImage, Tensor outputTensor){
final int[] colors_map = {0xFF000000, 0xFFFFFF00};
float[] output = outputTensor.getFloatData();
long outputShape[] = outputTensor.shape();
long outputSize = 1;
for (long s : outputShape) {
outputSize *= s;
}
int[] objectColor = new int[(int)outputSize];
for(int i=0;i<output.length;i++){
objectColor[i] = colors_map[(int)output[i]];
}
Bitmap.Config config = inputImage.getConfig();
Bitmap outputImage = null;
if(outputShape.length==3){
outputImage = Bitmap.createBitmap(objectColor, (int)outputShape[2], (int)outputShape[1], config);
outputImage = Bitmap.createScaledBitmap(outputImage, inputImage.getWidth(), inputImage.getHeight(),true);
}
else if (outputShape.length==4){
outputImage = Bitmap.createBitmap(objectColor, (int)outputShape[3], (int)outputShape[2], config);
}
Bitmap bmOverlay = Bitmap.createBitmap(inputImage.getWidth(), inputImage.getHeight() , inputImage.getConfig());
Canvas canvas = new Canvas(bmOverlay);
canvas.drawBitmap(inputImage, new Matrix(), null);
Paint paint = new Paint();
paint.setAlpha(0x80);
canvas.drawBitmap(outputImage, 0, 0, paint);
return bmOverlay;
}
}
<vector xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:aapt="http://schemas.android.com/aapt"
android:width="108dp"
android:height="108dp"
android:viewportWidth="108"
android:viewportHeight="108">
<path
android:fillType="evenOdd"
android:pathData="M32,64C32,64 38.39,52.99 44.13,50.95C51.37,48.37 70.14,49.57 70.14,49.57L108.26,87.69L108,109.01L75.97,107.97L32,64Z"
android:strokeWidth="1"
android:strokeColor="#00000000">
<aapt:attr name="android:fillColor">
<gradient
android:endX="78.5885"
android:endY="90.9159"
android:startX="48.7653"
android:startY="61.0927"
android:type="linear">
<item
android:color="#44000000"
android:offset="0.0" />
<item
android:color="#00000000"
android:offset="1.0" />
</gradient>
</aapt:attr>
</path>
<path
android:fillColor="#FFFFFF"
android:fillType="nonZero"
android:pathData="M66.94,46.02L66.94,46.02C72.44,50.07 76,56.61 76,64L32,64C32,56.61 35.56,50.11 40.98,46.06L36.18,41.19C35.45,40.45 35.45,39.3 36.18,38.56C36.91,37.81 38.05,37.81 38.78,38.56L44.25,44.05C47.18,42.57 50.48,41.71 54,41.71C57.48,41.71 60.78,42.57 63.68,44.05L69.11,38.56C69.84,37.81 70.98,37.81 71.71,38.56C72.44,39.3 72.44,40.45 71.71,41.19L66.94,46.02ZM62.94,56.92C64.08,56.92 65,56.01 65,54.88C65,53.76 64.08,52.85 62.94,52.85C61.8,52.85 60.88,53.76 60.88,54.88C60.88,56.01 61.8,56.92 62.94,56.92ZM45.06,56.92C46.2,56.92 47.13,56.01 47.13,54.88C47.13,53.76 46.2,52.85 45.06,52.85C43.92,52.85 43,53.76 43,54.88C43,56.01 43.92,56.92 45.06,56.92Z"
android:strokeWidth="1"
android:strokeColor="#00000000" />
</vector>
<?xml version="1.0" encoding="utf-8"?>
<vector xmlns:android="http://schemas.android.com/apk/res/android"
android:width="108dp"
android:height="108dp"
android:viewportWidth="108"
android:viewportHeight="108">
<path
android:fillColor="#008577"
android:pathData="M0,0h108v108h-108z" />
<path
android:fillColor="#00000000"
android:pathData="M9,0L9,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M19,0L19,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M29,0L29,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M39,0L39,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M49,0L49,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M59,0L59,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M69,0L69,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M79,0L79,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M89,0L89,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M99,0L99,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,9L108,9"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,19L108,19"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,29L108,29"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,39L108,39"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,49L108,49"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,59L108,59"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,69L108,69"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,79L108,79"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,89L108,89"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,99L108,99"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M19,29L89,29"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M19,39L89,39"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M19,49L89,49"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M19,59L89,59"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M19,69L89,69"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M19,79L89,79"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M29,19L29,89"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M39,19L39,89"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M49,19L49,89"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M59,19L59,89"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M69,19L69,89"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M79,19L79,89"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
</vector>
<?xml version="1.0" encoding="utf-8"?>
<android.support.constraint.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context=".segmentation.ImgSegActivity">
<RelativeLayout
android:layout_width="match_parent"
android:layout_height="match_parent">
<LinearLayout
android:id="@+id/v_input_info"
android:layout_width="fill_parent"
android:layout_height="wrap_content"
android:layout_alignParentTop="true"
android:orientation="vertical">
<TextView
android:id="@+id/tv_input_setting"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:scrollbars="vertical"
android:layout_marginLeft="12dp"
android:layout_marginRight="12dp"
android:layout_marginTop="10dp"
android:layout_marginBottom="5dp"
android:lineSpacingExtra="4dp"
android:singleLine="false"
android:maxLines="6"
android:text=""/>
</LinearLayout>
<RelativeLayout
android:layout_width="match_parent"
android:layout_height="match_parent"
android:layout_above="@+id/v_output_info"
android:layout_below="@+id/v_input_info">
<ImageView
android:id="@+id/iv_input_image"
android:layout_width="400dp"
android:layout_height="400dp"
android:layout_centerHorizontal="true"
android:layout_centerVertical="true"
android:layout_marginLeft="12dp"
android:layout_marginRight="12dp"
android:layout_marginTop="5dp"
android:layout_marginBottom="5dp"
android:adjustViewBounds="true"
android:scaleType="fitCenter"/>
</RelativeLayout>
<RelativeLayout
android:id="@+id/v_output_info"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_alignParentBottom="true"
android:layout_centerHorizontal="true">
<TextView
android:id="@+id/tv_output_result"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_alignParentTop="true"
android:layout_centerHorizontal="true"
android:layout_centerVertical="true"
android:scrollbars="vertical"
android:layout_marginLeft="12dp"
android:layout_marginRight="12dp"
android:layout_marginTop="5dp"
android:layout_marginBottom="5dp"
android:textAlignment="center"
android:lineSpacingExtra="5dp"
android:singleLine="false"
android:maxLines="5"
android:text=""/>
<TextView
android:id="@+id/tv_inference_time"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_below="@+id/tv_output_result"
android:layout_centerHorizontal="true"
android:layout_centerVertical="true"
android:textAlignment="center"
android:layout_marginLeft="12dp"
android:layout_marginRight="12dp"
android:layout_marginTop="5dp"
android:layout_marginBottom="10dp"
android:text=""/>
</RelativeLayout>
</RelativeLayout>
</android.support.constraint.ConstraintLayout>
\ No newline at end of file
<?xml version="1.0" encoding="utf-8"?>
<android.support.constraint.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="fill_parent"
android:layout_height="fill_parent"
tools:context=".MainActivity">
<ScrollView
android:layout_width="fill_parent"
android:layout_height="fill_parent"
android:fadingEdge="vertical"
android:scrollbars="vertical">
<LinearLayout
android:layout_width="fill_parent"
android:layout_height="fill_parent"
android:orientation="vertical">
<LinearLayout
android:layout_width="fill_parent"
android:layout_height="300dp"
android:orientation="horizontal">
<RelativeLayout
android:id="@+id/v_img_seg"
android:layout_width="wrap_content"
android:layout_height="fill_parent"
android:layout_weight="1"
android:clickable="true"
android:onClick="onClick">
<ImageView
android:id="@+id/iv_img_seg_image"
android:layout_width="200dp"
android:layout_height="200dp"
android:layout_centerHorizontal="true"
android:layout_centerVertical="true"
android:layout_margin="12dp"
android:adjustViewBounds="true"
android:src="@drawable/image_segementation"
android:scaleType="fitCenter"/>
<TextView
android:id="@+id/iv_img_seg_title"
android:layout_below="@id/iv_img_seg_image"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_centerHorizontal="true"
android:layout_margin="8dp"
android:text="Image Segmentation"
android:textStyle="bold"
android:textAllCaps="false"
android:singleLine="false"/>
</RelativeLayout>
</LinearLayout>
</LinearLayout>
</ScrollView>
</android.support.constraint.ConstraintLayout>
\ No newline at end of file
<menu xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto">
<group android:id="@+id/pick_image">
<item
android:id="@+id/open_gallery"
android:title="Open Gallery"
app:showAsAction="withText"/>
<item
android:id="@+id/take_photo"
android:title="Take Photo"
app:showAsAction="withText"/>
</group>
<group>
<item
android:id="@+id/settings"
android:title="Settings..."
app:showAsAction="withText"/>
</group>
</menu>
<?xml version="1.0" encoding="utf-8"?>
<adaptive-icon xmlns:android="http://schemas.android.com/apk/res/android">
<background android:drawable="@drawable/ic_launcher_background" />
<foreground android:drawable="@drawable/ic_launcher_foreground" />
</adaptive-icon>
\ No newline at end of file
<?xml version="1.0" encoding="utf-8"?>
<adaptive-icon xmlns:android="http://schemas.android.com/apk/res/android">
<background android:drawable="@drawable/ic_launcher_background" />
<foreground android:drawable="@drawable/ic_launcher_foreground" />
</adaptive-icon>
\ No newline at end of file
<?xml version="1.0" encoding="utf-8"?>
<resources>
<string-array name="cpu_thread_num_entries">
<item>1 threads</item>
<item>2 threads</item>
<item>4 threads</item>
<item>8 threads</item>
</string-array>
<string-array name="cpu_thread_num_values">
<item>1</item>
<item>2</item>
<item>4</item>
<item>8</item>
</string-array>
<string-array name="cpu_power_mode_entries">
<item>HIGH(only big cores)</item>
<item>LOW(only LITTLE cores)</item>
<item>FULL(all cores)</item>
<item>NO_BIND(depends on system)</item>
<item>RAND_HIGH</item>
<item>RAND_LOW</item>
</string-array>
<string-array name="cpu_power_mode_values">
<item>LITE_POWER_HIGH</item>
<item>LITE_POWER_LOW</item>
<item>LITE_POWER_FULL</item>
<item>LITE_POWER_NO_BIND</item>
<item>LITE_POWER_RAND_HIGH</item>
<item>LITE_POWER_RAND_LOW</item>
</string-array>
<string-array name="input_color_format_entries">
<item>BGR color format</item>
<item>RGB color format</item>
</string-array>
<string-array name="input_color_format_values">
<item>BGR</item>
<item>RGB</item>
</string-array>
</resources>
\ No newline at end of file
<?xml version="1.0" encoding="utf-8"?>
<resources>
<color name="colorPrimary">#008577</color>
<color name="colorPrimaryDark">#00574B</color>
<color name="colorAccent">#D81B60</color>
</resources>
<resources>
<string name="app_name">Segmentation-demo</string>
<!-- image segmentation settings -->
<string name="ISG_CHOOSE_PRE_INSTALLED_MODEL_KEY">ISG_CHOOSE_PRE_INSTALLED_MODEL_KEY</string>
<string name="ISG_ENABLE_CUSTOM_SETTINGS_KEY">ISG_ENABLE_CUSTOM_SETTINGS_KEY</string>
<string name="ISG_MODEL_PATH_KEY">ISG_MODEL_PATH_KEY</string>
<string name="ISG_LABEL_PATH_KEY">ISG_LABEL_PATH_KEY</string>
<string name="ISG_IMAGE_PATH_KEY">ISG_IMAGE_PATH_KEY</string>
<string name="ISG_CPU_THREAD_NUM_KEY">ISG_CPU_THREAD_NUM_KEY</string>
<string name="ISG_CPU_POWER_MODE_KEY">ISG_CPU_POWER_MODE_KEY</string>
<string name="ISG_INPUT_COLOR_FORMAT_KEY">ISG_INPUT_COLOR_FORMAT_KEY</string>
<string name="ISG_INPUT_SHAPE_KEY">ISG_INPUT_SHAPE_KEY</string>
<string name="ISG_MODEL_PATH_DEFAULT">image_segmentation/models/deeplab_mobilenet_for_cpu</string>
<string name="ISG_LABEL_PATH_DEFAULT">image_segmentation/labels/label_list</string>
<string name="ISG_IMAGE_PATH_DEFAULT">image_segmentation/images/human.jpg</string>
<string name="ISG_CPU_THREAD_NUM_DEFAULT">1</string>
<string name="ISG_CPU_POWER_MODE_DEFAULT">LITE_POWER_HIGH</string>
<string name="ISG_INPUT_COLOR_FORMAT_DEFAULT">RGB</string>
<string name="ISG_INPUT_SHAPE_DEFAULT">1,3,513,513</string>
</resources>
<resources>
<!-- Base application theme. -->
<style name="AppTheme" parent="Theme.AppCompat.Light.DarkActionBar">
<!-- Customize your theme here. -->
<item name="colorPrimary">@color/colorPrimary</item>
<item name="colorPrimaryDark">@color/colorPrimaryDark</item>
<item name="colorAccent">@color/colorAccent</item>
<item name="actionOverflowMenuStyle">@style/OverflowMenuStyle</item>
</style>
<style name="OverflowMenuStyle" parent="Widget.AppCompat.Light.PopupMenu.Overflow">
<item name="overlapAnchor">false</item>
</style>
<style name="AppTheme.NoActionBar">
<item name="windowActionBar">false</item>
<item name="windowNoTitle">true</item>
</style>
<style name="AppTheme.AppBarOverlay" parent="ThemeOverlay.AppCompat.Dark.ActionBar"/>
<style name="AppTheme.PopupOverlay" parent="ThemeOverlay.AppCompat.Light"/>
</resources>
<?xml version="1.0" encoding="utf-8"?>
<PreferenceScreen xmlns:android="http://schemas.android.com/apk/res/android" >
<PreferenceCategory android:title="Model Settings">
<ListPreference
android:defaultValue="@string/ISG_MODEL_PATH_DEFAULT"
android:key="@string/ISG_CHOOSE_PRE_INSTALLED_MODEL_KEY"
android:negativeButtonText="@null"
android:positiveButtonText="@null"
android:title="Choose pre-installed models" />
<CheckBoxPreference
android:defaultValue="false"
android:key="@string/ISG_ENABLE_CUSTOM_SETTINGS_KEY"
android:summaryOn="Enable"
android:summaryOff="Disable"
android:title="Enable custom settings"/>
<EditTextPreference
android:key="@string/ISG_MODEL_PATH_KEY"
android:defaultValue="@string/ISG_MODEL_PATH_DEFAULT"
android:title="Model Path" />
<EditTextPreference
android:key="@string/ISG_LABEL_PATH_KEY"
android:defaultValue="@string/ISG_LABEL_PATH_DEFAULT"
android:title="Label Path" />
<EditTextPreference
android:key="@string/ISG_IMAGE_PATH_KEY"
android:defaultValue="@string/ISG_IMAGE_PATH_DEFAULT"
android:title="Image Path" />
</PreferenceCategory>
<PreferenceCategory android:title="CPU Settings">
<ListPreference
android:defaultValue="@string/ISG_CPU_THREAD_NUM_DEFAULT"
android:key="@string/ISG_CPU_THREAD_NUM_KEY"
android:negativeButtonText="@null"
android:positiveButtonText="@null"
android:title="CPU Thread Num"
android:entries="@array/cpu_thread_num_entries"
android:entryValues="@array/cpu_thread_num_values"/>
<ListPreference
android:defaultValue="@string/ISG_CPU_POWER_MODE_DEFAULT"
android:key="@string/ISG_CPU_POWER_MODE_KEY"
android:negativeButtonText="@null"
android:positiveButtonText="@null"
android:title="CPU Power Mode"
android:entries="@array/cpu_power_mode_entries"
android:entryValues="@array/cpu_power_mode_values"/>
</PreferenceCategory>
<PreferenceCategory android:title="Input Settings">
<ListPreference
android:defaultValue="@string/ISG_INPUT_COLOR_FORMAT_DEFAULT"
android:key="@string/ISG_INPUT_COLOR_FORMAT_KEY"
android:negativeButtonText="@null"
android:positiveButtonText="@null"
android:title="Input Color Format: BGR or RGB"
android:entries="@array/input_color_format_entries"
android:entryValues="@array/input_color_format_values"/>
<EditTextPreference
android:key="@string/ISG_INPUT_SHAPE_KEY"
android:defaultValue="@string/ISG_INPUT_SHAPE_DEFAULT"
android:title="Input Shape: (1,1,h,w) or (1,3,h,w)" />
</PreferenceCategory>
</PreferenceScreen>
package com.baidu.paddle.lite.demo;
import org.junit.Test;
import static org.junit.Assert.*;
/**
* Example local unit test, which will execute on the development machine (host).
*
* @see <a href="http://d.android.com/tools/testing">Testing documentation</a>
*/
public class ExampleUnitTest {
@Test
public void addition_isCorrect() {
assertEquals(4, 2 + 2);
}
}
\ No newline at end of file
// Top-level build file where you can add configuration options common to all sub-projects/modules.
buildscript {
repositories {
google()
jcenter()
}
dependencies {
classpath 'com.android.tools.build:gradle:3.4.0'
// NOTE: Do not place your application dependencies here; they belong
// in the individual module build.gradle files
}
}
allprojects {
repositories {
google()
jcenter()
}
}
task clean(type: Delete) {
delete rootProject.buildDir
}
# Project-wide Gradle settings.
# IDE (e.g. Android Studio) users:
# Gradle settings configured through the IDE *will override*
# any settings specified in this file.
# For more details on how to configure your build environment visit
# http://www.gradle.org/docs/current/userguide/build_environment.html
# Specifies the JVM arguments used for the daemon process.
# The setting is particularly useful for tweaking memory settings.
org.gradle.jvmargs=-Xmx1536m
# When configured, Gradle will run in incubating parallel mode.
# This option should only be used with decoupled projects. More details, visit
# http://www.gradle.org/docs/current/userguide/multi_project_builds.html#sec:decoupled_projects
# org.gradle.parallel=true
#Thu Aug 22 15:05:37 CST 2019
distributionBase=GRADLE_USER_HOME
distributionPath=wrapper/dists
zipStoreBase=GRADLE_USER_HOME
zipStorePath=wrapper/dists
distributionUrl=https\://services.gradle.org/distributions/gradle-5.1.1-all.zip
#!/usr/bin/env sh
##############################################################################
##
## Gradle start up script for UN*X
##
##############################################################################
# Attempt to set APP_HOME
# Resolve links: $0 may be a link
PRG="$0"
# Need this for relative symlinks.
while [ -h "$PRG" ] ; do
ls=`ls -ld "$PRG"`
link=`expr "$ls" : '.*-> \(.*\)$'`
if expr "$link" : '/.*' > /dev/null; then
PRG="$link"
else
PRG=`dirname "$PRG"`"/$link"
fi
done
SAVED="`pwd`"
cd "`dirname \"$PRG\"`/" >/dev/null
APP_HOME="`pwd -P`"
cd "$SAVED" >/dev/null
APP_NAME="Gradle"
APP_BASE_NAME=`basename "$0"`
# Add default JVM options here. You can also use JAVA_OPTS and GRADLE_OPTS to pass JVM options to this script.
DEFAULT_JVM_OPTS=""
# Use the maximum available, or set MAX_FD != -1 to use that value.
MAX_FD="maximum"
warn () {
echo "$*"
}
die () {
echo
echo "$*"
echo
exit 1
}
# OS specific support (must be 'true' or 'false').
cygwin=false
msys=false
darwin=false
nonstop=false
case "`uname`" in
CYGWIN* )
cygwin=true
;;
Darwin* )
darwin=true
;;
MINGW* )
msys=true
;;
NONSTOP* )
nonstop=true
;;
esac
CLASSPATH=$APP_HOME/gradle/wrapper/gradle-wrapper.jar
# Determine the Java command to use to start the JVM.
if [ -n "$JAVA_HOME" ] ; then
if [ -x "$JAVA_HOME/jre/sh/java" ] ; then
# IBM's JDK on AIX uses strange locations for the executables
JAVACMD="$JAVA_HOME/jre/sh/java"
else
JAVACMD="$JAVA_HOME/bin/java"
fi
if [ ! -x "$JAVACMD" ] ; then
die "ERROR: JAVA_HOME is set to an invalid directory: $JAVA_HOME
Please set the JAVA_HOME variable in your environment to match the
location of your Java installation."
fi
else
JAVACMD="java"
which java >/dev/null 2>&1 || die "ERROR: JAVA_HOME is not set and no 'java' command could be found in your PATH.
Please set the JAVA_HOME variable in your environment to match the
location of your Java installation."
fi
# Increase the maximum file descriptors if we can.
if [ "$cygwin" = "false" -a "$darwin" = "false" -a "$nonstop" = "false" ] ; then
MAX_FD_LIMIT=`ulimit -H -n`
if [ $? -eq 0 ] ; then
if [ "$MAX_FD" = "maximum" -o "$MAX_FD" = "max" ] ; then
MAX_FD="$MAX_FD_LIMIT"
fi
ulimit -n $MAX_FD
if [ $? -ne 0 ] ; then
warn "Could not set maximum file descriptor limit: $MAX_FD"
fi
else
warn "Could not query maximum file descriptor limit: $MAX_FD_LIMIT"
fi
fi
# For Darwin, add options to specify how the application appears in the dock
if $darwin; then
GRADLE_OPTS="$GRADLE_OPTS \"-Xdock:name=$APP_NAME\" \"-Xdock:icon=$APP_HOME/media/gradle.icns\""
fi
# For Cygwin, switch paths to Windows format before running java
if $cygwin ; then
APP_HOME=`cygpath --path --mixed "$APP_HOME"`
CLASSPATH=`cygpath --path --mixed "$CLASSPATH"`
JAVACMD=`cygpath --unix "$JAVACMD"`
# We build the pattern for arguments to be converted via cygpath
ROOTDIRSRAW=`find -L / -maxdepth 1 -mindepth 1 -type d 2>/dev/null`
SEP=""
for dir in $ROOTDIRSRAW ; do
ROOTDIRS="$ROOTDIRS$SEP$dir"
SEP="|"
done
OURCYGPATTERN="(^($ROOTDIRS))"
# Add a user-defined pattern to the cygpath arguments
if [ "$GRADLE_CYGPATTERN" != "" ] ; then
OURCYGPATTERN="$OURCYGPATTERN|($GRADLE_CYGPATTERN)"
fi
# Now convert the arguments - kludge to limit ourselves to /bin/sh
i=0
for arg in "$@" ; do
CHECK=`echo "$arg"|egrep -c "$OURCYGPATTERN" -`
CHECK2=`echo "$arg"|egrep -c "^-"` ### Determine if an option
if [ $CHECK -ne 0 ] && [ $CHECK2 -eq 0 ] ; then ### Added a condition
eval `echo args$i`=`cygpath --path --ignore --mixed "$arg"`
else
eval `echo args$i`="\"$arg\""
fi
i=$((i+1))
done
case $i in
(0) set -- ;;
(1) set -- "$args0" ;;
(2) set -- "$args0" "$args1" ;;
(3) set -- "$args0" "$args1" "$args2" ;;
(4) set -- "$args0" "$args1" "$args2" "$args3" ;;
(5) set -- "$args0" "$args1" "$args2" "$args3" "$args4" ;;
(6) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" ;;
(7) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" "$args6" ;;
(8) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" "$args6" "$args7" ;;
(9) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" "$args6" "$args7" "$args8" ;;
esac
fi
# Escape application args
save () {
for i do printf %s\\n "$i" | sed "s/'/'\\\\''/g;1s/^/'/;\$s/\$/' \\\\/" ; done
echo " "
}
APP_ARGS=$(save "$@")
# Collect all arguments for the java command, following the shell quoting and substitution rules
eval set -- $DEFAULT_JVM_OPTS $JAVA_OPTS $GRADLE_OPTS "\"-Dorg.gradle.appname=$APP_BASE_NAME\"" -classpath "\"$CLASSPATH\"" org.gradle.wrapper.GradleWrapperMain "$APP_ARGS"
# by default we should be in the correct project dir, but when run from Finder on Mac, the cwd is wrong
if [ "$(uname)" = "Darwin" ] && [ "$HOME" = "$PWD" ]; then
cd "$(dirname "$0")"
fi
exec "$JAVACMD" "$@"
@if "%DEBUG%" == "" @echo off
@rem ##########################################################################
@rem
@rem Gradle startup script for Windows
@rem
@rem ##########################################################################
@rem Set local scope for the variables with windows NT shell
if "%OS%"=="Windows_NT" setlocal
set DIRNAME=%~dp0
if "%DIRNAME%" == "" set DIRNAME=.
set APP_BASE_NAME=%~n0
set APP_HOME=%DIRNAME%
@rem Add default JVM options here. You can also use JAVA_OPTS and GRADLE_OPTS to pass JVM options to this script.
set DEFAULT_JVM_OPTS=
@rem Find java.exe
if defined JAVA_HOME goto findJavaFromJavaHome
set JAVA_EXE=java.exe
%JAVA_EXE% -version >NUL 2>&1
if "%ERRORLEVEL%" == "0" goto init
echo.
echo ERROR: JAVA_HOME is not set and no 'java' command could be found in your PATH.
echo.
echo Please set the JAVA_HOME variable in your environment to match the
echo location of your Java installation.
goto fail
:findJavaFromJavaHome
set JAVA_HOME=%JAVA_HOME:"=%
set JAVA_EXE=%JAVA_HOME%/bin/java.exe
if exist "%JAVA_EXE%" goto init
echo.
echo ERROR: JAVA_HOME is set to an invalid directory: %JAVA_HOME%
echo.
echo Please set the JAVA_HOME variable in your environment to match the
echo location of your Java installation.
goto fail
:init
@rem Get command-line arguments, handling Windows variants
if not "%OS%" == "Windows_NT" goto win9xME_args
:win9xME_args
@rem Slurp the command line arguments.
set CMD_LINE_ARGS=
set _SKIP=2
:win9xME_args_slurp
if "x%~1" == "x" goto execute
set CMD_LINE_ARGS=%*
:execute
@rem Setup the command line
set CLASSPATH=%APP_HOME%\gradle\wrapper\gradle-wrapper.jar
@rem Execute Gradle
"%JAVA_EXE%" %DEFAULT_JVM_OPTS% %JAVA_OPTS% %GRADLE_OPTS% "-Dorg.gradle.appname=%APP_BASE_NAME%" -classpath "%CLASSPATH%" org.gradle.wrapper.GradleWrapperMain %CMD_LINE_ARGS%
:end
@rem End local scope for the variables with windows NT shell
if "%ERRORLEVEL%"=="0" goto mainEnd
:fail
rem Set variable GRADLE_EXIT_CONSOLE if you need the _script_ return code instead of
rem the _cmd.exe /c_ return code!
if not "" == "%GRADLE_EXIT_CONSOLE%" exit 1
exit /b 1
:mainEnd
if "%OS%"=="Windows_NT" endlocal
:omega
# PaddleSeg Python 预测部署方案
## 1. 说明
本方案旨在提供一个`PaddlePaddle`跨平台图像分割模型的`Python`预测部署方案作为参考,用户通过一定的配置,加上少量的代码,即可把模型集成到自己的服务中,完成图像分割的任务。
如果**硬件支持**(如`Tesla V100 GPU`等),本程序支持使用`Nvidia TensorRT`进行`FP32``FP16`两种精度进行推理性能加速。
## 2. 依赖前置条件
* Python2.7/Python3
## 3. 目录结构和文件说明
```
├── infer.py # 核心代码,完成分割模型的预测以及结果可视化
├── requirements.txt # 依赖的Python包
└── README.md # 说明文档
```
## 4. 环境安装和准备
### 4.1 安装 PaddlePaddle
如何选择合适版本的`PaddlePaddle`版本进行安装,可参考: [PaddlePaddle安装教程](https://www.paddlepaddle.org.cn/install/doc/)
**注意**: 如硬件支持且需要使用`TensorRT`支持`FP16`半精度优化等, 则**本步骤**需要自行安装`TensorRT`并编译`PaddlePaddle`, 点击查看[编译安装参考文档](docs/compile_paddle_with_tensorrt.md)
### 4.2:安装Python依赖包
**当前**目录下, 使用`pip`安装`Python`依赖包
```bash
pip install -r requirements.txt
```
### 4.3 安装`OpenCV` 相关依赖库
预测代码中需要使用`OpenCV`,所以还需要`OpenCV`安装相关的动态链接库。
`Ubuntu`下安装相关链接库:
```bash
apt-get install -y libglib2.0-0 libsm6 libxext6 libxrender-dev
```
CentOS 下安装相关链接库:
```bash
yum install -y libXext libSM libXrender
```
## 5. 开始预测
### 5.1 准备模型
请使用[模型导出工具](../../docs/model_export.md) 导出您的模型, 或点击下载我们的[人像分割样例模型](https://bj.bcebos.com/paddleseg/inference/human_freeze_model.zip)用于测试。
模型导出的目录通常包括三个文件:
```
├── model # 模型文件
├── params # 参数文件
└── deploy.yaml # 配置文件,用于C++或Python预测
```
配置文件的主要字段及其含义如下:
```yaml
DEPLOY:
# 是否使用GPU预测
USE_GPU: 1
# 模型和参数文件所在目录路径
MODEL_PATH: "/root/projects/models/deeplabv3p_xception65_humanseg"
# 模型文件名
MODEL_FILENAME: "__model__"
# 参数文件名
PARAMS_FILENAME: "__params__"
# 预测图片的的标准输入尺寸,输入尺寸不一致会做resize
EVAL_CROP_SIZE: (513, 513)
# 均值
MEAN: [0.5, 0.5, 0.5]
# 方差
STD: [0.5, 0.5, 0.5]
# 分类类型数
NUM_CLASSES: 2
# 图片通道数
CHANNELS : 3
# 预测模式,支持 NATIVE 和 ANALYSIS
PREDICTOR_MODE: "ANALYSIS"
# 每次预测的 batch_size
BATCH_SIZE : 3
```
### 5.2 执行预测程序
在终端输入以下命令进行预测:
```bash
python infer.py --conf=/path/to/deploy.yaml --input_dir/path/to/images_directory --use_pr=False
```
参数说明如下:
| 参数 | 是否必须|含义 |
|-------|-------|----------|
| conf | Yes|模型配置的Yaml文件路径 |
| input_dir |Yes| 需要预测的图片目录 |
| use_pr |No|是否使用优化模型,默认为False|
* 优化模型:使用`PaddleSeg 0.3.0`版导出的为优化模型, 此前版本导出的模型即为未优化版本。优化模型把图像的预处理以及后处理部分融入到模型网络中使用`GPU` 完成,相比原来`CPU` 中的处理提升了计算性能。
**注意**: 如果硬件支持且安装的是从源码编译集成`TensorRT``PaddlePaddle`, 则可以使用参数`--trt_mode=fp16` 表示开启`FP16` 精度优化, 使用`trt_mode=fp32` 表示使用`FP32` 精度。
运行后程序会扫描`input_dir` 目录下所有指定格式图片,并生成`预测mask``可视化的结果`
对于图片`a.jpeg`, `预测mask` 存在`a_jpeg.png` 中,而可视化结果则在`a_jpeg_result.png` 中。
输入样例:
![avatar](../cpp/images/humanseg/demo2.jpeg)
输出结果:
![avatar](../cpp/images/humanseg/demo2.jpeg_result.png)
# PaddleSeg 分割模型预测性能测试
## 测试软件环境
- CUDA 9.0
- CUDNN 7.6
- TensorRT-5.1.5
- PaddlePaddle v1.6.1
- Ubuntu 16.04
- GPU: Tesla V100
- CPU:Intel(R) Xeon(R) Gold 6148 CPU @ 2.40GHz
## 测试方法
- 输入采用 1000张RGB图片,batch_size 统一为 1。
- 重复跑多轮,去掉第一轮预热时间,计后续几轮的平均时间:包括数据拷贝到GPU,预测引擎计算时间,预测结果拷贝回CPU 时间。
- 采用Fluid C++预测引擎
- 测试时开启了 FLAGS_cudnn_exhaustive_search=True,使用exhaustive方式搜索卷积计算算法。
- 对于每个模型,同事测试了`OP`优化模型和原生模型的推理速度, 并分别就是否开启`FP16``FP32`的进行了测试
## 推理速度测试数据
**说明**`OP优化模型`指的是`PaddleSeg 0.3.0`版以后导出的新版模型,把图像的预处理和后处理部分放入 GPU 中进行加速,提高性能。每个模型包含了三种`eval_crop_size``192x192`/`512x512`/`768x768`
<table width="1440">
<tbody>
<tr>
<td rowspan="2" width="432">
<p>模型</p>
</td>
<td colspan="3" width="535">
<p>原始模型&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;(单位 ms/image)</p>
</td>
<td colspan="3" width="588">
<p>OP 优化模型&nbsp;&nbsp;&nbsp;&nbsp;(单位 ms/image)</p>
</td>
</tr>
<tr>
<td>
<p>Fluid</p>
</td>
<td>
<p>Fluid-TRT FP32</p>
</td>
<td>
<p>Fluid-TRT FP16</p>
</td>
<td>
<p>Fluid</p>
</td>
<td>
<p>Fluid-TRT FP32</p>
</td>
<td>
<p>Fluid-TRT FP16</p>
</td>
</tr>
<tr>
<td>
<p>deeplabv3p_mobilenetv2-1-0_bn_192x192</p>
</td>
<td>
<p>4.717</p>
</td>
<td>
<p>3.085</p>
</td>
<td>
<p>2.607</p>
</td>
<td>
<p>3.705</p>
</td>
<td>
<p>2.09</p>
</td>
<td>
<p>1.775</p>
</td>
</tr>
<tr>
<td>
<p>deeplabv3p_mobilenetv2-1-0_bn_512x512</p>
</td>
<td>
<p>15.848</p>
</td>
<td>
<p>14.243</p>
</td>
<td>
<p>13.699</p>
</td>
<td>
<p>8.284</p>
</td>
<td>
<p>6.972</p>
</td>
<td>
<p>6.013</p>
</td>
</tr>
<tr>
<td>
<p>deeplabv3p_mobilenetv2-1-0_bn_768x768</p>
</td>
<td>
<p>63.148</p>
</td>
<td>
<p>61.133</p>
</td>
<td>
<p>59.262</p>
</td>
<td>
<p>16.242</p>
</td>
<td>
<p>13.624</p>
</td>
<td>
<p>12.018</p>
</td>
</tr>
<tr>
<td>
<p>deeplabv3p_xception65_bn_192x192</p>
</td>
<td>
<p>9.703</p>
</td>
<td>
<p>9.393</p>
</td>
<td>
<p>6.46</p>
</td>
<td>
<p>8.555</p>
</td>
<td>
<p>8.202</p>
</td>
<td>
<p>5.15</p>
</td>
</tr>
<tr>
<td>
<p>deeplabv3p_xception65_bn_512x512</p>
</td>
<td>
<p>30.944</p>
</td>
<td>
<p>30.031</p>
</td>
<td>
<p>20.716</p>
</td>
<td>
<p>23.571</p>
</td>
<td>
<p>22.601</p>
</td>
<td>
<p>13.327</p>
</td>
</tr>
<tr>
<td>
<p>deeplabv3p_xception65_bn_768x768</p>
</td>
<td>
<p>92.109</p>
</td>
<td>
<p>89.338</p>
</td>
<td>
<p>43.342</p>
</td>
<td>
<p>44.341</p>
</td>
<td>
<p>41.945</p>
</td>
<td>
<p>25.486</p>
</td>
</tr>
<tr>
<td>
<p>icnet_bn_192x192</p>
</td>
<td>
<p>5.706</p>
</td>
<td>
<p>5.057</p>
</td>
<td>
<p>4.515</p>
</td>
<td>
<p>4.694</p>
</td>
<td>
<p>4.066</p>
</td>
<td>
<p>3.369</p>
</td>
</tr>
<tr>
<td>
<p>icnet_bn_512x512</p>
</td>
<td>
<p>18.326</p>
</td>
<td>
<p>16.971</p>
</td>
<td>
<p>16.663</p>
</td>
<td>
<p>10.576</p>
</td>
<td>
<p>9.779</p>
</td>
<td>
<p>9.389</p>
</td>
</tr>
<tr>
<td>
<p>icnet_bn_768x768</p>
</td>
<td>
<p>67.542</p>
</td>
<td>
<p>65.436</p>
</td>
<td>
<p>64.197</p>
</td>
<td>
<p>18.464</p>
</td>
<td>
<p>17.881</p>
</td>
<td>
<p>16.958</p>
</td>
</tr>
<tr>
<td>
<p>pspnet101_bn_192x192</p>
</td>
<td>
<p>20.978</p>
</td>
<td>
<p>18.089</p>
</td>
<td>
<p>11.946</p>
</td>
<td>
<p>20.102</p>
</td>
<td>
<p>17.128</p>
</td>
<td>
<p>11.011</p>
</td>
</tr>
<tr>
<td>
<p>pspnet101_bn_512x512</p>
</td>
<td>
<p>72.085</p>
</td>
<td>
<p>71.114</p>
</td>
<td>
<p>43.009</p>
</td>
<td>
<p>64.584</p>
</td>
<td>
<p>63.715</p>
</td>
<td>
<p>35.806</p>
</td>
</tr>
<tr>
<td>
<p>pspnet101_bn_768x768</p>
</td>
<td>
<p>160.552</p>
</td>
<td>
<p>157.791</p>
</td>
<td>
<p>110.544</p>
</td>
<td>
<p>111.996</p>
</td>
<td>
<p>111.22</p>
</td>
<td>
<p>69.646</p>
</td>
</tr>
<tr>
<td>
<p>pspnet50_bn_192x192</p>
</td>
<td>
<p>13.854</p>
</td>
<td>
<p>12.491</p>
</td>
<td>
<p>9.357</p>
</td>
<td>
<p>12.889</p>
</td>
<td>
<p>11.479</p>
</td>
<td>
<p>8.516</p>
</td>
</tr>
<tr>
<td>
<p>pspnet50_bn_512x512</p>
</td>
<td>
<p>55.868</p>
</td>
<td>
<p>55.205</p>
</td>
<td>
<p>39.659</p>
</td>
<td>
<p>48.647</p>
</td>
<td>
<p>48.076</p>
</td>
<td>
<p>32.403</p>
</td>
</tr>
<tr>
<td>
<p>pspnet50_bn_768x768</p>
</td>
<td>
<p>135.268</p>
</td>
<td>
<p>131.268</p>
</td>
<td>
<p>109.732</p>
</td>
<td>
<p>85.167</p>
</td>
<td>
<p>84.615</p>
</td>
<td>
<p>65.483</p>
</td>
</tr>
<tr>
<td>
<p>unet_bn_coco_192x192</p>
</td>
<td>
<p>7.557</p>
</td>
<td>
<p>7.979</p>
</td>
<td>
<p>8.049</p>
</td>
<td>
<p>4.933</p>
</td>
<td>
<p>4.952</p>
</td>
<td>
<p>4.959</p>
</td>
</tr>
<tr>
<td>
<p>unet_bn_coco_512x512</p>
</td>
<td>
<p>37.131</p>
</td>
<td>
<p>36.668</p>
</td>
<td>
<p>36.706</p>
</td>
<td>
<p>26.857</p>
</td>
<td>
<p>26.917</p>
</td>
<td>
<p>26.928</p>
</td>
</tr>
<tr>
<td>
<p>unet_bn_coco_768x768</p>
</td>
<td>
<p>110.578</p>
</td>
<td>
<p>110.031</p>
</td>
<td>
<p>109.979</p>
</td>
<td>
<p>59.118</p>
</td>
<td>
<p>59.173</p>
</td>
<td>
<p>59.124</p>
</td>
</tr>
</tbody>
</table>
<p>&nbsp;</p>
## 数据分析
### 1. 新版OP优化模型的加速效果
下图是`PaddleSeg 0.3.0`进行OP优化的模型和原模型的性能数据对比(以512x512 为例):
![OP加速对比](https://paddleseg.bj.bcebos.com/inference/benchmark/op_opt_512x512.png)
`分析`
- 优化模型的加速效果在各模型上都很明显,最高优化效果可达100%
- 模型的 `eval_crop_size`越大,加速效果越明显
### 2. 使用 TensorRT 开启 FP16 和 FP32 优化效果分析
在原始模型上的加速效果:
![优化模型](https://paddleseg.bj.bcebos.com/inference/benchmark/trt_opt_origin_512x512.png)
在优化模型上的加速效果:
![原始模型](https://paddleseg.bj.bcebos.com/inference/benchmark/trt_opt_new_512x512.png)
`分析`
- unet和icnet模型,使用Fluid-TensorRT的加速效果不明显,甚至没有加速。
- deeplabv3p_mobilenetv2模型,Fluid-TensorRT在原生模型的加速效果不明显,仅3%-5%的加速效果。在优化模型的加速效果可以达到20%。
- `deeplabv3_xception``pspnet50``pspnet101`模型,`fp16`加速效果很明显,在`768x768` 的size下加速效果最高可达110%。
### 3. 不同的EVAL_CROP_SIZE对图片想能的影响
`deeplabv3p_xception`上的数据对比图:
![xception](https://paddleseg.bj.bcebos.com/inference/benchmark/xception.png)
`deeplabv3p_mobilenet`上的数据对比图:
![xception](https://paddleseg.bj.bcebos.com/inference/benchmark/mobilenet.png)
`unet`上的测试数据对比图:
![xception](https://paddleseg.bj.bcebos.com/inference/benchmark/unet.png)
`icnet`上的测试数据对比图:
![xception](https://paddleseg.bj.bcebos.com/inference/benchmark/unet.png)
`pspnet101`上的测试数据对比图:
![xception](https://paddleseg.bj.bcebos.com/inference/benchmark/pspnet101.png)
`pspnet50`上的测试数据对比图:
![xception](https://paddleseg.bj.bcebos.com/inference/benchmark/pspnet50.png)
`分析`
- 对于同一模型,`eval_crop_size`越大,推理速度越慢
- 同一模型,不管是 TensorRT 优化还是 OP 优化,`eval_crop_size`越大效果越明显
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册