提交 0382af8d 编写于 作者: S sunyanfang01

Merge remote-tracking branch 'upstream/develop' into develop

<p align="center">
<img src="./docs/images/paddlex.png" width="360" height ="60" alt="PaddleX" align="middle" />
<img src="./docs/gui/images/paddlex.png" width="360" height ="55" alt="PaddleX" align="middle" />
</p>
[![License](https://img.shields.io/badge/license-Apache%202-red.svg)](LICENSE)
......@@ -8,66 +8,77 @@
![support os](https://img.shields.io/badge/os-linux%2C%20win%2C%20mac-yellow.svg)
![QQGroup](https://img.shields.io/badge/QQ_Group-1045148026-52B6EF?style=social&logo=tencent-qq&logoColor=000&logoWidth=20)
PaddleX是基于飞桨核心框架、开发套件和工具组件的深度学习全流程开发工具。具备**全流程打通****融合产业实践****易用易集成**三大特点
**PaddleX--飞桨全功能开发套件**,集成了飞桨视觉套件(PaddleClas、PaddleDetection、PaddleSeg)、模型压缩工具PaddleSlim、可视化分析工具VisualDL、轻量化推理引擎Paddle Lite 等核心模块的能力,同时融合飞桨团队丰富的实际经验及技术积累,将深度学习开发全流程,从数据准备、模型训练与优化到多端部署实现了端到端打通,为开发者提供飞桨全流程开发的最佳实践
## 特点
**PaddleX 提供了最简化的API设计,并官方实现GUI供大家下载使用**,最大程度降低开发者使用门槛。开发者既可以应用**PaddleX GUI**快速体验深度学习模型开发的全流程,也可以直接使用 **PaddleX API** 更灵活地进行开发。
- **全流程打通**
- **数据准备**:支持[EasyData智能数据服务平台](https://ai.baidu.com/easydata/)数据协议,通过平台便捷完成智能标注,低质数据清洗工作, 同时兼容主流标注工具协议, 助力开发者更快完成数据准备工作。
- **模型训练**:集成[PaddleClas](https://github.com/PaddlePaddle/PaddleClas), [PaddleDetection](https://github.com/PaddlePaddle/PaddleDetection), [PaddleSeg](https://github.com/PaddlePaddle/PaddleSeg)视觉开发套件,丰富的高质量预训练模型,更快实现工业级模型效果。
- **模型调优**:内置模型可解释性模块、[VisualDL](https://github.com/PaddlePaddle/VisualDL)可视化分析组件, 提供丰富的信息更好地理解模型,优化模型。
- **多端安全部署**:内置[PaddleSlim](https://github.com/PaddlePaddle/PaddleSlim)模型压缩工具和**模型加密部署模块**,结合Paddle Inference或[Paddle Lite](https://github.com/PaddlePaddle/Paddle-Lite)便捷完成高性能的多端安全部署。
更进一步的,如果用户需要根据自己场景及需求,定制化地对PaddleX 进行改造或集成,PaddleX 也提供很好的支持。
- **融合产业实践**
- 精选飞桨产业实践的成熟模型结构,开放案例实践教程,加速开发者产业落地。
## PaddleX 三大特点
- **易用易集成**
- 统一易用的全流程API,5步完成模型训练,10行代码实现Python/C++高性能部署。
- 提供以PaddleX为核心集成的跨平台可视化工具PaddleX-GUI,快速体验飞桨深度学习全流程。
### 全流程打通
- **数据准备**:兼容ImageNet、VOC、COCO等常用数据协议, 同时与Labelme、精灵标注助手、[EasyData智能数据服务平台](https://ai.baidu.com/easydata/)等无缝衔接,全方位助力开发者更快完成数据准备工作。
## 安装
- **数据预处理及增强**:提供极简的图像预处理和增强方法--Transforms,适配imgaug图像增强库,支持上百种数据增强策略,是开发者快速缓解小样本数据训练的问题。
- **模型训练**:集成[PaddleClas](https://github.com/PaddlePaddle/PaddleClas), [PaddleDetection](https://github.com/PaddlePaddle/PaddleDetection), [PaddleSeg](https://github.com/PaddlePaddle/PaddleSeg)视觉开发套件,提供大量精选的、经过产业实践的高质量预训练模型,使开发者更快实现工业级模型效果。
- **模型调优**:内置模型可解释性模块、[VisualDL](https://github.com/PaddlePaddle/VisualDL)可视化分析工具。使开发者可以更直观的理解模型的特征提取区域、训练过程参数变化,从而快速优化模型。
PaddleX提供两种开发模式,满足不同场景和用户需求:
- **多端安全部署**:内置[PaddleSlim](https://github.com/PaddlePaddle/PaddleSlim)模型压缩工具和**模型加密部署模块**,与飞桨原生预测库Paddle Inference及高性能端侧推理引擎[Paddle Lite](https://github.com/PaddlePaddle/Paddle-Lite) 无缝打通,使开发者快速实现模型的多端、高性能、安全部署。
- **Python开发模式:** 通过Python API方式完成全流程使用或集成,该模型提供全面、灵活、开放的深度学习功能,有更高的定制化空间。
### 融合产业实践
- **Padlde-GUI模式:** PaddleX-GUI支持`Python开发模式`下的常用功能,以更低门槛的方式快速完成产业验证的模型训练。
- **产业验证**:经过**质检****安防****巡检****遥感****零售****医疗**等十多个行业实际应用场景验证,适配行业数据格式及部署环境要求。
- **经验沉淀**:沉淀产业实践实际经验,**提供丰富的案例实践教程**,加速开发者产业落地。
- **产业开发者共建**:吸收实际产业开发者贡献代码,源于产业,回馈产业。
开发者可根据自身需要按需选择不同的模式进行安装使用。
### Python开发模式安装
## 易用易集成
- **易用**:统一的全流程API,5步即可完成模型训练,10行代码实现Python/C++高性能部署。
- **易集成**:支持开发者自主改造、集成,开发出适用于自己产业需求的产品。并官方提供基于 PaddleX API 开发的跨平台可视化工具-- **PaddleX GUI**,使开发者快速体验飞桨深度学习开发全流程,并启发用户进行定制化开发。
## 安装
**前置依赖**
* paddlepaddle >= 1.8.0
* python >= 3.5
* cython
* pycocotools
**PaddleX提供两种开发模式,满足用户的不同需求:**
1. **Python开发模式:** 通过简洁易懂的Python API,在兼顾功能全面性、开发灵活性、集成方便性的基础上,给开发者最流畅的深度学习开发体验。<br>
**前置依赖**
> - paddlepaddle >= 1.8.0
> - python >= 3.5
> - cython
> - pycocotools
```
pip install paddlex -i https://mirror.baidu.com/pypi/simple
```
安装的相关问题参考[PaddleX安装](https://paddlex.readthedocs.io/zh_CN/latest/install.html)
详细安装方法请参考[PaddleX安装](https://paddlex.readthedocs.io/zh_CN/develop/install.html)
2. **Padlde GUI模式:** 无代码开发的可视化客户端,应用Paddle API实现,使开发者快速进行产业项目验证,并为用户开发自有深度学习软件/应用提供参照。
- 前往[PaddleX官网](https://www.paddlepaddle.org.cn/paddle/paddlex),申请下载Paddle X GUI一键绿色安装包。
### PaddleX-GUI安装
- 前往[PaddleX GUI使用教程](./docs/gui/how_to_use.md)了解PaddleX GUI使用详情。
进入PaddleX官网[下载使用](https://www.paddlepaddle.org.cn/paddle/paddlex),申请下载绿色安装包,开箱即用。
Paddle-GUI的使用教程可参考[PaddleX-GUI模式使用教程](https://paddlex.readthedocs.io/zh_CN/latest/paddlex_gui/index.html)
## 使用文档
## 完整使用文档及API说明
推荐访问[PaddleX在线使用文档](https://paddlex.readthedocs.io/zh_CN/latest/index.html),快速查阅读使用教程和API文档说明。
- [完整PaddleX在线使用文档目录](https://paddlex.readthedocs.io/zh_CN/develop/index.html)
- [10分钟快速上手使用](https://paddlex.readthedocs.io/zh_CN/latest/quick_start.html)
- [PaddleX全流程开发教程](https://paddlex.readthedocs.io/zh_CN/latest/tutorials/index.html)
- [PaddleX视觉方案介绍](https://paddlex.readthedocs.io/zh_CN/latest/cv_solutions.html)
- [PaddleX API参考文档](https://paddlex.readthedocs.io/zh_CN/latest/apis/index.html)
- [10分钟快速上手系列教程](https://paddlex.readthedocs.io/zh_CN/develop/quick_start.html)
- [PaddleX模型训练教程集合](https://paddlex.readthedocs.io/zh_CN/develop/train/index.html)
- [PaddleX API接口说明](https://paddlex.readthedocs.io/zh_CN/develop/apis/index.html)
## 在线教程
## 在线项目示例
基于AIStudio平台,快速在线体验PaddleX的Python开发模式教程
为了使开发者更快掌握PaddleX API,我们创建了一系列完整的示例教程,您可通过AIStudio一站式开发平台,快速在线运行PaddleX的项目
- [PaddleX快速上手CV模型训练](https://aistudio.baidu.com/aistudio/projectdetail/450925)
- [PaddleX快速上手——MobileNetV3-ssld 化妆品分类](https://aistudio.baidu.com/aistudio/projectdetail/450220)
- [PaddleX快速上手——Faster-RCNN AI识虫](https://aistudio.baidu.com/aistudio/projectdetail/439888)
- [PaddleX快速上手——DeepLabv3+ 视盘分割](https://aistudio.baidu.com/aistudio/projectdetail/440197)
......@@ -76,15 +87,16 @@ Paddle-GUI的使用教程可参考[PaddleX-GUI模式使用教程](https://paddle
- 项目官网: https://www.paddlepaddle.org.cn/paddle/paddlex
- PaddleX用户交流群: 1045148026 (手机QQ扫描如下二维码快速加入)
<img src="./docs/images/QQGroup.jpeg" width="195" height="300" alt="QQGroup" align="center" />
<img src="./docs/gui/images/QR.jpg" width="250" height="300" alt="QQGroup" align="center" />
## FAQ
## [FAQ](./docs/gui/faq.md)
## 更新日志
* 2020.05.20
> [历史版本及更新内容](https://paddlex.readthedocs.io/zh_CN/develop/change_log.html)
**`v1.0.0`**
* 正式版本发布。
- 2020.07.12 v1.0.8
- 2020.05.20 v1.0.0
- 2020.05.17 v0.1.8
## 贡献代码
......
# 多端安全部署
# 模型部署
本目录为PaddleX模型部署代码,编译和使用教程参考:
- [服务端部署(支持Python部署、C++部署、模型加密部署)](../docs/tutorials/deploy/deploy_server/)
- [OpenVINO部署](../docs/tutorials/deploy/deploy_openvino.md)
- [移动端部署](../docs/tutorials/deploy/deploy_lite.md)
- [服务端部署](../docs/deploy/server/)
- [Python部署](../docs/deploy/server/python.md)
- [C++部署](../docs/deploy/server/cpp/)
- [Windows平台部署](../docs/deploy/server/cpp/windows.md)
- [Linux平台部署](../docs/deploy/server/cpp/linux.md)
- [模型加密部署](../docs/deploy/server/encryption.md)
- [Nvidia Jetson开发板部署](../docs/deploy/nvidia-jetson.md)
- [移动端部署](../docs/deploy/paddlelite/)
- [模型压缩](../docs/deploy/paddlelite/slim)
- [模型量化](../docs/deploy/paddlelite/slim/quant.md)
- [模型裁剪](../docs/deploy/paddlelite/slim/prune.md)
- [Android平台](../docs/deploy/paddlelite/android.md)
......@@ -62,8 +62,6 @@ int main(int argc, char** argv) {
FLAGS_use_ir_optim);
// 进行预测
double total_running_time_s = 0.0;
double total_imread_time_s = 0.0;
int imgs = 1;
if (FLAGS_image_list != "") {
std::ifstream inf(FLAGS_image_list);
......@@ -79,7 +77,6 @@ int main(int argc, char** argv) {
}
imgs = image_paths.size();
for (int i = 0; i < image_paths.size(); i += FLAGS_batch_size) {
auto start = system_clock::now();
// 读图像
int im_vec_size =
std::min(static_cast<int>(image_paths.size()), i + FLAGS_batch_size);
......@@ -91,19 +88,7 @@ int main(int argc, char** argv) {
for (int j = i; j < im_vec_size; ++j) {
im_vec[j - i] = std::move(cv::imread(image_paths[j], 1));
}
auto imread_end = system_clock::now();
model.predict(im_vec, &results, thread_num);
auto imread_duration = duration_cast<microseconds>(imread_end - start);
total_imread_time_s += static_cast<double>(imread_duration.count()) *
microseconds::period::num /
microseconds::period::den;
auto end = system_clock::now();
auto duration = duration_cast<microseconds>(end - start);
total_running_time_s += static_cast<double>(duration.count()) *
microseconds::period::num /
microseconds::period::den;
for (int j = i; j < im_vec_size; ++j) {
std::cout << "Path:" << image_paths[j]
<< ", predict label: " << results[j - i].category
......@@ -112,23 +97,12 @@ int main(int argc, char** argv) {
}
}
} else {
auto start = system_clock::now();
PaddleX::ClsResult result;
cv::Mat im = cv::imread(FLAGS_image, 1);
model.predict(im, &result);
auto end = system_clock::now();
auto duration = duration_cast<microseconds>(end - start);
total_running_time_s += static_cast<double>(duration.count()) *
microseconds::period::num /
microseconds::period::den;
std::cout << "Predict label: " << result.category
<< ", label_id:" << result.category_id
<< ", score: " << result.score << std::endl;
}
std::cout << "Total running time: " << total_running_time_s
<< " s, average running time: " << total_running_time_s / imgs
<< " s/img, total read img time: " << total_imread_time_s
<< " s, average read time: " << total_imread_time_s / imgs
<< " s/img, batch_size = " << FLAGS_batch_size << std::endl;
return 0;
}
......@@ -65,11 +65,7 @@ int main(int argc, char** argv) {
FLAGS_gpu_id,
FLAGS_key,
FLAGS_use_ir_optim);
double total_running_time_s = 0.0;
double total_imread_time_s = 0.0;
int imgs = 1;
auto colormap = PaddleX::GenerateColorMap(model.labels.size());
std::string save_dir = "output";
// 进行预测
if (FLAGS_image_list != "") {
......@@ -85,7 +81,6 @@ int main(int argc, char** argv) {
}
imgs = image_paths.size();
for (int i = 0; i < image_paths.size(); i += FLAGS_batch_size) {
auto start = system_clock::now();
int im_vec_size =
std::min(static_cast<int>(image_paths.size()), i + FLAGS_batch_size);
std::vector<cv::Mat> im_vec(im_vec_size - i);
......@@ -96,17 +91,7 @@ int main(int argc, char** argv) {
for (int j = i; j < im_vec_size; ++j) {
im_vec[j - i] = std::move(cv::imread(image_paths[j], 1));
}
auto imread_end = system_clock::now();
model.predict(im_vec, &results, thread_num);
auto imread_duration = duration_cast<microseconds>(imread_end - start);
total_imread_time_s += static_cast<double>(imread_duration.count()) *
microseconds::period::num /
microseconds::period::den;
auto end = system_clock::now();
auto duration = duration_cast<microseconds>(end - start);
total_running_time_s += static_cast<double>(duration.count()) *
microseconds::period::num /
microseconds::period::den;
// 输出结果目标框
for (int j = 0; j < im_vec_size - i; ++j) {
for (int k = 0; k < results[j].boxes.size(); ++k) {
......@@ -124,7 +109,7 @@ int main(int argc, char** argv) {
// 可视化
for (int j = 0; j < im_vec_size - i; ++j) {
cv::Mat vis_img = PaddleX::Visualize(
im_vec[j], results[j], model.labels, colormap, FLAGS_threshold);
im_vec[j], results[j], model.labels, FLAGS_threshold);
std::string save_path =
PaddleX::generate_save_path(FLAGS_save_dir, image_paths[i + j]);
cv::imwrite(save_path, vis_img);
......@@ -132,15 +117,9 @@ int main(int argc, char** argv) {
}
}
} else {
auto start = system_clock::now();
PaddleX::DetResult result;
cv::Mat im = cv::imread(FLAGS_image, 1);
model.predict(im, &result);
auto end = system_clock::now();
auto duration = duration_cast<microseconds>(end - start);
total_running_time_s += static_cast<double>(duration.count()) *
microseconds::period::num /
microseconds::period::den;
// 输出结果目标框
for (int i = 0; i < result.boxes.size(); ++i) {
std::cout << "image file: " << FLAGS_image << std::endl;
......@@ -155,7 +134,7 @@ int main(int argc, char** argv) {
// 可视化
cv::Mat vis_img =
PaddleX::Visualize(im, result, model.labels, colormap, FLAGS_threshold);
PaddleX::Visualize(im, result, model.labels, FLAGS_threshold);
std::string save_path =
PaddleX::generate_save_path(FLAGS_save_dir, FLAGS_image);
cv::imwrite(save_path, vis_img);
......@@ -163,11 +142,5 @@ int main(int argc, char** argv) {
std::cout << "Visualized output saved as " << save_path << std::endl;
}
std::cout << "Total running time: " << total_running_time_s
<< " s, average running time: " << total_running_time_s / imgs
<< " s/img, total read img time: " << total_imread_time_s
<< " s, average read img time: " << total_imread_time_s / imgs
<< " s, batch_size = " << FLAGS_batch_size << std::endl;
return 0;
}
......@@ -62,11 +62,7 @@ int main(int argc, char** argv) {
FLAGS_gpu_id,
FLAGS_key,
FLAGS_use_ir_optim);
double total_running_time_s = 0.0;
double total_imread_time_s = 0.0;
int imgs = 1;
auto colormap = PaddleX::GenerateColorMap(model.labels.size());
// 进行预测
if (FLAGS_image_list != "") {
std::ifstream inf(FLAGS_image_list);
......@@ -81,7 +77,6 @@ int main(int argc, char** argv) {
}
imgs = image_paths.size();
for (int i = 0; i < image_paths.size(); i += FLAGS_batch_size) {
auto start = system_clock::now();
int im_vec_size =
std::min(static_cast<int>(image_paths.size()), i + FLAGS_batch_size);
std::vector<cv::Mat> im_vec(im_vec_size - i);
......@@ -92,21 +87,11 @@ int main(int argc, char** argv) {
for (int j = i; j < im_vec_size; ++j) {
im_vec[j - i] = std::move(cv::imread(image_paths[j], 1));
}
auto imread_end = system_clock::now();
model.predict(im_vec, &results, thread_num);
auto imread_duration = duration_cast<microseconds>(imread_end - start);
total_imread_time_s += static_cast<double>(imread_duration.count()) *
microseconds::period::num /
microseconds::period::den;
auto end = system_clock::now();
auto duration = duration_cast<microseconds>(end - start);
total_running_time_s += static_cast<double>(duration.count()) *
microseconds::period::num /
microseconds::period::den;
// 可视化
for (int j = 0; j < im_vec_size - i; ++j) {
cv::Mat vis_img =
PaddleX::Visualize(im_vec[j], results[j], model.labels, colormap);
PaddleX::Visualize(im_vec[j], results[j], model.labels);
std::string save_path =
PaddleX::generate_save_path(FLAGS_save_dir, image_paths[i + j]);
cv::imwrite(save_path, vis_img);
......@@ -114,28 +99,16 @@ int main(int argc, char** argv) {
}
}
} else {
auto start = system_clock::now();
PaddleX::SegResult result;
cv::Mat im = cv::imread(FLAGS_image, 1);
model.predict(im, &result);
auto end = system_clock::now();
auto duration = duration_cast<microseconds>(end - start);
total_running_time_s += static_cast<double>(duration.count()) *
microseconds::period::num /
microseconds::period::den;
// 可视化
cv::Mat vis_img = PaddleX::Visualize(im, result, model.labels, colormap);
cv::Mat vis_img = PaddleX::Visualize(im, result, model.labels);
std::string save_path =
PaddleX::generate_save_path(FLAGS_save_dir, FLAGS_image);
cv::imwrite(save_path, vis_img);
result.clear();
std::cout << "Visualized output saved as " << save_path << std::endl;
}
std::cout << "Total running time: " << total_running_time_s
<< " s, average running time: " << total_running_time_s / imgs
<< " s/img, total read img time: " << total_imread_time_s
<< " s, average read img time: " << total_imread_time_s / imgs
<< " s, batch_size = " << FLAGS_batch_size << std::endl;
return 0;
}
......@@ -214,6 +214,12 @@ class Padding : public Transform {
height_ = item["target_size"].as<std::vector<int>>()[1];
}
}
if (item["im_padding_value"].IsDefined()) {
im_value_ = item["im_padding_value"].as<std::vector<float>>();
}
else {
im_value_ = {0, 0, 0};
}
}
virtual bool Run(cv::Mat* im, ImageBlob* data);
......@@ -221,6 +227,7 @@ class Padding : public Transform {
int coarsest_stride_ = -1;
int width_ = 0;
int height_ = 0;
std::vector<float> im_value_;
};
/*
* @brief
......
......@@ -65,13 +65,12 @@ std::vector<int> GenerateColorMap(int num_class);
* @param img: initial image matrix
* @param results: the detection result
* @param labels: label map
* @param colormap: visualization color map
* @param threshold: minimum confidence to display
* @return visualized image matrix
* */
cv::Mat Visualize(const cv::Mat& img,
const DetResult& results,
const std::map<int, std::string>& labels,
const std::vector<int>& colormap,
float threshold = 0.5);
/*
......@@ -81,13 +80,11 @@ cv::Mat Visualize(const cv::Mat& img,
* @param img: initial image matrix
* @param results: the detection result
* @param labels: label map
* @param colormap: visualization color map
* @return visualized image matrix
* */
cv::Mat Visualize(const cv::Mat& img,
const SegResult& result,
const std::map<int, std::string>& labels,
const std::vector<int>& colormap);
const std::map<int, std::string>& labels);
/*
* @brief
......
......@@ -110,8 +110,9 @@ bool Padding::Run(cv::Mat* im, ImageBlob* data) {
<< ", but they should be greater than 0." << std::endl;
return false;
}
cv::Scalar value = cv::Scalar(im_value_[0], im_value_[1], im_value_[2]);
cv::copyMakeBorder(
*im, *im, 0, padding_h, 0, padding_w, cv::BORDER_CONSTANT, cv::Scalar(0));
*im, *im, 0, padding_h, 0, padding_w, cv::BORDER_CONSTANT, value);
data->new_im_size_[0] = im->rows;
data->new_im_size_[1] = im->cols;
return true;
......
......@@ -34,8 +34,8 @@ std::vector<int> GenerateColorMap(int num_class) {
cv::Mat Visualize(const cv::Mat& img,
const DetResult& result,
const std::map<int, std::string>& labels,
const std::vector<int>& colormap,
float threshold) {
auto colormap = GenerateColorMap(labels.size());
cv::Mat vis_img = img.clone();
auto boxes = result.boxes;
for (int i = 0; i < boxes.size(); ++i) {
......@@ -107,8 +107,8 @@ cv::Mat Visualize(const cv::Mat& img,
cv::Mat Visualize(const cv::Mat& img,
const SegResult& result,
const std::map<int, std::string>& labels,
const std::vector<int>& colormap) {
const std::map<int, std::string>& labels) {
auto colormap = GenerateColorMap(labels.size());
std::vector<uint8_t> label_map(result.label_map.data.begin(),
result.label_map.data.end());
cv::Mat mask(result.label_map.shape[0],
......
*.iml
.gradle
/local.properties
/.idea/caches
/.idea/libraries
/.idea/modules.xml
/.idea/workspace.xml
/.idea/navEditor.xml
/.idea/assetWizardSettings.xml
.DS_Store
/build
/captures
.externalNativeBuild
import java.security.MessageDigest
apply plugin: 'com.android.application'
android {
compileSdkVersion 28
defaultConfig {
applicationId "com.baidu.paddlex.lite.demo"
minSdkVersion 15
targetSdkVersion 28
versionCode 1
versionName "1.0"
testInstrumentationRunner "android.support.test.runner.AndroidJUnitRunner"
}
buildTypes {
release {
minifyEnabled false
proguardFiles getDefaultProguardFile('proguard-android-optimize.txt'), 'proguard-rules.pro'
}
}
}
dependencies {
implementation fileTree(include: ['*.aar'], dir: 'libs')
implementation 'com.android.support:appcompat-v7:28.0.0'
implementation 'com.android.support.constraint:constraint-layout:1.1.3'
implementation 'com.android.support:design:28.0.0'
testImplementation 'junit:junit:4.12'
androidTestImplementation 'com.android.support.test:runner:1.0.2'
androidTestImplementation 'com.android.support.test.espresso:espresso-core:3.0.2'
}
def paddlexAndroidSdk = 'https://bj.bcebos.com/paddlex/deploy/lite/paddlex_lite_11cbd50e.tar.gz'
task downloadAndExtractPaddleXAndroidSdk(type: DefaultTask) {
doFirst {
println "Downloading and extracting PaddleX Android SDK"}
doLast {
// Prepare cache folder for sdk
if (!file("cache").exists()) {
mkdir "cache"
}
// Generate cache name for sdk
MessageDigest messageDigest = MessageDigest.getInstance('MD5')
messageDigest.update(paddlexAndroidSdk.bytes)
String cacheName = new BigInteger(1, messageDigest.digest()).toString(32)
// Download sdk
if (!file("cache/${cacheName}.tar.gz").exists()) {
ant.get(src: paddlexAndroidSdk, dest: file("cache/${cacheName}.tar.gz"))
}
// Unpack sdk
copy {
from tarTree("cache/${cacheName}.tar.gz")
into "cache/${cacheName}"
}
// Copy sdk
if (!file("libs/paddlex.aar").exists()) {
copy {
from "cache/${cacheName}/paddlex.aar"
into "libs"
}
}
}
}
preBuild.dependsOn downloadAndExtractPaddleXAndroidSdk
def paddleXLiteModel = 'https://bj.bcebos.com/paddlex/deploy/lite/mobilenetv2_imagenet_lite2.6.1.tar.gz'
task downloadAndExtractPaddleXLiteModel(type: DefaultTask) {
doFirst {
println "Downloading and extracting PaddleX Android SDK"}
doLast {
// Prepare cache folder for model
if (!file("cache").exists()) {
mkdir "cache"
}
// Generate cache name for model
MessageDigest messageDigest = MessageDigest.getInstance('MD5')
messageDigest.update(paddleXLiteModel.bytes)
String cacheName = new BigInteger(1, messageDigest.digest()).toString(32)
// Download sdk
if (!file("cache/${cacheName}.tar.gz").exists()) {
ant.get(src: paddleXLiteModel, dest: file("cache/${cacheName}.tar.gz"))
}
// Unpack model
copy {
from tarTree("cache/${cacheName}.tar.gz")
into "cache/${cacheName}"
}
// Copy model.nb
if (!file("src/main/assets/model/model.nb").exists()) {
copy {
from "cache/${cacheName}/model.nb"
into "src/main/assets/model/"
}
}
// Copy config file model.yml
if (!file("src/main/assets/config/model.yml").exists()) {
copy {
from "cache/${cacheName}/model.yml"
into "src/main/assets/config/"
}
}
// Copy config file model.yml
if (!file("src/main/assets/images/test.jpg").exists()) {
copy {
from "cache/${cacheName}/test.jpg"
into "src/main/assets/images/"
}
}
}
}
preBuild.dependsOn downloadAndExtractPaddleXLiteModel
# Add project specific ProGuard rules here.
# You can control the set of applied configuration files using the
# proguardFiles setting in build.gradle.
#
# For more details, see
# http://developer.android.com/guide/developing/tools/proguard.html
# If your project uses WebView with JS, uncomment the following
# and specify the fully qualified class name to the JavaScript interface
# class:
#-keepclassmembers class fqcn.of.javascript.interface.for.webview {
# public *;
#}
# Uncomment this to preserve the line number information for
# debugging stack traces.
#-keepattributes SourceFile,LineNumberTable
# If you keep the line number information, uncomment this to
# hide the original source file name.
#-renamesourcefileattribute SourceFile
package com.baidu.paddlex.lite.demo;
import android.content.Context;
import android.content.res.AssetManager;
import android.support.test.InstrumentationRegistry;
import android.support.test.runner.AndroidJUnit4;
import com.baidu.paddlex.config.ConfigParser;
import org.junit.Test;
import org.junit.runner.RunWith;
import java.io.IOException;
import java.io.InputStream;
import static org.junit.Assert.assertEquals;
/**
* Instrumented test, which will execute on an Android device.
*
* @see <a href="http://d.android.com/tools/testing">Testing documentation</a>
*/
@RunWith(AndroidJUnit4.class)
public class ExampleInstrumentedTest {
@Test
public void useAppContext() throws IOException {
// Context of the app under test.
Context appContext = InstrumentationRegistry.getTargetContext();
AssetManager ass = appContext.getAssets();
assertEquals("com.baidu.paddlex.lite.demo", appContext.getPackageName());
}
}
<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
package="com.baidu.paddlex.lite.demo">
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.CAMERA" />
<application
android:allowBackup="true"
android:icon="@mipmap/ic_launcher"
android:label="@string/app_name"
android:roundIcon="@mipmap/ic_launcher_round"
android:supportsRtl="true"
android:largeHeap="true"
android:theme="@style/AppTheme">
<activity android:name="com.baidu.paddlex.lite.demo.MainActivity">
<intent-filter>
<action android:name="android.intent.action.MAIN" />
<category android:name="android.intent.category.LAUNCHER" />
</intent-filter>
</activity>
<activity
android:name="com.baidu.paddlex.lite.demo.SettingsActivity"
android:label="Settings"></activity>
</application>
</manifest>
\ No newline at end of file
// Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package com.baidu.paddlex.lite.demo;
import android.content.res.Configuration;
import android.os.Bundle;
import android.preference.PreferenceActivity;
import android.support.annotation.LayoutRes;
import android.support.annotation.Nullable;
import android.support.v7.app.ActionBar;
import android.support.v7.app.AppCompatDelegate;
import android.support.v7.widget.Toolbar;
import android.view.MenuInflater;
import android.view.View;
import android.view.ViewGroup;
/**
* A {@link android.preference.PreferenceActivity} which implements and proxies the necessary calls
* to be used with AppCompat.
* <p>
* This technique can be used with an {@link android.app.Activity} class, not just
* {@link android.preference.PreferenceActivity}.
*/
public abstract class AppCompatPreferenceActivity extends PreferenceActivity {
private AppCompatDelegate mDelegate;
@Override
protected void onCreate(Bundle savedInstanceState) {
getDelegate().installViewFactory();
getDelegate().onCreate(savedInstanceState);
super.onCreate(savedInstanceState);
}
@Override
protected void onPostCreate(Bundle savedInstanceState) {
super.onPostCreate(savedInstanceState);
getDelegate().onPostCreate(savedInstanceState);
}
public ActionBar getSupportActionBar() {
return getDelegate().getSupportActionBar();
}
public void setSupportActionBar(@Nullable Toolbar toolbar) {
getDelegate().setSupportActionBar(toolbar);
}
@Override
public MenuInflater getMenuInflater() {
return getDelegate().getMenuInflater();
}
@Override
public void setContentView(@LayoutRes int layoutResID) {
getDelegate().setContentView(layoutResID);
}
@Override
public void setContentView(View view) {
getDelegate().setContentView(view);
}
@Override
public void setContentView(View view, ViewGroup.LayoutParams params) {
getDelegate().setContentView(view, params);
}
@Override
public void addContentView(View view, ViewGroup.LayoutParams params) {
getDelegate().addContentView(view, params);
}
@Override
protected void onPostResume() {
super.onPostResume();
getDelegate().onPostResume();
}
@Override
protected void onTitleChanged(CharSequence title, int color) {
super.onTitleChanged(title, color);
getDelegate().setTitle(title);
}
@Override
public void onConfigurationChanged(Configuration newConfig) {
super.onConfigurationChanged(newConfig);
getDelegate().onConfigurationChanged(newConfig);
}
@Override
protected void onStop() {
super.onStop();
getDelegate().onStop();
}
@Override
protected void onDestroy() {
super.onDestroy();
getDelegate().onDestroy();
}
public void invalidateOptionsMenu() {
getDelegate().invalidateOptionsMenu();
}
private AppCompatDelegate getDelegate() {
if (mDelegate == null) {
mDelegate = AppCompatDelegate.create(this, null);
}
return mDelegate;
}
}
// Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package com.baidu.paddlex.lite.demo;
import android.Manifest;
import android.app.ProgressDialog;
import android.content.ContentResolver;
import android.content.Intent;
import android.content.SharedPreferences;
import android.content.pm.PackageManager;
import android.database.Cursor;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.net.Uri;
import android.os.Bundle;
import android.os.Handler;
import android.os.HandlerThread;
import android.os.Message;
import android.preference.PreferenceManager;
import android.provider.MediaStore;
import android.support.annotation.NonNull;
import android.support.v4.app.ActivityCompat;
import android.support.v4.content.ContextCompat;
import android.support.v7.app.AppCompatActivity;
import android.text.method.ScrollingMovementMethod;
import android.util.Log;
import android.view.Menu;
import android.view.MenuInflater;
import android.view.MenuItem;
import android.view.View;
import android.widget.Button;
import android.widget.ImageView;
import android.widget.TextView;
import android.widget.Toast;
import com.baidu.paddlex.Predictor;
import com.baidu.paddlex.Utils;
import com.baidu.paddlex.config.ConfigParser;
import com.baidu.paddlex.postprocess.ClsResult;
import com.baidu.paddlex.postprocess.DetResult;
import com.baidu.paddlex.postprocess.SegResult;
import com.baidu.paddlex.visual.Visualize;
import org.opencv.core.Mat;
import org.opencv.imgcodecs.Imgcodecs;
import org.opencv.imgproc.Imgproc;
import java.io.File;
import java.io.IOException;
import java.io.InputStream;
public class MainActivity extends AppCompatActivity {
public static final int OPEN_GALLERY_REQUEST_CODE = 0;
public static final int TAKE_PHOTO_REQUEST_CODE = 1;
public static final int REQUEST_LOAD_MODEL = 0;
public static final int REQUEST_RUN_MODEL = 1;
public static final int RESPONSE_LOAD_MODEL_SUCCESSED = 0;
public static final int RESPONSE_LOAD_MODEL_FAILED = 1;
public static final int RESPONSE_RUN_MODEL_SUCCESSED = 2;
public static final int RESPONSE_RUN_MODEL_FAILED = 3;
private static final String TAG = MainActivity.class.getSimpleName();
protected ProgressDialog pbLoadModel = null;
protected ProgressDialog pbRunModel = null;
protected Handler receiver = null; // receive messages from worker thread
protected Handler sender = null; // send command to worker thread
protected HandlerThread worker = null; // worker thread to load&run model
protected TextView tvInputSetting;
protected ImageView ivInputImage;
protected TextView tvOutputResult;
protected TextView tvInferenceTime;
private Button predictButton;
protected String testImagePathFromAsset;
protected String testYamlPathFromAsset;
protected String testModelPathFromAsset;
// Predictor
protected Predictor predictor = new Predictor();
// model config
protected ConfigParser configParser = new ConfigParser();
// Visualize
protected Visualize visualize = new Visualize();
// Predict Mat of Opencv
protected Mat predictMat;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
receiver = new Handler() {
@Override
public void handleMessage(Message msg) {
switch (msg.what) {
case RESPONSE_LOAD_MODEL_SUCCESSED:
pbLoadModel.dismiss();
Toast.makeText(MainActivity.this, "Load model successfully!", Toast.LENGTH_SHORT).show();
break;
case RESPONSE_LOAD_MODEL_FAILED:
pbLoadModel.dismiss();
Toast.makeText(MainActivity.this, "Load model failed!", Toast.LENGTH_SHORT).show();
break;
case RESPONSE_RUN_MODEL_SUCCESSED:
pbRunModel.dismiss();
onRunModelSuccessed();
break;
case RESPONSE_RUN_MODEL_FAILED:
pbRunModel.dismiss();
Toast.makeText(MainActivity.this, "Run model failed!", Toast.LENGTH_SHORT).show();
onRunModelFailed();
break;
default:
break;
}
}
};
worker = new HandlerThread("Predictor Worker");
worker.start();
sender = new Handler(worker.getLooper()) {
public void handleMessage(Message msg) {
switch (msg.what) {
case REQUEST_LOAD_MODEL:
// load model and reload test image
if (onLoadModel()) {
receiver.sendEmptyMessage(RESPONSE_LOAD_MODEL_SUCCESSED);
} else {
receiver.sendEmptyMessage(RESPONSE_LOAD_MODEL_FAILED);
}
break;
case REQUEST_RUN_MODEL:
// run model if model is loaded
if (onRunModel()) {
receiver.sendEmptyMessage(RESPONSE_RUN_MODEL_SUCCESSED);
} else {
receiver.sendEmptyMessage(RESPONSE_RUN_MODEL_FAILED);
}
break;
default:
break;
}
}
};
tvInputSetting = findViewById(R.id.tv_input_setting);
ivInputImage = findViewById(R.id.iv_input_image);
predictButton = findViewById(R.id.iv_predict_button);
tvInferenceTime = findViewById(R.id.tv_inference_time);
tvOutputResult = findViewById(R.id.tv_output_result);
tvInputSetting.setMovementMethod(ScrollingMovementMethod.getInstance());
tvOutputResult.setMovementMethod(ScrollingMovementMethod.getInstance());
SharedPreferences sharedPreferences = PreferenceManager.getDefaultSharedPreferences(this);
String image_path = sharedPreferences.getString(getString(R.string.IMAGE_PATH_KEY),
getString(R.string.IMAGE_PATH_DEFAULT));
Utils.initialOpencv();
loadTestImageFromAsset(image_path);
predictButton.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View v) {
if(predictor.isLoaded()){
onLoadModelSuccessed();
}
}
});
}
public boolean onLoadModel() {
return predictor.init(configParser);
}
public boolean onRunModel() {
return predictor.isLoaded() && predictor.predict();
}
public void onRunModelFailed() {
}
public void loadModel() {
pbLoadModel = ProgressDialog.show(this, "", "Loading model...", false, false);
sender.sendEmptyMessage(REQUEST_LOAD_MODEL);
}
public void runModel() {
pbRunModel = ProgressDialog.show(this, "", "Running model...", false, false);
sender.sendEmptyMessage(REQUEST_RUN_MODEL);
}
public void onLoadModelSuccessed() {
if (predictMat != null && predictor.isLoaded()) {
int w = predictMat.width();
int h = predictMat.height();
int c = predictMat.channels();
predictor.setInputMat(predictMat);
runModel();
}
}
public void onRunModelSuccessed() {
// obtain results and update UI
tvInferenceTime.setText("Inference time: " + predictor.getInferenceTime() + " ms");
if (configParser.getModelType().equalsIgnoreCase("segmenter")) {
SegResult segResult = predictor.getSegResult();
Mat maskMat = visualize.draw(segResult, predictMat.clone(), predictor.getImageBlob(), 1);
Imgproc.cvtColor(maskMat, maskMat, Imgproc.COLOR_BGRA2RGBA);
Bitmap outputImage = Bitmap.createBitmap(maskMat.width(), maskMat.height(), Bitmap.Config.ARGB_8888);
org.opencv.android.Utils.matToBitmap(maskMat, outputImage);
if (outputImage != null) {
ivInputImage.setImageBitmap(outputImage);
}
} else if (configParser.getModelType().equalsIgnoreCase("detector")) {
DetResult detResult = predictor.getDetResult();
Mat roiMat = visualize.draw(detResult, predictMat.clone());
Imgproc.cvtColor(roiMat, roiMat, Imgproc.COLOR_BGR2RGB);
Bitmap outputImage = Bitmap.createBitmap(roiMat.width(),roiMat.height(), Bitmap.Config.ARGB_8888);
org.opencv.android.Utils.matToBitmap(roiMat,outputImage);
if (outputImage != null) {
ivInputImage.setImageBitmap(outputImage);
}
} else if (configParser.getModelType().equalsIgnoreCase("classifier")) {
ClsResult clsResult = predictor.getClsResult();
if (configParser.getLabeList().size() > 0) {
String outputResult = "Top1: " + clsResult.getCategory() + " - " + String.format("%.3f", clsResult.getScore());
tvOutputResult.setText(outputResult);
tvOutputResult.scrollTo(0, 0);
}
}
}
public void onMatChanged(Mat mat) {
this.predictMat = mat.clone();
}
public void onImageChanged(Bitmap image) {
ivInputImage.setImageBitmap(image);
tvOutputResult.setText("");
tvInferenceTime.setText("Inference time: -- ms");
}
public void onSettingsClicked() {
startActivity(new Intent(MainActivity.this, SettingsActivity.class));
}
@Override
public boolean onCreateOptionsMenu(Menu menu) {
MenuInflater inflater = getMenuInflater();
inflater.inflate(R.menu.menu_action_options, menu);
return true;
}
@Override
public boolean onOptionsItemSelected(MenuItem item) {
switch (item.getItemId()) {
case android.R.id.home:
finish();
break;
case R.id.open_gallery:
if (requestAllPermissions()) {
openGallery();
}
break;
case R.id.take_photo:
if (requestAllPermissions()) {
takePhoto();
}
break;
case R.id.settings:
if (requestAllPermissions()) {
// make sure we have SDCard r&w permissions to load model from SDCard
onSettingsClicked();
}
break;
}
return super.onOptionsItemSelected(item);
}
@Override
public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions,
@NonNull int[] grantResults) {
super.onRequestPermissionsResult(requestCode, permissions, grantResults);
if (grantResults[0] != PackageManager.PERMISSION_GRANTED || grantResults[1] != PackageManager.PERMISSION_GRANTED) {
Toast.makeText(this, "Permission Denied", Toast.LENGTH_SHORT).show();
}
}
@Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
if (resultCode == RESULT_OK && data != null) {
switch (requestCode) {
case OPEN_GALLERY_REQUEST_CODE:
try {
ContentResolver resolver = getContentResolver();
Uri uri = data.getData();
Bitmap image = MediaStore.Images.Media.getBitmap(resolver, uri);
String[] proj = {MediaStore.Images.Media.DATA};
Cursor cursor = managedQuery(uri, proj, null, null, null);
cursor.moveToFirst();
int columnIndex = cursor.getColumnIndex(proj[0]);
String imgDecodableString = cursor.getString(columnIndex);
File file = new File(imgDecodableString);
Mat mat = Imgcodecs.imread(file.getAbsolutePath(),Imgcodecs.IMREAD_COLOR);
onImageChanged(image);
onMatChanged(mat);
} catch (IOException e) {
Log.e(TAG, e.toString());
}
break;
case TAKE_PHOTO_REQUEST_CODE:
Bitmap image = (Bitmap) data.getParcelableExtra("data");
Mat mat = new Mat();
org.opencv.android.Utils.bitmapToMat(image, mat);
Imgproc.cvtColor(mat, mat, Imgproc.COLOR_RGBA2BGR);
onImageChanged(image);
onMatChanged(mat);
break;
default:
break;
}
}
}
private boolean requestAllPermissions() {
if (ContextCompat.checkSelfPermission(this, Manifest.permission.WRITE_EXTERNAL_STORAGE)
!= PackageManager.PERMISSION_GRANTED || ContextCompat.checkSelfPermission(this,
Manifest.permission.CAMERA)
!= PackageManager.PERMISSION_GRANTED) {
ActivityCompat.requestPermissions(this, new String[]{Manifest.permission.WRITE_EXTERNAL_STORAGE,
Manifest.permission.CAMERA},
0);
return false;
}
return true;
}
private void openGallery() {
Intent intent = new Intent(Intent.ACTION_PICK, null);
intent.setDataAndType(MediaStore.Images.Media.EXTERNAL_CONTENT_URI, "image/*");
startActivityForResult(intent, OPEN_GALLERY_REQUEST_CODE);
}
private void takePhoto() {
Intent takePhotoIntent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
if (takePhotoIntent.resolveActivity(getPackageManager()) != null) {
startActivityForResult(takePhotoIntent, TAKE_PHOTO_REQUEST_CODE);
}
}
@Override
public boolean onPrepareOptionsMenu(Menu menu) {
boolean isLoaded = predictor.isLoaded();
menu.findItem(R.id.open_gallery).setEnabled(isLoaded);
menu.findItem(R.id.take_photo).setEnabled(isLoaded);
return super.onPrepareOptionsMenu(menu);
}
@Override
protected void onResume() {
Log.i(TAG, "begin onResume");
super.onResume();
SharedPreferences sharedPreferences = PreferenceManager.getDefaultSharedPreferences(this);
boolean settingsChanged = false;
boolean testImageChanged = false;
String modelPath = sharedPreferences.getString(getString(R.string.MODEL_PATH_KEY),
getString(R.string.MODEL_PATH_DEFAULT));
settingsChanged |= !modelPath.equalsIgnoreCase(testModelPathFromAsset);
String yamlPath = sharedPreferences.getString(getString(R.string.YAML_PATH_KEY),
getString(R.string.YAML_PATH_DEFAULT));
settingsChanged |= !yamlPath.equalsIgnoreCase(testYamlPathFromAsset);
int cpuThreadNum = Integer.parseInt(sharedPreferences.getString(getString(R.string.CPU_THREAD_NUM_KEY),
getString(R.string.CPU_THREAD_NUM_DEFAULT)));
settingsChanged |= cpuThreadNum != configParser.getCpuThreadNum();
String cpuPowerMode = sharedPreferences.getString(getString(R.string.CPU_POWER_MODE_KEY),
getString(R.string.CPU_POWER_MODE_DEFAULT));
settingsChanged |= !cpuPowerMode.equalsIgnoreCase(configParser.getCpuPowerMode());
String imagePath = sharedPreferences.getString(getString(R.string.IMAGE_PATH_KEY),
getString(R.string.IMAGE_PATH_DEFAULT));
testImageChanged |= !imagePath.equalsIgnoreCase(testImagePathFromAsset);
testYamlPathFromAsset = yamlPath;
testModelPathFromAsset = modelPath;
if (settingsChanged) {
try {
String realModelPath = modelPath;
if (!modelPath.substring(0, 1).equals("/")) {
String modelFileName = Utils.getFileNameFromString(modelPath);
realModelPath = this.getCacheDir() + File.separator + modelFileName;
Utils.copyFileFromAssets(this, modelPath, realModelPath);
}
String realYamlPath = yamlPath;
if (!yamlPath.substring(0, 1).equals("/")) {
String yamlFileName = Utils.getFileNameFromString(yamlPath);
realYamlPath = this.getCacheDir() + File.separator + yamlFileName;
Utils.copyFileFromAssets(this, yamlPath, realYamlPath);
}
configParser.init(realModelPath, realYamlPath, cpuThreadNum, cpuPowerMode);
visualize.init(configParser.getNumClasses());
} catch (IOException e) {
e.printStackTrace();
Toast.makeText(MainActivity.this, "Load config failed!", Toast.LENGTH_SHORT).show();
}
// update UI
tvInputSetting.setText("Model: " + configParser.getModel()+ "\n" + "CPU" +
" Thread Num: " + Integer.toString(configParser.getCpuThreadNum()) + "\n" + "CPU Power Mode: " + configParser.getCpuPowerMode());
tvInputSetting.scrollTo(0, 0);
// reload model if configure has been changed
loadModel();
}
if (testImageChanged){
loadTestImageFromAsset(imagePath);
}
}
public void loadTestImageFromAsset(String imagePath){
if (imagePath.isEmpty()) {
return;
}
// read test image file from custom file_paths if the first character of mode file_paths is '/', otherwise read test
// image file from assets
testImagePathFromAsset = imagePath;
if (!imagePath.substring(0, 1).equals("/")) {
InputStream imageStream = null;
try {
imageStream = getAssets().open(imagePath);
} catch (IOException e) {
e.printStackTrace();
}
onImageChanged(BitmapFactory.decodeStream(imageStream));
String realPath;
String imageFileName = Utils.getFileNameFromString(imagePath);
realPath = this.getCacheDir() + File.separator + imageFileName;
Utils.copyFileFromAssets(this, imagePath, realPath);
onMatChanged(Imgcodecs.imread(realPath, Imgcodecs.IMREAD_COLOR));
} else {
if (!new File(imagePath).exists()) {
return;
}
onMatChanged(Imgcodecs.imread(imagePath, Imgcodecs.IMREAD_COLOR));
onImageChanged( BitmapFactory.decodeFile(imagePath));
}
}
@Override
protected void onDestroy() {
if (predictor != null) {
predictor.releaseModel();
}
worker.quit();
super.onDestroy();
}
}
\ No newline at end of file
// Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package com.baidu.paddlex.lite.demo;
import com.baidu.paddlex.Utils;
import android.content.SharedPreferences;
import android.os.Bundle;
import android.preference.CheckBoxPreference;
import android.preference.EditTextPreference;
import android.preference.ListPreference;
import android.support.v7.app.ActionBar;
import java.util.ArrayList;
import java.util.List;
public class SettingsActivity extends AppCompatPreferenceActivity implements SharedPreferences.OnSharedPreferenceChangeListener {
ListPreference lpChoosePreInstalledModel = null;
CheckBoxPreference cbEnableCustomSettings = null;
EditTextPreference etModelPath = null;
EditTextPreference etYamlPath = null;
EditTextPreference etImagePath = null;
ListPreference lpCPUThreadNum = null;
ListPreference lpCPUPowerMode = null;
List<String> preInstalledModelPaths = null;
List<String> preInstalledYamlPaths = null;
List<String> preInstalledImagePaths = null;
List<String> preInstalledCPUThreadNums = null;
List<String> preInstalledCPUPowerModes = null;
@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
addPreferencesFromResource(R.xml.settings);
ActionBar supportActionBar = getSupportActionBar();
if (supportActionBar != null) {
supportActionBar.setDisplayHomeAsUpEnabled(true);
}
// initialized pre-installed models
preInstalledModelPaths = new ArrayList<String>();
preInstalledYamlPaths = new ArrayList<String>();
preInstalledImagePaths = new ArrayList<String>();
preInstalledCPUThreadNums = new ArrayList<String>();
preInstalledCPUPowerModes = new ArrayList<String>();
preInstalledModelPaths.add(getString(R.string.MODEL_PATH_DEFAULT));
preInstalledYamlPaths.add(getString(R.string.YAML_PATH_DEFAULT));
preInstalledImagePaths.add(getString(R.string.IMAGE_PATH_DEFAULT));
preInstalledCPUThreadNums.add(getString(R.string.CPU_THREAD_NUM_DEFAULT));
preInstalledCPUPowerModes.add(getString(R.string.CPU_POWER_MODE_DEFAULT));
// initialize UI components
lpChoosePreInstalledModel =
(ListPreference) findPreference(getString(R.string.CHOOSE_PRE_INSTALLED_MODEL_KEY));
String[] preInstalledModelNames = new String[preInstalledModelPaths.size()];
for (int i = 0; i < preInstalledModelPaths.size(); i++) {
preInstalledModelNames[i] =
preInstalledModelPaths.get(i).substring(preInstalledModelPaths.get(i).lastIndexOf("/") + 1);
}
lpChoosePreInstalledModel.setEntries(preInstalledModelNames);
lpChoosePreInstalledModel.setEntryValues(preInstalledModelPaths.toArray(new String[preInstalledModelPaths.size()]));
cbEnableCustomSettings =
(CheckBoxPreference) findPreference(getString(R.string.ENABLE_CUSTOM_SETTINGS_KEY));
etModelPath = (EditTextPreference) findPreference(getString(R.string.MODEL_PATH_KEY));
etModelPath.setTitle("Model Path (SDCard: " + Utils.getSDCardDirectory() + ")");
etYamlPath = (EditTextPreference) findPreference(getString(R.string.YAML_PATH_KEY));
etImagePath = (EditTextPreference) findPreference(getString(R.string.IMAGE_PATH_KEY));
lpCPUThreadNum =
(ListPreference) findPreference(getString(R.string.CPU_THREAD_NUM_KEY));
lpCPUPowerMode =
(ListPreference) findPreference(getString(R.string.CPU_POWER_MODE_KEY));
}
private void reloadPreferenceAndUpdateUI() {
SharedPreferences sharedPreferences = getPreferenceScreen().getSharedPreferences();
boolean enableCustomSettings =
sharedPreferences.getBoolean(getString(R.string.ENABLE_CUSTOM_SETTINGS_KEY), false);
String modelPath = sharedPreferences.getString(getString(R.string.CHOOSE_PRE_INSTALLED_MODEL_KEY),
getString(R.string.MODEL_PATH_DEFAULT));
int modelIdx = lpChoosePreInstalledModel.findIndexOfValue(modelPath);
if (modelIdx >= 0 && modelIdx < preInstalledModelPaths.size()) {
if (!enableCustomSettings) {
SharedPreferences.Editor editor = sharedPreferences.edit();
editor.putString(getString(R.string.MODEL_PATH_KEY), preInstalledModelPaths.get(modelIdx));
editor.putString(getString(R.string.YAML_PATH_KEY), preInstalledYamlPaths.get(modelIdx));
editor.putString(getString(R.string.IMAGE_PATH_KEY), preInstalledImagePaths.get(modelIdx));
editor.putString(getString(R.string.CPU_THREAD_NUM_KEY), preInstalledCPUThreadNums.get(modelIdx));
editor.putString(getString(R.string.CPU_POWER_MODE_KEY), preInstalledCPUPowerModes.get(modelIdx));
editor.commit();
}
lpChoosePreInstalledModel.setSummary(modelPath);
}
cbEnableCustomSettings.setChecked(enableCustomSettings);
etModelPath.setEnabled(enableCustomSettings);
etYamlPath.setEnabled(enableCustomSettings);
etImagePath.setEnabled(enableCustomSettings);
lpCPUThreadNum.setEnabled(enableCustomSettings);
lpCPUPowerMode.setEnabled(enableCustomSettings);
modelPath = sharedPreferences.getString(getString(R.string.MODEL_PATH_KEY),
getString(R.string.MODEL_PATH_DEFAULT));
String YamlPath = sharedPreferences.getString(getString(R.string.YAML_PATH_KEY),
getString(R.string.YAML_PATH_DEFAULT));
String imagePath = sharedPreferences.getString(getString(R.string.IMAGE_PATH_KEY),
getString(R.string.IMAGE_PATH_DEFAULT));
String cpuThreadNum = sharedPreferences.getString(getString(R.string.CPU_THREAD_NUM_KEY),
getString(R.string.CPU_THREAD_NUM_DEFAULT));
String cpuPowerMode = sharedPreferences.getString(getString(R.string.CPU_POWER_MODE_KEY),
getString(R.string.CPU_POWER_MODE_DEFAULT));
etModelPath.setSummary(modelPath);
etModelPath.setText(modelPath);
etYamlPath.setSummary(YamlPath);
etYamlPath.setText(YamlPath);
etImagePath.setSummary(imagePath);
etImagePath.setText(imagePath);
lpCPUThreadNum.setValue(cpuThreadNum);
lpCPUThreadNum.setSummary(cpuThreadNum);
lpCPUPowerMode.setValue(cpuPowerMode);
lpCPUPowerMode.setSummary(cpuPowerMode);
}
@Override
protected void onResume() {
super.onResume();
getPreferenceScreen().getSharedPreferences().registerOnSharedPreferenceChangeListener(this);
reloadPreferenceAndUpdateUI();
}
@Override
protected void onPause() {
super.onPause();
getPreferenceScreen().getSharedPreferences().unregisterOnSharedPreferenceChangeListener(this);
}
@Override
public void onSharedPreferenceChanged(SharedPreferences sharedPreferences, String key) {
if (key.equals(getString(R.string.CHOOSE_PRE_INSTALLED_MODEL_KEY))) {
SharedPreferences.Editor editor = sharedPreferences.edit();
editor.putBoolean(getString(R.string.ENABLE_CUSTOM_SETTINGS_KEY), false);
editor.commit();
}
reloadPreferenceAndUpdateUI();
}
}
<vector xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:aapt="http://schemas.android.com/aapt"
android:width="108dp"
android:height="108dp"
android:viewportWidth="108"
android:viewportHeight="108">
<path
android:fillType="evenOdd"
android:pathData="M32,64C32,64 38.39,52.99 44.13,50.95C51.37,48.37 70.14,49.57 70.14,49.57L108.26,87.69L108,109.01L75.97,107.97L32,64Z"
android:strokeWidth="1"
android:strokeColor="#00000000">
<aapt:attr name="android:fillColor">
<gradient
android:endX="78.5885"
android:endY="90.9159"
android:startX="48.7653"
android:startY="61.0927"
android:type="linear">
<item
android:color="#44000000"
android:offset="0.0" />
<item
android:color="#00000000"
android:offset="1.0" />
</gradient>
</aapt:attr>
</path>
<path
android:fillColor="#FFFFFF"
android:fillType="nonZero"
android:pathData="M66.94,46.02L66.94,46.02C72.44,50.07 76,56.61 76,64L32,64C32,56.61 35.56,50.11 40.98,46.06L36.18,41.19C35.45,40.45 35.45,39.3 36.18,38.56C36.91,37.81 38.05,37.81 38.78,38.56L44.25,44.05C47.18,42.57 50.48,41.71 54,41.71C57.48,41.71 60.78,42.57 63.68,44.05L69.11,38.56C69.84,37.81 70.98,37.81 71.71,38.56C72.44,39.3 72.44,40.45 71.71,41.19L66.94,46.02ZM62.94,56.92C64.08,56.92 65,56.01 65,54.88C65,53.76 64.08,52.85 62.94,52.85C61.8,52.85 60.88,53.76 60.88,54.88C60.88,56.01 61.8,56.92 62.94,56.92ZM45.06,56.92C46.2,56.92 47.13,56.01 47.13,54.88C47.13,53.76 46.2,52.85 45.06,52.85C43.92,52.85 43,53.76 43,54.88C43,56.01 43.92,56.92 45.06,56.92Z"
android:strokeWidth="1"
android:strokeColor="#00000000" />
</vector>
<?xml version="1.0" encoding="utf-8"?>
<vector xmlns:android="http://schemas.android.com/apk/res/android"
android:width="108dp"
android:height="108dp"
android:viewportWidth="108"
android:viewportHeight="108">
<path
android:fillColor="#008577"
android:pathData="M0,0h108v108h-108z" />
<path
android:fillColor="#00000000"
android:pathData="M9,0L9,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M19,0L19,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M29,0L29,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M39,0L39,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M49,0L49,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M59,0L59,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M69,0L69,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M79,0L79,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M89,0L89,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M99,0L99,108"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,9L108,9"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,19L108,19"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,29L108,29"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,39L108,39"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,49L108,49"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,59L108,59"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,69L108,69"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,79L108,79"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,89L108,89"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M0,99L108,99"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M19,29L89,29"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M19,39L89,39"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M19,49L89,49"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M19,59L89,59"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M19,69L89,69"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M19,79L89,79"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M29,19L29,89"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M39,19L39,89"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M49,19L49,89"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M59,19L59,89"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M69,19L69,89"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
<path
android:fillColor="#00000000"
android:pathData="M79,19L79,89"
android:strokeWidth="0.8"
android:strokeColor="#33FFFFFF" />
</vector>
<?xml version="1.0" encoding="utf-8"?>
<android.support.constraint.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context="com.baidu.paddlex.lite.demo.MainActivity">
<RelativeLayout
android:layout_width="match_parent"
android:layout_height="match_parent">
<LinearLayout
android:id="@+id/v_input_info"
android:layout_width="fill_parent"
android:layout_height="wrap_content"
android:layout_alignParentTop="true"
android:orientation="vertical">
<TextView
android:id="@+id/tv_input_setting"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_marginLeft="12dp"
android:layout_marginTop="10dp"
android:layout_marginRight="12dp"
android:layout_marginBottom="5dp"
android:lineSpacingExtra="4dp"
android:maxLines="6"
android:scrollbars="vertical"
android:singleLine="false"
android:text="" />
</LinearLayout>
<RelativeLayout
android:layout_width="match_parent"
android:layout_height="match_parent"
android:layout_above="@+id/v_output_info"
android:layout_below="@+id/v_input_info">
<ImageView
android:id="@+id/iv_input_image"
android:layout_width="400dp"
android:layout_height="400dp"
android:layout_centerInParent="true"
android:layout_marginLeft="12dp"
android:layout_marginTop="5dp"
android:layout_marginRight="12dp"
android:layout_marginBottom="5dp"
android:adjustViewBounds="true"
android:scaleType="fitCenter" />
</RelativeLayout>
<RelativeLayout
android:id="@+id/v_output_info"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_alignParentBottom="true"
android:layout_centerHorizontal="true">
<TextView
android:id="@+id/tv_output_result"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_alignParentTop="true"
android:layout_centerHorizontal="true"
android:layout_centerVertical="true"
android:layout_marginLeft="12dp"
android:layout_marginTop="5dp"
android:layout_marginRight="12dp"
android:layout_marginBottom="5dp"
android:lineSpacingExtra="5dp"
android:maxLines="5"
android:scrollbars="vertical"
android:singleLine="false"
android:text=""
android:textAlignment="center" />
<TextView
android:id="@+id/tv_inference_time"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_below="@+id/tv_output_result"
android:layout_centerHorizontal="true"
android:layout_centerVertical="true"
android:layout_marginLeft="12dp"
android:layout_marginTop="5dp"
android:layout_marginRight="12dp"
android:layout_marginBottom="10dp"
android:text=""
android:textAlignment="center" />
<Button
android:id="@+id/iv_predict_button"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_below="@+id/tv_inference_time"
android:layout_centerHorizontal="true"
android:layout_centerVertical="true"
android:layout_marginLeft="12dp"
android:layout_marginTop="5dp"
android:layout_marginRight="12dp"
android:layout_marginBottom="10dp"
android:text="Predict"
android:textAlignment="center" />/>
</RelativeLayout>
</RelativeLayout>
</android.support.constraint.ConstraintLayout>
\ No newline at end of file
<menu xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto">
<group android:id="@+id/pick_image">
<item
android:id="@+id/open_gallery"
android:title="Open Gallery"
app:showAsAction="withText" />
<item
android:id="@+id/take_photo"
android:title="Take Photo"
app:showAsAction="withText" />
</group>
<group>
<item
android:id="@+id/settings"
android:title="Settings..."
app:showAsAction="withText" />
</group>
</menu>
<?xml version="1.0" encoding="utf-8"?>
<adaptive-icon xmlns:android="http://schemas.android.com/apk/res/android">
<background android:drawable="@drawable/ic_launcher_background" />
<foreground android:drawable="@drawable/ic_launcher_foreground" />
</adaptive-icon>
\ No newline at end of file
<?xml version="1.0" encoding="utf-8"?>
<adaptive-icon xmlns:android="http://schemas.android.com/apk/res/android">
<background android:drawable="@drawable/ic_launcher_background" />
<foreground android:drawable="@drawable/ic_launcher_foreground" />
</adaptive-icon>
\ No newline at end of file
<?xml version="1.0" encoding="utf-8"?>
<resources>
<string-array name="cpu_thread_num_entries">
<item>1 threads</item>
<item>2 threads</item>
<item>4 threads</item>
<item>8 threads</item>
</string-array>
<string-array name="cpu_thread_num_values">
<item>1</item>
<item>2</item>
<item>4</item>
<item>8</item>
</string-array>
<string-array name="cpu_power_mode_entries">
<item>HIGH(only big cores)</item>
<item>LOW(only LITTLE cores)</item>
<item>FULL(all cores)</item>
<item>NO_BIND(depends on system)</item>
<item>RAND_HIGH</item>
<item>RAND_LOW</item>
</string-array>
<string-array name="cpu_power_mode_values">
<item>LITE_POWER_HIGH</item>
<item>LITE_POWER_LOW</item>
<item>LITE_POWER_FULL</item>
<item>LITE_POWER_NO_BIND</item>
<item>LITE_POWER_RAND_HIGH</item>
<item>LITE_POWER_RAND_LOW</item>
</string-array>
<string-array name="input_color_format_entries">
<item>BGR color format</item>
<item>RGB color format</item>
</string-array>
<string-array name="input_color_format_values">
<item>BGR</item>
<item>RGB</item>
</string-array>
</resources>
\ No newline at end of file
<?xml version="1.0" encoding="utf-8"?>
<resources>
<color name="colorPrimary">#008577</color>
<color name="colorPrimaryDark">#00574B</color>
<color name="colorAccent">#D81B60</color>
</resources>
<resources>
<string name="app_name">PaddleX Demo</string>
<!-- settings -->
<string name="CHOOSE_PRE_INSTALLED_MODEL_KEY">CHOOSE_PRE_INSTALLED_MODEL_KEY</string>
<string name="ENABLE_CUSTOM_SETTINGS_KEY">ENABLE_CUSTOM_SETTINGS_KEY</string>
<string name="MODEL_PATH_KEY">MODEL_PATH_KEY</string>
<string name="YAML_PATH_KEY">YAML_PATH_KEY</string>
<string name="IMAGE_PATH_KEY">IMAGE_PATH_KEY</string>
<string name="CPU_POWER_MODE_KEY">CPU_POWER_MODE_KEY</string>
<string name="CPU_THREAD_NUM_KEY">CPU_THREAD_NUM_KEY</string>
<string name="MODEL_PATH_DEFAULT">model/model.nb</string>
<string name="YAML_PATH_DEFAULT">config/model.yml</string>
<string name="IMAGE_PATH_DEFAULT">images/test.jpg</string>
<string name="CPU_THREAD_NUM_DEFAULT">1</string>
<string name="CPU_POWER_MODE_DEFAULT">LITE_POWER_HIGH</string>
</resources>
\ No newline at end of file
<resources>
<!-- Base application theme. -->
<style name="AppTheme" parent="Theme.AppCompat.Light.DarkActionBar">
<!-- Customize your theme here. -->
<item name="colorPrimary">@color/colorPrimary</item>
<item name="colorPrimaryDark">@color/colorPrimaryDark</item>
<item name="colorAccent">@color/colorAccent</item>
<item name="actionOverflowMenuStyle">@style/OverflowMenuStyle</item>
</style>
<style name="OverflowMenuStyle" parent="Widget.AppCompat.Light.PopupMenu.Overflow">
<item name="overlapAnchor">false</item>
</style>
<style name="AppTheme.NoActionBar">
<item name="windowActionBar">false</item>
<item name="windowNoTitle">true</item>
</style>
<style name="AppTheme.AppBarOverlay" parent="ThemeOverlay.AppCompat.Dark.ActionBar" />
<style name="AppTheme.PopupOverlay" parent="ThemeOverlay.AppCompat.Light" />
</resources>
<?xml version="1.0" encoding="utf-8"?>
<PreferenceScreen xmlns:android="http://schemas.android.com/apk/res/android">
<PreferenceCategory android:title="Model Settings">
<ListPreference
android:defaultValue="@string/MODEL_PATH_DEFAULT"
android:key="@string/CHOOSE_PRE_INSTALLED_MODEL_KEY"
android:negativeButtonText="@null"
android:positiveButtonText="@null"
android:title="Choose pre-installed models" />
<CheckBoxPreference
android:defaultValue="false"
android:key="@string/ENABLE_CUSTOM_SETTINGS_KEY"
android:summaryOff="Disable"
android:summaryOn="Enable"
android:title="Enable custom settings" />
<EditTextPreference
android:defaultValue="@string/MODEL_PATH_DEFAULT"
android:key="@string/MODEL_PATH_KEY"
android:title="Model Path" />
<EditTextPreference
android:defaultValue="@string/YAML_PATH_DEFAULT"
android:key="@string/YAML_PATH_KEY"
android:title="Yaml Path" />
</PreferenceCategory>
<PreferenceCategory android:title="Image Settings">
<EditTextPreference
android:defaultValue="@string/IMAGE_PATH_DEFAULT"
android:key="@string/IMAGE_PATH_KEY"
android:title="Image Path" />
</PreferenceCategory>
<PreferenceCategory android:title="CPU Settings">
<ListPreference
android:defaultValue="@string/CPU_THREAD_NUM_DEFAULT"
android:entries="@array/cpu_thread_num_entries"
android:entryValues="@array/cpu_thread_num_values"
android:key="@string/CPU_THREAD_NUM_KEY"
android:negativeButtonText="@null"
android:positiveButtonText="@null"
android:title="CPU Thread Num" />
<ListPreference
android:defaultValue="@string/CPU_POWER_MODE_DEFAULT"
android:entries="@array/cpu_power_mode_entries"
android:entryValues="@array/cpu_power_mode_values"
android:key="@string/CPU_POWER_MODE_KEY"
android:negativeButtonText="@null"
android:positiveButtonText="@null"
android:title="CPU Power Mode" />
</PreferenceCategory>
</PreferenceScreen>
package com.baidu.paddlex.lite.demo;
import org.junit.Test;
import static org.junit.Assert.assertEquals;
/**
* Example local unit test, which will execute on the development machine (host).
*
* @see <a href="http://d.android.com/tools/testing">Testing documentation</a>
*/
public class ExampleUnitTest {
@Test
public void addition_isCorrect() {
assertEquals(4, 2 + 2);
}
}
\ No newline at end of file
// Top-level build file where you can add configuration options common to all sub-projects/modules.
buildscript {
repositories {
google()
jcenter()
}
dependencies {
classpath 'com.android.tools.build:gradle:3.4.0'
// NOTE: Do not place your application dependencies here; they belong
// in the individual module build.gradle files
}
}
allprojects {
repositories {
google()
jcenter()
}
}
task clean(type: Delete) {
delete rootProject.buildDir
}
# Project-wide Gradle settings.
# IDE (e.g. Android Studio) users:
# Gradle settings configured through the IDE *will override*
# any settings specified in this file.
# For more details on how to configure your build environment visit
# http://www.gradle.org/docs/current/userguide/build_environment.html
# Specifies the JVM arguments used for the daemon process.
# The setting is particularly useful for tweaking memory settings.
org.gradle.jvmargs=-Xmx1536m
# When configured, Gradle will run in incubating parallel mode.
# This option should only be used with decoupled projects. More details, visit
# http://www.gradle.org/docs/current/userguide/multi_project_builds.html#sec:decoupled_projects
# org.gradle.parallel=true
#Thu Aug 22 15:05:37 CST 2019
distributionBase=GRADLE_USER_HOME
distributionPath=wrapper/dists
zipStoreBase=GRADLE_USER_HOME
zipStorePath=wrapper/dists
distributionUrl=https\://services.gradle.org/distributions/gradle-5.1.1-all.zip
#!/usr/bin/env sh
##############################################################################
##
## Gradle start up script for UN*X
##
##############################################################################
# Attempt to set APP_HOME
# Resolve links: $0 may be a link
PRG="$0"
# Need this for relative symlinks.
while [ -h "$PRG" ] ; do
ls=`ls -ld "$PRG"`
link=`expr "$ls" : '.*-> \(.*\)$'`
if expr "$link" : '/.*' > /dev/null; then
PRG="$link"
else
PRG=`dirname "$PRG"`"/$link"
fi
done
SAVED="`pwd`"
cd "`dirname \"$PRG\"`/" >/dev/null
APP_HOME="`pwd -P`"
cd "$SAVED" >/dev/null
APP_NAME="Gradle"
APP_BASE_NAME=`basename "$0"`
# Add default JVM options here. You can also use JAVA_OPTS and GRADLE_OPTS to pass JVM options to this script.
DEFAULT_JVM_OPTS=""
# Use the maximum available, or set MAX_FD != -1 to use that value.
MAX_FD="maximum"
warn () {
echo "$*"
}
die () {
echo
echo "$*"
echo
exit 1
}
# OS specific support (must be 'true' or 'false').
cygwin=false
msys=false
darwin=false
nonstop=false
case "`uname`" in
CYGWIN* )
cygwin=true
;;
Darwin* )
darwin=true
;;
MINGW* )
msys=true
;;
NONSTOP* )
nonstop=true
;;
esac
CLASSPATH=$APP_HOME/gradle/wrapper/gradle-wrapper.jar
# Determine the Java command to use to start the JVM.
if [ -n "$JAVA_HOME" ] ; then
if [ -x "$JAVA_HOME/jre/sh/java" ] ; then
# IBM's JDK on AIX uses strange locations for the executables
JAVACMD="$JAVA_HOME/jre/sh/java"
else
JAVACMD="$JAVA_HOME/bin/java"
fi
if [ ! -x "$JAVACMD" ] ; then
die "ERROR: JAVA_HOME is set to an invalid directory: $JAVA_HOME
Please set the JAVA_HOME variable in your environment to match the
location of your Java installation."
fi
else
JAVACMD="java"
which java >/dev/null 2>&1 || die "ERROR: JAVA_HOME is not set and no 'java' command could be found in your PATH.
Please set the JAVA_HOME variable in your environment to match the
location of your Java installation."
fi
# Increase the maximum file descriptors if we can.
if [ "$cygwin" = "false" -a "$darwin" = "false" -a "$nonstop" = "false" ] ; then
MAX_LIMIT=`ulimit -H -n`
if [ $? -eq 0 ] ; then
if [ "$MAX_FD" = "maximum" -o "$MAX_FD" = "max" ] ; then
MAX_FD="$MAX_LIMIT"
fi
ulimit -n $MAX_FD
if [ $? -ne 0 ] ; then
warn "Could not set maximum file descriptor limit: $MAX_FD"
fi
else
warn "Could not query maximum file descriptor limit: $MAX_LIMIT"
fi
fi
# For Darwin, add options to specify how the application appears in the dock
if $darwin; then
GRADLE_OPTS="$GRADLE_OPTS \"-Xdock:name=$APP_NAME\" \"-Xdock:icon=$APP_HOME/media/gradle.icns\""
fi
# For Cygwin, switch paths to Windows format before running java
if $cygwin ; then
APP_HOME=`cygpath --path --mixed "$APP_HOME"`
CLASSPATH=`cygpath --path --mixed "$CLASSPATH"`
JAVACMD=`cygpath --unix "$JAVACMD"`
# We build the pattern for arguments to be converted via cygpath
ROOTDIRSRAW=`find -L / -maxdepth 1 -mindepth 1 -type d 2>/dev/null`
SEP=""
for dir in $ROOTDIRSRAW ; do
ROOTDIRS="$ROOTDIRS$SEP$dir"
SEP="|"
done
OURCYGPATTERN="(^($ROOTDIRS))"
# Add a user-defined pattern to the cygpath arguments
if [ "$GRADLE_CYGPATTERN" != "" ] ; then
OURCYGPATTERN="$OURCYGPATTERN|($GRADLE_CYGPATTERN)"
fi
# Now convert the arguments - kludge to limit ourselves to /bin/sh
i=0
for arg in "$@" ; do
CHECK=`echo "$arg"|egrep -c "$OURCYGPATTERN" -`
CHECK2=`echo "$arg"|egrep -c "^-"` ### Determine if an option
if [ $CHECK -ne 0 ] && [ $CHECK2 -eq 0 ] ; then ### Added a condition
eval `echo args$i`=`cygpath --path --ignore --mixed "$arg"`
else
eval `echo args$i`="\"$arg\""
fi
i=$((i+1))
done
case $i in
(0) set -- ;;
(1) set -- "$args0" ;;
(2) set -- "$args0" "$args1" ;;
(3) set -- "$args0" "$args1" "$args2" ;;
(4) set -- "$args0" "$args1" "$args2" "$args3" ;;
(5) set -- "$args0" "$args1" "$args2" "$args3" "$args4" ;;
(6) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" ;;
(7) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" "$args6" ;;
(8) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" "$args6" "$args7" ;;
(9) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" "$args6" "$args7" "$args8" ;;
esac
fi
# Escape application args
save () {
for i do printf %s\\n "$i" | sed "s/'/'\\\\''/g;1s/^/'/;\$s/\$/' \\\\/" ; done
echo " "
}
APP_ARGS=$(save "$@")
# Collect all arguments for the java command, following the shell quoting and substitution rules
eval set -- $DEFAULT_JVM_OPTS $JAVA_OPTS $GRADLE_OPTS "\"-Dorg.gradle.appname=$APP_BASE_NAME\"" -classpath "\"$CLASSPATH\"" org.gradle.wrapper.GradleWrapperMain "$APP_ARGS"
# by default we should be in the correct project dir, but when run from Finder on Mac, the cwd is wrong
if [ "$(uname)" = "Darwin" ] && [ "$HOME" = "$PWD" ]; then
cd "$(dirname "$0")"
fi
exec "$JAVACMD" "$@"
@if "%DEBUG%" == "" @echo off
@rem ##########################################################################
@rem
@rem Gradle startup script for Windows
@rem
@rem ##########################################################################
@rem Set local scope for the variables with windows NT shell
if "%OS%"=="Windows_NT" setlocal
set DIRNAME=%~dp0
if "%DIRNAME%" == "" set DIRNAME=.
set APP_BASE_NAME=%~n0
set APP_HOME=%DIRNAME%
@rem Add default JVM options here. You can also use JAVA_OPTS and GRADLE_OPTS to pass JVM options to this script.
set DEFAULT_JVM_OPTS=
@rem Find java.exe
if defined JAVA_HOME goto findJavaFromJavaHome
set JAVA_EXE=java.exe
%JAVA_EXE% -version >NUL 2>&1
if "%ERRORLEVEL%" == "0" goto init
echo.
echo ERROR: JAVA_HOME is not set and no 'java' command could be found in your PATH.
echo.
echo Please set the JAVA_HOME variable in your environment to match the
echo location of your Java installation.
goto fail
:findJavaFromJavaHome
set JAVA_HOME=%JAVA_HOME:"=%
set JAVA_EXE=%JAVA_HOME%/bin/java.exe
if exist "%JAVA_EXE%" goto init
echo.
echo ERROR: JAVA_HOME is set to an invalid directory: %JAVA_HOME%
echo.
echo Please set the JAVA_HOME variable in your environment to match the
echo location of your Java installation.
goto fail
:init
@rem Get command-line arguments, handling Windows variants
if not "%OS%" == "Windows_NT" goto win9xME_args
:win9xME_args
@rem Slurp the command line arguments.
set CMD_LINE_ARGS=
set _SKIP=2
:win9xME_args_slurp
if "x%~1" == "x" goto execute
set CMD_LINE_ARGS=%*
:execute
@rem Setup the command line
set CLASSPATH=%APP_HOME%\gradle\wrapper\gradle-wrapper.jar
@rem Execute Gradle
"%JAVA_EXE%" %DEFAULT_JVM_OPTS% %JAVA_OPTS% %GRADLE_OPTS% "-Dorg.gradle.appname=%APP_BASE_NAME%" -classpath "%CLASSPATH%" org.gradle.wrapper.GradleWrapperMain %CMD_LINE_ARGS%
:end
@rem End local scope for the variables with windows NT shell
if "%ERRORLEVEL%"=="0" goto mainEnd
:fail
rem Set variable GRADLE_EXIT_CONSOLE if you need the _script_ return code instead of
rem the _cmd.exe /c_ return code!
if not "" == "%GRADLE_EXIT_CONSOLE%" exit 1
exit /b 1
:mainEnd
if "%OS%"=="Windows_NT" endlocal
:omega
ECLIPSE ANDROID PROJECT IMPORT SUMMARY
======================================
Ignored Files:
--------------
The following files were *not* copied into the new Gradle project; you
should evaluate whether these are still needed in your project and if
so manually move them:
* javadoc/
* javadoc/allclasses-frame.html
* javadoc/allclasses-noframe.html
* javadoc/constant-values.html
* javadoc/help-doc.html
* javadoc/index-all.html
* javadoc/index.html
* javadoc/org/
* javadoc/org/opencv/
* javadoc/org/opencv/android/
* javadoc/org/opencv/android/BaseLoaderCallback.html
* javadoc/org/opencv/android/Camera2Renderer.html
* javadoc/org/opencv/android/CameraBridgeViewBase.CvCameraViewFrame.html
* javadoc/org/opencv/android/CameraBridgeViewBase.CvCameraViewListener.html
* javadoc/org/opencv/android/CameraBridgeViewBase.CvCameraViewListener2.html
* javadoc/org/opencv/android/CameraBridgeViewBase.ListItemAccessor.html
* javadoc/org/opencv/android/CameraBridgeViewBase.html
* javadoc/org/opencv/android/CameraGLRendererBase.html
* javadoc/org/opencv/android/CameraGLSurfaceView.CameraTextureListener.html
* javadoc/org/opencv/android/CameraGLSurfaceView.html
* javadoc/org/opencv/android/CameraRenderer.html
* javadoc/org/opencv/android/FpsMeter.html
* javadoc/org/opencv/android/InstallCallbackInterface.html
* javadoc/org/opencv/android/JavaCamera2View.html
* javadoc/org/opencv/android/JavaCameraView.JavaCameraSizeAccessor.html
* javadoc/org/opencv/android/JavaCameraView.html
* javadoc/org/opencv/android/LoaderCallbackInterface.html
* javadoc/org/opencv/android/OpenCVLoader.html
* javadoc/org/opencv/android/Utils.html
* javadoc/org/opencv/android/package-frame.html
* javadoc/org/opencv/android/package-summary.html
* javadoc/org/opencv/android/package-tree.html
* javadoc/org/opencv/calib3d/
* javadoc/org/opencv/calib3d/Calib3d.html
* javadoc/org/opencv/calib3d/StereoBM.html
* javadoc/org/opencv/calib3d/StereoMatcher.html
* javadoc/org/opencv/calib3d/StereoSGBM.html
* javadoc/org/opencv/calib3d/package-frame.html
* javadoc/org/opencv/calib3d/package-summary.html
* javadoc/org/opencv/calib3d/package-tree.html
* javadoc/org/opencv/core/
* javadoc/org/opencv/core/Algorithm.html
* javadoc/org/opencv/core/Core.MinMaxLocResult.html
* javadoc/org/opencv/core/Core.html
* javadoc/org/opencv/core/CvException.html
* javadoc/org/opencv/core/CvType.html
* javadoc/org/opencv/core/DMatch.html
* javadoc/org/opencv/core/KeyPoint.html
* javadoc/org/opencv/core/Mat.html
* javadoc/org/opencv/core/MatOfByte.html
* javadoc/org/opencv/core/MatOfDMatch.html
* javadoc/org/opencv/core/MatOfDouble.html
* javadoc/org/opencv/core/MatOfFloat.html
* javadoc/org/opencv/core/MatOfFloat4.html
* javadoc/org/opencv/core/MatOfFloat6.html
* javadoc/org/opencv/core/MatOfInt.html
* javadoc/org/opencv/core/MatOfInt4.html
* javadoc/org/opencv/core/MatOfKeyPoint.html
* javadoc/org/opencv/core/MatOfPoint.html
* javadoc/org/opencv/core/MatOfPoint2f.html
* javadoc/org/opencv/core/MatOfPoint3.html
* javadoc/org/opencv/core/MatOfPoint3f.html
* javadoc/org/opencv/core/MatOfRect.html
* javadoc/org/opencv/core/MatOfRect2d.html
* javadoc/org/opencv/core/MatOfRotatedRect.html
* javadoc/org/opencv/core/Point.html
* javadoc/org/opencv/core/Point3.html
* javadoc/org/opencv/core/Range.html
* javadoc/org/opencv/core/Rect.html
* javadoc/org/opencv/core/Rect2d.html
* javadoc/org/opencv/core/RotatedRect.html
* javadoc/org/opencv/core/Scalar.html
* javadoc/org/opencv/core/Size.html
* javadoc/org/opencv/core/TermCriteria.html
* javadoc/org/opencv/core/TickMeter.html
* javadoc/org/opencv/core/package-frame.html
* javadoc/org/opencv/core/package-summary.html
* javadoc/org/opencv/core/package-tree.html
* javadoc/org/opencv/dnn/
* javadoc/org/opencv/dnn/DictValue.html
* javadoc/org/opencv/dnn/Dnn.html
* javadoc/org/opencv/dnn/Layer.html
* javadoc/org/opencv/dnn/Net.html
* javadoc/org/opencv/dnn/package-frame.html
* javadoc/org/opencv/dnn/package-summary.html
* javadoc/org/opencv/dnn/package-tree.html
* javadoc/org/opencv/features2d/
* javadoc/org/opencv/features2d/AKAZE.html
* javadoc/org/opencv/features2d/AgastFeatureDetector.html
* javadoc/org/opencv/features2d/BFMatcher.html
* javadoc/org/opencv/features2d/BOWImgDescriptorExtractor.html
* javadoc/org/opencv/features2d/BOWKMeansTrainer.html
* javadoc/org/opencv/features2d/BOWTrainer.html
* javadoc/org/opencv/features2d/BRISK.html
* javadoc/org/opencv/features2d/DescriptorMatcher.html
* javadoc/org/opencv/features2d/FastFeatureDetector.html
* javadoc/org/opencv/features2d/Feature2D.html
* javadoc/org/opencv/features2d/Features2d.html
* javadoc/org/opencv/features2d/FlannBasedMatcher.html
* javadoc/org/opencv/features2d/GFTTDetector.html
* javadoc/org/opencv/features2d/KAZE.html
* javadoc/org/opencv/features2d/MSER.html
* javadoc/org/opencv/features2d/ORB.html
* javadoc/org/opencv/features2d/Params.html
* javadoc/org/opencv/features2d/package-frame.html
* javadoc/org/opencv/features2d/package-summary.html
* javadoc/org/opencv/features2d/package-tree.html
* javadoc/org/opencv/imgcodecs/
* javadoc/org/opencv/imgcodecs/Imgcodecs.html
* javadoc/org/opencv/imgcodecs/package-frame.html
* javadoc/org/opencv/imgcodecs/package-summary.html
* javadoc/org/opencv/imgcodecs/package-tree.html
* javadoc/org/opencv/imgproc/
* javadoc/org/opencv/imgproc/CLAHE.html
* javadoc/org/opencv/imgproc/Imgproc.html
* javadoc/org/opencv/imgproc/LineSegmentDetector.html
* javadoc/org/opencv/imgproc/Moments.html
* javadoc/org/opencv/imgproc/Subdiv2D.html
* javadoc/org/opencv/imgproc/package-frame.html
* javadoc/org/opencv/imgproc/package-summary.html
* javadoc/org/opencv/imgproc/package-tree.html
* javadoc/org/opencv/ml/
* javadoc/org/opencv/ml/ANN_MLP.html
* javadoc/org/opencv/ml/ANN_MLP_ANNEAL.html
* javadoc/org/opencv/ml/Boost.html
* javadoc/org/opencv/ml/DTrees.html
* javadoc/org/opencv/ml/EM.html
* javadoc/org/opencv/ml/KNearest.html
* javadoc/org/opencv/ml/LogisticRegression.html
* javadoc/org/opencv/ml/Ml.html
* javadoc/org/opencv/ml/NormalBayesClassifier.html
* javadoc/org/opencv/ml/ParamGrid.html
* javadoc/org/opencv/ml/RTrees.html
* javadoc/org/opencv/ml/SVM.html
* javadoc/org/opencv/ml/SVMSGD.html
* javadoc/org/opencv/ml/StatModel.html
* javadoc/org/opencv/ml/TrainData.html
* javadoc/org/opencv/ml/package-frame.html
* javadoc/org/opencv/ml/package-summary.html
* javadoc/org/opencv/ml/package-tree.html
* javadoc/org/opencv/objdetect/
* javadoc/org/opencv/objdetect/BaseCascadeClassifier.html
* javadoc/org/opencv/objdetect/CascadeClassifier.html
* javadoc/org/opencv/objdetect/HOGDescriptor.html
* javadoc/org/opencv/objdetect/Objdetect.html
* javadoc/org/opencv/objdetect/QRCodeDetector.html
* javadoc/org/opencv/objdetect/package-frame.html
* javadoc/org/opencv/objdetect/package-summary.html
* javadoc/org/opencv/objdetect/package-tree.html
* javadoc/org/opencv/osgi/
* javadoc/org/opencv/osgi/OpenCVInterface.html
* javadoc/org/opencv/osgi/OpenCVNativeLoader.html
* javadoc/org/opencv/osgi/package-frame.html
* javadoc/org/opencv/osgi/package-summary.html
* javadoc/org/opencv/osgi/package-tree.html
* javadoc/org/opencv/photo/
* javadoc/org/opencv/photo/AlignExposures.html
* javadoc/org/opencv/photo/AlignMTB.html
* javadoc/org/opencv/photo/CalibrateCRF.html
* javadoc/org/opencv/photo/CalibrateDebevec.html
* javadoc/org/opencv/photo/CalibrateRobertson.html
* javadoc/org/opencv/photo/MergeDebevec.html
* javadoc/org/opencv/photo/MergeExposures.html
* javadoc/org/opencv/photo/MergeMertens.html
* javadoc/org/opencv/photo/MergeRobertson.html
* javadoc/org/opencv/photo/Photo.html
* javadoc/org/opencv/photo/Tonemap.html
* javadoc/org/opencv/photo/TonemapDrago.html
* javadoc/org/opencv/photo/TonemapMantiuk.html
* javadoc/org/opencv/photo/TonemapReinhard.html
* javadoc/org/opencv/photo/package-frame.html
* javadoc/org/opencv/photo/package-summary.html
* javadoc/org/opencv/photo/package-tree.html
* javadoc/org/opencv/utils/
* javadoc/org/opencv/utils/Converters.html
* javadoc/org/opencv/utils/package-frame.html
* javadoc/org/opencv/utils/package-summary.html
* javadoc/org/opencv/utils/package-tree.html
* javadoc/org/opencv/video/
* javadoc/org/opencv/video/BackgroundSubtractor.html
* javadoc/org/opencv/video/BackgroundSubtractorKNN.html
* javadoc/org/opencv/video/BackgroundSubtractorMOG2.html
* javadoc/org/opencv/video/DenseOpticalFlow.html
* javadoc/org/opencv/video/DualTVL1OpticalFlow.html
* javadoc/org/opencv/video/FarnebackOpticalFlow.html
* javadoc/org/opencv/video/KalmanFilter.html
* javadoc/org/opencv/video/SparseOpticalFlow.html
* javadoc/org/opencv/video/SparsePyrLKOpticalFlow.html
* javadoc/org/opencv/video/Video.html
* javadoc/org/opencv/video/package-frame.html
* javadoc/org/opencv/video/package-summary.html
* javadoc/org/opencv/video/package-tree.html
* javadoc/org/opencv/videoio/
* javadoc/org/opencv/videoio/VideoCapture.html
* javadoc/org/opencv/videoio/VideoWriter.html
* javadoc/org/opencv/videoio/Videoio.html
* javadoc/org/opencv/videoio/package-frame.html
* javadoc/org/opencv/videoio/package-summary.html
* javadoc/org/opencv/videoio/package-tree.html
* javadoc/overview-frame.html
* javadoc/overview-summary.html
* javadoc/overview-tree.html
* javadoc/package-list
* javadoc/resources/
* javadoc/resources/background.gif
* javadoc/resources/tab.gif
* javadoc/resources/titlebar.gif
* javadoc/resources/titlebar_end.gif
* javadoc/serialized-form.html
* javadoc/stylesheet.css
Moved Files:
------------
Android Gradle projects use a different directory structure than ADT
Eclipse projects. Here's how the projects were restructured:
* AndroidManifest.xml => openCVLibrary346/src/main/AndroidManifest.xml
* lint.xml => openCVLibrary346/lint.xml
* res/ => openCVLibrary346/src/main/res/
* src/ => openCVLibrary346/src/main/java/
* src/org/opencv/engine/OpenCVEngineInterface.aidl => openCVLibrary346/src/main/aidl/org/opencv/engine/OpenCVEngineInterface.aidl
Next Steps:
-----------
You can now build the project. The Gradle project needs network
connectivity to download dependencies.
Bugs:
-----
If for some reason your project does not build, and you determine that
it is due to a bug or limitation of the Eclipse to Gradle importer,
please file a bug at http://b.android.com with category
Component-Tools.
(This import summary is for your information only, and can be deleted
after import once you are satisfied with the results.)
include ':app'
\ No newline at end of file
import java.security.MessageDigest
apply plugin: 'com.android.library'
android {
compileSdkVersion 28
buildToolsVersion "29.0.2"
defaultConfig {
minSdkVersion 15
targetSdkVersion 28
versionCode 1
versionName "1.0"
testInstrumentationRunner "androidx.test.runner.AndroidJUnitRunner"
consumerProguardFiles 'consumer-rules.pro'
}
buildTypes {
release {
minifyEnabled false
proguardFiles getDefaultProguardFile('proguard-android-optimize.txt'), 'proguard-rules.pro'
}
}
}
dependencies {
implementation fileTree(dir: 'libs', include: ['*.jar','*.aar'])
implementation 'com.android.support:appcompat-v7:28.0.0'
implementation 'com.android.support.constraint:constraint-layout:1.1.3'
implementation 'com.android.support:design:28.0.0'
testImplementation 'junit:junit:4.12'
androidTestImplementation 'androidx.test.ext:junit:1.1.1'
androidTestImplementation 'androidx.test.espresso:espresso-core:3.2.0'
}
def paddleLiteLibs = 'https://bj.bcebos.com/paddlex/deploy/lite/paddle_lite_version_11cbd50e.tar.gz'
task downloadAndExtractPaddleLiteLibs(type: DefaultTask) {
doFirst {
println "Downloading and extracting Paddle Lite libs"
}
doLast {
// Prepare cache folder for libs
if (!file("cache").exists()) {
mkdir "cache"
}
// Generate cache name for libs
MessageDigest messageDigest = MessageDigest.getInstance('MD5')
messageDigest.update(paddleLiteLibs.bytes)
String cacheName = new BigInteger(1, messageDigest.digest()).toString(32)
// Download libs
if (!file("cache/${cacheName}.tar.gz").exists()) {
ant.get(src: paddleLiteLibs, dest: file("cache/${cacheName}.tar.gz"))
}
// Unpack libs
copy {
from tarTree("cache/${cacheName}.tar.gz")
into "cache/${cacheName}"
}
// Copy PaddlePredictor.jar
if (!file("libs/PaddlePredictor.jar").exists()) {
copy {
from "cache/${cacheName}/PaddlePredictor.jar"
into "libs"
}
}
// Copy libpaddle_lite_jni.so for armeabi-v7a and arm64-v8a
if (!file("src/main/jniLibs/armeabi-v7a/libpaddle_lite_jni.so").exists()) {
copy {
from "cache/${cacheName}/libs/armeabi-v7a/"
into "src/main/jniLibs/armeabi-v7a"
}
}
if (!file("src/main/jniLibs/arm64-v8a/libpaddle_lite_jni.so").exists()) {
copy {
from "cache/${cacheName}/libs/arm64-v8a/"
into "src/main/jniLibs/arm64-v8a"
}
}
}
}
preBuild.dependsOn downloadAndExtractPaddleLiteLibs
def snakeYamlLibs = 'https://bj.bcebos.com/paddlex/deploy/lite/snakeyaml-1.18-android.tar.gz'
task downloadAndExtractSnakeYamlLibs(type: DefaultTask) {
doFirst {
println "Downloading and extracting snake yaml sdk"
}
doLast {
// Prepare cache folder for sdk
if (!file("cache").exists()) {
mkdir "cache"
}
// Generate cache name for sdk
MessageDigest messageDigest = MessageDigest.getInstance('MD5')
messageDigest.update(snakeYamlLibs.bytes)
String cacheName = new BigInteger(1, messageDigest.digest()).toString(32)
// Download libs
if (!file("cache/${cacheName}.tar.gz").exists()) {
ant.get(src: snakeYamlLibs, dest: file("cache/${cacheName}.tar.gz"))
}
// Unpack libs
copy {
from tarTree("cache/${cacheName}.tar.gz")
into "cache/${cacheName}"
}
// Copy .jar
if (!file("libs/snakeyaml-1.18-android.jar").exists()) {
copy {
from "cache/${cacheName}/snakeyaml-1.18-android.jar"
into "libs"
}
}
}
}
preBuild.dependsOn downloadAndExtractSnakeYamlLibs
def opencvLibs = 'https://bj.bcebos.com/paddlex/deploy/lite/opencv-3.4.6-android.tar.gz'
task downloadAndExtractOpencvLibs(type: DefaultTask) {
doFirst {
println "Downloading and extracting opencv sdk"
}
doLast {
// Prepare cache folder for sdk
if (!file("cache").exists()) {
mkdir "cache"
}
// Generate cache name for sdk
MessageDigest messageDigest = MessageDigest.getInstance('MD5')
messageDigest.update(opencvLibs.bytes)
String cacheName = new BigInteger(1, messageDigest.digest()).toString(32)
// Download libs
if (!file("cache/${cacheName}.tar.gz").exists()) {
ant.get(src: opencvLibs, dest: file("cache/${cacheName}.tar.gz"))
}
// Unpack libs
copy {
from tarTree("cache/${cacheName}.tar.gz")
into "cache/${cacheName}"
}
// Copy .jar
if (!file("libs/opencv346.jar").exists()) {
copy {
from "cache/${cacheName}/opencv346.jar"
into "libs"
}
}
// Copy .so for armeabi-v7a and arm64-v8a
if (!file("src/main/jniLibs/armeabi-v7a/libopencv_java3.so").exists()) {
copy {
from "cache/${cacheName}/libs/armeabi-v7a/"
into "src/main/jniLibs/armeabi-v7a"
}
}
if (!file("src/main/jniLibs/arm64-v8a/libopencv_java3.so").exists()) {
copy {
from "cache/${cacheName}/libs/arm64-v8a/"
into "src/main/jniLibs/arm64-v8a"
}
}
}
}
preBuild.dependsOn downloadAndExtractOpencvLibs
## This file must *NOT* be checked into Version Control Systems,
# as it contains information specific to your local configuration.
#
# Location of the SDK. This is only used by Gradle.
# For customization when using a Version Control System, please read the
# header note.
#Tue Jun 16 10:08:04 CST 2020
# Add project specific ProGuard rules here.
# You can control the set of applied configuration files using the
# proguardFiles setting in build.gradle.
#
# For more details, see
# http://developer.android.com/guide/developing/tools/proguard.html
# If your project uses WebView with JS, uncomment the following
# and specify the fully qualified class name to the JavaScript interface
# class:
#-keepclassmembers class fqcn.of.javascript.interface.for.webview {
# public *;
#}
# Uncomment this to preserve the line number information for
# debugging stack traces.
#-keepattributes SourceFile,LineNumberTable
# If you keep the line number information, uncomment this to
# hide the original source file name.
#-renamesourcefileattribute SourceFile
package com.example.paddlex;
import android.content.Context;
import android.content.res.AssetManager;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import androidx.test.platform.app.InstrumentationRegistry;
import androidx.test.ext.junit.runners.AndroidJUnit4;
import com.baidu.paddlex.config.ConfigParser;
import org.json.JSONException;
import org.junit.Test;
import org.junit.runner.RunWith;
import java.io.IOException;
import java.io.InputStream;
/**
* Instrumented test, which will execute on an Android device.
*
* @see <a href="http://d.android.com/tools/testing">Testing documentation</a>
*/
@RunWith(AndroidJUnit4.class)
public class ExampleInstrumentedTest {
@Test
public void useAppContext() throws IOException, JSONException {
// Context of the app under test.
Context appContext = InstrumentationRegistry.getInstrumentation().getTargetContext();
AssetManager ass = appContext.getAssets();
}
}
<manifest package="com.example.paddlex" />
// Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package com.baidu.paddlex;
import android.util.Log;
import com.baidu.paddle.lite.MobileConfig;
import com.baidu.paddle.lite.PaddlePredictor;
import com.baidu.paddle.lite.PowerMode;
import com.baidu.paddle.lite.Tensor;
import com.baidu.paddlex.config.ConfigParser;
import com.baidu.paddlex.postprocess.ClsResult;
import com.baidu.paddlex.postprocess.DetResult;
import com.baidu.paddlex.postprocess.Result;
import com.baidu.paddlex.postprocess.SegResult;
import com.baidu.paddlex.preprocess.ImageBlob;
import com.baidu.paddlex.preprocess.Transforms;
import java.util.Date;
import org.opencv.core.Mat;
public class Predictor {
private static final String TAG = Predictor.class.getSimpleName();
protected boolean isLoaded = false;
protected int warmupIterNum = 0;
protected int inferIterNum = 1;
protected int cpuThreadNum = 1;
protected String cpuPowerMode = "LITE_POWER_HIGH";
protected String modelPath = "";
protected String modelName = "";
protected float inferenceTime = 0;
protected float preprocessTime = 0;
protected float postprocessTime = 0;
protected PaddlePredictor paddlePredictor = null;
protected ImageBlob imageBlob = new ImageBlob();
protected Transforms transforms = new Transforms();
protected ConfigParser configParser = new ConfigParser();
protected Mat inputMat;
protected Result result;
public Predictor() {
super();
}
public boolean init(String modelPath, int cpuThreadNum, String cpuPowerMode) {
if (configParser.getModelType().equalsIgnoreCase("classifier")) {
result = new ClsResult();
} else if (configParser.getModelType().equalsIgnoreCase("detector")) {
result = new DetResult();
} else if (configParser.getModelType().equalsIgnoreCase("segmenter")) {
result = new SegResult();
} else {
Log.i(TAG, "model type: " + configParser.getModelType() + " is not support! Only support: 'classifier' or 'detector' or 'segmenter'");
}
isLoaded = loadModel(modelPath, cpuThreadNum, cpuPowerMode);
return isLoaded;
}
public boolean init(ConfigParser configParser) {
this.configParser = configParser;
init(configParser.getModelPath(), configParser.getCpuThreadNum(), configParser.getCpuPowerMode());
transforms.loadConfig(configParser.getTransformsList(), configParser.getTransformsMode());
if (!isLoaded()) {
return false;
}
Log.i(TAG, configParser.toString());
return isLoaded;
}
public boolean predict() {
this.imageBlob.clear();
this.imageBlob = transforms.run(inputMat, imageBlob);
if (configParser.getModelType().equalsIgnoreCase("classifier")) {
runModel((ClsResult) result);
} else if (configParser.getModelType().equalsIgnoreCase("detector")) {
runModel((DetResult) result);
} else if (configParser.getModelType().equalsIgnoreCase("segmenter")) {
runModel((SegResult) result);
}
return true;
}
private boolean runModel(DetResult detReult) {
// set input shape & data
Tensor imTensor = getInput(0);
imTensor.resize(imageBlob.getNewImageSize());
imTensor.setData(imageBlob.getImageData());
if (configParser.getModel().equalsIgnoreCase("YOLOv3")) {
Tensor imSizeTensor = getInput(1);
long[] imSize = {1, 2};
imSizeTensor.resize(imSize);
imSizeTensor.setData(new int[]{(int) imageBlob.getOriImageSize()[2], (int) imageBlob.getOriImageSize()[3]});
} else if (configParser.getModel().equalsIgnoreCase("FasterRCNN")) {
Tensor imInfoTensor = getInput(1);
long[] imInfo = {1, 3};
imInfoTensor.resize(imInfo);
imInfoTensor.setData(new float[]{imageBlob.getNewImageSize()[2], imageBlob.getNewImageSize()[3], imageBlob.getScale()});
Tensor imShapeTensor = getInput(2);
long[] imShape = {1, 3};
imShapeTensor.resize(imShape);
imShapeTensor.setData(new float[]{imageBlob.getOriImageSize()[2], imageBlob.getOriImageSize()[3], 1});
}
// run model
runModel();
// Fetch output tensor
Tensor outputTensor = getOutput(0);
float[] output = outputTensor.getFloatData();
long[] outputShape = outputTensor.shape();
long outputSize = 1;
for (long s : outputShape) {
outputSize *= s;
}
int num_boxes = (int) (outputSize / 6);
for (int i = 0; i < num_boxes; i++) {
DetResult.Box box = detReult.new Box();
box.setCategoryId((int) output[i * 6]);
box.setCategory(configParser.getLabeList().get(box.getCategoryId()));
box.setScore(output[i * 6 + 1]);
float xmin = output[i * 6 + 2];
float ymin = output[i * 6 + 3];
float xmax = output[i * 6 + 4];
float ymax = output[i * 6 + 5];
box.setCoordinate(new float[]{xmin, ymin, xmax, ymax});
detReult.getBoxes().add(box);
}
return true;
}
private boolean runModel(SegResult segReult) {
// set input shape & data
Tensor imTensor = getInput(0);
imTensor.resize(imageBlob.getNewImageSize());
imTensor.setData(imageBlob.getImageData());
// run model
runModel();
Tensor labelTensor = getOutput(0);
// Fetch output tensor
long[] labelData = labelTensor.getLongData();
segReult.getMask().setLabelShape(labelTensor.shape());
long labelSize = 1;
for (long s : segReult.getMask().getLabelShape()) {
labelSize *= s;
}
segReult.getMask().setLabelData(labelData);
Tensor scoreTensor = getOutput(1);
float[] scoreData = scoreTensor.getFloatData();
segReult.getMask().setScoreShape(scoreTensor.shape());
segReult.getMask().setScoreData(scoreData);
return true;
}
private boolean runModel(ClsResult clsReult) {
// set input shape & data
Tensor imTensor = getInput(0);
imTensor.resize(imageBlob.getNewImageSize());
imTensor.setData(imageBlob.getImageData());
// run model
runModel();
// Fetch output tensor
Tensor outputTensor = getOutput(0);
long[] outputShape = outputTensor.shape();
long outputSize = 1;
for (long s : outputShape) {
outputSize *= s;
}
int max_index = 0; // Top3 indices
float max_score = 0; // Top3 scores
for (int i = 0; i < outputSize; i++) {
float tmp = outputTensor.getFloatData()[i];
if (tmp > max_score) {
max_index = i;
max_score = tmp;
}
}
clsReult.setCategoryId(max_index);
clsReult.setCategory(configParser.getLabeList().get(max_index));
clsReult.setScore(max_score);
return true;
}
private boolean loadModel(String modelPath, int cpuThreadNum, String cpuPowerMode) {
// release model if exists
releaseModel();
// load model
if (modelPath.isEmpty()) {
return false;
}
MobileConfig config = new MobileConfig();
config.setModelFromFile(modelPath);
config.setThreads(cpuThreadNum);
if (cpuPowerMode.equalsIgnoreCase("LITE_POWER_HIGH")) {
config.setPowerMode(PowerMode.LITE_POWER_HIGH);
} else if (cpuPowerMode.equalsIgnoreCase("LITE_POWER_LOW")) {
config.setPowerMode(PowerMode.LITE_POWER_LOW);
} else if (cpuPowerMode.equalsIgnoreCase("LITE_POWER_FULL")) {
config.setPowerMode(PowerMode.LITE_POWER_FULL);
} else if (cpuPowerMode.equalsIgnoreCase("LITE_POWER_NO_BIND")) {
config.setPowerMode(PowerMode.LITE_POWER_NO_BIND);
} else if (cpuPowerMode.equalsIgnoreCase("LITE_POWER_RAND_HIGH")) {
config.setPowerMode(PowerMode.LITE_POWER_RAND_HIGH);
} else if (cpuPowerMode.equalsIgnoreCase("LITE_POWER_RAND_LOW")) {
config.setPowerMode(PowerMode.LITE_POWER_RAND_LOW);
} else {
Log.e(TAG, "unknown cpu power mode!");
return false;
}
paddlePredictor = PaddlePredictor.createPaddlePredictor(config);
this.cpuThreadNum = cpuThreadNum;
this.cpuPowerMode = cpuPowerMode;
this.modelPath = modelPath;
this.modelName = configParser.getModel();
return true;
}
private boolean runModel() {
if (!isLoaded()) {
return false;
}
// warm up
for (int i = 0; i < warmupIterNum; i++) {
paddlePredictor.run();
}
Date start = new Date();
// inference
for (int i = 0; i < inferIterNum; i++) {
paddlePredictor.run();
}
Date end = new Date();
inferenceTime = (end.getTime() - start.getTime()) / (float) inferIterNum;
return true;
}
public void releaseModel() {
paddlePredictor = null;
isLoaded = false;
cpuThreadNum = 1;
cpuPowerMode = "LITE_POWER_HIGH";
modelPath = "";
modelName = "";
}
public boolean isLoaded() {
return paddlePredictor != null && isLoaded;
}
public void setLoaded(boolean loaded) {
isLoaded = loaded;
}
public int getWarmupIterNum() {
return warmupIterNum;
}
public void setWarmupIterNum(int warmupIterNum) {
this.warmupIterNum = warmupIterNum;
}
public int getInferIterNum() {
return inferIterNum;
}
public void setInferIterNum(int inferIterNum) {
this.inferIterNum = inferIterNum;
}
public float getInferenceTime() {
return inferenceTime;
}
public void setInferenceTime(float inferenceTime) {
this.inferenceTime = inferenceTime;
}
public int getCpuThreadNum() {
return cpuThreadNum;
}
public void setCpuThreadNum(int cpuThreadNum) {
this.cpuThreadNum = cpuThreadNum;
}
public String getCpuPowerMode() {
return cpuPowerMode;
}
public void setCpuPowerMode(String cpuPowerMode) {
this.cpuPowerMode = cpuPowerMode;
}
public String getModelPath() {
return modelPath;
}
public void setModelPath(String modelPath) {
this.modelPath = modelPath;
}
public String getModelName() {
return modelName;
}
public void setModelName(String modelName) {
this.modelName = modelName;
}
public Result getResult() {
return result;
}
public void setResult(Result result) {
this.result = result;
}
public PaddlePredictor getPaddlePredictor() {
return paddlePredictor;
}
public void setPaddlePredictor(PaddlePredictor paddlePredictor) {
this.paddlePredictor = paddlePredictor;
}
public float getPreprocessTime() {
return preprocessTime;
}
public void setPreprocessTime(float preprocessTime) {
this.preprocessTime = preprocessTime;
}
public float getPostprocessTime() {
return postprocessTime;
}
public void setPostprocessTime(float postprocessTime) {
this.postprocessTime = postprocessTime;
}
public void setConfigParser(ConfigParser configParser) {
this.configParser = configParser;
}
public Mat getInputMat() {
return inputMat;
}
public void setInputMat(Mat inputMat) {
Mat copyMat = new Mat();
inputMat.copyTo(copyMat);
this.inputMat = copyMat;
}
public DetResult getDetResult() {
if (result.getType() != "det") {
Log.e(TAG, "this model_type is not detector");
return null;
}
return (DetResult) result;
}
public SegResult getSegResult() {
if (result.getType() != "seg") {
Log.e(TAG, "this model_type is not segmeter");
return null;
}
return (SegResult) result;
}
public ClsResult getClsResult() {
if (result.getType() != "cls") {
Log.e(TAG, "this model_type is not classifier");
return null;
}
return (ClsResult) result;
}
public ImageBlob getImageBlob() {
return imageBlob;
}
public void setImageBlob(ImageBlob imageBlob) {
this.imageBlob = imageBlob;
}
public Tensor getInput(int idx) {
if (!isLoaded()) {
return null;
}
return paddlePredictor.getInput(idx);
}
public Tensor getOutput(int idx) {
if (!isLoaded()) {
return null;
}
return paddlePredictor.getOutput(idx);
}
}
// Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package com.baidu.paddlex;
import android.content.Context;
import android.os.Environment;
import org.opencv.android.OpenCVLoader;
import java.io.BufferedInputStream;
import java.io.BufferedOutputStream;
import java.io.File;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
public class Utils {
private static final String TAG = Utils.class.getSimpleName();
public static void copyFileFromAssets(Context appCtx, String srcPath, String dstPath) {
if (srcPath.isEmpty() || dstPath.isEmpty()) {
return;
}
InputStream is = null;
OutputStream os = null;
try {
is = new BufferedInputStream(appCtx.getAssets().open(srcPath));
os = new BufferedOutputStream(new FileOutputStream(new File(dstPath)));
byte[] buffer = new byte[1024];
int length = 0;
while ((length = is.read(buffer)) != -1) {
os.write(buffer, 0, length);
}
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
try {
os.close();
is.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
public static void copyDirectoryFromAssets(Context appCtx, String srcDir, String dstDir) {
if (srcDir.isEmpty() || dstDir.isEmpty()) {
return;
}
try {
if (!new File(dstDir).exists()) {
new File(dstDir).mkdirs();
}
for (String fileName : appCtx.getAssets().list(srcDir)) {
String srcSubPath = srcDir + File.separator + fileName;
String dstSubPath = dstDir + File.separator + fileName;
copyFileFromAssets(appCtx, srcSubPath, dstSubPath);
if (new File(srcSubPath).isDirectory()) {
copyDirectoryFromAssets(appCtx, srcSubPath, dstSubPath);
} else {
copyFileFromAssets(appCtx, srcSubPath, dstSubPath);
}
}
} catch (Exception e) {
e.printStackTrace();
}
}
public static String getFileNameFromString(String srcDir) {
if (srcDir.isEmpty()) {
return null;
}
try {
String fileName = srcDir.substring(srcDir.lastIndexOf("/") + 1);
return fileName;
} catch (Exception e) {
e.printStackTrace();
}
return null;
}
public static float[] parseFloatsFromString(String string, String delimiter) {
String[] pieces = string.trim().toLowerCase().split(delimiter);
float[] floats = new float[pieces.length];
for (int i = 0; i < pieces.length; i++) {
floats[i] = Float.parseFloat(pieces[i].trim());
}
return floats;
}
public static long[] parseLongsFromString(String string, String delimiter) {
String[] pieces = string.trim().toLowerCase().split(delimiter);
long[] longs = new long[pieces.length];
for (int i = 0; i < pieces.length; i++) {
longs[i] = Long.parseLong(pieces[i].trim());
}
return longs;
}
public static String getSDCardDirectory() {
return Environment.getExternalStorageDirectory().getAbsolutePath();
}
public static boolean isSupportedNPU() {
String hardware = android.os.Build.HARDWARE;
return hardware.equalsIgnoreCase("kirin810") || hardware.equalsIgnoreCase("kirin990");
}
public static boolean initialOpencv() {
if (!OpenCVLoader.initDebug()) {
return false;
}
return true;
}
}
// Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package com.baidu.paddlex.config;
import android.content.Context;
import android.content.res.AssetManager;
import org.yaml.snakeyaml.Yaml;
import java.io.File;
import java.io.FileInputStream;
import java.io.IOException;
import java.io.InputStream;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
public class ConfigParser {
protected String model = "";
protected List<String> labeList = new ArrayList<>();
protected int numClasses = 0;
protected String modelType = "";
protected String transformsMode = "RGB";
protected List transformsList = new ArrayList();
protected String modelPath = "";
protected int cpuThreadNum = 1;
protected String cpuPowerMode = "";
protected String yamlPath = "";
public void init(String modelPath, String yamlPath, int cpuThreadNum,
String cpuPowerMode) throws IOException {
this.modelPath = modelPath;
this.cpuThreadNum = cpuThreadNum;
this.cpuPowerMode = cpuPowerMode;
this.yamlPath = yamlPath;
InputStream ymlStream = new FileInputStream(new File(yamlPath));
Yaml yml = new Yaml();
HashMap yml_map = (HashMap) yml.load(ymlStream);
model = (String) yml_map.get("Model");
if (yml_map.containsKey("TransformsMode")) {
transformsMode = (String) yml_map.get("TransformsMode");
}
HashMap _Attributes = (HashMap) yml_map.get("_Attributes");
// parser label_list
labeList = (List<String>) _Attributes.get("labels");
numClasses = (int) _Attributes.get("num_classes");
// parser model_type(classifier, segmenter, detector)
modelType = (String) _Attributes.get("model_type");
// parser Transforms
transformsList = (List) yml_map.get("Transforms");
}
@Override
public String toString() {
return "ConfigParser{" +
"model='" + model + '\'' +
", labeList=" + labeList +
", numClasses=" + numClasses +
", modelType='" + modelType + '\'' +
", transformsMode='" + transformsMode + '\'' +
", transformsList=" + transformsList +
", modelPath='" + modelPath + '\'' +
", cpuThreadNum=" + cpuThreadNum +
", cpuPowerMode='" + cpuPowerMode + '\'' +
", yamlPath='" + yamlPath + '\'' +
'}';
}
public int getNumClasses() {
return numClasses;
}
public void setNumClasses(int numClasses) {
this.numClasses = numClasses;
}
public List<String> getLabeList() {
return labeList;
}
public void setLabeList(List<String> labeList) {
this.labeList = labeList;
}
public String getModelType() {
return modelType;
}
public void setModelType(String modelType) {
this.modelType = modelType;
}
public List getTransformsList() {
return transformsList;
}
public void setTransformsList(List transformsList) {
this.transformsList = transformsList;
}
public String getModel() {
return model;
}
public void setModel(String model) {
this.model = model;
}
public String getTransformsMode() {
return transformsMode;
}
public void setTransformsMode(String transformsMode) {
this.transformsMode = transformsMode;
}
public String getModelPath() {
return modelPath;
}
public void setModelPath(String modelPath) {
this.modelPath = modelPath;
}
public int getCpuThreadNum() {
return cpuThreadNum;
}
public void setCpuThreadNum(int cpuThreadNum) {
this.cpuThreadNum = cpuThreadNum;
}
public String getCpuPowerMode() {
return cpuPowerMode;
}
public void setCpuPowerMode(String cpuPowerMode) {
this.cpuPowerMode = cpuPowerMode;
}
public String getYamlPath() {
return yamlPath;
}
public void setYamlPath(String yamlPath) {
this.yamlPath = yamlPath;
}
}
// Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package com.baidu.paddlex.postprocess;
public class ClsResult extends Result {
static String type = "cls";
protected int categoryId;
protected String category;
protected float score;
public int getCategoryId() {
return categoryId;
}
public void setCategoryId(int categoryId) {
this.categoryId = categoryId;
}
public String getCategory() {
return category;
}
public void setCategory(String category) {
this.category = category;
}
public double getScore() {
return score;
}
public void setScore(float score) {
this.score = score;
}
@Override
public String getType() {
return type;
}
}
// Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package com.baidu.paddlex.postprocess;
import java.util.ArrayList;
import java.util.List;
public class DetResult extends Result {
static String type = "det";
protected List<Box> boxes = new ArrayList<Box>();
public List<Box> getBoxes() {
return boxes;
}
public void setBoxes(List<Box> boxes) {
this.boxes = boxes;
}
@Override
public String getType() {
return type;
}
public class Box {
protected int categoryId;
protected String category;
protected float score;
protected float[] coordinate = new float[4];
public int getCategoryId() {
return categoryId;
}
public void setCategoryId(int category_id) {
this.categoryId = category_id;
}
public String getCategory() {
return category;
}
public void setCategory(String category) {
this.category = category;
}
public float getScore() {
return score;
}
public void setScore(float score) {
this.score = score;
}
public float[] getCoordinate() {
return coordinate;
}
public void setCoordinate(float[] coordinate) {
this.coordinate = coordinate;
}
}
}
......@@ -12,9 +12,12 @@
// See the License for the specific language governing permissions and
// limitations under the License.
#include <iostream>
package com.baidu.paddlex.postprocess;
int main() {
std::cout << "haha" << std::endl;
return 0;
public class Result {
static String type = "base";
public String getType() {
return type;
}
}
// Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package com.baidu.paddlex.postprocess;
public class SegResult extends Result {
static String type = "seg";
protected Mask mask = new Mask();
public Mask getMask() {
return mask;
}
public void setMask(Mask mask) {
this.mask = mask;
}
@Override
public String getType() {
return type;
}
public class Mask {
protected float[] scoreData;
protected long[] labelData;
protected long[] labelShape = new long[4];
protected long[] scoreShape = new long[4];
public float[] getScoreData() {
return scoreData;
}
public void setScoreData(float[] score_data) {
this.scoreData = score_data;
}
public long[] getLabelData() {
return labelData;
}
public void setLabelData(long[] label_data) {
this.labelData = label_data;
}
public long[] getLabelShape() {
return labelShape;
}
public void setLabelShape(long[] labelShape) {
this.labelShape = labelShape;
}
public long[] getScoreShape() {
return scoreShape;
}
public void setScoreShape(long[] scoreShape) {
this.scoreShape = scoreShape;
}
}
}
// Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package com.baidu.paddlex.preprocess;
import java.util.LinkedHashMap;
public class ImageBlob {
// Original image height and width
private long[] oriImageSize = new long[]{1, 3, -1, -1};
// Newest image height and width after process
private long[] newImageSize = new long[]{1, 3, -1, -1};
// Reshape order, Image height and width before resize
private LinkedHashMap<String, int[]> reshapeInfo = new LinkedHashMap<String, int[]>();
// Resize scale
private float scale = 1;
// Buffer for image data after preprocessing
private float[] imageData;
public void clear() {
oriImageSize = new long[]{1, 3, -1, -1};
newImageSize = new long[]{1, 3, -1, -1};
reshapeInfo.clear();
imageData = null;
}
public long[] getOriImageSize() {
return oriImageSize;
}
public void setOriImageSize(long[] oriImageSize) {
this.oriImageSize = oriImageSize;
}
public void setOriImageSize(long dim, int idx) {
this.oriImageSize[idx] = dim;
}
public long[] getNewImageSize() {
return newImageSize;
}
public void setNewImageSize(long[] newImageSize) {
this.newImageSize = newImageSize;
}
public void setNewImageSize(long dim, int idx) {
this.newImageSize[idx] = dim;
}
public LinkedHashMap<String, int[]> getReshapeInfo() {
return reshapeInfo;
}
public void setReshapeInfo(LinkedHashMap<String, int[]> reshapeInfo) {
this.reshapeInfo = reshapeInfo;
}
public float getScale() {
return scale;
}
public void setScale(float scale) {
this.scale = scale;
}
public float[] getImageData() {
return imageData;
}
public void setImageData(float[] imageData) {
this.imageData = imageData;
}
}
\ No newline at end of file
// Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package com.baidu.paddlex.preprocess;
import android.util.Log;
import org.opencv.android.OpenCVLoader;
import org.opencv.core.Core;
import org.opencv.core.CvType;
import org.opencv.core.Mat;
import org.opencv.core.Rect;
import org.opencv.core.Scalar;
import org.opencv.core.Size;
import org.opencv.imgproc.Imgproc;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
public class Transforms {
private static final String TAG = Transforms.class.getSimpleName();
private List<transformOp> transformOps = new ArrayList<transformOp>();
private String transformsMode = "RGB";
private HashMap<String, Integer> interpMap = new HashMap<String, Integer>(){{
put("LINEAR", Imgproc.INTER_LINEAR);
put("NEAREST", Imgproc.INTER_NEAREST);
put("AREA", Imgproc.INTER_AREA);
put("CUBIC", Imgproc.INTER_CUBIC);
put("LANCZOS4", Imgproc.INTER_LANCZOS4);
}
};
public void loadConfig(List transforms_list, String transformsMode) {
if (!OpenCVLoader.initDebug()) {
Log.e(TAG,"OpenCV Loadding failed.");
}
this.transformsMode = transformsMode;
for (int i = 0; i < transforms_list.size(); i++) {
HashMap transform_op = (HashMap) (transforms_list.get(i));
if (transform_op.containsKey("ResizeByShort")) {
HashMap info = (HashMap) transform_op.get("ResizeByShort");
ResizeByShort resizeByShort = new ResizeByShort();
resizeByShort.max_size = (int)info.get("max_size");
resizeByShort.short_size = (int)info.get("short_size");
if (info.containsKey("interp")) {
resizeByShort.interp = (String) info.get("interp");
}
transformOps.add(resizeByShort);
} else if (transform_op.containsKey("ResizeByLong")) {
HashMap info = (HashMap) transform_op.get("ResizeByLong");
ResizeByLong resizeByLong = new ResizeByLong();
resizeByLong.long_size = (int)info.get("long_size");
if (info.containsKey("interp")) {
resizeByLong.interp = (String) info.get("interp");
}
transformOps.add(resizeByLong);
} else if (transform_op.containsKey("CenterCrop")) {
HashMap info = (HashMap) transform_op.get("CenterCrop");
CenterCrop centerCrop = new CenterCrop();
if (info.get("crop_size") instanceof Integer) {
centerCrop.cropHeight = (int) info.get("crop_size");
centerCrop.cropWidth = (int) info.get("crop_size");
} else {
centerCrop.cropWidth = ((List<Integer>) info.get("crop_size")).get(0);
centerCrop.cropHeight = ((List<Integer>) info.get("crop_size")).get(1);
}
transformOps.add(centerCrop);
} else if (transform_op.containsKey("Normalize")) {
HashMap<String, List<Float>> info = (HashMap<String, List<Float>>) transform_op.get("Normalize");
Normalize normalize = new Normalize();
normalize.mean = info.get("mean").toArray(new Double[info.get("mean").size()]);
normalize.std = info.get("std").toArray(new Double[info.get("std").size()]);
transformOps.add(normalize);
} else if (transform_op.containsKey("Resize")) {
HashMap info = (HashMap) transform_op.get("Resize");
Resize resize = new Resize();
if (info.get("target_size") instanceof Integer) {
resize.width = (int) info.get("target_size");
resize.height = (int) info.get("target_size");
} else {
resize.width = ((List<Integer>) info.get("target_size")).get(0);
resize.height = ((List<Integer>) info.get("target_size")).get(1);
}
if (info.containsKey("interp")) {
resize.interp = (String) info.get("interp");
}
transformOps.add(resize);
} else if (transform_op.containsKey("Padding")) {
HashMap info = (HashMap) transform_op.get("Padding");
Padding padding = new Padding();
if (info.containsKey("coarsest_stride")) {
padding.coarsest_stride = (int) info.get("coarsest_stride");
}
if (info.containsKey("target_size")) {
if (info.get("target_size") instanceof Integer) {
padding.width = (int) info.get("target_size");
padding.height = (int) info.get("target_size");
} else {
padding.width = ((List<Integer>) info.get("target_size")).get(0);
padding.height = ((List<Integer>) info.get("target_size")).get(1);
}
}
transformOps.add(padding);
}
}
}
public ImageBlob run(Mat inputMat, ImageBlob imageBlob) {
imageBlob.setOriImageSize(inputMat.height(),2);
imageBlob.setOriImageSize(inputMat.width(),3);
imageBlob.setNewImageSize(inputMat.height(),2);
imageBlob.setNewImageSize(inputMat.width(),3);
if(transformsMode.equalsIgnoreCase("RGB")){
Imgproc.cvtColor(inputMat, inputMat, Imgproc.COLOR_BGR2RGB);
}else if(!transformsMode.equalsIgnoreCase("BGR")){
Log.e(TAG, "transformsMode only support RGB or BGR");
}
inputMat.convertTo(inputMat, CvType.CV_32FC(3));
for (transformOp op : transformOps) {
inputMat = op.run(inputMat, imageBlob);
}
int w = inputMat.width();
int h = inputMat.height();
int c = inputMat.channels();
imageBlob.setImageData(new float[w * h * c]);
int[] channelStride = new int[]{w * h, w * h * 2};
for (int y = 0; y < h; y++) {
for (int x = 0;
x < w; x++) {
double[] color = inputMat.get(y, x);
imageBlob.getImageData()[y * w + x] = (float) (color[0]);
imageBlob.getImageData()[y * w + x + channelStride[0]] = (float) (color[1]);
imageBlob.getImageData()[y * w + x + channelStride[1]] = (float) (color[2]);
}
}
return imageBlob;
}
private class transformOp {
public Mat run(Mat inputMat, ImageBlob data) {
return inputMat;
}
}
private class ResizeByShort extends transformOp {
private int max_size;
private int short_size;
private String interp = "LINEAR";
public Mat run(Mat inputMat, ImageBlob imageBlob) {
int origin_w = inputMat.width();
int origin_h = inputMat.height();
imageBlob.getReshapeInfo().put("resize", new int[]{origin_w, origin_h});
int im_size_max = Math.max(origin_w, origin_h);
int im_size_min = Math.min(origin_w, origin_h);
float scale = (float) (short_size) / (float) (im_size_min);
if (max_size > 0) {
if (Math.round(scale * im_size_max) > max_size) {
scale = (float) (max_size) / (float) (im_size_max);
}
}
int width = Math.round(scale * origin_w);
int height = Math.round(scale * origin_h);
Size sz = new Size(width, height);
Imgproc.resize(inputMat, inputMat, sz,0,0, interpMap.get(interp));
imageBlob.setNewImageSize(inputMat.height(),2);
imageBlob.setNewImageSize(inputMat.width(),3);
imageBlob.setScale(scale);
return inputMat;
}
}
private class ResizeByLong extends transformOp {
private int long_size;
private String interp = "LINEAR";
public Mat run(Mat inputMat, ImageBlob imageBlob) {
int origin_w = inputMat.width();
int origin_h = inputMat.height();
imageBlob.getReshapeInfo().put("resize", new int[]{origin_w, origin_h});
int im_size_max = Math.max(origin_w, origin_h);
float scale = (float) (long_size) / (float) (im_size_max);
int width = Math.round(scale * origin_w);
int height = Math.round(scale * origin_h);
Size sz = new Size(width, height);
Imgproc.resize(inputMat, inputMat, sz,0,0, interpMap.get(interp));
imageBlob.setNewImageSize(inputMat.height(),2);
imageBlob.setNewImageSize(inputMat.width(),3);
imageBlob.setScale(scale);
return inputMat;
}
}
private class CenterCrop extends transformOp {
private int cropHeight;
private int cropWidth;
public Mat run(Mat inputMat, ImageBlob imageBlob) {
int origin_w = inputMat.width();
int origin_h = inputMat.height();
if (origin_h < cropHeight || origin_w < cropWidth) {
Log.e(TAG, "[CenterCrop] Image size less than crop size");
}
int offset_x, offset_y;
offset_x = (origin_w - cropWidth) / 2;
offset_y = (origin_h - cropHeight) / 2;
offset_x = Math.max(Math.min(offset_x, origin_w - cropWidth), 0);
offset_y = Math.max(Math.min(offset_y, origin_h - cropHeight), 0);
Rect crop_roi = new Rect(offset_x, offset_y, cropHeight, cropWidth);
inputMat = inputMat.submat(crop_roi);
imageBlob.setNewImageSize(inputMat.height(),2);
imageBlob.setNewImageSize(inputMat.width(),3);
return inputMat;
}
}
private class Resize extends transformOp {
private int height;
private int width;
private String interp = "LINEAR";
public Mat run(Mat inputMat, ImageBlob imageBlob) {
int origin_w = inputMat.width();
int origin_h = inputMat.height();
imageBlob.getReshapeInfo().put("resize", new int[]{origin_w, origin_h});
Size sz = new Size(width, height);
Imgproc.resize(inputMat, inputMat, sz,0,0, interpMap.get(interp));
imageBlob.setNewImageSize(inputMat.height(),2);
imageBlob.setNewImageSize(inputMat.width(),3);
return inputMat;
}
}
private class Padding extends transformOp {
private double width;
private double height;
private double coarsest_stride;
public Mat run(Mat inputMat, ImageBlob imageBlob) {
int origin_w = inputMat.width();
int origin_h = inputMat.height();
imageBlob.getReshapeInfo().put("padding", new int[]{origin_w, origin_h});
double padding_w = 0;
double padding_h = 0;
if (width > 1 & height > 1) {
padding_w = width;
padding_h = height;
} else if (coarsest_stride > 1) {
padding_h = Math.ceil(origin_h / coarsest_stride) * coarsest_stride;
padding_w = Math.ceil(origin_w / coarsest_stride) * coarsest_stride;
}
imageBlob.setNewImageSize(inputMat.height(),2);
imageBlob.setNewImageSize(inputMat.width(),3);
Core.copyMakeBorder(inputMat, inputMat, 0, (int)padding_h, 0, (int)padding_w, Core.BORDER_CONSTANT, new Scalar(0));
return inputMat;
}
}
private class Normalize extends transformOp {
private Double[] mean = new Double[3];
private Double[] std = new Double[3];
public Mat run(Mat inputMat, ImageBlob imageBlob) {
inputMat.convertTo(inputMat, CvType.CV_32FC(3), 1/255.0);
Scalar meanScalar = new Scalar(mean[0], mean[1], mean[2]);
Scalar stdScalar = new Scalar(std[0], std[1], std[2]);
Core.subtract(inputMat, meanScalar, inputMat);
Core.divide(inputMat, stdScalar, inputMat);
return inputMat;
}
}
}
// Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package com.baidu.paddlex.visual;
import android.graphics.Canvas;
import android.graphics.Paint;
import android.util.Log;
import com.baidu.paddlex.postprocess.DetResult;
import com.baidu.paddlex.postprocess.SegResult;
import com.baidu.paddlex.preprocess.ImageBlob;
import org.opencv.core.Core;
import org.opencv.core.CvType;
import org.opencv.core.Mat;
import org.opencv.core.Point;
import org.opencv.core.Rect;
import org.opencv.core.Scalar;
import org.opencv.core.Size;
import org.opencv.imgproc.Imgproc;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import java.util.ListIterator;
import java.util.Map;
public class Visualize {
protected static final String TAG = Visualize.class.getSimpleName();
protected float detectConfidenceThreshold = (float) 0.5;
protected Scalar[] colormap = new Scalar[]{};
protected void generateColorMap(int num_class) {
this.colormap = new Scalar[num_class];
this.colormap[0] = new Scalar(0, 0, 0);
for (int i = 0; i < num_class; i++) {
int j = 0;
int lab = i;
while (lab > 0) {
int r = (((lab >> 0) & 1) << (7 - j));
int g = (((lab >> 1) & 1) << (7 - j));
int b = (((lab >> 2) & 1) << (7 - j));
this.colormap[i] = new Scalar(r, g, b);
++j;
lab >>= 3;
}
}
}
public float getDetectConfidenceThreshold() {
return detectConfidenceThreshold;
}
public void setDetectConfidenceThreshold(float detectConfidenceThreshold) {
this.detectConfidenceThreshold = detectConfidenceThreshold;
}
public Scalar[] getColormap() {
return colormap;
}
public void setColormap(Scalar[] colormap) {
this.colormap = colormap;
}
public void init(int num_class) {
generateColorMap(num_class);
}
public Mat draw(DetResult result, Mat visualizeMat) {
Paint rectPaint = new Paint();
rectPaint.setStyle(Paint.Style.STROKE);
rectPaint.setStrokeWidth(2);
Paint txtPaint = new Paint();
txtPaint.setTextSize(15);
txtPaint.setAntiAlias(true);
for (DetResult.Box box : result.getBoxes()) {
if (box.getScore() < detectConfidenceThreshold) {
continue;
}
String text = box.getCategory() + ":" + String.valueOf(box.getScore()).substring(0, 4);
Scalar roiColor = colormap[box.getCategoryId()];
double font_scale = 0.5;
int thickness = 1;
int font_face = Core.FONT_HERSHEY_SIMPLEX;
Point roiXyMin = new Point(box.getCoordinate()[0],box.getCoordinate()[1]);
Point roiXyMax = new Point(box.getCoordinate()[2],box.getCoordinate()[3]);
Size text_size = Imgproc.getTextSize(text, font_face,font_scale, thickness,null);
Imgproc.rectangle(visualizeMat, roiXyMin, roiXyMax, roiColor,2);
Point textXyMin = new Point(box.getCoordinate()[0],box.getCoordinate()[1]-text_size.height);
Point textXyMax = new Point(box.getCoordinate()[0]+text_size.width,box.getCoordinate()[1]);
Imgproc.rectangle(visualizeMat,textXyMin, textXyMax, roiColor,-1);
Imgproc.putText(visualizeMat,
text,
roiXyMin,
font_face,
font_scale,
new Scalar(255, 255, 255));
}
return visualizeMat;
}
public Mat draw(SegResult result, Mat visualizeMat, ImageBlob imageBlob, int cutoutClass) {
int new_h = (int)imageBlob.getNewImageSize()[2];
int new_w = (int)imageBlob.getNewImageSize()[3];
Mat mask = new Mat(new_h, new_w, CvType.CV_8UC(1));
for (int h = 0; h < new_h; h++) {
for (int w = 0; w < new_w; w++){
mask.put(h , w, (1-result.getMask().getScoreData()[cutoutClass + h * new_h + w]) * 255);
}
}
ListIterator<Map.Entry<String, int[]>> reverseReshapeInfo = new ArrayList<Map.Entry<String, int[]>>(imageBlob.getReshapeInfo().entrySet()).listIterator(imageBlob.getReshapeInfo().size());
while (reverseReshapeInfo.hasPrevious()) {
Map.Entry<String, int[]> entry = reverseReshapeInfo.previous();
if (entry.getKey().equalsIgnoreCase("padding")) {
Rect crop_roi = new Rect(0, 0, entry.getValue()[0], entry.getValue()[1]);
mask = mask.submat(crop_roi);
} else if (entry.getKey().equalsIgnoreCase("resize")) {
Size sz = new Size(entry.getValue()[0], entry.getValue()[1]);
Imgproc.resize(mask, mask, sz,0,0,Imgproc.INTER_LINEAR);
}
Log.i(TAG, "postprocess operator: " + entry.getKey());
Log.i(TAG, "shape:: " + String.valueOf(mask.width()) + ","+ String.valueOf(mask.height()));
}
Mat dst = new Mat();
List<Mat> listMat = Arrays.asList(visualizeMat, mask);
Core.merge(listMat, dst);
return dst;
}
}
<resources>
<string name="app_name">PaddleX</string>
</resources>
package com.example.paddlex;
import org.junit.Test;
import static org.junit.Assert.*;
/**
* Example local unit test, which will execute on the development machine (host).
*
* @see <a href="http://d.android.com/tools/testing">Testing documentation</a>
*/
public class ExampleUnitTest {
@Test
public void addition_isCorrect() {
assertEquals(4, 2 + 2);
}
}
\ No newline at end of file
......@@ -21,7 +21,8 @@ def export_lite():
opt = lite.Opt()
model_file = os.path.join(FLAGS.model_dir, '__model__')
params_file = os.path.join(FLAGS.model_dir, '__params__')
opt.run_optimize("", model_file, params_file, FLAGS.place, FLAGS.save_file)
opt.run_optimize("", model_file, params_file, 'naive_buffer', FLAGS.place,
FLAGS.save_file)
if __name__ == '__main__':
......
# 常见问题
## 1. 训练参数如何调整
> 参考[参数调整文档](appendix/parameters.md)
## 2. 训练过程因显存不够出错
> 通过使用在终端`nvidia-smi`命令,查看GPU是否被其它任务占用,尝试清除其它任务;
> 调低训练时的`batch_size`参数,从而降低显存的要求,注意需等比例调低`learning_rate`等参数;
> 选用更小的模型或backbone。
## 3. 是否有更小的模型,适用于更低配置的设备上运行
> 可以使用模型裁剪,参考文档[模型裁剪使用教程](slim/prune.md),通过调整裁剪参数,可以控制模型裁剪后的大小,在实际实验中,如VOC检测数据,使用yolov3-mobilenet,原模型大小为XXM,裁剪后为XX M,精度基本保持不变
## 4. 如何配置训练时GPU的卡数
> 通过在终端export环境变量,或在Python代码中设置,可参考文档[CPU/多卡GPU训练](appendix/gpu_configure.md)
## 5. 想将之前训练的模型参数上继续训练
> 在训练调用`train`接口时,将`pretrain_weights`设为之前的模型保存路径即可
## 6. PaddleX保存的模型分为正常训练过程中产生、裁剪训练产生、导出为部署模型和量化保存这么多种,有什么差别,怎么区分
**不同模型的功能差异**
>1.正常模型训练保存
>
>>模型在正常训练过程,每间隔n个epoch保存的模型目录,模型可作为预训练模型参数,可使用PaddleX加载预测、或导出部署模型
>2.裁剪训练保存
>
>>模型在裁剪训练过程,每间隔n个epoch保存的模型目录,模型不可作为预训练模型参数,可使用PaddleX加载预测、或导出部署模型
>3.导出部署模型
>
>>为了模型在服务端部署,导出的模型目录,不可作为预训练模型参数,可使用PaddleX加载预测
>4.量化保存模型
>
>>为了提升模型预测速度,将模型参数进行量化保存的模型目录,模型不可作为预训练模型参数,可使用PaddleX加载预测
**区分方法**
>> 通过模型目录下model.yml文件中`status`字段来区别不同的模型类型, 'Normal'、'Prune'、'Infer'、'Quant'分别表示正常模型训练保存、裁剪训练保存、导出的部署模型、量化保存模型
## 7. 模型训练需要太久时间,或者训练速度太慢,怎么提速
> 1.模型训练速度与用户选定的模型大小,和设定的`batch_size`相关,模型大小可直接参考[模型库](model_zoo.md)中的指标,一般而言,模型越大,训练速度就越慢;
> 2.在模型速度之外,模型训练完成所需的时间又与用户设定的`num_epochs`迭代轮数相关,用户可以通过观察模型在验证集上的指标来决定是否提示结束掉训练进程(训练时设定`save_interval_epochs`参数,训练过程会每间隔`save_interval_epochs`轮数在验证集上计算指标,并保存模型);
## 8. 如何设定迭代的轮数
> 1. 用户自行训练时,如不确定迭代的轮数,可以将轮数设高一些,同时注意设置`save_interval_epochs`,这样模型迭代每间隔相应轮数就会在验证集上进行评估和保存,可以根据不同轮数模型在验证集上的评估指标,判断模型是否已经收敛,若模型已收敛,可以自行结束训练进程
>
## 9. 只有CPU,没有GPU,如何提升训练速度
> 当没有GPU时,可以根据自己的CPU配置,选择是否使用多CPU进行训练,具体配置方式可以参考文档[多卡CPU/GPU训练](appendix/gpu_configure.md)
>
## 10. 电脑不能联网,训练时因为下载预训练模型失败,如何解决
> 可以预先通过其它方式准备好预训练模型,然后训练时自定义`pretrain_weights`即可,可参考文档[无联网模型训练](how_to_offline_run.md)
## 11. 每次训练新的模型,都需要重新下载预训练模型,怎样可以下载一次就搞定
> 1.可以按照9的方式来解决这个问题
> 2.每次训练前都设定`paddlex.pretrain_dir`路径,如设定`paddlex.pretrain_dir='/usrname/paddlex`,如此下载完的预训练模型会存放至`/usrname/paddlex`目录下,而已经下载在该目录的模型也不会再次重复下载
## 12. PaddleX GUI启动时提示"Failed to execute script PaddleX",如何解决?
> 1. 请检查目标机器上PaddleX程序所在路径是否包含中文。目前暂不支持中文路径,请尝试将程序移动到英文目录。
> 2. 如果您的系统是Windows 7或者Windows Server 2012时,原因是缺少MFPlat.DLL/MF.dll/MFReadWrite.dll等OpenCV依赖的DLL,请按如下方式安装桌面体验:通过“我的电脑”-->“属性”-->"管理"打开服务器管理器,点击右上角“管理”选择“添加角色和功能”。点击“服务器选择”-->“功能”,拖动滚动条到最下端,点开“用户界面和基础结构”,勾选“桌面体验”后点击“安装”,等安装完成尝试再次运行PaddleX。
> 3. 请检查目标机器上是否有其他的PaddleX程序或者进程在运行中,如有请退出或者重启机器看是否解决
> 4. 请确认运行程序的用户是否有管理员权限,如非管理员权限用户请尝试使用管理员运行看是否成功
文件模式从 100755 更改为 100644
# 检测和实例分割数据集
# 数据集读取
## VOCDetection类
## paddlex.datasets.ImageNet
> **用于图像分类模型**
```
paddlex.datasets.ImageNet(data_dir, file_list, label_list, transforms=None, num_workers=‘auto’, buffer_size=100, parallel_method='thread', shuffle=False)
```
读取ImageNet格式的分类数据集,并对样本进行相应的处理。ImageNet数据集格式的介绍可查看文档:[数据集格式说明](../data/format/index.html)
示例:[代码文件](https://github.com/PaddlePaddle/PaddleX/blob/develop/tutorials/train/image_classification/mobilenetv2.py)
> **参数**
> > * **data_dir** (str): 数据集所在的目录路径。
> > * **file_list** (str): 描述数据集图片文件和类别id的文件路径(文本内每行路径为相对`data_dir`的相对路径)。
> > * **label_list** (str): 描述数据集包含的类别信息文件路径。
> > * **transforms** (paddlex.cls.transforms): 数据集中每个样本的预处理/增强算子,详见[paddlex.cls.transforms](./transforms/cls_transforms.md)。
> > * **num_workers** (int|str):数据集中样本在预处理过程中的线程或进程数。默认为'auto'。当设为'auto'时,根据系统的实际CPU核数设置`num_workers`: 如果CPU核数的一半大于8,则`num_workers`为8,否则为CPU核数的一半。
> > * **buffer_size** (int): 数据集中样本在预处理过程中队列的缓存长度,以样本数为单位。默认为100。
> > * **parallel_method** (str): 数据集中样本在预处理过程中并行处理的方式,支持'thread'线程和'process'进程两种方式。默认为'process'(Windows和Mac下会强制使用thread,该参数无效)。
> > * **shuffle** (bool): 是否需要对数据集中样本打乱顺序。默认为False。
## paddlex.datasets.VOCDetection
> **用于目标检测模型**
```
paddlex.datasets.VOCDetection(data_dir, file_list, label_list, transforms=None, num_workers=‘auto’, buffer_size=100, parallel_method='thread', shuffle=False)
```
> 仅用于**目标检测**。读取PascalVOC格式的检测数据集,并对样本进行相应的处理。PascalVOC数据集格式的介绍可查看文档:[数据集格式说明](../datasets.md)
> 读取PascalVOC格式的检测数据集,并对样本进行相应的处理。PascalVOC数据集格式的介绍可查看文档:[数据集格式说明](../data/format/index.html)
> 示例:[代码文件](https://github.com/PaddlePaddle/PaddleX/blob/develop/tutorials/train/detection/yolov3_darknet53.py#L29)
> 示例:[代码文件](https://github.com/PaddlePaddle/PaddleX/blob/develop/tutorials/train/object_detection/yolov3_darknet53.py)
> **参数**
......@@ -21,53 +41,73 @@ paddlex.datasets.VOCDetection(data_dir, file_list, label_list, transforms=None,
> > * **parallel_method** (str): 数据集中样本在预处理过程中并行处理的方式,支持'thread'线程和'process'进程两种方式。默认为'process'(Windows和Mac下会强制使用thread,该参数无效)。
> > * **shuffle** (bool): 是否需要对数据集中样本打乱顺序。默认为False。
> 【可选】支持在训练过程中加入无目标真值的背景图片来减少背景误检,定义VOCDetection类后调用其成员函数`add_negative_samples`添加背景图片即可:
> ```
> add_negative_samples(image_dir)
> ```
> > 示例:[代码](../../tuning_strategy/detection/negatives_training.html#id4)
## paddlex.datasets.CocoDetection
> **用于实例分割/目标检测模型**
```
paddlex.datasets.CocoDetection(data_dir, ann_file, transforms=None, num_workers='auto', buffer_size=100, parallel_method='thread', shuffle=False)
```
> 读取MSCOCO格式的检测数据集,并对样本进行相应的处理,该格式的数据集同样可以应用到实例分割模型的训练中。MSCOCO数据集格式的介绍可查看文档:[数据集格式说明](../data/format/index.html)
> > **参数**
> 示例:[代码文件](https://github.com/PaddlePaddle/PaddleX/blob/develop/tutorials/train/instance_segmentation/mask_rcnn_r50_fpn.py)
> > > * **image_dir** (str): 背景图片所在的目录路径。
> **参数**
## CocoDetection类
> > * **data_dir** (str): 数据集所在的目录路径。
> > * **ann_file** (str): 数据集的标注文件,为一个独立的json格式文件。
> > * **transforms** (paddlex.det.transforms): 数据集中每个样本的预处理/增强算子,详见[paddlex.det.transforms](./transforms/det_transforms.md)。
> > * **num_workers** (int|str):数据集中样本在预处理过程中的线程或进程数。默认为'auto'。当设为'auto'时,根据系统的实际CPU核数设置`num_workers`: 如果CPU核数的一半大于8,则`num_workers`为8,否则为CPU核数的一半。
> > * **buffer_size** (int): 数据集中样本在预处理过程中队列的缓存长度,以样本数为单位。默认为100。
> > * **parallel_method** (str): 数据集中样本在预处理过程中并行处理的方式,支持'thread'线程和'process'进程两种方式。默认为'process'(Windows和Mac下会强制使用thread,该参数无效)。
> > * **shuffle** (bool): 是否需要对数据集中样本打乱顺序。默认为False。
## paddlex.datasets.SegDataset
> **用于语义分割模型**
```
paddlex.datasets.CocoDetection(data_dir, ann_file, transforms=None, num_workers='auto', buffer_size=100, parallel_method='thread', shuffle=False)
paddlex.datasets.SegDataset(data_dir, file_list, label_list, transforms=None, num_workers='auto', buffer_size=100, parallel_method='thread', shuffle=False)
```
> 用于**目标检测或实例分割**。读取MSCOCO格式的检测数据集,并对样本进行相应的处理,该格式的数据集同样可以应用到实例分割模型的训练中。MSCOCO数据集格式的介绍可查看文档:[数据集格式说明](../datasets.md)
> 读取语义分割任务数据集,并对样本进行相应的处理。语义分割任务数据集格式的介绍可查看文档:[数据集格式说明](../data/format/index.html)
> 示例:[代码文件](https://github.com/PaddlePaddle/PaddleX/blob/develop/tutorials/train/detection/mask_rcnn_r50_fpn.py#L27)
> 示例:[代码文件](https://github.com/PaddlePaddle/PaddleX/blob/develop/tutorials/train/semantic_segmentation/unet.py)
> **参数**
> > * **data_dir** (str): 数据集所在的目录路径。
> > * **ann_file** (str): 数据集的标注文件,为一个独立的json格式文件。
> > * **transforms** (paddlex.det.transforms): 数据集中每个样本的预处理/增强算子,详见[paddlex.det.transforms](./transforms/det_transforms.md)。
> > * **file_list** (str): 描述数据集图片文件和对应标注文件的文件路径(文本内每行路径为相对`data_dir`的相对路径)。
> > * **label_list** (str): 描述数据集包含的类别信息文件路径。
> > * **transforms** (paddlex.seg.transforms): 数据集中每个样本的预处理/增强算子,详见[paddlex.seg.transforms](./transforms/seg_transforms.md)。
> > * **num_workers** (int|str):数据集中样本在预处理过程中的线程或进程数。默认为'auto'。当设为'auto'时,根据系统的实际CPU核数设置`num_workers`: 如果CPU核数的一半大于8,则`num_workers`为8,否则为CPU核数的一半。
> > * **buffer_size** (int): 数据集中样本在预处理过程中队列的缓存长度,以样本数为单位。默认为100。
> > * **parallel_method** (str): 数据集中样本在预处理过程中并行处理的方式,支持'thread'线程和'process'进程两种方式。默认为'process'(Windows和Mac下会强制使用thread,该参数无效)。
> > * **shuffle** (bool): 是否需要对数据集中样本打乱顺序。默认为False。
> 【可选】支持在训练过程中加入无目标真值的背景图片来减少背景误检,定义CocoDetection类后调用其成员函数`add_negative_samples`添加背景图片即可:
> ```
> add_negative_samples(image_dir)
> ```
> > 示例:[代码](../../tuning_strategy/detection/negatives_training.html#id4)
## paddlex.datasets.EasyDataCls
> **用于图像分类模型**
```
paddlex.datasets.EasyDataCls(data_dir, file_list, label_list, transforms=None, num_workers='auto', buffer_size=100, parallel_method='thread', shuffle=False)
```
> > **参数**
> 读取EasyData平台标注图像分类数据集,并对样本进行相应的处理。
> > > * **image_dir** (str): 背景图片所在的目录路径。
> **参数**
## EasyDataDet类
> > * **data_dir** (str): 数据集所在的目录路径。
> > * **file_list** (str): 描述数据集图片文件和对应标注文件的文件路径(文本内每行路径为相对`data_dir`的相对路径)。
> > * **label_list** (str): 描述数据集包含的类别信息文件路径。
> > * **transforms** (paddlex.seg.transforms): 数据集中每个样本的预处理/增强算子,详见[paddlex.cls.transforms](./transforms/cls_transforms.md)。
> > * **num_workers** (int|str):数据集中样本在预处理过程中的线程或进程数。默认为'auto'。当设为'auto'时,根据系统的实际CPU核数设置`num_workers`: 如果CPU核数的一半大于8,则`num_workers`为8,否则为CPU核数的一半。
> > * **buffer_size** (int): 数据集中样本在预处理过程中队列的缓存长度,以样本数为单位。默认为100。
> > * **parallel_method** (str): 数据集中样本在预处理过程中并行处理的方式,支持'thread'线程和'process'进程两种方式。默认为'process'(Windows和Mac下会强制使用thread,该参数无效)。
> > * **shuffle** (bool): 是否需要对数据集中样本打乱顺序。默认为False。
## paddlex.datasets.EasyDataDet
> 用于**目标检测/实例分割模型**
```
paddlex.datasets.EasyDataDet(data_dir, file_list, label_list, transforms=None, num_workers=‘auto’, buffer_size=100, parallel_method='thread', shuffle=False)
```
> 用于**目标检测或实例分割**。读取EasyData目标检测格式数据集,并对样本进行相应的处理,该格式的数据集同样可以应用到实例分割模型的训练中。EasyData目标检测或实例分割任务数据集格式的介绍可查看文档:[数据集格式说明](../datasets.md)
> 读取EasyData目标检测/实例分割格式数据集,并对样本进行相应的处理,该格式的数据集同样可以应用到实例分割模型的训练中。
> **参数**
......@@ -81,13 +121,22 @@ paddlex.datasets.EasyDataDet(data_dir, file_list, label_list, transforms=None, n
> > * **parallel_method** (str): 数据集中样本在预处理过程中并行处理的方式,支持'thread'线程和'process'进程两种方式。默认为'process'(Windows和Mac下会强制使用thread,该参数无效)。
> > * **shuffle** (bool): 是否需要对数据集中样本打乱顺序。默认为False。
## paddlex.datasets.EasyDataSeg
> **用于语义分割模型**
```
paddlex.datasets.EasyDataSeg(data_dir, file_list, label_list, transforms=None, num_workers='auto', buffer_size=100, parallel_method='thread', shuffle=False)
```
> 读取EasyData语义分割任务数据集,并对样本进行相应的处理。
> 【可选】支持在训练过程中加入无目标真值的背景图片来减少背景误检,定义EasyDataDet类后调用其成员函数`add_negative_samples`添加背景图片即可:
> ```
> add_negative_samples(image_dir)
> ```
> > 示例:[代码](../../tuning_strategy/detection/negatives_training.html#id4)
> > **参数**
> **参数**
> > > * **image_dir** (str): 背景图片所在的目录路径。
> > * **data_dir** (str): 数据集所在的目录路径。
> > * **file_list** (str): 描述数据集图片文件和对应标注文件的文件路径(文本内每行路径为相对`data_dir`的相对路径)。
> > * **label_list** (str): 描述数据集包含的类别信息文件路径。
> > * **transforms** (paddlex.seg.transforms): 数据集中每个样本的预处理/增强算子,详见[paddlex.seg.transforms](./transforms/seg_transforms.md)。
> > * **num_workers** (int|str):数据集中样本在预处理过程中的线程或进程数。默认为'auto'。当设为'auto'时,根据系统的实际CPU核数设置`num_workers`: 如果CPU核数的一半大于8,则`num_workers`为8,否则为CPU核数的一半。
> > * **buffer_size** (int): 数据集中样本在预处理过程中队列的缓存长度,以样本数为单位。默认为100。
> > * **parallel_method** (str): 数据集中样本在预处理过程中并行处理的方式,支持'thread'线程和'process'进程两种方式。默认为'process'(Windows和Mac下会强制使用thread,该参数无效)。
> > * **shuffle** (bool): 是否需要对数据集中样本打乱顺序。默认为False。
# 图像分类数据集
## ImageNet类
```
paddlex.datasets.ImageNet(data_dir, file_list, label_list, transforms=None, num_workers=‘auto’, buffer_size=100, parallel_method='thread', shuffle=False)
```
读取ImageNet格式的分类数据集,并对样本进行相应的处理。ImageNet数据集格式的介绍可查看文档:[数据集格式说明](../datasets.md)
示例:[代码文件](https://github.com/PaddlePaddle/PaddleX/blob/develop/tutorials/train/classification/mobilenetv2.py#L25)
> **参数**
> > * **data_dir** (str): 数据集所在的目录路径。
> > * **file_list** (str): 描述数据集图片文件和类别id的文件路径(文本内每行路径为相对`data_dir`的相对路径)。
> > * **label_list** (str): 描述数据集包含的类别信息文件路径。
> > * **transforms** (paddlex.cls.transforms): 数据集中每个样本的预处理/增强算子,详见[paddlex.cls.transforms](./transforms/cls_transforms.md)。
> > * **num_workers** (int|str):数据集中样本在预处理过程中的线程或进程数。默认为'auto'。当设为'auto'时,根据系统的实际CPU核数设置`num_workers`: 如果CPU核数的一半大于8,则`num_workers`为8,否则为CPU核数的一半。
> > * **buffer_size** (int): 数据集中样本在预处理过程中队列的缓存长度,以样本数为单位。默认为100。
> > * **parallel_method** (str): 数据集中样本在预处理过程中并行处理的方式,支持'thread'线程和'process'进程两种方式。默认为'process'(Windows和Mac下会强制使用thread,该参数无效)。
> > * **shuffle** (bool): 是否需要对数据集中样本打乱顺序。默认为False。
## EasyDataCls类
```
paddlex.datasets.EasyDatasetCls(data_dir, file_list, label_list, transforms=None, num_workers='auto', buffer_size=100, parallel_method='thread', shuffle=False)
```
> 读取EasyData平台标注图像分类数据集,并对样本进行相应的处理。EasyData图像分类任务数据集格式的介绍可查看文档:[数据集格式说明](../datasets.md)。
> **参数**
> > * **data_dir** (str): 数据集所在的目录路径。
> > * **file_list** (str): 描述数据集图片文件和对应标注文件的文件路径(文本内每行路径为相对`data_dir`的相对路径)。
> > * **label_list** (str): 描述数据集包含的类别信息文件路径。
> > * **transforms** (paddlex.seg.transforms): 数据集中每个样本的预处理/增强算子,详见[paddlex.cls.transforms](./transforms/cls_transforms.md)。
> > * **num_workers** (int|str):数据集中样本在预处理过程中的线程或进程数。默认为'auto'。当设为'auto'时,根据系统的实际CPU核数设置`num_workers`: 如果CPU核数的一半大于8,则`num_workers`为8,否则为CPU核数的一半。
> > * **buffer_size** (int): 数据集中样本在预处理过程中队列的缓存长度,以样本数为单位。默认为100。
> > * **parallel_method** (str): 数据集中样本在预处理过程中并行处理的方式,支持'thread'线程和'process'进程两种方式。默认为'process'(Windows和Mac下会强制使用thread,该参数无效)。
> > * **shuffle** (bool): 是否需要对数据集中样本打乱顺序。默认为False。
# 数据集转换
## labelme2voc
```python
pdx.tools.labelme2voc(image_dir, json_dir, dataset_save_dir)
```
将LabelMe标注的数据集转换为VOC数据集。
> **参数**
> > * **image_dir** (str): 图像文件存放的路径。
> > * **json_dir** (str): 与每张图像对应的json文件的存放路径。
> > * **dataset_save_dir** (str): 转换后数据集存放路径。
## 其它数据集转换
### easydata2imagenet
```python
pdx.tools.easydata2imagenet(image_dir, json_dir, dataset_save_dir)
```
### easydata2voc
```python
pdx.tools.easydata2voc(image_dir, json_dir, dataset_save_dir)
```
### easydata2coco
```python
pdx.tools.easydata2coco(image_dir, json_dir, dataset_save_dir)
```
### easydata2seg
```python
pdx.tools.easydata2seg(image_dir, json_dir, dataset_save_dir)
```
### labelme2coco
```python
pdx.tools.labelme2coco(image_dir, json_dir, dataset_save_dir)
```
### labelme2seg
```python
pdx.tools.labelme2seg(image_dir, json_dir, dataset_save_dir)
```
### jingling2seg
```python
pdx.tools.jingling2seg(image_dir, json_dir, dataset_save_dir)
```
数据集-datasets
============================
PaddleX目前支持主流的CV数据集格式和 `EasyData <https://ai.baidu.com/easydata/>`_ 数据标注平台的标注数据格式,此外PaddleX也提升了数据格式转换工具API,支持包括LabelMe,精灵标注助手和EasyData平台数据格式的转换,可以参考PaddleX的tools API文档。
下表为各数据集格式与相应任务的对应关系,
+------------------------+------------+----------+----------+----------+
| 数据集格式 | 图像分类 | 目标检测 | 实例分割 | 语义分割 |
+========================+============+==========+==========+==========+
| ImageNet | √ | - | - | - |
+------------------------+------------+----------+----------+----------+
| VOCDetection | - | √ | - | - |
+------------------------+------------+----------+----------+----------+
| CocoDetection | - | √ | √ | - |
+------------------------+------------+----------+----------+----------+
| SegDataset | - | - | - | √ |
+------------------------+------------+----------+----------+----------+
| EasyDataCls | √ | - | - | - |
+------------------------+------------+----------+----------+----------+
| EasyDataDet | - | √ | √ | - |
+------------------------+------------+----------+----------+----------+
| EasyDataSeg | - | - | - | √ |
+------------------------+------------+----------+----------+----------+
.. toctree::
:maxdepth: 2
classification.md
detection.md
semantic_segmentation.md
dataset_convert.md
# 语义分割数据集
## SegDataset类
```
paddlex.datasets.SegDataset(data_dir, file_list, label_list, transforms=None, num_workers='auto', buffer_size=100, parallel_method='thread', shuffle=False)
```
> 读取语义分割任务数据集,并对样本进行相应的处理。语义分割任务数据集格式的介绍可查看文档:[数据集格式说明](../datasets.md)
> 示例:[代码文件](https://github.com/PaddlePaddle/PaddleX/blob/develop/tutorials/train/segmentation/unet.py#L27)
> **参数**
> > * **data_dir** (str): 数据集所在的目录路径。
> > * **file_list** (str): 描述数据集图片文件和对应标注文件的文件路径(文本内每行路径为相对`data_dir`的相对路径)。
> > * **label_list** (str): 描述数据集包含的类别信息文件路径。
> > * **transforms** (paddlex.seg.transforms): 数据集中每个样本的预处理/增强算子,详见[paddlex.seg.transforms](./transforms/seg_transforms.md)。
> > * **num_workers** (int|str):数据集中样本在预处理过程中的线程或进程数。默认为'auto'。当设为'auto'时,根据系统的实际CPU核数设置`num_workers`: 如果CPU核数的一半大于8,则`num_workers`为8,否则为CPU核数的一半。
> > * **buffer_size** (int): 数据集中样本在预处理过程中队列的缓存长度,以样本数为单位。默认为100。
> > * **parallel_method** (str): 数据集中样本在预处理过程中并行处理的方式,支持'thread'线程和'process'进程两种方式。默认为'process'(Windows和Mac下会强制使用thread,该参数无效)。
> > * **shuffle** (bool): 是否需要对数据集中样本打乱顺序。默认为False。
## EasyDataSeg类
```
paddlex.datasets.EasyDataSeg(data_dir, file_list, label_list, transforms=None, num_workers='auto', buffer_size=100, parallel_method='thread', shuffle=False)
```
> 读取EasyData语义分割任务数据集,并对样本进行相应的处理。EasyData语义分割任务数据集格式的介绍可查看文档:[数据集格式说明](../datasets.md)
> **参数**
> > * **data_dir** (str): 数据集所在的目录路径。
> > * **file_list** (str): 描述数据集图片文件和对应标注文件的文件路径(文本内每行路径为相对`data_dir`的相对路径)。
> > * **label_list** (str): 描述数据集包含的类别信息文件路径。
> > * **transforms** (paddlex.seg.transforms): 数据集中每个样本的预处理/增强算子,详见[paddlex.seg.transforms](./transforms/seg_transforms.md)。
> > * **num_workers** (int|str):数据集中样本在预处理过程中的线程或进程数。默认为'auto'。当设为'auto'时,根据系统的实际CPU核数设置`num_workers`: 如果CPU核数的一半大于8,则`num_workers`为8,否则为CPU核数的一半。
> > * **buffer_size** (int): 数据集中样本在预处理过程中队列的缓存长度,以样本数为单位。默认为100。
> > * **parallel_method** (str): 数据集中样本在预处理过程中并行处理的方式,支持'thread'线程和'process'进程两种方式。默认为'process'(Windows和Mac下会强制使用thread,该参数无效)。
> > * **shuffle** (bool): 是否需要对数据集中样本打乱顺序。默认为False。
......@@ -4,35 +4,52 @@
## Predictor类
图像分类、目标检测、实例分割、语义分割统一的预测器,实现高性能预测。
```
paddlex.deploy.Predictor(model_dir, use_gpu=False, gpu_id=0, use_mkl=False, use_trt=False, use_glog=False, memory_optimize=True)
```
> **参数**
**参数**
> > * **model_dir**: 训练过程中保存的模型路径, 注意需要使用导出的inference模型
> > * **use_gpu**: 是否使用GPU进行预测
> > * **gpu_id**: 使用的GPU序列号
> > * **use_mkl**: 是否使用mkldnn加速库
> > * **use_trt**: 是否使用TensorRT预测引擎
> > * **use_glog**: 是否打印中间日志
> > * **memory_optimize**: 是否优化内存使用
> > ### 示例
> >
> > ```
> > import paddlex
> >
> > model = paddlex.deploy.Predictor(model_dir, use_gpu=True)
> > result = model.predict(image_file)
> > ```
> * **model_dir** (str): 导出为inference格式的模型路径。
> * **use_gpu** (bool): 是否使用GPU进行预测。
> * **gpu_id** (int): 使用的GPU序列号。
> * **use_mkl** (bool): 是否使用mkldnn加速库。
> * **use_trt** (boll): 是否使用TensorRT预测引擎。
> * **use_glog** (bool): 是否打印中间日志。
> * **memory_optimize** (bool): 是否优化内存使用。
### predict 接口
> ### 示例
>
> ```
> predict(image, topk=1)
> import paddlex
>
> model = paddlex.deploy.Predictor(model_dir, use_gpu=True)
> result = model.predict(image_file)
> ```
> **参数
### predict 接口
```
predict(image, topk=1)
```
单张图片预测接口。
> **参数**
>
> > * **image** (str|np.ndarray): 待预测的图片路径或numpy数组(HWC排列,BGR格式)。
> > * **topk** (int): 图像分类时使用的参数,表示预测前topk个可能的分类
* **image(str|np.ndarray)**: 待预测的图片路径或np.ndarray,若为后者需注意为BGR格式
* **topk(int)**: 图像分类时使用的参数,表示预测前topk个可能的分类
### batch_predict 接口
```
batch_predict(image_list, topk=1, thread_num=2)
```
批量图片预测接口。
> **参数**
>
> > * **image_list** (list|tuple): 对列表(或元组)中的图像同时进行预测,列表中的元素可以是图像路径或numpy数组(HWC排列,BGR格式)。
> > * **topk** (int): 图像分类时使用的参数,表示预测前topk个可能的分类。
> > * **thread_num** (int): 并发执行各图像预处理时的线程数。
PaddleX API说明文档
API接口说明
============================
.. toctree::
:maxdepth: 2
transforms/index.rst
datasets/index.rst
datasets.md
models/index.rst
slim.md
load_model.md
visualize.md
deploy.md
interpret.md
# 模型可解释性
目前PaddleX支持对于图像分类的结果以可视化的方式进行解释,支持LIME和NormLIME两种可解释性算法。
## paddlex.interpret.lime
> **LIME可解释性结果可视化**
```
paddlex.interpret.lime(img_file,
model,
num_samples=3000,
batch_size=50,
save_dir='./')
```
使用LIME算法将模型预测结果的可解释性可视化。
LIME表示与模型无关的局部可解释性,可以解释任何模型。LIME的思想是以输入样本为中心,在其附近的空间中进行随机采样,每个采样通过原模型得到新的输出,这样得到一系列的输入和对应的输出,LIME用一个简单的、可解释的模型(比如线性回归模型)来拟合这个映射关系,得到每个输入维度的权重,以此来解释模型。
**注意:** 可解释性结果可视化目前只支持分类模型。
### 参数
>* **img_file** (str): 预测图像路径。
>* **model** (paddlex.cv.models): paddlex中的模型。
>* **num_samples** (int): LIME用于学习线性模型的采样数,默认为3000。
>* **batch_size** (int): 预测数据batch大小,默认为50。
>* **save_dir** (str): 可解释性可视化结果(保存为png格式文件)和中间文件存储路径。
### 使用示例
> 对预测可解释性结果可视化的过程可参见[代码](https://github.com/PaddlePaddle/PaddleX/blob/develop/tutorials/interpret/lime.py)。
## paddlex.interpret.normlime
> **NormLIME可解释性结果可视化**
```
paddlex.interpret.normlime(img_file,
model,
dataset=None,
num_samples=3000,
batch_size=50,
save_dir='./',
normlime_weights_file=None)
```
使用NormLIME算法将模型预测结果的可解释性可视化。
NormLIME是利用一定数量的样本来出一个全局的解释。由于NormLIME计算量较大,此处采用一种简化的方式:使用一定数量的测试样本(目前默认使用所有测试样本),对每个样本进行特征提取,映射到同一个特征空间;然后以此特征做为输入,以模型输出做为输出,使用线性回归对其进行拟合,得到一个全局的输入和输出的关系。之后,对一测试样本进行解释时,使用NormLIME全局的解释,来对LIME的结果进行滤波,使最终的可视化结果更加稳定。
**注意:** 可解释性结果可视化目前只支持分类模型。
### 参数
>* **img_file** (str): 预测图像路径。
>* **model** (paddlex.cv.models): paddlex中的模型。
>* **dataset** (paddlex.datasets): 数据集读取器,默认为None。
>* **num_samples** (int): LIME用于学习线性模型的采样数,默认为3000。
>* **batch_size** (int): 预测数据batch大小,默认为50。
>* **save_dir** (str): 可解释性可视化结果(保存为png格式文件)和中间文件存储路径。
>* **normlime_weights_file** (str): NormLIME初始化文件名,若不存在,则计算一次,保存于该路径;若存在,则直接载入。
**注意:** dataset`读取的是一个数据集,该数据集不宜过大,否则计算时间会较长,但应包含所有类别的数据。NormLIME可解释性结果可视化目前只支持分类模型。
### 使用示例
> 对预测可解释性结果可视化的过程可参见[代码](https://github.com/PaddlePaddle/PaddleX/blob/develop/tutorials/interpret/normlime.py)。
# 模型加载-load_model
# 模型加载
PaddleX提供了统一的模型加载接口,支持加载PaddleX保存的模型,并在验证集上进行评估或对测试图片进行预测
## 函数接口
## paddlex.load_model
> **加载PaddleX保存的模型**
```
paddlex.load_model(model_dir)
......
# 图像分类
# Image Classification
## ResNet50类
## paddlex.cls.ResNet50
```python
paddlex.cls.ResNet50(num_classes=1000)
......@@ -12,7 +12,7 @@ paddlex.cls.ResNet50(num_classes=1000)
> - **num_classes** (int): 类别数。默认为1000。
### train 训练接口
### train
```python
train(self, num_epochs, train_dataset, train_batch_size=64, eval_dataset=None, save_interval_epochs=1, log_interval_steps=2, save_dir='output', pretrain_weights='IMAGENET', optimizer=None, learning_rate=0.025, warmup_steps=0, warmup_start_lr=0.0, lr_decay_epochs=[30, 60, 90], lr_decay_gamma=0.1, use_vdl=False, sensitivities_file=None, eval_metric_loss=0.05, early_stop=False, early_stop_patience=5, resume_checkpoint=None)
......@@ -41,7 +41,7 @@ train(self, num_epochs, train_dataset, train_batch_size=64, eval_dataset=None, s
> > - **early_stop_patience** (int): 当使用提前终止训练策略时,如果验证集精度在`early_stop_patience`个epoch内连续下降或持平,则终止训练。默认值为5。
> > - **resume_checkpoint** (str): 恢复训练时指定上次训练保存的模型路径。若为None,则不会恢复训练。默认值为None。
### evaluate 评估接口
### evaluate
```python
evaluate(self, eval_dataset, batch_size=1, epoch_id=None, return_details=False)
......@@ -59,7 +59,7 @@ evaluate(self, eval_dataset, batch_size=1, epoch_id=None, return_details=False)
> > - **dict**: 当return_details为False时,返回dict, 包含关键字:'acc1'、'acc5',分别表示最大值的accuracy、前5个最大值的accuracy。
> > - **tuple** (metrics, eval_details): 当`return_details`为True时,增加返回dict,包含关键字:'true_labels'、'pred_scores',分别代表真实类别id、每个类别的预测得分。
### predict 预测接口
### predict
```python
predict(self, img_file, transforms=None, topk=5)
......@@ -69,7 +69,7 @@ predict(self, img_file, transforms=None, topk=5)
> **参数**
>
> > - **img_file** (str): 预测图像路径
> > - **img_file** (str|np.ndarray): 预测图像路径或numpy数组(HWC排列,BGR格式)
> > - **transforms** (paddlex.cls.transforms): 数据预处理操作。
> > - **topk** (int): 预测时前k个最大值。
......@@ -78,117 +78,52 @@ predict(self, img_file, transforms=None, topk=5)
> > - **list**: 其中元素均为字典。字典的关键字为'category_id'、'category'、'score',
> > 分别对应预测类别id、预测类别标签、预测得分。
## 其它分类器类
### batch_predict
PaddleX提供了共计22种分类器,所有分类器均提供同`ResNet50`相同的训练`train`,评估`evaluate`和预测`predict`接口,各模型效果可参考[模型库](https://paddlex.readthedocs.io/zh_CN/latest/appendix/model_zoo.html)
### ResNet18
```python
paddlex.cls.ResNet18(num_classes=1000)
```
### ResNet34
```python
paddlex.cls.ResNet34(num_classes=1000)
```
### ResNet50
```python
paddlex.cls.ResNet50(num_classes=1000)
```
### ResNet50_vd
```python
paddlex.cls.ResNet50_vd(num_classes=1000)
```
### ResNet50_vd_ssld
```python
paddlex.cls.ResNet50_vd_ssld(num_classes=1000)
```
### ResNet101
```python
paddlex.cls.ResNet101(num_classes=1000)
batch_predict(self, img_file_list, transforms=None, topk=5, thread_num=2)
```
### ResNet101_vd
```python
paddlex.cls.ResNet101_vdnum_classes=1000)
```
### ResNet101_vd_ssld
```python
paddlex.cls.ResNet101_vd_ssld(num_classes=1000)
```
### DarkNet53
```python
paddlex.cls.DarkNet53(num_classes=1000)
```
### MobileNetV1
```python
paddlex.cls.MobileNetV1(num_classes=1000)
```
### MobileNetV2
```python
paddlex.cls.MobileNetV2(num_classes=1000)
```
### MobileNetV3_small
```python
paddlex.cls.MobileNetV3_small(num_classes=1000)
```
> 分类模型批量预测接口。需要注意的是,只有在训练过程中定义了eval_dataset,模型在保存时才会将预测时的图像处理流程保存在`ResNet50.test_transforms`和`ResNet50.eval_transforms`中。如未在训练时定义eval_dataset,那在调用预测`predict`接口时,用户需要再重新定义test_transforms传入给`predict`接口。
### MobileNetV3_small_ssld
```python
paddlex.cls.MobileNetV3_small_ssld(num_classes=1000)
```
### MobileNetV3_large
```python
paddlex.cls.MobileNetV3_large(num_classes=1000)
```
### MobileNetV3_large_ssld
```python
paddlex.cls.MobileNetV3_large_ssld(num_classes=1000)
```
### Xception65
```python
paddlex.cls.Xception65(num_classes=1000)
```
### Xception71
```python
paddlex.cls.Xception71(num_classes=1000)
```
### ShuffleNetV2
```python
paddlex.cls.ShuffleNetV2(num_classes=1000)
```
### DenseNet121
```python
paddlex.cls.DenseNet121(num_classes=1000)
```
### DenseNet161
```python
paddlex.cls.DenseNet161(num_classes=1000)
```
### DenseNet201
```python
paddlex.cls.DenseNet201(num_classes=1000)
```
> **参数**
>
> > - **img_file_list** (list|tuple): 对列表(或元组)中的图像同时进行预测,列表中的元素可以是图像路径或numpy数组(HWC排列,BGR格式)。
> > - **transforms** (paddlex.cls.transforms): 数据预处理操作。
> > - **topk** (int): 预测时前k个最大值。
> > - **thread_num** (int): 并发执行各图像预处理时的线程数。
### HRNet_W18
```python
paddlex.cls.HRNet_W18(num_classes=1000)
```
> **返回值**
>
> > - **list**: 每个元素都为列表,表示各图像的预测结果。在各图像的预测列表中,其中元素均为字典。字典的关键字为'category_id'、'category'、'score',分别对应预测类别id、预测类别标签、预测得分。
## 其它分类模型
PaddleX提供了共计22种分类模型,所有分类模型均提供同`ResNet50`相同的训练`train`,评估`evaluate`和预测`predict`接口,各模型效果可参考[模型库](https://paddlex.readthedocs.io/zh_CN/latest/appendix/model_zoo.html)
| 模型 | 接口 |
| :---------------- | :---------------------- |
| ResNet18 | paddlex.cls.ResNet18(num_classes=1000) |
| ResNet34 | paddlex.cls.ResNet34(num_classes=1000) |
| ResNet50 | paddlex.cls.ResNet50(num_classes=1000) |
| ResNet50_vd | paddlex.cls.ResNet50_vd(num_classes=1000) |
| ResNet50_vd_ssld | paddlex.cls.ResNet50_vd_ssld(num_classes=1000) |
| ResNet101 | paddlex.cls.ResNet101(num_classes=1000) |
| ResNet101_vd | paddlex.cls.ResNet101_vd(num_classes=1000) |
| ResNet101_vd_ssld | paddlex.cls.ResNet101_vd_ssld(num_classes=1000) |
| DarkNet53 | paddlex.cls.DarkNet53(num_classes=1000) |
| MoibileNetV1 | paddlex.cls.MobileNetV1(num_classes=1000) |
| MobileNetV2 | paddlex.cls.MobileNetV2(num_classes=1000) |
| MobileNetV3_small | paddlex.cls.MobileNetV3_small(num_classes=1000) |
| MobileNetV3_small_ssld | paddlex.cls.MobileNetV3_small_ssld(num_classes=1000) |
| MobileNetV3_large | paddlex.cls.MobileNetV3_large(num_classes=1000) |
| MobileNetV3_large_ssld | paddlex.cls.MobileNetV3_large_ssld(num_classes=1000) |
| Xception65 | paddlex.cls.Xception65(num_classes=1000) |
| Xception71 | paddlex.cls.Xception71(num_classes=1000) |
| ShuffleNetV2 | paddlex.cls.ShuffleNetV2(num_classes=1000) |
| DenseNet121 | paddlex.cls.DenseNet121(num_classes=1000) |
| DenseNet161 | paddlex.cls.DenseNet161(num_classes=1000) |
| DenseNet201 | paddlex.cls.DenseNet201(num_classes=1000) |
| HRNet_W18 | paddlex.cls.HRNet_W18(num_classes=1000) |
| AlexNet | paddlex.cls.AlexNet(num_classes=1000) |
此差异已折叠。
模型集-models
视觉模型集
============================
PaddleX目前支持 `四种视觉任务解决方案 <../../cv_solutions.html>`_ ,包括图像分类、目标检测、实例分割和语义分割。对于每种视觉任务,PaddleX又提供了1种或多种模型,用户可根据需求及应用场景选取。
.. toctree::
:maxdepth: 2
:maxdepth: 3
classification.md
detection.md
......
# 实例分割
# Instance Segmentation
## MaskRCNN
## MaskRCNN
```python
paddlex.det.MaskRCNN(num_classes=81, backbone='ResNet50', with_fpn=True, aspect_ratios=[0.5, 1.0, 2.0], anchor_sizes=[32, 64, 128, 256, 512])
......@@ -17,7 +17,7 @@ paddlex.det.MaskRCNN(num_classes=81, backbone='ResNet50', with_fpn=True, aspect_
> > - **aspect_ratios** (list): 生成anchor高宽比的可选值。默认为[0.5, 1.0, 2.0]。
> > - **anchor_sizes** (list): 生成anchor大小的可选值。默认为[32, 64, 128, 256, 512]。
#### train 训练接口
#### train
```python
train(self, num_epochs, train_dataset, train_batch_size=1, eval_dataset=None, save_interval_epochs=1, log_interval_steps=20, save_dir='output', pretrain_weights='IMAGENET', optimizer=None, learning_rate=1.0/800, warmup_steps=500, warmup_start_lr=1.0 / 2400, lr_decay_epochs=[8, 11], lr_decay_gamma=0.1, metric=None, use_vdl=False, early_stop=False, early_stop_patience=5, resume_checkpoint=None)
......@@ -47,7 +47,7 @@ train(self, num_epochs, train_dataset, train_batch_size=1, eval_dataset=None, sa
> > - **early_stop_patience** (int): 当使用提前终止训练策略时,如果验证集精度在`early_stop_patience`个epoch内连续下降或持平,则终止训练。默认值为5。
> > - **resume_checkpoint** (str): 恢复训练时指定上次训练保存的模型路径。若为None,则不会恢复训练。默认值为None。
#### evaluate 评估接口
#### evaluate
```python
evaluate(self, eval_dataset, batch_size=1, epoch_id=None, metric=None, return_details=False)
......@@ -67,7 +67,7 @@ evaluate(self, eval_dataset, batch_size=1, epoch_id=None, metric=None, return_de
>
> > - **tuple** (metrics, eval_details) | **dict** (metrics): 当`return_details`为True时,返回(metrics, eval_details),当return_details为False时,返回metrics。metrics为dict,包含关键字:'bbox_mmap'和'segm_mmap'或者’bbox_map‘和'segm_map',分别表示预测框和分割区域平均准确率平均值在各个IoU阈值下的结果取平均值的结果(mmAP)、平均准确率平均值(mAP)。eval_details为dict,包含关键字:'bbox',对应元素预测框结果列表,每个预测结果由图像id、预测框类别id、预测框坐标、预测框得分;'mask',对应元素预测区域结果列表,每个预测结果由图像id、预测区域类别id、预测区域坐标、预测区域得分;’gt‘:真实标注框和标注区域相关信息。
#### predict 预测接口
#### predict
```python
predict(self, img_file, transforms=None)
......@@ -77,9 +77,28 @@ predict(self, img_file, transforms=None)
> **参数**
>
> > - **img_file** (str): 预测图像路径
> > - **img_file** (str|np.ndarray): 预测图像路径或numpy数组(HWC排列,BGR格式)
> > - **transforms** (paddlex.det.transforms): 数据预处理操作。
>
> **返回值**
>
> > - **list**: 预测结果列表,列表中每个元素均为一个dict,key'bbox', 'mask', 'category', 'category_id', 'score',分别表示每个预测目标的框坐标信息、Mask信息,类别、类别id、置信度。其中框坐标信息为[xmin, ymin, w, h],即左上角x, y坐标和框的宽和高。Mask信息为原图大小的二值图,1表示像素点属于预测类别,0表示像素点是背景。
#### batch_predict
```python
batch_predict(self, img_file_list, transforms=None, thread_num=2)
```
> MaskRCNN模型批量预测接口。需要注意的是,只有在训练过程中定义了eval_dataset,模型在保存时才会将预测时的图像处理流程保存在FasterRCNN.test_transforms和FasterRCNN.eval_transforms中。如未在训练时定义eval_dataset,那在调用预测predict接口时,用户需要再重新定义test_transforms传入给predict接口。
> **参数**
>
> > - **img_file_list** (list|tuple): 对列表(或元组)中的图像同时进行预测,列表中的元素可以是预测图像路径或numpy数组(HWC排列,BGR格式)。
> > - **transforms** (paddlex.det.transforms): 数据预处理操作。
> > - **thread_num** (int): 并发执行各图像预处理时的线程数。
>
> **返回值**
>
> > - **list**: 每个元素都为列表,表示各图像的预测结果。在各图像的预测结果列表中,每个元素均为一个dict,key'bbox', 'mask', 'category', 'category_id', 'score',分别表示每个预测目标的框坐标信息、Mask信息,类别、类别id、置信度。其中框坐标信息为[xmin, ymin, w, h],即左上角x, y坐标和框的宽和高。Mask信息为原图大小的二值图,1表示像素点属于预测类别,0表示像素点是背景。
此差异已折叠。
数据处理-transforms
数据处理与增强
============================
transforms为PaddleX的模型训练提供了数据的预处理和数据增强接口。
......
此差异已折叠。
......@@ -8,7 +8,7 @@ Anaconda是一个开源的Python发行版本,其包含了conda、Python等180
### 第二步 安装
运行下载的安装包(以.exe为后辍),根据引导完成安装, 用户可自行修改安装目录(如下图)
![](../images/anaconda_windows.png)
![](images/anaconda_windows.png)
### 第三步 使用
- 点击Windows系统左下角的Windows图标,打开:所有程序->Anaconda3/2(64-bit)->Anaconda Prompt
......
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
此差异已折叠。
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册