diff --git a/configs/picodet/README.md b/configs/picodet/README.md
index 235b494e03845af9582f80cb55f51c34ebf5a9e7..cd25cfe64f0beec341049f8b8038919e707636b6 100644
--- a/configs/picodet/README.md
+++ b/configs/picodet/README.md
@@ -3,19 +3,19 @@
![](../../docs/images/picedet_demo.jpeg)
## Introduction
-We developed a series of lightweight models, which named `PP-PicoDet`. Because of its excellent performance, it is very suitable for deployment on mobile or CPU. For more details, please refer to our [report on arXiv](https://arxiv.org/abs/2111.00902).
+We developed a series of lightweight models, named `PP-PicoDet`. Because of the excellent performance, our models are very suitable for deployment on mobile or CPU. For more details, please refer to our [report on arXiv](https://arxiv.org/abs/2111.00902).
- 🌟 Higher mAP: the **first** object detectors that surpass mAP(0.5:0.95) **30+** within 1M parameters when the input size is 416.
- 🚀 Faster latency: 150FPS on mobile ARM CPU.
- 😊 Deploy friendly: support PaddleLite/MNN/NCNN/OpenVINO and provide C++/Python/Android implementation.
-- 😍 Advanced algorithm: use the most advanced algorithms and innovate, such as ESNet, CSP-PAN, SimOTA with VFL, etc.
+- 😍 Advanced algorithm: use the most advanced algorithms and offer innovation, such as ESNet, CSP-PAN, SimOTA with VFL, etc.
-### Comming soon
+### Comming Soon
- [ ] More series of model, such as smaller or larger model.
- [ ] Pretrained models for more scenarios.
- [ ] More features in need.
@@ -35,7 +35,7 @@ We developed a series of lightweight models, which named `PP-PicoDet`. Because o
| PicoDet-L | 416*416 | 36.6 | 52.5 | 3.30 | 3.76 | 23.36 | **21.85** | [model](https://paddledet.bj.bcebos.com/models/picodet_l_416_coco.pdparams) | [log](https://paddledet.bj.bcebos.com/logs/train_picodet_l_416_coco.log) | [config](https://github.com/PaddlePaddle/PaddleDetection/tree/release/2.3/configs/picodet/picodet_l_416_coco.yml) |
| PicoDet-L | 640*640 | 40.9 | 57.6 | 3.30 | 8.91 | 54.11 | **50.55** | [model](https://paddledet.bj.bcebos.com/models/picodet_l_640_coco.pdparams) | [log](https://paddledet.bj.bcebos.com/logs/train_picodet_l_640_coco.log) | [config](https://github.com/PaddlePaddle/PaddleDetection/tree/release/2.3/configs/picodet/picodet_l_640_coco.yml) |
-#### More config
+#### More Configs
| Model | Input size | mAPval
0.5:0.95 | mAPval
0.5 | Params
(M) | FLOPS
(G) | Latency[NCNN](#latency)
(ms) | Latency[Lite](#latency)
(ms) | download | config |
| :--------------------------- | :--------: | :---------------------: | :----------------: | :----------------: | :---------------: | :-----------------------------: | :-----------------------------: | :-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------: | :------------------------------------------------------------------------------------------------------------------------------------------- |
@@ -52,10 +52,26 @@ We developed a series of lightweight models, which named `PP-PicoDet`. Because o
+#### Benchmark of Other Models
+
+| Model | Input size | mAPval
0.5:0.95 | mAPval
0.5 | Params
(M) | FLOPS
(G) | Latency[NCNN](#latency)
(ms) |
+| :-------- | :--------: | :---------------------: | :----------------: | :----------------: | :---------------: | :-----------------------------: |
+| YOLOv3-Tiny | 416*416 | 16.6 | 33.1 | 8.86 | 5.62 | 25.42 |
+| YOLOv4-Tiny | 416*416 | 21.7 | 40.2 | 6.06 | 6.96 | 23.69 |
+| PP-YOLO-Tiny | 320*320 | 20.6 | - | 1.08 | 0.58 | 6.75 |
+| PP-YOLO-Tiny | 416*416 | 22.7 | - | 1.08 | 1.02 | 10.48 |
+| Nanodet-M | 320*320 | 20.6 | - | 0.95 | 0.72 | 8.71 |
+| Nanodet-M | 416*416 | 23.5 | - | 0.95 | 1.2 | 13.35 |
+| Nanodet-M 1.5x | 416*416 | 26.8 | - | 2.08 | 2.42 | 15.83 |
+| YOLOX-Nano | 416*416 | 25.8 | - | 0.91 | 1.08 | 19.23 |
+| YOLOX-Tiny | 416*416 | 32.8 | - | 5.06 | 6.45 | 32.77 |
+| YOLOv5n | 640*640 | 28.4 | 46.0 | 1.9 | 4.5 | 40.35 |
+| YOLOv5s | 640*640 | 37.2 | 56.0 | 7.2 | 16.5 | 78.05 |
+
## Deployment
-### Export and Convert model
+### Export and Convert Model
1. Export model (click to expand)
@@ -131,14 +147,13 @@ paddle2onnx --model_dir output_inference/picodet_s_320_coco/ \
- [Android demo](https://github.com/JiweiMaster/PP-PicoDet-Android-Demo)
+Android demo visualization:
-## Slim
-
-### quantization
+## Quantization
Requirements:
@@ -176,11 +191,11 @@ python tools/post_quant.py -c configs/picodet/picodet_s_320_coco.yml \
--slim_config configs/slim/post_quant/picodet_s_ptq.yml
```
-- Notes: Now the accuracy of post quant is abnormal and it is being debugged.
+- Notes: Now the accuracy of post quant is abnormal and this problem is being solved.
-## Cite PiocDet
+## Cite PP-PiocDet
If you use PiocDet in your research, please cite our work by using the following BibTeX entry:
```
@misc{yu2021pppicodet,
diff --git a/deploy/lite/README.md b/deploy/lite/README.md
index dd3ddc5c91a99b22723f51163e16c2a4b972dea1..1b2adac6ee270bfade6e69a1d84f408ff8cd125b 100644
--- a/deploy/lite/README.md
+++ b/deploy/lite/README.md
@@ -22,9 +22,11 @@ Paddle Lite是飞桨轻量化推理引擎,为手机、IOT端提供高效推理
预测库有两种获取方式:
1. [**建议**]直接下载,预测库下载链接如下:
- |平台|预测库下载链接|
- |-|-|
- |Android|[arm7](https://github.com/PaddlePaddle/Paddle-Lite/releases/download/v2.10-rc/inference_lite_lib.android.armv7.clang.c++_static.with_extra.with_cv.tar.gz) / [arm8](https://github.com/PaddlePaddle/Paddle-Lite/releases/download/v2.10-rc/inference_lite_lib.android.armv8.clang.c++_static.with_extra.with_cv.tar.gz)|
+ |平台| 架构 | 预测库下载链接|
+ |-|-|-|
+ |Android| arm7 | [inference_lite_lib](https://github.com/PaddlePaddle/Paddle-Lite/releases/download/v2.10-rc/inference_lite_lib.android.armv7.clang.c++_static.with_extra.with_cv.tar.gz) |
+ | Android | arm8 | [inference_lite_lib](https://github.com/PaddlePaddle/Paddle-Lite/releases/download/v2.10-rc/inference_lite_lib.android.armv8.clang.c++_static.with_extra.with_cv.tar.gz) |
+ | Android | arm8(FP16) | [inference_lite_lib](https://github.com/PaddlePaddle/Paddle-Lite/releases/download/v2.10-rc/inference_lite_lib.android.armv8_clang_c++_static_with_extra_with_cv_with_fp16.tiny_publish.zip) |
**注意**:1. 如果是从 Paddle-Lite [官方文档](https://paddle-lite.readthedocs.io/zh/latest/quick_start/release_lib.html#android-toolchain-gcc)下载的预测库,注意选择`with_extra=ON,with_cv=ON`的下载链接。2. 目前只提供Android端demo,IOS端demo可以参考[Paddle-Lite IOS demo](https://github.com/PaddlePaddle/Paddle-Lite-Demo/tree/master/PaddleLite-ios-demo)
@@ -35,7 +37,10 @@ git clone https://github.com/PaddlePaddle/Paddle-Lite.git
cd Paddle-Lite
# 如果使用编译方式,建议使用develop分支编译预测库
git checkout develop
-./lite/tools/build_android.sh --arch=armv8 --with_cv=ON --with_extra=ON
+# FP32
+./lite/tools/build_android.sh --arch=armv8 --toolchain=clang --with_cv=ON --with_extra=ON
+# FP16
+./lite/tools/build_android.sh --arch=armv8 --toolchain=clang --with_cv=ON --with_extra=ON --with_arm82_fp16=ON
```
**注意**:编译Paddle-Lite获得预测库时,需要打开`--with_cv=ON --with_extra=ON`两个选项,`--arch`表示`arm`版本,这里指定为armv8,更多编译命令介绍请参考[链接](https://paddle-lite.readthedocs.io/zh/latest/source_compile/compile_andriod.html#id2)。
@@ -131,7 +136,10 @@ python tools/export_model.py -c configs/picodet/picodet_s_320_coco.yml \
-o weights=https://paddledet.bj.bcebos.com/models/picodet_s_320_coco.pdparams --output_dir=output_inference
# 将inference模型转化为Paddle-Lite优化模型
+# FP32
paddle_lite_opt --valid_targets=arm --model_file=output_inference/picodet_s_320_coco/model.pdmodel --param_file=output_inference/picodet_s_320_coco/model.pdiparams --optimize_out=output_inference/picodet_s_320_coco/model
+# FP16
+paddle_lite_opt --valid_targets=arm --model_file=output_inference/picodet_s_320_coco/model.pdmodel --param_file=output_inference/picodet_s_320_coco/model.pdiparams --optimize_out=output_inference/picodet_s_320_coco/model --enable_fp16=true
# 将inference模型配置转化为json格式
python deploy/lite/convert_yml_to_json.py output_inference/picodet_s_320_coco/infer_cfg.yml
@@ -144,7 +152,7 @@ python deploy/lite/convert_yml_to_json.py output_inference/picodet_s_320_coco/in
### 2.2 与手机联调
首先需要进行一些准备工作。
-1. 准备一台arm8的安卓手机,如果编译的预测库和opt文件是armv7,则需要arm7的手机,并修改Makefile中`ARM_ABI = arm7`。
+1. 准备一台arm8的安卓手机,如果编译的预测库是armv7,则需要arm7的手机,并修改Makefile中`ARM_ABI=arm7`。
2. 电脑上安装ADB工具,用于调试。 ADB安装方式如下:
2.1. MAC电脑安装ADB:
diff --git a/docs/images/picodet_android_demo1.jpg b/docs/images/picodet_android_demo1.jpg
index 6f5d2bc6716c2b3edf2fc9bda74756fafc53283d..ecb5dae5135d1912ef65485e31e3b00715a8195b 100644
Binary files a/docs/images/picodet_android_demo1.jpg and b/docs/images/picodet_android_demo1.jpg differ
diff --git a/docs/images/picodet_android_demo2.jpg b/docs/images/picodet_android_demo2.jpg
index 3eb2d77f08d618a6160dcf74cb5447026aa8ad47..e08ffb8b0cb4859a76b62017c6beef951cbc2f85 100644
Binary files a/docs/images/picodet_android_demo2.jpg and b/docs/images/picodet_android_demo2.jpg differ