Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
PaddlePaddle
PaddleClas
提交
0f86c555
P
PaddleClas
项目概览
PaddlePaddle
/
PaddleClas
大约 2 年 前同步成功
通知
118
Star
4999
Fork
1114
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
19
列表
看板
标记
里程碑
合并请求
6
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
P
PaddleClas
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
19
Issue
19
列表
看板
标记
里程碑
合并请求
6
合并请求
6
Pages
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
提交
Issue看板
提交
0f86c555
编写于
5月 26, 2023
作者:
G
gaotingquan
提交者:
Tingquan Gao
5月 29, 2023
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
add amp args, use_amp=False
上级
2d8346cd
变更
295
隐藏空白更改
内联
并排
Showing
295 changed file
with
3540 addition
and
0 deletion
+3540
-0
ppcls/configs/ImageNet/AlexNet/AlexNet.yaml
ppcls/configs/ImageNet/AlexNet/AlexNet.yaml
+12
-0
ppcls/configs/ImageNet/CSPNet/CSPDarkNet53.yaml
ppcls/configs/ImageNet/CSPNet/CSPDarkNet53.yaml
+12
-0
ppcls/configs/ImageNet/CSWinTransformer/CSWinTransformer_base_224.yaml
.../ImageNet/CSWinTransformer/CSWinTransformer_base_224.yaml
+12
-0
ppcls/configs/ImageNet/CSWinTransformer/CSWinTransformer_base_384.yaml
.../ImageNet/CSWinTransformer/CSWinTransformer_base_384.yaml
+12
-0
ppcls/configs/ImageNet/CSWinTransformer/CSWinTransformer_large_224.yaml
...ImageNet/CSWinTransformer/CSWinTransformer_large_224.yaml
+12
-0
ppcls/configs/ImageNet/CSWinTransformer/CSWinTransformer_large_384.yaml
...ImageNet/CSWinTransformer/CSWinTransformer_large_384.yaml
+12
-0
ppcls/configs/ImageNet/CSWinTransformer/CSWinTransformer_small_224.yaml
...ImageNet/CSWinTransformer/CSWinTransformer_small_224.yaml
+12
-0
ppcls/configs/ImageNet/CSWinTransformer/CSWinTransformer_tiny_224.yaml
.../ImageNet/CSWinTransformer/CSWinTransformer_tiny_224.yaml
+12
-0
ppcls/configs/ImageNet/ConvNeXt/ConvNeXt_base_224.yaml
ppcls/configs/ImageNet/ConvNeXt/ConvNeXt_base_224.yaml
+12
-0
ppcls/configs/ImageNet/ConvNeXt/ConvNeXt_base_384.yaml
ppcls/configs/ImageNet/ConvNeXt/ConvNeXt_base_384.yaml
+12
-0
ppcls/configs/ImageNet/ConvNeXt/ConvNeXt_large_224.yaml
ppcls/configs/ImageNet/ConvNeXt/ConvNeXt_large_224.yaml
+12
-0
ppcls/configs/ImageNet/ConvNeXt/ConvNeXt_large_384.yaml
ppcls/configs/ImageNet/ConvNeXt/ConvNeXt_large_384.yaml
+12
-0
ppcls/configs/ImageNet/ConvNeXt/ConvNeXt_small.yaml
ppcls/configs/ImageNet/ConvNeXt/ConvNeXt_small.yaml
+12
-0
ppcls/configs/ImageNet/ConvNeXt/ConvNeXt_tiny.yaml
ppcls/configs/ImageNet/ConvNeXt/ConvNeXt_tiny.yaml
+12
-0
ppcls/configs/ImageNet/CvT/CvT_13_224.yaml
ppcls/configs/ImageNet/CvT/CvT_13_224.yaml
+12
-0
ppcls/configs/ImageNet/CvT/CvT_13_384.yaml
ppcls/configs/ImageNet/CvT/CvT_13_384.yaml
+12
-0
ppcls/configs/ImageNet/CvT/CvT_21_224.yaml
ppcls/configs/ImageNet/CvT/CvT_21_224.yaml
+12
-0
ppcls/configs/ImageNet/CvT/CvT_21_384.yaml
ppcls/configs/ImageNet/CvT/CvT_21_384.yaml
+12
-0
ppcls/configs/ImageNet/CvT/CvT_W24_384.yaml
ppcls/configs/ImageNet/CvT/CvT_W24_384.yaml
+12
-0
ppcls/configs/ImageNet/DLA/DLA102.yaml
ppcls/configs/ImageNet/DLA/DLA102.yaml
+12
-0
ppcls/configs/ImageNet/DLA/DLA102x.yaml
ppcls/configs/ImageNet/DLA/DLA102x.yaml
+12
-0
ppcls/configs/ImageNet/DLA/DLA102x2.yaml
ppcls/configs/ImageNet/DLA/DLA102x2.yaml
+12
-0
ppcls/configs/ImageNet/DLA/DLA169.yaml
ppcls/configs/ImageNet/DLA/DLA169.yaml
+12
-0
ppcls/configs/ImageNet/DLA/DLA34.yaml
ppcls/configs/ImageNet/DLA/DLA34.yaml
+12
-0
ppcls/configs/ImageNet/DLA/DLA46_c.yaml
ppcls/configs/ImageNet/DLA/DLA46_c.yaml
+12
-0
ppcls/configs/ImageNet/DLA/DLA46x_c.yaml
ppcls/configs/ImageNet/DLA/DLA46x_c.yaml
+12
-0
ppcls/configs/ImageNet/DLA/DLA60.yaml
ppcls/configs/ImageNet/DLA/DLA60.yaml
+12
-0
ppcls/configs/ImageNet/DLA/DLA60x.yaml
ppcls/configs/ImageNet/DLA/DLA60x.yaml
+12
-0
ppcls/configs/ImageNet/DLA/DLA60x_c.yaml
ppcls/configs/ImageNet/DLA/DLA60x_c.yaml
+12
-0
ppcls/configs/ImageNet/DPN/DPN107.yaml
ppcls/configs/ImageNet/DPN/DPN107.yaml
+12
-0
ppcls/configs/ImageNet/DPN/DPN131.yaml
ppcls/configs/ImageNet/DPN/DPN131.yaml
+12
-0
ppcls/configs/ImageNet/DPN/DPN68.yaml
ppcls/configs/ImageNet/DPN/DPN68.yaml
+12
-0
ppcls/configs/ImageNet/DPN/DPN92.yaml
ppcls/configs/ImageNet/DPN/DPN92.yaml
+12
-0
ppcls/configs/ImageNet/DPN/DPN98.yaml
ppcls/configs/ImageNet/DPN/DPN98.yaml
+12
-0
ppcls/configs/ImageNet/DSNet/DSNet_base.yaml
ppcls/configs/ImageNet/DSNet/DSNet_base.yaml
+12
-0
ppcls/configs/ImageNet/DSNet/DSNet_small.yaml
ppcls/configs/ImageNet/DSNet/DSNet_small.yaml
+12
-0
ppcls/configs/ImageNet/DSNet/DSNet_tiny.yaml
ppcls/configs/ImageNet/DSNet/DSNet_tiny.yaml
+12
-0
ppcls/configs/ImageNet/DarkNet/DarkNet53.yaml
ppcls/configs/ImageNet/DarkNet/DarkNet53.yaml
+12
-0
ppcls/configs/ImageNet/DataAugment/ResNet50_AutoAugment.yaml
ppcls/configs/ImageNet/DataAugment/ResNet50_AutoAugment.yaml
+12
-0
ppcls/configs/ImageNet/DataAugment/ResNet50_Baseline.yaml
ppcls/configs/ImageNet/DataAugment/ResNet50_Baseline.yaml
+12
-0
ppcls/configs/ImageNet/DataAugment/ResNet50_Cutmix.yaml
ppcls/configs/ImageNet/DataAugment/ResNet50_Cutmix.yaml
+12
-0
ppcls/configs/ImageNet/DataAugment/ResNet50_Cutout.yaml
ppcls/configs/ImageNet/DataAugment/ResNet50_Cutout.yaml
+12
-0
ppcls/configs/ImageNet/DataAugment/ResNet50_GridMask.yaml
ppcls/configs/ImageNet/DataAugment/ResNet50_GridMask.yaml
+12
-0
ppcls/configs/ImageNet/DataAugment/ResNet50_HideAndSeek.yaml
ppcls/configs/ImageNet/DataAugment/ResNet50_HideAndSeek.yaml
+12
-0
ppcls/configs/ImageNet/DataAugment/ResNet50_Mixup.yaml
ppcls/configs/ImageNet/DataAugment/ResNet50_Mixup.yaml
+12
-0
ppcls/configs/ImageNet/DataAugment/ResNet50_RandAugment.yaml
ppcls/configs/ImageNet/DataAugment/ResNet50_RandAugment.yaml
+12
-0
ppcls/configs/ImageNet/DataAugment/ResNet50_RandomErasing.yaml
.../configs/ImageNet/DataAugment/ResNet50_RandomErasing.yaml
+12
-0
ppcls/configs/ImageNet/DeiT/DeiT_base_distilled_patch16_224.yaml
...onfigs/ImageNet/DeiT/DeiT_base_distilled_patch16_224.yaml
+12
-0
ppcls/configs/ImageNet/DeiT/DeiT_base_distilled_patch16_384.yaml
...onfigs/ImageNet/DeiT/DeiT_base_distilled_patch16_384.yaml
+12
-0
ppcls/configs/ImageNet/DeiT/DeiT_base_patch16_224.yaml
ppcls/configs/ImageNet/DeiT/DeiT_base_patch16_224.yaml
+12
-0
ppcls/configs/ImageNet/DeiT/DeiT_base_patch16_384.yaml
ppcls/configs/ImageNet/DeiT/DeiT_base_patch16_384.yaml
+12
-0
ppcls/configs/ImageNet/DeiT/DeiT_small_distilled_patch16_224.yaml
...nfigs/ImageNet/DeiT/DeiT_small_distilled_patch16_224.yaml
+12
-0
ppcls/configs/ImageNet/DeiT/DeiT_small_patch16_224.yaml
ppcls/configs/ImageNet/DeiT/DeiT_small_patch16_224.yaml
+12
-0
ppcls/configs/ImageNet/DeiT/DeiT_tiny_distilled_patch16_224.yaml
...onfigs/ImageNet/DeiT/DeiT_tiny_distilled_patch16_224.yaml
+12
-0
ppcls/configs/ImageNet/DeiT/DeiT_tiny_patch16_224.yaml
ppcls/configs/ImageNet/DeiT/DeiT_tiny_patch16_224.yaml
+12
-0
ppcls/configs/ImageNet/DenseNet/DenseNet121.yaml
ppcls/configs/ImageNet/DenseNet/DenseNet121.yaml
+12
-0
ppcls/configs/ImageNet/DenseNet/DenseNet161.yaml
ppcls/configs/ImageNet/DenseNet/DenseNet161.yaml
+12
-0
ppcls/configs/ImageNet/DenseNet/DenseNet169.yaml
ppcls/configs/ImageNet/DenseNet/DenseNet169.yaml
+12
-0
ppcls/configs/ImageNet/DenseNet/DenseNet201.yaml
ppcls/configs/ImageNet/DenseNet/DenseNet201.yaml
+12
-0
ppcls/configs/ImageNet/DenseNet/DenseNet264.yaml
ppcls/configs/ImageNet/DenseNet/DenseNet264.yaml
+12
-0
ppcls/configs/ImageNet/Distillation/mv3_large_x1_0_distill_mv3_small_x1_0.yaml
...t/Distillation/mv3_large_x1_0_distill_mv3_small_x1_0.yaml
+12
-0
ppcls/configs/ImageNet/Distillation/resnet34_distill_resnet18_afd.yaml
.../ImageNet/Distillation/resnet34_distill_resnet18_afd.yaml
+12
-0
ppcls/configs/ImageNet/Distillation/resnet34_distill_resnet18_dist.yaml
...ImageNet/Distillation/resnet34_distill_resnet18_dist.yaml
+12
-0
ppcls/configs/ImageNet/Distillation/resnet34_distill_resnet18_dkd.yaml
.../ImageNet/Distillation/resnet34_distill_resnet18_dkd.yaml
+12
-0
ppcls/configs/ImageNet/Distillation/resnet34_distill_resnet18_mgd.yaml
.../ImageNet/Distillation/resnet34_distill_resnet18_mgd.yaml
+12
-0
ppcls/configs/ImageNet/Distillation/resnet34_distill_resnet18_pefd.yaml
...ImageNet/Distillation/resnet34_distill_resnet18_pefd.yaml
+12
-0
ppcls/configs/ImageNet/Distillation/resnet34_distill_resnet18_skd.yaml
.../ImageNet/Distillation/resnet34_distill_resnet18_skd.yaml
+12
-0
ppcls/configs/ImageNet/Distillation/resnet34_distill_resnet18_wsl.yaml
.../ImageNet/Distillation/resnet34_distill_resnet18_wsl.yaml
+12
-0
ppcls/configs/ImageNet/ESNet/ESNet_x0_25.yaml
ppcls/configs/ImageNet/ESNet/ESNet_x0_25.yaml
+12
-0
ppcls/configs/ImageNet/ESNet/ESNet_x0_5.yaml
ppcls/configs/ImageNet/ESNet/ESNet_x0_5.yaml
+12
-0
ppcls/configs/ImageNet/ESNet/ESNet_x0_75.yaml
ppcls/configs/ImageNet/ESNet/ESNet_x0_75.yaml
+12
-0
ppcls/configs/ImageNet/ESNet/ESNet_x1_0.yaml
ppcls/configs/ImageNet/ESNet/ESNet_x1_0.yaml
+12
-0
ppcls/configs/ImageNet/EfficientNet/EfficientNetB0.yaml
ppcls/configs/ImageNet/EfficientNet/EfficientNetB0.yaml
+12
-0
ppcls/configs/ImageNet/EfficientNet/EfficientNetB1.yaml
ppcls/configs/ImageNet/EfficientNet/EfficientNetB1.yaml
+12
-0
ppcls/configs/ImageNet/EfficientNet/EfficientNetB2.yaml
ppcls/configs/ImageNet/EfficientNet/EfficientNetB2.yaml
+12
-0
ppcls/configs/ImageNet/EfficientNet/EfficientNetB3.yaml
ppcls/configs/ImageNet/EfficientNet/EfficientNetB3.yaml
+12
-0
ppcls/configs/ImageNet/EfficientNet/EfficientNetB4.yaml
ppcls/configs/ImageNet/EfficientNet/EfficientNetB4.yaml
+12
-0
ppcls/configs/ImageNet/EfficientNet/EfficientNetB5.yaml
ppcls/configs/ImageNet/EfficientNet/EfficientNetB5.yaml
+12
-0
ppcls/configs/ImageNet/EfficientNet/EfficientNetB6.yaml
ppcls/configs/ImageNet/EfficientNet/EfficientNetB6.yaml
+12
-0
ppcls/configs/ImageNet/EfficientNet/EfficientNetB7.yaml
ppcls/configs/ImageNet/EfficientNet/EfficientNetB7.yaml
+12
-0
ppcls/configs/ImageNet/GhostNet/GhostNet_x0_5.yaml
ppcls/configs/ImageNet/GhostNet/GhostNet_x0_5.yaml
+12
-0
ppcls/configs/ImageNet/GhostNet/GhostNet_x1_0.yaml
ppcls/configs/ImageNet/GhostNet/GhostNet_x1_0.yaml
+12
-0
ppcls/configs/ImageNet/GhostNet/GhostNet_x1_3.yaml
ppcls/configs/ImageNet/GhostNet/GhostNet_x1_3.yaml
+12
-0
ppcls/configs/ImageNet/HRNet/HRNet_W18_C.yaml
ppcls/configs/ImageNet/HRNet/HRNet_W18_C.yaml
+12
-0
ppcls/configs/ImageNet/HRNet/HRNet_W30_C.yaml
ppcls/configs/ImageNet/HRNet/HRNet_W30_C.yaml
+12
-0
ppcls/configs/ImageNet/HRNet/HRNet_W32_C.yaml
ppcls/configs/ImageNet/HRNet/HRNet_W32_C.yaml
+12
-0
ppcls/configs/ImageNet/HRNet/HRNet_W40_C.yaml
ppcls/configs/ImageNet/HRNet/HRNet_W40_C.yaml
+12
-0
ppcls/configs/ImageNet/HRNet/HRNet_W44_C.yaml
ppcls/configs/ImageNet/HRNet/HRNet_W44_C.yaml
+12
-0
ppcls/configs/ImageNet/HRNet/HRNet_W48_C.yaml
ppcls/configs/ImageNet/HRNet/HRNet_W48_C.yaml
+12
-0
ppcls/configs/ImageNet/HRNet/HRNet_W64_C.yaml
ppcls/configs/ImageNet/HRNet/HRNet_W64_C.yaml
+12
-0
ppcls/configs/ImageNet/HarDNet/HarDNet39_ds.yaml
ppcls/configs/ImageNet/HarDNet/HarDNet39_ds.yaml
+12
-0
ppcls/configs/ImageNet/HarDNet/HarDNet68.yaml
ppcls/configs/ImageNet/HarDNet/HarDNet68.yaml
+12
-0
ppcls/configs/ImageNet/HarDNet/HarDNet68_ds.yaml
ppcls/configs/ImageNet/HarDNet/HarDNet68_ds.yaml
+12
-0
ppcls/configs/ImageNet/HarDNet/HarDNet85.yaml
ppcls/configs/ImageNet/HarDNet/HarDNet85.yaml
+12
-0
ppcls/configs/ImageNet/Inception/GoogLeNet.yaml
ppcls/configs/ImageNet/Inception/GoogLeNet.yaml
+12
-0
ppcls/configs/ImageNet/Inception/InceptionV3.yaml
ppcls/configs/ImageNet/Inception/InceptionV3.yaml
+12
-0
ppcls/configs/ImageNet/Inception/InceptionV4.yaml
ppcls/configs/ImageNet/Inception/InceptionV4.yaml
+12
-0
ppcls/configs/ImageNet/LeViT/LeViT_128.yaml
ppcls/configs/ImageNet/LeViT/LeViT_128.yaml
+12
-0
ppcls/configs/ImageNet/LeViT/LeViT_128S.yaml
ppcls/configs/ImageNet/LeViT/LeViT_128S.yaml
+12
-0
ppcls/configs/ImageNet/LeViT/LeViT_192.yaml
ppcls/configs/ImageNet/LeViT/LeViT_192.yaml
+12
-0
ppcls/configs/ImageNet/LeViT/LeViT_256.yaml
ppcls/configs/ImageNet/LeViT/LeViT_256.yaml
+12
-0
ppcls/configs/ImageNet/LeViT/LeViT_384.yaml
ppcls/configs/ImageNet/LeViT/LeViT_384.yaml
+12
-0
ppcls/configs/ImageNet/MicroNet/MicroNet_M0.yaml
ppcls/configs/ImageNet/MicroNet/MicroNet_M0.yaml
+12
-0
ppcls/configs/ImageNet/MicroNet/MicroNet_M1.yaml
ppcls/configs/ImageNet/MicroNet/MicroNet_M1.yaml
+12
-0
ppcls/configs/ImageNet/MicroNet/MicroNet_M2.yaml
ppcls/configs/ImageNet/MicroNet/MicroNet_M2.yaml
+12
-0
ppcls/configs/ImageNet/MicroNet/MicroNet_M3.yaml
ppcls/configs/ImageNet/MicroNet/MicroNet_M3.yaml
+12
-0
ppcls/configs/ImageNet/MixNet/MixNet_L.yaml
ppcls/configs/ImageNet/MixNet/MixNet_L.yaml
+12
-0
ppcls/configs/ImageNet/MixNet/MixNet_M.yaml
ppcls/configs/ImageNet/MixNet/MixNet_M.yaml
+12
-0
ppcls/configs/ImageNet/MixNet/MixNet_S.yaml
ppcls/configs/ImageNet/MixNet/MixNet_S.yaml
+12
-0
ppcls/configs/ImageNet/MobileNeXt/MobileNeXt_x1_0.yaml
ppcls/configs/ImageNet/MobileNeXt/MobileNeXt_x1_0.yaml
+12
-0
ppcls/configs/ImageNet/MobileNetV1/MobileNetV1.yaml
ppcls/configs/ImageNet/MobileNetV1/MobileNetV1.yaml
+12
-0
ppcls/configs/ImageNet/MobileNetV1/MobileNetV1_x0_25.yaml
ppcls/configs/ImageNet/MobileNetV1/MobileNetV1_x0_25.yaml
+12
-0
ppcls/configs/ImageNet/MobileNetV1/MobileNetV1_x0_5.yaml
ppcls/configs/ImageNet/MobileNetV1/MobileNetV1_x0_5.yaml
+12
-0
ppcls/configs/ImageNet/MobileNetV1/MobileNetV1_x0_75.yaml
ppcls/configs/ImageNet/MobileNetV1/MobileNetV1_x0_75.yaml
+12
-0
ppcls/configs/ImageNet/MobileNetV2/MobileNetV2.yaml
ppcls/configs/ImageNet/MobileNetV2/MobileNetV2.yaml
+12
-0
ppcls/configs/ImageNet/MobileNetV2/MobileNetV2_x0_25.yaml
ppcls/configs/ImageNet/MobileNetV2/MobileNetV2_x0_25.yaml
+12
-0
ppcls/configs/ImageNet/MobileNetV2/MobileNetV2_x0_5.yaml
ppcls/configs/ImageNet/MobileNetV2/MobileNetV2_x0_5.yaml
+12
-0
ppcls/configs/ImageNet/MobileNetV2/MobileNetV2_x0_75.yaml
ppcls/configs/ImageNet/MobileNetV2/MobileNetV2_x0_75.yaml
+12
-0
ppcls/configs/ImageNet/MobileNetV2/MobileNetV2_x1_5.yaml
ppcls/configs/ImageNet/MobileNetV2/MobileNetV2_x1_5.yaml
+12
-0
ppcls/configs/ImageNet/MobileNetV2/MobileNetV2_x2_0.yaml
ppcls/configs/ImageNet/MobileNetV2/MobileNetV2_x2_0.yaml
+12
-0
ppcls/configs/ImageNet/MobileNetV3/MobileNetV3_large_x0_35.yaml
...configs/ImageNet/MobileNetV3/MobileNetV3_large_x0_35.yaml
+12
-0
ppcls/configs/ImageNet/MobileNetV3/MobileNetV3_large_x0_5.yaml
.../configs/ImageNet/MobileNetV3/MobileNetV3_large_x0_5.yaml
+12
-0
ppcls/configs/ImageNet/MobileNetV3/MobileNetV3_large_x0_75.yaml
...configs/ImageNet/MobileNetV3/MobileNetV3_large_x0_75.yaml
+12
-0
ppcls/configs/ImageNet/MobileNetV3/MobileNetV3_large_x1_0.yaml
.../configs/ImageNet/MobileNetV3/MobileNetV3_large_x1_0.yaml
+12
-0
ppcls/configs/ImageNet/MobileNetV3/MobileNetV3_large_x1_25.yaml
...configs/ImageNet/MobileNetV3/MobileNetV3_large_x1_25.yaml
+12
-0
ppcls/configs/ImageNet/MobileNetV3/MobileNetV3_small_x0_35.yaml
...configs/ImageNet/MobileNetV3/MobileNetV3_small_x0_35.yaml
+12
-0
ppcls/configs/ImageNet/MobileNetV3/MobileNetV3_small_x0_5.yaml
.../configs/ImageNet/MobileNetV3/MobileNetV3_small_x0_5.yaml
+12
-0
ppcls/configs/ImageNet/MobileNetV3/MobileNetV3_small_x0_75.yaml
...configs/ImageNet/MobileNetV3/MobileNetV3_small_x0_75.yaml
+12
-0
ppcls/configs/ImageNet/MobileNetV3/MobileNetV3_small_x1_0.yaml
.../configs/ImageNet/MobileNetV3/MobileNetV3_small_x1_0.yaml
+12
-0
ppcls/configs/ImageNet/MobileNetV3/MobileNetV3_small_x1_0_fp32_ultra.yaml
...ageNet/MobileNetV3/MobileNetV3_small_x1_0_fp32_ultra.yaml
+12
-0
ppcls/configs/ImageNet/MobileNetV3/MobileNetV3_small_x1_25.yaml
...configs/ImageNet/MobileNetV3/MobileNetV3_small_x1_25.yaml
+12
-0
ppcls/configs/ImageNet/MobileViT/MobileViT_S.yaml
ppcls/configs/ImageNet/MobileViT/MobileViT_S.yaml
+12
-0
ppcls/configs/ImageNet/MobileViT/MobileViT_XS.yaml
ppcls/configs/ImageNet/MobileViT/MobileViT_XS.yaml
+12
-0
ppcls/configs/ImageNet/MobileViT/MobileViT_XXS.yaml
ppcls/configs/ImageNet/MobileViT/MobileViT_XXS.yaml
+12
-0
ppcls/configs/ImageNet/PPLCNet/PPLCNet_x0_25.yaml
ppcls/configs/ImageNet/PPLCNet/PPLCNet_x0_25.yaml
+12
-0
ppcls/configs/ImageNet/PPLCNet/PPLCNet_x0_35.yaml
ppcls/configs/ImageNet/PPLCNet/PPLCNet_x0_35.yaml
+12
-0
ppcls/configs/ImageNet/PPLCNet/PPLCNet_x0_5.yaml
ppcls/configs/ImageNet/PPLCNet/PPLCNet_x0_5.yaml
+12
-0
ppcls/configs/ImageNet/PPLCNet/PPLCNet_x0_75.yaml
ppcls/configs/ImageNet/PPLCNet/PPLCNet_x0_75.yaml
+12
-0
ppcls/configs/ImageNet/PPLCNet/PPLCNet_x1_0.yaml
ppcls/configs/ImageNet/PPLCNet/PPLCNet_x1_0.yaml
+12
-0
ppcls/configs/ImageNet/PPLCNet/PPLCNet_x1_0_fp32_ultra.yaml
ppcls/configs/ImageNet/PPLCNet/PPLCNet_x1_0_fp32_ultra.yaml
+12
-0
ppcls/configs/ImageNet/PPLCNet/PPLCNet_x1_5.yaml
ppcls/configs/ImageNet/PPLCNet/PPLCNet_x1_5.yaml
+12
-0
ppcls/configs/ImageNet/PPLCNet/PPLCNet_x2_0.yaml
ppcls/configs/ImageNet/PPLCNet/PPLCNet_x2_0.yaml
+12
-0
ppcls/configs/ImageNet/PPLCNet/PPLCNet_x2_5.yaml
ppcls/configs/ImageNet/PPLCNet/PPLCNet_x2_5.yaml
+12
-0
ppcls/configs/ImageNet/PPLCNetV2/PPLCNetV2_base.yaml
ppcls/configs/ImageNet/PPLCNetV2/PPLCNetV2_base.yaml
+12
-0
ppcls/configs/ImageNet/PPLCNetV2/PPLCNetV2_large.yaml
ppcls/configs/ImageNet/PPLCNetV2/PPLCNetV2_large.yaml
+12
-0
ppcls/configs/ImageNet/PPLCNetV2/PPLCNetV2_small.yaml
ppcls/configs/ImageNet/PPLCNetV2/PPLCNetV2_small.yaml
+12
-0
ppcls/configs/ImageNet/PVTV2/PVT_V2_B0.yaml
ppcls/configs/ImageNet/PVTV2/PVT_V2_B0.yaml
+12
-0
ppcls/configs/ImageNet/PVTV2/PVT_V2_B1.yaml
ppcls/configs/ImageNet/PVTV2/PVT_V2_B1.yaml
+12
-0
ppcls/configs/ImageNet/PVTV2/PVT_V2_B2.yaml
ppcls/configs/ImageNet/PVTV2/PVT_V2_B2.yaml
+12
-0
ppcls/configs/ImageNet/PVTV2/PVT_V2_B2_Linear.yaml
ppcls/configs/ImageNet/PVTV2/PVT_V2_B2_Linear.yaml
+12
-0
ppcls/configs/ImageNet/PVTV2/PVT_V2_B3.yaml
ppcls/configs/ImageNet/PVTV2/PVT_V2_B3.yaml
+12
-0
ppcls/configs/ImageNet/PVTV2/PVT_V2_B4.yaml
ppcls/configs/ImageNet/PVTV2/PVT_V2_B4.yaml
+12
-0
ppcls/configs/ImageNet/PVTV2/PVT_V2_B5.yaml
ppcls/configs/ImageNet/PVTV2/PVT_V2_B5.yaml
+12
-0
ppcls/configs/ImageNet/PeleeNet/PeleeNet.yaml
ppcls/configs/ImageNet/PeleeNet/PeleeNet.yaml
+12
-0
ppcls/configs/ImageNet/ReXNet/ReXNet_1_0.yaml
ppcls/configs/ImageNet/ReXNet/ReXNet_1_0.yaml
+12
-0
ppcls/configs/ImageNet/ReXNet/ReXNet_1_3.yaml
ppcls/configs/ImageNet/ReXNet/ReXNet_1_3.yaml
+12
-0
ppcls/configs/ImageNet/ReXNet/ReXNet_1_5.yaml
ppcls/configs/ImageNet/ReXNet/ReXNet_1_5.yaml
+12
-0
ppcls/configs/ImageNet/ReXNet/ReXNet_2_0.yaml
ppcls/configs/ImageNet/ReXNet/ReXNet_2_0.yaml
+12
-0
ppcls/configs/ImageNet/ReXNet/ReXNet_3_0.yaml
ppcls/configs/ImageNet/ReXNet/ReXNet_3_0.yaml
+12
-0
ppcls/configs/ImageNet/RedNet/RedNet101.yaml
ppcls/configs/ImageNet/RedNet/RedNet101.yaml
+12
-0
ppcls/configs/ImageNet/RedNet/RedNet152.yaml
ppcls/configs/ImageNet/RedNet/RedNet152.yaml
+12
-0
ppcls/configs/ImageNet/RedNet/RedNet26.yaml
ppcls/configs/ImageNet/RedNet/RedNet26.yaml
+12
-0
ppcls/configs/ImageNet/RedNet/RedNet38.yaml
ppcls/configs/ImageNet/RedNet/RedNet38.yaml
+12
-0
ppcls/configs/ImageNet/RedNet/RedNet50.yaml
ppcls/configs/ImageNet/RedNet/RedNet50.yaml
+12
-0
ppcls/configs/ImageNet/RegNet/RegNetX_12GF.yaml
ppcls/configs/ImageNet/RegNet/RegNetX_12GF.yaml
+12
-0
ppcls/configs/ImageNet/RegNet/RegNetX_1600MF.yaml
ppcls/configs/ImageNet/RegNet/RegNetX_1600MF.yaml
+12
-0
ppcls/configs/ImageNet/RegNet/RegNetX_16GF.yaml
ppcls/configs/ImageNet/RegNet/RegNetX_16GF.yaml
+12
-0
ppcls/configs/ImageNet/RegNet/RegNetX_200MF.yaml
ppcls/configs/ImageNet/RegNet/RegNetX_200MF.yaml
+12
-0
ppcls/configs/ImageNet/RegNet/RegNetX_3200MF.yaml
ppcls/configs/ImageNet/RegNet/RegNetX_3200MF.yaml
+12
-0
ppcls/configs/ImageNet/RegNet/RegNetX_32GF.yaml
ppcls/configs/ImageNet/RegNet/RegNetX_32GF.yaml
+12
-0
ppcls/configs/ImageNet/RegNet/RegNetX_400MF.yaml
ppcls/configs/ImageNet/RegNet/RegNetX_400MF.yaml
+12
-0
ppcls/configs/ImageNet/RegNet/RegNetX_600MF.yaml
ppcls/configs/ImageNet/RegNet/RegNetX_600MF.yaml
+12
-0
ppcls/configs/ImageNet/RegNet/RegNetX_6400MF.yaml
ppcls/configs/ImageNet/RegNet/RegNetX_6400MF.yaml
+12
-0
ppcls/configs/ImageNet/RegNet/RegNetX_800MF.yaml
ppcls/configs/ImageNet/RegNet/RegNetX_800MF.yaml
+12
-0
ppcls/configs/ImageNet/RegNet/RegNetX_8GF.yaml
ppcls/configs/ImageNet/RegNet/RegNetX_8GF.yaml
+12
-0
ppcls/configs/ImageNet/RepVGG/RepVGG_A0.yaml
ppcls/configs/ImageNet/RepVGG/RepVGG_A0.yaml
+12
-0
ppcls/configs/ImageNet/RepVGG/RepVGG_A1.yaml
ppcls/configs/ImageNet/RepVGG/RepVGG_A1.yaml
+12
-0
ppcls/configs/ImageNet/RepVGG/RepVGG_A2.yaml
ppcls/configs/ImageNet/RepVGG/RepVGG_A2.yaml
+12
-0
ppcls/configs/ImageNet/RepVGG/RepVGG_B0.yaml
ppcls/configs/ImageNet/RepVGG/RepVGG_B0.yaml
+12
-0
ppcls/configs/ImageNet/RepVGG/RepVGG_B1.yaml
ppcls/configs/ImageNet/RepVGG/RepVGG_B1.yaml
+12
-0
ppcls/configs/ImageNet/RepVGG/RepVGG_B1g2.yaml
ppcls/configs/ImageNet/RepVGG/RepVGG_B1g2.yaml
+12
-0
ppcls/configs/ImageNet/RepVGG/RepVGG_B1g4.yaml
ppcls/configs/ImageNet/RepVGG/RepVGG_B1g4.yaml
+12
-0
ppcls/configs/ImageNet/RepVGG/RepVGG_B2.yaml
ppcls/configs/ImageNet/RepVGG/RepVGG_B2.yaml
+12
-0
ppcls/configs/ImageNet/RepVGG/RepVGG_B2g4.yaml
ppcls/configs/ImageNet/RepVGG/RepVGG_B2g4.yaml
+12
-0
ppcls/configs/ImageNet/RepVGG/RepVGG_B3.yaml
ppcls/configs/ImageNet/RepVGG/RepVGG_B3.yaml
+12
-0
ppcls/configs/ImageNet/RepVGG/RepVGG_B3g4.yaml
ppcls/configs/ImageNet/RepVGG/RepVGG_B3g4.yaml
+12
-0
ppcls/configs/ImageNet/RepVGG/RepVGG_D2se.yaml
ppcls/configs/ImageNet/RepVGG/RepVGG_D2se.yaml
+12
-0
ppcls/configs/ImageNet/Res2Net/Res2Net101_vd_26w_4s.yaml
ppcls/configs/ImageNet/Res2Net/Res2Net101_vd_26w_4s.yaml
+12
-0
ppcls/configs/ImageNet/Res2Net/Res2Net200_vd_26w_4s.yaml
ppcls/configs/ImageNet/Res2Net/Res2Net200_vd_26w_4s.yaml
+12
-0
ppcls/configs/ImageNet/Res2Net/Res2Net50_14w_8s.yaml
ppcls/configs/ImageNet/Res2Net/Res2Net50_14w_8s.yaml
+12
-0
ppcls/configs/ImageNet/Res2Net/Res2Net50_26w_4s.yaml
ppcls/configs/ImageNet/Res2Net/Res2Net50_26w_4s.yaml
+12
-0
ppcls/configs/ImageNet/Res2Net/Res2Net50_vd_26w_4s.yaml
ppcls/configs/ImageNet/Res2Net/Res2Net50_vd_26w_4s.yaml
+12
-0
ppcls/configs/ImageNet/ResNeSt/ResNeSt101.yaml
ppcls/configs/ImageNet/ResNeSt/ResNeSt101.yaml
+12
-0
ppcls/configs/ImageNet/ResNeSt/ResNeSt200.yaml
ppcls/configs/ImageNet/ResNeSt/ResNeSt200.yaml
+12
-0
ppcls/configs/ImageNet/ResNeSt/ResNeSt269.yaml
ppcls/configs/ImageNet/ResNeSt/ResNeSt269.yaml
+12
-0
ppcls/configs/ImageNet/ResNeSt/ResNeSt50.yaml
ppcls/configs/ImageNet/ResNeSt/ResNeSt50.yaml
+12
-0
ppcls/configs/ImageNet/ResNeSt/ResNeSt50_fast_1s1x64d.yaml
ppcls/configs/ImageNet/ResNeSt/ResNeSt50_fast_1s1x64d.yaml
+12
-0
ppcls/configs/ImageNet/ResNeXt/ResNeXt101_32x4d.yaml
ppcls/configs/ImageNet/ResNeXt/ResNeXt101_32x4d.yaml
+12
-0
ppcls/configs/ImageNet/ResNeXt/ResNeXt101_64x4d.yaml
ppcls/configs/ImageNet/ResNeXt/ResNeXt101_64x4d.yaml
+12
-0
ppcls/configs/ImageNet/ResNeXt/ResNeXt101_vd_32x4d.yaml
ppcls/configs/ImageNet/ResNeXt/ResNeXt101_vd_32x4d.yaml
+12
-0
ppcls/configs/ImageNet/ResNeXt/ResNeXt101_vd_64x4d.yaml
ppcls/configs/ImageNet/ResNeXt/ResNeXt101_vd_64x4d.yaml
+12
-0
ppcls/configs/ImageNet/ResNeXt/ResNeXt152_32x4d.yaml
ppcls/configs/ImageNet/ResNeXt/ResNeXt152_32x4d.yaml
+12
-0
ppcls/configs/ImageNet/ResNeXt/ResNeXt152_64x4d.yaml
ppcls/configs/ImageNet/ResNeXt/ResNeXt152_64x4d.yaml
+12
-0
ppcls/configs/ImageNet/ResNeXt/ResNeXt152_vd_32x4d.yaml
ppcls/configs/ImageNet/ResNeXt/ResNeXt152_vd_32x4d.yaml
+12
-0
ppcls/configs/ImageNet/ResNeXt/ResNeXt152_vd_64x4d.yaml
ppcls/configs/ImageNet/ResNeXt/ResNeXt152_vd_64x4d.yaml
+12
-0
ppcls/configs/ImageNet/ResNeXt/ResNeXt50_32x4d.yaml
ppcls/configs/ImageNet/ResNeXt/ResNeXt50_32x4d.yaml
+12
-0
ppcls/configs/ImageNet/ResNeXt/ResNeXt50_64x4d.yaml
ppcls/configs/ImageNet/ResNeXt/ResNeXt50_64x4d.yaml
+12
-0
ppcls/configs/ImageNet/ResNeXt/ResNeXt50_vd_32x4d.yaml
ppcls/configs/ImageNet/ResNeXt/ResNeXt50_vd_32x4d.yaml
+12
-0
ppcls/configs/ImageNet/ResNeXt/ResNeXt50_vd_64x4d.yaml
ppcls/configs/ImageNet/ResNeXt/ResNeXt50_vd_64x4d.yaml
+12
-0
ppcls/configs/ImageNet/ResNeXt101_wsl/ResNeXt101_32x16d_wsl.yaml
...onfigs/ImageNet/ResNeXt101_wsl/ResNeXt101_32x16d_wsl.yaml
+12
-0
ppcls/configs/ImageNet/ResNeXt101_wsl/ResNeXt101_32x32d_wsl.yaml
...onfigs/ImageNet/ResNeXt101_wsl/ResNeXt101_32x32d_wsl.yaml
+12
-0
ppcls/configs/ImageNet/ResNeXt101_wsl/ResNeXt101_32x48d_wsl.yaml
...onfigs/ImageNet/ResNeXt101_wsl/ResNeXt101_32x48d_wsl.yaml
+12
-0
ppcls/configs/ImageNet/ResNeXt101_wsl/ResNeXt101_32x8d_wsl.yaml
...configs/ImageNet/ResNeXt101_wsl/ResNeXt101_32x8d_wsl.yaml
+12
-0
ppcls/configs/ImageNet/ResNet/ResNet101.yaml
ppcls/configs/ImageNet/ResNet/ResNet101.yaml
+12
-0
ppcls/configs/ImageNet/ResNet/ResNet101_vd.yaml
ppcls/configs/ImageNet/ResNet/ResNet101_vd.yaml
+12
-0
ppcls/configs/ImageNet/ResNet/ResNet152.yaml
ppcls/configs/ImageNet/ResNet/ResNet152.yaml
+12
-0
ppcls/configs/ImageNet/ResNet/ResNet152_vd.yaml
ppcls/configs/ImageNet/ResNet/ResNet152_vd.yaml
+12
-0
ppcls/configs/ImageNet/ResNet/ResNet18.yaml
ppcls/configs/ImageNet/ResNet/ResNet18.yaml
+12
-0
ppcls/configs/ImageNet/ResNet/ResNet18_dbb.yaml
ppcls/configs/ImageNet/ResNet/ResNet18_dbb.yaml
+12
-0
ppcls/configs/ImageNet/ResNet/ResNet18_vd.yaml
ppcls/configs/ImageNet/ResNet/ResNet18_vd.yaml
+12
-0
ppcls/configs/ImageNet/ResNet/ResNet200_vd.yaml
ppcls/configs/ImageNet/ResNet/ResNet200_vd.yaml
+12
-0
ppcls/configs/ImageNet/ResNet/ResNet34.yaml
ppcls/configs/ImageNet/ResNet/ResNet34.yaml
+12
-0
ppcls/configs/ImageNet/ResNet/ResNet34_vd.yaml
ppcls/configs/ImageNet/ResNet/ResNet34_vd.yaml
+12
-0
ppcls/configs/ImageNet/ResNet/ResNet50.yaml
ppcls/configs/ImageNet/ResNet/ResNet50.yaml
+12
-0
ppcls/configs/ImageNet/ResNet/ResNet50_fp32_ultra.yaml
ppcls/configs/ImageNet/ResNet/ResNet50_fp32_ultra.yaml
+12
-0
ppcls/configs/ImageNet/ResNet/ResNet50_vd.yaml
ppcls/configs/ImageNet/ResNet/ResNet50_vd.yaml
+12
-0
ppcls/configs/ImageNet/SENet/SENet154_vd.yaml
ppcls/configs/ImageNet/SENet/SENet154_vd.yaml
+12
-0
ppcls/configs/ImageNet/SENet/SE_ResNeXt101_32x4d.yaml
ppcls/configs/ImageNet/SENet/SE_ResNeXt101_32x4d.yaml
+12
-0
ppcls/configs/ImageNet/SENet/SE_ResNeXt50_32x4d.yaml
ppcls/configs/ImageNet/SENet/SE_ResNeXt50_32x4d.yaml
+12
-0
ppcls/configs/ImageNet/SENet/SE_ResNeXt50_vd_32x4d.yaml
ppcls/configs/ImageNet/SENet/SE_ResNeXt50_vd_32x4d.yaml
+12
-0
ppcls/configs/ImageNet/SENet/SE_ResNet18_vd.yaml
ppcls/configs/ImageNet/SENet/SE_ResNet18_vd.yaml
+12
-0
ppcls/configs/ImageNet/SENet/SE_ResNet34_vd.yaml
ppcls/configs/ImageNet/SENet/SE_ResNet34_vd.yaml
+12
-0
ppcls/configs/ImageNet/SENet/SE_ResNet50_vd.yaml
ppcls/configs/ImageNet/SENet/SE_ResNet50_vd.yaml
+12
-0
ppcls/configs/ImageNet/ShuffleNet/ShuffleNetV2_swish.yaml
ppcls/configs/ImageNet/ShuffleNet/ShuffleNetV2_swish.yaml
+12
-0
ppcls/configs/ImageNet/ShuffleNet/ShuffleNetV2_x0_25.yaml
ppcls/configs/ImageNet/ShuffleNet/ShuffleNetV2_x0_25.yaml
+12
-0
ppcls/configs/ImageNet/ShuffleNet/ShuffleNetV2_x0_33.yaml
ppcls/configs/ImageNet/ShuffleNet/ShuffleNetV2_x0_33.yaml
+12
-0
ppcls/configs/ImageNet/ShuffleNet/ShuffleNetV2_x0_5.yaml
ppcls/configs/ImageNet/ShuffleNet/ShuffleNetV2_x0_5.yaml
+12
-0
ppcls/configs/ImageNet/ShuffleNet/ShuffleNetV2_x1_0.yaml
ppcls/configs/ImageNet/ShuffleNet/ShuffleNetV2_x1_0.yaml
+12
-0
ppcls/configs/ImageNet/ShuffleNet/ShuffleNetV2_x1_5.yaml
ppcls/configs/ImageNet/ShuffleNet/ShuffleNetV2_x1_5.yaml
+12
-0
ppcls/configs/ImageNet/ShuffleNet/ShuffleNetV2_x2_0.yaml
ppcls/configs/ImageNet/ShuffleNet/ShuffleNetV2_x2_0.yaml
+12
-0
ppcls/configs/ImageNet/SqueezeNet/SqueezeNet1_0.yaml
ppcls/configs/ImageNet/SqueezeNet/SqueezeNet1_0.yaml
+12
-0
ppcls/configs/ImageNet/SqueezeNet/SqueezeNet1_1.yaml
ppcls/configs/ImageNet/SqueezeNet/SqueezeNet1_1.yaml
+12
-0
ppcls/configs/ImageNet/SwinTransformer/SwinTransformer_base_patch4_window12_384.yaml
...Transformer/SwinTransformer_base_patch4_window12_384.yaml
+12
-0
ppcls/configs/ImageNet/SwinTransformer/SwinTransformer_base_patch4_window7_224.yaml
...nTransformer/SwinTransformer_base_patch4_window7_224.yaml
+12
-0
ppcls/configs/ImageNet/SwinTransformer/SwinTransformer_large_patch4_window12_384.yaml
...ransformer/SwinTransformer_large_patch4_window12_384.yaml
+12
-0
ppcls/configs/ImageNet/SwinTransformer/SwinTransformer_large_patch4_window7_224.yaml
...Transformer/SwinTransformer_large_patch4_window7_224.yaml
+12
-0
ppcls/configs/ImageNet/SwinTransformer/SwinTransformer_small_patch4_window7_224.yaml
...Transformer/SwinTransformer_small_patch4_window7_224.yaml
+12
-0
ppcls/configs/ImageNet/SwinTransformer/SwinTransformer_tiny_patch4_window7_224.yaml
...nTransformer/SwinTransformer_tiny_patch4_window7_224.yaml
+12
-0
ppcls/configs/ImageNet/SwinTransformerV2/SwinTransformerV2_base_patch4_window16_256.yaml
...sformerV2/SwinTransformerV2_base_patch4_window16_256.yaml
+12
-0
ppcls/configs/ImageNet/SwinTransformerV2/SwinTransformerV2_base_patch4_window24_384.yaml
...sformerV2/SwinTransformerV2_base_patch4_window24_384.yaml
+12
-0
ppcls/configs/ImageNet/SwinTransformerV2/SwinTransformerV2_base_patch4_window8_256.yaml
...nsformerV2/SwinTransformerV2_base_patch4_window8_256.yaml
+12
-0
ppcls/configs/ImageNet/SwinTransformerV2/SwinTransformerV2_large_patch4_window16_256.yaml
...formerV2/SwinTransformerV2_large_patch4_window16_256.yaml
+12
-0
ppcls/configs/ImageNet/SwinTransformerV2/SwinTransformerV2_large_patch4_window24_384.yaml
...formerV2/SwinTransformerV2_large_patch4_window24_384.yaml
+12
-0
ppcls/configs/ImageNet/SwinTransformerV2/SwinTransformerV2_small_patch4_window16_256.yaml
...formerV2/SwinTransformerV2_small_patch4_window16_256.yaml
+12
-0
ppcls/configs/ImageNet/SwinTransformerV2/SwinTransformerV2_small_patch4_window8_256.yaml
...sformerV2/SwinTransformerV2_small_patch4_window8_256.yaml
+12
-0
ppcls/configs/ImageNet/SwinTransformerV2/SwinTransformerV2_tiny_patch4_window16_256.yaml
...sformerV2/SwinTransformerV2_tiny_patch4_window16_256.yaml
+12
-0
ppcls/configs/ImageNet/SwinTransformerV2/SwinTransformerV2_tiny_patch4_window8_256.yaml
...nsformerV2/SwinTransformerV2_tiny_patch4_window8_256.yaml
+12
-0
ppcls/configs/ImageNet/TNT/TNT_base.yaml
ppcls/configs/ImageNet/TNT/TNT_base.yaml
+12
-0
ppcls/configs/ImageNet/TNT/TNT_small.yaml
ppcls/configs/ImageNet/TNT/TNT_small.yaml
+12
-0
ppcls/configs/ImageNet/TinyNet/TinyNet_A.yaml
ppcls/configs/ImageNet/TinyNet/TinyNet_A.yaml
+12
-0
ppcls/configs/ImageNet/TinyNet/TinyNet_B.yaml
ppcls/configs/ImageNet/TinyNet/TinyNet_B.yaml
+12
-0
ppcls/configs/ImageNet/TinyNet/TinyNet_C.yaml
ppcls/configs/ImageNet/TinyNet/TinyNet_C.yaml
+12
-0
ppcls/configs/ImageNet/TinyNet/TinyNet_D.yaml
ppcls/configs/ImageNet/TinyNet/TinyNet_D.yaml
+12
-0
ppcls/configs/ImageNet/TinyNet/TinyNet_E.yaml
ppcls/configs/ImageNet/TinyNet/TinyNet_E.yaml
+12
-0
ppcls/configs/ImageNet/Twins/alt_gvt_base.yaml
ppcls/configs/ImageNet/Twins/alt_gvt_base.yaml
+12
-0
ppcls/configs/ImageNet/Twins/alt_gvt_large.yaml
ppcls/configs/ImageNet/Twins/alt_gvt_large.yaml
+12
-0
ppcls/configs/ImageNet/Twins/alt_gvt_small.yaml
ppcls/configs/ImageNet/Twins/alt_gvt_small.yaml
+12
-0
ppcls/configs/ImageNet/Twins/pcpvt_base.yaml
ppcls/configs/ImageNet/Twins/pcpvt_base.yaml
+12
-0
ppcls/configs/ImageNet/Twins/pcpvt_large.yaml
ppcls/configs/ImageNet/Twins/pcpvt_large.yaml
+12
-0
ppcls/configs/ImageNet/Twins/pcpvt_small.yaml
ppcls/configs/ImageNet/Twins/pcpvt_small.yaml
+12
-0
ppcls/configs/ImageNet/UniFormer/UniFormer_base.yaml
ppcls/configs/ImageNet/UniFormer/UniFormer_base.yaml
+12
-0
ppcls/configs/ImageNet/UniFormer/UniFormer_base_ls.yaml
ppcls/configs/ImageNet/UniFormer/UniFormer_base_ls.yaml
+12
-0
ppcls/configs/ImageNet/UniFormer/UniFormer_small.yaml
ppcls/configs/ImageNet/UniFormer/UniFormer_small.yaml
+12
-0
ppcls/configs/ImageNet/UniFormer/UniFormer_small_plus.yaml
ppcls/configs/ImageNet/UniFormer/UniFormer_small_plus.yaml
+12
-0
ppcls/configs/ImageNet/UniFormer/UniFormer_small_plus_dim64.yaml
...onfigs/ImageNet/UniFormer/UniFormer_small_plus_dim64.yaml
+12
-0
ppcls/configs/ImageNet/VAN/VAN_B0.yaml
ppcls/configs/ImageNet/VAN/VAN_B0.yaml
+12
-0
ppcls/configs/ImageNet/VAN/VAN_B1.yaml
ppcls/configs/ImageNet/VAN/VAN_B1.yaml
+12
-0
ppcls/configs/ImageNet/VAN/VAN_B2.yaml
ppcls/configs/ImageNet/VAN/VAN_B2.yaml
+12
-0
ppcls/configs/ImageNet/VAN/VAN_B3.yaml
ppcls/configs/ImageNet/VAN/VAN_B3.yaml
+12
-0
ppcls/configs/ImageNet/VGG/VGG11.yaml
ppcls/configs/ImageNet/VGG/VGG11.yaml
+12
-0
ppcls/configs/ImageNet/VGG/VGG13.yaml
ppcls/configs/ImageNet/VGG/VGG13.yaml
+12
-0
ppcls/configs/ImageNet/VGG/VGG16.yaml
ppcls/configs/ImageNet/VGG/VGG16.yaml
+12
-0
ppcls/configs/ImageNet/VGG/VGG19.yaml
ppcls/configs/ImageNet/VGG/VGG19.yaml
+12
-0
ppcls/configs/ImageNet/VisionTransformer/ViT_base_patch16_224.yaml
...figs/ImageNet/VisionTransformer/ViT_base_patch16_224.yaml
+12
-0
ppcls/configs/ImageNet/VisionTransformer/ViT_base_patch16_384.yaml
...figs/ImageNet/VisionTransformer/ViT_base_patch16_384.yaml
+12
-0
ppcls/configs/ImageNet/VisionTransformer/ViT_base_patch32_384.yaml
...figs/ImageNet/VisionTransformer/ViT_base_patch32_384.yaml
+12
-0
ppcls/configs/ImageNet/VisionTransformer/ViT_large_patch16_224.yaml
...igs/ImageNet/VisionTransformer/ViT_large_patch16_224.yaml
+12
-0
ppcls/configs/ImageNet/VisionTransformer/ViT_large_patch16_384.yaml
...igs/ImageNet/VisionTransformer/ViT_large_patch16_384.yaml
+12
-0
ppcls/configs/ImageNet/VisionTransformer/ViT_large_patch32_384.yaml
...igs/ImageNet/VisionTransformer/ViT_large_patch32_384.yaml
+12
-0
ppcls/configs/ImageNet/VisionTransformer/ViT_small_patch16_224.yaml
...igs/ImageNet/VisionTransformer/ViT_small_patch16_224.yaml
+12
-0
ppcls/configs/ImageNet/Xception/Xception41.yaml
ppcls/configs/ImageNet/Xception/Xception41.yaml
+12
-0
ppcls/configs/ImageNet/Xception/Xception41_deeplab.yaml
ppcls/configs/ImageNet/Xception/Xception41_deeplab.yaml
+12
-0
ppcls/configs/ImageNet/Xception/Xception65.yaml
ppcls/configs/ImageNet/Xception/Xception65.yaml
+12
-0
ppcls/configs/ImageNet/Xception/Xception65_deeplab.yaml
ppcls/configs/ImageNet/Xception/Xception65_deeplab.yaml
+12
-0
ppcls/configs/ImageNet/Xception/Xception71.yaml
ppcls/configs/ImageNet/Xception/Xception71.yaml
+12
-0
未找到文件。
ppcls/configs/ImageNet/AlexNet/AlexNet.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
AlexNet
name
:
AlexNet
...
...
ppcls/configs/ImageNet/CSPNet/CSPDarkNet53.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
CSPDarkNet53
name
:
CSPDarkNet53
...
...
ppcls/configs/ImageNet/CSWinTransformer/CSWinTransformer_base_224.yaml
浏览文件 @
0f86c555
...
@@ -17,6 +17,18 @@ Global:
...
@@ -17,6 +17,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
CSWinTransformer_base_224
name
:
CSWinTransformer_base_224
...
...
ppcls/configs/ImageNet/CSWinTransformer/CSWinTransformer_base_384.yaml
浏览文件 @
0f86c555
...
@@ -17,6 +17,18 @@ Global:
...
@@ -17,6 +17,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
CSWinTransformer_base_384
name
:
CSWinTransformer_base_384
...
...
ppcls/configs/ImageNet/CSWinTransformer/CSWinTransformer_large_224.yaml
浏览文件 @
0f86c555
...
@@ -17,6 +17,18 @@ Global:
...
@@ -17,6 +17,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
CSWinTransformer_large_224
name
:
CSWinTransformer_large_224
...
...
ppcls/configs/ImageNet/CSWinTransformer/CSWinTransformer_large_384.yaml
浏览文件 @
0f86c555
...
@@ -17,6 +17,18 @@ Global:
...
@@ -17,6 +17,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
CSWinTransformer_large_384
name
:
CSWinTransformer_large_384
...
...
ppcls/configs/ImageNet/CSWinTransformer/CSWinTransformer_small_224.yaml
浏览文件 @
0f86c555
...
@@ -17,6 +17,18 @@ Global:
...
@@ -17,6 +17,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
CSWinTransformer_small_224
name
:
CSWinTransformer_small_224
...
...
ppcls/configs/ImageNet/CSWinTransformer/CSWinTransformer_tiny_224.yaml
浏览文件 @
0f86c555
...
@@ -17,6 +17,18 @@ Global:
...
@@ -17,6 +17,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
CSWinTransformer_tiny_224
name
:
CSWinTransformer_tiny_224
...
...
ppcls/configs/ImageNet/ConvNeXt/ConvNeXt_base_224.yaml
浏览文件 @
0f86c555
...
@@ -22,6 +22,18 @@ EMA:
...
@@ -22,6 +22,18 @@ EMA:
decay
:
0.9999
decay
:
0.9999
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ConvNeXt_base_224
name
:
ConvNeXt_base_224
...
...
ppcls/configs/ImageNet/ConvNeXt/ConvNeXt_base_384.yaml
浏览文件 @
0f86c555
...
@@ -22,6 +22,18 @@ EMA:
...
@@ -22,6 +22,18 @@ EMA:
decay
:
0.9999
decay
:
0.9999
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ConvNeXt_base_384
name
:
ConvNeXt_base_384
...
...
ppcls/configs/ImageNet/ConvNeXt/ConvNeXt_large_224.yaml
浏览文件 @
0f86c555
...
@@ -22,6 +22,18 @@ EMA:
...
@@ -22,6 +22,18 @@ EMA:
decay
:
0.9999
decay
:
0.9999
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ConvNeXt_large_224
name
:
ConvNeXt_large_224
...
...
ppcls/configs/ImageNet/ConvNeXt/ConvNeXt_large_384.yaml
浏览文件 @
0f86c555
...
@@ -22,6 +22,18 @@ EMA:
...
@@ -22,6 +22,18 @@ EMA:
decay
:
0.9999
decay
:
0.9999
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ConvNeXt_large_384
name
:
ConvNeXt_large_384
...
...
ppcls/configs/ImageNet/ConvNeXt/ConvNeXt_small.yaml
浏览文件 @
0f86c555
...
@@ -22,6 +22,18 @@ EMA:
...
@@ -22,6 +22,18 @@ EMA:
decay
:
0.9999
decay
:
0.9999
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ConvNeXt_small
name
:
ConvNeXt_small
...
...
ppcls/configs/ImageNet/ConvNeXt/ConvNeXt_tiny.yaml
浏览文件 @
0f86c555
...
@@ -22,6 +22,18 @@ EMA:
...
@@ -22,6 +22,18 @@ EMA:
decay
:
0.9999
decay
:
0.9999
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ConvNeXt_tiny
name
:
ConvNeXt_tiny
...
...
ppcls/configs/ImageNet/CvT/CvT_13_224.yaml
浏览文件 @
0f86c555
...
@@ -17,6 +17,18 @@ Global:
...
@@ -17,6 +17,18 @@ Global:
to_static
:
False
to_static
:
False
update_freq
:
2
# for 8 cards
update_freq
:
2
# for 8 cards
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
CvT_13_224
name
:
CvT_13_224
...
...
ppcls/configs/ImageNet/CvT/CvT_13_384.yaml
浏览文件 @
0f86c555
...
@@ -17,6 +17,18 @@ Global:
...
@@ -17,6 +17,18 @@ Global:
to_static
:
False
to_static
:
False
update_freq
:
2
# for 8 cards
update_freq
:
2
# for 8 cards
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
CvT_13_384
name
:
CvT_13_384
...
...
ppcls/configs/ImageNet/CvT/CvT_21_224.yaml
浏览文件 @
0f86c555
...
@@ -17,6 +17,18 @@ Global:
...
@@ -17,6 +17,18 @@ Global:
to_static
:
False
to_static
:
False
update_freq
:
2
# for 8 cards
update_freq
:
2
# for 8 cards
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
CvT_21_224
name
:
CvT_21_224
...
...
ppcls/configs/ImageNet/CvT/CvT_21_384.yaml
浏览文件 @
0f86c555
...
@@ -17,6 +17,18 @@ Global:
...
@@ -17,6 +17,18 @@ Global:
to_static
:
False
to_static
:
False
update_freq
:
2
# for 8 cards
update_freq
:
2
# for 8 cards
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
CvT_21_384
name
:
CvT_21_384
...
...
ppcls/configs/ImageNet/CvT/CvT_W24_384.yaml
浏览文件 @
0f86c555
...
@@ -17,6 +17,18 @@ Global:
...
@@ -17,6 +17,18 @@ Global:
to_static
:
False
to_static
:
False
update_freq
:
2
# for 8 cards
update_freq
:
2
# for 8 cards
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
CvT_W24_384
name
:
CvT_W24_384
...
...
ppcls/configs/ImageNet/DLA/DLA102.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
DLA102
name
:
DLA102
...
...
ppcls/configs/ImageNet/DLA/DLA102x.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
DLA102x
name
:
DLA102x
...
...
ppcls/configs/ImageNet/DLA/DLA102x2.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
DLA102x2
name
:
DLA102x2
...
...
ppcls/configs/ImageNet/DLA/DLA169.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
DLA169
name
:
DLA169
...
...
ppcls/configs/ImageNet/DLA/DLA34.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
DLA34
name
:
DLA34
...
...
ppcls/configs/ImageNet/DLA/DLA46_c.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
DLA46_c
name
:
DLA46_c
...
...
ppcls/configs/ImageNet/DLA/DLA46x_c.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
DLA46x_c
name
:
DLA46x_c
...
...
ppcls/configs/ImageNet/DLA/DLA60.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
DLA60
name
:
DLA60
...
...
ppcls/configs/ImageNet/DLA/DLA60x.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
DLA60x
name
:
DLA60x
...
...
ppcls/configs/ImageNet/DLA/DLA60x_c.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
DLA60x_c
name
:
DLA60x_c
...
...
ppcls/configs/ImageNet/DPN/DPN107.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
DPN107
name
:
DPN107
...
...
ppcls/configs/ImageNet/DPN/DPN131.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
DPN131
name
:
DPN131
...
...
ppcls/configs/ImageNet/DPN/DPN68.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
DPN68
name
:
DPN68
...
...
ppcls/configs/ImageNet/DPN/DPN92.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
DPN92
name
:
DPN92
...
...
ppcls/configs/ImageNet/DPN/DPN98.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
DPN98
name
:
DPN98
...
...
ppcls/configs/ImageNet/DSNet/DSNet_base.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
DSNet_base
name
:
DSNet_base
...
...
ppcls/configs/ImageNet/DSNet/DSNet_small.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
DSNet_small
name
:
DSNet_small
...
...
ppcls/configs/ImageNet/DSNet/DSNet_tiny.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
DSNet_tiny
name
:
DSNet_tiny
...
...
ppcls/configs/ImageNet/DarkNet/DarkNet53.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
256
,
256
]
image_shape
:
[
3
,
256
,
256
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
DarkNet53
name
:
DarkNet53
...
...
ppcls/configs/ImageNet/DataAugment/ResNet50_AutoAugment.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ResNet50
name
:
ResNet50
...
...
ppcls/configs/ImageNet/DataAugment/ResNet50_Baseline.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ResNet50
name
:
ResNet50
...
...
ppcls/configs/ImageNet/DataAugment/ResNet50_Cutmix.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ResNet50
name
:
ResNet50
...
...
ppcls/configs/ImageNet/DataAugment/ResNet50_Cutout.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ResNet50
name
:
ResNet50
...
...
ppcls/configs/ImageNet/DataAugment/ResNet50_GridMask.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ResNet50
name
:
ResNet50
...
...
ppcls/configs/ImageNet/DataAugment/ResNet50_HideAndSeek.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ResNet50
name
:
ResNet50
...
...
ppcls/configs/ImageNet/DataAugment/ResNet50_Mixup.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ResNet50
name
:
ResNet50
...
...
ppcls/configs/ImageNet/DataAugment/ResNet50_RandAugment.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ResNet50
name
:
ResNet50
...
...
ppcls/configs/ImageNet/DataAugment/ResNet50_RandomErasing.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ResNet50
name
:
ResNet50
...
...
ppcls/configs/ImageNet/DeiT/DeiT_base_distilled_patch16_224.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
DeiT_base_distilled_patch16_224
name
:
DeiT_base_distilled_patch16_224
...
...
ppcls/configs/ImageNet/DeiT/DeiT_base_distilled_patch16_384.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
384
,
384
]
image_shape
:
[
3
,
384
,
384
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
DeiT_base_distilled_patch16_384
name
:
DeiT_base_distilled_patch16_384
...
...
ppcls/configs/ImageNet/DeiT/DeiT_base_patch16_224.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
DeiT_base_patch16_224
name
:
DeiT_base_patch16_224
...
...
ppcls/configs/ImageNet/DeiT/DeiT_base_patch16_384.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
384
,
384
]
image_shape
:
[
3
,
384
,
384
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
DeiT_base_patch16_384
name
:
DeiT_base_patch16_384
...
...
ppcls/configs/ImageNet/DeiT/DeiT_small_distilled_patch16_224.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
DeiT_small_distilled_patch16_224
name
:
DeiT_small_distilled_patch16_224
...
...
ppcls/configs/ImageNet/DeiT/DeiT_small_patch16_224.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
DeiT_small_patch16_224
name
:
DeiT_small_patch16_224
...
...
ppcls/configs/ImageNet/DeiT/DeiT_tiny_distilled_patch16_224.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
DeiT_tiny_distilled_patch16_224
name
:
DeiT_tiny_distilled_patch16_224
...
...
ppcls/configs/ImageNet/DeiT/DeiT_tiny_patch16_224.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
DeiT_tiny_patch16_224
name
:
DeiT_tiny_patch16_224
...
...
ppcls/configs/ImageNet/DenseNet/DenseNet121.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
DenseNet121
name
:
DenseNet121
...
...
ppcls/configs/ImageNet/DenseNet/DenseNet161.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
DenseNet161
name
:
DenseNet161
...
...
ppcls/configs/ImageNet/DenseNet/DenseNet169.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
DenseNet169
name
:
DenseNet169
...
...
ppcls/configs/ImageNet/DenseNet/DenseNet201.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
DenseNet201
name
:
DenseNet201
...
...
ppcls/configs/ImageNet/DenseNet/DenseNet264.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
DenseNet264
name
:
DenseNet264
...
...
ppcls/configs/ImageNet/Distillation/mv3_large_x1_0_distill_mv3_small_x1_0.yaml
浏览文件 @
0f86c555
...
@@ -15,6 +15,18 @@ Global:
...
@@ -15,6 +15,18 @@ Global:
save_inference_dir
:
"
./inference"
save_inference_dir
:
"
./inference"
use_dali
:
false
use_dali
:
false
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
"
DistillationModel"
name
:
"
DistillationModel"
...
...
ppcls/configs/ImageNet/Distillation/resnet34_distill_resnet18_afd.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
"
./inference"
save_inference_dir
:
"
./inference"
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
"
DistillationModel"
name
:
"
DistillationModel"
...
...
ppcls/configs/ImageNet/Distillation/resnet34_distill_resnet18_dist.yaml
浏览文件 @
0f86c555
...
@@ -15,6 +15,18 @@ Global:
...
@@ -15,6 +15,18 @@ Global:
save_inference_dir
:
./inference
save_inference_dir
:
./inference
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
"
DistillationModel"
name
:
"
DistillationModel"
...
...
ppcls/configs/ImageNet/Distillation/resnet34_distill_resnet18_dkd.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
"
./inference"
save_inference_dir
:
"
./inference"
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
"
DistillationModel"
name
:
"
DistillationModel"
...
...
ppcls/configs/ImageNet/Distillation/resnet34_distill_resnet18_mgd.yaml
浏览文件 @
0f86c555
...
@@ -15,6 +15,18 @@ Global:
...
@@ -15,6 +15,18 @@ Global:
save_inference_dir
:
./inference
save_inference_dir
:
./inference
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
"
DistillationModel"
name
:
"
DistillationModel"
...
...
ppcls/configs/ImageNet/Distillation/resnet34_distill_resnet18_pefd.yaml
浏览文件 @
0f86c555
...
@@ -15,6 +15,18 @@ Global:
...
@@ -15,6 +15,18 @@ Global:
save_inference_dir
:
./inference
save_inference_dir
:
./inference
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
"
DistillationModel"
name
:
"
DistillationModel"
...
...
ppcls/configs/ImageNet/Distillation/resnet34_distill_resnet18_skd.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
"
./inference"
save_inference_dir
:
"
./inference"
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
"
DistillationModel"
name
:
"
DistillationModel"
...
...
ppcls/configs/ImageNet/Distillation/resnet34_distill_resnet18_wsl.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
"
./inference"
save_inference_dir
:
"
./inference"
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
"
DistillationModel"
name
:
"
DistillationModel"
...
...
ppcls/configs/ImageNet/ESNet/ESNet_x0_25.yaml
浏览文件 @
0f86c555
...
@@ -13,6 +13,18 @@ Global:
...
@@ -13,6 +13,18 @@ Global:
# used for static mode and model export
# used for static mode and model export
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ESNet_x0_25
name
:
ESNet_x0_25
...
...
ppcls/configs/ImageNet/ESNet/ESNet_x0_5.yaml
浏览文件 @
0f86c555
...
@@ -13,6 +13,18 @@ Global:
...
@@ -13,6 +13,18 @@ Global:
# used for static mode and model export
# used for static mode and model export
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ESNet_x0_5
name
:
ESNet_x0_5
...
...
ppcls/configs/ImageNet/ESNet/ESNet_x0_75.yaml
浏览文件 @
0f86c555
...
@@ -13,6 +13,18 @@ Global:
...
@@ -13,6 +13,18 @@ Global:
# used for static mode and model export
# used for static mode and model export
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ESNet_x0_75
name
:
ESNet_x0_75
...
...
ppcls/configs/ImageNet/ESNet/ESNet_x1_0.yaml
浏览文件 @
0f86c555
...
@@ -13,6 +13,18 @@ Global:
...
@@ -13,6 +13,18 @@ Global:
# used for static mode and model export
# used for static mode and model export
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ESNet_x1_0
name
:
ESNet_x1_0
...
...
ppcls/configs/ImageNet/EfficientNet/EfficientNetB0.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
EfficientNetB0
name
:
EfficientNetB0
...
...
ppcls/configs/ImageNet/EfficientNet/EfficientNetB1.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
240
,
240
]
image_shape
:
[
3
,
240
,
240
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
EfficientNetB1
name
:
EfficientNetB1
...
...
ppcls/configs/ImageNet/EfficientNet/EfficientNetB2.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
260
,
260
]
image_shape
:
[
3
,
260
,
260
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
EfficientNetB2
name
:
EfficientNetB2
...
...
ppcls/configs/ImageNet/EfficientNet/EfficientNetB3.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
300
,
300
]
image_shape
:
[
3
,
300
,
300
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
EfficientNetB3
name
:
EfficientNetB3
...
...
ppcls/configs/ImageNet/EfficientNet/EfficientNetB4.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
380
,
380
]
image_shape
:
[
3
,
380
,
380
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
EfficientNetB4
name
:
EfficientNetB4
...
...
ppcls/configs/ImageNet/EfficientNet/EfficientNetB5.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
456
,
456
]
image_shape
:
[
3
,
456
,
456
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
EfficientNetB5
name
:
EfficientNetB5
...
...
ppcls/configs/ImageNet/EfficientNet/EfficientNetB6.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
528
,
528
]
image_shape
:
[
3
,
528
,
528
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
EfficientNetB6
name
:
EfficientNetB6
...
...
ppcls/configs/ImageNet/EfficientNet/EfficientNetB7.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
600
,
600
]
image_shape
:
[
3
,
600
,
600
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
EfficientNetB7
name
:
EfficientNetB7
...
...
ppcls/configs/ImageNet/GhostNet/GhostNet_x0_5.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
GhostNet_x0_5
name
:
GhostNet_x0_5
...
...
ppcls/configs/ImageNet/GhostNet/GhostNet_x1_0.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
GhostNet_x1_0
name
:
GhostNet_x1_0
...
...
ppcls/configs/ImageNet/GhostNet/GhostNet_x1_3.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
GhostNet_x1_3
name
:
GhostNet_x1_3
...
...
ppcls/configs/ImageNet/HRNet/HRNet_W18_C.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
HRNet_W18_C
name
:
HRNet_W18_C
...
...
ppcls/configs/ImageNet/HRNet/HRNet_W30_C.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
HRNet_W30_C
name
:
HRNet_W30_C
...
...
ppcls/configs/ImageNet/HRNet/HRNet_W32_C.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
HRNet_W32_C
name
:
HRNet_W32_C
...
...
ppcls/configs/ImageNet/HRNet/HRNet_W40_C.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
HRNet_W40_C
name
:
HRNet_W40_C
...
...
ppcls/configs/ImageNet/HRNet/HRNet_W44_C.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
HRNet_W44_C
name
:
HRNet_W44_C
...
...
ppcls/configs/ImageNet/HRNet/HRNet_W48_C.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
HRNet_W48_C
name
:
HRNet_W48_C
...
...
ppcls/configs/ImageNet/HRNet/HRNet_W64_C.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
HRNet_W64_C
name
:
HRNet_W64_C
...
...
ppcls/configs/ImageNet/HarDNet/HarDNet39_ds.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
HarDNet39_ds
name
:
HarDNet39_ds
...
...
ppcls/configs/ImageNet/HarDNet/HarDNet68.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
HarDNet68
name
:
HarDNet68
...
...
ppcls/configs/ImageNet/HarDNet/HarDNet68_ds.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
HarDNet68_ds
name
:
HarDNet68_ds
...
...
ppcls/configs/ImageNet/HarDNet/HarDNet85.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
HarDNet85
name
:
HarDNet85
...
...
ppcls/configs/ImageNet/Inception/GoogLeNet.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
GoogLeNet
name
:
GoogLeNet
...
...
ppcls/configs/ImageNet/Inception/InceptionV3.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
299
,
299
]
image_shape
:
[
3
,
299
,
299
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
InceptionV3
name
:
InceptionV3
...
...
ppcls/configs/ImageNet/Inception/InceptionV4.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
299
,
299
]
image_shape
:
[
3
,
299
,
299
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
InceptionV4
name
:
InceptionV4
...
...
ppcls/configs/ImageNet/LeViT/LeViT_128.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
LeViT_128
name
:
LeViT_128
...
...
ppcls/configs/ImageNet/LeViT/LeViT_128S.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
LeViT_128S
name
:
LeViT_128S
...
...
ppcls/configs/ImageNet/LeViT/LeViT_192.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
LeViT_192
name
:
LeViT_192
...
...
ppcls/configs/ImageNet/LeViT/LeViT_256.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
LeViT_256
name
:
LeViT_256
...
...
ppcls/configs/ImageNet/LeViT/LeViT_384.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
LeViT_384
name
:
LeViT_384
...
...
ppcls/configs/ImageNet/MicroNet/MicroNet_M0.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
MicroNet_M0
name
:
MicroNet_M0
...
...
ppcls/configs/ImageNet/MicroNet/MicroNet_M1.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
MicroNet_M1
name
:
MicroNet_M1
...
...
ppcls/configs/ImageNet/MicroNet/MicroNet_M2.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
MicroNet_M2
name
:
MicroNet_M2
...
...
ppcls/configs/ImageNet/MicroNet/MicroNet_M3.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
MicroNet_M3
name
:
MicroNet_M3
...
...
ppcls/configs/ImageNet/MixNet/MixNet_L.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
MixNet_L
name
:
MixNet_L
...
...
ppcls/configs/ImageNet/MixNet/MixNet_M.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
MixNet_M
name
:
MixNet_M
...
...
ppcls/configs/ImageNet/MixNet/MixNet_S.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
MixNet_S
name
:
MixNet_S
...
...
ppcls/configs/ImageNet/MobileNeXt/MobileNeXt_x1_0.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
MobileNeXt_x1_0
name
:
MobileNeXt_x1_0
...
...
ppcls/configs/ImageNet/MobileNetV1/MobileNetV1.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
MobileNetV1
name
:
MobileNetV1
...
...
ppcls/configs/ImageNet/MobileNetV1/MobileNetV1_x0_25.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
MobileNetV1_x0_25
name
:
MobileNetV1_x0_25
...
...
ppcls/configs/ImageNet/MobileNetV1/MobileNetV1_x0_5.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
MobileNetV1_x0_5
name
:
MobileNetV1_x0_5
...
...
ppcls/configs/ImageNet/MobileNetV1/MobileNetV1_x0_75.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
MobileNetV1_x0_75
name
:
MobileNetV1_x0_75
...
...
ppcls/configs/ImageNet/MobileNetV2/MobileNetV2.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
MobileNetV2
name
:
MobileNetV2
...
...
ppcls/configs/ImageNet/MobileNetV2/MobileNetV2_x0_25.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
MobileNetV2_x0_25
name
:
MobileNetV2_x0_25
...
...
ppcls/configs/ImageNet/MobileNetV2/MobileNetV2_x0_5.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
MobileNetV2_x0_5
name
:
MobileNetV2_x0_5
...
...
ppcls/configs/ImageNet/MobileNetV2/MobileNetV2_x0_75.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
MobileNetV2_x0_75
name
:
MobileNetV2_x0_75
...
...
ppcls/configs/ImageNet/MobileNetV2/MobileNetV2_x1_5.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
MobileNetV2_x1_5
name
:
MobileNetV2_x1_5
...
...
ppcls/configs/ImageNet/MobileNetV2/MobileNetV2_x2_0.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
MobileNetV2_x2_0
name
:
MobileNetV2_x2_0
...
...
ppcls/configs/ImageNet/MobileNetV3/MobileNetV3_large_x0_35.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
MobileNetV3_large_x0_35
name
:
MobileNetV3_large_x0_35
...
...
ppcls/configs/ImageNet/MobileNetV3/MobileNetV3_large_x0_5.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
MobileNetV3_large_x0_5
name
:
MobileNetV3_large_x0_5
...
...
ppcls/configs/ImageNet/MobileNetV3/MobileNetV3_large_x0_75.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
MobileNetV3_large_x0_75
name
:
MobileNetV3_large_x0_75
...
...
ppcls/configs/ImageNet/MobileNetV3/MobileNetV3_large_x1_0.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
MobileNetV3_large_x1_0
name
:
MobileNetV3_large_x1_0
...
...
ppcls/configs/ImageNet/MobileNetV3/MobileNetV3_large_x1_25.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
MobileNetV3_large_x1_25
name
:
MobileNetV3_large_x1_25
...
...
ppcls/configs/ImageNet/MobileNetV3/MobileNetV3_small_x0_35.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
MobileNetV3_small_x0_35
name
:
MobileNetV3_small_x0_35
...
...
ppcls/configs/ImageNet/MobileNetV3/MobileNetV3_small_x0_5.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
MobileNetV3_small_x0_5
name
:
MobileNetV3_small_x0_5
...
...
ppcls/configs/ImageNet/MobileNetV3/MobileNetV3_small_x0_75.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
MobileNetV3_small_x0_75
name
:
MobileNetV3_small_x0_75
...
...
ppcls/configs/ImageNet/MobileNetV3/MobileNetV3_small_x1_0.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
MobileNetV3_small_x1_0
name
:
MobileNetV3_small_x1_0
...
...
ppcls/configs/ImageNet/MobileNetV3/MobileNetV3_small_x1_0_fp32_ultra.yaml
浏览文件 @
0f86c555
...
@@ -15,6 +15,18 @@ Global:
...
@@ -15,6 +15,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
MobileNetV3_small_x1_0
name
:
MobileNetV3_small_x1_0
...
...
ppcls/configs/ImageNet/MobileNetV3/MobileNetV3_small_x1_25.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
MobileNetV3_small_x1_25
name
:
MobileNetV3_small_x1_25
...
...
ppcls/configs/ImageNet/MobileViT/MobileViT_S.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
256
,
256
]
image_shape
:
[
3
,
256
,
256
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
use_dali
:
False
use_dali
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
MobileViT_S
name
:
MobileViT_S
...
...
ppcls/configs/ImageNet/MobileViT/MobileViT_XS.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
256
,
256
]
image_shape
:
[
3
,
256
,
256
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
use_dali
:
False
use_dali
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
MobileViT_XS
name
:
MobileViT_XS
...
...
ppcls/configs/ImageNet/MobileViT/MobileViT_XXS.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
256
,
256
]
image_shape
:
[
3
,
256
,
256
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
use_dali
:
False
use_dali
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
MobileViT_XXS
name
:
MobileViT_XXS
...
...
ppcls/configs/ImageNet/PPLCNet/PPLCNet_x0_25.yaml
浏览文件 @
0f86c555
...
@@ -13,6 +13,18 @@ Global:
...
@@ -13,6 +13,18 @@ Global:
# used for static mode and model export
# used for static mode and model export
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
PPLCNet_x0_25
name
:
PPLCNet_x0_25
...
...
ppcls/configs/ImageNet/PPLCNet/PPLCNet_x0_35.yaml
浏览文件 @
0f86c555
...
@@ -13,6 +13,18 @@ Global:
...
@@ -13,6 +13,18 @@ Global:
# used for static mode and model export
# used for static mode and model export
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
PPLCNet_x0_35
name
:
PPLCNet_x0_35
...
...
ppcls/configs/ImageNet/PPLCNet/PPLCNet_x0_5.yaml
浏览文件 @
0f86c555
...
@@ -13,6 +13,18 @@ Global:
...
@@ -13,6 +13,18 @@ Global:
# used for static mode and model export
# used for static mode and model export
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
PPLCNet_x0_5
name
:
PPLCNet_x0_5
...
...
ppcls/configs/ImageNet/PPLCNet/PPLCNet_x0_75.yaml
浏览文件 @
0f86c555
...
@@ -13,6 +13,18 @@ Global:
...
@@ -13,6 +13,18 @@ Global:
# used for static mode and model export
# used for static mode and model export
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
PPLCNet_x0_75
name
:
PPLCNet_x0_75
...
...
ppcls/configs/ImageNet/PPLCNet/PPLCNet_x1_0.yaml
浏览文件 @
0f86c555
...
@@ -13,6 +13,18 @@ Global:
...
@@ -13,6 +13,18 @@ Global:
# used for static mode and model export
# used for static mode and model export
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
PPLCNet_x1_0
name
:
PPLCNet_x1_0
...
...
ppcls/configs/ImageNet/PPLCNet/PPLCNet_x1_0_fp32_ultra.yaml
浏览文件 @
0f86c555
...
@@ -15,6 +15,18 @@ Global:
...
@@ -15,6 +15,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
PPLCNet_x1_0
name
:
PPLCNet_x1_0
...
...
ppcls/configs/ImageNet/PPLCNet/PPLCNet_x1_5.yaml
浏览文件 @
0f86c555
...
@@ -13,6 +13,18 @@ Global:
...
@@ -13,6 +13,18 @@ Global:
# used for static mode and model export
# used for static mode and model export
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
PPLCNet_x1_5
name
:
PPLCNet_x1_5
...
...
ppcls/configs/ImageNet/PPLCNet/PPLCNet_x2_0.yaml
浏览文件 @
0f86c555
...
@@ -13,6 +13,18 @@ Global:
...
@@ -13,6 +13,18 @@ Global:
# used for static mode and model export
# used for static mode and model export
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
PPLCNet_x2_0
name
:
PPLCNet_x2_0
...
...
ppcls/configs/ImageNet/PPLCNet/PPLCNet_x2_5.yaml
浏览文件 @
0f86c555
...
@@ -13,6 +13,18 @@ Global:
...
@@ -13,6 +13,18 @@ Global:
# used for static mode and model export
# used for static mode and model export
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
PPLCNet_x2_5
name
:
PPLCNet_x2_5
...
...
ppcls/configs/ImageNet/PPLCNetV2/PPLCNetV2_base.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
PPLCNetV2_base
name
:
PPLCNetV2_base
...
...
ppcls/configs/ImageNet/PPLCNetV2/PPLCNetV2_large.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
PPLCNetV2_large
name
:
PPLCNetV2_large
...
...
ppcls/configs/ImageNet/PPLCNetV2/PPLCNetV2_small.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
PPLCNetV2_small
name
:
PPLCNetV2_small
...
...
ppcls/configs/ImageNet/PVTV2/PVT_V2_B0.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
PVT_V2_B0
name
:
PVT_V2_B0
...
...
ppcls/configs/ImageNet/PVTV2/PVT_V2_B1.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
PVT_V2_B1
name
:
PVT_V2_B1
...
...
ppcls/configs/ImageNet/PVTV2/PVT_V2_B2.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
PVT_V2_B2
name
:
PVT_V2_B2
...
...
ppcls/configs/ImageNet/PVTV2/PVT_V2_B2_Linear.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
PVT_V2_B2_Linear
name
:
PVT_V2_B2_Linear
...
...
ppcls/configs/ImageNet/PVTV2/PVT_V2_B3.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
PVT_V2_B3
name
:
PVT_V2_B3
...
...
ppcls/configs/ImageNet/PVTV2/PVT_V2_B4.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
PVT_V2_B4
name
:
PVT_V2_B4
...
...
ppcls/configs/ImageNet/PVTV2/PVT_V2_B5.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
PVT_V2_B5
name
:
PVT_V2_B5
...
...
ppcls/configs/ImageNet/PeleeNet/PeleeNet.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
PeleeNet
name
:
PeleeNet
...
...
ppcls/configs/ImageNet/ReXNet/ReXNet_1_0.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ReXNet_1_0
name
:
ReXNet_1_0
...
...
ppcls/configs/ImageNet/ReXNet/ReXNet_1_3.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ReXNet_1_3
name
:
ReXNet_1_3
...
...
ppcls/configs/ImageNet/ReXNet/ReXNet_1_5.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ReXNet_1_5
name
:
ReXNet_1_5
...
...
ppcls/configs/ImageNet/ReXNet/ReXNet_2_0.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ReXNet_2_0
name
:
ReXNet_2_0
...
...
ppcls/configs/ImageNet/ReXNet/ReXNet_3_0.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ReXNet_3_0
name
:
ReXNet_3_0
...
...
ppcls/configs/ImageNet/RedNet/RedNet101.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
RedNet101
name
:
RedNet101
...
...
ppcls/configs/ImageNet/RedNet/RedNet152.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
RedNet152
name
:
RedNet152
...
...
ppcls/configs/ImageNet/RedNet/RedNet26.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
RedNet26
name
:
RedNet26
...
...
ppcls/configs/ImageNet/RedNet/RedNet38.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
RedNet38
name
:
RedNet38
...
...
ppcls/configs/ImageNet/RedNet/RedNet50.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
RedNet50
name
:
RedNet50
...
...
ppcls/configs/ImageNet/RegNet/RegNetX_12GF.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
256
,
256
]
image_shape
:
[
3
,
256
,
256
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
RegNetX_12GF
name
:
RegNetX_12GF
...
...
ppcls/configs/ImageNet/RegNet/RegNetX_1600MF.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
256
,
256
]
image_shape
:
[
3
,
256
,
256
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
RegNetX_1600MF
name
:
RegNetX_1600MF
...
...
ppcls/configs/ImageNet/RegNet/RegNetX_16GF.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
256
,
256
]
image_shape
:
[
3
,
256
,
256
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
RegNetX_16GF
name
:
RegNetX_16GF
...
...
ppcls/configs/ImageNet/RegNet/RegNetX_200MF.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
256
,
256
]
image_shape
:
[
3
,
256
,
256
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
RegNetX_200MF
name
:
RegNetX_200MF
...
...
ppcls/configs/ImageNet/RegNet/RegNetX_3200MF.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
256
,
256
]
image_shape
:
[
3
,
256
,
256
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
RegNetX_3200MF
name
:
RegNetX_3200MF
...
...
ppcls/configs/ImageNet/RegNet/RegNetX_32GF.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
256
,
256
]
image_shape
:
[
3
,
256
,
256
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
RegNetX_32GF
name
:
RegNetX_32GF
...
...
ppcls/configs/ImageNet/RegNet/RegNetX_400MF.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
256
,
256
]
image_shape
:
[
3
,
256
,
256
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
RegNetX_400MF
name
:
RegNetX_400MF
...
...
ppcls/configs/ImageNet/RegNet/RegNetX_600MF.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
256
,
256
]
image_shape
:
[
3
,
256
,
256
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
RegNetX_600MF
name
:
RegNetX_600MF
...
...
ppcls/configs/ImageNet/RegNet/RegNetX_6400MF.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
256
,
256
]
image_shape
:
[
3
,
256
,
256
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
RegNetX_6400MF
name
:
RegNetX_6400MF
...
...
ppcls/configs/ImageNet/RegNet/RegNetX_800MF.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
256
,
256
]
image_shape
:
[
3
,
256
,
256
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
RegNetX_800MF
name
:
RegNetX_800MF
...
...
ppcls/configs/ImageNet/RegNet/RegNetX_8GF.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
256
,
256
]
image_shape
:
[
3
,
256
,
256
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
RegNetX_8GF
name
:
RegNetX_8GF
...
...
ppcls/configs/ImageNet/RepVGG/RepVGG_A0.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
RepVGG_A0
name
:
RepVGG_A0
...
...
ppcls/configs/ImageNet/RepVGG/RepVGG_A1.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
RepVGG_A1
name
:
RepVGG_A1
...
...
ppcls/configs/ImageNet/RepVGG/RepVGG_A2.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
RepVGG_A2
name
:
RepVGG_A2
...
...
ppcls/configs/ImageNet/RepVGG/RepVGG_B0.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
RepVGG_B0
name
:
RepVGG_B0
...
...
ppcls/configs/ImageNet/RepVGG/RepVGG_B1.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
RepVGG_B1
name
:
RepVGG_B1
...
...
ppcls/configs/ImageNet/RepVGG/RepVGG_B1g2.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
RepVGG_B1g2
name
:
RepVGG_B1g2
...
...
ppcls/configs/ImageNet/RepVGG/RepVGG_B1g4.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
RepVGG_B1g4
name
:
RepVGG_B1g4
...
...
ppcls/configs/ImageNet/RepVGG/RepVGG_B2.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
RepVGG_B2
name
:
RepVGG_B2
...
...
ppcls/configs/ImageNet/RepVGG/RepVGG_B2g4.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
RepVGG_B2g4
name
:
RepVGG_B2g4
...
...
ppcls/configs/ImageNet/RepVGG/RepVGG_B3.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
RepVGG_B3
name
:
RepVGG_B3
...
...
ppcls/configs/ImageNet/RepVGG/RepVGG_B3g4.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
RepVGG_B3g4
name
:
RepVGG_B3g4
...
...
ppcls/configs/ImageNet/RepVGG/RepVGG_D2se.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
320
,
320
]
image_shape
:
[
3
,
320
,
320
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
RepVGG_D2se
name
:
RepVGG_D2se
...
...
ppcls/configs/ImageNet/Res2Net/Res2Net101_vd_26w_4s.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
Res2Net101_vd_26w_4s
name
:
Res2Net101_vd_26w_4s
...
...
ppcls/configs/ImageNet/Res2Net/Res2Net200_vd_26w_4s.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
Res2Net200_vd_26w_4s
name
:
Res2Net200_vd_26w_4s
...
...
ppcls/configs/ImageNet/Res2Net/Res2Net50_14w_8s.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
Res2Net50_14w_8s
name
:
Res2Net50_14w_8s
...
...
ppcls/configs/ImageNet/Res2Net/Res2Net50_26w_4s.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
Res2Net50_26w_4s
name
:
Res2Net50_26w_4s
...
...
ppcls/configs/ImageNet/Res2Net/Res2Net50_vd_26w_4s.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
Res2Net50_vd_26w_4s
name
:
Res2Net50_vd_26w_4s
...
...
ppcls/configs/ImageNet/ResNeSt/ResNeSt101.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
256
,
256
]
image_shape
:
[
3
,
256
,
256
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ResNeSt101
name
:
ResNeSt101
...
...
ppcls/configs/ImageNet/ResNeSt/ResNeSt200.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
320
,
320
]
image_shape
:
[
3
,
320
,
320
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ResNeSt200
name
:
ResNeSt200
...
...
ppcls/configs/ImageNet/ResNeSt/ResNeSt269.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
416
,
416
]
image_shape
:
[
3
,
416
,
416
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ResNeSt269
name
:
ResNeSt269
...
...
ppcls/configs/ImageNet/ResNeSt/ResNeSt50.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ResNeSt50
name
:
ResNeSt50
...
...
ppcls/configs/ImageNet/ResNeSt/ResNeSt50_fast_1s1x64d.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ResNeSt50_fast_1s1x64d
name
:
ResNeSt50_fast_1s1x64d
...
...
ppcls/configs/ImageNet/ResNeXt/ResNeXt101_32x4d.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ResNeXt101_32x4d
name
:
ResNeXt101_32x4d
...
...
ppcls/configs/ImageNet/ResNeXt/ResNeXt101_64x4d.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ResNeXt101_64x4d
name
:
ResNeXt101_64x4d
...
...
ppcls/configs/ImageNet/ResNeXt/ResNeXt101_vd_32x4d.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ResNeXt101_vd_32x4d
name
:
ResNeXt101_vd_32x4d
...
...
ppcls/configs/ImageNet/ResNeXt/ResNeXt101_vd_64x4d.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ResNeXt101_vd_64x4d
name
:
ResNeXt101_vd_64x4d
...
...
ppcls/configs/ImageNet/ResNeXt/ResNeXt152_32x4d.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ResNeXt152_32x4d
name
:
ResNeXt152_32x4d
...
...
ppcls/configs/ImageNet/ResNeXt/ResNeXt152_64x4d.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ResNeXt152_64x4d
name
:
ResNeXt152_64x4d
...
...
ppcls/configs/ImageNet/ResNeXt/ResNeXt152_vd_32x4d.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ResNeXt152_vd_32x4d
name
:
ResNeXt152_vd_32x4d
...
...
ppcls/configs/ImageNet/ResNeXt/ResNeXt152_vd_64x4d.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ResNeXt152_vd_64x4d
name
:
ResNeXt152_vd_64x4d
...
...
ppcls/configs/ImageNet/ResNeXt/ResNeXt50_32x4d.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ResNeXt50_32x4d
name
:
ResNeXt50_32x4d
...
...
ppcls/configs/ImageNet/ResNeXt/ResNeXt50_64x4d.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ResNeXt50_64x4d
name
:
ResNeXt50_64x4d
...
...
ppcls/configs/ImageNet/ResNeXt/ResNeXt50_vd_32x4d.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ResNeXt50_vd_32x4d
name
:
ResNeXt50_vd_32x4d
...
...
ppcls/configs/ImageNet/ResNeXt/ResNeXt50_vd_64x4d.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ResNeXt50_vd_64x4d
name
:
ResNeXt50_vd_64x4d
...
...
ppcls/configs/ImageNet/ResNeXt101_wsl/ResNeXt101_32x16d_wsl.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ResNeXt101_32x16d_wsl
name
:
ResNeXt101_32x16d_wsl
...
...
ppcls/configs/ImageNet/ResNeXt101_wsl/ResNeXt101_32x32d_wsl.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ResNeXt101_32x32d_wsl
name
:
ResNeXt101_32x32d_wsl
...
...
ppcls/configs/ImageNet/ResNeXt101_wsl/ResNeXt101_32x48d_wsl.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ResNeXt101_32x48d_wsl
name
:
ResNeXt101_32x48d_wsl
...
...
ppcls/configs/ImageNet/ResNeXt101_wsl/ResNeXt101_32x8d_wsl.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ResNeXt101_32x8d_wsl
name
:
ResNeXt101_32x8d_wsl
...
...
ppcls/configs/ImageNet/ResNet/ResNet101.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ResNet101
name
:
ResNet101
...
...
ppcls/configs/ImageNet/ResNet/ResNet101_vd.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ResNet101_vd
name
:
ResNet101_vd
...
...
ppcls/configs/ImageNet/ResNet/ResNet152.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ResNet152
name
:
ResNet152
...
...
ppcls/configs/ImageNet/ResNet/ResNet152_vd.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ResNet152_vd
name
:
ResNet152_vd
...
...
ppcls/configs/ImageNet/ResNet/ResNet18.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ResNet18
name
:
ResNet18
...
...
ppcls/configs/ImageNet/ResNet/ResNet18_dbb.yaml
浏览文件 @
0f86c555
...
@@ -15,6 +15,18 @@ Global:
...
@@ -15,6 +15,18 @@ Global:
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ResNet18
name
:
ResNet18
...
...
ppcls/configs/ImageNet/ResNet/ResNet18_vd.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ResNet18_vd
name
:
ResNet18_vd
...
...
ppcls/configs/ImageNet/ResNet/ResNet200_vd.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ResNet200_vd
name
:
ResNet200_vd
...
...
ppcls/configs/ImageNet/ResNet/ResNet34.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ResNet34
name
:
ResNet34
...
...
ppcls/configs/ImageNet/ResNet/ResNet34_vd.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ResNet34_vd
name
:
ResNet34_vd
...
...
ppcls/configs/ImageNet/ResNet/ResNet50.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ResNet50
name
:
ResNet50
...
...
ppcls/configs/ImageNet/ResNet/ResNet50_fp32_ultra.yaml
浏览文件 @
0f86c555
...
@@ -17,6 +17,18 @@ Global:
...
@@ -17,6 +17,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ResNet50
name
:
ResNet50
...
...
ppcls/configs/ImageNet/ResNet/ResNet50_vd.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ResNet50_vd
name
:
ResNet50_vd
...
...
ppcls/configs/ImageNet/SENet/SENet154_vd.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
SENet154_vd
name
:
SENet154_vd
...
...
ppcls/configs/ImageNet/SENet/SE_ResNeXt101_32x4d.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
SE_ResNeXt101_32x4d
name
:
SE_ResNeXt101_32x4d
...
...
ppcls/configs/ImageNet/SENet/SE_ResNeXt50_32x4d.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
SE_ResNeXt50_32x4d
name
:
SE_ResNeXt50_32x4d
...
...
ppcls/configs/ImageNet/SENet/SE_ResNeXt50_vd_32x4d.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
SE_ResNeXt50_vd_32x4d
name
:
SE_ResNeXt50_vd_32x4d
...
...
ppcls/configs/ImageNet/SENet/SE_ResNet18_vd.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
SE_ResNet18_vd
name
:
SE_ResNet18_vd
...
...
ppcls/configs/ImageNet/SENet/SE_ResNet34_vd.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
SE_ResNet34_vd
name
:
SE_ResNet34_vd
...
...
ppcls/configs/ImageNet/SENet/SE_ResNet50_vd.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
SE_ResNet50_vd
name
:
SE_ResNet50_vd
...
...
ppcls/configs/ImageNet/ShuffleNet/ShuffleNetV2_swish.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ShuffleNetV2_swish
name
:
ShuffleNetV2_swish
...
...
ppcls/configs/ImageNet/ShuffleNet/ShuffleNetV2_x0_25.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ShuffleNetV2_x0_25
name
:
ShuffleNetV2_x0_25
...
...
ppcls/configs/ImageNet/ShuffleNet/ShuffleNetV2_x0_33.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ShuffleNetV2_x0_33
name
:
ShuffleNetV2_x0_33
...
...
ppcls/configs/ImageNet/ShuffleNet/ShuffleNetV2_x0_5.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ShuffleNetV2_x0_5
name
:
ShuffleNetV2_x0_5
...
...
ppcls/configs/ImageNet/ShuffleNet/ShuffleNetV2_x1_0.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ShuffleNetV2_x1_0
name
:
ShuffleNetV2_x1_0
...
...
ppcls/configs/ImageNet/ShuffleNet/ShuffleNetV2_x1_5.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ShuffleNetV2_x1_5
name
:
ShuffleNetV2_x1_5
...
...
ppcls/configs/ImageNet/ShuffleNet/ShuffleNetV2_x2_0.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ShuffleNetV2_x2_0
name
:
ShuffleNetV2_x2_0
...
...
ppcls/configs/ImageNet/SqueezeNet/SqueezeNet1_0.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
SqueezeNet1_0
name
:
SqueezeNet1_0
...
...
ppcls/configs/ImageNet/SqueezeNet/SqueezeNet1_1.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
SqueezeNet1_1
name
:
SqueezeNet1_1
...
...
ppcls/configs/ImageNet/SwinTransformer/SwinTransformer_base_patch4_window12_384.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
SwinTransformer_base_patch4_window12_384
name
:
SwinTransformer_base_patch4_window12_384
...
...
ppcls/configs/ImageNet/SwinTransformer/SwinTransformer_base_patch4_window7_224.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
SwinTransformer_base_patch4_window7_224
name
:
SwinTransformer_base_patch4_window7_224
...
...
ppcls/configs/ImageNet/SwinTransformer/SwinTransformer_large_patch4_window12_384.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
SwinTransformer_large_patch4_window12_384
name
:
SwinTransformer_large_patch4_window12_384
...
...
ppcls/configs/ImageNet/SwinTransformer/SwinTransformer_large_patch4_window7_224.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
SwinTransformer_large_patch4_window7_224
name
:
SwinTransformer_large_patch4_window7_224
...
...
ppcls/configs/ImageNet/SwinTransformer/SwinTransformer_small_patch4_window7_224.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
SwinTransformer_small_patch4_window7_224
name
:
SwinTransformer_small_patch4_window7_224
...
...
ppcls/configs/ImageNet/SwinTransformer/SwinTransformer_tiny_patch4_window7_224.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
SwinTransformer_tiny_patch4_window7_224
name
:
SwinTransformer_tiny_patch4_window7_224
...
...
ppcls/configs/ImageNet/SwinTransformerV2/SwinTransformerV2_base_patch4_window16_256.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
SwinTransformerV2_base_patch4_window16_256
name
:
SwinTransformerV2_base_patch4_window16_256
...
...
ppcls/configs/ImageNet/SwinTransformerV2/SwinTransformerV2_base_patch4_window24_384.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
SwinTransformerV2_base_patch4_window24_384
name
:
SwinTransformerV2_base_patch4_window24_384
...
...
ppcls/configs/ImageNet/SwinTransformerV2/SwinTransformerV2_base_patch4_window8_256.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
SwinTransformerV2_base_patch4_window8_256
name
:
SwinTransformerV2_base_patch4_window8_256
...
...
ppcls/configs/ImageNet/SwinTransformerV2/SwinTransformerV2_large_patch4_window16_256.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
SwinTransformerV2_large_patch4_window16_256
name
:
SwinTransformerV2_large_patch4_window16_256
...
...
ppcls/configs/ImageNet/SwinTransformerV2/SwinTransformerV2_large_patch4_window24_384.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
SwinTransformerV2_large_patch4_window24_384
name
:
SwinTransformerV2_large_patch4_window24_384
...
...
ppcls/configs/ImageNet/SwinTransformerV2/SwinTransformerV2_small_patch4_window16_256.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
SwinTransformerV2_small_patch4_window16_256
name
:
SwinTransformerV2_small_patch4_window16_256
...
...
ppcls/configs/ImageNet/SwinTransformerV2/SwinTransformerV2_small_patch4_window8_256.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
SwinTransformerV2_small_patch4_window8_256
name
:
SwinTransformerV2_small_patch4_window8_256
...
...
ppcls/configs/ImageNet/SwinTransformerV2/SwinTransformerV2_tiny_patch4_window16_256.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
SwinTransformerV2_tiny_patch4_window16_256
name
:
SwinTransformerV2_tiny_patch4_window16_256
...
...
ppcls/configs/ImageNet/SwinTransformerV2/SwinTransformerV2_tiny_patch4_window8_256.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
SwinTransformerV2_tiny_patch4_window8_256
name
:
SwinTransformerV2_tiny_patch4_window8_256
...
...
ppcls/configs/ImageNet/TNT/TNT_base.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
TNT_base
name
:
TNT_base
...
...
ppcls/configs/ImageNet/TNT/TNT_small.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
TNT_small
name
:
TNT_small
...
...
ppcls/configs/ImageNet/TinyNet/TinyNet_A.yaml
浏览文件 @
0f86c555
...
@@ -18,6 +18,18 @@ Global:
...
@@ -18,6 +18,18 @@ Global:
EMA
:
EMA
:
decay
:
0.9999
decay
:
0.9999
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
TinyNet_A
name
:
TinyNet_A
...
...
ppcls/configs/ImageNet/TinyNet/TinyNet_B.yaml
浏览文件 @
0f86c555
...
@@ -18,6 +18,18 @@ Global:
...
@@ -18,6 +18,18 @@ Global:
EMA
:
EMA
:
decay
:
0.9999
decay
:
0.9999
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
TinyNet_B
name
:
TinyNet_B
...
...
ppcls/configs/ImageNet/TinyNet/TinyNet_C.yaml
浏览文件 @
0f86c555
...
@@ -18,6 +18,18 @@ Global:
...
@@ -18,6 +18,18 @@ Global:
EMA
:
EMA
:
decay
:
0.9999
decay
:
0.9999
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
TinyNet_C
name
:
TinyNet_C
...
...
ppcls/configs/ImageNet/TinyNet/TinyNet_D.yaml
浏览文件 @
0f86c555
...
@@ -18,6 +18,18 @@ Global:
...
@@ -18,6 +18,18 @@ Global:
EMA
:
EMA
:
decay
:
0.9999
decay
:
0.9999
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
TinyNet_D
name
:
TinyNet_D
...
...
ppcls/configs/ImageNet/TinyNet/TinyNet_E.yaml
浏览文件 @
0f86c555
...
@@ -18,6 +18,18 @@ Global:
...
@@ -18,6 +18,18 @@ Global:
EMA
:
EMA
:
decay
:
0.9999
decay
:
0.9999
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
TinyNet_E
name
:
TinyNet_E
...
...
ppcls/configs/ImageNet/Twins/alt_gvt_base.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
alt_gvt_base
name
:
alt_gvt_base
...
...
ppcls/configs/ImageNet/Twins/alt_gvt_large.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
alt_gvt_large
name
:
alt_gvt_large
...
...
ppcls/configs/ImageNet/Twins/alt_gvt_small.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
alt_gvt_small
name
:
alt_gvt_small
...
...
ppcls/configs/ImageNet/Twins/pcpvt_base.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
pcpvt_base
name
:
pcpvt_base
...
...
ppcls/configs/ImageNet/Twins/pcpvt_large.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
pcpvt_large
name
:
pcpvt_large
...
...
ppcls/configs/ImageNet/Twins/pcpvt_small.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
pcpvt_small
name
:
pcpvt_small
...
...
ppcls/configs/ImageNet/UniFormer/UniFormer_base.yaml
浏览文件 @
0f86c555
...
@@ -17,6 +17,18 @@ Global:
...
@@ -17,6 +17,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
UniFormer_base
name
:
UniFormer_base
...
...
ppcls/configs/ImageNet/UniFormer/UniFormer_base_ls.yaml
浏览文件 @
0f86c555
...
@@ -17,6 +17,18 @@ Global:
...
@@ -17,6 +17,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
UniFormer_base_ls
name
:
UniFormer_base_ls
...
...
ppcls/configs/ImageNet/UniFormer/UniFormer_small.yaml
浏览文件 @
0f86c555
...
@@ -17,6 +17,18 @@ Global:
...
@@ -17,6 +17,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
UniFormer_small
name
:
UniFormer_small
...
...
ppcls/configs/ImageNet/UniFormer/UniFormer_small_plus.yaml
浏览文件 @
0f86c555
...
@@ -17,6 +17,18 @@ Global:
...
@@ -17,6 +17,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
UniFormer_small_plus
name
:
UniFormer_small_plus
...
...
ppcls/configs/ImageNet/UniFormer/UniFormer_small_plus_dim64.yaml
浏览文件 @
0f86c555
...
@@ -17,6 +17,18 @@ Global:
...
@@ -17,6 +17,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
UniFormer_small_plus_dim64
name
:
UniFormer_small_plus_dim64
...
...
ppcls/configs/ImageNet/VAN/VAN_B0.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
VAN_B0
name
:
VAN_B0
...
...
ppcls/configs/ImageNet/VAN/VAN_B1.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
VAN_B1
name
:
VAN_B1
...
...
ppcls/configs/ImageNet/VAN/VAN_B2.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
VAN_B2
name
:
VAN_B2
...
...
ppcls/configs/ImageNet/VAN/VAN_B3.yaml
浏览文件 @
0f86c555
...
@@ -16,6 +16,18 @@ Global:
...
@@ -16,6 +16,18 @@ Global:
# training model under @to_static
# training model under @to_static
to_static
:
False
to_static
:
False
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
VAN_B3
name
:
VAN_B3
...
...
ppcls/configs/ImageNet/VGG/VGG11.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
VGG11
name
:
VGG11
...
...
ppcls/configs/ImageNet/VGG/VGG13.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
VGG13
name
:
VGG13
...
...
ppcls/configs/ImageNet/VGG/VGG16.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
VGG16
name
:
VGG16
...
...
ppcls/configs/ImageNet/VGG/VGG19.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
VGG19
name
:
VGG19
...
...
ppcls/configs/ImageNet/VisionTransformer/ViT_base_patch16_224.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ViT_base_patch16_224
name
:
ViT_base_patch16_224
...
...
ppcls/configs/ImageNet/VisionTransformer/ViT_base_patch16_384.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
384
,
384
]
image_shape
:
[
3
,
384
,
384
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ViT_base_patch16_384
name
:
ViT_base_patch16_384
...
...
ppcls/configs/ImageNet/VisionTransformer/ViT_base_patch32_384.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
384
,
384
]
image_shape
:
[
3
,
384
,
384
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ViT_base_patch32_384
name
:
ViT_base_patch32_384
...
...
ppcls/configs/ImageNet/VisionTransformer/ViT_large_patch16_224.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ViT_large_patch16_224
name
:
ViT_large_patch16_224
...
...
ppcls/configs/ImageNet/VisionTransformer/ViT_large_patch16_384.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
384
,
384
]
image_shape
:
[
3
,
384
,
384
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ViT_large_patch16_384
name
:
ViT_large_patch16_384
...
...
ppcls/configs/ImageNet/VisionTransformer/ViT_large_patch32_384.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
384
,
384
]
image_shape
:
[
3
,
384
,
384
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ViT_large_patch32_384
name
:
ViT_large_patch32_384
...
...
ppcls/configs/ImageNet/VisionTransformer/ViT_small_patch16_224.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
224
,
224
]
image_shape
:
[
3
,
224
,
224
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
ViT_small_patch16_224
name
:
ViT_small_patch16_224
...
...
ppcls/configs/ImageNet/Xception/Xception41.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
299
,
299
]
image_shape
:
[
3
,
299
,
299
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
Xception41
name
:
Xception41
...
...
ppcls/configs/ImageNet/Xception/Xception41_deeplab.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
299
,
299
]
image_shape
:
[
3
,
299
,
299
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
Xception41_deeplab
name
:
Xception41_deeplab
...
...
ppcls/configs/ImageNet/Xception/Xception65.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
299
,
299
]
image_shape
:
[
3
,
299
,
299
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
Xception65
name
:
Xception65
...
...
ppcls/configs/ImageNet/Xception/Xception65_deeplab.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
299
,
299
]
image_shape
:
[
3
,
299
,
299
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
Xception65_deeplab
name
:
Xception65_deeplab
...
...
ppcls/configs/ImageNet/Xception/Xception71.yaml
浏览文件 @
0f86c555
...
@@ -14,6 +14,18 @@ Global:
...
@@ -14,6 +14,18 @@ Global:
image_shape
:
[
3
,
299
,
299
]
image_shape
:
[
3
,
299
,
299
]
save_inference_dir
:
./inference
save_inference_dir
:
./inference
# mixed precision
AMP
:
use_amp
:
False
use_fp16_test
:
False
scale_loss
:
128.0
use_dynamic_loss_scaling
:
True
use_promote
:
False
# O1: mixed fp16, O2: pure fp16
level
:
O1
# model architecture
# model architecture
Arch
:
Arch
:
name
:
Xception71
name
:
Xception71
...
...
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录