提交 b31b07de 编写于 作者: littletomatodonkey's avatar littletomatodonkey

improve distillation config

上级 8a01dcda
...@@ -18,7 +18,7 @@ Global: ...@@ -18,7 +18,7 @@ Global:
# model architecture # model architecture
Arch: Arch:
name: "DistillationModel" name: "DistillationModel"
class_num: 1000 class_num: &class_num 1000
# if not null, its lengths should be same as models # if not null, its lengths should be same as models
pretrained_list: pretrained_list:
# if not null, its lengths should be same as models # if not null, its lengths should be same as models
...@@ -28,11 +28,13 @@ Arch: ...@@ -28,11 +28,13 @@ Arch:
models: models:
- Teacher: - Teacher:
name: MobileNetV3_large_x1_0 name: MobileNetV3_large_x1_0
class_num: *class_num
pretrained: True pretrained: True
use_ssld: True use_ssld: True
dropout_prob: null dropout_prob: null
- Student: - Student:
name: MobileNetV3_small_x1_0 name: MobileNetV3_small_x1_0
class_num: *class_num
pretrained: False pretrained: False
dropout_prob: null dropout_prob: null
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册