提交 68da674d 编写于 作者: L LDOUBLEV

fix distill

上级 7e5d6055
......@@ -35,7 +35,7 @@ Architecture:
name: DBHead
kernel_list: [7,2,2]
k: 50
Teacher:
Student2:
return_all_feats: false
model_type: det
algorithm: DB
......
......@@ -448,7 +448,7 @@ Architecture:
<a name="222"></a>
#### 2.2.2 损失函数
检测ch_PP-OCRv3_det_cml.yml蒸馏损失函数配置如下所示。相比较于ch_PP-OCRv3_det_distill.yml的损失函数配置,cml蒸馏的损失函数配置做了3个改动:
检测ch_PP-OCRv3_det_cml.yml蒸馏损失函数配置如下所示。
```yaml
Loss:
name: CombinedLoss
......
......@@ -464,7 +464,7 @@ The key contains `backbone_out`, `neck_out`, `head_out`, and `value` is the tens
<a name="222"></a>
#### 2.2.2 Loss Function
The distillation loss function configuration(`ch_PP-OCRv3_det_cml.yml`) is shown below. Compared with the loss function configuration of ch_PP-OCRv3_det_distill.yml, there are three changes:
The distillation loss function configuration(`ch_PP-OCRv3_det_cml.yml`) is shown below.
```yaml
Loss:
name: CombinedLoss
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册