提交 85ff9692 编写于 作者: Y yangfukui

Update single_distiller.py

上级 49a17a25
...@@ -162,7 +162,7 @@ def soft_label_loss(teacher_var_name, ...@@ -162,7 +162,7 @@ def soft_label_loss(teacher_var_name,
return soft_label_loss return soft_label_loss
def self_defined_loss(program, loss_func, **kwargs): def loss(program, loss_func, **kwargs):
""" """
Combine variables from student model and teacher model by self defined loss. Combine variables from student model and teacher model by self defined loss.
Args: Args:
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册