paddle.fluid.layers.cross_entropy(input, label, soft_label=True) 运行后label被更改
Created by: linrjing
loss 使用 fluid.layers.cross_entropy(input=self.network_outputs[0], label=gt_label, soft_label=True),
输入: gt_label是我自己设定好的 [0. 0. 0. 0.5 0. 0.5 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. ]
在train的run之后打印 fetch_out的gt_label变为:
loss完整代码如下: cost_cls = fluid.layers.cross_entropy(input=self.network_outputs[0], label=self.label_id_input, soft_label=True) #cost_cls = fluid.layers.sigmoid_cross_entropy_with_logits(x=self.logit, label=self.label_id_input) cost_cls = fluid.layers.reduce_sum(cost_cls, dim=-1) sum_cost_cls = fluid.layers.reduce_sum(cost_cls) self.loss_cls_ = fluid.layers.scale(sum_cost_cls, scale=self.num_gpus, bias_after_scale=False)
尝试用 fluid.layers.sigmoid_cross_entropy_with_logits. gt_label不变化