提交 8f59d79d 编写于 作者: Q qiaolongfei

update doc for sigmoid_cross_entropy_with_logits

上级 5b50307b
......@@ -113,14 +113,14 @@ The logistic loss is given as follows:
$$loss = -Labels * \log(\sigma(X)) - (1 - Labels) * \log(1 - \sigma(X))$$
We know that $$\sigma(X) = (1 / (1 + \exp(-X)))$$. By substituting this we get:
We know that $$\sigma(X) = \\frac{1}{1 + \exp(-X)}$$. By substituting this we get:
$$loss = X - X * Labels + \log(1 + \exp(-X))$$
For stability and to prevent overflow of $$\exp(-X)$$ when X < 0,
we reformulate the loss as follows:
$$loss = \max(X, 0) - X * Labels + \log(1 + \exp(-|X|))$$
$$loss = \max(X, 0) - X * Labels + \log(1 + \exp(-\|X\|))$$
Both the input `X` and `Labels` can carry the LoD (Level of Details) information.
However the output only shares the LoD with input `X`.
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册