提交 da6fde1b 编写于 作者: A A. Unique TensorFlower 提交者: TensorFlower Gardener

fixed typo tf.nn.(sparse_)softmax_cross_entropy_with_logits

Change: 139913751
上级 c9b6ce1d
......@@ -343,13 +343,13 @@ each element of `y_` with the corresponding element of `tf.log(y)`. Then
`reduction_indices=[1]` parameter. Finally, `tf.reduce_mean` computes the mean
over all the examples in the batch.
(Note that in the source code, we don't use this formulation, because it is
Note that in the source code, we don't use this formulation, because it is
numerically unstable. Instead, we apply
`tf.nn.softmax_cross_entropy_with_logits` on the unnormalized logits (e.g., we
call `softmax_cross_entropy_with_logits` on `tf.matmul(x, W) + b`), because this
more numerically stable function internally computes the softmax activation. In
your code, consider using tf.nn.(sparse_)softmax_cross_entropy_with_logits
instead).
your code, consider using `tf.nn.softmax_cross_entropy_with_logits`
instead.
Now that we know what we want our model to do, it's very easy to have TensorFlow
train it to do so. Because TensorFlow knows the entire graph of your
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册