Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
BaiXuePrincess
Paddle
提交
bb64efb1
P
Paddle
项目概览
BaiXuePrincess
/
Paddle
与 Fork 源项目一致
Fork自
PaddlePaddle / Paddle
通知
1
Star
1
Fork
0
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
0
列表
看板
标记
里程碑
合并请求
0
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
P
Paddle
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
0
Issue
0
列表
看板
标记
里程碑
合并请求
0
合并请求
0
Pages
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
提交
Issue看板
未验证
提交
bb64efb1
编写于
11月 27, 2020
作者:
G
Guanghua Yu
提交者:
GitHub
11月 27, 2020
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
fix softmax_with_cross_entropy api en docs (#29116)
上级
0dfb8161
变更
1
显示空白变更内容
内联
并排
Showing
1 changed file
with
13 addition
and
12 deletion
+13
-12
python/paddle/fluid/layers/loss.py
python/paddle/fluid/layers/loss.py
+13
-12
未找到文件。
python/paddle/fluid/layers/loss.py
浏览文件 @
bb64efb1
...
...
@@ -1162,9 +1162,6 @@ def softmax_with_cross_entropy(logits,
return_softmax
=
False
,
axis
=-
1
):
r
"""
:alias_main: paddle.nn.functional.softmax_with_cross_entropy
:alias: paddle.nn.functional.softmax_with_cross_entropy,paddle.nn.functional.loss.softmax_with_cross_entropy
:old_api: paddle.fluid.layers.softmax_with_cross_entropy
This operator implements the cross entropy loss function with softmax. This function
combines the calculation of the softmax operation and the cross entropy loss function
...
...
@@ -1209,8 +1206,8 @@ def softmax_with_cross_entropy(logits,
and then cross entropy loss is calculated by softmax and label.
Args:
logits (
Variable
): A multi-dimension ``Tensor`` , and the data type is float32 or float64. The input tensor of unscaled log probabilities.
label (
Variable
): The ground truth ``Tensor`` , data type is the same
logits (
Tensor
): A multi-dimension ``Tensor`` , and the data type is float32 or float64. The input tensor of unscaled log probabilities.
label (
Tensor
): The ground truth ``Tensor`` , data type is the same
as the ``logits`` . If :attr:`soft_label` is set to :attr:`True`,
Label is a ``Tensor`` in the same shape with :attr:`logits`.
If :attr:`soft_label` is set to :attr:`True`, Label is a ``Tensor``
...
...
@@ -1236,7 +1233,7 @@ def softmax_with_cross_entropy(logits,
is the rank of input :attr:`logits`. Default: -1.
Returns:
``
Variable`` or Tuple of two ``Variable
`` : Return the cross entropy loss if \
``
Tensor`` or Tuple of two ``Tensor
`` : Return the cross entropy loss if \
`return_softmax` is False, otherwise the tuple \
(loss, softmax), softmax is in the same shape \
with input logits and cross entropy loss is in \
...
...
@@ -1246,13 +1243,17 @@ def softmax_with_cross_entropy(logits,
Examples:
.. code-block:: python
import paddle.fluid as fluid
import paddle
import numpy as np
data = fluid.data(name='data', shape=[-1, 128], dtype='float32')
label = fluid.data(name='label', shape=[-1, 1], dtype='int64')
fc = fluid.layers.fc(input=data, size=100)
out = fluid.layers.softmax_with_cross_entropy(
logits=fc, label=label)
data = np.random.rand(128).astype("float32")
label = np.random.rand(1).astype("int64")
data = paddle.to_tensor(data)
label = paddle.to_tensor(label)
linear = paddle.nn.Linear(128, 100)
x = linear(data)
out = paddle.nn.functional.softmax_with_cross_entropy(logits=x, label=label)
print(out)
"""
if
in_dygraph_mode
():
softmax
,
loss
=
core
.
ops
.
softmax_with_cross_entropy
(
...
...
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录