Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
s920243400
PaddleDetection
提交
3c8aa787
P
PaddleDetection
项目概览
s920243400
/
PaddleDetection
与 Fork 源项目一致
Fork自
PaddlePaddle / PaddleDetection
通知
2
Star
0
Fork
0
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
0
列表
看板
标记
里程碑
合并请求
0
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
P
PaddleDetection
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
0
Issue
0
列表
看板
标记
里程碑
合并请求
0
合并请求
0
Pages
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
提交
Issue看板
提交
3c8aa787
编写于
1月 30, 2019
作者:
X
xuezhong
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
define sampled_softmax_with_cross_entropy
上级
15d52f09
变更
2
隐藏空白更改
内联
并排
Showing
2 changed file
with
31 addition
and
17 deletion
+31
-17
python/paddle/fluid/layers/nn.py
python/paddle/fluid/layers/nn.py
+30
-16
python/paddle/fluid/tests/unittests/test_layers.py
python/paddle/fluid/tests/unittests/test_layers.py
+1
-1
未找到文件。
python/paddle/fluid/layers/nn.py
浏览文件 @
3c8aa787
...
...
@@ -87,7 +87,7 @@ __all__ = [
'transpose'
,
'im2sequence'
,
'nce'
,
'sample
_logits
'
,
'sample
d_softmax_with_cross_entropy
'
,
'hsigmoid'
,
'beam_search'
,
'row_conv'
,
...
...
@@ -5765,23 +5765,22 @@ def softmax_with_cross_entropy(logits,
return
loss
def
sample
_logits
(
logits
,
label
,
num_samples
,
uniq
=
T
rue
,
remove_accidental_hits
=
True
,
use_custom_samples
=
False
,
custom_samples
=
None
,
custom_probabilities
=
None
,
seed
=
0
):
def
sample
d_softmax_with_cross_entropy
(
logits
,
label
,
num_samples
,
num_true
=
num_t
rue
,
remove_accidental_hits
=
True
,
use_custom_samples
=
False
,
custom_samples
=
None
,
custom_probabilities
=
None
,
seed
=
0
):
"""
**Sampled Softmax With Cross Entropy Operator.**
Cross entropy loss with sampled softmax is used as the output layer for
larger output classes extensively. This operator samples a number of samples
for
each example(row)
, and computes the softmax normalized values for each
for
all examples
, and computes the softmax normalized values for each
row of the sampled tensor, after which cross-entropy loss is computed.
This provides a more numerically stable gradient.
Because this operator performs a softmax on logits internally, it expects
unscaled logits. This operator should not be used with the output of
...
...
@@ -5810,13 +5809,19 @@ def sample_logits(logits,
labels per example.
num_samples (int): The number for each example, num_samples should be
less than the number of class.
seed (int): The random seed for generating random number, which is used
in the process of sampling. Default is 0.
num_true(int): The number of target classes per training example.
remove_accidental_hits (bool): A flag indicating whether to remove
accidental hits when sampling. If True and if a sample[i, j]
accidentally hits true labels, then the corresponding
sampled_logits[i, j] is minus by 1e20 to make its softmax result
close to zero. Default is True.
use_custom_samples (bool): Whether to use custom samples and probabities to sample
logits.
custom_samples (Variable): User defined samples, which is a 1-D tensor with shape [S]. S is the num_samples.
custom_probabilities (Variable): User defined probabilities of samples, a 1-D tensor which has the same shape with custom_samples.
seed (int): The random seed for generating random number, which is used
in the process of sampling. Default is 0.
Returns:
Variable: Return the cross entropy loss which is a 2-D tensor with shape
...
...
@@ -5855,12 +5860,21 @@ def sample_logits(logits,
},
attrs
=
{
'use_custom_samples'
:
use_custom_samples
,
'uniq'
:
uniq
,
'uniq'
:
True
,
'remove_accidental_hits'
:
remove_accidental_hits
,
'num_samples'
:
num_samples
,
'seed'
:
seed
})
return
sampled_logits
,
sampled_label
,
samples
,
probabilities
helper
.
append_op
(
type
=
'softmax_with_cross_entropy'
,
inputs
=
{
'Logits'
:
sampled_logits
,
'Label'
:
sampled_label
,
'soft_label'
:
False
,
},
outputs
=
{
'loss'
:
samples
,
})
return
outputs
/
num_true
def
smooth_l1
(
x
,
y
,
inside_weight
=
None
,
outside_weight
=
None
,
sigma
=
None
):
...
...
python/paddle/fluid/tests/unittests/test_layers.py
浏览文件 @
3c8aa787
...
...
@@ -374,7 +374,7 @@ class TestBook(unittest.TestCase):
self
.
assertIsNotNone
(
output
)
print
(
str
(
program
))
def
test_sample
_logits
(
self
):
def
test_sample
d_softmax_with_cross_entropy
(
self
):
program
=
Program
()
with
program_guard
(
program
):
logits
=
layers
.
data
(
name
=
'Logits'
,
shape
=
[
256
],
dtype
=
'float64'
)
...
...
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录