Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
PaddlePaddle
Paddle
提交
2e40660e
P
Paddle
项目概览
PaddlePaddle
/
Paddle
大约 1 年 前同步成功
通知
2298
Star
20931
Fork
5422
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
1423
列表
看板
标记
里程碑
合并请求
543
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
P
Paddle
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
1,423
Issue
1,423
列表
看板
标记
里程碑
合并请求
543
合并请求
543
Pages
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
提交
Issue看板
提交
2e40660e
编写于
4月 04, 2018
作者:
W
wanghaoshuang
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
Fix some issues.
上级
19db989e
变更
4
隐藏空白更改
内联
并排
Showing
4 changed file
with
18 addition
and
16 deletion
+18
-16
python/paddle/fluid/framework.py
python/paddle/fluid/framework.py
+2
-2
python/paddle/fluid/layers/nn.py
python/paddle/fluid/layers/nn.py
+11
-9
python/paddle/fluid/optimizer.py
python/paddle/fluid/optimizer.py
+2
-2
python/paddle/fluid/param_attr.py
python/paddle/fluid/param_attr.py
+3
-3
未找到文件。
python/paddle/fluid/framework.py
浏览文件 @
2e40660e
...
...
@@ -1155,7 +1155,7 @@ class Parameter(Variable):
self
.
gradient_clip_attr
=
kwargs
.
get
(
'gradient_clip_attr'
,
None
)
self
.
average
=
kwargs
.
get
(
'average'
,
Tru
e
)
self
.
do_model_average
=
kwargs
.
get
(
'do_model_average'
,
Non
e
)
def
__str__
(
self
):
return
self
.
to_string
(
True
)
...
...
@@ -1177,7 +1177,7 @@ class Parameter(Variable):
if
with_details
:
res_str
=
Variable
.
to_string
(
self
,
throw_on_error
,
True
)
additional_attr
=
(
"trainable"
,
"optimize_attr"
,
"regularizer"
,
"gradient_clip_attr"
,
"average"
)
"gradient_clip_attr"
,
"
do_model_
average"
)
for
attr_name
in
additional_attr
:
res_str
+=
"%s: %s
\n
"
%
(
attr_name
,
str
(
getattr
(
self
,
attr_name
)))
...
...
python/paddle/fluid/layers/nn.py
浏览文件 @
2e40660e
...
...
@@ -1489,8 +1489,7 @@ def batch_norm(input,
name
=
None
,
moving_mean_name
=
None
,
moving_variance_name
=
None
,
average_mean
=
True
,
average_variance
=
True
):
do_model_average_for_mean_and_var
=
False
):
"""
This function helps create an operator to implement
the BatchNorm layer using the configurations from the input parameters.
...
...
@@ -1519,12 +1518,15 @@ def batch_norm(input,
bias
=
helper
.
create_parameter
(
attr
=
helper
.
bias_attr
,
shape
=
param_shape
,
dtype
=
dtype
,
is_bias
=
True
)
if
do_model_average_for_mean_and_var
:
do_model_average_for_mean_and_var
=
None
mean
=
helper
.
create_parameter
(
attr
=
ParamAttr
(
name
=
moving_mean_name
,
initializer
=
Constant
(
0.0
),
trainable
=
False
,
average
=
average_variance
),
do_model_average
=
do_model_average_for_mean_and_var
),
shape
=
param_shape
,
dtype
=
input
.
dtype
)
mean
.
stop_gradient
=
True
...
...
@@ -1534,7 +1536,7 @@ def batch_norm(input,
name
=
moving_variance_name
,
initializer
=
Constant
(
1.0
),
trainable
=
False
,
average
=
average_mean
),
do_model_average
=
do_model_average_for_mean_and_var
),
shape
=
param_shape
,
dtype
=
input
.
dtype
)
variance
.
stop_gradient
=
True
...
...
@@ -3352,14 +3354,14 @@ def reshape(x, shape, actual_shape=None, act=None, inplace=True, name=None):
Here are some examples to explain it.
1. Given a 3-D tensor x with a shape [2, 4, 6], and the target shape
is [6, 8], the reshape operator will transform x into a 2-D tensor with
is [6, 8], the reshape operator will transform x into a 2-D tensor with
shape [6, 8] and leaving x's data unchanged.
2. Given a 3-D tensor x with a shape [2, 4, 6], and the target shape
specified is [2, 3, -1, 2], the reshape operator will transform x into a
4-D tensor with shape [2, 3, 4, 2] and leaving x's data unchanged. In this
case, one dimension of the target shape is set to -1, the value of this
dimension is inferred from the total element number of x and remaining
case, one dimension of the target shape is set to -1, the value of this
dimension is inferred from the total element number of x and remaining
dimensions.
3. Given a 3-D tensor x with a shape [2, 4, 6], and the target shape
...
...
@@ -3593,7 +3595,7 @@ def lrn(input, n=5, k=1.0, alpha=1e-4, beta=0.75, name=None):
def
pad
(
x
,
paddings
,
pad_value
=
0.
,
name
=
None
):
"""
Pads a tensor with a constant value given by :attr:`pad_value`, and the
padded width is specified by :attr:`paddings`.
padded width is specified by :attr:`paddings`.
Specifically, the number of values padded before the contents of :attr:`x`
in dimension :attr:`i` is indicated by :attr:`paddings[i]`, and the number
...
...
@@ -3621,7 +3623,7 @@ def pad(x, paddings, pad_value=0., name=None):
x (Variable): The input tensor variable.
paddings (list): A list of integers. Its elements specify the padded
width before and after for each dimension in turn.
The length of :attr:paddings must be
The length of :attr:paddings must be
:math:`rank(x)
\\
times 2`.
pad_value (float): The constant value used to pad.
name(str|None): A name for this layer(optional). If set None, the layer
...
...
python/paddle/fluid/optimizer.py
浏览文件 @
2e40660e
...
...
@@ -840,7 +840,7 @@ class ModelAverage(Optimizer):
"""
def
__init__
(
self
,
average_window_rate
=
0.15
,
average_window_rate
,
params_grads
=
None
,
min_average_window
=
10000
,
max_average_window
=
10000
,
...
...
@@ -856,7 +856,7 @@ class ModelAverage(Optimizer):
params
[
param
.
name
]
=
(
param
,
grad
)
for
param
in
framework
.
default_main_program
().
global_block
(
).
all_parameters
():
if
param
.
name
not
in
params
and
param
.
averag
e
:
if
param
.
name
not
in
params
and
param
.
do_model_average
!=
Fals
e
:
grad
=
param
.
block
.
create_var
(
name
=
unique_name
.
generate
(
"."
.
join
([
param
.
name
,
'tmp'
])),
dtype
=
param
.
dtype
,
...
...
python/paddle/fluid/param_attr.py
浏览文件 @
2e40660e
...
...
@@ -29,14 +29,14 @@ class ParamAttr(object):
regularizer
=
None
,
trainable
=
True
,
gradient_clip
=
None
,
average
=
Tru
e
):
do_model_average
=
Non
e
):
self
.
name
=
name
self
.
initializer
=
initializer
self
.
learning_rate
=
learning_rate
self
.
regularizer
=
regularizer
self
.
trainable
=
trainable
self
.
gradient_clip
=
gradient_clip
self
.
average
=
average
self
.
model_average
=
do_model_
average
def
set_default_initializer
(
self
,
initializer
):
if
initializer
is
None
:
...
...
@@ -83,7 +83,7 @@ class ParamAttr(object):
'regularizer'
:
self
.
regularizer
,
'trainable'
:
self
.
trainable
,
'gradient_clip_attr'
:
self
.
gradient_clip
,
'
average'
:
self
.
average
'
model_average'
:
self
.
model_
average
}
if
with_initializer
:
kwargs
[
'initializer'
]
=
self
.
initializer
...
...
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录