Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
BaiXuePrincess
Paddle
提交
4c56586a
P
Paddle
项目概览
BaiXuePrincess
/
Paddle
与 Fork 源项目一致
Fork自
PaddlePaddle / Paddle
通知
1
Star
1
Fork
0
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
0
列表
看板
标记
里程碑
合并请求
0
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
P
Paddle
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
0
Issue
0
列表
看板
标记
里程碑
合并请求
0
合并请求
0
Pages
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
提交
Issue看板
提交
4c56586a
编写于
11月 06, 2019
作者:
B
bingyanghuang
提交者:
Tao Luo
11月 06, 2019
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
[Cherry-pick] 21028: Remove fuse_with_relu argument from batch_norm constructor (#21049)
上级
f504d6f1
变更
2
显示空白变更内容
内联
并排
Showing
2 changed file
with
2 addition
and
7 deletion
+2
-7
python/paddle/fluid/dygraph/nn.py
python/paddle/fluid/dygraph/nn.py
+1
-4
python/paddle/fluid/layers/nn.py
python/paddle/fluid/layers/nn.py
+1
-3
未找到文件。
python/paddle/fluid/dygraph/nn.py
浏览文件 @
4c56586a
...
@@ -1200,8 +1200,6 @@ class BatchNorm(layers.Layer):
...
@@ -1200,8 +1200,6 @@ class BatchNorm(layers.Layer):
moving_variance_name(str, optional): The name of the moving_variance which store the global Variance. Default: None.
moving_variance_name(str, optional): The name of the moving_variance which store the global Variance. Default: None.
do_model_average_for_mean_and_var(bool, optional): Whether parameter mean and variance should do model
do_model_average_for_mean_and_var(bool, optional): Whether parameter mean and variance should do model
average when model average is enabled. Default: True.
average when model average is enabled. Default: True.
fuse_with_relu (bool, optional): When setting fuse_with_relu True, this OP performs relu after batch norm.
Default: False.
use_global_stats(bool, optional): Whether to use global mean and
use_global_stats(bool, optional): Whether to use global mean and
variance. In inference or test mode, set use_global_stats to true
variance. In inference or test mode, set use_global_stats to true
or is_test to true, and the behavior is equivalent.
or is_test to true, and the behavior is equivalent.
...
@@ -1243,7 +1241,6 @@ class BatchNorm(layers.Layer):
...
@@ -1243,7 +1241,6 @@ class BatchNorm(layers.Layer):
moving_mean_name
=
None
,
moving_mean_name
=
None
,
moving_variance_name
=
None
,
moving_variance_name
=
None
,
do_model_average_for_mean_and_var
=
True
,
do_model_average_for_mean_and_var
=
True
,
fuse_with_relu
=
False
,
use_global_stats
=
False
,
use_global_stats
=
False
,
trainable_statistics
=
False
):
trainable_statistics
=
False
):
super
(
BatchNorm
,
self
).
__init__
(
name_scope
,
dtype
)
super
(
BatchNorm
,
self
).
__init__
(
name_scope
,
dtype
)
...
@@ -1302,7 +1299,7 @@ class BatchNorm(layers.Layer):
...
@@ -1302,7 +1299,7 @@ class BatchNorm(layers.Layer):
self
.
_momentum
=
momentum
self
.
_momentum
=
momentum
self
.
_epsilon
=
epsilon
self
.
_epsilon
=
epsilon
self
.
_is_test
=
is_test
self
.
_is_test
=
is_test
self
.
_fuse_with_relu
=
fuse_with_relu
self
.
_fuse_with_relu
=
False
self
.
_use_global_stats
=
use_global_stats
self
.
_use_global_stats
=
use_global_stats
self
.
_trainable_statistics
=
trainable_statistics
self
.
_trainable_statistics
=
trainable_statistics
...
...
python/paddle/fluid/layers/nn.py
浏览文件 @
4c56586a
...
@@ -4126,7 +4126,6 @@ def batch_norm(input,
...
@@ -4126,7 +4126,6 @@ def batch_norm(input,
moving_mean_name=None,
moving_mean_name=None,
moving_variance_name=None,
moving_variance_name=None,
do_model_average_for_mean_and_var=True,
do_model_average_for_mean_and_var=True,
fuse_with_relu=False,
use_global_stats=False):
use_global_stats=False):
"""
"""
**Batch Normalization Layer**
**Batch Normalization Layer**
...
@@ -4211,7 +4210,6 @@ def batch_norm(input,
...
@@ -4211,7 +4210,6 @@ def batch_norm(input,
will save global variance with the string.
will save global variance with the string.
do_model_average_for_mean_and_var(bool, Default True): Whether parameter mean and variance should do model
do_model_average_for_mean_and_var(bool, Default True): Whether parameter mean and variance should do model
average when model average is enabled.
average when model average is enabled.
fuse_with_relu (bool): if True, this OP performs relu after batch norm.
use_global_stats(bool, Default False): Whether to use global mean and
use_global_stats(bool, Default False): Whether to use global mean and
variance. In inference or test mode, set use_global_stats to true
variance. In inference or test mode, set use_global_stats to true
or is_test to true, and the behavior is equivalent.
or is_test to true, and the behavior is equivalent.
...
@@ -4327,7 +4325,7 @@ def batch_norm(input,
...
@@ -4327,7 +4325,7 @@ def batch_norm(input,
"is_test": is_test,
"is_test": is_test,
"data_layout": data_layout,
"data_layout": data_layout,
"use_mkldnn": False,
"use_mkldnn": False,
"fuse_with_relu":
fuse_with_relu
,
"fuse_with_relu":
False
,
"use_global_stats": use_global_stats
"use_global_stats": use_global_stats
})
})
...
...
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录