Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
magicwindyyd
mindspore
提交
e8b26dbd
M
mindspore
项目概览
magicwindyyd
/
mindspore
与 Fork 源项目一致
Fork自
MindSpore / mindspore
通知
1
Star
1
Fork
0
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
0
列表
看板
标记
里程碑
合并请求
0
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
M
mindspore
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
0
Issue
0
列表
看板
标记
里程碑
合并请求
0
合并请求
0
Pages
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
提交
Issue看板
提交
e8b26dbd
编写于
7月 18, 2020
作者:
M
mindspore-ci-bot
提交者:
Gitee
7月 18, 2020
浏览文件
操作
浏览文件
下载
差异文件
!3151 Fix verification of BatchNorm2d input is 4D.
Merge pull request !3151 from liuxiao93/fix-BatchNorm2d
上级
55cd091f
75881e5f
变更
4
隐藏空白更改
内联
并排
Showing
4 changed file
with
14 addition
and
8 deletion
+14
-8
mindspore/nn/layer/normalization.py
mindspore/nn/layer/normalization.py
+11
-5
mindspore/ops/operations/math_ops.py
mindspore/ops/operations/math_ops.py
+1
-1
tests/ut/python/parallel/test_auto_parallel_tuple_depend.py
tests/ut/python/parallel/test_auto_parallel_tuple_depend.py
+1
-1
tests/ut/python/parallel/test_auto_parallel_two_bn.py
tests/ut/python/parallel/test_auto_parallel_two_bn.py
+1
-1
未找到文件。
mindspore/nn/layer/normalization.py
浏览文件 @
e8b26dbd
...
@@ -44,7 +44,8 @@ class _BatchNorm(Cell):
...
@@ -44,7 +44,8 @@ class _BatchNorm(Cell):
moving_mean_init
=
'zeros'
,
moving_mean_init
=
'zeros'
,
moving_var_init
=
'ones'
,
moving_var_init
=
'ones'
,
use_batch_statistics
=
None
,
use_batch_statistics
=
None
,
device_num_each_group
=
1
):
device_num_each_group
=
1
,
input_dims
=
'1d'
):
super
(
_BatchNorm
,
self
).
__init__
()
super
(
_BatchNorm
,
self
).
__init__
()
if
num_features
<
1
:
if
num_features
<
1
:
raise
ValueError
(
"num_features must be at least 1"
)
raise
ValueError
(
"num_features must be at least 1"
)
...
@@ -55,6 +56,7 @@ class _BatchNorm(Cell):
...
@@ -55,6 +56,7 @@ class _BatchNorm(Cell):
self
.
use_batch_statistics
=
use_batch_statistics
self
.
use_batch_statistics
=
use_batch_statistics
self
.
num_features
=
num_features
self
.
num_features
=
num_features
self
.
eps
=
eps
self
.
eps
=
eps
self
.
input_dims
=
input_dims
self
.
moving_mean
=
Parameter
(
initializer
(
self
.
moving_mean
=
Parameter
(
initializer
(
moving_mean_init
,
num_features
),
name
=
"mean"
,
requires_grad
=
False
)
moving_mean_init
,
num_features
),
name
=
"mean"
,
requires_grad
=
False
)
self
.
moving_variance
=
Parameter
(
initializer
(
self
.
moving_variance
=
Parameter
(
initializer
(
...
@@ -145,6 +147,8 @@ class _BatchNorm(Cell):
...
@@ -145,6 +147,8 @@ class _BatchNorm(Cell):
return
y
return
y
def
construct
(
self
,
x
):
def
construct
(
self
,
x
):
if
self
.
input_dims
==
'2d'
:
_shape_check
(
self
.
shape
(
x
))
if
self
.
use_batch_statistics
is
None
:
if
self
.
use_batch_statistics
is
None
:
flag
=
self
.
training
flag
=
self
.
training
else
:
else
:
...
@@ -253,10 +257,10 @@ class BatchNorm1d(_BatchNorm):
...
@@ -253,10 +257,10 @@ class BatchNorm1d(_BatchNorm):
mean and variance. Default: None.
mean and variance. Default: None.
Inputs:
Inputs:
- **input** (Tensor) - Tensor of shape :math:`(N, C_{in}
, H_{in}, W_{in}
)`.
- **input** (Tensor) - Tensor of shape :math:`(N, C_{in})`.
Outputs:
Outputs:
Tensor, the normalized, scaled, offset tensor, of shape :math:`(N, C_{out}
, H_{out}, W_{out}
)`.
Tensor, the normalized, scaled, offset tensor, of shape :math:`(N, C_{out})`.
Examples:
Examples:
>>> net = nn.BatchNorm1d(num_features=16)
>>> net = nn.BatchNorm1d(num_features=16)
...
@@ -282,7 +286,8 @@ class BatchNorm1d(_BatchNorm):
...
@@ -282,7 +286,8 @@ class BatchNorm1d(_BatchNorm):
beta_init
,
beta_init
,
moving_mean_init
,
moving_mean_init
,
moving_var_init
,
moving_var_init
,
use_batch_statistics
)
use_batch_statistics
,
input_dims
=
'1d'
)
def
_check_data_dim
(
self
,
x
):
def
_check_data_dim
(
self
,
x
):
if
x
.
dim
()
!=
2
:
if
x
.
dim
()
!=
2
:
...
@@ -357,7 +362,8 @@ class BatchNorm2d(_BatchNorm):
...
@@ -357,7 +362,8 @@ class BatchNorm2d(_BatchNorm):
beta_init
,
beta_init
,
moving_mean_init
,
moving_mean_init
,
moving_var_init
,
moving_var_init
,
use_batch_statistics
)
use_batch_statistics
,
input_dims
=
'2d'
)
def
_check_data_dim
(
self
,
x
):
def
_check_data_dim
(
self
,
x
):
if
x
.
dim
()
!=
4
:
if
x
.
dim
()
!=
4
:
...
...
mindspore/ops/operations/math_ops.py
浏览文件 @
e8b26dbd
...
@@ -2931,7 +2931,7 @@ class Round(PrimitiveWithInfer):
...
@@ -2931,7 +2931,7 @@ class Round(PrimitiveWithInfer):
class
Tan
(
PrimitiveWithInfer
):
class
Tan
(
PrimitiveWithInfer
):
"""
"""
Computes tan of `input_x` element-wise.
Computes tan
gent
of `input_x` element-wise.
Inputs:
Inputs:
- **input_x** (Tensor) - The shape of tensor is :math:`(x_1, x_2, ..., x_R)`.
- **input_x** (Tensor) - The shape of tensor is :math:`(x_1, x_2, ..., x_R)`.
...
...
tests/ut/python/parallel/test_auto_parallel_tuple_depend.py
浏览文件 @
e8b26dbd
...
@@ -46,7 +46,7 @@ class GradWrap(nn.Cell):
...
@@ -46,7 +46,7 @@ class GradWrap(nn.Cell):
def
bn_with_initialize
(
out_channels
):
def
bn_with_initialize
(
out_channels
):
bn
=
nn
.
BatchNorm
2
d
(
out_channels
,
momentum
=
0.1
,
eps
=
1e-5
)
bn
=
nn
.
BatchNorm
1
d
(
out_channels
,
momentum
=
0.1
,
eps
=
1e-5
)
return
bn
return
bn
...
...
tests/ut/python/parallel/test_auto_parallel_two_bn.py
浏览文件 @
e8b26dbd
...
@@ -40,7 +40,7 @@ class NetWithLoss(nn.Cell):
...
@@ -40,7 +40,7 @@ class NetWithLoss(nn.Cell):
class
Blockcell
(
nn
.
Cell
):
class
Blockcell
(
nn
.
Cell
):
def
__init__
(
self
):
def
__init__
(
self
):
super
(
Blockcell
,
self
).
__init__
()
super
(
Blockcell
,
self
).
__init__
()
self
.
bn
=
nn
.
BatchNorm
2
d
(
64
,
momentum
=
0.9
)
self
.
bn
=
nn
.
BatchNorm
1
d
(
64
,
momentum
=
0.9
)
def
construct
(
self
,
x
):
def
construct
(
self
,
x
):
out
=
self
.
bn
(
x
)
out
=
self
.
bn
(
x
)
...
...
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录