Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
BaiXuePrincess
Paddle
提交
2ebc8f77
P
Paddle
项目概览
BaiXuePrincess
/
Paddle
与 Fork 源项目一致
Fork自
PaddlePaddle / Paddle
通知
1
Star
1
Fork
0
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
0
列表
看板
标记
里程碑
合并请求
0
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
P
Paddle
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
0
Issue
0
列表
看板
标记
里程碑
合并请求
0
合并请求
0
Pages
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
提交
Issue看板
未验证
提交
2ebc8f77
编写于
12月 29, 2021
作者:
Z
zhangbo9674
提交者:
GitHub
12月 29, 2021
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
[AMP] Add BatchNorm_1D_2D_3D skip for paddle.amp.decorate (#38541)
* add bn_1d_2d_3d for fp16 decorate * add unittest
上级
e3faf345
变更
2
隐藏空白更改
内联
并排
Showing
2 changed file
with
30 addition
and
2 deletion
+30
-2
python/paddle/fluid/dygraph/amp/auto_cast.py
python/paddle/fluid/dygraph/amp/auto_cast.py
+4
-2
python/paddle/fluid/tests/unittests/test_imperative_auto_mixed_precision.py
...d/tests/unittests/test_imperative_auto_mixed_precision.py
+26
-0
未找到文件。
python/paddle/fluid/dygraph/amp/auto_cast.py
浏览文件 @
2ebc8f77
...
...
@@ -130,8 +130,10 @@ def pure_fp16_initialize(models):
for
idx
in
range
(
len
(
models
)):
for
layer
in
models
[
idx
].
sublayers
(
include_self
=
True
):
layer
.
_casted_by_pure_fp16
=
True
if
(
layer
.
_dtype
is
'float16'
)
or
isinstance
(
layer
,
(
paddle
.
nn
.
BatchNorm
,
paddle
.
nn
.
LayerNorm
)):
if
(
layer
.
_dtype
is
'float16'
)
or
isinstance
(
layer
,
(
paddle
.
nn
.
BatchNorm
,
paddle
.
nn
.
BatchNorm1D
,
paddle
.
nn
.
BatchNorm2D
,
paddle
.
nn
.
BatchNorm3D
,
paddle
.
nn
.
LayerNorm
)):
continue
layer
.
_to_impl
(
dtype
=
'float16'
,
include_sublayers
=
False
)
return
models
...
...
python/paddle/fluid/tests/unittests/test_imperative_auto_mixed_precision.py
浏览文件 @
2ebc8f77
...
...
@@ -598,6 +598,32 @@ class TestAmpDecorator(unittest.TestCase):
self
.
assertEqual
(
optimizers
[
0
].
_multi_precision
,
False
)
self
.
assertEqual
(
optimizers
[
1
].
_multi_precision
,
False
)
def
test_skip_BatchNorm_Layer_norm
(
self
):
model
=
paddle
.
nn
.
LayerNorm
(
1
)
model
=
paddle
.
amp
.
decorate
(
models
=
model
,
level
=
'O2'
)
for
param
in
model
.
parameters
():
self
.
assertEqual
((
param
.
dtype
==
paddle
.
float32
),
True
)
model
=
paddle
.
nn
.
BatchNorm
(
1
)
model
=
paddle
.
amp
.
decorate
(
models
=
model
,
level
=
'O2'
)
for
param
in
model
.
parameters
():
self
.
assertEqual
((
param
.
dtype
==
paddle
.
float32
),
True
)
model
=
paddle
.
nn
.
BatchNorm1D
(
1
)
model
=
paddle
.
amp
.
decorate
(
models
=
model
,
level
=
'O2'
)
for
param
in
model
.
parameters
():
self
.
assertEqual
((
param
.
dtype
==
paddle
.
float32
),
True
)
model
=
paddle
.
nn
.
BatchNorm2D
(
1
)
model
=
paddle
.
amp
.
decorate
(
models
=
model
,
level
=
'O2'
)
for
param
in
model
.
parameters
():
self
.
assertEqual
((
param
.
dtype
==
paddle
.
float32
),
True
)
model
=
paddle
.
nn
.
BatchNorm3D
(
1
)
model
=
paddle
.
amp
.
decorate
(
models
=
model
,
level
=
'O2'
)
for
param
in
model
.
parameters
():
self
.
assertEqual
((
param
.
dtype
==
paddle
.
float32
),
True
)
class
TestPureFp16SaveLoad
(
unittest
.
TestCase
):
def
test_save_dtype_exception
(
self
):
...
...
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录