Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
Crayon鑫
Paddle
提交
12ab017e
P
Paddle
项目概览
Crayon鑫
/
Paddle
与 Fork 源项目一致
Fork自
PaddlePaddle / Paddle
通知
1
Star
1
Fork
0
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
1
列表
看板
标记
里程碑
合并请求
0
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
P
Paddle
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
1
Issue
1
列表
看板
标记
里程碑
合并请求
0
合并请求
0
Pages
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
提交
Issue看板
未验证
提交
12ab017e
编写于
9月 22, 2021
作者:
Z
zhangbo9674
提交者:
GitHub
9月 22, 2021
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
fix bug of module 'paddle' has no attribute 'fluid' for python3.6 (#35862)
上级
77134300
变更
1
显示空白变更内容
内联
并排
Showing
1 changed file
with
11 addition
and
13 deletion
+11
-13
python/paddle/fluid/dygraph/amp/auto_cast.py
python/paddle/fluid/dygraph/amp/auto_cast.py
+11
-13
未找到文件。
python/paddle/fluid/dygraph/amp/auto_cast.py
浏览文件 @
12ab017e
...
@@ -23,7 +23,6 @@ import functools
...
@@ -23,7 +23,6 @@ import functools
import
paddle
import
paddle
import
operator
import
operator
import
types
import
types
import
paddle.fluid
as
fluid
__all__
=
[
'amp_guard'
,
'amp_decorate'
]
__all__
=
[
'amp_guard'
,
'amp_decorate'
]
...
@@ -220,16 +219,16 @@ def amp_guard(enable=True,
...
@@ -220,16 +219,16 @@ def amp_guard(enable=True,
.. code-block:: python
.. code-block:: python
import numpy as np
import numpy as np
import paddle
.fluid as fluid
import paddle
data = np.random.uniform(-1, 1, [10, 3, 32, 32]).astype('float32')
data = np.random.uniform(-1, 1, [10, 3, 32, 32]).astype('float32')
with fluid.dygraph.guard():
with
paddle.
fluid.dygraph.guard():
conv2d = fluid.dygraph.Conv2D(3, 2, 3)
conv2d =
paddle.
fluid.dygraph.Conv2D(3, 2, 3)
data = fluid.dygraph.to_variable(data)
data =
paddle.
fluid.dygraph.to_variable(data)
with fluid.dygraph.amp_guard():
with
paddle.
fluid.dygraph.amp_guard():
conv = conv2d(data)
conv = conv2d(data)
print(conv.dtype) # FP16
print(conv.dtype) # FP16
with fluid.dygraph.amp_guard(enable=False):
with
paddle.
fluid.dygraph.amp_guard(enable=False):
conv = conv2d(data)
conv = conv2d(data)
print(conv.dtype) # FP32
print(conv.dtype) # FP32
...
@@ -301,7 +300,7 @@ class StateDictHook(object):
...
@@ -301,7 +300,7 @@ class StateDictHook(object):
def
__call__
(
self
,
state_dict
):
def
__call__
(
self
,
state_dict
):
for
key
in
state_dict
:
for
key
in
state_dict
:
param
=
state_dict
[
key
]
param
=
state_dict
[
key
]
with
fluid
.
dygraph
.
guard
():
with
paddle
.
fluid
.
dygraph
.
guard
():
param_applied
=
paddle
.
cast
(
param
,
self
.
_save_dtype
)
param_applied
=
paddle
.
cast
(
param
,
self
.
_save_dtype
)
param_applied
.
name
=
param
.
name
param_applied
.
name
=
param
.
name
state_dict
[
key
]
=
param_applied
state_dict
[
key
]
=
param_applied
...
@@ -335,16 +334,15 @@ def amp_decorate(models,
...
@@ -335,16 +334,15 @@ def amp_decorate(models,
# required: gpu
# required: gpu
# Demo1: single model and optimizer:
# Demo1: single model and optimizer:
import paddle
import paddle
import paddle.fluid as fluid
model = paddle.nn.Conv2D(3, 2, 3, bias_attr=False)
model = paddle.nn.Conv2D(3, 2, 3, bias_attr=False)
optimzier = paddle.optimizer.SGD(parameters=model.parameters())
optimzier = paddle.optimizer.SGD(parameters=model.parameters())
model, optimizer = fluid.dygraph.amp_decorate(models=model, optimizers=optimzier, level='O2')
model, optimizer =
paddle.
fluid.dygraph.amp_decorate(models=model, optimizers=optimzier, level='O2')
data = paddle.rand([10, 3, 32, 32])
data = paddle.rand([10, 3, 32, 32])
with fluid.dygraph.amp_guard(enable=True, custom_white_list=None, custom_black_list=None, level='O2'):
with
paddle.
fluid.dygraph.amp_guard(enable=True, custom_white_list=None, custom_black_list=None, level='O2'):
output = model(data)
output = model(data)
print(output.dtype) # FP16
print(output.dtype) # FP16
...
@@ -353,11 +351,11 @@ def amp_decorate(models,
...
@@ -353,11 +351,11 @@ def amp_decorate(models,
model2 = paddle.nn.Conv2D(3, 2, 3, bias_attr=False)
model2 = paddle.nn.Conv2D(3, 2, 3, bias_attr=False)
optimizer2 = paddle.optimizer.Adam(parameters=model2.parameters())
optimizer2 = paddle.optimizer.Adam(parameters=model2.parameters())
models, optimizers = fluid.dygraph.amp_decorate(models=[model, model2], optimizers=[optimzier, optimizer2], level='O2')
models, optimizers =
paddle.
fluid.dygraph.amp_decorate(models=[model, model2], optimizers=[optimzier, optimizer2], level='O2')
data = paddle.rand([10, 3, 32, 32])
data = paddle.rand([10, 3, 32, 32])
with fluid.dygraph.amp_guard(enable=True, custom_white_list=None, custom_black_list=None, level='O2'):
with
paddle.
fluid.dygraph.amp_guard(enable=True, custom_white_list=None, custom_black_list=None, level='O2'):
output = models[0](data)
output = models[0](data)
output2 = models[1](data)
output2 = models[1](data)
print(output.dtype) # FP16
print(output.dtype) # FP16
...
...
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录