Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
机器未来
Paddle
提交
c037d625
P
Paddle
项目概览
机器未来
/
Paddle
与 Fork 源项目一致
Fork自
PaddlePaddle / Paddle
通知
1
Star
1
Fork
0
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
1
列表
看板
标记
里程碑
合并请求
0
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
P
Paddle
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
1
Issue
1
列表
看板
标记
里程碑
合并请求
0
合并请求
0
Pages
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
提交
Issue看板
未验证
提交
c037d625
编写于
8月 22, 2020
作者:
Z
Zhen Wang
提交者:
GitHub
8月 22, 2020
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
Update the demo of paddle.grad (#26498)
* update the demo codes of paddle.grad.
上级
c11c83fb
变更
1
隐藏空白更改
内联
并排
Showing
1 changed file
with
37 addition
and
38 deletion
+37
-38
python/paddle/fluid/dygraph/base.py
python/paddle/fluid/dygraph/base.py
+37
-38
未找到文件。
python/paddle/fluid/dygraph/base.py
浏览文件 @
c037d625
...
...
@@ -375,47 +375,46 @@ def grad(outputs,
Examples 1:
.. code-block:: python
import paddle.fluid as fluid
import paddle
paddle.disable_static()
def test_dygraph_grad(create_graph):
with fluid.dygraph.guard():
x = fluid.layers.ones(shape=[1], dtype='float32')
x.stop_gradient = False
y = x * x
# Since y = x * x, dx = 2 * x
dx = fluid.dygraph.grad(
outputs=[y],
inputs=[x],
create_graph=create_graph,
retain_graph=True)[0]
z = y + dx
# If create_graph = False, the gradient of dx
# would not be backpropagated. Therefore,
# z = x * x + dx, and x.gradient() = 2 * x = 2.0
# If create_graph = True, the gradient of dx
# would be backpropagated. Therefore,
# z = x * x + dx = x * x + 2 * x, and
# x.gradient() = 2 * x + 2 = 4.0
z.backward()
return x.gradient()
print(test_dygraph_grad(create_graph=False)) # [2.]
x = paddle.ones(shape=[1], dtype='float32')
x.stop_gradient = False
y = x * x
# Since y = x * x, dx = 2 * x
dx = paddle.grad(
outputs=[y],
inputs=[x],
create_graph=create_graph,
retain_graph=True)[0]
z = y + dx
# If create_graph = False, the gradient of dx
# would not be backpropagated. Therefore,
# z = x * x + dx, and x.gradient() = 2 * x = 2.0
# If create_graph = True, the gradient of dx
# would be backpropagated. Therefore,
# z = x * x + dx = x * x + 2 * x, and
# x.gradient() = 2 * x + 2 = 4.0
z.backward()
return x.gradient()
print(test_dygraph_grad(create_graph=False)) # [2.]
print(test_dygraph_grad(create_graph=True)) # [4.]
Examples 2:
.. code-block:: python
import paddle.fluid as fluid
fluid.enable_dygraph()
import paddle
paddle.disable_static()
def test_dygraph_grad(grad_outputs=None):
x =
fluid.layers
.fill_constant(shape=[1], value=2.0, dtype='float32')
x =
paddle
.fill_constant(shape=[1], value=2.0, dtype='float32')
x.stop_gradient = False
y1 = x * x
...
...
@@ -431,27 +430,27 @@ def grad(outputs,
# Therefore, the final result would be:
# dx = 2 * x * dy1 + 3 * dy2 = 4 * dy1 + 3 * dy2.
dx =
fluid.dygraph
.grad(
dx =
paddle
.grad(
outputs=[y1, y2],
inputs=[x],
grad_outputs=grad_outputs)[0]
return dx.numpy()
THREE = fluid.layers.fill_constant(shape=[1], value=3.0, dtype='float32')
FOUR = fluid.layers.fill_constant(shape=[1], value=4.0, dtype='float32')
grad_value = paddle.fill_constant(shape=[1], value=4.0, dtype='float32')
# dy1 = [1], dy2 = [1]
print(test_dygraph_grad(None)) # [7.]
# dy1 = [1], dy2 = [4]
print(test_dygraph_grad([None,
FOUR])) # [16.]
print(test_dygraph_grad([None,
grad_value])) # [16.]
# dy1 = [4], dy2 = [1]
print(test_dygraph_grad([
FOUR
, None])) # [19.]
print(test_dygraph_grad([
grad_value
, None])) # [19.]
# dy1 = [3], dy2 = [4]
print(test_dygraph_grad([THREE, FOUR])) # [24.]
grad_y1 = paddle.fill_constant(shape=[1], value=3.0, dtype='float32')
print(test_dygraph_grad([grad_y1, grad_value])) # [24.]
'''
def
check_in_out
(
in_out_list
,
name
):
...
...
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录