Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
机器未来
Paddle
提交
af66fcb2
P
Paddle
项目概览
机器未来
/
Paddle
与 Fork 源项目一致
Fork自
PaddlePaddle / Paddle
通知
1
Star
1
Fork
0
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
1
列表
看板
标记
里程碑
合并请求
0
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
P
Paddle
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
1
Issue
1
列表
看板
标记
里程碑
合并请求
0
合并请求
0
Pages
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
提交
Issue看板
提交
af66fcb2
编写于
10月 24, 2017
作者:
T
Travis CI
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
Deploy to GitHub Pages:
dd0008d5
上级
09e2dd36
变更
6
展开全部
隐藏空白更改
内联
并排
Showing
6 changed file
with
6 addition
and
62 deletion
+6
-62
develop/doc/_sources/design/optimizer.md.txt
develop/doc/_sources/design/optimizer.md.txt
+1
-15
develop/doc/design/optimizer.html
develop/doc/design/optimizer.html
+1
-15
develop/doc/searchindex.js
develop/doc/searchindex.js
+1
-1
develop/doc_cn/_sources/design/optimizer.md.txt
develop/doc_cn/_sources/design/optimizer.md.txt
+1
-15
develop/doc_cn/design/optimizer.html
develop/doc_cn/design/optimizer.html
+1
-15
develop/doc_cn/searchindex.js
develop/doc_cn/searchindex.js
+1
-1
未找到文件。
develop/doc/_sources/design/optimizer.md.txt
浏览文件 @
af66fcb2
...
...
@@ -65,20 +65,6 @@ class Optimizer(object):
def __init__(self):
pass
def create_backward_pass(self, loss, parameter_list=None):
"""
create and add gradient Operators in BlockDesc to Compute gradients of `loss`
for parameters in parameter_list
Args:
loss: an variable generated by cost function.
parameter_list: parameters that need to compute gradient and update to optimize the lost.
Returns:
list of (parameters, gradients) pair.
"""
return None
def create_optimization_pass(self, parameters_and_grads):
"""Add optimization operators to update gradients to variables.
...
...
@@ -93,7 +79,7 @@ class Optimizer(object):
def minimize(self, loss, parameter_list):
"""Add operations to minimize `loss` by updating `parameter_list`.
This method combines interface `
create_backward_pas
s()` and
This method combines interface `
append_backward_op
s()` and
`create_optimization_pass()` into one.
"""
params_grads = self.create_backward_pass(loss, parameter_list)
...
...
develop/doc/design/optimizer.html
浏览文件 @
af66fcb2
...
...
@@ -243,20 +243,6 @@
<span
class=
"k"
>
def
</span>
<span
class=
"fm"
>
__init__
</span><span
class=
"p"
>
(
</span><span
class=
"bp"
>
self
</span><span
class=
"p"
>
):
</span>
<span
class=
"k"
>
pass
</span>
<span
class=
"k"
>
def
</span>
<span
class=
"nf"
>
create_backward_pass
</span><span
class=
"p"
>
(
</span><span
class=
"bp"
>
self
</span><span
class=
"p"
>
,
</span>
<span
class=
"n"
>
loss
</span><span
class=
"p"
>
,
</span>
<span
class=
"n"
>
parameter_list
</span><span
class=
"o"
>
=
</span><span
class=
"bp"
>
None
</span><span
class=
"p"
>
):
</span>
<span
class=
"sd"
>
"""
</span>
<span
class=
"sd"
>
create and add gradient Operators in BlockDesc to Compute gradients of `loss`
</span>
<span
class=
"sd"
>
for parameters in parameter_list
</span>
<span
class=
"sd"
>
Args:
</span>
<span
class=
"sd"
>
loss: an variable generated by cost function.
</span>
<span
class=
"sd"
>
parameter_list: parameters that need to compute gradient and update to optimize the lost.
</span>
<span
class=
"sd"
>
Returns:
</span>
<span
class=
"sd"
>
list of (parameters, gradients) pair.
</span>
<span
class=
"sd"
>
"""
</span>
<span
class=
"k"
>
return
</span>
<span
class=
"bp"
>
None
</span>
<span
class=
"k"
>
def
</span>
<span
class=
"nf"
>
create_optimization_pass
</span><span
class=
"p"
>
(
</span><span
class=
"bp"
>
self
</span><span
class=
"p"
>
,
</span>
<span
class=
"n"
>
parameters_and_grads
</span><span
class=
"p"
>
):
</span>
<span
class=
"sd"
>
"""
Add optimization operators to update gradients to variables.
</span>
...
...
@@ -271,7 +257,7 @@
<span
class=
"k"
>
def
</span>
<span
class=
"nf"
>
minimize
</span><span
class=
"p"
>
(
</span><span
class=
"bp"
>
self
</span><span
class=
"p"
>
,
</span>
<span
class=
"n"
>
loss
</span><span
class=
"p"
>
,
</span>
<span
class=
"n"
>
parameter_list
</span><span
class=
"p"
>
):
</span>
<span
class=
"sd"
>
"""
Add operations to minimize `loss` by updating `parameter_list`.
</span>
<span
class=
"sd"
>
This method combines interface `
create_backward_pas
s()` and
</span>
<span
class=
"sd"
>
This method combines interface `
append_backward_op
s()` and
</span>
<span
class=
"sd"
>
`create_optimization_pass()` into one.
</span>
<span
class=
"sd"
>
"""
</span>
<span
class=
"n"
>
params_grads
</span>
<span
class=
"o"
>
=
</span>
<span
class=
"bp"
>
self
</span><span
class=
"o"
>
.
</span><span
class=
"n"
>
create_backward_pass
</span><span
class=
"p"
>
(
</span><span
class=
"n"
>
loss
</span><span
class=
"p"
>
,
</span>
<span
class=
"n"
>
parameter_list
</span><span
class=
"p"
>
)
</span>
...
...
develop/doc/searchindex.js
浏览文件 @
af66fcb2
因为 它太大了无法显示 source diff 。你可以改为
查看blob
。
develop/doc_cn/_sources/design/optimizer.md.txt
浏览文件 @
af66fcb2
...
...
@@ -65,20 +65,6 @@ class Optimizer(object):
def __init__(self):
pass
def create_backward_pass(self, loss, parameter_list=None):
"""
create and add gradient Operators in BlockDesc to Compute gradients of `loss`
for parameters in parameter_list
Args:
loss: an variable generated by cost function.
parameter_list: parameters that need to compute gradient and update to optimize the lost.
Returns:
list of (parameters, gradients) pair.
"""
return None
def create_optimization_pass(self, parameters_and_grads):
"""Add optimization operators to update gradients to variables.
...
...
@@ -93,7 +79,7 @@ class Optimizer(object):
def minimize(self, loss, parameter_list):
"""Add operations to minimize `loss` by updating `parameter_list`.
This method combines interface `
create_backward_pas
s()` and
This method combines interface `
append_backward_op
s()` and
`create_optimization_pass()` into one.
"""
params_grads = self.create_backward_pass(loss, parameter_list)
...
...
develop/doc_cn/design/optimizer.html
浏览文件 @
af66fcb2
...
...
@@ -257,20 +257,6 @@
<span
class=
"k"
>
def
</span>
<span
class=
"fm"
>
__init__
</span><span
class=
"p"
>
(
</span><span
class=
"bp"
>
self
</span><span
class=
"p"
>
):
</span>
<span
class=
"k"
>
pass
</span>
<span
class=
"k"
>
def
</span>
<span
class=
"nf"
>
create_backward_pass
</span><span
class=
"p"
>
(
</span><span
class=
"bp"
>
self
</span><span
class=
"p"
>
,
</span>
<span
class=
"n"
>
loss
</span><span
class=
"p"
>
,
</span>
<span
class=
"n"
>
parameter_list
</span><span
class=
"o"
>
=
</span><span
class=
"bp"
>
None
</span><span
class=
"p"
>
):
</span>
<span
class=
"sd"
>
"""
</span>
<span
class=
"sd"
>
create and add gradient Operators in BlockDesc to Compute gradients of `loss`
</span>
<span
class=
"sd"
>
for parameters in parameter_list
</span>
<span
class=
"sd"
>
Args:
</span>
<span
class=
"sd"
>
loss: an variable generated by cost function.
</span>
<span
class=
"sd"
>
parameter_list: parameters that need to compute gradient and update to optimize the lost.
</span>
<span
class=
"sd"
>
Returns:
</span>
<span
class=
"sd"
>
list of (parameters, gradients) pair.
</span>
<span
class=
"sd"
>
"""
</span>
<span
class=
"k"
>
return
</span>
<span
class=
"bp"
>
None
</span>
<span
class=
"k"
>
def
</span>
<span
class=
"nf"
>
create_optimization_pass
</span><span
class=
"p"
>
(
</span><span
class=
"bp"
>
self
</span><span
class=
"p"
>
,
</span>
<span
class=
"n"
>
parameters_and_grads
</span><span
class=
"p"
>
):
</span>
<span
class=
"sd"
>
"""
Add optimization operators to update gradients to variables.
</span>
...
...
@@ -285,7 +271,7 @@
<span
class=
"k"
>
def
</span>
<span
class=
"nf"
>
minimize
</span><span
class=
"p"
>
(
</span><span
class=
"bp"
>
self
</span><span
class=
"p"
>
,
</span>
<span
class=
"n"
>
loss
</span><span
class=
"p"
>
,
</span>
<span
class=
"n"
>
parameter_list
</span><span
class=
"p"
>
):
</span>
<span
class=
"sd"
>
"""
Add operations to minimize `loss` by updating `parameter_list`.
</span>
<span
class=
"sd"
>
This method combines interface `
create_backward_pas
s()` and
</span>
<span
class=
"sd"
>
This method combines interface `
append_backward_op
s()` and
</span>
<span
class=
"sd"
>
`create_optimization_pass()` into one.
</span>
<span
class=
"sd"
>
"""
</span>
<span
class=
"n"
>
params_grads
</span>
<span
class=
"o"
>
=
</span>
<span
class=
"bp"
>
self
</span><span
class=
"o"
>
.
</span><span
class=
"n"
>
create_backward_pass
</span><span
class=
"p"
>
(
</span><span
class=
"n"
>
loss
</span><span
class=
"p"
>
,
</span>
<span
class=
"n"
>
parameter_list
</span><span
class=
"p"
>
)
</span>
...
...
develop/doc_cn/searchindex.js
浏览文件 @
af66fcb2
此差异已折叠。
点击以展开。
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录