Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
机器未来
Paddle
提交
26d3e0ef
P
Paddle
项目概览
机器未来
/
Paddle
与 Fork 源项目一致
Fork自
PaddlePaddle / Paddle
通知
1
Star
1
Fork
0
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
1
列表
看板
标记
里程碑
合并请求
0
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
P
Paddle
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
1
Issue
1
列表
看板
标记
里程碑
合并请求
0
合并请求
0
Pages
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
提交
Issue看板
提交
26d3e0ef
编写于
8月 11, 2017
作者:
T
Travis CI
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
Deploy to GitHub Pages:
886e66a5
上级
cac0f6f5
变更
4
展开全部
隐藏空白更改
内联
并排
Showing
4 changed file
with
30 addition
and
2 deletion
+30
-2
develop/doc/api/v2/config/optimizer.html
develop/doc/api/v2/config/optimizer.html
+14
-0
develop/doc/searchindex.js
develop/doc/searchindex.js
+1
-1
develop/doc_cn/api/v2/config/optimizer.html
develop/doc_cn/api/v2/config/optimizer.html
+14
-0
develop/doc_cn/searchindex.js
develop/doc_cn/searchindex.js
+1
-1
未找到文件。
develop/doc/api/v2/config/optimizer.html
浏览文件 @
26d3e0ef
...
...
@@ -190,6 +190,8 @@
<h1>
Optimizer
<a
class=
"headerlink"
href=
"#optimizer"
title=
"Permalink to this headline"
>
¶
</a></h1>
<div
class=
"section"
id=
"momentum"
>
<h2>
Momentum
<a
class=
"headerlink"
href=
"#momentum"
title=
"Permalink to this headline"
>
¶
</a></h2>
<p>
Optimizers(update equation) for SGD method.
</p>
<p>
TODO(yuyang18): Complete comments.
</p>
<dl
class=
"class"
>
<dt>
<em
class=
"property"
>
class
</em><code
class=
"descclassname"
>
paddle.v2.optimizer.
</code><code
class=
"descname"
>
Momentum
</code><span
class=
"sig-paren"
>
(
</span><em>
momentum=None
</em>
,
<em>
sparse=False
</em>
,
<em>
**kwargs
</em><span
class=
"sig-paren"
>
)
</span></dt>
...
...
@@ -215,6 +217,8 @@ be learned. The i is the i-th observation in (trainning) data.</p>
</div>
<div
class=
"section"
id=
"adam"
>
<h2>
Adam
<a
class=
"headerlink"
href=
"#adam"
title=
"Permalink to this headline"
>
¶
</a></h2>
<p>
Optimizers(update equation) for SGD method.
</p>
<p>
TODO(yuyang18): Complete comments.
</p>
<dl
class=
"class"
>
<dt>
<em
class=
"property"
>
class
</em><code
class=
"descclassname"
>
paddle.v2.optimizer.
</code><code
class=
"descname"
>
Adam
</code><span
class=
"sig-paren"
>
(
</span><em>
beta1=0.9
</em>
,
<em>
beta2=0.999
</em>
,
<em>
epsilon=1e-08
</em>
,
<em>
**kwargs
</em><span
class=
"sig-paren"
>
)
</span></dt>
...
...
@@ -243,6 +247,8 @@ divided by zero.</li>
</div>
<div
class=
"section"
id=
"adamax"
>
<h2>
Adamax
<a
class=
"headerlink"
href=
"#adamax"
title=
"Permalink to this headline"
>
¶
</a></h2>
<p>
Optimizers(update equation) for SGD method.
</p>
<p>
TODO(yuyang18): Complete comments.
</p>
<dl
class=
"class"
>
<dt>
<em
class=
"property"
>
class
</em><code
class=
"descclassname"
>
paddle.v2.optimizer.
</code><code
class=
"descname"
>
Adamax
</code><span
class=
"sig-paren"
>
(
</span><em>
beta1=0.9
</em>
,
<em>
beta2=0.999
</em>
,
<em>
**kwargs
</em><span
class=
"sig-paren"
>
)
</span></dt>
...
...
@@ -269,6 +275,8 @@ w_t & = w_{t-1} - (\eta/(1-\beta_1^t))*m_t/u_t\end{split}\]</div>
</div>
<div
class=
"section"
id=
"adagrad"
>
<h2>
AdaGrad
<a
class=
"headerlink"
href=
"#adagrad"
title=
"Permalink to this headline"
>
¶
</a></h2>
<p>
Optimizers(update equation) for SGD method.
</p>
<p>
TODO(yuyang18): Complete comments.
</p>
<dl
class=
"class"
>
<dt>
<em
class=
"property"
>
class
</em><code
class=
"descclassname"
>
paddle.v2.optimizer.
</code><code
class=
"descname"
>
AdaGrad
</code><span
class=
"sig-paren"
>
(
</span><em>
**kwargs
</em><span
class=
"sig-paren"
>
)
</span></dt>
...
...
@@ -283,6 +291,8 @@ w & = w - \eta diag(G)^{-\frac{1}{2}} \circ g\end{split}\]</div>
</div>
<div
class=
"section"
id=
"decayedadagrad"
>
<h2>
DecayedAdaGrad
<a
class=
"headerlink"
href=
"#decayedadagrad"
title=
"Permalink to this headline"
>
¶
</a></h2>
<p>
Optimizers(update equation) for SGD method.
</p>
<p>
TODO(yuyang18): Complete comments.
</p>
<dl
class=
"class"
>
<dt>
<em
class=
"property"
>
class
</em><code
class=
"descclassname"
>
paddle.v2.optimizer.
</code><code
class=
"descname"
>
DecayedAdaGrad
</code><span
class=
"sig-paren"
>
(
</span><em>
rho=0.95
</em>
,
<em>
epsilon=1e-06
</em>
,
<em>
**kwargs
</em><span
class=
"sig-paren"
>
)
</span></dt>
...
...
@@ -308,6 +318,8 @@ learning\_rate &= 1/sqrt( ( E(g_t^2) + \epsilon )\end{split}\]</div>
</div>
<div
class=
"section"
id=
"adadelta"
>
<h2>
AdaDelta
<a
class=
"headerlink"
href=
"#adadelta"
title=
"Permalink to this headline"
>
¶
</a></h2>
<p>
Optimizers(update equation) for SGD method.
</p>
<p>
TODO(yuyang18): Complete comments.
</p>
<dl
class=
"class"
>
<dt>
<em
class=
"property"
>
class
</em><code
class=
"descclassname"
>
paddle.v2.optimizer.
</code><code
class=
"descname"
>
AdaDelta
</code><span
class=
"sig-paren"
>
(
</span><em>
rho=0.95
</em>
,
<em>
epsilon=1e-06
</em>
,
<em>
**kwargs
</em><span
class=
"sig-paren"
>
)
</span></dt>
...
...
@@ -335,6 +347,8 @@ E(dx_t^2) &= \rho * E(dx_{t-1}^2) + (1-\rho) * (-g*learning\_rate)^2\end{spl
</div>
<div
class=
"section"
id=
"rmsprop"
>
<h2>
RMSProp
<a
class=
"headerlink"
href=
"#rmsprop"
title=
"Permalink to this headline"
>
¶
</a></h2>
<p>
Optimizers(update equation) for SGD method.
</p>
<p>
TODO(yuyang18): Complete comments.
</p>
<dl
class=
"class"
>
<dt>
<em
class=
"property"
>
class
</em><code
class=
"descclassname"
>
paddle.v2.optimizer.
</code><code
class=
"descname"
>
RMSProp
</code><span
class=
"sig-paren"
>
(
</span><em>
rho=0.95
</em>
,
<em>
epsilon=1e-06
</em>
,
<em>
**kwargs
</em><span
class=
"sig-paren"
>
)
</span></dt>
...
...
develop/doc/searchindex.js
浏览文件 @
26d3e0ef
因为 它太大了无法显示 source diff 。你可以改为
查看blob
。
develop/doc_cn/api/v2/config/optimizer.html
浏览文件 @
26d3e0ef
...
...
@@ -195,6 +195,8 @@
<h1>
Optimizer
<a
class=
"headerlink"
href=
"#optimizer"
title=
"永久链接至标题"
>
¶
</a></h1>
<div
class=
"section"
id=
"momentum"
>
<h2>
Momentum
<a
class=
"headerlink"
href=
"#momentum"
title=
"永久链接至标题"
>
¶
</a></h2>
<p>
Optimizers(update equation) for SGD method.
</p>
<p>
TODO(yuyang18): Complete comments.
</p>
<dl
class=
"class"
>
<dt>
<em
class=
"property"
>
class
</em><code
class=
"descclassname"
>
paddle.v2.optimizer.
</code><code
class=
"descname"
>
Momentum
</code><span
class=
"sig-paren"
>
(
</span><em>
momentum=None
</em>
,
<em>
sparse=False
</em>
,
<em>
**kwargs
</em><span
class=
"sig-paren"
>
)
</span></dt>
...
...
@@ -220,6 +222,8 @@ be learned. The i is the i-th observation in (trainning) data.</p>
</div>
<div
class=
"section"
id=
"adam"
>
<h2>
Adam
<a
class=
"headerlink"
href=
"#adam"
title=
"永久链接至标题"
>
¶
</a></h2>
<p>
Optimizers(update equation) for SGD method.
</p>
<p>
TODO(yuyang18): Complete comments.
</p>
<dl
class=
"class"
>
<dt>
<em
class=
"property"
>
class
</em><code
class=
"descclassname"
>
paddle.v2.optimizer.
</code><code
class=
"descname"
>
Adam
</code><span
class=
"sig-paren"
>
(
</span><em>
beta1=0.9
</em>
,
<em>
beta2=0.999
</em>
,
<em>
epsilon=1e-08
</em>
,
<em>
**kwargs
</em><span
class=
"sig-paren"
>
)
</span></dt>
...
...
@@ -248,6 +252,8 @@ divided by zero.</li>
</div>
<div
class=
"section"
id=
"adamax"
>
<h2>
Adamax
<a
class=
"headerlink"
href=
"#adamax"
title=
"永久链接至标题"
>
¶
</a></h2>
<p>
Optimizers(update equation) for SGD method.
</p>
<p>
TODO(yuyang18): Complete comments.
</p>
<dl
class=
"class"
>
<dt>
<em
class=
"property"
>
class
</em><code
class=
"descclassname"
>
paddle.v2.optimizer.
</code><code
class=
"descname"
>
Adamax
</code><span
class=
"sig-paren"
>
(
</span><em>
beta1=0.9
</em>
,
<em>
beta2=0.999
</em>
,
<em>
**kwargs
</em><span
class=
"sig-paren"
>
)
</span></dt>
...
...
@@ -274,6 +280,8 @@ w_t & = w_{t-1} - (\eta/(1-\beta_1^t))*m_t/u_t\end{split}\]</div>
</div>
<div
class=
"section"
id=
"adagrad"
>
<h2>
AdaGrad
<a
class=
"headerlink"
href=
"#adagrad"
title=
"永久链接至标题"
>
¶
</a></h2>
<p>
Optimizers(update equation) for SGD method.
</p>
<p>
TODO(yuyang18): Complete comments.
</p>
<dl
class=
"class"
>
<dt>
<em
class=
"property"
>
class
</em><code
class=
"descclassname"
>
paddle.v2.optimizer.
</code><code
class=
"descname"
>
AdaGrad
</code><span
class=
"sig-paren"
>
(
</span><em>
**kwargs
</em><span
class=
"sig-paren"
>
)
</span></dt>
...
...
@@ -288,6 +296,8 @@ w & = w - \eta diag(G)^{-\frac{1}{2}} \circ g\end{split}\]</div>
</div>
<div
class=
"section"
id=
"decayedadagrad"
>
<h2>
DecayedAdaGrad
<a
class=
"headerlink"
href=
"#decayedadagrad"
title=
"永久链接至标题"
>
¶
</a></h2>
<p>
Optimizers(update equation) for SGD method.
</p>
<p>
TODO(yuyang18): Complete comments.
</p>
<dl
class=
"class"
>
<dt>
<em
class=
"property"
>
class
</em><code
class=
"descclassname"
>
paddle.v2.optimizer.
</code><code
class=
"descname"
>
DecayedAdaGrad
</code><span
class=
"sig-paren"
>
(
</span><em>
rho=0.95
</em>
,
<em>
epsilon=1e-06
</em>
,
<em>
**kwargs
</em><span
class=
"sig-paren"
>
)
</span></dt>
...
...
@@ -313,6 +323,8 @@ learning\_rate &= 1/sqrt( ( E(g_t^2) + \epsilon )\end{split}\]</div>
</div>
<div
class=
"section"
id=
"adadelta"
>
<h2>
AdaDelta
<a
class=
"headerlink"
href=
"#adadelta"
title=
"永久链接至标题"
>
¶
</a></h2>
<p>
Optimizers(update equation) for SGD method.
</p>
<p>
TODO(yuyang18): Complete comments.
</p>
<dl
class=
"class"
>
<dt>
<em
class=
"property"
>
class
</em><code
class=
"descclassname"
>
paddle.v2.optimizer.
</code><code
class=
"descname"
>
AdaDelta
</code><span
class=
"sig-paren"
>
(
</span><em>
rho=0.95
</em>
,
<em>
epsilon=1e-06
</em>
,
<em>
**kwargs
</em><span
class=
"sig-paren"
>
)
</span></dt>
...
...
@@ -340,6 +352,8 @@ E(dx_t^2) &= \rho * E(dx_{t-1}^2) + (1-\rho) * (-g*learning\_rate)^2\end{spl
</div>
<div
class=
"section"
id=
"rmsprop"
>
<h2>
RMSProp
<a
class=
"headerlink"
href=
"#rmsprop"
title=
"永久链接至标题"
>
¶
</a></h2>
<p>
Optimizers(update equation) for SGD method.
</p>
<p>
TODO(yuyang18): Complete comments.
</p>
<dl
class=
"class"
>
<dt>
<em
class=
"property"
>
class
</em><code
class=
"descclassname"
>
paddle.v2.optimizer.
</code><code
class=
"descname"
>
RMSProp
</code><span
class=
"sig-paren"
>
(
</span><em>
rho=0.95
</em>
,
<em>
epsilon=1e-06
</em>
,
<em>
**kwargs
</em><span
class=
"sig-paren"
>
)
</span></dt>
...
...
develop/doc_cn/searchindex.js
浏览文件 @
26d3e0ef
此差异已折叠。
点击以展开。
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录