提交 4acfa557 编写于 作者: T Travis CI

Deploy to GitHub Pages: 68b958c3

上级 9b1461a4
...@@ -11,8 +11,7 @@ Data layer ...@@ -11,8 +11,7 @@ Data layer
data data
---- ----
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.data
:members: data
:noindex: :noindex:
Fully Connected Layers Fully Connected Layers
...@@ -22,14 +21,12 @@ Fully Connected Layers ...@@ -22,14 +21,12 @@ Fully Connected Layers
fc fc
-- --
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.fc
:members: fc
:noindex: :noindex:
selective_fc selective_fc
------------ ------------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.selective_fc
:members: selective_fc
:noindex: :noindex:
Conv Layers Conv Layers
...@@ -37,34 +34,29 @@ Conv Layers ...@@ -37,34 +34,29 @@ Conv Layers
conv_operator conv_operator
------------- -------------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.conv_operator
:members: conv_operator
:noindex: :noindex:
conv_projection conv_projection
--------------- ---------------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.conv_projection
:members: conv_projection
:noindex: :noindex:
conv_shift conv_shift
---------- ----------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.conv_shift
:members: conv_shift
:noindex: :noindex:
img_conv img_conv
-------- --------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.img_conv
:members: img_conv
:noindex: :noindex:
.. _api_v2.layer_context_projection: .. _api_v2.layer_context_projection:
context_projection context_projection
------------------ ------------------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.context_projection
:members: context_projection
:noindex: :noindex:
Image Pooling Layer Image Pooling Layer
...@@ -72,20 +64,17 @@ Image Pooling Layer ...@@ -72,20 +64,17 @@ Image Pooling Layer
img_pool img_pool
-------- --------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.img_pool
:members: img_pool
:noindex: :noindex:
spp spp
--- ---
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.spp
:members: spp
:noindex: :noindex:
maxout maxout
------ ------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.maxout
:members: maxout
:noindex: :noindex:
Norm Layer Norm Layer
...@@ -93,26 +82,22 @@ Norm Layer ...@@ -93,26 +82,22 @@ Norm Layer
img_cmrnorm img_cmrnorm
----------- -----------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.img_cmrnorm
:members: img_cmrnorm
:noindex: :noindex:
batch_norm batch_norm
---------- ----------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.batch_norm
:members: batch_norm
:noindex: :noindex:
sum_to_one_norm sum_to_one_norm
--------------- ---------------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.sum_to_one_norm
:members: sum_to_one_norm
:noindex: :noindex:
cross_channel_norm cross_channel_norm
------------------ ------------------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.cross_channel_norm
:members: cross_channel_norm
:noindex: :noindex:
Recurrent Layers Recurrent Layers
...@@ -120,20 +105,17 @@ Recurrent Layers ...@@ -120,20 +105,17 @@ Recurrent Layers
recurrent recurrent
--------- ---------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.recurrent
:members: recurrent
:noindex: :noindex:
lstmemory lstmemory
--------- ---------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.lstmemory
:members: lstmemory
:noindex: :noindex:
grumemory grumemory
--------- ---------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.grumemory
:members: grumemory
:noindex: :noindex:
Recurrent Layer Group Recurrent Layer Group
...@@ -141,38 +123,32 @@ Recurrent Layer Group ...@@ -141,38 +123,32 @@ Recurrent Layer Group
memory memory
------ ------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.memory
:members: memory
:noindex: :noindex:
recurrent_group recurrent_group
--------------- ---------------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.recurrent_group
:members: recurrent_group
:noindex: :noindex:
lstm_step lstm_step
--------- ---------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.lstm_step
:members: lstm_step
:noindex: :noindex:
gru_step gru_step
-------- --------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.gru_step
:members: gru_step
:noindex: :noindex:
beam_search beam_search
------------ ------------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.beam_search
:members: beam_search
:noindex: :noindex:
get_output get_output
---------- ----------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.get_output
:members: get_output
:noindex: :noindex:
Mixed Layer Mixed Layer
...@@ -182,59 +158,50 @@ Mixed Layer ...@@ -182,59 +158,50 @@ Mixed Layer
mixed mixed
----- -----
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.mixed
:members: mixed
:noindex: :noindex:
.. _api_v2.layer_embedding: .. _api_v2.layer_embedding:
embedding embedding
--------- ---------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.embedding
:members: embedding
:noindex: :noindex:
scaling_projection scaling_projection
------------------ ------------------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.scaling_projection
:members: scaling_projection
:noindex: :noindex:
dotmul_projection dotmul_projection
----------------- -----------------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.dotmul_projection
:members: dotmul_projection
:noindex: :noindex:
dotmul_operator dotmul_operator
--------------- ---------------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.dotmul_operator
:members: dotmul_operator
:noindex: :noindex:
full_matrix_projection full_matrix_projection
---------------------- ----------------------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.full_matrix_projection
:members: full_matrix_projection
:noindex: :noindex:
identity_projection identity_projection
------------------- -------------------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.identity_projection
:members: identity_projection
:noindex: :noindex:
table_projection table_projection
---------------- ----------------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.table_projection
:members: table_projection
:noindex: :noindex:
trans_full_matrix_projection trans_full_matrix_projection
---------------------------- ----------------------------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.trans_full_matrix_projection
:members: trans_full_matrix_projection
:noindex: :noindex:
Aggregate Layers Aggregate Layers
...@@ -244,36 +211,31 @@ Aggregate Layers ...@@ -244,36 +211,31 @@ Aggregate Layers
pooling pooling
------- -------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.pooling
:members: pooling
:noindex: :noindex:
.. _api_v2.layer_last_seq: .. _api_v2.layer_last_seq:
last_seq last_seq
-------- --------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.last_seq
:members: last_seq
:noindex: :noindex:
.. _api_v2.layer_first_seq: .. _api_v2.layer_first_seq:
first_seq first_seq
--------- ---------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.first_seq
:members: first_seq
:noindex: :noindex:
concat concat
------ ------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.concat
:members: concat
:noindex: :noindex:
seq_concat seq_concat
---------- ----------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.seq_concat
:members: seq_concat
:noindex: :noindex:
Reshaping Layers Reshaping Layers
...@@ -281,34 +243,29 @@ Reshaping Layers ...@@ -281,34 +243,29 @@ Reshaping Layers
block_expand block_expand
------------ ------------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.block_expand
:members: block_expand
:noindex: :noindex:
.. _api_v2.layer_expand: .. _api_v2.layer_expand:
expand expand
------ ------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.expand
:members: expand
:noindex: :noindex:
repeat repeat
------ ------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.repeat
:members: repeat
:noindex: :noindex:
rotate rotate
------ ------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.rotate
:members: rotate
:noindex: :noindex:
seq_reshape seq_reshape
----------- -----------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.seq_reshape
:members: seq_reshape
:noindex: :noindex:
Math Layers Math Layers
...@@ -316,64 +273,54 @@ Math Layers ...@@ -316,64 +273,54 @@ Math Layers
addto addto
----- -----
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.addto
:members: addto
:noindex: :noindex:
linear_comb linear_comb
----------- -----------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.linear_comb
:members: linear_comb
:noindex: :noindex:
interpolation interpolation
------------- -------------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.interpolation
:members: interpolation
:noindex: :noindex:
bilinear_interp bilinear_interp
--------------- ---------------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.bilinear_interp
:members: bilinear_interp
:noindex: :noindex:
power power
----- -----
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.power
:members: power
:noindex: :noindex:
scaling scaling
------- -------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.scaling
:members: scaling
:noindex: :noindex:
slope_intercept slope_intercept
--------------- ---------------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.slope_intercept
:members: slope_intercept
:noindex: :noindex:
tensor tensor
------ ------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.tensor
:members: tensor
:noindex: :noindex:
.. _api_v2.layer_cos_sim: .. _api_v2.layer_cos_sim:
cos_sim cos_sim
------- -------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.cos_sim
:members: cos_sim
:noindex: :noindex:
trans trans
----- -----
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.trans
:members: trans
:noindex: :noindex:
Sampling Layers Sampling Layers
...@@ -381,14 +328,12 @@ Sampling Layers ...@@ -381,14 +328,12 @@ Sampling Layers
maxid maxid
----- -----
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.max_id
:members: maxid
:noindex: :noindex:
sampling_id sampling_id
----------- -----------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.sampling_id
:members: sampling_id
:noindex: :noindex:
Slicing and Joining Layers Slicing and Joining Layers
...@@ -396,8 +341,7 @@ Slicing and Joining Layers ...@@ -396,8 +341,7 @@ Slicing and Joining Layers
pad pad
---- ----
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.pad
:members: pad
:noindex: :noindex:
.. _api_v2.layer_costs: .. _api_v2.layer_costs:
...@@ -407,80 +351,72 @@ Cost Layers ...@@ -407,80 +351,72 @@ Cost Layers
cross_entropy_cost cross_entropy_cost
------------------ ------------------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.cross_entropy_cost
:members: cross_entropy_cost
:noindex: :noindex:
cross_entropy_with_selfnorm_cost cross_entropy_with_selfnorm_cost
-------------------------------- --------------------------------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.cross_entropy_with_selfnorm_cost
:members: cross_entropy_with_selfnorm_cost
:noindex: :noindex:
multi_binary_label_cross_entropy_cost multi_binary_label_cross_entropy_cost
------------------------------------- -------------------------------------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.multi_binary_label_cross_entropy_cost
:members: multi_binary_label_cross_entropy_cost
:noindex: :noindex:
huber_cost huber_cost
---------- ----------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.huber_cost
:members: huber_cost
:noindex: :noindex:
lambda_cost lambda_cost
----------- -----------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.lambda_cost
:members: lambda_cost :noindex:
mse_cost
--------
.. autoclass:: paddle.v2.layer.mse_cost
:noindex: :noindex:
rank_cost rank_cost
--------- ---------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.rank_cost
:members: rank_cost
:noindex: :noindex:
sum_cost sum_cost
--------- ---------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.sum_cost
:members: sum_cost
:noindex: :noindex:
crf crf
--- ---
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.crf
:members: crf
:noindex: :noindex:
crf_decoding crf_decoding
------------ ------------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.crf_decoding
:members: crf_decoding
:noindex: :noindex:
ctc ctc
--- ---
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.ctc
:members: ctc
:noindex: :noindex:
warp_ctc warp_ctc
-------- --------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.warp_ctc
:members: warp_ctc
:noindex: :noindex:
nce nce
--- ---
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.nce
:members: nce
:noindex: :noindex:
hsigmoid hsigmoid
--------- ---------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.hsigmoid
:members: hsigmoid
:noindex: :noindex:
Check Layer Check Layer
...@@ -488,6 +424,5 @@ Check Layer ...@@ -488,6 +424,5 @@ Check Layer
eos eos
--- ---
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.eos
:members: eos
:noindex: :noindex:
...@@ -270,7 +270,7 @@ layer that not support this attribute, paddle will print an error and core.</p> ...@@ -270,7 +270,7 @@ layer that not support this attribute, paddle will print an error and core.</p>
<li><strong>drop_rate</strong> (<em>float</em>) &#8211; Dropout rate. Dropout will create a mask on layer output. <li><strong>drop_rate</strong> (<em>float</em>) &#8211; Dropout rate. Dropout will create a mask on layer output.
The dropout rate is the zero rate of this mask. The The dropout rate is the zero rate of this mask. The
details of what dropout is please refer to <a class="reference external" href="https://www.cs.toronto.edu/~hinton/absps/JMLRdropout.pdf">here</a>.</li> details of what dropout is please refer to <a class="reference external" href="https://www.cs.toronto.edu/~hinton/absps/JMLRdropout.pdf">here</a>.</li>
<li><strong>device</strong> (<em>int</em>) &#8211; <p>device ID of layer. device=-1, use CPU. device&gt;0, use GPU. <li><strong>device</strong> (<em>int</em>) &#8211; <p>device ID of layer. device=-1, use CPU. device&gt;=0, use GPU.
The details allocation in parallel_nn please refer to <a class="reference external" href="http://www.paddlepaddle.org/doc/ui/cmd_argument/use_case.html#case-2-specify-layers-in-different-devices">here</a>.</p> The details allocation in parallel_nn please refer to <a class="reference external" href="http://www.paddlepaddle.org/doc/ui/cmd_argument/use_case.html#case-2-specify-layers-in-different-devices">here</a>.</p>
</li> </li>
</ul> </ul>
......
...@@ -1757,11 +1757,11 @@ It performs element-wise multiplication with weight.</p> ...@@ -1757,11 +1757,11 @@ It performs element-wise multiplication with weight.</p>
<code class="descclassname">paddle.trainer_config_helpers.layers.</code><code class="descname">dotmul_operator</code><span class="sig-paren">(</span><em>a=None</em>, <em>b=None</em>, <em>scale=1</em>, <em>**kwargs</em><span class="sig-paren">)</span></dt> <code class="descclassname">paddle.trainer_config_helpers.layers.</code><code class="descname">dotmul_operator</code><span class="sig-paren">(</span><em>a=None</em>, <em>b=None</em>, <em>scale=1</em>, <em>**kwargs</em><span class="sig-paren">)</span></dt>
<dd><p>DotMulOperator takes two inputs and performs element-wise multiplication:</p> <dd><p>DotMulOperator takes two inputs and performs element-wise multiplication:</p>
<div class="math"> <div class="math">
\[out.row[i] += scale * (x.row[i] .* y.row[i])\]</div> \[out.row[i] += scale * (a.row[i] .* b.row[i])\]</div>
<p>where <span class="math">\(.*\)</span> means element-wise multiplication, and <p>where <span class="math">\(.*\)</span> means element-wise multiplication, and
scale is a config scalar, its default value is one.</p> scale is a config scalar, its default value is one.</p>
<p>The example usage is:</p> <p>The example usage is:</p>
<div class="highlight-python"><div class="highlight"><pre><span></span><span class="n">op</span> <span class="o">=</span> <span class="n">dotmul_operator</span><span class="p">(</span><span class="n">x</span><span class="o">=</span><span class="n">layer1</span><span class="p">,</span> <span class="n">y</span><span class="o">=</span><span class="n">layer2</span><span class="p">,</span> <span class="n">scale</span><span class="o">=</span><span class="mf">0.5</span><span class="p">)</span> <div class="highlight-python"><div class="highlight"><pre><span></span><span class="n">op</span> <span class="o">=</span> <span class="n">dotmul_operator</span><span class="p">(</span><span class="n">a</span><span class="o">=</span><span class="n">layer1</span><span class="p">,</span> <span class="n">b</span><span class="o">=</span><span class="n">layer2</span><span class="p">,</span> <span class="n">scale</span><span class="o">=</span><span class="mf">0.5</span><span class="p">)</span>
</pre></div> </pre></div>
</div> </div>
<table class="docutils field-list" frame="void" rules="none"> <table class="docutils field-list" frame="void" rules="none">
...@@ -3077,9 +3077,9 @@ Input should be a vector of positive numbers, without normalization.</p> ...@@ -3077,9 +3077,9 @@ Input should be a vector of positive numbers, without normalization.</p>
<dd><blockquote> <dd><blockquote>
<div><p>mean squared error cost:</p> <div><p>mean squared error cost:</p>
<div class="math"> <div class="math">
\[$\]</div> \[\]</div>
</div></blockquote> </div></blockquote>
<p>rac{1}{N}sum_{i=1}^N(t _i- y_i)^2$</p> <p>rac{1}{N}sum_{i=1}^N(t_i-y_i)^2</p>
<blockquote> <blockquote>
<div><table class="docutils field-list" frame="void" rules="none"> <div><table class="docutils field-list" frame="void" rules="none">
<col class="field-name" /> <col class="field-name" />
......
...@@ -301,7 +301,7 @@ layer that not support this attribute, paddle will print an error and core.</p> ...@@ -301,7 +301,7 @@ layer that not support this attribute, paddle will print an error and core.</p>
<li><strong>drop_rate</strong> (<em>float</em>) &#8211; Dropout rate. Dropout will create a mask on layer output. <li><strong>drop_rate</strong> (<em>float</em>) &#8211; Dropout rate. Dropout will create a mask on layer output.
The dropout rate is the zero rate of this mask. The The dropout rate is the zero rate of this mask. The
details of what dropout is please refer to <a class="reference external" href="https://www.cs.toronto.edu/~hinton/absps/JMLRdropout.pdf">here</a>.</li> details of what dropout is please refer to <a class="reference external" href="https://www.cs.toronto.edu/~hinton/absps/JMLRdropout.pdf">here</a>.</li>
<li><strong>device</strong> (<em>int</em>) &#8211; <p>device ID of layer. device=-1, use CPU. device&gt;0, use GPU. <li><strong>device</strong> (<em>int</em>) &#8211; <p>device ID of layer. device=-1, use CPU. device&gt;=0, use GPU.
The details allocation in parallel_nn please refer to <a class="reference external" href="http://www.paddlepaddle.org/doc/ui/cmd_argument/use_case.html#case-2-specify-layers-in-different-devices">here</a>.</p> The details allocation in parallel_nn please refer to <a class="reference external" href="http://www.paddlepaddle.org/doc/ui/cmd_argument/use_case.html#case-2-specify-layers-in-different-devices">here</a>.</p>
</li> </li>
</ul> </ul>
......
因为 它太大了无法显示 source diff 。你可以改为 查看blob
因为 它太大了无法显示 source diff 。你可以改为 查看blob
...@@ -11,8 +11,7 @@ Data layer ...@@ -11,8 +11,7 @@ Data layer
data data
---- ----
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.data
:members: data
:noindex: :noindex:
Fully Connected Layers Fully Connected Layers
...@@ -22,14 +21,12 @@ Fully Connected Layers ...@@ -22,14 +21,12 @@ Fully Connected Layers
fc fc
-- --
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.fc
:members: fc
:noindex: :noindex:
selective_fc selective_fc
------------ ------------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.selective_fc
:members: selective_fc
:noindex: :noindex:
Conv Layers Conv Layers
...@@ -37,34 +34,29 @@ Conv Layers ...@@ -37,34 +34,29 @@ Conv Layers
conv_operator conv_operator
------------- -------------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.conv_operator
:members: conv_operator
:noindex: :noindex:
conv_projection conv_projection
--------------- ---------------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.conv_projection
:members: conv_projection
:noindex: :noindex:
conv_shift conv_shift
---------- ----------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.conv_shift
:members: conv_shift
:noindex: :noindex:
img_conv img_conv
-------- --------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.img_conv
:members: img_conv
:noindex: :noindex:
.. _api_v2.layer_context_projection: .. _api_v2.layer_context_projection:
context_projection context_projection
------------------ ------------------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.context_projection
:members: context_projection
:noindex: :noindex:
Image Pooling Layer Image Pooling Layer
...@@ -72,20 +64,17 @@ Image Pooling Layer ...@@ -72,20 +64,17 @@ Image Pooling Layer
img_pool img_pool
-------- --------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.img_pool
:members: img_pool
:noindex: :noindex:
spp spp
--- ---
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.spp
:members: spp
:noindex: :noindex:
maxout maxout
------ ------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.maxout
:members: maxout
:noindex: :noindex:
Norm Layer Norm Layer
...@@ -93,26 +82,22 @@ Norm Layer ...@@ -93,26 +82,22 @@ Norm Layer
img_cmrnorm img_cmrnorm
----------- -----------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.img_cmrnorm
:members: img_cmrnorm
:noindex: :noindex:
batch_norm batch_norm
---------- ----------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.batch_norm
:members: batch_norm
:noindex: :noindex:
sum_to_one_norm sum_to_one_norm
--------------- ---------------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.sum_to_one_norm
:members: sum_to_one_norm
:noindex: :noindex:
cross_channel_norm cross_channel_norm
------------------ ------------------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.cross_channel_norm
:members: cross_channel_norm
:noindex: :noindex:
Recurrent Layers Recurrent Layers
...@@ -120,20 +105,17 @@ Recurrent Layers ...@@ -120,20 +105,17 @@ Recurrent Layers
recurrent recurrent
--------- ---------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.recurrent
:members: recurrent
:noindex: :noindex:
lstmemory lstmemory
--------- ---------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.lstmemory
:members: lstmemory
:noindex: :noindex:
grumemory grumemory
--------- ---------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.grumemory
:members: grumemory
:noindex: :noindex:
Recurrent Layer Group Recurrent Layer Group
...@@ -141,38 +123,32 @@ Recurrent Layer Group ...@@ -141,38 +123,32 @@ Recurrent Layer Group
memory memory
------ ------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.memory
:members: memory
:noindex: :noindex:
recurrent_group recurrent_group
--------------- ---------------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.recurrent_group
:members: recurrent_group
:noindex: :noindex:
lstm_step lstm_step
--------- ---------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.lstm_step
:members: lstm_step
:noindex: :noindex:
gru_step gru_step
-------- --------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.gru_step
:members: gru_step
:noindex: :noindex:
beam_search beam_search
------------ ------------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.beam_search
:members: beam_search
:noindex: :noindex:
get_output get_output
---------- ----------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.get_output
:members: get_output
:noindex: :noindex:
Mixed Layer Mixed Layer
...@@ -182,59 +158,50 @@ Mixed Layer ...@@ -182,59 +158,50 @@ Mixed Layer
mixed mixed
----- -----
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.mixed
:members: mixed
:noindex: :noindex:
.. _api_v2.layer_embedding: .. _api_v2.layer_embedding:
embedding embedding
--------- ---------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.embedding
:members: embedding
:noindex: :noindex:
scaling_projection scaling_projection
------------------ ------------------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.scaling_projection
:members: scaling_projection
:noindex: :noindex:
dotmul_projection dotmul_projection
----------------- -----------------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.dotmul_projection
:members: dotmul_projection
:noindex: :noindex:
dotmul_operator dotmul_operator
--------------- ---------------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.dotmul_operator
:members: dotmul_operator
:noindex: :noindex:
full_matrix_projection full_matrix_projection
---------------------- ----------------------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.full_matrix_projection
:members: full_matrix_projection
:noindex: :noindex:
identity_projection identity_projection
------------------- -------------------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.identity_projection
:members: identity_projection
:noindex: :noindex:
table_projection table_projection
---------------- ----------------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.table_projection
:members: table_projection
:noindex: :noindex:
trans_full_matrix_projection trans_full_matrix_projection
---------------------------- ----------------------------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.trans_full_matrix_projection
:members: trans_full_matrix_projection
:noindex: :noindex:
Aggregate Layers Aggregate Layers
...@@ -244,36 +211,31 @@ Aggregate Layers ...@@ -244,36 +211,31 @@ Aggregate Layers
pooling pooling
------- -------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.pooling
:members: pooling
:noindex: :noindex:
.. _api_v2.layer_last_seq: .. _api_v2.layer_last_seq:
last_seq last_seq
-------- --------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.last_seq
:members: last_seq
:noindex: :noindex:
.. _api_v2.layer_first_seq: .. _api_v2.layer_first_seq:
first_seq first_seq
--------- ---------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.first_seq
:members: first_seq
:noindex: :noindex:
concat concat
------ ------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.concat
:members: concat
:noindex: :noindex:
seq_concat seq_concat
---------- ----------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.seq_concat
:members: seq_concat
:noindex: :noindex:
Reshaping Layers Reshaping Layers
...@@ -281,34 +243,29 @@ Reshaping Layers ...@@ -281,34 +243,29 @@ Reshaping Layers
block_expand block_expand
------------ ------------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.block_expand
:members: block_expand
:noindex: :noindex:
.. _api_v2.layer_expand: .. _api_v2.layer_expand:
expand expand
------ ------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.expand
:members: expand
:noindex: :noindex:
repeat repeat
------ ------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.repeat
:members: repeat
:noindex: :noindex:
rotate rotate
------ ------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.rotate
:members: rotate
:noindex: :noindex:
seq_reshape seq_reshape
----------- -----------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.seq_reshape
:members: seq_reshape
:noindex: :noindex:
Math Layers Math Layers
...@@ -316,64 +273,54 @@ Math Layers ...@@ -316,64 +273,54 @@ Math Layers
addto addto
----- -----
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.addto
:members: addto
:noindex: :noindex:
linear_comb linear_comb
----------- -----------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.linear_comb
:members: linear_comb
:noindex: :noindex:
interpolation interpolation
------------- -------------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.interpolation
:members: interpolation
:noindex: :noindex:
bilinear_interp bilinear_interp
--------------- ---------------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.bilinear_interp
:members: bilinear_interp
:noindex: :noindex:
power power
----- -----
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.power
:members: power
:noindex: :noindex:
scaling scaling
------- -------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.scaling
:members: scaling
:noindex: :noindex:
slope_intercept slope_intercept
--------------- ---------------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.slope_intercept
:members: slope_intercept
:noindex: :noindex:
tensor tensor
------ ------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.tensor
:members: tensor
:noindex: :noindex:
.. _api_v2.layer_cos_sim: .. _api_v2.layer_cos_sim:
cos_sim cos_sim
------- -------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.cos_sim
:members: cos_sim
:noindex: :noindex:
trans trans
----- -----
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.trans
:members: trans
:noindex: :noindex:
Sampling Layers Sampling Layers
...@@ -381,14 +328,12 @@ Sampling Layers ...@@ -381,14 +328,12 @@ Sampling Layers
maxid maxid
----- -----
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.max_id
:members: maxid
:noindex: :noindex:
sampling_id sampling_id
----------- -----------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.sampling_id
:members: sampling_id
:noindex: :noindex:
Slicing and Joining Layers Slicing and Joining Layers
...@@ -396,8 +341,7 @@ Slicing and Joining Layers ...@@ -396,8 +341,7 @@ Slicing and Joining Layers
pad pad
---- ----
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.pad
:members: pad
:noindex: :noindex:
.. _api_v2.layer_costs: .. _api_v2.layer_costs:
...@@ -407,80 +351,72 @@ Cost Layers ...@@ -407,80 +351,72 @@ Cost Layers
cross_entropy_cost cross_entropy_cost
------------------ ------------------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.cross_entropy_cost
:members: cross_entropy_cost
:noindex: :noindex:
cross_entropy_with_selfnorm_cost cross_entropy_with_selfnorm_cost
-------------------------------- --------------------------------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.cross_entropy_with_selfnorm_cost
:members: cross_entropy_with_selfnorm_cost
:noindex: :noindex:
multi_binary_label_cross_entropy_cost multi_binary_label_cross_entropy_cost
------------------------------------- -------------------------------------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.multi_binary_label_cross_entropy_cost
:members: multi_binary_label_cross_entropy_cost
:noindex: :noindex:
huber_cost huber_cost
---------- ----------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.huber_cost
:members: huber_cost
:noindex: :noindex:
lambda_cost lambda_cost
----------- -----------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.lambda_cost
:members: lambda_cost :noindex:
mse_cost
--------
.. autoclass:: paddle.v2.layer.mse_cost
:noindex: :noindex:
rank_cost rank_cost
--------- ---------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.rank_cost
:members: rank_cost
:noindex: :noindex:
sum_cost sum_cost
--------- ---------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.sum_cost
:members: sum_cost
:noindex: :noindex:
crf crf
--- ---
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.crf
:members: crf
:noindex: :noindex:
crf_decoding crf_decoding
------------ ------------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.crf_decoding
:members: crf_decoding
:noindex: :noindex:
ctc ctc
--- ---
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.ctc
:members: ctc
:noindex: :noindex:
warp_ctc warp_ctc
-------- --------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.warp_ctc
:members: warp_ctc
:noindex: :noindex:
nce nce
--- ---
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.nce
:members: nce
:noindex: :noindex:
hsigmoid hsigmoid
--------- ---------
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.hsigmoid
:members: hsigmoid
:noindex: :noindex:
Check Layer Check Layer
...@@ -488,6 +424,5 @@ Check Layer ...@@ -488,6 +424,5 @@ Check Layer
eos eos
--- ---
.. automodule:: paddle.v2.layer .. autoclass:: paddle.v2.layer.eos
:members: eos
:noindex: :noindex:
...@@ -277,7 +277,7 @@ layer that not support this attribute, paddle will print an error and core.</p> ...@@ -277,7 +277,7 @@ layer that not support this attribute, paddle will print an error and core.</p>
<li><strong>drop_rate</strong> (<em>float</em>) &#8211; Dropout rate. Dropout will create a mask on layer output. <li><strong>drop_rate</strong> (<em>float</em>) &#8211; Dropout rate. Dropout will create a mask on layer output.
The dropout rate is the zero rate of this mask. The The dropout rate is the zero rate of this mask. The
details of what dropout is please refer to <a class="reference external" href="https://www.cs.toronto.edu/~hinton/absps/JMLRdropout.pdf">here</a>.</li> details of what dropout is please refer to <a class="reference external" href="https://www.cs.toronto.edu/~hinton/absps/JMLRdropout.pdf">here</a>.</li>
<li><strong>device</strong> (<em>int</em>) &#8211; <p>device ID of layer. device=-1, use CPU. device&gt;0, use GPU. <li><strong>device</strong> (<em>int</em>) &#8211; <p>device ID of layer. device=-1, use CPU. device&gt;=0, use GPU.
The details allocation in parallel_nn please refer to <a class="reference external" href="http://www.paddlepaddle.org/doc/ui/cmd_argument/use_case.html#case-2-specify-layers-in-different-devices">here</a>.</p> The details allocation in parallel_nn please refer to <a class="reference external" href="http://www.paddlepaddle.org/doc/ui/cmd_argument/use_case.html#case-2-specify-layers-in-different-devices">here</a>.</p>
</li> </li>
</ul> </ul>
......
...@@ -1764,11 +1764,11 @@ It performs element-wise multiplication with weight.</p> ...@@ -1764,11 +1764,11 @@ It performs element-wise multiplication with weight.</p>
<code class="descclassname">paddle.trainer_config_helpers.layers.</code><code class="descname">dotmul_operator</code><span class="sig-paren">(</span><em>a=None</em>, <em>b=None</em>, <em>scale=1</em>, <em>**kwargs</em><span class="sig-paren">)</span></dt> <code class="descclassname">paddle.trainer_config_helpers.layers.</code><code class="descname">dotmul_operator</code><span class="sig-paren">(</span><em>a=None</em>, <em>b=None</em>, <em>scale=1</em>, <em>**kwargs</em><span class="sig-paren">)</span></dt>
<dd><p>DotMulOperator takes two inputs and performs element-wise multiplication:</p> <dd><p>DotMulOperator takes two inputs and performs element-wise multiplication:</p>
<div class="math"> <div class="math">
\[out.row[i] += scale * (x.row[i] .* y.row[i])\]</div> \[out.row[i] += scale * (a.row[i] .* b.row[i])\]</div>
<p>where <span class="math">\(.*\)</span> means element-wise multiplication, and <p>where <span class="math">\(.*\)</span> means element-wise multiplication, and
scale is a config scalar, its default value is one.</p> scale is a config scalar, its default value is one.</p>
<p>The example usage is:</p> <p>The example usage is:</p>
<div class="highlight-python"><div class="highlight"><pre><span></span><span class="n">op</span> <span class="o">=</span> <span class="n">dotmul_operator</span><span class="p">(</span><span class="n">x</span><span class="o">=</span><span class="n">layer1</span><span class="p">,</span> <span class="n">y</span><span class="o">=</span><span class="n">layer2</span><span class="p">,</span> <span class="n">scale</span><span class="o">=</span><span class="mf">0.5</span><span class="p">)</span> <div class="highlight-python"><div class="highlight"><pre><span></span><span class="n">op</span> <span class="o">=</span> <span class="n">dotmul_operator</span><span class="p">(</span><span class="n">a</span><span class="o">=</span><span class="n">layer1</span><span class="p">,</span> <span class="n">b</span><span class="o">=</span><span class="n">layer2</span><span class="p">,</span> <span class="n">scale</span><span class="o">=</span><span class="mf">0.5</span><span class="p">)</span>
</pre></div> </pre></div>
</div> </div>
<table class="docutils field-list" frame="void" rules="none"> <table class="docutils field-list" frame="void" rules="none">
...@@ -3084,9 +3084,9 @@ Input should be a vector of positive numbers, without normalization.</p> ...@@ -3084,9 +3084,9 @@ Input should be a vector of positive numbers, without normalization.</p>
<dd><blockquote> <dd><blockquote>
<div><p>mean squared error cost:</p> <div><p>mean squared error cost:</p>
<div class="math"> <div class="math">
\[$\]</div> \[\]</div>
</div></blockquote> </div></blockquote>
<p>rac{1}{N}sum_{i=1}^N(t _i- y_i)^2$</p> <p>rac{1}{N}sum_{i=1}^N(t_i-y_i)^2</p>
<blockquote> <blockquote>
<div><table class="docutils field-list" frame="void" rules="none"> <div><table class="docutils field-list" frame="void" rules="none">
<col class="field-name" /> <col class="field-name" />
......
...@@ -308,7 +308,7 @@ layer that not support this attribute, paddle will print an error and core.</p> ...@@ -308,7 +308,7 @@ layer that not support this attribute, paddle will print an error and core.</p>
<li><strong>drop_rate</strong> (<em>float</em>) &#8211; Dropout rate. Dropout will create a mask on layer output. <li><strong>drop_rate</strong> (<em>float</em>) &#8211; Dropout rate. Dropout will create a mask on layer output.
The dropout rate is the zero rate of this mask. The The dropout rate is the zero rate of this mask. The
details of what dropout is please refer to <a class="reference external" href="https://www.cs.toronto.edu/~hinton/absps/JMLRdropout.pdf">here</a>.</li> details of what dropout is please refer to <a class="reference external" href="https://www.cs.toronto.edu/~hinton/absps/JMLRdropout.pdf">here</a>.</li>
<li><strong>device</strong> (<em>int</em>) &#8211; <p>device ID of layer. device=-1, use CPU. device&gt;0, use GPU. <li><strong>device</strong> (<em>int</em>) &#8211; <p>device ID of layer. device=-1, use CPU. device&gt;=0, use GPU.
The details allocation in parallel_nn please refer to <a class="reference external" href="http://www.paddlepaddle.org/doc/ui/cmd_argument/use_case.html#case-2-specify-layers-in-different-devices">here</a>.</p> The details allocation in parallel_nn please refer to <a class="reference external" href="http://www.paddlepaddle.org/doc/ui/cmd_argument/use_case.html#case-2-specify-layers-in-different-devices">here</a>.</p>
</li> </li>
</ul> </ul>
......
因为 它太大了无法显示 source diff 。你可以改为 查看blob
此差异已折叠。
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册