提交 7f1d6c5a 编写于 作者: D dangqingqing

Fix some documentations.

ISSUE=4611579

git-svn-id: https://svn.baidu.com/idl/trunk/paddle@1473 1ad973e4-5ce8-4261-8a94-b56d1f490c56
上级 80790017
...@@ -45,5 +45,5 @@ sphinx_add_target(paddle_docs ...@@ -45,5 +45,5 @@ sphinx_add_target(paddle_docs
${SPHINX_HTML_DIR}) ${SPHINX_HTML_DIR})
add_dependencies(paddle_docs add_dependencies(paddle_docs
gen_proto_py gen_proto_py)
paddle_doxygen_docs) #paddle_doxygen_docs)
\ No newline at end of file
...@@ -173,7 +173,7 @@ python -m paddle.utils.plotcurve -i $log > plot.png ...@@ -173,7 +173,7 @@ python -m paddle.utils.plotcurve -i $log > plot.png
- The script `plotcurve.py` requires the python module of `matplotlib`, so if it fails, maybe you need to install `matplotlib`. - The script `plotcurve.py` requires the python module of `matplotlib`, so if it fails, maybe you need to install `matplotlib`.
After training finishes, the training and testing error curve will be saved to `plot.png` using `plotcurve.py` script. An example of the plot is shown below: After training finishes, the training and testing error curves will be saved to `plot.png` using `plotcurve.py` script. An example of the plot is shown below:
<center>![Training and testing curves.](./plot.png)</center> <center>![Training and testing curves.](./plot.png)</center>
......
# Model Zoo - ImageNet # # Model Zoo - ImageNet #
[ImageNet](http://www.image-net.org/) is a popular dataset for generic object classification. This tutorial provided convolutional neural network(CNN) models for ImageNet. [ImageNet](http://www.image-net.org/) is a popular dataset for generic object classification. This tutorial provides convolutional neural network(CNN) models for ImageNet.
## ResNet Introduction ## ResNet Introduction
...@@ -48,11 +48,11 @@ We present three ResNet models, which are converted from the models provided by ...@@ -48,11 +48,11 @@ We present three ResNet models, which are converted from the models provided by
## ResNet Model ## ResNet Model
See ```demo/model_zoo/resnet/resnet.py```. This confgiure contains network of 50, 101 and 152 layers. You can specify layer number by adding argument like this ```--config_args=layer_num=50``` in command line arguments. See ```demo/model_zoo/resnet/resnet.py```. This config contains network of 50, 101 and 152 layers. You can specify layer number by adding argument like ```--config_args=layer_num=50``` in command line arguments.
### Network Visualization ### Network Visualization
You can get a diagram of ResNet network by running the following command. The script generates dot file and then converts dot file to PNG file, which uses installed draw_dot tool in our server. If you can not access the server, just install graphviz to convert dot file. You can get a diagram of ResNet network by running the following commands. The script generates dot file and then converts dot file to PNG file, which uses installed draw_dot tool in our server. If you can not access the server, just install graphviz to convert dot file.
``` ```
cd demo/model_zoo/resnet cd demo/model_zoo/resnet
...@@ -190,8 +190,7 @@ Second, specify layers to extract features in `Outputs()` of `resnet.py`. For ex ...@@ -190,8 +190,7 @@ Second, specify layers to extract features in `Outputs()` of `resnet.py`. For ex
Outputs("res5_3_branch2c_conv", "res5_3_branch2c_bn") Outputs("res5_3_branch2c_conv", "res5_3_branch2c_bn")
``` ```
Third, specify model path and output directory in `extract_fea_c++.sh Third, specify model path and output directory in `extract_fea_c++.sh`, and then run the following commands.
`, and then run following commands
``` ```
cd demo/model_zoo/resnet cd demo/model_zoo/resnet
......
...@@ -10,7 +10,7 @@ customized, with sacrificing the efficiency only a little. This is extremly ...@@ -10,7 +10,7 @@ customized, with sacrificing the efficiency only a little. This is extremly
useful when you have to dynamically generate certain kinds of data according to, useful when you have to dynamically generate certain kinds of data according to,
for example, the training performance. for example, the training performance.
Besides, users also can also customize a C++ :code:`DataProvider` for a more Besides, users also can customize a C++ :code:`DataProvider` for a more
complex usage, or for a higher efficiency. complex usage, or for a higher efficiency.
The following parameters are required to define in the PaddlePaddle network The following parameters are required to define in the PaddlePaddle network
......
...@@ -17,10 +17,10 @@ how to write a simple PyDataProvider. ...@@ -17,10 +17,10 @@ how to write a simple PyDataProvider.
MNIST is a handwriting classification data set. It contains 70,000 digital MNIST is a handwriting classification data set. It contains 70,000 digital
grayscale images. Labels of the training sample range from 0 to 9. All the grayscale images. Labels of the training sample range from 0 to 9. All the
images have been size-normalized and centered into images with a same size images have been size-normalized and centered into images with the same size
of 28 x 28 pixels. of 28 x 28 pixels.
A small part of the original data as an example can be found in the path below: A small part of the original data as an example is shown as below:
.. literalinclude:: ../../../doc_cn/ui/data_provider/mnist_train.txt .. literalinclude:: ../../../doc_cn/ui/data_provider/mnist_train.txt
...@@ -31,10 +31,9 @@ Just write path of the above data into train.list. It looks like this: ...@@ -31,10 +31,9 @@ Just write path of the above data into train.list. It looks like this:
.. literalinclude:: ../../../doc_cn/ui/data_provider/train.list .. literalinclude:: ../../../doc_cn/ui/data_provider/train.list
The corresponding dataprovider can be found in the path below: The corresponding dataprovider is shown as below:
.. literalinclude:: ../../../doc_cn/ui/data_provider/mnist_provider.py .. literalinclude:: ../../../doc_cn/ui/data_provider/mnist_provider.py
: linenos:
The first line imports PyDataProvider2 package. The first line imports PyDataProvider2 package.
The main function is the process function, that has two parameters. The main function is the process function, that has two parameters.
...@@ -45,8 +44,8 @@ This parameter is passed to the process function by PaddlePaddle. ...@@ -45,8 +44,8 @@ This parameter is passed to the process function by PaddlePaddle.
:code:`@provider` is a Python :code:`@provider` is a Python
`Decorator <http://www.learnpython.org/en/Decorators>`_ . `Decorator <http://www.learnpython.org/en/Decorators>`_ .
It sets some properties to DataProvider, and constructs a real PaddlePaddle It sets some properties to DataProvider, and constructs a real PaddlePaddle
DataProvider from a very sample user implemented python function. It does not DataProvider from a very simple user implemented python function. It does not
matter if you are not familiar with `Decorator`_. You can keep it sample by matter if you are not familiar with `Decorator`_. You can keep it simple by
just taking :code:`@provider` as a fixed mark above the provider function you just taking :code:`@provider` as a fixed mark above the provider function you
implemented. implemented.
...@@ -59,9 +58,9 @@ document of `input_types`_ for more details. ...@@ -59,9 +58,9 @@ document of `input_types`_ for more details.
The process method is the core part to construct a real DataProvider in The process method is the core part to construct a real DataProvider in
PaddlePaddle. It implements how to open the text file, how to read one sample PaddlePaddle. It implements how to open the text file, how to read one sample
from the original text file, converted them into `input_types`_, and give them from the original text file, convert them into `input_types`_, and give them
back to PaddlePaddle process at line 23. back to PaddlePaddle process at line 23.
Note that data yields by the process function must follow a same order that Note that data yielded by the process function must follow the same order that
`input_types`_ are defined. `input_types`_ are defined.
...@@ -111,7 +110,7 @@ The corresponding data provider can be found in the path below: ...@@ -111,7 +110,7 @@ The corresponding data provider can be found in the path below:
.. literalinclude:: ../../../doc_cn/ui/data_provider/sentimental_provider.py .. literalinclude:: ../../../doc_cn/ui/data_provider/sentimental_provider.py
This data provider for sequential model is a little bit complex than that This data provider for sequential model is a little more complex than that
for MINST dataset. for MINST dataset.
A new initialization method is introduced here. A new initialization method is introduced here.
The method :code:`on_init` is configured to DataProvider by :code:`@provider`'s The method :code:`on_init` is configured to DataProvider by :code:`@provider`'s
...@@ -243,7 +242,7 @@ parameters which your init_hook does not use. ...@@ -243,7 +242,7 @@ parameters which your init_hook does not use.
cache cache
+++++ +++++
DataProvider provides two simple cache strategy. They are DataProvider provides two simple cache strategy. They are
* CacheType.NO_CACHE means do not cache any data, then data is read runtime by * CacheType.NO_CACHE means do not cache any data, then data is read at runtime by
the user implemented python module every pass. the user implemented python module every pass.
* CacheType.CACHE_PASS_IN_MEM means the first pass reads data by the user * CacheType.CACHE_PASS_IN_MEM means the first pass reads data by the user
implemented python module, and the rest passes will directly read data from implemented python module, and the rest passes will directly read data from
......
...@@ -613,7 +613,7 @@ def data_layer(name, size, layer_attr=None): ...@@ -613,7 +613,7 @@ def data_layer(name, size, layer_attr=None):
:type size: int :type size: int
:param layer_attr: Extra Layer Attribute. :param layer_attr: Extra Layer Attribute.
:type layer_attr: ExtraLayerAttribute. :type layer_attr: ExtraLayerAttribute.
:return: Layer Output Object. :return: LayerOutput object.
:rtype: LayerOutput :rtype: LayerOutput
""" """
Layer(type=LayerType.DATA, name=name, size=size, Layer(type=LayerType.DATA, name=name, size=size,
...@@ -640,7 +640,7 @@ def embedding_layer(input, size, name=None, param_attr=None, layer_attr=None): ...@@ -640,7 +640,7 @@ def embedding_layer(input, size, name=None, param_attr=None, layer_attr=None):
:type param_attr: ParameterAttribute|None :type param_attr: ParameterAttribute|None
:param layer_attr: Extra layer Config. Default is None. :param layer_attr: Extra layer Config. Default is None.
:type layer_attr: ExtraLayerAttribute|None :type layer_attr: ExtraLayerAttribute|None
:return: Embedding Layer output :return: LayerOutput object.
:rtype: LayerOutput :rtype: LayerOutput
""" """
with mixed_layer(name=name, size=size, act=LinearActivation(), with mixed_layer(name=name, size=size, act=LinearActivation(),
...@@ -692,7 +692,7 @@ def fc_layer(input, size, act=None, name=None, ...@@ -692,7 +692,7 @@ def fc_layer(input, size, act=None, name=None,
:type bias_attr: ParameterAttribute|None|Any :type bias_attr: ParameterAttribute|None|Any
:param layer_attr: Extra Layer config. :param layer_attr: Extra Layer config.
:type layer_attr: ExtraLayerAttribute|None :type layer_attr: ExtraLayerAttribute|None
:return: Layer Name. :return: LayerOutput object.
:rtype: LayerOutput :rtype: LayerOutput
""" """
if isinstance(input, LayerOutput): if isinstance(input, LayerOutput):
...@@ -756,7 +756,7 @@ def pooling_layer(input, pooling_type=None, name=None, bias_attr=None, ...@@ -756,7 +756,7 @@ def pooling_layer(input, pooling_type=None, name=None, bias_attr=None,
:type bias_attr: ParameterAttribute|None|False :type bias_attr: ParameterAttribute|None|False
:param layer_attr: The Extra Attributes for layer, such as dropout. :param layer_attr: The Extra Attributes for layer, such as dropout.
:type layer_attr: ExtraLayerAttribute|None :type layer_attr: ExtraLayerAttribute|None
:return: layer name. :return: LayerOutput object.
:rtype: LayerType :rtype: LayerType
""" """
extra_dict = dict() extra_dict = dict()
...@@ -842,7 +842,7 @@ def lstmemory(input, name=None, reverse=False, act=None, ...@@ -842,7 +842,7 @@ def lstmemory(input, name=None, reverse=False, act=None,
:type param_attr: ParameterAttribute|None|False :type param_attr: ParameterAttribute|None|False
:param layer_attr: Extra Layer attribute :param layer_attr: Extra Layer attribute
:type layer_attr: ExtraLayerAttribute|None :type layer_attr: ExtraLayerAttribute|None
:return: Layer name. :return: LayerOutput object.
:rtype: LayerOutput :rtype: LayerOutput
""" """
...@@ -943,7 +943,7 @@ def grumemory(input, name=None, reverse=False, act=None, ...@@ -943,7 +943,7 @@ def grumemory(input, name=None, reverse=False, act=None,
:type param_attr: ParameterAttribute|None|False :type param_attr: ParameterAttribute|None|False
:param layer_attr: Extra Layer attribute :param layer_attr: Extra Layer attribute
:type layer_attr: ExtraLayerAttribute|None :type layer_attr: ExtraLayerAttribute|None
:return: Layer name. :return: LayerOutput object.
:rtype: LayerOutput :rtype: LayerOutput
""" """
...@@ -977,7 +977,7 @@ def last_seq(input, name=None, agg_level=AggregateLevel.EACH_TIMESTEP, ...@@ -977,7 +977,7 @@ def last_seq(input, name=None, agg_level=AggregateLevel.EACH_TIMESTEP,
:type input: LayerOutput :type input: LayerOutput
:param layer_attr: extra layer attributes. :param layer_attr: extra layer attributes.
:type layer_attr: ExtraLayerAttribute. :type layer_attr: ExtraLayerAttribute.
:return: layer name. :return: LayerOutput object.
:rtype: LayerOutput :rtype: LayerOutput
""" """
Layer( Layer(
...@@ -1005,7 +1005,7 @@ def first_seq(input, name=None, agg_level=AggregateLevel.EACH_TIMESTEP, ...@@ -1005,7 +1005,7 @@ def first_seq(input, name=None, agg_level=AggregateLevel.EACH_TIMESTEP,
:type input: LayerOutput :type input: LayerOutput
:param layer_attr: extra layer attributes. :param layer_attr: extra layer attributes.
:type layer_attr: ExtraLayerAttribute. :type layer_attr: ExtraLayerAttribute.
:return: layer name. :return: LayerOutput object.
:rtype: LayerOutput :rtype: LayerOutput
""" """
Layer( Layer(
...@@ -1055,7 +1055,7 @@ def expand_layer(input, expand_as, ...@@ -1055,7 +1055,7 @@ def expand_layer(input, expand_as,
:type expand_level: ExpandLevel :type expand_level: ExpandLevel
:param layer_attr: extra layer attributes. :param layer_attr: extra layer attributes.
:type layer_attr: ExtraLayerAttribute. :type layer_attr: ExtraLayerAttribute.
:return: layer name :return: LayerOutput object.
:rtype: LayerOutput :rtype: LayerOutput
""" """
...@@ -1102,7 +1102,7 @@ def interpolation_layer(input, weight, name=None, layer_attr=None): ...@@ -1102,7 +1102,7 @@ def interpolation_layer(input, weight, name=None, layer_attr=None):
:type name: basestring :type name: basestring
:param layer_attr: extra layer attributes. :param layer_attr: extra layer attributes.
:type layer_attr: ExtraLayerAttribute. :type layer_attr: ExtraLayerAttribute.
:return: layer name. :return: LayerOutput object.
:rtype: LayerOutput :rtype: LayerOutput
""" """
assert isinstance(input, list) or isinstance(input, tuple) assert isinstance(input, list) or isinstance(input, tuple)
...@@ -1147,7 +1147,7 @@ def power_layer(input, weight, name=None, layer_attr=None): ...@@ -1147,7 +1147,7 @@ def power_layer(input, weight, name=None, layer_attr=None):
:type name: basestring :type name: basestring
:param layer_attr: extra layer attributes. :param layer_attr: extra layer attributes.
:type layer_attr: ExtraLayerAttribute. :type layer_attr: ExtraLayerAttribute.
:return: layer name. :return: LayerOutput object.
:rtype: LayerOutput :rtype: LayerOutput
""" """
assert weight.size == 1 assert weight.size == 1
...@@ -1187,7 +1187,7 @@ def scaling_layer(input, weight, name=None, layer_attr=None): ...@@ -1187,7 +1187,7 @@ def scaling_layer(input, weight, name=None, layer_attr=None):
:type name: basestring :type name: basestring
:param layer_attr: extra layer attributes. :param layer_attr: extra layer attributes.
:type layer_attr: ExtraLayerAttribute. :type layer_attr: ExtraLayerAttribute.
:return: layer name. :return: LayerOutput object.
:rtype: LayerOutput :rtype: LayerOutput
""" """
assert weight.size == 1 assert weight.size == 1
...@@ -1224,7 +1224,7 @@ def trans_layer(input, name=None, layer_attr=None): ...@@ -1224,7 +1224,7 @@ def trans_layer(input, name=None, layer_attr=None):
:type name: basestring :type name: basestring
:param layer_attr: extra layer attributes. :param layer_attr: extra layer attributes.
:type layer_attr: ExtraLayerAttribute. :type layer_attr: ExtraLayerAttribute.
:return: layer name. :return: LayerOutput object.
:rtype: LayerOutput :rtype: LayerOutput
""" """
Layer( Layer(
...@@ -1244,8 +1244,8 @@ def cos_sim(a, b, scale=5, size=1, name=None, layer_attr=None): ...@@ -1244,8 +1244,8 @@ def cos_sim(a, b, scale=5, size=1, name=None, layer_attr=None):
Cosine Similarity Layer. The cosine similarity equation is here. Cosine Similarity Layer. The cosine similarity equation is here.
.. math:: .. math::
similarity = cos(\\theta) = {\\mathbf{A} \\cdot \\mathbf{B} similarity = cos(\\theta) = {\\mathbf{a} \\cdot \\mathbf{b}
\\over \\|\\mathbf{A}\\| \\|\\mathbf{B}\\|} \\over \\|\\mathbf{b}\\| \\|\\mathbf{b}\\|}
And the input dimension is :math:`a \in R^M`, :math:`b \in R^{MN}`. The And the input dimension is :math:`a \in R^M`, :math:`b \in R^{MN}`. The
similarity will be calculated N times by step M. The output dimension is similarity will be calculated N times by step M. The output dimension is
...@@ -1263,7 +1263,7 @@ def cos_sim(a, b, scale=5, size=1, name=None, layer_attr=None): ...@@ -1263,7 +1263,7 @@ def cos_sim(a, b, scale=5, size=1, name=None, layer_attr=None):
:type size: int :type size: int
:param layer_attr: Extra Layer Attribute. :param layer_attr: Extra Layer Attribute.
:type layer_attr: ExtraLayerAttribute :type layer_attr: ExtraLayerAttribute
:return: layer name. :return: LayerOutput object.
:rtype: LayerOutput :rtype: LayerOutput
""" """
Layer( Layer(
...@@ -1308,7 +1308,7 @@ def hsigmoid(input, label, num_classes, name=None, bias_attr=None, layer_attr=No ...@@ -1308,7 +1308,7 @@ def hsigmoid(input, label, num_classes, name=None, bias_attr=None, layer_attr=No
:type bias_attr: ParameterAttribute|False :type bias_attr: ParameterAttribute|False
:param layer_attr: Extra Layer Attribute. :param layer_attr: Extra Layer Attribute.
:type layer_attr: ExtraLayerAttribute :type layer_attr: ExtraLayerAttribute
:return: layer name. :return: LayerOutput object.
:rtype: LayerOutput :rtype: LayerOutput
""" """
if isinstance(input, LayerOutput): if isinstance(input, LayerOutput):
...@@ -1400,7 +1400,7 @@ def img_conv_layer(input, filter_size, num_filters, ...@@ -1400,7 +1400,7 @@ def img_conv_layer(input, filter_size, num_filters,
:type shared_biases: bool :type shared_biases: bool
:param layer_attr: Layer Extra Attribute. :param layer_attr: Layer Extra Attribute.
:type layer_attr: ExtraLayerAttribute :type layer_attr: ExtraLayerAttribute
:return: Layer output. :return: LayerOutput object.
:rtype: LayerOutput :rtype: LayerOutput
""" """
if num_channels is None: if num_channels is None:
...@@ -1464,7 +1464,8 @@ def img_pool_layer(input, pool_size, name=None, ...@@ -1464,7 +1464,8 @@ def img_pool_layer(input, pool_size, name=None,
:type start: int :type start: int
:param layer_attr: Extra Layer attribute. :param layer_attr: Extra Layer attribute.
:type layer_attr: ExtraLayerAttribute :type layer_attr: ExtraLayerAttribute
:return: LayerOutput :return: LayerOutput object.
:rtype: LayerOutput
""" """
if num_channels is None: if num_channels is None:
assert input.num_filters is not None assert input.num_filters is not None
...@@ -1514,29 +1515,30 @@ def __img_norm_layer__(name, input, size, norm_type, scale, power, ...@@ -1514,29 +1515,30 @@ def __img_norm_layer__(name, input, size, norm_type, scale, power,
@wrap_name_default("crmnorm") @wrap_name_default("crmnorm")
@layer_support() @layer_support()
def img_cmrnorm_layer(input, size, scale, power, name=None, num_channels=None, def img_cmrnorm_layer(input, size, scale=0.0128, power=0.75,
name=None, num_channels=None,
blocked=0, layer_attr=None): blocked=0, layer_attr=None):
""" """
Convolution cross-map-response-normalize layer. Convolution cross-map-response-normalize layer.
The details please refer to
TODO(yuyang18): Add reference and equations, to explain why cmr is work? `Alex's paper <http://www.cs.toronto.edu/~fritz/absps/imagenet.pdf>`_.
:param name: layer name. :param name: layer name.
:type name: basestring :type name: None|basestring
:param input: layer's input. :param input: layer's input.
:type input: LayerOutput :type input: LayerOutput
:param size: cross map response size. :param size: cross map response size.
:type size: int :type size: int
:param scale: TODO(yuyang18) :param scale: The hyper-parameter.
:type scale: float :type scale: float
:param power: TODO(yuyang18) :param power: The hyper-parameter.
:type power: float :type power: float
:param num_channels: input layer's filers number or channels. If :param num_channels: input layer's filers number or channels. If
num_channels is None, it will be set automatically. num_channels is None, it will be set automatically.
:param blocked: TODO(yuyang18) :param blocked: namely normalize in number of blocked feature maps.
:param layer_attr: Extra Layer Attribute. :param layer_attr: Extra Layer Attribute.
:type layer_attr: ExtraLayerAttribute :type layer_attr: ExtraLayerAttribute
:return: Layer's output :return: LayerOutput object.
:rtype: LayerOutput :rtype: LayerOutput
""" """
return __img_norm_layer__(name, input, size, "cmrnorm-projection", scale, return __img_norm_layer__(name, input, size, "cmrnorm-projection", scale,
...@@ -1548,19 +1550,19 @@ def img_cmrnorm_layer(input, size, scale, power, name=None, num_channels=None, ...@@ -1548,19 +1550,19 @@ def img_cmrnorm_layer(input, size, scale, power, name=None, num_channels=None,
def img_rnorm_layer(input, size, scale, power, name=None, num_channels=None, def img_rnorm_layer(input, size, scale, power, name=None, num_channels=None,
layer_attr=None): layer_attr=None):
""" """
TODO(yuyang18): add comments Normalize the input in local region, namely response normalization
across feature maps.
TODO(yuyang18): Why it is always not implemented whenever use_gpu or not?
:param name: The name of this layer.
:param name: :rtype name: None|basestring
:param input: :param input: The input of this layer.
:param size: :param size:
:param scale: :param scale:
:param power: :param power:
:param num_channels: :param num_channels:
:param layer_attr: :param layer_attr:
:return: :return: LayerOutput object.
:rtype: LayerOutput
""" """
return __img_norm_layer__(name, input, size, 'rnorm', scale, power, return __img_norm_layer__(name, input, size, 'rnorm', scale, power,
num_channels, 0, layer_attr) num_channels, 0, layer_attr)
...@@ -1637,7 +1639,7 @@ def batch_norm_layer(input, act=None, name=None, num_channels=None, ...@@ -1637,7 +1639,7 @@ def batch_norm_layer(input, act=None, name=None, num_channels=None,
:math:`runningMean = newMean*(1-factor) :math:`runningMean = newMean*(1-factor)
+ runningMean*factor` + runningMean*factor`
:type moving_average_fraction: float. :type moving_average_fraction: float.
:return: Layer's output :return: LayerOutput object.
:rtype: LayerOutput :rtype: LayerOutput
""" """
if not isinstance(act, ReluActivation): if not isinstance(act, ReluActivation):
...@@ -1701,7 +1703,7 @@ def sum_to_one_norm_layer(input, name=None, layer_attr=None): ...@@ -1701,7 +1703,7 @@ def sum_to_one_norm_layer(input, name=None, layer_attr=None):
:type name: basestring :type name: basestring
:param layer_attr: extra layer attributes. :param layer_attr: extra layer attributes.
:type layer_attr: ExtraLayerAttribute. :type layer_attr: ExtraLayerAttribute.
:return: layer name. :return: LayerOutput object.
:rtype: LayerOutput :rtype: LayerOutput
""" """
Layer( Layer(
...@@ -1761,7 +1763,7 @@ def addto_layer(input, act=None, name=None, bias_attr=None, ...@@ -1761,7 +1763,7 @@ def addto_layer(input, act=None, name=None, bias_attr=None,
:type bias_attr: ParameterAttribute|bool :type bias_attr: ParameterAttribute|bool
:param layer_attr: Extra Layer attribute. :param layer_attr: Extra Layer attribute.
:type layer_attr: ExtraLayerAttribute :type layer_attr: ExtraLayerAttribute
:return: layer's output :return: LayerOutput object.
:rtype: LayerOutput :rtype: LayerOutput
""" """
num_filters = None num_filters = None
...@@ -1803,7 +1805,7 @@ def concat_layer(input, act=None, name=None, layer_attr=None): ...@@ -1803,7 +1805,7 @@ def concat_layer(input, act=None, name=None, layer_attr=None):
:type act: BaseActivation :type act: BaseActivation
:param layer_attr: Extra Layer Attribute. :param layer_attr: Extra Layer Attribute.
:type layer_attr: ExtraLayerAttribute :type layer_attr: ExtraLayerAttribute
:return: layer's output :return: LayerOutput object.
:rtype: LayerOutput :rtype: LayerOutput
""" """
...@@ -1901,7 +1903,7 @@ def memory(name, size, is_seq=False, boot_layer=None, ...@@ -1901,7 +1903,7 @@ def memory(name, size, is_seq=False, boot_layer=None,
:type boot_bias_active_type: BaseActivation :type boot_bias_active_type: BaseActivation
:param boot_with_const_id: boot layer's id. :param boot_with_const_id: boot layer's id.
:type boot_with_const_id: int :type boot_with_const_id: int
:return: Memory layer's output :return: LayerOutput object which is a memory.
:rtype: LayerOutput :rtype: LayerOutput
""" """
if boot_bias_active_type is None: if boot_bias_active_type is None:
...@@ -1993,7 +1995,7 @@ def lstm_step_layer(input, state, size, act=None, ...@@ -1993,7 +1995,7 @@ def lstm_step_layer(input, state, size, act=None,
:type bias_attr: ParameterAttribute :type bias_attr: ParameterAttribute
:param layer_attr: layer's extra attribute. :param layer_attr: layer's extra attribute.
:type layer_attr: ExtraLayerAttribute :type layer_attr: ExtraLayerAttribute
:return: lstm step's layer output :return: LayerOutput object.
:rtype: LayerOutput :rtype: LayerOutput
""" """
Layer( Layer(
...@@ -2032,7 +2034,7 @@ def gru_step_layer(input, output_mem, size=None, act=None, ...@@ -2032,7 +2034,7 @@ def gru_step_layer(input, output_mem, size=None, act=None,
:param gate_act: :param gate_act:
:param bias_attr: :param bias_attr:
:param layer_attr: :param layer_attr:
:return: :return: LayerOutput object.
:rtype: LayerOutput :rtype: LayerOutput
""" """
assert input.size % 3 == 0 assert input.size % 3 == 0
...@@ -2073,7 +2075,7 @@ def get_output_layer(input, arg_name, name=None, layer_attr=None): ...@@ -2073,7 +2075,7 @@ def get_output_layer(input, arg_name, name=None, layer_attr=None):
:param arg_name: Output name from input. :param arg_name: Output name from input.
:type arg_name: basestring :type arg_name: basestring
:param layer_attr: Layer's extra attribute. :param layer_attr: Layer's extra attribute.
:return: Layer's output :return: LayerOutput object.
:rtype: LayerOutput :rtype: LayerOutput
""" """
# GetOutputLayer # GetOutputLayer
...@@ -2107,7 +2109,7 @@ def recurrent_layer(input, act=None, bias_attr=None, ...@@ -2107,7 +2109,7 @@ def recurrent_layer(input, act=None, bias_attr=None,
:param param_attr: :param param_attr:
:param name: :param name:
:param layer_attr: :param layer_attr:
:return: :return: LayerOutput object.
""" """
Layer(name=name, Layer(name=name,
type=LayerType.RECURRENT_LAYER, type=LayerType.RECURRENT_LAYER,
...@@ -2201,7 +2203,7 @@ def recurrent_group(step, input, reverse=False, name=None): ...@@ -2201,7 +2203,7 @@ def recurrent_group(step, input, reverse=False, name=None):
:param reverse: If reverse is set true, the recurrent unit will process the :param reverse: If reverse is set true, the recurrent unit will process the
input sequence in a reverse order. input sequence in a reverse order.
:type reverse: bool :type reverse: bool
:return: Layer output object :return: LayerOutput object.
:rtype: LayerOutput :rtype: LayerOutput
""" """
model_type('recurrent_nn') model_type('recurrent_nn')
...@@ -2319,7 +2321,7 @@ def maxid_layer(input, name=None, layer_attr=None): ...@@ -2319,7 +2321,7 @@ def maxid_layer(input, name=None, layer_attr=None):
:type name: basestring :type name: basestring
:param layer_attr: extra layer attributes. :param layer_attr: extra layer attributes.
:type layer_attr: ExtraLayerAttribute. :type layer_attr: ExtraLayerAttribute.
:return: layer name. :return: LayerOutput object.
:rtype: LayerOutput :rtype: LayerOutput
""" """
...@@ -2356,7 +2358,7 @@ def eos_layer(input, eos_id, name=None, layer_attr=None): ...@@ -2356,7 +2358,7 @@ def eos_layer(input, eos_id, name=None, layer_attr=None):
:type eos_id: int :type eos_id: int
:param layer_attr: extra layer attributes. :param layer_attr: extra layer attributes.
:type layer_attr: ExtraLayerAttribute. :type layer_attr: ExtraLayerAttribute.
:return: layer name. :return: LayerOutput object.
:rtype: LayerOutput :rtype: LayerOutput
""" """
Layer(name=name, Layer(name=name,
...@@ -2528,7 +2530,7 @@ def regression_cost(input, label, cost='square_error', name=None): ...@@ -2528,7 +2530,7 @@ def regression_cost(input, label, cost='square_error', name=None):
:param input: Network prediction. :param input: Network prediction.
:param label: Data label. :param label: Data label.
:param cost: Cost method. :param cost: Cost method.
:return: layer name. :return: LayerOutput object.
""" """
Layer(inputs=[Input(input.name), Input(label.name)], type=cost, name=name) Layer(inputs=[Input(input.name), Input(label.name)], type=cost, name=name)
return LayerOutput( return LayerOutput(
...@@ -2552,7 +2554,7 @@ def classification_cost(input, label, name=None, ...@@ -2552,7 +2554,7 @@ def classification_cost(input, label, name=None,
:param cost: cost method. :param cost: cost method.
:type cost: basestring :type cost: basestring
:param evaluator: Evaluator method. :param evaluator: Evaluator method.
:return: layer name. :return: LayerOutput object.
:rtype: LayerOutput :rtype: LayerOutput
""" """
assert input.layer_type != LayerType.DATA assert input.layer_type != LayerType.DATA
...@@ -2667,7 +2669,7 @@ def conv_shift_layer(input, name=None): ...@@ -2667,7 +2669,7 @@ def conv_shift_layer(input, name=None):
:type name: basestring :type name: basestring
:param input: Input layer. :param input: Input layer.
:type input: LayerOutput|list|tuple. :type input: LayerOutput|list|tuple.
:return: a object of LayerOutput. :return: LayerOutput object.
:rtype: LayerOutput :rtype: LayerOutput
""" """
assert isinstance(input, list) or isinstance(input, tuple) assert isinstance(input, list) or isinstance(input, tuple)
...@@ -2722,7 +2724,7 @@ def tensor_layer(input, size, act=None, name=None, ...@@ -2722,7 +2724,7 @@ def tensor_layer(input, size, act=None, name=None,
:type bias_attr: ParameterAttribute|None|Any :type bias_attr: ParameterAttribute|None|Any
:param layer_attr: Extra Layer config. :param layer_attr: Extra Layer config.
:type layer_attr: ExtraLayerAttribute|None :type layer_attr: ExtraLayerAttribute|None
:return: a object of LayerOutput. :return: LayerOutput object.
:rtype: LayerOutput :rtype: LayerOutput
""" """
assert isinstance(input, list) or isinstance(input, tuple) assert isinstance(input, list) or isinstance(input, tuple)
...@@ -2816,7 +2818,7 @@ def selective_fc_layer(input, size, act=None, name=None, ...@@ -2816,7 +2818,7 @@ def selective_fc_layer(input, size, act=None, name=None,
:type bias_attr: ParameterAttribute|None|Any :type bias_attr: ParameterAttribute|None|Any
:param layer_attr: Extra Layer config. :param layer_attr: Extra Layer config.
:type layer_attr: ExtraLayerAttribute|None :type layer_attr: ExtraLayerAttribute|None
:return: a object of LayerOutput. :return: LayerOutput object.
:rtype: LayerOutput :rtype: LayerOutput
""" """
if isinstance(input, LayerOutput): if isinstance(input, LayerOutput):
...@@ -2867,7 +2869,7 @@ def sampling_id_layer(input, name=None): ...@@ -2867,7 +2869,7 @@ def sampling_id_layer(input, name=None):
:type input: LayerOutput :type input: LayerOutput
:param name: The Layer Name. :param name: The Layer Name.
:type name: basestring :type name: basestring
:return: a object of LayerOutput. :return: LayerOutput object.
:rtype: LayerOutput :rtype: LayerOutput
""" """
Layer( Layer(
...@@ -2901,7 +2903,7 @@ def slope_intercept_layer(input, name=None, slope=1.0, intercept=0.0): ...@@ -2901,7 +2903,7 @@ def slope_intercept_layer(input, name=None, slope=1.0, intercept=0.0):
:type slope: float. :type slope: float.
:param intercept: the offset. :param intercept: the offset.
:type intercept: float. :type intercept: float.
:return: a object of LayerOutput. :return: LayerOutput object.
:rtype: LayerOutput :rtype: LayerOutput
""" """
Layer( Layer(
...@@ -2946,7 +2948,7 @@ def convex_comb_layer(input, size, name=None): ...@@ -2946,7 +2948,7 @@ def convex_comb_layer(input, size, name=None):
:type size: int :type size: int
:param name: The Layer Name. :param name: The Layer Name.
:type name: basestring :type name: basestring
:return: a object of LayerOutput. :return: LayerOutput object.
:rtype: LayerOutput :rtype: LayerOutput
""" """
...@@ -3016,7 +3018,7 @@ def block_expand_layer(input, ...@@ -3016,7 +3018,7 @@ def block_expand_layer(input,
:type padding_y: int :type padding_y: int
:param name: The name of this layer, which can not specify. :param name: The name of this layer, which can not specify.
:type name: None|basestring. :type name: None|basestring.
:return: a object of LayerOutput. :return: LayerOutput object.
:rtype: LayerOutput :rtype: LayerOutput
""" """
Layer(name=name, Layer(name=name,
...@@ -3061,7 +3063,7 @@ def ctc_layer(input, label, size, name=None, norm_by_times=False): ...@@ -3061,7 +3063,7 @@ def ctc_layer(input, label, size, name=None, norm_by_times=False):
:type name: string|None :type name: string|None
:param norm_by_times: Whether to normalization by times. False by default. :param norm_by_times: Whether to normalization by times. False by default.
:type norm_by_times: bool :type norm_by_times: bool
:return: a object of LayerOutput. :return: LayerOutput object.
:rtype: LayerOutput :rtype: LayerOutput
""" """
assert isinstance(input, LayerOutput) assert isinstance(input, LayerOutput)
...@@ -3102,7 +3104,7 @@ def crf_layer(input, label, size, weight=None, param_attr=None, name=None): ...@@ -3102,7 +3104,7 @@ def crf_layer(input, label, size, weight=None, param_attr=None, name=None):
:type param_attr: ParameterAttribute :type param_attr: ParameterAttribute
:param name: The name of this layers. It is not necessary. :param name: The name of this layers. It is not necessary.
:type name: None|basestring :type name: None|basestring
:return: a object of LayerOutput. :return: LayerOutput object.
:rtype: LayerOutput :rtype: LayerOutput
""" """
assert isinstance(input, LayerOutput) assert isinstance(input, LayerOutput)
...@@ -3144,7 +3146,7 @@ def crf_decoding_layer(input, size, label=None, param_attr=None, name=None): ...@@ -3144,7 +3146,7 @@ def crf_decoding_layer(input, size, label=None, param_attr=None, name=None):
:type param_attr: ParameterAttribute :type param_attr: ParameterAttribute
:param name: The name of this layers. It is not necessary. :param name: The name of this layers. It is not necessary.
:type name: None|basestring :type name: None|basestring
:return: a object of LayerOutput. :return: LayerOutput object.
:rtype: LayerOutput :rtype: LayerOutput
""" """
...@@ -3213,7 +3215,7 @@ def rank_cost(left, right, lable, weight=None, name=None, coeff=1.0): ...@@ -3213,7 +3215,7 @@ def rank_cost(left, right, lable, weight=None, name=None, coeff=1.0):
:type name: None|basestring :type name: None|basestring
:param coeff: The coefficient affects the gradient in the backward. :param coeff: The coefficient affects the gradient in the backward.
:type coeff: float :type coeff: float
:return: a object of LayerOutput. :return: LayerOutput object.
:rtype: LayerOutput :rtype: LayerOutput
""" """
assert left.size == 1 assert left.size == 1
...@@ -3270,7 +3272,7 @@ def lambda_cost(input, score, NDCG_num=5, max_sort_size=-1, coeff=1.0): ...@@ -3270,7 +3272,7 @@ def lambda_cost(input, score, NDCG_num=5, max_sort_size=-1, coeff=1.0):
:type name: None|basestring :type name: None|basestring
:param coeff: The coefficient affects the gradient in the backward. :param coeff: The coefficient affects the gradient in the backward.
:type coeff: float :type coeff: float
:return: a object of LayerOutput. :return: LayerOutput object.
:rtype: LayerOutput :rtype: LayerOutput
""" """
Layer(name=name, Layer(name=name,
...@@ -3302,7 +3304,7 @@ def cross_entropy(input, label, name=None, coeff=1.0): ...@@ -3302,7 +3304,7 @@ def cross_entropy(input, label, name=None, coeff=1.0):
:type name: None|basestring. :type name: None|basestring.
:param coeff: The coefficient affects the gradient in the backward. :param coeff: The coefficient affects the gradient in the backward.
:type coeff: float. :type coeff: float.
:return: a object of LayerOutput. :return: LayerOutput object.
:rtype: LayerOutput. :rtype: LayerOutput.
""" """
...@@ -3335,7 +3337,7 @@ def cross_entropy_with_selfnorm(input, label, name=None, coeff=1.0, ...@@ -3335,7 +3337,7 @@ def cross_entropy_with_selfnorm(input, label, name=None, coeff=1.0,
:type coeff: float. :type coeff: float.
:param softmax_selfnorm_alpha: The scale factor affects the cost. :param softmax_selfnorm_alpha: The scale factor affects the cost.
:type softmax_selfnorm_alpha: float. :type softmax_selfnorm_alpha: float.
:return: a object of LayerOutput. :return: LayerOutput object.
:rtype: LayerOutput. :rtype: LayerOutput.
""" """
Layer(name=name, Layer(name=name,
...@@ -3368,7 +3370,7 @@ def huber_cost(input, label, name=None, coeff=1.0): ...@@ -3368,7 +3370,7 @@ def huber_cost(input, label, name=None, coeff=1.0):
:type name: None|basestring. :type name: None|basestring.
:param coeff: The coefficient affects the gradient in the backward. :param coeff: The coefficient affects the gradient in the backward.
:type coeff: float. :type coeff: float.
:return: a object of LayerOutput. :return: LayerOutput object.
:rtype: LayerOutput. :rtype: LayerOutput.
""" """
...@@ -3398,7 +3400,7 @@ def multi_binary_label_cross_entropy(input, label, name=None, coeff=1.0): ...@@ -3398,7 +3400,7 @@ def multi_binary_label_cross_entropy(input, label, name=None, coeff=1.0):
:type name: None|basestring :type name: None|basestring
:param coeff: The coefficient affects the gradient in the backward. :param coeff: The coefficient affects the gradient in the backward.
:type coeff: float :type coeff: float
:return: a object of LayerOutput. :return: LayerOutput object.
:rtype: LayerOutput :rtype: LayerOutput
""" """
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册