提交 28384912 编写于 作者: D dangqingqing

add smooth_l1 interface to v2 doc.

上级 6c654c01
...@@ -419,6 +419,11 @@ hsigmoid ...@@ -419,6 +419,11 @@ hsigmoid
.. autoclass:: paddle.v2.layer.hsigmoid .. autoclass:: paddle.v2.layer.hsigmoid
:noindex: :noindex:
smooth_l1
---------
.. automodule:: paddle.v2.layer.smooth_l1
:noindex:
Check Layer Check Layer
============ ============
......
...@@ -116,7 +116,7 @@ __all__ = [ ...@@ -116,7 +116,7 @@ __all__ = [
'spp_layer', 'spp_layer',
'pad_layer', 'pad_layer',
'eos_layer', 'eos_layer',
'smooth_l1_cost', 'smooth_l1',
'layer_support', 'layer_support',
] ]
...@@ -5283,7 +5283,7 @@ def multi_binary_label_cross_entropy(input, ...@@ -5283,7 +5283,7 @@ def multi_binary_label_cross_entropy(input,
@wrap_name_default() @wrap_name_default()
@layer_support() @layer_support()
def smooth_l1_cost(input, label, name=None, layer_attr=None): def smooth_l1(input, label, name=None, layer_attr=None):
""" """
This is a L1 loss but more smooth. It requires that the This is a L1 loss but more smooth. It requires that the
size of input and label are equal. The formula is as follows, size of input and label are equal. The formula is as follows,
...@@ -5307,8 +5307,8 @@ def smooth_l1_cost(input, label, name=None, layer_attr=None): ...@@ -5307,8 +5307,8 @@ def smooth_l1_cost(input, label, name=None, layer_attr=None):
.. code-block:: python .. code-block:: python
cost = smooth_l1_cost(input=input_layer, cost = smooth_l1(input=input_layer,
label=label_layer) label=label_layer)
:param input: The input layer. :param input: The input layer.
:type input: LayerOutput :type input: LayerOutput
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册