未验证 提交 787c09cd 编写于 作者: Z zhupengyang 提交者: GitHub

logsigmoid -> log_sigmoid (#2602)

上级 653611db
...@@ -88,8 +88,8 @@ paddle.nn ...@@ -88,8 +88,8 @@ paddle.nn
nn/Linear.rst nn/Linear.rst
nn/linear_lr_warmup.rst nn/linear_lr_warmup.rst
nn/log_loss.rst nn/log_loss.rst
nn/log_sigmoid.rst
nn/log_softmax.rst nn/log_softmax.rst
nn/logsigmoid.rst
nn/loss.rst nn/loss.rst
nn/lrn.rst nn/lrn.rst
nn/margin_rank_loss.rst nn/margin_rank_loss.rst
......
.. _api_nn_logsigmoid: .. _api_nn_log_sigmoid:
logsigmoid log_sigmoid
------------------------------- -------------------------------
.. autofunction:: paddle.nn.functional.logsigmoid .. autofunction:: paddle.nn.functional.log_sigmoid
:noindex: :noindex:
...@@ -5,12 +5,6 @@ logsigmoid ...@@ -5,12 +5,6 @@ logsigmoid
.. py:function:: paddle.fluid.layers.logsigmoid(x, name=None) .. py:function:: paddle.fluid.layers.logsigmoid(x, name=None)
:alias_main: paddle.nn.functional.logsigmoid
:alias: paddle.nn.functional.logsigmoid,paddle.nn.functional.activation.logsigmoid
:old_api: paddle.fluid.layers.logsigmoid
Logsigmoid激活函数 Logsigmoid激活函数
......
...@@ -109,8 +109,8 @@ paddle.nn ...@@ -109,8 +109,8 @@ paddle.nn
nn_cn/leaky_relu_cn.rst nn_cn/leaky_relu_cn.rst
nn_cn/Linear_cn.rst nn_cn/Linear_cn.rst
nn_cn/linear_lr_warmup_cn.rst nn_cn/linear_lr_warmup_cn.rst
nn_cn/logsigmoid_cn.rst
nn_cn/log_loss_cn.rst nn_cn/log_loss_cn.rst
nn_cn/log_sigmoid_cn.rst
nn_cn/log_softmax_cn.rst nn_cn/log_softmax_cn.rst
nn_cn/lrn_cn.rst nn_cn/lrn_cn.rst
nn_cn/margin_ranking_loss_cn.rst nn_cn/margin_ranking_loss_cn.rst
......
.. _cn_api_nn_cn_logsigmoid: .. _cn_api_nn_cn_log_sigmoid:
logsigmoid log_sigmoid
------------------------------- -------------------------------
.. py:function:: paddle.nn.functional.logsigmoid(x, name=None) .. py:function:: paddle.nn.functional.log_sigmoid(x, name=None)
logsigmoid激活层。计算公式如下: log_sigmoid激活层。计算公式如下:
.. math:: .. math::
logsigmoid(x) = \log \frac{1}{1 + e^{-x}} log\_sigmoid(x) = \log \frac{1}{1 + e^{-x}}
其中,:math:`x` 为输入的 Tensor 其中,:math:`x` 为输入的 Tensor
...@@ -29,9 +29,8 @@ logsigmoid激活层。计算公式如下: ...@@ -29,9 +29,8 @@ logsigmoid激活层。计算公式如下:
import paddle import paddle
import paddle.nn.functional as F import paddle.nn.functional as F
import numpy as np
paddle.disable_static() paddle.disable_static()
x = paddle.to_tensor(np.array([1.0, 2.0, 3.0, 4.0])) x = paddle.to_tensor([1.0, 2.0, 3.0, 4.0])
out = F.logsigmoid(x) # [-0.313262 -0.126928 -0.0485874 -0.0181499] out = F.log_sigmoid(x) # [-0.313262 -0.126928 -0.0485874 -0.0181499]
...@@ -5,7 +5,7 @@ paddle.nn.functional.loss.l1_loss paddle.nn.functional.l1_loss ...@@ -5,7 +5,7 @@ paddle.nn.functional.loss.l1_loss paddle.nn.functional.l1_loss
paddle.nn.functional.loss.margin_ranking_loss paddle.nn.functional.margin_ranking_loss paddle.nn.functional.loss.margin_ranking_loss paddle.nn.functional.margin_ranking_loss
paddle.nn.layer.pooling.AdaptiveAvgPool3d paddle.nn.AdaptiveAvgPool3d,paddle.nn.layer.AdaptiveAvgPool3d paddle.nn.layer.pooling.AdaptiveAvgPool3d paddle.nn.AdaptiveAvgPool3d,paddle.nn.layer.AdaptiveAvgPool3d
paddle.nn.functional.common.alpha_dropout paddle.nn.functional.alpha_dropout paddle.nn.functional.common.alpha_dropout paddle.nn.functional.alpha_dropout
paddle.nn.functional.activation.logsigmoid paddle.nn.functional.logsigmoid paddle.nn.functional.activation.log_sigmoid paddle.nn.functional.log_sigmoid
paddle.fluid.executor.Executor paddle.static.Executor paddle.fluid.executor.Executor paddle.static.Executor
paddle.nn.functional.pooling.avg_pool2d paddle.nn.functional.avg_pool2d paddle.nn.functional.pooling.avg_pool2d paddle.nn.functional.avg_pool2d
paddle.fluid.dygraph.checkpoint.load_dygraph paddle.load,paddle.framework.load paddle.fluid.dygraph.checkpoint.load_dygraph paddle.load,paddle.framework.load
......
...@@ -398,7 +398,7 @@ Decoder .. _api_paddle_fluid_layers_Decoder: ...@@ -398,7 +398,7 @@ Decoder .. _api_paddle_fluid_layers_Decoder:
array_read .. _api_paddle_fluid_layers_array_read: array_read .. _api_paddle_fluid_layers_array_read:
floor_divide .. _api_paddle_tensor_math_floor_divide: floor_divide .. _api_paddle_tensor_math_floor_divide:
floor_mod .. _api_paddle_tensor_math_floor_mod: floor_mod .. _api_paddle_tensor_math_floor_mod:
logsigmoid .. _api_paddle_nn_functional_logsigmoid: log_sigmoid .. _api_paddle_nn_functional_log_sigmoid:
generate_mask_labels .. _api_paddle_fluid_layers_generate_mask_labels: generate_mask_labels .. _api_paddle_fluid_layers_generate_mask_labels:
square .. _api_paddle_fluid_layers_square: square .. _api_paddle_fluid_layers_square:
reset_profiler .. _api_paddle_fluid_profiler_reset_profiler: reset_profiler .. _api_paddle_fluid_profiler_reset_profiler:
......
.. _cn_api_nn_cn_logsigmoid: .. _cn_api_nn_cn_log_sigmoid:
logsigmoid log_sigmoid
------------------------------- -------------------------------
.. py:function:: paddle.nn.functional.logsigmoid(x, name=None) .. py:function:: paddle.nn.functional.log_sigmoid(x, name=None)
logsigmoid激活层。计算公式如下: log_sigmoid激活层。计算公式如下:
.. math:: .. math::
logsigmoid(x) = \log \frac{1}{1 + e^{-x}} log\_sigmoid(x) = \log \frac{1}{1 + e^{-x}}
其中,:math:`x` 为输入的 Tensor 其中,:math:`x` 为输入的 Tensor
...@@ -29,9 +29,8 @@ logsigmoid激活层。计算公式如下: ...@@ -29,9 +29,8 @@ logsigmoid激活层。计算公式如下:
import paddle import paddle
import paddle.nn.functional as F import paddle.nn.functional as F
import numpy as np
paddle.disable_static() paddle.disable_static()
x = paddle.to_tensor(np.array([1.0, 2.0, 3.0, 4.0])) x = paddle.to_tensor([1.0, 2.0, 3.0, 4.0])
out = F.logsigmoid(x) # [-0.313262 -0.126928 -0.0485874 -0.0181499] out = F.log_sigmoid(x) # [-0.313262 -0.126928 -0.0485874 -0.0181499]
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册