未验证 提交 1fb4d90b 编写于 作者: D Dandelight 提交者: GitHub

Add description to `nn.functional.celu` (#48074)

上级 3b18d96b
...@@ -35,17 +35,19 @@ def celu(x, alpha=1.0, name=None): ...@@ -35,17 +35,19 @@ def celu(x, alpha=1.0, name=None):
r""" r"""
celu activation. celu activation.
Apply the following operation to each element of the input Tensor accroding to the `Continuously Differentiable Exponential Linear Units <https://arxiv.org/abs/1704.07483>`_.
.. math:: .. math::
celu(x) = max(0, x) + min(0, \alpha * (e^{x/\alpha}-1)) \operatorname{celu}(x) = \max(0, x) + \min(0, \alpha * (\mathrm{e}^{x/\alpha}-1))
Parameters: Parameters:
x (Tensor): The input Tensor with data type float32, float64. x (Tensor): The input Tensor with data type float16, float32, or float64.
alpha (float, optional): The 'alpha' value of the CELU formulation. Default is 1.0. alpha (float, optional): The 'alpha' value of the CELU formula. Default is 1.0.
name (str, optional): For details, please refer to :ref:`api_guide_Name`. Generally, no setting is required. Default: None. name (str, optional): For details, please refer to :ref:`api_guide_Name`. Generally, no setting is required. Default: None.
Returns: Returns:
A Tensor with the same data type and shape as ``x`` . A ``Tensor`` with the same data type and shape as ``x`` .
Examples: Examples:
.. code-block:: python .. code-block:: python
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册