未验证 提交 e647ac00 编写于 作者: I Infinity_lee 提交者: GitHub

fix englishdocs typo error in 8th2s (#49493)

上级 69aae171
......@@ -454,12 +454,14 @@ def leaky_relu(x, negative_slope=0.01, name=None):
def prelu(x, weight, data_format="NCHW", name=None):
"""
prelu activation.
prelu activation. The calculation formula is follows:
.. math::
prelu(x) = max(0, x) + weight * min(0, x)
x and weight is input Tensor.
Parameters:
x (Tensor): The input Tensor with data type float32, float64.
weight (Tensor): The learnable parameter with data type same as ``x``.
......@@ -592,8 +594,8 @@ def rrelu(x, lower=1.0 / 8.0, upper=1.0 / 3.0, training=True, name=None):
Parameters:
x (Tensor): The input Tensor with data type float16, float32, float64.
lower (float, optional): The lower bound of uniform distribution. Default: 0.125.
upper (float, optional): The upper bound of uniform distribution. Default: 0.333.
lower (float, optional): The lower bound of uniform distribution. Default: 1.0/8.0.
upper (float, optional): The upper bound of uniform distribution. Default: 1.0/3.0.
training (bool, optional): Current mode is in training or others. Default is True.
name (str, optional): For details, please refer to :ref:`api_guide_Name`. Generally, no setting is required. Default: None.
......@@ -673,12 +675,14 @@ def rrelu(x, lower=1.0 / 8.0, upper=1.0 / 3.0, training=True, name=None):
def relu(x, name=None):
"""
relu activation.
relu activation. The calculation formula is follows:
.. math::
out = max(x, 0)
x is input Tensor.
Parameters:
x (Tensor): The input Tensor with data type float32, float64.
name (str, optional): For details, please refer to :ref:`api_guide_Name`. Generally, no setting is required. Default: None.
......@@ -900,8 +904,8 @@ def selu(
Parameters:
x (Tensor): The input Tensor with data type float32, float64.
scale (float, optional): The value of scale(must be greater than 1.0) for selu. Default is 1.0507009873554804934193349852946
alpha (float, optional): The value of alpha(must be no less than zero) for selu. Default is 1.6732632423543772848170429916717
scale (float, optional): The value of scale(must be greater than 1.0) for selu. Default is 1.0507009873554804934193349852946.
alpha (float, optional): The value of alpha(must be no less than zero) for selu. Default is 1.6732632423543772848170429916717.
name (str, optional): For details, please refer to :ref:`api_guide_Name`. Generally, no setting is required. Default: None.
Returns:
......
......@@ -356,12 +356,16 @@ class Hardtanh(Layer):
class PReLU(Layer):
"""
PReLU Activation.
PReLU Activation. The calculation formula is follows:
If approximate calculation is used:
.. math::
PReLU(x) = max(0, x) + weight * min(0, x)
x is input Tensor.
Parameters:
num_parameters (int, optional): Number of `weight` to learn. The supported values are:
1 - a single parameter `alpha` is used for all input channels;
......@@ -479,8 +483,8 @@ class RReLU(Layer):
:math:`lower` and :math:`upper` are the bounds of uniform distribution.
Parameters:
lower (float, optional): The lower bound of uniform distribution. Default: 0.125.
upper (float, optional): The upper bound of uniform distribution. Default: 0.333.
lower (float, optional): The lower bound of uniform distribution. Default: 1.0/8.0.
upper (float, optional): The upper bound of uniform distribution. Default: 1.0/3.0.
name (str, optional): Name for the operation (optional, default is None).
For more information, please refer to :ref:`api_guide_Name`.
......@@ -531,12 +535,14 @@ class RReLU(Layer):
class ReLU(Layer):
"""
ReLU Activation.
ReLU Activation. The calculation formula is follows:
.. math::
ReLU(x) = max(x, 0)
x is input Tensor.
Parameters:
name (str, optional): Name for the operation (optional, default is None).
For more information, please refer to :ref:`api_guide_Name`.
......@@ -577,6 +583,8 @@ class ReLU6(Layer):
ReLU6(x) = min(max(0,x), 6)
x is input Tensor.
Parameters:
name (str, optional): Name for the operation (optional, default is None).
For more information, please refer to :ref:`api_guide_Name`.
......@@ -624,8 +632,8 @@ class SELU(Layer):
\right.
Parameters:
scale (float, optional): The value of scale(must be greater than 1.0) for SELU. Default is 1.0507009873554804934193349852946
alpha (float, optional): The value of alpha(must be no less than zero) for SELU. Default is 1.6732632423543772848170429916717
scale (float, optional): The value of scale(must be greater than 1.0) for SELU. Default is 1.0507009873554804934193349852946.
alpha (float, optional): The value of alpha(must be no less than zero) for SELU. Default is 1.6732632423543772848170429916717.
name (str, optional): Name for the operation (optional, default is None).
For more information, please refer to :ref:`api_guide_Name`.
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册