未验证 提交 dbcef732 编写于 作者: H hong19860320 提交者: GitHub

Fix the formula of SELU/selu (#26675)

上级 c70bc3bb
...@@ -644,7 +644,11 @@ def selu(x, ...@@ -644,7 +644,11 @@ def selu(x,
.. math:: .. math::
selu(x) = scale * (max(0,x) + min(0, alpha * (e^{x} - 1))) selu(x)= scale *
\\begin{cases}
x, \\text{if } x > 0 \\\\
alpha * e^{x} - alpha, \\text{if } x <= 0
\\end{cases}
Parameters: Parameters:
x (Tensor): The input Tensor with data type float32, float64. x (Tensor): The input Tensor with data type float32, float64.
......
...@@ -552,7 +552,11 @@ class SELU(layers.Layer): ...@@ -552,7 +552,11 @@ class SELU(layers.Layer):
.. math:: .. math::
SELU(x) = scale * (max(0,x) + min(0, alpha * (e^{x} - 1))) SELU(x)= scale *
\\begin{cases}
x, \\text{if } x > 0 \\\\
alpha * e^{x} - alpha, \\text{if } x <= 0
\\end{cases}
Parameters: Parameters:
scale (float, optional): The value of scale for SELU. Default is 1.0507009873554804934193349852946 scale (float, optional): The value of scale for SELU. Default is 1.0507009873554804934193349852946
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册