Activation¶
+Abs¶
+-
+
-
+class
paddle.v2.activation.
Abs
+ Abs Activation.
+Forward: \(f(z) = abs(z)\)
+Derivative:
++\[\begin{split}1 &\quad if \quad z > 0 \\ +-1 &\quad if \quad z < 0 \\ +0 &\quad if \quad z = 0\end{split}\]+
Exp¶
+-
+
-
+class
paddle.v2.activation.
Exp
+ Exponential Activation.
++\[f(z) = e^z.\]+
Identity¶
+-
+
-
+
paddle.v2.activation.
Identity
+ alias of
+Linear
Linear¶
+-
+
-
+class
paddle.v2.activation.
Linear
+ Identity Activation.
+Just do nothing for output both forward/backward.
+
Log¶
+-
+
-
+class
paddle.v2.activation.
Log
+ Logarithm Activation.
++\[f(z) = log(z)\]+
Square¶
+-
+
-
+class
paddle.v2.activation.
Square
+ Square Activation.
++\[f(z) = z^2.\]+
Sigmoid¶
+-
+
-
+class
paddle.v2.activation.
Sigmoid
+ Sigmoid activation.
++\[f(z) = \frac{1}{1+exp(-z)}\]+
Softmax¶
+-
+
-
+class
paddle.v2.activation.
Softmax
+ Softmax activation for simple input
++\[P(y=j|x) = \frac{e^{x_j}} {\sum^K_{k=1} e^{x_j} }\]+
SequenceSoftmax¶
+-
+
-
+class
paddle.v2.activation.
SequenceSoftmax
+ Softmax activation for one sequence. The dimension of input feature must be +1 and a sequence.
+++result = softmax(for each_feature_vector[0] in input_feature) +for i, each_time_step_output in enumerate(output): + each_time_step_output = result[i] +
Relu¶
+-
+
-
+class
paddle.v2.activation.
Relu
+ Relu activation.
+forward. \(y = max(0, z)\)
+derivative:
++\[\begin{split}1 &\quad if z > 0 \\ +0 &\quad\mathrm{otherwize}\end{split}\]+
BRelu¶
+-
+
-
+class
paddle.v2.activation.
BRelu
+ BRelu Activation.
+forward. \(y = min(24, max(0, z))\)
+derivative:
++\[\begin{split}1 &\quad if 0 < z < 24 \\ +0 &\quad \mathrm{otherwise}\end{split}\]+
SoftRelu¶
+-
+
-
+class
paddle.v2.activation.
SoftRelu
+ SoftRelu Activation.
+
Tanh¶
+-
+
-
+class
paddle.v2.activation.
Tanh
+ Tanh activation.
++\[f(z)=tanh(z)=\frac{e^z-e^{-z}}{e^z+e^{-z}}\]+
STanh¶
+-
+
-
+class
paddle.v2.activation.
STanh
+ Scaled Tanh Activation.
++\[f(z) = 1.7159 * tanh(2/3*z)\]+