Activations¶
Paddle Supported Activations.
Each activation inherit BaseActivation, which has two attributes.
- name: activation name in paddle config. 
- support_hppl: True if supported by hppl. lstm layer can only use activations
- supported by hppl (the name hppl will be revised later). 
 
- 
class paddle.trainer_config_helpers.activations.BaseActivation(name, support_hppl)¶
- A mark for activation class. 
- 
class paddle.trainer_config_helpers.activations.TanhActivation¶
- Tanh activation. \[f(z)=tanh(z)=\frac{e^z-e^{-z}}{e^z+e^{-z}}\]
- 
class paddle.trainer_config_helpers.activations.SigmoidActivation¶
- Sigmoid activation. \[f(z) = \frac{1}{1+exp(-z)}\]
- 
class paddle.trainer_config_helpers.activations.SoftmaxActivation¶
- Softmax activation for simple input \[P(y=j|x) = \frac{e^{x_j}} {\sum^K_{k=1} e^{x_j} }\]
- 
class paddle.trainer_config_helpers.activations.SequenceSoftmaxActivation¶
- Softmax activation for one sequence. The dimension of input feature must be 1 and a sequence. - result = softmax(for each_feature_vector[0] in input_feature) for i, each_time_step_output in enumerate(output): each_time_step_output = result[i]
- 
class paddle.trainer_config_helpers.activations.IdentityActivation¶
- Identity Activation. - Just do nothing for output both forward/backward. 
- 
paddle.trainer_config_helpers.activations.LinearActivation¶
- alias of - IdentityActivation
- 
class paddle.trainer_config_helpers.activations.ReluActivation¶
- Relu activation. - forward. \(y = max(0, z)\) - derivative: \[\begin{split}1 &\quad if z > 0 \\ 0 &\quad\mathrm{otherwize}\end{split}\]
- 
class paddle.trainer_config_helpers.activations.BReluActivation¶
- BRelu Activation. - forward. \(y = min(24, max(0, z))\) - derivative: \[\begin{split}1 &\quad if 0 < z < 24 \\ 0 &\quad \mathrm{otherwise}\end{split}\]
- 
class paddle.trainer_config_helpers.activations.SoftReluActivation¶
- SoftRelu Activation. 
- 
class paddle.trainer_config_helpers.activations.STanhActivation¶
- Scaled Tanh Activation. \[f(z) = 1.7159 * tanh(2/3*z)\]
- 
class paddle.trainer_config_helpers.activations.AbsActivation¶
- Abs Activation. - Forward: \(f(z) = abs(z)\) - Derivative: \[\begin{split}1 &\quad if \quad z > 0 \\ -1 &\quad if \quad z < 0 \\ 0 &\quad if \quad z = 0\end{split}\]
- 
class paddle.trainer_config_helpers.activations.SquareActivation¶
- Square Activation. \[f(z) = z^2.\]