Activations¶
-
class
paddle::
ActivationFunction
¶ Activation function is a function that transforms a set of input signals into an output signals. The purpose of the activation function is to introduce non-liearilty into the network.
- Note
- Common activation function are provieded, including linear, sigmoid, softmax, sequence_max, relu, brelu, tanh, stanh, softrelu, abs, square, exponential.
Subclassed by paddle::IdentityActivation
Public Functions
-
ActivationFunction
()¶
-
virtual
~ActivationFunction
()¶
-
virtual void
forward
(Argument &act) = 0¶ Foward propagation.
act.value <- f(act.value), where f is the activation function. Suppose that before calling forward(), act.value is x and after forward() is called, act.value is y, then y = f(x).
Usually, act is Layer::output_
-
virtual void
backward
(Argument &act) = 0¶ Backward propagaion.
x and y are defined in the above comment for forward().
- Before calling backward(), act.grad = dE / dy, where E is the error/cost
- After backward() returns, act.grad = dE / dx = (dE/dy) * (dy/dx)
-
virtual const std::string &
getName
() const = 0¶
Public Static Functions
-
ActivationFunction *
create
(const std::string &type)¶