Activations

namespace paddle
class ActivationFunction

Subclassed by paddle::IdentityActivation

Public Functions

ActivationFunction()
virtual ~ActivationFunction()
virtual void forward(Argument &act) = 0
virtual void backward(Argument &act) = 0
virtual const std::string &getName() const = 0

Public Static Functions

ActivationFunction *create(const std::string &type)

Defines

ACTIVATION_CLASS_NAME(ACTIVATION_NAME)
BEGIN_DEFINE_ACTIVATION(ACTIVATION_NAME)
END_DEFINE_ACTIVATION(ACTIVATION_NAME)
namespace paddle

Functions

void forward(Argument &act)

SigmoidActivation

f(z) = {1}{1+exp(-z)}

Relu Activation.

forward. y = max(0, z)

derivative of relu is:

1 if z > 0

0 otherwise.

BRelu Activation.

forward. y = min(24, max(0, z))

derivative of brelu is:

1 if 0 < z < 24

0 otherwise.

TODO(yuyang18): Remove magic number 24 or make it configuable.

tanh activation.

f(z) = tanh(z)={e^z-e^{-z}}{e^z+e^{-z}}

Soft relu activation.

f(z) = ln(1+e^z)

Abs Activation.

Forward: f(z) = abs(z)

Derivative:

1   if z>0

-1 if z<0

0 if z=0

Square Activation.

f(z) = z^2.

void backward(Argument &act)
ACTIVATION_CLASS_NAME(softmax)

Softmax on all frames of one sequence. Width of frame must be one.

ACTIVATION_CLASS_NAME() paddle::stanh()

Variables

ClassRegistrar<ActivationFunction> gActivationRegistrar
InitFunction paddle::__reg_activation__identity([]{gActivationRegistrar.registerClass< IdentityActivation >("");gActivationRegistrar.registerClass< IdentityActivation >("linear");})
MatrixPtr sftMaxSum_

Do Softmax activation for all sample. P(y=j|x) = {e^{x^Tw_j}}{^K_{k=1}e^{x^Tw_k}}

MatrixPtr sftMaxDot_
MatrixPtr one_
Argument argument_
real a

Scaled Tanh Activation

f(z) = 1.7159 * tanh(2/3*z)

real b
class IdentityActivation

The IdentityActivation class.

Do nothing when forward/backward.

Inherits from paddle::ActivationFunction

Public Functions

virtual void forward(Argument &act)
virtual void backward(Argument &act)
virtual const std::string &getName() const

Public Static Attributes

const std::string name