Dangerous non-tested layer forward
Created by: pengwangucla
I recently implement a layer leakyrelu,
I have a wrong implementation Here in BaseMatrix.cu:
DEFINE_MATRIX_BINARY_PARAMETER_OP(LeakyRelu, ONE_PARAMETER,
b = a >= 0.0f ? a : p);
template<class T>
void BaseMatrixT<T>::leakyRelu(BaseMatrixT& b, T p) {
applyBinary(binary::LeakyRelu<T>(p), b); }
DEFINE_MATRIX_BINARY_PARAMETER_OP(LeakyReluDerivative, ONE_PARAMETER,
a *= (b > 0.0f ? 1.0f : p));
template<class T>
void BaseMatrixT<T>::leakyReluDerivative(BaseMatrixT& b, T p) {
applyBinary(binary::LeakyReluDerivative<T>(p), b);
}
Notice the gradient is correct in forward function, while not correct in backward when b < 0
However, when using the test_ActivationGrad, this layer passed the test, what is the reason of that ?
Should there also be a customized forward testing that also test the forward value ?