Backward error for FeatureMapExpandLayer in PaddlePaddle
Created by: yangyi02
Hi all,
I defined a constant layer in PaddlePaddle by myself and use FeatureMapExpandLayer on top of that for some learning algorithms. The constant layer I defined does not need backpropagation.
When I run the training, PaddlePaddle gives segmentation fault.
The reason I find is because getInputGrad(0) in FeatureMapExpandLayer is empty (because the constant layer does not need backpropagation). And "int imgSize = inGrad->getWidth();" gives segmentation fault.
Below is the backward code in FeatureMapExpandLayer.cpp
96 void FeatureMapExpandLayer::backward(const UpdateCallback& callback) { 97 LOG(INFO) << "I am here"; 98 MatrixPtr inGrad = getInputGrad(0); 99 MatrixPtr outGrad = getOutputGrad(); 100 size_t batchSize = getInput(0).getBatchSize(); 101 int imgSize = inGrad->getWidth(); 102 { 103 AsyncGpuBlock asyncGpuBlock; 104 for (size_t i = 0; i < batchSize; i++) { 105 MatrixPtr outGradTmp = 106 Matrix::create(outGrad->getData() + i * imgSize * numFilters_, 107 numFilters_, 108 imgSize, 109 false, 110 useGpu_); 111 MatrixPtr inGradTmp = Matrix::create( 112 inGrad->getData() + i * imgSize, 1, imgSize, false, useGpu_); 113 inGradTmp->collectBias(outGradTmp, 1); 114 } 115 } 116 / Do derivation */ { 117 REGISTER_TIMER_INFO("BpAvtTimer", getName().c_str()); 118 backwardActivation(); 119 } 120 }