paddle训练时,为什么会出现参数的数量不一致的错误?
Created by: joseph-chan
在训练一个简单的三层结构的autoencoder时,出现网络的参数size的问题。整个网络结构的代码如下,为什么会出现参数数量不一致错误? input_layer_dim = 100 hidden_layer_dim = 1024 The network structure is as follow:
input_attr = ParamAttr(name = 'input.w', initial_mean = 0., sparse_update=True,
l2_rate= 1e-3, initial_std = 1. / np.sqrt(self.input_layer_dim / 2.))
input_bias = ParamAttr(name='input.bias', initial_mean=0., initial_std=0.)
hidden_attr = ParamAttr( name = 'hidden.w',
l2_rate= 1e-3, initial_mean = 0., initial_std = 1. / np.sqrt(self.hidden_layer_dim / 2.))
hidden_bias = ParamAttr(name='input.bias', initial_mean=0., initial_std=0.)
self.aeinput = paddle.layer.data(
name='ae_input',
type=paddle.data_type.sparse_vector(self.input_layer_dim))
#hidden layer
hiddenlayer = paddle.layer.fc(input = self.aeinput, size = self.hidden_layer_dim, param_attr = input_attr,bias_attr = input_bias, act = paddle.activation.Softmax())
self.predict = paddle.layer.fc(input = hiddenlayer, size = self.input_layer_dim, param_attr = hidden_attr,bias_attr = hidden_bias, act = paddle.activation.Softmax())
出现以下错误:
351 Weight.cpp:28] Check failed: param->getSize() == width * height (1024 vs. 100)
*** Check failure stack trace: ***
@ 0x7f4d29ca2d7d google::LogMessage::Fail()
@ 0x7f4d29ca682c google::LogMessage::SendToLog()
@ 0x7f4d29ca2873 google::LogMessage::Flush()
@ 0x7f4d29ca7d3e google::LogMessageFatal::~LogMessageFatal()
@ 0x7f4d29b27b32 paddle::Weight::Weight()
@ 0x7f4d299fcd0a paddle::FullyConnectedLayer::init()
@ 0x7f4d29a60f8f paddle::NeuralNetwork::init()
@ 0x7f4d29a896d1 paddle::GradientMachine::create()
@ 0x7f4d29c7fca3 GradientMachine::createFromPaddleModelPtr()
@ 0x7f4d29c7fe7f GradientMachine::createByConfigProtoStr()
@ 0x7f4d2992fd3d _wrap_GradientMachine_createByConfigProtoStr