error when using "recurrent_group"
Created by: CrossLee1
I make a test for the layer recurrent_group following this. My net config is as follwings:
input = data_layer(name='feat', size=feat_size)
label = data_layer(name='label', size=label_size)
def step_test(input):
output = fc_layer(input=input, size=1024, act=LinearActivation(), bias_attr=False)
return output
decoder = recurrent_group(name='group', step=step_test, input=[StaticInput(input=input, is_seq=True)])
last = last_seq(decoder)
output = fc_layer(input=last, size=label_size, name='output', bias_attr=bias_attr, act=SigmoidActivation())
and my slots is
obj.slots = [DenseSlot(obj.ftr_dim), SparseNonValueSlot(obj.label_size)]
When I begin to training, error occurs:
Current Layer forward/backward stack is
LayerName: group
LayerName: label
LayerName: feat
*** Aborted at 1478511490 (unix time) try "date -d @1478511490" if you are using GNU date ***
Current Layer forward/backward stack is
PC: @ 0x93a62d paddle::RecurrentGradientMachine::forward()
Current Layer forward/backward stack is
*** SIGSEGV (@0x8) received by PID 29766 (TID 0x7f44921fd700) from PID 8; stack trace: ***
Current Layer forward/backward stack is
@ 0x7f44b74c3160 (unknown)
Current Layer forward/backward stack is
@ 0x93a62d paddle::RecurrentGradientMachine::forward()
Current Layer forward/backward stack is
@ 0x92fde3 paddle::RecurrentLayerGroup::forward()
Current Layer forward/backward stack is
@ 0x9103ac paddle::NeuralNetwork::forward()
Current Layer forward/backward stack is
@ 0x90162f paddle::TrainerThread::forward()
Current Layer forward/backward stack is
@ 0x902835 paddle::TrainerThread::computeThread()
Current Layer forward/backward stack is
@ 0x7f44b6d5a8a0 execute_native_thread_routine
Current Layer forward/backward stack is
@ 0x7f44b74bb1c3 start_thread
Current Layer forward/backward stack is
@ 0x7f44b67ce12d __clone
What is the problem? Thank~~