[Dimension error][LSTM] Misunderstanding between layers dimension
Created by: 7633
What could this error means?
F0228 16:24:54.667619 19048 Matrix.cpp:3167] Check failed: (size_t)lbl[i] < dim (4 vs. 4)
My network architecture:
link_encode = data_layer(name='data', size=TERM_NUM)
lstm = simple_lstm(input=link_encode, size=emb_size)
score = fc_layer(input=lstm, size=4, act=SoftmaxActivation())
if is_predict:
maxid = maxid_layer(score)
outputs([maxid, score])
else:
# Multi-task training.
label = data_layer(name='label', size=4)
cls = classification_cost(input=score, label=label)
outputs(cls)
So, part of initHook in dataprovider:
settings.input_types = [
dense_vector_sequence(TERM_NUM),
integer_value(4)
]
And process return this yield [input_data], int(speeds[i])
I didn't understand why and where it occurs, because dimension of output label layer is 4, dimension of score layer is the same.