using cos_sim always core at paddle::CpuMatrix::sumOfSquares()
Created by: Damon-wyg
如题,在使用cos_sim的时候,一直都挂在sumOfSquares,报错信息如下:
I0714 15:39:04.527210 16309 PyDataProvider2.cpp:257] loading dataprovider data_provider::process
I0714 15:40:05.779645 16309 PyDataProvider2.cpp:257] loading dataprovider data_provider::process
I0714 15:40:05.780019 16309 GradientMachine.cpp:134] Initing parameters..
I0714 15:40:08.254307 16309 GradientMachine.cpp:141] Init parameters done.
I0714 15:40:15.136973 24206 ThreadLocal.cpp:37] thread use undeterministic rand seed:24207
Thread [140023083710208] Forwarding __regression_cost_0__, __cos_sim_0__, user_media_cos_layer, user_media_fc_layer1, media_fc_layer2, media_fc_layer1, user_fc_layer2, user_fc_layer1, ad_fc_cos_layer, ad_fc_layer2, ad_fc_layer1, media_input, ad_input, user_emb_input, label,
*** Aborted at 1500018015 (unix time) try "date -d @1500018015" if you are using GNU date ***
PC: @ 0x7d0bf7 paddle::CpuMatrix::sumOfSquares()
*** SIGSEGV (@0x29) received by PID 16309 (TID 0x7f59aa2a3700) from PID 41; stack trace: ***
@ 0x7f59d9c78160 (unknown)
@ 0x7d0bf7 paddle::CpuMatrix::sumOfSquares()
@ 0x6713c5 paddle::CostLayer::forward()
@ 0x6d5884 paddle::NeuralNetwork::forward()
@ 0x6cbcc6 paddle::TrainerThread::forward()
@ 0x6ccdec paddle::TrainerThread::computeThread()
@ 0x7f59d93f28a0 execute_native_thread_routine
@ 0x7f59d9c701c3 start_thread
@ 0x7f59d8b6312d __clone
我的网络结构如下:(省略了data layer及部分hidden layer)
ad_fc_layer2 = fc_layer(name='ad_fc_layer2', input=ad_fc_layer1, size=64, act=ReluActivation())
ad_fc_cos_layer = fc_layer(name='ad_fc_cos_layer', input=ad_fc_layer2, size=48, act=TanhActivation())
...
user_media_fc_layer1 = fc_layer(name='user_media_fc_layer1', input=[user_fc_layer2, media_fc_layer2], size=64, act=ReluActivation())
user_media_cos_layer = fc_layer(name='user_media_cos_layer', input=user_media_fc_layer1, size=48, act=TanhActivation())
...
cost = cos_sim(a=user_media_cos_layer, b=ad_fc_cos_layer, scale=1)
rc = regression_cost(input=cost, label=label)
eval = auc_evaluator(rc, label)
outputs(rc)
请问这是什么原因呢?网络看上去貌似没啥问题