softmax_with_cross_entropy优化报错
Created by: xyq019971
我自己写的程序 b19 = fluid.layers.conv2d(x,num_filters=int(100),filter_size=[1, 1],stride=[1, 1],groups=1) b20 = fluid.layers.transpose(b19, perm=[0, 2, 3, 1]) out1 = fluid.layers.reshape(b20, shape=[-1,100]) out3 = fluid.layers.reshape(y, shape=[-1, 1]) out2 = fluid.layers.cast(out3, dtype="int64") loss=fluid.layers.softmax_with_cross_entropy(out1,out2) avg=fluid.layers.reduce_mean(loss) reduced_loss = 0.9*avg regularizer = fluid.regularizer.L2Decay(0.0001) optimizer = fluid.optimizer.Momentum( learning_rate=0.1, momentum=0.9, regularization=regularizer) _, params_grads = optimizer.minimize(reduced_loss)
不运行最后一句 _, params_grads = optimizer.minimize(reduced_loss)
不会报错,运行最后一句就会抱错
EnforceNotMet: The input of cast op must be set at [E:\dist\Paddle\paddle\fluid\operators\cast_op.cc:42] PaddlePaddle Call Stacks: Windows not support stack backtrace yet.
求大神指点