inference的时候无论输出数据怎么变,网络的output都不变
Created by: Ashleychen
在本地inference的时候,发现网络的output一直不变,打印网络参数和网络输出均正常。不知道原因,麻烦帮忙解决。inference的具体写法如下:
if __name__ == '__main__':
parser = argparse.ArgumentParser()
parser.add_argument('--data', default='./data')
parser.add_argument('--model', default='./model')
conf = TiebaConf()
conf.define_feature(conf.session_wise_features, conf.item_wise_features, conf.labels)
args = parser.parse_args()
exe = fluid.Executor(fluid.CPUPlace())
inference_scope = fluid.core.Scope()
with fluid.scope_guard(inference_scope):
[inference_program, feed_target_names, fetch_targets] = \
fluid.io.load_inference_model(args.model, exe)
print_params(inference_program, inference_scope, conf)
lines = []
with open(args.data, 'r') as fin:
for line in fin:
lines.append(line)
if len(lines) == conf.batch_size:
# tmp = lines2tensors(lines, conf)
# for slot in tmp:
# print(slot, numpy.mean(numpy.array(tmp[slot])))
network_output = exe.run(inference_program, feed=lines2tensors(lines, conf),
fetch_list=fetch_targets, return_numpy=False)
accum_metrics(network_output, conf)
lines = []
环境已经打包好,麻烦私hi我提供。