ERNIE 中finetuning遇到一个index out of range的问题
Created by: fw339wj
在finetuning的文件夹下面的classifier中,遇到一个这样的问题:
Traceback (most recent call last):
File "C:/Users/wxc/Desktop/NLP/NLPCC2019/LARK/ERNIE/run_classifier.py", line 282, in
main(args)
File "C:/Users/wxc/Desktop/NLP/NLPCC2019/LARK/ERNIE/run_classifier.py", line 202, in main
graph_vars, "train")
File "C:\Users\wxc\Desktop\NLP\NLPCC2019\LARK\ERNIE\finetune\classifier.py", line 259, in evaluate
ret["learning_rate"] = float(outputs[4][0])
IndexError: list index out of range
W0518 13:45:05.752697 14328 graph.h:204] WARN: After a series of passes, the current graph can be quite different from OriginProgram. So, please avoid using the OriginProgram()
method!
I0518 13:45:10.330855 14328 build_strategy.cc:285] SeqOnlyAllReduceOps:0, num_trainers:1
代码如下:
`def evaluate(exe, test_program, test_pyreader, graph_vars, eval_phase): train_fetch_list = [ graph_vars["loss"].name, graph_vars["accuracy"].name, graph_vars["num_seqs"].name ]
if eval_phase == "train":
if "learning_rate" in graph_vars:
train_fetch_list.append(graph_vars["learning_rate"].name)
outputs = exe.run(fetch_list=train_fetch_list)
ret = {"loss": np.mean(outputs[0]), "accuracy": np.mean(outputs[1])}
print("outputs:", outputs)
if "learning_rate" in graph_vars:
ret["learning_rate"] = float(outputs[4][0])
return ret
我认为 ret["learning_rate"] = float(outputs[4][0])中应该outputs为[3][0],请确认。