Created by: FrostML
PR types
Bug fixes
PR changes
OPs
Describe
Before this fix, the InferShape
of beam_search
doesn't set the shape of output, which leads to saved transformer __model__ maintains [0]
shape information for init_score
, trg_word
and init_idx
.
It is just fine for C++ inference but will fail when using load_inference_model
.