• T
    Transformer pr (#2214) · f0a6c1eb
    TianXiaogang 提交于
    * feat: add beam_search_special function for support nlp model
    
    * fix: add beam_search_compute kernel input and output
    
    * feat: add assign op & copy_compute kernel
    
    * feat: add fill_const_batch_size_like op & kernel
    
    * feat: add layer_norm op and kernel and ut
    
    * fix: fix some bugs
        fix mul_op infer_shape bug when x_dim_idx = 2, x_dims.size()=3 & y_dim_idx = 1, y_dims.size()=2
        fix elementwise_compute bug when y axis is all 1
        fix beam_search choose math_func wrong bug
        fix layer_norm get attr bug
        fix fill_constant_batch_size_like shape_set bug
    
    * feat: add gather op and kernel & and transform ut
    
    * feats: add ops and fix bugs to support transformer op
           fix type_cast passes to skip `while`
           fix elementwise infer_shape bug when x.dims=3 and y.dims={1} & axis=0
           fix lookup_table compute bug
           fix read_from_array/beam_search/increment/compate/gather ops data_type problems
    
    * fix:
        transfomer ut add word read inferface
        fix copy/gather/norm/layer_norm include path problem
    
    * fix:debug info
    
    * fix: fix input reshape bug
    
    * fix: fix norm bug
    
    * style: style fix & test=develop
    
    * style: fix operators cmakelist
    
    * style: fix operators cmakelist; test=develop
    
    * fix and test=develop
    
    * fix and test=develop
    
    * style: style fix; test=develop
    f0a6c1eb
layer_norm_op.cc 2.4 KB