• S
    Move fused_attention op to phi [迁移前向 GPU OpKernel] (#51743) · a7ec8958
    Sonder 提交于
    * add kernel functions
    
    * update kernel functions
    
    * update func parameters' name
    
    * create codes for gpu device
    
    * 调整文件位置
    
    * fix include error
    
    * remove dependent files to phi/
    
    * restore fused_attention_op.cu
    
    * fix dependence errors
    
    * fix dependence errors
    
    * fix include error
    
    * fix all depandence errors[build success]
    
    * remove useless include
    
    * recover useless include
    
    * use phi::ToNCCLDataType
    
    * fix namespace
    
    * update new register code
    
    * fix error in fused_gemm_epilogue_utils
    
    * fix error in FusedAttentionKernel parm
    
    * finish fused_attention registe code[build success]
    
    * add paddle::optional
    
    * add sig file
    
    * fix build error
    
    * fix a include error
    
    * update CMkaeList
    
    * fix parameter sequence
    
    * add include file
    
    * update #if before include
    
    * fix grammly error
    
    * update codes for DropoutParam
    
    * remove const cast
    
    * trans some fluid api to phi api
    
    * add #if
    
    * update test code
    
    * update test codes
    
    * recover test codes
    
    * trans fused_attention to fluid
    
    * move #endif to end
    
    * move #endif
    
    * delete useless files
    
    * use fused attention utils and recover random seed
    
    * remove fluid include in phi
    a7ec8958
attn_gemm.h 10.7 KB