1. 11 2月, 2022 1 次提交
  2. 06 2月, 2022 1 次提交
  3. 18 1月, 2022 1 次提交
  4. 26 8月, 2021 1 次提交
    • L
      Add feed_forward for fused attention op. (#34945) · d1a33bc7
      Li Min 提交于
      Describe
      
      Add feed_forward for fused attention op.
      (1) Encapsulate matmul impl (forward and backward) used in attention op.
      (2) Implement bias_add (forward and backward) used in attention op.
      d1a33bc7