1. 25 10月, 2021 1 次提交
    • L
      Add fused_dropout wrapper to ease use. (#36185) (#36640) · 05d7e2fd
      Li Min 提交于
      In fused_attention op and fused_ffn op, the fused bias_add+dropout+residual+layernorm kernel or bias_add+dropout+residual kernel is used. To ease the use of this kernel, we provide a wrapper in this PR.
      1.To reuse the increment computing code, we exact the corresponding code to "GetSeedDataAndIncrement" routine in dropout_impl_util.h.
      2.The fused_dropout_helper.h provides the fused dropout kernel wrapper.
      
      Note: the test of this warper will be provided in the following fused_attention_op and fused_ffn PRs.
      05d7e2fd
  2. 15 9月, 2021 1 次提交