1. 16 12月, 2021 5 次提交
    • L
      Add fmax and fmin operators (#37826) · dd3afc9d
      LJQ❤️ 提交于
      Add elementwise_fmax and elementwise_fmin operators
      dd3afc9d
    • L
      Add sparse_attention mask ,test=develop (#37973) · fa463b90
      Liu-xiandong 提交于
      Add key_padding_mask and attn_mask in sparse_attention Api
      
      1.Key padding mask is a tensor with dimensions [batch_size, seq_len], and attention mask is a tensor with dimensions [seq_len, seq_len]. The data types of the two masks are consistent with Q, K, and V, which are float32 or float64. If the value in Mask is 0, it means that the position needs to be masked.
      
      2.The changed files are mainly paddle/fluid/operators/sparse_attention_op.cu and python/paddle/fluid/tests/unittests/test_sparse_attention_op.py. sparse_attention has three parts: sddmm, softmax, and dsd. Adding the mask operation only needs to modify the softmax. It has no effect on the other two parts. In addition, in order to test the mask function, related tests has been added.
      fa463b90
    • N
      Add the transformop parameter in TensorReduceFunctorImpl (#38135) · 524389ee
      niuliling123 提交于
      * Add the transformop parameter in TensorReduceFunctorImpl
      524389ee
    • Y
      [Pten]Modify registered kernel name (#38109) · be874c08
      YuanRisheng 提交于
      * Reduce reshape kernel functions in pten
      
      * delete notes
      
      * fix bugs when compile
      
      * modify register name
      
      * fix compile bugs
      be874c08
    • L
      Add float16 type for scatter op. (#38136) · 9bac4a76
      Li Min 提交于
      * Add float16 type for scatter op.
      
      * Add fp16 test for scatter op.
      
      * Add int and int64 support for scatter_grad on gpu.
      
      * Add int and int64 for check_variable_and_dtype routine.
      
      * Minors.
      
      * Code format.
      9bac4a76
  2. 15 12月, 2021 3 次提交
  3. 14 12月, 2021 6 次提交
  4. 13 12月, 2021 4 次提交
  5. 10 12月, 2021 5 次提交
  6. 09 12月, 2021 6 次提交
  7. 08 12月, 2021 6 次提交
  8. 07 12月, 2021 2 次提交
  9. 06 12月, 2021 2 次提交
  10. 03 12月, 2021 1 次提交