- 17 8月, 2022 1 次提交
-
-
由 Nyakku Shigure 提交于
[CodeStyle][NPU] use np.testing.assert_allclose instead of self.assertTrue(np.allclose(...)) (part 1) (#44988) * autofix * try resolve precision issues * revert some changes * clean some `err_msg` * 0.0001 -> 1e-4 * update commented assert code * try to fix some shape errors * `numpy` -> `np` * empty commit, trigger kunlun ci, test=kunlun * empty commit, retrigger kunlun ci, test=kunlun * empty commit, trigger kunlun ci, try fix npu memcpy_h2d, test=kunlun * try fix npu import error, test=kunlun
-
- 05 6月, 2022 1 次提交
-
-
由 Sing_chan 提交于
* use yapf to format all python file * yapf exclude two unittests file for they rely on writing and reading file, and format will break them * disable diff_py_file because too many diff files cause command following failed
-
- 16 12月, 2021 1 次提交
-
-
由 Liu-xiandong 提交于
Add key_padding_mask and attn_mask in sparse_attention Api 1.Key padding mask is a tensor with dimensions [batch_size, seq_len], and attention mask is a tensor with dimensions [seq_len, seq_len]. The data types of the two masks are consistent with Q, K, and V, which are float32 or float64. If the value in Mask is 0, it means that the position needs to be masked. 2.The changed files are mainly paddle/fluid/operators/sparse_attention_op.cu and python/paddle/fluid/tests/unittests/test_sparse_attention_op.py. sparse_attention has three parts: sddmm, softmax, and dsd. Adding the mask operation only needs to modify the softmax. It has no effect on the other two parts. In addition, in order to test the mask function, related tests has been added.
-
- 02 11月, 2021 1 次提交
-
-
由 Liu-xiandong 提交于
-
- 11 10月, 2021 1 次提交
-
-
由 Liu-xiandong 提交于
Add paddle.nn.functional.sparse_attention API 本个PR主要将sparse_attention功能在python层进行了一层封装,OP的主体代码见:#PR35676 此外,对于封装的python 接口,增加了相应的单测。
-
- 29 9月, 2021 1 次提交
-
-
由 Liu-xiandong 提交于
* fix cusparse compile problem, test=develop * Modify file permissions
-
- 28 9月, 2021 1 次提交
-
-
由 Liu-xiandong 提交于
Add sparse_attention OPs, python api will be added in next pr
-