1. 12 5月, 2023 1 次提交
  2. 08 5月, 2023 1 次提交
  3. 25 4月, 2023 1 次提交
  4. 13 4月, 2023 1 次提交
  5. 31 3月, 2023 1 次提交
    • G
      FIX_LINUX_Wternimate (#52307) · ffff133b
      Galaxy1458 提交于
      * this is a test pr, test=develop
      
      * solve the four [-Wterminate] warning, test=develop
      
      * solve the four [-Wterminate] warning, test=develop
      
      * new fix [-Wterminate], test=delelop
      
      * new fix [-Wterminate], test=delelop
      
      * new fix [-Wterminate], test=delelop
      
      * new , test = develop
      
      * new , test = develop
      
      * new , test = develop
      
      * new , test = develop
      
      * new , test = develop
      
      * new , test = develop
      ffff133b
  6. 28 3月, 2023 1 次提交
    • F
      Add basic functionalities to support Scalar & Scalars in op attr (#51984) · 2e9fd5e4
      Feiyu Chan 提交于
      Add basic functionalities to support Scalar & Scalars in operator attribute.
      
      1. extend allowed types in operator's attribute type, add `paddle::experimental::Scalar`, add corresponding protobuf Message types;
      2. Scalar enhancement, add formatting, equality;
      3. add code to handle Scalar & Scalars in opmaker, conversion from  paddle operator to phi kernel, opdesc construction and manipulation,  tensorrt converter, tracer, operator construction, etc;
      4. bind `paddle::experimental::Scalar` to python, as `libpaddle.Scalar`;
      5. add functionality to canonicalize attribute map according to OpProto(if the op the attribute map used for has an OpProto);
      6. add code to manipulate Scalar proto message via protobuffer python API;
      
      Add unittests.
      
      1. add test cases for formatting, equality for Scalars, and WrapAsScalars;
      2. add test cases for 'casting' between different morphs of attributes;
      3. add test cases for extracting scalar & scalars from attribute;
      4. add test cases for CanonicalizeScalarAttrs(and fix a bug in type index offset);
      5. fix gmock's library filename on windows platform.
      6. clean code: use canonicalize_attrs instead of inlining the function;
      7. add test cases for libpaddle.Scalar in python code.
      8. add test cases for `make_scalar_proto`, which manipulate proto message `Scalar` via protobuffer python API.
      2e9fd5e4
  7. 22 3月, 2023 1 次提交
  8. 21 3月, 2023 1 次提交
  9. 16 3月, 2023 1 次提交
  10. 24 2月, 2023 1 次提交
  11. 20 2月, 2023 1 次提交
  12. 16 2月, 2023 1 次提交
  13. 11 2月, 2023 1 次提交
    • W
      [TRT] elementwise_add+transpose fusion (#50081) · fd0d4fa4
      Wang Bojun 提交于
      * eleadd_trans first version
      
      log fix
      
      * refine code for linear format, add pass check
      
      * linear format refine and ut fix
      
      * fix ut
      
      * windows ut
      
      * windows ut 2
      
      * move tensorMeta and alloc to configure
      fd0d4fa4
  14. 09 2月, 2023 2 次提交
    • Z
      [Paddle-TRT] GroupNorm int8 nchw32 fake kernel (#50146) · d93c63a0
      zhoutianzi666 提交于
      * add fmha_flashattention oss plugin
      
      * add fmhca
      
      * add oss fmhca
      
      * code reconstruct and add ut
      
      * code style refine
      
      * fix ut and enforce check
      
      * refine trt version check
      
      refine compile
      
      fix compile
      
      * fix cross ut
      
      * code refine
      
      * use runtime trt version check
      
      * bug fix and code refine
      
      * compile fix
      
      * merge develop
      
      * add GN QDQ kernel
      
      * support GN int8 fake kernel
      
      * add with_int8
      
      * add GN int8 fake kernel
      
      * add GN int8 fake kernel
      
      * add GN int8 fake kernel
      
      * add GN int8 fake kernel
      
      * add GN int8 fake kernel
      
      * add GN int8 fake kernel
      
      * add GN int8 fake kernel
      
      * add GN int8  UT
      
      * add verison > 8000  in GN int8  UT
      
      * add some check in .cu
      
      * add stdlib.h in UT
      
      * little change  in .cu
      
      * remove rand_r use rand
      
      * remove use rand
      
      * setAxis(1)
      
      * when int8 is on allow fall back to fp16
      
      ---------
      Co-authored-by: Nwwbitejotunn <wang_bojun@outlook.com>
      d93c63a0
    • W
      [TRT] Transpose layernorm fusion with different input format (#50082) · b2bb7ec9
      Wang Bojun 提交于
      * trans_layernorm
      b2bb7ec9
  15. 31 1月, 2023 1 次提交
    • W
      gn_silu (#49928) · 111075a3
      wenbin 提交于
      * gn_silu
      
      * add ut
      
      * set TIMEOUT
      
      * correct comments
      
      * comments
      
      * disable windows ut
      
      * rename parameter
      111075a3
  16. 12 1月, 2023 1 次提交
  17. 11 1月, 2023 1 次提交
  18. 10 1月, 2023 3 次提交
  19. 09 1月, 2023 1 次提交
    • W
      Preln groupnorm (#49463) · 591be3bd
      wenbin 提交于
      * skip_groupnorm
      
      * init
      
      * preln
      
      * add ut
      
      * more assert
      
      * set timeout
      
      * fix windows ci issue
      591be3bd
  20. 23 12月, 2022 2 次提交
  21. 21 12月, 2022 1 次提交
  22. 20 12月, 2022 1 次提交
  23. 19 12月, 2022 1 次提交
  24. 15 12月, 2022 1 次提交
  25. 13 12月, 2022 2 次提交
  26. 08 12月, 2022 1 次提交
  27. 05 12月, 2022 1 次提交
    • W
      Reverse roll fuse (#46914) · feb68dd1
      Wang Bojun 提交于
      * pass
      
      * pass
      
      * draft version
      
      * share mem opt
      
      * remove sharemem
      
      * add pattern for the case with circle_shift=0
      
      * add UT
      
      * pass opt
      
      * test_fix
      
      * code-commit
      
      * code-style
      
      * code style
      
      * code-style
      
      * ut-fix
      
      * op teller refine
      
      * resolve conflict
      
      * adjust position op_teller list and pass order for swin
      
      * ut code style update
      
      * adjust paddle pass order
      
      * refine pass order
      
      * refine pass order
      
      * refine pass order
      feb68dd1
  28. 01 12月, 2022 3 次提交
  29. 28 11月, 2022 1 次提交
  30. 25 11月, 2022 3 次提交
  31. 24 11月, 2022 1 次提交