1. 28 3月, 2023 4 次提交
  2. 27 3月, 2023 1 次提交
  3. 25 3月, 2023 1 次提交
  4. 24 3月, 2023 1 次提交
    • Z
      Memory Efficient Attention (#51867) · e5ad3859
      ZhangDY-6483 提交于
      * first version, notest
      
      * return final rst, notest
      
      * use infinity() instead of max
      
      * ut structure
      
      * start up of ut
      
      * generate lse
      
      * update
      
      * add depense
      
      * reconstruct cmake
      
      * move file
      
      * add memory efficient attention and fix blasimpl
      
      * update
      
      * update cmake
      
      * add namespace
      
      * update cmake
      
      * use .cu
      
      * update for pad3d
      
      * bug fix
      
      * bug fix
      
      * update
      
      * bug fix
      
      * update enforce
      
      * add test case
      
      * merge the lse pad
      
      * fix kernel_fn of backward
      
      * fix PADDLE_ENFORCE_EQ and phi_api
      
      * fix PADDLE_ENFORCE
      
      * fix PADDLE_ENFORCE
      
      * rerun coverage
      
      * fix memory efficient attention test
      
      * rerun ci
      
      * add cuda version condition
      
      * add cuda version condition
      
      * delete WIP test
      
      * replace PADDLE_ENFORCE
      
      * edit the namespace of datatype in multiple.cc
      
      * rerun
      
      * rerun
      
      ---------
      Co-authored-by: Nliuyuang <liuyuang@baidu.com>
      e5ad3859
  5. 23 3月, 2023 3 次提交
  6. 22 3月, 2023 3 次提交
  7. 21 3月, 2023 3 次提交
  8. 20 3月, 2023 7 次提交
    • A
      [CodeStyle][UP008] remove super call with parameters (#51812) · 81f3f6b5
      Ainavo 提交于
      * remove super call with parameters
      
      * fix bug
      81f3f6b5
    • X
      【prim】New layer_norm grad (#51750) · 802a81d0
      xiaoguoguo626807 提交于
      * Add flatten composite rule
      
      * get the right xshape and pass func test
      
      * add cinn unit test
      
      * Remove cinn test, wait for it to be added after repair
      
      * add comp test to test_flatten_contiguous_range_op.py
      
      * remove func test on composite_ops
      
      * Add comments to maybe_wrap_dim func
      
      * remove commented code
      
      * fix the problem with 0D tensor case
      
      * add flatten split rule comment
      
      * fix syntax issues
      
      * block flatten on resnet_prim_cinn
      
      * init change
      
      * tmp commit
      
      * add layer_norm InferMeta check
      
      * cast type modify
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes (#50557)
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes
      
      * add unittest
      
      * fix typo
      
      * fix typo
      
      * fix map.at
      
      * fix find
      
      * fix test
      
      * fix cinn cache key structure realize
      
      * using ordered map for attributes
      
      * add test by review advice
      
      ---------
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      
      * [prim] enable dygraph_to_static to support custom_vjp
      
      * Pr 50885 (#7)
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes (#50557)
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes
      
      * add unittest
      
      * fix typo
      
      * fix typo
      
      * fix map.at
      
      * fix find
      
      * fix test
      
      * fix cinn cache key structure realize
      
      * using ordered map for attributes
      
      * add test by review advice
      
      ---------
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      
      * [prim] enable dygraph_to_static to support custom_vjp
      
      * fix code in a dy2static-friendly way.
      
      * [dystatic] add hooker for prim
      
      ---------
      Co-authored-by: NAurelius84 <zhangliujie@baidu.com>
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      Co-authored-by: Ncxxly <chenxx_id@163.com>
      
      * [prim] enable dygraph_to_static to support custom_vjp
      
      * fix cast prim and vjp dtype mapping error bug
      
      * recover
      
      * big tol
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes (#50557)
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes
      
      * add unittest
      
      * fix typo
      
      * fix typo
      
      * fix map.at
      
      * fix find
      
      * fix test
      
      * fix cinn cache key structure realize
      
      * using ordered map for attributes
      
      * add test by review advice
      
      ---------
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      
      * [prim] enable dygraph_to_static to support custom_vjp
      
      * Pr 50885 (#7)
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes (#50557)
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes
      
      * add unittest
      
      * fix typo
      
      * fix typo
      
      * fix map.at
      
      * fix find
      
      * fix test
      
      * fix cinn cache key structure realize
      
      * using ordered map for attributes
      
      * add test by review advice
      
      ---------
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      
      * [prim] enable dygraph_to_static to support custom_vjp
      
      * fix code in a dy2static-friendly way.
      
      * [dystatic] add hooker for prim
      
      ---------
      Co-authored-by: NAurelius84 <zhangliujie@baidu.com>
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      Co-authored-by: Ncxxly <chenxx_id@163.com>
      
      * [prim] enable dygraph_to_static to support custom_vjp
      
      * fix cast prim and vjp dtype mapping error bug
      
      * Cxx prim custom vjp (#8)
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes (#50557)
      
      ---------
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      
      * [prim] enable dygraph_to_static to support custom_vjp
      
      * Pr 50885 (#7)
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes (#50557)
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes
      
      ---------
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      
      * [prim] enable dygraph_to_static to support custom_vjp
      
      * fix code in a dy2static-friendly way.
      
      * [dystatic] add hooker for prim
      
      ---------
      Co-authored-by: NAurelius84 <zhangliujie@baidu.com>
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      Co-authored-by: Ncxxly <chenxx_id@163.com>
      
      * [prim] enable dygraph_to_static to support custom_vjp
      
      * fix cast prim and vjp dtype mapping error bug
      
      * [dy2static-ci] fix dy2static ci errors.
      
      ---------
      Co-authored-by: NAurelius84 <zhangliujie@baidu.com>
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      Co-authored-by: Ncxxly <chenxx_id@163.com>
      
      * [Prim] enable whitelist and blacklist for custom_vjp
      
      * debug log
      
      * clear log
      
      * fix
      
      * nothing
      
      * less memory
      
      * recover utils
      
      * fix
      
      * modify threshold value
      
      * skip layer_norm for test_bert
      
      * back to bert success state
      
      * add epsion
      
      * delete unnecessary compute
      
      * modify amp dtype
      
      * modify * order
      
      * delete sqrt check and fp16
      
      ---------
      Co-authored-by: Nxuyongsheng <xuyongsheng@baidu.com>
      Co-authored-by: Nxysheng-baidu <121540080+xysheng-baidu@users.noreply.github.com>
      Co-authored-by: NAurelius84 <zhangliujie@baidu.com>
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      Co-authored-by: Ncxxly <chenxx_id@163.com>
      Co-authored-by: Nxiongkun <807377414@qq.com>
      802a81d0
    • zhouweiwei2014's avatar
    • G
      【fluid clean】Move out layers and layers helper (#49415) · 1d5cad23
      GGBond8488 提交于
      * remove no used fluid beam_search_decoder
      
      * move Layer and related helper to paddle.nn.common
      
      * modify Layer references from dygraph.layers.Layer to paddle.nn.common.layers
      
      * stash changge
      
      * remove fluid layer_object_helper, layers.py
      
      * remove fluid layers init
      
      * add setip
      
      * fix unitest
      
      * delete layers in fluid.dygraph
      
      * merge paddle.tensor.stat,py
      
      * fix circle import
      
      * fix curcle import
      
      * remove redundant in_dygraph_mode import
      
      * revoce paddle.nn.common.* in fluid.__init__
      
      * recovery nn.rnn
      
      * paddle.frame use lazy import import paddle.jit to avoid circle import
      
      * remove left dygraph.layers ref
      
      * merge develop
      
      * fix import error
      
      * fix test error
      
      * fxi merge error
      
      * fix test fluid.Layer
      
      * fix test error
      
      * fix test error
      
      * fix import error
      
      * fix import error
      
      * fix comments
      
      * fix circle import
      
      * fix rnn import error
      
      * fix circle import
      1d5cad23
    • warrentdrew's avatar
      add composite rules for squeeze op (#51539) · 89ff0d59
      warrentdrew 提交于
      * add composite rule for squeeze
      
      * fix pre commit
      
      * fix pre commit
      
      * simplify rules
      
      * arrange code
      
      * fix int axis
      
      * simplify squeeze axis rules
      
      * bugfix
      
      * fix pre commit
      89ff0d59
    • G
      Fluid clean move out fill constant (#49511) · c985b1ac
      GGBond8488 提交于
      * migrate fill_constant to paddle.tensor
      
      * move fill_constant to paddle.tensor and repalce the reference
      
      * add missing fill_constant replacement
      
      * fix typro
      
      * remove unused import fill_constant
      
      * fix zeros import error
      
      * fix circle import
      
      * fix layers.zeros
      
      * fix unitest
      
      * fix unitests
      
      * fix unitest
      
      * use paddle.full replace fill_constant in samplecode
      
      * fix sample code
      
      * recovery xpu test
      
      * recovery xpu test
      
      * fix circle import
      
      * fix utils import error
      
      * fix utils error
      
      * fix circle import
      
      * redo
      
      * fix circle import
      
      * fix prim fill constant import
      
      * fix type error
      
      * fix increase error
      
      * fix test error
      
      * fix fill_constant
      c985b1ac
    • J
      support relue custom vjp (#51742) · 604b7a53
      Jiabin Yang 提交于
      604b7a53
  9. 17 3月, 2023 4 次提交
  10. 16 3月, 2023 2 次提交
  11. 15 3月, 2023 4 次提交
    • K
      feat: add rsqrt composite rule (#51432) · c9ca7c35
      Kang Zhao 提交于
      * feat: add relu composite rule
      
      * feat: add relu composite rule, maximum op
      
      * feat: add relu composite rule, maximum op
      
      * feat: add relu composite rule, polish comments
      
      * feat: add relu composite rule, polish comments
      
      * feat: add relu composite rule, add python api of relu
      
      * feat: add relu composite rule, commit hook
      
      * fix: maximum type error & ban cinn test
      
      * fix: maximum input sequence bugs
      
      * resolve conflicts
      
      * fix: code style bugs
      
      * add: relu fp16 test
      
      * feat: add rsqrt composite rule
      
      * feat: add rsqrt composite rule
      
      * resolve conflicts of composite rule
      
      * fix: delete check eager
      c9ca7c35
    • J
      【Prim】Support amp logic for layer_norm and softmax (#51473) · 64076727
      Jiabin Yang 提交于
      * support amp logic for layer_norm and softmax
      
      * fix layer_norm amp
      
      * fix layernorm api and dropout fp16
      
      * fix layernorm api and dropout fp16
      
      * fix bn, ln dtype in float16
      
      * fix dropout fp16
      
      * fix comment
      64076727
    • C
      [Prim] add pow composite rule (#51070) · 2d9e103e
      chenjian 提交于
      * add pow composite rule
      
      * fix test
      
      * fix unit test
      
      * update test
      
      * fix test
      
      * update
      2d9e103e
    • W
      refine amp scaler (#51340) · 1e232e27
      wanghuancoder 提交于
      * refine _found_inf
      1e232e27
  12. 14 3月, 2023 7 次提交