1. 20 3月, 2023 18 次提交
    • 2
      [AMP OP&Test] Norm bf16 (#51083) · 90cb9a0d
      201716010711 提交于
      90cb9a0d
    • A
      [CodeStyle][UP008] remove super call with parameters (#51812) · 81f3f6b5
      Ainavo 提交于
      * remove super call with parameters
      
      * fix bug
      81f3f6b5
    • X
      【prim】New layer_norm grad (#51750) · 802a81d0
      xiaoguoguo626807 提交于
      * Add flatten composite rule
      
      * get the right xshape and pass func test
      
      * add cinn unit test
      
      * Remove cinn test, wait for it to be added after repair
      
      * add comp test to test_flatten_contiguous_range_op.py
      
      * remove func test on composite_ops
      
      * Add comments to maybe_wrap_dim func
      
      * remove commented code
      
      * fix the problem with 0D tensor case
      
      * add flatten split rule comment
      
      * fix syntax issues
      
      * block flatten on resnet_prim_cinn
      
      * init change
      
      * tmp commit
      
      * add layer_norm InferMeta check
      
      * cast type modify
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes (#50557)
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes
      
      * add unittest
      
      * fix typo
      
      * fix typo
      
      * fix map.at
      
      * fix find
      
      * fix test
      
      * fix cinn cache key structure realize
      
      * using ordered map for attributes
      
      * add test by review advice
      
      ---------
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      
      * [prim] enable dygraph_to_static to support custom_vjp
      
      * Pr 50885 (#7)
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes (#50557)
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes
      
      * add unittest
      
      * fix typo
      
      * fix typo
      
      * fix map.at
      
      * fix find
      
      * fix test
      
      * fix cinn cache key structure realize
      
      * using ordered map for attributes
      
      * add test by review advice
      
      ---------
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      
      * [prim] enable dygraph_to_static to support custom_vjp
      
      * fix code in a dy2static-friendly way.
      
      * [dystatic] add hooker for prim
      
      ---------
      Co-authored-by: NAurelius84 <zhangliujie@baidu.com>
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      Co-authored-by: Ncxxly <chenxx_id@163.com>
      
      * [prim] enable dygraph_to_static to support custom_vjp
      
      * fix cast prim and vjp dtype mapping error bug
      
      * recover
      
      * big tol
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes (#50557)
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes
      
      * add unittest
      
      * fix typo
      
      * fix typo
      
      * fix map.at
      
      * fix find
      
      * fix test
      
      * fix cinn cache key structure realize
      
      * using ordered map for attributes
      
      * add test by review advice
      
      ---------
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      
      * [prim] enable dygraph_to_static to support custom_vjp
      
      * Pr 50885 (#7)
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes (#50557)
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes
      
      * add unittest
      
      * fix typo
      
      * fix typo
      
      * fix map.at
      
      * fix find
      
      * fix test
      
      * fix cinn cache key structure realize
      
      * using ordered map for attributes
      
      * add test by review advice
      
      ---------
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      
      * [prim] enable dygraph_to_static to support custom_vjp
      
      * fix code in a dy2static-friendly way.
      
      * [dystatic] add hooker for prim
      
      ---------
      Co-authored-by: NAurelius84 <zhangliujie@baidu.com>
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      Co-authored-by: Ncxxly <chenxx_id@163.com>
      
      * [prim] enable dygraph_to_static to support custom_vjp
      
      * fix cast prim and vjp dtype mapping error bug
      
      * Cxx prim custom vjp (#8)
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes (#50557)
      
      ---------
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      
      * [prim] enable dygraph_to_static to support custom_vjp
      
      * Pr 50885 (#7)
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes (#50557)
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes
      
      ---------
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      
      * [prim] enable dygraph_to_static to support custom_vjp
      
      * fix code in a dy2static-friendly way.
      
      * [dystatic] add hooker for prim
      
      ---------
      Co-authored-by: NAurelius84 <zhangliujie@baidu.com>
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      Co-authored-by: Ncxxly <chenxx_id@163.com>
      
      * [prim] enable dygraph_to_static to support custom_vjp
      
      * fix cast prim and vjp dtype mapping error bug
      
      * [dy2static-ci] fix dy2static ci errors.
      
      ---------
      Co-authored-by: NAurelius84 <zhangliujie@baidu.com>
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      Co-authored-by: Ncxxly <chenxx_id@163.com>
      
      * [Prim] enable whitelist and blacklist for custom_vjp
      
      * debug log
      
      * clear log
      
      * fix
      
      * nothing
      
      * less memory
      
      * recover utils
      
      * fix
      
      * modify threshold value
      
      * skip layer_norm for test_bert
      
      * back to bert success state
      
      * add epsion
      
      * delete unnecessary compute
      
      * modify amp dtype
      
      * modify * order
      
      * delete sqrt check and fp16
      
      ---------
      Co-authored-by: Nxuyongsheng <xuyongsheng@baidu.com>
      Co-authored-by: Nxysheng-baidu <121540080+xysheng-baidu@users.noreply.github.com>
      Co-authored-by: NAurelius84 <zhangliujie@baidu.com>
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      Co-authored-by: Ncxxly <chenxx_id@163.com>
      Co-authored-by: Nxiongkun <807377414@qq.com>
      802a81d0
    • A
      [CodeStyle][UP004] remove useless object inheritance (#51771) · 9983892e
      Ainavo 提交于
      * add_up004_for_ruff
      
      * 修改配置文件并清除object
      
      * fix md
      9983892e
    • zhouweiwei2014's avatar
    • G
      【fluid clean】Move out layers and layers helper (#49415) · 1d5cad23
      GGBond8488 提交于
      * remove no used fluid beam_search_decoder
      
      * move Layer and related helper to paddle.nn.common
      
      * modify Layer references from dygraph.layers.Layer to paddle.nn.common.layers
      
      * stash changge
      
      * remove fluid layer_object_helper, layers.py
      
      * remove fluid layers init
      
      * add setip
      
      * fix unitest
      
      * delete layers in fluid.dygraph
      
      * merge paddle.tensor.stat,py
      
      * fix circle import
      
      * fix curcle import
      
      * remove redundant in_dygraph_mode import
      
      * revoce paddle.nn.common.* in fluid.__init__
      
      * recovery nn.rnn
      
      * paddle.frame use lazy import import paddle.jit to avoid circle import
      
      * remove left dygraph.layers ref
      
      * merge develop
      
      * fix import error
      
      * fix test error
      
      * fxi merge error
      
      * fix test fluid.Layer
      
      * fix test error
      
      * fix test error
      
      * fix import error
      
      * fix import error
      
      * fix comments
      
      * fix circle import
      
      * fix rnn import error
      
      * fix circle import
      1d5cad23
    • warrentdrew's avatar
      add composite rules for squeeze op (#51539) · 89ff0d59
      warrentdrew 提交于
      * add composite rule for squeeze
      
      * fix pre commit
      
      * fix pre commit
      
      * simplify rules
      
      * arrange code
      
      * fix int axis
      
      * simplify squeeze axis rules
      
      * bugfix
      
      * fix pre commit
      89ff0d59
    • V
      Adjust tolerance with modi grad (#51791) · 2c543193
      Vvsmile 提交于
      2c543193
    • W
      add sigmoid custom grad for prim (#51768) · ac47d003
      Weilong Wu 提交于
      ac47d003
    • S
      [Hackathon NO.71] 为 Paddle-TRT 添加 pad3d 算子 (#50986) · c36e3fd2
      Sonder 提交于
      * update codes about pad3d
      
      * add codes about Tensor type Padding
      
      * update
      
      * 更新单测文件
      
      * format code style
      
      * update and to &&'
      
      * rewrite codes about pad3d
      
      * add codes about converting paddle pad format to tensorrt pad format
      
      * fix some errors
      
      * 指定trt版本范围
      
      * 修正dims初始化方式
      
      * fix code style
      
      * update test pad values
      
      * 指定pad3d trt版本
      
      * 更新 单测 文件范围
      
      * 更新单测文件
      
      * update pad3d paddings convert codes
      
      * update pad3d
      
      * add static mode support
      
      * update test file
      
      * fix bugs about dynamic mode test codes
      
      * fix bug and add limite in op_teller
      
      * use a new padding convert method[ITensor* padding with using Slice to split the pre_pad and the  post pad]
      
      * fix PADDLE_THROW grammaly error
      
      * update test codes
      
      * 添加对于Tensor padding 的 size 判断
      c36e3fd2
    • FormlessUnit's avatar
      fill_constant_batch_size_like support bf16 (#51396) · 2a0bd17c
      FormlessUnit 提交于
      shape support bf16
      2a0bd17c
    • Y
      Add fp16 and bf16 to the checking dtype list of rand apis. (#51684) · b2385821
      Yiqun Liu 提交于
      * Add fp16 and bf16 to the checking dtype list of rand apis.
      
      * Remove the checking of raising TypeError.
      b2385821
    • zhouweiwei2014's avatar
      Fix unsqueeze with empty axis bug (#51828) · 7a79fd88
      zhouweiwei2014 提交于
      7a79fd88
    • R
      skip cpu test for test_index_select (#51741) · 6ac7cabe
      Roc 提交于
      6ac7cabe
    • C
      rm restict of platform (#51806) · ca364e14
      cyber-pioneer 提交于
      ca364e14
    • G
      Fluid clean move out fill constant (#49511) · c985b1ac
      GGBond8488 提交于
      * migrate fill_constant to paddle.tensor
      
      * move fill_constant to paddle.tensor and repalce the reference
      
      * add missing fill_constant replacement
      
      * fix typro
      
      * remove unused import fill_constant
      
      * fix zeros import error
      
      * fix circle import
      
      * fix layers.zeros
      
      * fix unitest
      
      * fix unitests
      
      * fix unitest
      
      * use paddle.full replace fill_constant in samplecode
      
      * fix sample code
      
      * recovery xpu test
      
      * recovery xpu test
      
      * fix circle import
      
      * fix utils import error
      
      * fix utils error
      
      * fix circle import
      
      * redo
      
      * fix circle import
      
      * fix prim fill constant import
      
      * fix type error
      
      * fix increase error
      
      * fix test error
      
      * fix fill_constant
      c985b1ac
    • A
      318c401e
    • J
      support relue custom vjp (#51742) · 604b7a53
      Jiabin Yang 提交于
      604b7a53
  2. 17 3月, 2023 14 次提交
  3. 16 3月, 2023 8 次提交