1. 08 1月, 2021 1 次提交
    • Z
      Support pure fp16 training for AMP API. (#29544) · 7f7dfccf
      Zhen Wang 提交于
      * add cast ops before and after unsupported fp16 ops.
      
      * Keep partial net in FP32 pattern.
      
      * Support check_finite_and_unscale and update_loss_scaling for FP16 calculation mode.
      
      * Add fp16 support for adam op.
      
      * add multi precision attr for adam.
      
      * Fix the bug of test_multi_precision_fp16_train UT.
      
      * Code format for CI.
      
      * Fix the redefine error about MPTypeTrait on windows.
      
      * fix bugs of the _create_accumulators func in Momentum.
      
      * fix bug when inserting post cast op.
      
      * Add the update_loss_scaling op in allow_set of UnusedVarCheck.
      
      * Update for ci coverage.
      
      * Add some doc for OptimizerWithMixedPrecision.
      
      * Fix the code style.
      
      * Imporve the doc of `amp_init`.
      
      * Change for fp16 testing if users have the infer program defined in separate way.
      7f7dfccf
  2. 05 1月, 2021 1 次提交
  3. 21 12月, 2020 1 次提交
    • H
      Optimizer trans momentum (#29597) · a29006d1
      huangxu96 提交于
      * merge amp related function in Momentum from paddle.fluid.contrib.optimizer into paddle.optimizer.
      
      * Add unittest for 2.0  Momentum API.
      
      * fix some bugs in weight_decay.
      a29006d1
  4. 01 12月, 2020 2 次提交
  5. 24 11月, 2020 1 次提交
    • L
      Upgrade string literals to raw string (#28989) · 3815d7aa
      Leo Chen 提交于
      * upgrade comment string to raw string
      
      * fix string in
      
      * fix string with ' '
      
      * revert update on comments
      
      * upgrade only necessary
      
      * fix sample code checker
      
      * fix comments with '''
      3815d7aa
  6. 29 8月, 2020 1 次提交
    • J
      Adadelta Optimizer (#26590) · a1b99fae
      Jiawei Wang 提交于
      * add doc; notest
      
      * fix doc; notest
      
      * update doc; notest
      
      * refine optimizer && adam
      
      * refine optimizer; notest
      
      * add adam
      
      * fix doc
      
      * fix doc && add adamw; notest
      
      * add error message
      
      * bug fix
      
      * refine rmsprop && adamax
      
      * fix ci
      
      * buf fix
      
      * update comment
      
      * unify arguments place; notest
      
      * fix ut, test=develop
      
      * bug fix
      
      * fix conflicts, test=develop
      
      * add examples code
      
      * bug fix
      
      * fix comments
      
      * fix sample code
      
      * add sample code for Optimizer
      
      * add adamax ut, test=develop
      
      * fix rmsprop ut, test=develop
      
      * add ut for optimizer.py and adamw.py
      
      * first commit of adadelta optimizer
      
      * fix learning rate
      
      * fix adadelta doc and add sgd momentum
      
      * remove unused fluid
      
      * fix codestyle
      
      * Update test_adam_op.py
      
      * Update test_adam_op.py
      
      * fix SGD in 2 unittests
      
      * fix SGD in 2 unittests
      
      * fix ci
      
      * fix ut
      Co-authored-by: NMRXLT <xlt2024@gmail.com>
      Co-authored-by: Nmapingshuo <mps2012@yeah.net>
      a1b99fae