1. 01 4月, 2019 1 次提交
  2. 31 3月, 2019 2 次提交
    • Q
      Add linear learning warmup method in learning rate scheduler. (#16563) · 1ebd7434
      qingqing01 提交于
      * Add linear learning warmup method
      
      This warmup lr can be combinated with other learning rate strategies.
      For example:
                  decayed_lr = fluid.layers.linear_lr_warmup(
                      fluid.layers.piecewise_decay(boundaries, lr_steps),
                      warmup_steps, start_lr, end_lr)
      1ebd7434
    • W
      Batch norm cudnn accurate (#16545) · 22b02bfa
      Wu Yi 提交于
      * fix cudnn batch norm accuracy test=develop
      
      * fix cudnn batch norm accuracy test=develop
      
      * disable failed test for later fix test=develop
      22b02bfa
  3. 30 3月, 2019 1 次提交
  4. 29 3月, 2019 15 次提交
  5. 28 3月, 2019 6 次提交
  6. 27 3月, 2019 6 次提交
  7. 26 3月, 2019 8 次提交
    • S
      fix env variable settting bug · 78fb3a62
      sneaxiy 提交于
      test=develop
      78fb3a62
    • D
      revert test_softmax_cudnn. test=develop · 7920e3be
      dengkaipeng 提交于
      7920e3be
    • J
      Fix/test imperative ptb rnn (#16433) · 7c5319ba
      Jiabin Yang 提交于
      * test=develop, fix ptb rnn
      
      * test=develop, change cdn to bj to pass ci
      
      * test=develop, fix ci
      7c5319ba
    • J
      add layer norm to Layers, add transformer test in imperative mode (#16092) · f735102e
      Jiabin Yang 提交于
      * add layer norm to Layers, add transformer prepare encoding
      
      * little change
      
      * finish encoder part
      
      * add decoder part
      
      * finish model part
      
      * add test case and part of data feed
      
      * add transformer test
      
      * add to_parameter, add remove in set_attr
      
      * test=develop, fix pos encoding bug, create_parameter with stantard name
      
      * test=develop, rm dropout test in imperative
      
      * test=develop, fix cpu error
      
      * test=develop, fix minize bug
      
      * test=develop, fix one hot not stop gradient
      
      * test=develop, fix one hot not stop gradient
      
      * test=develop, refine parameter name
      
      * test=develop, fix transformer test in imperative mode
      
      * test=develop, fix transformer test in imperative mode
      
      * test=develop, fix boost and mkl download error
      
      * test=develop, fix boost and mkl download error
      
      * test=develop, fix ci and refine code
      
      * test=develop, fix ci and refine code
      f735102e
    • X
      polish · fd24ab47
      Xin Pan 提交于
      test=develop
      fd24ab47
    • X
      update DeepCF model · 1f89249a
      Xin Pan 提交于
      test=develop
      1f89249a
    • S
      fix some op grad maker · 7000ec85
      sneaxiy 提交于
      fix ctest eager deletion disable bug
      test=develop
      7000ec85
    • W
      [slim] Add quantization strategy and distillation strategy. (#16408) · e9bec936
      whs 提交于
      * Add fsp operator.
      1 Add unitest.
      2. Add python API.
      3. Add layer test.
      
      * Add quantization strategy.
      1. Add API.
      2. Add unitest.
      
      * Add distillatoin strategy.
      
      * Add unitest config file for quantization
      
      * Fix Copyright
      test=develop
      
      * Fix setup.py
      
      * Fix document of layers.py.
      test=develop
      
      * Fix unitest in python3.
      test=develop
      
      * Fix documents.
      test=develop
      
      * 1. refine fsp op by batched gemm
      2. remove unused import
      test=develop
      
      * Fix test_dist_se_resnext.
      1. disable test distillation.
      2. reset framework.py
      test=develop
      
      * Enable unitest of distillation after fixing Block._clone_variable
      test=develop
      
      * Fix cdn issue.
      test=develop
      e9bec936
  8. 25 3月, 2019 1 次提交