1. 15 5月, 2020 4 次提交
    • W
    • J
      SamplingID Op fix error print (#24521) (#24552) · f6050dac
      Jiawei Wang 提交于
      * fix error print for sampling_id_op
      
      * fix spell err
      
      * fix spell err test=develop
      f6050dac
    • X
      fix error check (#24483) (#24531) · 6f65b078
      xujiaqi01 提交于
      * fix error check of stack and space_to_depth
      * test=develop
      6f65b078
    • A
      [cherry-pick][Dy2stat] training with @declarative decorator and save_inference_model (#24557) · a20ce3ee
      Aurelius84 提交于
      * [Dy2Stat] Add test for ptb model. (#24076)
      
      * [Dy2Stat] Add test for ptb model. test=develop
      
      * Simplify code for gast.If in is_control_flow_to_transform. test=develop
      
      * Move IsControlFlowVisitor to file utils. test=develop
      
      * Don't use convert_call for build-in func in CallTransformer. test=develop
      
      * Optimize api is_control_flow_to_transform. test=develop
      
      * Polish the document of IsControlFlowVisitor. test=develop
      
      * Use declarative instead of dygraph_to_static_func. test=develop
      
      * [dy2static] Add print transformer and unify print format (#24068)
      
      * add print transformer & unify print format, test=develop
      
      * remove using of dygraph_to_static_func, test=develop
      
      * remove python stdout capture, test=develop
      
      * fix compatibility problems for PY2, test=develop
      
      * fix detail error, test=develop
      
      * fix type analysis bug, test=develop
      
      * fix print tuple compatible error in PY2, test=develop
      
      * replace get_func to declarative, test=develop
      
      * fix detail bug, test=develop
      
      * fix some detail problems, test=develop
      
      * change visit_call in print transformer, test=develop
      
      * [dy2static] Support for static graph training with @declarative decorator (#24259)
      
      * support to train in static
      
      * support to independent decorator
      
      * remove in_dygraph_mode condition in ProgramTranslator
      
      * fix import param_guard and add train/eval test=develop
      
      * Modify into ShareVarsFromScope and rm __all__ in partial_program test=develop
      
      * [Dy2Stat] Optimize loop cond (#24049)
      
      * Simplify code for gast.If in is_control_flow_to_transform.
      * Move IsControlFlowVisitor to file utils. 
      * Don't use convert_call for build-in func in CallTransformer. 
      * Optimize api is_control_flow_to_transform. 
      * Polish the document of IsControlFlowVisitor.
      
      * revert modification from #24259
      
      * [dy2stat]Support save_inference_model in program_translator (#24353)
      
      * support save_inference_model in program_translator test=develop
      
      * fix compatibility with OrderedDict.values() in python3 test=develop
      
      * synchronized random_seed test=develop
      
      * Polish Error Message test=develop
      
      * Fix bug with `if Tensor` in is_control_flow (#24433)
      
      * fix bug with `if Tensor` in is_control_flow test=develop
      
      * remove continue test=develop
      
      * Revert "[dy2static] Add print transformer and unify print format (#24068)"
      
      This reverts commit 09dd0190.
      
      * Revert "[dy2static] Add print transformer and unify print format (#24068)"
      
      This reverts commit 09dd0190.
      
      * fix sample code in sava_inference_model test=develop
      Co-authored-by: Nliym27 <33742067+liym27@users.noreply.github.com>
      Co-authored-by: NChen Weihang <chenweihang@baidu.com>
      a20ce3ee
  2. 14 5月, 2020 14 次提交
  3. 13 5月, 2020 17 次提交
  4. 12 5月, 2020 2 次提交
  5. 11 5月, 2020 2 次提交
  6. 30 4月, 2020 1 次提交
    • Q
      Fix double_grad bug in statig-graph (#24190) (#24286) · 0231f58e
      qingqing01 提交于
      * Rename internal gradient variables in multiple backward
      * so that they have different names with previous backward
      * For example:
      *  y = x * x, grad = fluid.gradients(fluid.gradients(y, x) + y * y, x)
      * In second-time backward, gradient variable names of partial
      * forward network (y * y) may be have same names with first-time
      * fluid.gradients(y, x).
      
      test=develop
      0231f58e