1. 03 4月, 2020 1 次提交
    • Y
      Fix learning rate scaling bug · 810ece8f
      Yang Zhang 提交于
      this bug is quite peculiar and hard to track down, when learning rate for a
      parameter is scaled via param_attr and learning rate schedulers are used,
      `append_optimizer_op` will error out complaining `LearningRate` input is null
      
      turns out learning rate scaling is done in `_create_param_lr`, which basically
      add a scale op, the problem is: it is appended to `orig_prog` (since
      `global_learning_rate()` variable is in it), therefore the resulting scaled
      learning rate variable can not be found in `train_prog`.
      
      the reason it works previously w/o lr scaling is this:
      `clone()` will create a variable with the same name as the
      `global_learning_rate()` variable, which will be used in `append_optimizer_op`
      810ece8f
  2. 02 4月, 2020 2 次提交
  3. 01 4月, 2020 1 次提交
  4. 31 3月, 2020 1 次提交
  5. 30 3月, 2020 1 次提交
  6. 27 3月, 2020 2 次提交
  7. 26 3月, 2020 2 次提交
  8. 25 3月, 2020 4 次提交
  9. 24 3月, 2020 3 次提交
  10. 23 3月, 2020 5 次提交
  11. 22 3月, 2020 2 次提交
  12. 21 3月, 2020 2 次提交
  13. 20 3月, 2020 1 次提交
  14. 18 3月, 2020 1 次提交
  15. 17 3月, 2020 3 次提交
  16. 16 3月, 2020 1 次提交
  17. 13 3月, 2020 3 次提交
  18. 11 3月, 2020 1 次提交
  19. 10 3月, 2020 1 次提交
  20. 09 3月, 2020 1 次提交
  21. 08 3月, 2020 1 次提交
  22. 05 3月, 2020 1 次提交