1. 22 11月, 2017 1 次提交
    • Q
      07/Label semantic roles (#5798) · 53bd51e3
      Qiao Longfei 提交于
      * init label_semantic_roles.py
      
      * add linear_chain_crf and test
      
      * complete test_linear_chain_crf
      
      * correct last layer of db_lstm
      
      * update optimizer and initializer
      
      * update param_initializer of embedding_layer
      
      * support load pre trained embedding
      
      * rm unused parameter
      
      * optimize code
      
      * clean code
      
      * fix test
      
      * add todo
      53bd51e3
  2. 15 11月, 2017 1 次提交
  3. 14 11月, 2017 1 次提交
  4. 13 11月, 2017 1 次提交
  5. 10 11月, 2017 1 次提交
  6. 05 11月, 2017 1 次提交
  7. 02 11月, 2017 1 次提交
    • Q
      Optimizer use init program (#5275) · f48159ad
      Qiao Longfei 提交于
      * optimizer use init_program
      
      * create persistable variable
      
      * add create_persistable_var to block
      
      * optimizer use create_persistable_var
      
      * fix prefix
      
      * move create_global_persistable_var from Block to LayerHelper
      
      * Polish Optimizer initialization code.
      
      * Using the LayerHelper to create initialize operator and variables
      
      * add_accumulator should use an independent data type
      
      * default use param data type for accumulator
      f48159ad
  8. 28 10月, 2017 1 次提交
  9. 27 10月, 2017 1 次提交
  10. 26 10月, 2017 2 次提交
  11. 25 10月, 2017 2 次提交
  12. 21 10月, 2017 1 次提交
  13. 20 10月, 2017 2 次提交
  14. 18 10月, 2017 1 次提交
    • Q
      Impl optimizer (#4734) · df0946eb
      Qiao Longfei 提交于
      * init parameter base class
      
      * optimize the Comments of optimizer
      
      * basic implimentation of optimizer
      
      * add test_optimizer
      
      * add no_grad_set to interface
      
      * update optimizer.py
      
      * python code can run
      
      * fix some problem
      
      * add sync_with_cpp to Python Program and Block
      
      * sync vars and ops in block from cpp
      
      * optimize code and add some comment
      
      * add more check for sync
      
      * update optimizer with return value of Backward
      
      * rm unused code
      
      * infer shape when create gradient vairiable
      
      * update test_optimizer
      
      * update test_program.py
      
      * update backward test
      
      * follow comment
      df0946eb