• Z
    [AMP] Support pure fp16 training mode for dygraph (#35521) · adaeee4d
    zhangbo9674 提交于
    * add pure fp16 major function in auto_cast & tracer
    
    * support master weight in dygraph for pure fp16
    
    * check mix dtype of fp16&fp32 for check_finite_and_unscale op
    
    * change pure fp16 funtion name
    
    * refine some bug in auto_cast
    
    * refine auto_cast interface logic
    
    * add param _casted_by_pure_fp16 for class Layer
    
    * support state_dict hook for save model by user appointed dtype in pure_fp16_decorator
    
    * refine pure_fp16_decorator as decorator
    
    * add unittest
    
    * add comment
    
    * add comment
    
    * support recompute
    
    * add comment for auto_cast and decorator
    
    * support to_static_state_dict for paddle.jit.save
    
    * unlimite models num and optimizers num
    
    * add lookup_table in black_list
    
    * fix momentum and layer state_dict
    
    * fix bug in layer state_dict
    
    * fix bug in layer state_dict_helper
    
    * refine unittest
    
    * refine test_momentun_op
    
    * refine interface and some code
    
    * refine amp_decorator interface
    
    * refine pure fp16 interface
    
    * refine master weight interface
    adaeee4d
tracer.cc 10.4 KB