• Z
    [DoubleGrad PR #8] Enabled triple grads for sigmoid and matmul (#41387) · d8a10977
    Zhanlue Yang 提交于
    * [Refactor] refactored eager_gen.py PR #2
    
    * [DoubleGrad PR #1] Decoupled code generation logics for Dygraph ForwardFunctions and GradNodes
    
    * Fixed minor issue
    
    * Adjusted logics of GenerateNodeCreationCodes and GenerateForwardDefinition
    
    * Fixed issues
    
    * Supported higher-order grad node generation
    
    * [DoubleGrad PR #4] Supported higher-order GradNode generation
    
    * [DoubleGrad #4] Bug Fixes to Double Grad Node Generation
    
    * Fixed yaml typo
    
    * Fixed yaml typo
    
    * fixed minor issues
    
    * [DoubleGrad PR #5] Enabled gradient computations for grad_tensors passed to paddle.grad()
    
    * Fixed minor issue
    
    * Fixed CI-Inference issue
    
    * Fixed CI-inference issues
    
    * [DoubleGrad PR #7] paddle.grad() to copy backward graph before backward run
    
    * Fixed minor issues
    
    * Fixed issue with backward graph construction logic
    
    * Fixed implementation issues with backward graph reconstruction
    
    * Fixed unittest issue
    
    * Fixed issues
    
    * [DoubleGrad PR #8] Enabled triple grads for sigmoid and matmul
    
    * Fixed issues with phi kernel
    
    * Added triple grad test case
    
    * Fixed minor issue
    d8a10977
backward.h 8.9 KB