未验证 提交 e8d296ef 编写于 作者: H HydrogenSulfate 提交者: GitHub

Add jacobian and hessian (#53331)

* add jacobian and hessian in paddle.autograd

* disable unitest 'func_multi_input' for bug in high-order gradient of multiply

* add dimension checks

* add support for 0-D tensor

* change return type from Jacobian to Hessian in hessian function

* refine Jacobian _flatten function for single xs

* refine support for 0-D tensor

* 1. add 'func_multi_input' unitest for multiply_grad_kernel bug fixed
already.
2. support non-inplace math operation via magical method overwriting.

* add unitest for math operation and raise error when 0-D tensor is indexed

* add ndim check on ys and xs according to is_batched, and add one unitest

* refine docstring of jacobian and hessian

* move paddle.incubate.autograd.Jacobian/Hessian to paddle.incubate.autograd.functional.Jacobian/Hessian

* remove single_input unitest case because numerical differentiation is wrong

* remove 3 unitest for numerical result(reference result) is wrong

* 1. rename autodiff.py to autograd.py
2. increase TIMEOUT to 100

* cancel modification for functional Jacobian/Hessian

* 1. use tuple as return type instead of list
2. refine docstring

* add more unitest case to improve coverage

* remove 2 unitest of Hessian for numerical result is wrong

* remove 1 unitest of Hessian for numerical result is wrong

* remove 1 unitest of Hessian for numerical result is wrong

* change unit test to shape check

* correct doc and replace incubate API to stable API in _grad
上级 6768c6ec
......@@ -18,12 +18,15 @@ from ..fluid.dygraph.base import no_grad_ as no_grad # noqa: F401
from ..fluid.dygraph.base import is_grad_enabled # noqa: F401
from ..fluid.dygraph.base import set_grad_enabled # noqa: F401
from . import backward_mode # noqa: F401
from .autograd import jacobian, hessian # noqa: F401
from .backward_mode import backward # noqa: F401
from .py_layer import PyLayer # noqa: F401
from .py_layer import PyLayerContext # noqa: F401
from .saved_tensors_hooks import saved_tensors_hooks
__all__ = [ # noqa
'jacobian',
'hessian',
'backward',
'PyLayer',
'PyLayerContext',
......
此差异已折叠。
......@@ -15,6 +15,7 @@ foreach(TEST_OP ${TEST_OPS})
py_test_modules(${TEST_OP} MODULES ${TEST_OP} ENVS ${GC_ENVS})
endforeach()
set_tests_properties(test_autograd_dynamic PROPERTIES TIMEOUT 100)
set_tests_properties(test_autograd_functional_dynamic PROPERTIES TIMEOUT 200)
set_tests_properties(test_autograd_functional_static PROPERTIES TIMEOUT 160)
set_tests_properties(test_minimize PROPERTIES TIMEOUT 60)
......
此差异已折叠。
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册