AutoDiff(自动微分) 模块¶
ppsci.autodiff.ad
¶
This module is adapted from https://github.com/lululxvi/deepxde
jacobian = Jacobians()
module-attribute
¶
hessian = Hessians()
module-attribute
¶
Jacobians
¶
Compute multiple Jacobians.
A new instance will be created for a new pair of (output, input). For the (output, input) pair that has been computed before, it will reuse the previous instance, rather than creating a new one.
Source code in ppsci/autodiff/ad.py
__call__(ys, xs, i=0, j=None, retain_graph=None, create_graph=True)
¶
Compute jacobians for given ys and xs.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
ys |
Tensor
|
Output tensor. |
required |
xs |
Tensor
|
Input tensor. |
required |
i |
int
|
i-th output variable. Defaults to 0. |
0
|
j |
Optional[int]
|
j-th input variable. Defaults to None. |
None
|
retain_graph |
Optional[bool]
|
Whether to retain the forward graph which
is used to calculate the gradient. When it is True, the graph would
be retained, in which way users can calculate backward twice for the
same graph. When it is False, the graph would be freed. Default None,
which means it is equal to |
None
|
create_graph |
bool
|
Whether to create the gradient graphs of the computing process. When it is True, higher order derivatives are supported to compute; when it is False, the gradient graphs of the computing process would be discarded. Default False. |
True
|
Returns:
Type | Description |
---|---|
'paddle.Tensor'
|
paddle.Tensor: Jacobian matrix of ys[i] to xs[j]. |
Examples:
>>> import paddle
>>> import ppsci
>>> x = paddle.randn([4, 1])
>>> x.stop_gradient = False
>>> y = x * x
>>> dy_dx = ppsci.autodiff.jacobian(y, x)
Source code in ppsci/autodiff/ad.py
Hessians
¶
Compute multiple Hessians.
A new instance will be created for a new pair of (output, input). For the (output, input) pair that has been computed before, it will reuse the previous instance, rather than creating a new one.
Source code in ppsci/autodiff/ad.py
__call__(ys, xs, component=None, i=0, j=0, grad_y=None, retain_graph=None, create_graph=True)
¶
Compute hessian matrix for given ys and xs.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
ys |
Tensor
|
Output tensor. |
required |
xs |
Tensor
|
Input tensor. |
required |
component |
Optional[int]
|
If |
None
|
i |
int
|
i-th input variable. Defaults to 0. |
0
|
j |
int
|
j-th input variable. Defaults to 0. |
0
|
grad_y |
Optional[Tensor]
|
The gradient of |
None
|
retain_graph |
Optional[bool]
|
Whether to retain the forward graph which
is used to calculate the gradient. When it is True, the graph would
be retained, in which way users can calculate backward twice for the
same graph. When it is False, the graph would be freed. Default None,
which means it is equal to |
None
|
create_graph |
bool
|
Whether to create the gradient graphs of the computing process. When it is True, higher order derivatives are supported to compute; when it is False, the gradient graphs of the computing process would be discarded. Default False. |
True
|
Returns:
Type | Description |
---|---|
'paddle.Tensor'
|
paddle.Tensor: Hessian matrix. |
Examples:
>>> import paddle
>>> import ppsci
>>> x = paddle.randn([4, 3])
>>> x.stop_gradient = False
>>> y = (x * x).sin()
>>> dy_dxx = ppsci.autodiff.hessian(y, x, component=0)
Source code in ppsci/autodiff/ad.py
创建日期: November 6, 2023