未验证 提交 33ca455a 编写于 作者: Z Zhong Hui 提交者: GitHub

[DOC] Clarify the difference of paddle.norm and np.linalg.norm (#32530)

* [DOC] Clarify the difference between paddle.norm and np.linalg.norm
上级 561dc719
...@@ -177,6 +177,12 @@ def norm(x, p='fro', axis=None, keepdim=False, name=None): ...@@ -177,6 +177,12 @@ def norm(x, p='fro', axis=None, keepdim=False, name=None):
Returns the matrix norm (Frobenius) or vector norm (the 1-norm, the Euclidean Returns the matrix norm (Frobenius) or vector norm (the 1-norm, the Euclidean
or 2-norm, and in general the p-norm for p > 0) of a given tensor. or 2-norm, and in general the p-norm for p > 0) of a given tensor.
.. note::
This norm API is different from `numpy.linalg.norm`.
This api supports high-order input tensors (rank >= 3), and certain axis need to be pointed out to calculate the norm.
But `numpy.linalg.norm` only supports 1-D vector or 2-D matrix as input tensor.
For p-order matrix norm, this api actually treats matrix as a flattened vector to calculate the vector norm, NOT REAL MATRIX NORM.
Args: Args:
x (Tensor): The input tensor could be N-D tensor, and the input data x (Tensor): The input tensor could be N-D tensor, and the input data
type could be float32 or float64. type could be float32 or float64.
...@@ -344,6 +350,10 @@ def norm(x, p='fro', axis=None, keepdim=False, name=None): ...@@ -344,6 +350,10 @@ def norm(x, p='fro', axis=None, keepdim=False, name=None):
return reduce_out return reduce_out
def p_matrix_norm(input, porder=1., axis=axis, keepdim=False, name=None): def p_matrix_norm(input, porder=1., axis=axis, keepdim=False, name=None):
"""
NOTE:
This function actually treats the matrix as flattened vector to calculate vector norm instead of matrix norm.
"""
block = LayerHelper('norm', **locals()) block = LayerHelper('norm', **locals())
out = block.create_variable_for_type_inference( out = block.create_variable_for_type_inference(
dtype=block.input_dtype()) dtype=block.input_dtype())
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册