未验证 提交 3c81d0ca 编写于 作者: J jjyaoao 提交者: GitHub

Fixed the dead link bug in the API documentation (#48969)

* first pr

* Revise nn.py

* Revise nn.py 2.0

* Revise rnn.py;test=document_fix

* test=document_fix
Co-authored-by: NLigoml <39876205+Ligoml@users.noreply.github.com>
上级 b0e7226e
...@@ -36,7 +36,7 @@ def launch(): ...@@ -36,7 +36,7 @@ def launch():
Base Parameters: Base Parameters:
- ``--master``: The master/rendezvous server, support http:// and etcd://, default with http://. e.g., ``--master=127.0.0.1:8080``. Default ``--master=None``. - ``--master``: The master/rendezvous server, support ``http://`` and ``etcd://``, default with ``http://``. e.g., ``--master=127.0.0.1:8080``. Default ``--master=None``.
- ``--rank``: The rank of the node, can be auto assigned by master. Default ``--rank=-1``. - ``--rank``: The rank of the node, can be auto assigned by master. Default ``--rank=-1``.
......
...@@ -37,6 +37,7 @@ from ..data_feeder import ( ...@@ -37,6 +37,7 @@ from ..data_feeder import (
check_type, check_type,
check_dtype, check_dtype,
) )
from ..param_attr import ParamAttr from ..param_attr import ParamAttr
from ..initializer import Normal, Constant, NumpyArrayInitializer from ..initializer import Normal, Constant, NumpyArrayInitializer
from .. import unique_name from .. import unique_name
......
...@@ -1384,7 +1384,7 @@ class Variable(metaclass=VariableMetaClass): ...@@ -1384,7 +1384,7 @@ class Variable(metaclass=VariableMetaClass):
shape=[-1, 23, 48], shape=[-1, 23, 48],
dtype='float32') dtype='float32')
In `Dygraph <../../user_guides/howto/dygraph/DyGraph.html>`_ Mode: In Dygraph Mode:
.. code-block:: python .. code-block:: python
...@@ -1861,7 +1861,7 @@ class Variable(metaclass=VariableMetaClass): ...@@ -1861,7 +1861,7 @@ class Variable(metaclass=VariableMetaClass):
""" """
Indicating if we stop gradient from current Variable Indicating if we stop gradient from current Variable
**Notes: This Property has default value as** ``True`` **in** `Dygraph <../../user_guides/howto/dygraph/DyGraph.html>`_ **mode, while Parameter's default value is False. However, in Static Graph Mode all Variable's default stop_gradient value is** ``False`` **Notes: This Property has default value as** ``True`` **in** Dygraph **mode, while Parameter's default value is False. However, in Static Graph Mode all Variable's default stop_gradient value is** ``False``
Examples: Examples:
.. code-block:: python .. code-block:: python
...@@ -1903,7 +1903,7 @@ class Variable(metaclass=VariableMetaClass): ...@@ -1903,7 +1903,7 @@ class Variable(metaclass=VariableMetaClass):
**1. All Variable's persistable is** ``False`` **except Parameters.** **1. All Variable's persistable is** ``False`` **except Parameters.**
**2. In** `Dygraph <../../user_guides/howto/dygraph/DyGraph.html>`_ **mode, this property should not be changed** **2. In** Dygraph **mode, this property should not be changed**
Examples: Examples:
.. code-block:: python .. code-block:: python
...@@ -1952,7 +1952,7 @@ class Variable(metaclass=VariableMetaClass): ...@@ -1952,7 +1952,7 @@ class Variable(metaclass=VariableMetaClass):
""" """
Indicating name of current Variable Indicating name of current Variable
**Notes: If it has two or more Varaible share the same name in the same** :ref:`api_guide_Block_en` **, it means these Variable will share content in no-** `Dygraph <../../user_guides/howto/dygraph/DyGraph.html>`_ **mode. This is how we achieve Parameter sharing** **Notes: If it has two or more Varaible share the same name in the same** :ref:`api_guide_Block_en` **, it means these Variable will share content in no-** Dygraph **mode. This is how we achieve Parameter sharing**
Examples: Examples:
.. code-block:: python .. code-block:: python
...@@ -1982,7 +1982,7 @@ class Variable(metaclass=VariableMetaClass): ...@@ -1982,7 +1982,7 @@ class Variable(metaclass=VariableMetaClass):
import paddle.fluid as fluid import paddle.fluid as fluid
x = fluid.data(name="x", shape=[-1, 23, 48], dtype='float32') x = fluid.data(name="x", shape=[-1, 23, 48], dtype='float32')
print(x.grad_name) # output is "x@GRAD" print(x.grad_name) # output is ``x@GRAD``
""" """
return self.name + "@GRAD" return self.name + "@GRAD"
...@@ -2043,7 +2043,7 @@ class Variable(metaclass=VariableMetaClass): ...@@ -2043,7 +2043,7 @@ class Variable(metaclass=VariableMetaClass):
**1. This is a read-only property** **1. This is a read-only property**
**2. Don't support this property in** `Dygraph <../../user_guides/howto/dygraph/DyGraph.html>`_ **mode, it's value should be** ``0(int)`` **2. Don't support this property in** Dygraph **mode, it's value should be** ``0(int)``
Examples: Examples:
.. code-block:: python .. code-block:: python
......
...@@ -494,7 +494,6 @@ def dynamic_lstm( ...@@ -494,7 +494,6 @@ def dynamic_lstm(
name=None, name=None,
): ):
r""" r"""
:api_attr: Static Graph
**Note**: **Note**:
1. This OP only supports LoDTensor as inputs. If you need to deal with Tensor, please use :ref:`api_fluid_layers_lstm` . 1. This OP only supports LoDTensor as inputs. If you need to deal with Tensor, please use :ref:`api_fluid_layers_lstm` .
...@@ -684,12 +683,11 @@ def lstm( ...@@ -684,12 +683,11 @@ def lstm(
seed=-1, seed=-1,
): ):
r""" r"""
:api_attr: Static Graph
**Note**: **Note**:
This OP only supports running on GPU devices. This OP only supports running on GPU devices.
This OP implements LSTM operation - `Hochreiter, S., & Schmidhuber, J. (1997) <http://deeplearning.cs.cmu.edu/pdfs/Hochreiter97_lstm.pdf>`_ . This OP implements LSTM operation - `Hochreiter, S., & Schmidhuber, J. (1997) <https://blog.xpgreat.com/file/lstm.pdf>`_ .
The implementation of this OP does not include diagonal/peephole connections. The implementation of this OP does not include diagonal/peephole connections.
Please refer to `Gers, F. A., & Schmidhuber, J. (2000) <ftp://ftp.idsia.ch/pub/juergen/TimeCount-IJCNN2000.pdf>`_ . Please refer to `Gers, F. A., & Schmidhuber, J. (2000) <ftp://ftp.idsia.ch/pub/juergen/TimeCount-IJCNN2000.pdf>`_ .
...@@ -742,7 +740,6 @@ def lstm( ...@@ -742,7 +740,6 @@ def lstm(
If set None, default initializer will be used. Default: None. If set None, default initializer will be used. Default: None.
seed(int, optional): Seed for dropout in LSTM, If it's -1, dropout will use random seed. Default: 1. seed(int, optional): Seed for dropout in LSTM, If it's -1, dropout will use random seed. Default: 1.
Returns: Returns:
tuple ( :ref:`api_guide_Variable_en` , :ref:`api_guide_Variable_en` , :ref:`api_guide_Variable_en` ) : tuple ( :ref:`api_guide_Variable_en` , :ref:`api_guide_Variable_en` , :ref:`api_guide_Variable_en` ) :
...@@ -757,7 +754,6 @@ def lstm( ...@@ -757,7 +754,6 @@ def lstm(
shape is :math:`[num\_layers, batch\_size, hidden\_size]` \ shape is :math:`[num\_layers, batch\_size, hidden\_size]` \
if is_bidirec set to True, shape will be :math:`[num\_layers*2, batch\_size, hidden\_size]` if is_bidirec set to True, shape will be :math:`[num\_layers*2, batch\_size, hidden\_size]`
Examples: Examples:
.. code-block:: python .. code-block:: python
...@@ -875,7 +871,6 @@ def dynamic_lstmp( ...@@ -875,7 +871,6 @@ def dynamic_lstmp(
proj_clip=None, proj_clip=None,
): ):
r""" r"""
:api_attr: Static Graph
**Note**: **Note**:
1. In order to improve efficiency, users must first map the input of dimension [T, hidden_size] to input of [T, 4 * hidden_size], and then pass it to this OP. 1. In order to improve efficiency, users must first map the input of dimension [T, hidden_size] to input of [T, 4 * hidden_size], and then pass it to this OP.
...@@ -1100,7 +1095,6 @@ def dynamic_gru( ...@@ -1100,7 +1095,6 @@ def dynamic_gru(
origin_mode=False, origin_mode=False,
): ):
r""" r"""
:api_attr: Static Graph
**Note: The input type of this must be LoDTensor. If the input type to be **Note: The input type of this must be LoDTensor. If the input type to be
processed is Tensor, use** :ref:`api_fluid_layers_StaticRNN` . processed is Tensor, use** :ref:`api_fluid_layers_StaticRNN` .
...@@ -1270,7 +1264,6 @@ def gru_unit( ...@@ -1270,7 +1264,6 @@ def gru_unit(
origin_mode=False, origin_mode=False,
): ):
r""" r"""
:api_attr: Static Graph
Gated Recurrent Unit (GRU) RNN cell. This operator performs GRU calculations for Gated Recurrent Unit (GRU) RNN cell. This operator performs GRU calculations for
one time step and it supports these two modes: one time step and it supports these two modes:
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册