未验证 提交 a0fa0d9e 编写于 作者: C Chen Weihang 提交者: GitHub

add optional for param attr args, test=document_fix (#31105) (#31115)

test=document_fix
上级 0d780349
......@@ -46,7 +46,7 @@ class ParamAttr(object):
initializer (Initializer, optional): The method to initial this parameter. Default
None, meaning that the weight parameter is initialized by Xavier initializer,
and the bias parameter is initialized by 0.
learning_rate (float): The parameter's learning rate. The learning rate when
learning_rate (float, optional): The parameter's learning rate. The learning rate when
optimize is the global learning rates times the parameter's learning rate times
the factor of learning rate scheduler. Default 1.0.
regularizer (WeightDecayRegularizer, optional): Regularization strategy. There are two method:
......@@ -54,10 +54,13 @@ class ParamAttr(object):
regularizer is also set in ``optimizer`` (such as :ref:`api_paddle_optimizer_SGD` ),
that regularizer setting in optimizer will be ignored. Default None, meaning there is
no regularization.
trainable (bool): Whether this parameter is trainable. Default True.
do_model_average (bool): Whether this parameter should do model average
trainable (bool, optional): Whether this parameter is trainable. Default True.
do_model_average (bool, optional): Whether this parameter should do model average
when model average is enabled. Only used in ExponentialMovingAverage. Default True.
need_clip (bool): Whether the parameter gradient need to be cliped in optimizer. Default is True.
need_clip (bool, optional): Whether the parameter gradient need to be cliped in optimizer. Default is True.
Returns:
ParamAttr Object.
Examples:
.. code-block:: python
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册