提交 3f00f7e0 编写于 作者: littletomatodonkey's avatar littletomatodonkey

fix doc

上级 db762aa6
......@@ -28,7 +28,7 @@ class L1Decay(fluid.regularizer.L1Decay):
in its ParamAttr, then the regularizer in Optimizer will be ignored. Otherwise the regularizer
in Optimizer will be used.
In the implementation, the penalty of L1 Weight Decay Regularization is as follows:
In the implementation, the loss function of L1 Weight Decay Regularization is as follows:
.. math::
......@@ -90,7 +90,7 @@ class L2Decay(fluid.regularizer.L2Decay):
in its ParamAttr, then the regularizer in Optimizer will be ignored. Otherwise the regularizer
in Optimizer will be used.
In the implementation, the penalty of L2 Weight Decay Regularization is as follows:
In the implementation, the loss function of L2 Weight Decay Regularization is as follows:
.. math::
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册