未验证 提交 4556ad76 编写于 作者: L Leo Chen 提交者: GitHub

Upgrade string literals to raw string [part 2](#29217)

上级 2b2cd186
...@@ -24,7 +24,7 @@ __all__ = ['Momentum'] ...@@ -24,7 +24,7 @@ __all__ = ['Momentum']
class Momentum(Optimizer): class Momentum(Optimizer):
""" r"""
Simple Momentum optimizer with velocity state Simple Momentum optimizer with velocity state
......
...@@ -2979,7 +2979,7 @@ class GroupNorm(layers.Layer): ...@@ -2979,7 +2979,7 @@ class GroupNorm(layers.Layer):
class SpectralNorm(layers.Layer): class SpectralNorm(layers.Layer):
""" r"""
This interface is used to construct a callable object of the ``SpectralNorm`` class. This interface is used to construct a callable object of the ``SpectralNorm`` class.
For more details, refer to code examples. It implements the function of the Spectral Normalization Layer. For more details, refer to code examples. It implements the function of the Spectral Normalization Layer.
This layer calculates the spectral normalization value of weight parameters of This layer calculates the spectral normalization value of weight parameters of
......
...@@ -1123,7 +1123,7 @@ def cross_entropy(input, ...@@ -1123,7 +1123,7 @@ def cross_entropy(input,
soft_label=False, soft_label=False,
axis=-1, axis=-1,
name=None): name=None):
""" r"""
This operator implements the cross entropy loss function with softmax. This function This operator implements the cross entropy loss function with softmax. This function
combines the calculation of the softmax operation and the cross entropy loss function combines the calculation of the softmax operation and the cross entropy loss function
to provide a more numerically stable gradient. to provide a more numerically stable gradient.
......
...@@ -141,7 +141,7 @@ class BCEWithLogitsLoss(fluid.dygraph.Layer): ...@@ -141,7 +141,7 @@ class BCEWithLogitsLoss(fluid.dygraph.Layer):
class CrossEntropyLoss(fluid.dygraph.Layer): class CrossEntropyLoss(fluid.dygraph.Layer):
""" r"""
This operator implements the cross entropy loss function with softmax. This function This operator implements the cross entropy loss function with softmax. This function
combines the calculation of the softmax operation and the cross entropy loss function combines the calculation of the softmax operation and the cross entropy loss function
to provide a more numerically stable gradient. to provide a more numerically stable gradient.
...@@ -623,7 +623,7 @@ class BCELoss(fluid.dygraph.Layer): ...@@ -623,7 +623,7 @@ class BCELoss(fluid.dygraph.Layer):
class NLLLoss(fluid.dygraph.Layer): class NLLLoss(fluid.dygraph.Layer):
""" r"""
:alias_main: paddle.nn.NLLLoss :alias_main: paddle.nn.NLLLoss
:alias: paddle.nn.NLLLoss,paddle.nn.layer.NLLLoss,paddle.nn.layer.loss.NLLLoss :alias: paddle.nn.NLLLoss,paddle.nn.layer.NLLLoss,paddle.nn.layer.loss.NLLLoss
......
...@@ -21,7 +21,7 @@ __all__ = ["Lamb"] ...@@ -21,7 +21,7 @@ __all__ = ["Lamb"]
class Lamb(Optimizer): class Lamb(Optimizer):
""" r"""
LAMB (Layer-wise Adaptive Moments optimizer for Batching training) Optimizer. LAMB (Layer-wise Adaptive Moments optimizer for Batching training) Optimizer.
LAMB Optimizer is designed to scale up the batch size of training without losing LAMB Optimizer is designed to scale up the batch size of training without losing
......
...@@ -34,7 +34,7 @@ def yolo_loss(x, ...@@ -34,7 +34,7 @@ def yolo_loss(x,
use_label_smooth=True, use_label_smooth=True,
name=None, name=None,
scale_x_y=1.): scale_x_y=1.):
""" r"""
This operator generates YOLOv3 loss based on given predict result and ground This operator generates YOLOv3 loss based on given predict result and ground
truth boxes. truth boxes.
...@@ -242,7 +242,7 @@ def yolo_box(x, ...@@ -242,7 +242,7 @@ def yolo_box(x,
clip_bbox=True, clip_bbox=True,
name=None, name=None,
scale_x_y=1.): scale_x_y=1.):
""" r"""
This operator generates YOLO detection boxes from output of YOLOv3 network. This operator generates YOLO detection boxes from output of YOLOv3 network.
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册