提交 74fa0cc2 编写于 作者: T tianyi1997 提交者: HydrogenSulfate

Modify docstring

上级 fad8563e
...@@ -99,9 +99,9 @@ class MetaBNNeck(nn.Layer): ...@@ -99,9 +99,9 @@ class MetaBNNeck(nn.Layer):
def setup_opt(self, opt): def setup_opt(self, opt):
""" """
enable_inside_update: enable inside updating for `gate` in MetaBIN Arg:
lr_gate: learning rate of `gate` during meta-train phase opt (dict): Optional setting to change the behavior of MetaBIN during training.
bn_mode: control the running stats & updating of BN It includes three settings which are `enable_inside_update`, `lr_gate` and `bn_mode`.
""" """
self.check_opt(opt) self.check_opt(opt)
self.opt = copy.deepcopy(opt) self.opt = copy.deepcopy(opt)
......
...@@ -257,31 +257,31 @@ class Cyclic(LRBase): ...@@ -257,31 +257,31 @@ class Cyclic(LRBase):
"""Cyclic learning rate decay """Cyclic learning rate decay
Args: Args:
epochs (int): Total epoch(s) epochs (int): Total epoch(s).
step_each_epoch (int): Number of iterations within an epoch step_each_epoch (int): Number of iterations within an epoch.
base_learning_rate (float): Initial learning rate, which is the lower boundary in the cycle. The paper recommends base_learning_rate (float): Initial learning rate, which is the lower boundary in the cycle. The paper recommends
that set the base_learning_rate to 1/3 or 1/4 of max_learning_rate. that set the base_learning_rate to 1/3 or 1/4 of max_learning_rate.
max_learning_rate (float): Maximum learning rate in the cycle. It defines the cycle amplitude as above. max_learning_rate (float): Maximum learning rate in the cycle. It defines the cycle amplitude as above.
Since there is some scaling operation during process of learning rate adjustment, Since there is some scaling operation during process of learning rate adjustment,
max_learning_rate may not actually be reached. max_learning_rate may not actually be reached.
warmup_epoch (int): Number of warmup epoch(s) warmup_epoch (int): Number of warmup epoch(s).
warmup_start_lr (float): Start learning rate within warmup warmup_start_lr (float): Start learning rate within warmup.
step_size_up (int): Number of training steps, which is used to increase learning rate in a cycle. step_size_up (int): Number of training steps, which is used to increase learning rate in a cycle.
The step size of one cycle will be defined by step_size_up + step_size_down. According to the paper, step The step size of one cycle will be defined by step_size_up + step_size_down. According to the paper, step
size should be set as at least 3 or 4 times steps in one epoch. size should be set as at least 3 or 4 times steps in one epoch.
step_size_down (int, optional): Number of training steps, which is used to decrease learning rate in a cycle. step_size_down (int, optional): Number of training steps, which is used to decrease learning rate in a cycle.
If not specified, it's value will initialize to `` step_size_up `` . Default: None If not specified, it's value will initialize to `` step_size_up `` . Default: None.
mode (str, optional): One of 'triangular', 'triangular2' or 'exp_range'. mode (str, optional): One of 'triangular', 'triangular2' or 'exp_range'.
If scale_fn is specified, this argument will be ignored. Default: 'triangular' If scale_fn is specified, this argument will be ignored. Default: 'triangular'.
exp_gamma (float): Constant in 'exp_range' scaling function: exp_gamma**iterations. Used only when mode = 'exp_range'. Default: 1.0 exp_gamma (float): Constant in 'exp_range' scaling function: exp_gamma**iterations. Used only when mode = 'exp_range'. Default: 1.0.
scale_fn (function, optional): A custom scaling function, which is used to replace three build-in methods. scale_fn (function, optional): A custom scaling function, which is used to replace three build-in methods.
It should only have one argument. For all x >= 0, 0 <= scale_fn(x) <= 1. It should only have one argument. For all x >= 0, 0 <= scale_fn(x) <= 1.
If specified, then 'mode' will be ignored. Default: None If specified, then 'mode' will be ignored. Default: None.
scale_mode (str, optional): One of 'cycle' or 'iterations'. Defines whether scale_fn is evaluated on cycle scale_mode (str, optional): One of 'cycle' or 'iterations'. Defines whether scale_fn is evaluated on cycle
number or cycle iterations (total iterations since start of training). Default: 'cycle' number or cycle iterations (total iterations since start of training). Default: 'cycle'.
last_epoch (int, optional): The index of last epoch. Can be set to restart training. Default: -1, means initial learning rate. last_epoch (int, optional): The index of last epoch. Can be set to restart training. Default: -1, means initial learning rate.
by_epoch (bool): Learning rate decays by epoch when by_epoch is True, else by iter by_epoch (bool): Learning rate decays by epoch when by_epoch is True, else by iter.
verbose: (bool, optional): If True, prints a message to stdout for each update. Defaults to False verbose: (bool, optional): If True, prints a message to stdout for each update. Defaults to False.
""" """
def __init__(self, def __init__(self,
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册