Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
s920243400
PaddleDetection
提交
1755a2b2
P
PaddleDetection
项目概览
s920243400
/
PaddleDetection
与 Fork 源项目一致
Fork自
PaddlePaddle / PaddleDetection
通知
2
Star
0
Fork
0
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
0
列表
看板
标记
里程碑
合并请求
0
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
P
PaddleDetection
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
0
Issue
0
列表
看板
标记
里程碑
合并请求
0
合并请求
0
Pages
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
提交
Issue看板
未验证
提交
1755a2b2
编写于
10月 21, 2021
作者:
W
Wenyu
提交者:
GitHub
10月 21, 2021
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
add w/o weight decay params groups (#4337)
上级
1bf6d854
变更
3
隐藏空白更改
内联
并排
Showing
3 changed file
with
22 addition
and
3 deletion
+22
-3
configs/faster_rcnn/_base_/optimizer_swin_1x.yml
configs/faster_rcnn/_base_/optimizer_swin_1x.yml
+1
-0
ppdet/engine/trainer.py
ppdet/engine/trainer.py
+1
-2
ppdet/optimizer.py
ppdet/optimizer.py
+20
-1
未找到文件。
configs/faster_rcnn/_base_/optimizer_swin_1x.yml
浏览文件 @
1755a2b2
...
@@ -15,3 +15,4 @@ OptimizerBuilder:
...
@@ -15,3 +15,4 @@ OptimizerBuilder:
optimizer
:
optimizer
:
type
:
AdamW
type
:
AdamW
weight_decay
:
0.05
weight_decay
:
0.05
without_weight_decay_params
:
[
'
absolute_pos_embed'
,
'
relative_position_bias_table'
,
'
norm'
]
ppdet/engine/trainer.py
浏览文件 @
1755a2b2
...
@@ -115,8 +115,7 @@ class Trainer(object):
...
@@ -115,8 +115,7 @@ class Trainer(object):
if
self
.
mode
==
'train'
:
if
self
.
mode
==
'train'
:
steps_per_epoch
=
len
(
self
.
loader
)
steps_per_epoch
=
len
(
self
.
loader
)
self
.
lr
=
create
(
'LearningRate'
)(
steps_per_epoch
)
self
.
lr
=
create
(
'LearningRate'
)(
steps_per_epoch
)
self
.
optimizer
=
create
(
'OptimizerBuilder'
)(
self
.
lr
,
self
.
optimizer
=
create
(
'OptimizerBuilder'
)(
self
.
lr
,
self
.
model
)
self
.
model
.
parameters
())
self
.
_nranks
=
dist
.
get_world_size
()
self
.
_nranks
=
dist
.
get_world_size
()
self
.
_local_rank
=
dist
.
get_rank
()
self
.
_local_rank
=
dist
.
get_rank
()
...
...
ppdet/optimizer.py
浏览文件 @
1755a2b2
...
@@ -225,7 +225,7 @@ class OptimizerBuilder():
...
@@ -225,7 +225,7 @@ class OptimizerBuilder():
self
.
regularizer
=
regularizer
self
.
regularizer
=
regularizer
self
.
optimizer
=
optimizer
self
.
optimizer
=
optimizer
def
__call__
(
self
,
learning_rate
,
params
=
None
):
def
__call__
(
self
,
learning_rate
,
model
=
None
):
if
self
.
clip_grad_by_norm
is
not
None
:
if
self
.
clip_grad_by_norm
is
not
None
:
grad_clip
=
nn
.
ClipGradByGlobalNorm
(
grad_clip
=
nn
.
ClipGradByGlobalNorm
(
clip_norm
=
self
.
clip_grad_by_norm
)
clip_norm
=
self
.
clip_grad_by_norm
)
...
@@ -244,6 +244,25 @@ class OptimizerBuilder():
...
@@ -244,6 +244,25 @@ class OptimizerBuilder():
if
optim_type
!=
'AdamW'
:
if
optim_type
!=
'AdamW'
:
optim_args
[
'weight_decay'
]
=
regularization
optim_args
[
'weight_decay'
]
=
regularization
op
=
getattr
(
optimizer
,
optim_type
)
op
=
getattr
(
optimizer
,
optim_type
)
if
'without_weight_decay_params'
in
optim_args
:
keys
=
optim_args
[
'without_weight_decay_params'
]
params
=
[{
'params'
:
[
p
for
n
,
p
in
model
.
named_parameters
()
if
any
([
k
in
n
for
k
in
keys
])
],
'weight_decay'
:
0.
},
{
'params'
:
[
p
for
n
,
p
in
model
.
named_parameters
()
if
all
([
k
not
in
n
for
k
in
keys
])
]
}]
del
optim_args
[
'without_weight_decay_params'
]
else
:
params
=
model
.
parameters
()
return
op
(
learning_rate
=
learning_rate
,
return
op
(
learning_rate
=
learning_rate
,
parameters
=
params
,
parameters
=
params
,
grad_clip
=
grad_clip
,
grad_clip
=
grad_clip
,
...
...
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录