Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
曾经的那一瞬间
Models
提交
a7894f9e
M
Models
项目概览
曾经的那一瞬间
/
Models
11 个月 前同步成功
通知
1
Star
0
Fork
0
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
0
列表
看板
标记
里程碑
合并请求
0
DevOps
流水线
流水线任务
计划
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
M
Models
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
0
Issue
0
列表
看板
标记
里程碑
合并请求
0
合并请求
0
Pages
DevOps
DevOps
流水线
流水线任务
计划
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
流水线任务
提交
Issue看板
体验新版 GitCode,发现更多精彩内容 >>
提交
a7894f9e
编写于
1月 26, 2022
作者:
C
Chen Qian
提交者:
A. Unique TensorFlower
1月 26, 2022
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
Internal change
PiperOrigin-RevId: 424391275
上级
885fda09
变更
3
隐藏空白更改
内联
并排
Showing
3 changed file
with
29 addition
and
4 deletion
+29
-4
official/modeling/optimization/configs/optimization_config.py
...cial/modeling/optimization/configs/optimization_config.py
+2
-0
official/modeling/optimization/configs/optimizer_config.py
official/modeling/optimization/configs/optimizer_config.py
+20
-0
official/modeling/optimization/optimizer_factory.py
official/modeling/optimization/optimizer_factory.py
+7
-4
未找到文件。
official/modeling/optimization/configs/optimization_config.py
浏览文件 @
a7894f9e
...
...
@@ -45,6 +45,8 @@ class OptimizerConfig(oneof.OneOfConfig):
"""
type
:
Optional
[
str
]
=
None
sgd
:
opt_cfg
.
SGDConfig
=
opt_cfg
.
SGDConfig
()
sgd_experimental
:
opt_cfg
.
SGDExperimentalConfig
=
(
opt_cfg
.
SGDExperimentalConfig
())
adam
:
opt_cfg
.
AdamConfig
=
opt_cfg
.
AdamConfig
()
adamw
:
opt_cfg
.
AdamWeightDecayConfig
=
opt_cfg
.
AdamWeightDecayConfig
()
lamb
:
opt_cfg
.
LAMBConfig
=
opt_cfg
.
LAMBConfig
()
...
...
official/modeling/optimization/configs/optimizer_config.py
浏览文件 @
a7894f9e
...
...
@@ -54,6 +54,26 @@ class SGDConfig(BaseOptimizerConfig):
momentum
:
float
=
0.0
# TODO(b/216129465): Merge this config with SGDConfig after the experimental
# optimizer graduates.
@
dataclasses
.
dataclass
class
SGDExperimentalConfig
(
BaseOptimizerConfig
):
"""Configuration for SGD optimizer.
The attributes for this class matches the arguments of
`tf.keras.optimizer.experimental.SGD`.
Attributes:
name: name of the optimizer.
nesterov: nesterov for SGD optimizer.
momentum: momentum for SGD optimizer.
"""
name
:
str
=
"SGD"
nesterov
:
bool
=
False
momentum
:
float
=
0.0
jit_compile
:
bool
=
False
@
dataclasses
.
dataclass
class
RMSPropConfig
(
BaseOptimizerConfig
):
"""Configuration for RMSProp optimizer.
...
...
official/modeling/optimization/optimizer_factory.py
浏览文件 @
a7894f9e
...
...
@@ -18,7 +18,6 @@ from typing import Callable, Optional, Union, List, Tuple
import
gin
import
tensorflow
as
tf
import
tensorflow_addons.optimizers
as
tfa_optimizers
from
official.modeling.optimization
import
slide_optimizer
from
official.modeling.optimization
import
adafactor_optimizer
from
official.modeling.optimization
import
ema_optimizer
...
...
@@ -29,6 +28,7 @@ from official.nlp import optimization as nlp_optimization
OPTIMIZERS_CLS
=
{
'sgd'
:
tf
.
keras
.
optimizers
.
SGD
,
'sgd_experimental'
:
tf
.
keras
.
optimizers
.
experimental
.
SGD
,
'adam'
:
tf
.
keras
.
optimizers
.
Adam
,
'adamw'
:
nlp_optimization
.
AdamWeightDecay
,
'lamb'
:
tfa_optimizers
.
LAMB
,
...
...
@@ -178,7 +178,8 @@ class OptimizerFactory:
takes an optimizer and returns an optimizer.
Returns:
tf.keras.optimizers.Optimizer instance.
`tf.keras.optimizers.Optimizer` or
`tf.keras.optimizers.experimental.Optimizer` instance.
"""
optimizer_dict
=
self
.
_optimizer_config
.
as_dict
()
...
...
@@ -201,8 +202,10 @@ class OptimizerFactory:
optimizer
,
**
self
.
_ema_config
.
as_dict
())
if
postprocessor
:
optimizer
=
postprocessor
(
optimizer
)
assert
isinstance
(
optimizer
,
tf
.
keras
.
optimizers
.
Optimizer
),
(
'OptimizerFactory.build_optimizer returning a non-optimizer object: '
assert
isinstance
(
optimizer
,
(
tf
.
keras
.
optimizers
.
Optimizer
,
tf
.
keras
.
optimizers
.
experimental
.
Optimizer
)
),
(
'OptimizerFactory.build_optimizer returning a non-optimizer object: '
'{}'
.
format
(
optimizer
))
return
optimizer
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录