Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
s920243400
PaddleDetection
提交
332194c8
P
PaddleDetection
项目概览
s920243400
/
PaddleDetection
与 Fork 源项目一致
Fork自
PaddlePaddle / PaddleDetection
通知
2
Star
0
Fork
0
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
0
列表
看板
标记
里程碑
合并请求
0
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
P
PaddleDetection
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
0
Issue
0
列表
看板
标记
里程碑
合并请求
0
合并请求
0
Pages
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
提交
Issue看板
提交
332194c8
编写于
9月 23, 2016
作者:
H
Haichao-Zhang
提交者:
emailweixu
9月 23, 2016
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
add type compatible check for ParamAttr (#113)
* add type compatible check for ParamAttr
上级
77ed98d1
变更
2
隐藏空白更改
内联
并排
Showing
2 changed file
with
58 addition
and
8 deletion
+58
-8
python/paddle/trainer_config_helpers/attrs.py
python/paddle/trainer_config_helpers/attrs.py
+46
-6
python/paddle/trainer_config_helpers/tests/layers_test_config.py
...paddle/trainer_config_helpers/tests/layers_test_config.py
+12
-2
未找到文件。
python/paddle/trainer_config_helpers/attrs.py
浏览文件 @
332194c8
...
...
@@ -17,6 +17,42 @@ __all__ = ['ParamAttr', 'ExtraAttr', 'ParameterAttribute',
'ExtraLayerAttribute'
]
def
convert_and_compare
(
x
,
Type
):
"""
Convert x to be the same type as Type and then convert back to
check whether there is a loss of information
:param x: object to be checked
:param Type: target type to check x over
"""
return
type
(
x
)(
Type
(
x
))
==
x
def
is_compatible_with
(
x
,
Type
):
"""
Check if x has a type compatible with Type
:param x: object to be checked
:param Type: target type to check x over
"""
if
type
(
x
)
==
Type
:
return
True
try
:
if
float
==
Type
or
int
==
Type
:
# avoid those types that can be converted to float/int but not very
# meaningful and could potentially lead to error
# i.e., str and bool typed value should not be used for initializing float/int variable
if
not
isinstance
(
x
,
str
)
and
not
isinstance
(
x
,
bool
):
return
convert_and_compare
(
x
,
Type
)
elif
bool
==
Type
:
# should not use string type to initialize bool variable
if
not
isinstance
(
x
,
str
):
return
convert_and_compare
(
x
,
Type
)
else
:
return
False
except
:
return
False
class
ParameterAttribute
(
object
):
"""
Parameter Attributes object. To fine-tuning network training process, user
...
...
@@ -65,14 +101,18 @@ class ParameterAttribute(object):
elif
initial_std
is
None
and
initial_mean
is
None
and
initial_max
\
is
None
and
initial_min
is
None
:
self
.
attr
=
{
'initial_smart'
:
True
}
elif
isinstance
(
initial_std
,
float
)
or
isinstance
(
initial_mean
,
float
):
elif
is_compatible_with
(
initial_std
,
float
)
or
\
is_compatible_with
(
initial_mean
,
float
):
self
.
attr
=
dict
()
if
initial_std
is
not
None
:
self
.
attr
[
'initial_std'
]
=
initial_std
if
initial_mean
is
not
None
:
self
.
attr
[
'initial_mean'
]
=
initial_mean
self
.
attr
[
'initial_strategy'
]
=
0
# Gauss Random
elif
isinstance
(
initial_max
,
float
)
and
isinstance
(
initial_min
,
float
):
elif
is_compatible_with
(
initial_max
,
float
)
and
\
is_compatible_with
(
initial_min
,
float
):
initial_max
=
initial_max
initial_min
=
initial_min
assert
initial_min
<
initial_max
initial_mean
=
(
initial_max
+
initial_min
)
/
2
initial_std
=
initial_mean
-
initial_min
...
...
@@ -83,16 +123,16 @@ class ParameterAttribute(object):
else
:
raise
RuntimeError
(
"Unexpected branch."
)
if
not
is_static
and
is
instance
(
l1_rate
,
float
):
if
not
is_static
and
is
_compatible_with
(
l1_rate
,
float
):
self
.
attr
[
'decay_rate_l1'
]
=
l1_rate
if
not
is_static
and
is
instance
(
l2_rate
,
float
):
if
not
is_static
and
is
_compatible_with
(
l2_rate
,
float
):
self
.
attr
[
'decay_rate'
]
=
l2_rate
if
not
is_static
and
is
instance
(
learning_rate
,
float
):
if
not
is_static
and
is
_compatible_with
(
learning_rate
,
float
):
self
.
attr
[
'learning_rate'
]
=
learning_rate
if
not
is_static
and
is
instance
(
momentum
,
float
):
if
not
is_static
and
is
_compatible_with
(
momentum
,
float
):
self
.
attr
[
'momentum'
]
=
momentum
if
name
is
not
None
:
...
...
python/paddle/trainer_config_helpers/tests/layers_test_config.py
浏览文件 @
332194c8
...
...
@@ -39,10 +39,20 @@ print_layer(input=[out])
outputs
(
classification_cost
(
out
,
data_layer
(
name
=
"label"
,
size
=
num_classes
)))
dotmul
=
mixed_layer
(
input
=
[
dotmul_operator
(
x
=
x1
,
y
=
y1
),
dotmul_projection
(
input
=
y1
)])
dotmul_projection
(
input
=
y1
)])
proj_with_attr_init
=
mixed_layer
(
input
=
full_matrix_projection
(
input
=
y1
,
param_attr
=
ParamAttr
(
learning_rate
=
0
,
initial_mean
=
0
,
initial_std
=
0
)),
bias_attr
=
ParamAttr
(
initial_mean
=
0
,
initial_std
=
0
,
learning_rate
=
0
),
act
=
LinearActivation
(),
size
=
5
,
name
=
'proj_with_attr_init'
)
# for ctc
tmp
=
fc_layer
(
input
=
[
x1
,
dotmul
],
tmp
=
fc_layer
(
input
=
[
x1
,
dotmul
,
proj_with_attr_init
],
size
=
num_classes
+
1
,
act
=
SoftmaxActivation
())
ctc
=
ctc_layer
(
input
=
tmp
,
...
...
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录