Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
Crayon鑫
Paddle
提交
71ab4df3
P
Paddle
项目概览
Crayon鑫
/
Paddle
与 Fork 源项目一致
Fork自
PaddlePaddle / Paddle
通知
1
Star
1
Fork
0
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
1
列表
看板
标记
里程碑
合并请求
0
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
P
Paddle
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
1
Issue
1
列表
看板
标记
里程碑
合并请求
0
合并请求
0
Pages
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
提交
Issue看板
提交
71ab4df3
编写于
3月 08, 2017
作者:
Y
Yu Yang
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
Follow comments, remove reader/batch_size in interface.
上级
5905d0e8
变更
2
隐藏空白更改
内联
并排
Showing
2 changed file
with
9 addition
and
25 deletion
+9
-25
demo/mnist/api_train_v2.py
demo/mnist/api_train_v2.py
+2
-1
python/paddle/v2/inference.py
python/paddle/v2/inference.py
+7
-24
未找到文件。
demo/mnist/api_train_v2.py
浏览文件 @
71ab4df3
...
@@ -132,7 +132,8 @@ def main():
...
@@ -132,7 +132,8 @@ def main():
# output is a softmax layer. It returns probabilities.
# output is a softmax layer. It returns probabilities.
# Shape should be (100, 10)
# Shape should be (100, 10)
probs
=
paddle
.
infer
(
output
=
predict
,
parameters
=
parameters
,
input
=
test_data
)
probs
=
paddle
.
infer
(
output_layer
=
predict
,
parameters
=
parameters
,
input
=
test_data
)
print
probs
.
shape
print
probs
.
shape
...
...
python/paddle/v2/inference.py
浏览文件 @
71ab4df3
...
@@ -9,8 +9,8 @@ __all__ = ['infer']
...
@@ -9,8 +9,8 @@ __all__ = ['infer']
class
Inference
(
object
):
class
Inference
(
object
):
def
__init__
(
self
,
output
,
parameters
):
def
__init__
(
self
,
output
_layer
,
parameters
):
topo
=
topology
.
Topology
(
output
)
topo
=
topology
.
Topology
(
output
_layer
)
gm
=
api
.
GradientMachine
.
createFromConfigProto
(
gm
=
api
.
GradientMachine
.
createFromConfigProto
(
topo
.
proto
(),
api
.
CREATE_MODE_TESTING
,
[
api
.
PARAMETER_VALUE
])
topo
.
proto
(),
api
.
CREATE_MODE_TESTING
,
[
api
.
PARAMETER_VALUE
])
for
param
in
gm
.
getParameters
():
for
param
in
gm
.
getParameters
():
...
@@ -70,13 +70,7 @@ class Inference(object):
...
@@ -70,13 +70,7 @@ class Inference(object):
return
retv
return
retv
def
infer
(
output
,
def
infer
(
output_layer
,
parameters
,
input
=
None
,
feeding
=
None
,
field
=
'value'
):
parameters
,
input
=
None
,
batch_size
=
None
,
reader
=
None
,
feeding
=
None
,
field
=
'value'
):
"""
"""
Infer a neural network by given neural network output and parameters. The
Infer a neural network by given neural network output and parameters. The
user should pass either a batch of input data or reader method.
user should pass either a batch of input data or reader method.
...
@@ -89,19 +83,13 @@ def infer(output,
...
@@ -89,19 +83,13 @@ def infer(output,
batch_size=32)
batch_size=32)
print result
print result
:param output: output of the neural network that would be inferred
:param output
_layer
: output of the neural network that would be inferred
:type output: paddle.v2.config_base.Layer
:type output
_layer
: paddle.v2.config_base.Layer
:param parameters: parameters of the neural network.
:param parameters: parameters of the neural network.
:type parameters: paddle.v2.parameters.Parameters
:type parameters: paddle.v2.parameters.Parameters
:param input: input data batch. Should be a python iterable object, and each
:param input: input data batch. Should be a python iterable object, and each
element is the data batch.
element is the data batch.
:type input: collections.Iterable
:type input: collections.Iterable
:param batch_size: the batch size when perform inference. Default is the
length of input.
:type batch_size: int
:param reader: input data reader creator in batch. If this field is set, the
`input` and `batch_size` will be ignored.
:type reader: callable
:param feeding: Reader dictionary. Default could generate from input
:param feeding: Reader dictionary. Default could generate from input
value.
value.
:param field: The prediction field. It should in [`value`, `ids`]. `value`
:param field: The prediction field. It should in [`value`, `ids`]. `value`
...
@@ -112,10 +100,5 @@ def infer(output,
...
@@ -112,10 +100,5 @@ def infer(output,
:rtype: numpy.ndarray
:rtype: numpy.ndarray
"""
"""
inferer
=
Inference
(
output
=
output
,
parameters
=
parameters
)
inferer
=
Inference
(
output_layer
=
output_layer
,
parameters
=
parameters
)
return
inferer
.
infer
(
return
inferer
.
infer
(
field
=
field
,
input
=
input
,
feeding
=
feeding
)
field
=
field
,
input
=
input
,
batch_size
=
batch_size
,
reader
=
reader
,
feeding
=
feeding
)
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录