Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
PaddlePaddle
book
提交
e23bc324
B
book
项目概览
PaddlePaddle
/
book
通知
17
Star
4
Fork
0
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
40
列表
看板
标记
里程碑
合并请求
37
Wiki
5
Wiki
分析
仓库
DevOps
项目成员
Pages
B
book
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
40
Issue
40
列表
看板
标记
里程碑
合并请求
37
合并请求
37
Pages
分析
分析
仓库分析
DevOps
Wiki
5
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
提交
Issue看板
提交
e23bc324
编写于
9月 13, 2017
作者:
H
Helin Wang
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
Update the tutorial of the end-to-end demo
上级
0f7f0244
变更
1
显示空白变更内容
内联
并排
Showing
1 changed file
with
57 addition
and
10 deletion
+57
-10
mnist-client/README.md
mnist-client/README.md
+57
-10
未找到文件。
mnist-client/README.md
浏览文件 @
e23bc324
# MNIST classification by PaddlePaddle
# MNIST classification by PaddlePaddle
Forked from https://github.com/sugyan/tensorflow-mnist


##
Build
##
Usage
$ docker build -t paddle-mnist .
This MNIST classification demo consists of two parts: a PaddlePaddle
inference server and a Javascript front end. We will start them
separately.
## Usage
We will use Docker to run the demo, if you are not familiar with
Docker, please checkout
this
[
tutorial
](
https://github.com/PaddlePaddle/Paddle/wiki/TLDR-for-new-docker-user
)
.
### Start the Inference Server
The inference server can be used to inference any model trained by
PaddlePaddle. Please see
[
here
](
TODO
)
for more details.
1.
Download the MNIST inference model topylogy and parameters to the
current working directory.
```bash
wget https://s3.us-east-2.amazonaws.com/models.paddlepaddle/inference_topology.pkl
wget https://s3.us-east-2.amazonaws.com/models.paddlepaddle/param.tar
```
1.
Run following command to start the inference server:
```bash
docker run --name paddle_serve -v `pwd`:/data -d -p 8000:80 -e WITH_GPU=0 paddlepaddle/book:serve
```
The above command will mount the current working directory to the
`/data` directory inside the docker container. The inference
server will load the model topology and parameters that we just
downloaded from there.
After you are done with the demo, you can run `docker stop
paddle_serve` to stop this docker container.
### Start the Front End
1.
Run the following command
```
bash
docker run
-it
-p
5000:5000 paddlepaddle/book:mnist
```
1.
Visit http://localhost:5000 and you will see the PaddlePaddle MNIST demo.
## Build
We have already prepared the pre-built docker image
`paddlepaddle/book:mnist`
, here is the command if you want build the
docker image again.
1.
Download
`inference_topology.pkl`
and
`param.tar`
to current directory
1.
Run following commands:
```
bash
```
bash
docker run
-v
`
pwd
`
:/data
-d
-p
8000:80
-e
WITH_GPU
=
0 paddlepaddle/book:serve
docker build
-t
paddlepaddle/book:mnist .
docker run
-it
-p
5000:5000 paddlepaddle/book:mnist
```
```
1.
Visit http://localhost:5000
## Acknowledgement
Thanks to the great project https://github.com/sugyan/tensorflow-mnist
. Most of the code in this project comes from there.
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录