README.md 2.3 KB
Newer Older
1 2 3 4
# MNIST classification by PaddlePaddle

![screencast](https://cloud.githubusercontent.com/assets/80381/11339453/f04f885e-923c-11e5-8845-33c16978c54d.gif)

5
## Usage
6

7 8 9
This MNIST classification demo consists of two parts: a PaddlePaddle
inference server and a Javascript front end. We will start them
separately.
10

11 12 13 14 15 16 17 18
We will use Docker to run the demo, if you are not familiar with
Docker, please checkout
this
[tutorial](https://github.com/PaddlePaddle/Paddle/wiki/TLDR-for-new-docker-user).

### Start the Inference Server

The inference server can be used to inference any model trained by
H
Helin Wang 已提交
19
PaddlePaddle. Please see [here](../serve/README.md) for more details.
20 21 22 23 24

1. Download the MNIST inference model topylogy and parameters to the
   current working directory.

    ```bash
H
helinwang 已提交
25 26
    wget https://s3.us-east-2.amazonaws.com/models.paddlepaddle/end-to-end-mnist/inference_topology.pkl
    wget https://s3.us-east-2.amazonaws.com/models.paddlepaddle/end-to-end-mnist/param.tar
27 28 29 30 31 32 33 34 35 36 37 38
    ```

1. Run following command to start the inference server:

    ```bash
    docker run --name paddle_serve -v `pwd`:/data -d -p 8000:80 -e WITH_GPU=0 paddlepaddle/book:serve
    ```

    The above command will mount the current working directory to the
    `/data` directory inside the docker container. The inference
    server will load the model topology and parameters that we just
    downloaded from there.
39

40 41 42 43 44 45
    After you are done with the demo, you can run `docker stop
    paddle_serve` to stop this docker container.

### Start the Front End

1. Run the following command
H
Helin Wang 已提交
46 47 48 49 50 51 52 53
   ```bash
   docker run -it -p 5000:5000 -e BACKEND_URL=http://localhost:8000/ paddlepaddle/book:mnist
   ```

   `BACKEND_URL` in the above command specifies the inference server
   endpoint. If you started the inference server on another machine,
   or want to visit the front end remotely, you may want to change its
   value.
54 55 56 57 58 59 60

1. Visit http://localhost:5000 and you will see the PaddlePaddle MNIST demo.


## Build

We have already prepared the pre-built docker image
H
Helin Wang 已提交
61 62
`paddlepaddle/book:mnist`, here is the command if you want to build
the docker image again.
63 64

```bash
65
docker build -t paddlepaddle/book:mnist .
66
```
67 68 69 70 71 72


## Acknowledgement

Thanks to the great project https://github.com/sugyan/tensorflow-mnist
. Most of the code in this project comes from there.