README.md 1.6 KB
Newer Older
1 2 3 4 5 6 7 8 9 10 11 12 13 14
# Encryption Model Prediction

([简体中文](README_CN.md)|English)

## Get Origin Model

The example uses the model file of the fit_a_line example as a origin model

```
sh get_data.sh
```

## Encrypt Model

15
The `paddlepaddle` package is used in this example, you may need to download the corresponding package(`pip install paddlepaddle`).
H
HexToString 已提交
16 17 18 19 20 21 22 23 24 25 26 27

[python encrypt.py](./encrypt.py)

[//file]:#encrypt.py
``` python
def serving_encryption():
    inference_model_to_serving(
        dirname="./uci_housing_model",
        params_filename=None,
        serving_server="encrypt_server",
        serving_client="encrypt_client",
        encryption=True)
28
```
T
2233  
Thomas Young 已提交
29
dirname is the folder path where the model is located. If the parameter is discrete, it is unnecessary to specify params_filename, else you need to set `params_filename="__params__"`.
H
HexToString 已提交
30

31 32 33
The key is stored in the `key` file, and the encrypted model file and server-side configuration file are stored in the `encrypt_server` directory.
client-side configuration file are stored in the `encrypt_client` directory.

H
fix doc  
HexToString 已提交
34
**Notice:** When encryption prediction is used, the model configuration and parameter folder loaded by server and client should be encrypt_server/ and encrypt_client/
35 36 37 38 39 40 41
## Start Encryption Service
CPU Service
```
python -m paddle_serving_server.serve --model encrypt_server/ --port 9300 --use_encryption_model
```
GPU Service
```
Z
zhangjun 已提交
42
python -m paddle_serving_server.serve --model encrypt_server/ --port 9300 --use_encryption_model --gpu_ids 0
43 44 45 46
```

## Prediction
```
H
fix doc  
HexToString 已提交
47
python test_client.py encrypt_client/serving_client_conf.prototxt
48
```