# Pruning of image classification model - sensitivity
In this tutorial, you will learn how to use [sensitivity API of PaddleSlim](https://paddlepaddle.github.io/PaddleSlim/api/prune_api/#sensitivity) by a demo of MobileNetV1 model on MNIST dataset。
This tutorial following workflow:
1. Import dependency
2. Build model
3. Define data reader
4. Define function for test
5. Training model
6. Get names of parameter
7. Compute sensitivities
8. Pruning model
## 1. Import dependency
PaddleSlim dependents on Paddle1.7. Please ensure that you have installed paddle correctly. Import Paddle and PaddleSlim as below:
```python
importpaddle
importpaddle.fluidasfluid
importpaddleslimasslim
```
## 2. Build model
This section will build a classsification model based `MobileNetV1` for MNIST task. The shape of the input is `[1, 28, 28]` and the output number is 10.
To make the code simple, we define a function in package `paddleslim.models` to build classification model.
>Note:The functions in paddleslim.models is just used in tutorials or demos.
## 3 Define data reader
MNIST dataset is used for making the demo can be executed quickly. It defines some functions for downloading and reading MNIST dataset in package `paddle.dataset.mnist`.
Show as below:
```python
importpaddle.dataset.mnistasreader
train_reader=paddle.batch(
reader.train(),batch_size=128,drop_last=True)
test_reader=paddle.batch(
reader.test(),batch_size=128,drop_last=True)
data_feeder=fluid.DataFeeder(inputs,place)
```
## 4. Define test function
To get the performance of model on test dataset after pruning a convolution layer, we define a test function as below:
```python
importnumpyasnp
deftest(program):
acc_top1_ns=[]
acc_top5_ns=[]
fordataintest_reader():
acc_top1_n,acc_top5_n,_=exe.run(
program,
feed=data_feeder.feed(data),
fetch_list=outputs)
acc_top1_ns.append(np.mean(acc_top1_n))
acc_top5_ns.append(np.mean(acc_top5_n))
print("Final eva - acc_top1: {}; acc_top5: {}".format(
Sensitivity analysis is dependent on pretrained model. So we should train the model defined in section 2 for some epochs. One epoch training is enough for this simple demo while more epochs may be necessary for other model. Or you can load pretrained model from filesystem.
Apply sensitivity analysis on pretrained model by calling [sensitivity API](https://paddlepaddle.github.io/PaddleSlim/api/prune_api/#sensitivity).
The sensitivities will be appended into the file given by option `sensitivities_file` during computing.
The information in this file won`t be computed repeatedly.
Remove the file `sensitivities_0.data` in current directory:
```python
!rm -rf sensitivities_0.data
```
Apart from the parameters to be analyzed, it also support for setting the ratios that each convolutoin will be pruned.
If one model losses 90% accuracy on test dataset when its single convolution layer is pruned by 40%, then we can set `pruned_ratios` to `[0.1, 0.2, 0.3, 0.4]`.
The granularity of `pruned_ratios` should be small to get more reasonable sensitivities. But small granularity of `pruned_ratios` will slow down the computing.
```python
sens_0 = slim.prune.sensitivity(
val_program,
place,
params,
test,
sensitivities_file="sensitivities_0.data",
pruned_ratios=[0.1, 0.2])
print(sens_0)
```
### 7.2 Expand sensitivities
We can expand `pruned_ratios` to `[0.1, 0.2, 0.3]` based the sensitivities generated in section 7.1.
```python
sens_0 = slim.prune.sensitivity(
val_program,
place,
params,
test,
sensitivities_file="sensitivities_0.data",
pruned_ratios=[0.3])
print(sens_0)
```
### 7.3 Computing sensitivity in multi-process
The time cost of computing sensitivities is dependent on the count of parameters and the speed of model evaluation on test dataset. We can speed up computing by multi-process.
Split `pruned_ratios` into multi-process, and merge the sensitivities from multi-process.
#### 7.3.1 Computing in each process
We have compute the sensitivities when `pruned_ratios=[0.1, 0.2, 0.3]` and saved the sensitivities into file named `sensitivities_0.data`.
在另一个进程中,The we start a task by setting `pruned_ratios=[0.4]` in another process and save result into file named `sensitivities_1.data`. Show as below:
```python
sens_1 = slim.prune.sensitivity(
val_program,
place,
params,
test,
sensitivities_file="sensitivities_1.data",
pruned_ratios=[0.4])
print(sens_1)
```
#### 7.3.2 Load sensitivity file generated in multi-process