INSTALL.md 3.5 KB
Newer Older
K
Kaipeng Deng 已提交
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37
# Installing PaddleDetection

---
## Table of Contents

- [Introduction](#introduction)
- [PaddlePaddle](#paddlepaddle)
- [Other Dependencies](#other-dependencies)
- [PaddleDetection](#paddle-detection)
- [Datasets](#datasets)


## Introduction

This document covers how to install PaddleDetection, its dependencies (including PaddlePaddle), and COCO and PASCAL VOC dataset.

For general information about PaddleDetection, please see [README.md](../README.md).


## PaddlePaddle

Running PaddleDetection requires PaddelPaddle Fluid v.1.5 and later. please follow the installation instructions in [installation document](http://www.paddlepaddle.org/documentation/docs/en/1.4/beginners_guide/install/index_en.html).

Please make sure your PaddlePaddle installation was sucessful and the version of your PaddlePaddle is not lower than the version required. You can check PaddlePaddle installation with following commands.

```
# To check if PaddlePaddle installation was sucessful
python -c "from paddle.fluid import fluid; fluid.install_check.run_check()"

# To print PaddlePaddle version
python -c "import paddle; print(paddle.__version__)"
```

### Requirements:

- Python2 or Python3
- CUDA >= 8.0
38
- cuDNN >= 5.0
K
Kaipeng Deng 已提交
39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70
- nccl >= 2.1.2


## Other Dependencies

**Install the [COCO-API](https://github.com/cocodataset/cocoapi):**

To train the model, COCO-API is needed. Installation is as follows:

    git clone https://github.com/cocodataset/cocoapi.git
    cd cocoapi/PythonAPI
    # if cython is not installed
    pip install Cython
    # Install into global site-packages
    make install
    # Alternatively, if you do not have permissions or prefer
    # not to install the COCO API into global site-packages
    python setup.py install --user


## PaddleDetection

**Clone Paddle models repository:**

You can clone Paddle models and change directory to PaddleDetection module with folloing commands:

```
cd <path/to/clone/models>
git clone https://github.com/PaddlePaddle/models
cd models/PaddleCV/object_detection
```

71
**Install Python module requirements:**
K
Kaipeng Deng 已提交
72

73
Other python module requirements is set in [requirements.txt](../requirements.txt), you can install these requirements with folloing command:
K
Kaipeng Deng 已提交
74 75 76 77 78 79 80 81

```
pip install -r requirements.txt
```

**Check PaddleDetection architectures tests pass:**

```
82
export PYTHONPATH=`pwd`:$PYTHONPATH
K
Kaipeng Deng 已提交
83 84 85 86 87 88 89 90 91 92
python ppdet/modeling/tests/test_architectures.py
```


## Datasets

PaddleDetection support train/eval/infer models with dataset [MSCOCO](http://cocodataset.org) and [PASCAL VOC](http://host.robots.ox.ac.uk/pascal/VOC/), you can set up dataset as follows.

**Create symlinks for datasets:**

93
Dataset default path in PaddleDetection config files is `dataset/coco` and `dataset/voc`, you can set symlinks for your COCO/COCO-like or VOC/VOC-like datasets with following commands:
K
Kaipeng Deng 已提交
94 95 96 97 98 99 100 101

```
ln -sf <path/to/coco> $PaddleDetection/data/coco
ln -sf <path/to/voc> $PaddleDetection/data/voc
```

If you do not have datasets locally, you can download dataset as follows:

102
- MS-COCO
K
Kaipeng Deng 已提交
103 104

```
105 106
cd dataset/coco
./download.sh
K
Kaipeng Deng 已提交
107 108
```

109
- PASCAL VOC
K
Kaipeng Deng 已提交
110 111

```
112 113
cd dataset/voc
./download.sh
K
Kaipeng Deng 已提交
114 115 116 117 118 119 120
```

**Auto download datasets:**

If you set up models while `data/coc` and `data/voc` is not found, PaddleDetection will automaticaly download them from [MSCOCO-2017](http://images.cocodataset.org) and [VOC2012](http://host.robots.ox.ac.uk/pascal/VOC), the decompressed datasets will be places in `~/.cache/paddle/dataset/` and can be discovered automaticaly in the next setting up time.


121
**NOTE:** For further informations on the datasets, please see [DATASET.md](DATA.md)