README.md 5.1 KB
Newer Older
Z
zhangjinchao01 已提交
1 2
# PaddlePaddle

C
Cheerego 已提交
3
English | [简体中文](./README_cn.md)
G
gangliao 已提交
4

G
gangliao 已提交
5
[![Build Status](https://travis-ci.org/PaddlePaddle/Paddle.svg?branch=develop)](https://travis-ci.org/PaddlePaddle/Paddle)
6 7
[![Documentation Status](https://img.shields.io/badge/docs-latest-brightgreen.svg?style=flat)](http://www.paddlepaddle.org.cn/documentation/docs/en/1.6/beginners_guide/index_en.html)
[![Documentation Status](https://img.shields.io/badge/中文文档-最新-brightgreen.svg)](http://www.paddlepaddle.org.cn/documentation/docs/zh/1.6/beginners_guide/index_cn.html)
G
gangliao 已提交
8
[![Release](https://img.shields.io/github/release/PaddlePaddle/Paddle.svg)](https://github.com/PaddlePaddle/Paddle/releases)
L
liaogang 已提交
9 10
[![License](https://img.shields.io/badge/license-Apache%202-blue.svg)](LICENSE)

G
gangliao 已提交
11
Welcome to the PaddlePaddle GitHub.
E
emailweixu 已提交
12

Z
zhangjinchao01 已提交
13 14 15 16 17
PaddlePaddle (PArallel Distributed Deep LEarning) is an easy-to-use,
efficient, flexible and scalable deep learning platform, which is originally
developed by Baidu scientists and engineers for the purpose of applying deep
learning to many products at Baidu.

G
gangliao 已提交
18
Our vision is to enable deep learning for everyone via PaddlePaddle.
G
gangliao 已提交
19
Please refer to our [release announcement](https://github.com/PaddlePaddle/Paddle/releases) to track the latest feature of PaddlePaddle.
G
gangliao 已提交
20

21
### Latest PaddlePaddle Release: [v1.6](https://github.com/PaddlePaddle/Paddle/tree/release/1.6)
X
Xin Pan 已提交
22 23 24 25
### Install Latest Stable Release:
```
# Linux CPU
pip install paddlepaddle
26
# Linux GPU cuda10cudnn7
X
xsrobin 已提交
27 28
pip install paddlepaddle-gpu
# Linux GPU cuda9cudnn7
29
pip install paddlepaddle-gpu==1.6.2.post97
X
xsrobin 已提交
30

X
Xin Pan 已提交
31 32 33

# For installation on other platform, refer to http://paddlepaddle.org/
```
A
adaxi123 已提交
34
Now our developers could acquire Tesla V100 online computing resources for free. If you create a program by AI Studio, you would obtain 12 hours to train models online per day. If you could insist on that for five consecutive days, then you would own extra 48 hours. [Click here to start](http://ai.baidu.com/support/news?action=detail&id=981).
X
Xin Pan 已提交
35

Z
zhangjinchao01 已提交
36 37 38 39
## Features

- **Flexibility**

G
gangliao 已提交
40 41 42 43
    PaddlePaddle supports a wide range of neural network architectures and
    optimization algorithms. It is easy to configure complex models such as
    neural machine translation model with attention mechanism or complex memory
    connection.
Z
zhangjinchao01 已提交
44 45

-  **Efficiency**
46

G
gangliao 已提交
47 48 49 50 51 52
    In order to unleash the power of heterogeneous computing resource,
    optimization occurs at different levels of PaddlePaddle, including
    computing, memory, architecture and communication. The following are some
    examples:

      - Optimized math operations through SSE/AVX intrinsics, BLAS libraries
L
Luo Tao 已提交
53
      (e.g. MKL, OpenBLAS, cuBLAS) or customized CPU/GPU kernels.
54
      - Optimized CNN networks through MKL-DNN library.
55
      - Highly optimized recurrent networks which can handle **variable-length**
G
gangliao 已提交
56 57 58
      sequence without padding.
      - Optimized local and distributed training for models with high dimensional
      sparse data.
Z
zhangjinchao01 已提交
59 60 61

- **Scalability**

G
gangliao 已提交
62 63 64
    With PaddlePaddle, it is easy to use many CPUs/GPUs and machines to speed
    up your training. PaddlePaddle can achieve high throughput and performance
    via optimized communication.
Z
zhangjinchao01 已提交
65 66 67

- **Connected to Products**

G
gangliao 已提交
68
    In addition, PaddlePaddle is also designed to be easily deployable. At Baidu,
69
    PaddlePaddle has been deployed into products and services with a vast number
G
gangliao 已提交
70 71 72
    of users, including ad click-through rate (CTR) prediction, large-scale image
    classification, optical character recognition(OCR), search ranking, computer
    virus detection, recommendation, etc. It is widely utilized in products at
73 74
    Baidu and it has achieved a significant impact. We hope you can also explore
    the capability of PaddlePaddle to make an impact on your product.
Z
zhangjinchao01 已提交
75 76

## Installation
Y
Yi Wang 已提交
77

78
It is recommended to read [this doc](http://www.paddlepaddle.org.cn/documentation/docs/en/1.6/beginners_guide/index_en.html) on our website.
79

Z
zhangjinchao01 已提交
80
## Documentation
81

82 83
We provide [English](http://www.paddlepaddle.org.cn/documentation/docs/en/1.6/beginners_guide/index_en.html) and
[Chinese](http://www.paddlepaddle.org.cn/documentation/docs/zh/1.6/beginners_guide/install/index_cn.html) documentation.
Y
Yi Wang 已提交
84

S
Shan Yi 已提交
85
- [Deep Learning 101](https://github.com/PaddlePaddle/book)
Y
Yi Wang 已提交
86

87
  You might want to start from this online interactive book that can run in a Jupyter Notebook.
Y
Yi Wang 已提交
88

89
- [Distributed Training](http://paddlepaddle.org.cn/documentation/docs/en/1.6/user_guides/howto/training/multi_node_en.html)
Y
Yi Wang 已提交
90 91 92

  You can run distributed training jobs on MPI clusters.

93
- [Python API](http://paddlepaddle.org.cn/documentation/docs/en/1.6/api/index_en.html)
94

Y
Yi Wang 已提交
95
   Our new API enables much shorter programs.
96

97
- [How to Contribute](http://paddlepaddle.org.cn/documentation/docs/en/1.6/advanced_usage/development/contribute_to_paddle/index_en.html)
Z
zhangjinchao01 已提交
98

Y
Yi Wang 已提交
99
   We appreciate your contributions!
100

101
## Communication
102

103
- [Github Issues](https://github.com/PaddlePaddle/Paddle/issues): bug reports, feature requests, install issues, usage issues, etc.
X
xsrobin 已提交
104
- QQ discussion group: 796771754 (PaddlePaddle).
105
- [Forums](http://ai.baidu.com/forum/topic/list/168?pageNo=1): discuss implementations, research, etc.
Z
zhangjinchao01 已提交
106 107 108

## Copyright and License
PaddlePaddle is provided under the [Apache-2.0 license](LICENSE).