pip install"paddleocr>=2.0.1"# Recommend to use version 2.0.1+
pip install"paddleocr>=2.0.1"# Recommend to use version 2.0.1+
...
@@ -12,9 +12,11 @@ build own whl package and install
...
@@ -12,9 +12,11 @@ build own whl package and install
python3 setup.py bdist_wheel
python3 setup.py bdist_wheel
pip3 install dist/paddleocr-x.x.x-py3-none-any.whl # x.x.x is the version of paddleocr
pip3 install dist/paddleocr-x.x.x-py3-none-any.whl # x.x.x is the version of paddleocr
```
```
### 1. Use by code
## 2 Use
### 2.1 Use by code
The paddleocr whl package will automatically download the ppocr lightweight model as the default model, which can be customized and replaced according to the section 3 **Custom Model**.
* detection classification and recognition
* detection angle classification and recognition
```python
```python
frompaddleocrimportPaddleOCR,draw_ocr
frompaddleocrimportPaddleOCR,draw_ocr
# Paddleocr supports Chinese, English, French, German, Korean and Japanese.
# Paddleocr supports Chinese, English, French, German, Korean and Japanese.
...
@@ -163,7 +165,7 @@ Output will be a list, each item contains classification result and confidence
...
@@ -163,7 +165,7 @@ Output will be a list, each item contains classification result and confidence
['0', 0.99999964]
['0', 0.99999964]
```
```
### Use by command line
### 2.2 Use by command line
show help information
show help information
```bash
```bash
...
@@ -239,11 +241,11 @@ Output will be a list, each item contains classification result and confidence
...
@@ -239,11 +241,11 @@ Output will be a list, each item contains classification result and confidence
['0', 0.99999964]
['0', 0.99999964]
```
```
## Use custom model
## 3 Use custom model
When the built-in model cannot meet the needs, you need to use your own trained model.
When the built-in model cannot meet the needs, you need to use your own trained model.
First, refer to the first section of [inference_en.md](./inference_en.md) to convert your det and rec model to inference model, and then use it as follows
First, refer to the first section of [inference_en.md](./inference_en.md) to convert your det and rec model to inference model, and then use it as follows