{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "# Text recognition practice\n", "\n", "In the theoretical part of the previous chapter, the main methods in the field of text recognition were introduced. Among them, CRNN was proposed earlier and is currently more widely used in the industry. This chapter will introduce in detail how to build, train, evaluate and predict the CRNN text recognition model based on PaddleOCR. The data set is icdar 2015, in which there are 4468 pieces in the training set and 2077 pieces in the test set.\n", "\n", "\n", "Through the study of this chapter, you can master:\n", "\n", "1. How to use PaddleOCR whl package to quickly complete text recognition prediction\n", "\n", "2. The basic principles and network structure of CRNN\n", "\n", "3. The necessary steps and parameter adjustment methods for model training\n", "\n", "4. Use a custom data set to train the network\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## 1 Quick experience\n", "\n", "### 1.1 Install related dependencies and whl packages\n", "\n", "First confirm that paddle and paddleocr are installed. If they have been installed, ignore this step." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Looking in indexes: https://pypi.tuna.tsinghua.edu.cn/simple\n", "Requirement already satisfied: paddlepaddle-gpu in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (2.1.2.post101)\n", "Requirement already satisfied: protobuf>=3.1.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from paddlepaddle-gpu) (3.14.0)\n", "Requirement already satisfied: numpy>=1.13; python_version >= \"3.5\" and platform_system != \"Windows\" in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from paddlepaddle-gpu) (1.20.3)\n", "Requirement already satisfied: Pillow in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from paddlepaddle-gpu) (7.1.2)\n", "Requirement already satisfied: six in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from paddlepaddle-gpu) (1.15.0)\n", "Requirement already satisfied: requests>=2.20.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from paddlepaddle-gpu) (2.22.0)\n", "Requirement already satisfied: astor in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from paddlepaddle-gpu) (0.8.1)\n", "Requirement already satisfied: decorator in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from paddlepaddle-gpu) (4.4.2)\n", "Requirement already satisfied: gast<=0.4.0,>=0.3.3; platform_system != \"Windows\" in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from paddlepaddle-gpu) (0.3.3)\n", "Requirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from requests>=2.20.0->paddlepaddle-gpu) (1.25.6)\n", "Requirement already satisfied: certifi>=2017.4.17 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from requests>=2.20.0->paddlepaddle-gpu) (2019.9.11)\n", "Requirement already satisfied: idna<2.9,>=2.5 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from requests>=2.20.0->paddlepaddle-gpu) (2.8)\n", "Requirement already satisfied: chardet<3.1.0,>=3.0.2 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from requests>=2.20.0->paddlepaddle-gpu) (3.0.4)\n", "Looking in indexes: https://pypi.tuna.tsinghua.edu.cn/simple\n", "Collecting pip\n", "\u001b[?25l Downloading https://pypi.tuna.tsinghua.edu.cn/packages/a4/6d/6463d49a933f547439d6b5b98b46af8742cc03ae83543e4d7688c2420f8b/pip-21.3.1-py3-none-any.whl (1.7MB)\n", "\u001b[K |████████████████████████████████| 1.7MB 8.4MB/s eta 0:00:01\n", "\u001b[?25hInstalling collected packages: pip\n", " Found existing installation: pip 19.2.3\n", " Uninstalling pip-19.2.3:\n", " Successfully uninstalled pip-19.2.3\n", "Successfully installed pip-21.3.1\n", "Looking in indexes: https://pypi.tuna.tsinghua.edu.cn/simple\n", "Collecting paddleocr\n", " Downloading https://pypi.tuna.tsinghua.edu.cn/packages/e1/b6/5486e674ce096667dff247b58bf0fb789c2ce17a10e546c2686a2bb07aec/paddleocr-2.3.0.2-py3-none-any.whl (250 kB)\n", " |████████████████████████████████| 250 kB 3.3 MB/s \n", "\u001b[?25hCollecting lmdb\n", " Downloading https://pypi.tuna.tsinghua.edu.cn/packages/2e/dd/ada2fd91cd7832979069c556607903f274470c3d3d2274e0a848908272e8/lmdb-1.2.1-cp37-cp37m-manylinux2010_x86_64.whl (299 kB)\n", " |████████████████████████████████| 299 kB 12.8 MB/s \n", "\u001b[?25hCollecting lxml\n", " Downloading https://pypi.tuna.tsinghua.edu.cn/packages/7b/01/16a9b80c8ce4339294bb944f08e157dbfcfbb09ba9031bde4ddf7e3e5499/lxml-4.7.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl (6.4 MB)\n", " |████████████████████████████████| 6.4 MB 52.4 MB/s \n", "\u001b[?25hCollecting python-Levenshtein\n", " Downloading https://pypi.tuna.tsinghua.edu.cn/packages/2a/dc/97f2b63ef0fa1fd78dcb7195aca577804f6b2b51e712516cc0e902a9a201/python-Levenshtein-0.12.2.tar.gz (50 kB)\n", " |████████████████████████████████| 50 kB 1.6 MB/s \n", "\u001b[?25h Preparing metadata (setup.py) ... \u001b[?25ldone\n", "\u001b[?25hCollecting scikit-image\n", " Downloading https://pypi.tuna.tsinghua.edu.cn/packages/9a/44/8f8c7f9c9de7fde70587a656d7df7d056e6f05192a74491f7bc074a724d0/scikit_image-0.19.1-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (13.3 MB)\n", " |████████████████████████████████| 13.3 MB 56.1 MB/s \n", "\u001b[?25hRequirement already satisfied: numpy in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from paddleocr) (1.20.3)\n", "Collecting imgaug==0.4.0\n", " Downloading https://pypi.tuna.tsinghua.edu.cn/packages/66/b1/af3142c4a85cba6da9f4ebb5ff4e21e2616309552caca5e8acefe9840622/imgaug-0.4.0-py2.py3-none-any.whl (948 kB)\n", " |████████████████████████████████| 948 kB 62.9 MB/s \n", "\u001b[?25hCollecting opencv-contrib-python==4.4.0.46\n", " Downloading https://pypi.tuna.tsinghua.edu.cn/packages/08/51/1e0a206dd5c70fea91084e6f43979dc13e8eb175760cc7a105083ec3eb68/opencv_contrib_python-4.4.0.46-cp37-cp37m-manylinux2014_x86_64.whl (55.7 MB)\n", " |████████████████████████████████| 55.7 MB 44 kB/s 0:01\n", "\u001b[?25hCollecting premailer\n", " Downloading https://pypi.tuna.tsinghua.edu.cn/packages/b1/07/4e8d94f94c7d41ca5ddf8a9695ad87b888104e2fd41a35546c1dc9ca74ac/premailer-3.10.0-py2.py3-none-any.whl (19 kB)\n", "Collecting shapely\n", " Downloading https://pypi.tuna.tsinghua.edu.cn/packages/ae/20/33ce377bd24d122a4d54e22ae2c445b9b1be8240edb50040b40add950cd9/Shapely-1.8.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl (1.1 MB)\n", " |████████████████████████████████| 1.1 MB 14.5 MB/s \n", "\u001b[?25hRequirement already satisfied: visualdl in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from paddleocr) (2.2.0)\n", "Collecting fasttext==0.9.1\n", " Downloading https://pypi.tuna.tsinghua.edu.cn/packages/10/61/2e01f1397ec533756c1d893c22d9d5ed3fce3a6e4af1976e0d86bb13ea97/fasttext-0.9.1.tar.gz (57 kB)\n", " |████████████████████████████████| 57 kB 9.0 MB/s \n", "\u001b[?25h Preparing metadata (setup.py) ... \u001b[?25ldone\n", "\u001b[?25hRequirement already satisfied: cython in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from paddleocr) (0.29)\n", "Requirement already satisfied: openpyxl in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from paddleocr) (3.0.5)\n", "Collecting pyclipper\n", " Downloading https://pypi.tuna.tsinghua.edu.cn/packages/c5/fa/2c294127e4f88967149a68ad5b3e43636e94e3721109572f8f17ab15b772/pyclipper-1.3.0.post2-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl (603 kB)\n", " |████████████████████████████████| 603 kB 7.6 MB/s \n", "\u001b[?25hRequirement already satisfied: tqdm in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from paddleocr) (4.36.1)\n", "Collecting pybind11>=2.2\n", " Using cached https://pypi.tuna.tsinghua.edu.cn/packages/a8/3b/fc246e1d4c7547a7a07df830128e93c6215e9b93dcb118b2a47a70726153/pybind11-2.8.1-py2.py3-none-any.whl (208 kB)\n", "Requirement already satisfied: setuptools>=0.7.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from fasttext==0.9.1->paddleocr) (56.2.0)\n", "Requirement already satisfied: Pillow in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from imgaug==0.4.0->paddleocr) (7.1.2)\n", "Requirement already satisfied: imageio in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from imgaug==0.4.0->paddleocr) (2.6.1)\n", "Requirement already satisfied: scipy in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from imgaug==0.4.0->paddleocr) (1.6.3)\n", "Requirement already satisfied: opencv-python in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from imgaug==0.4.0->paddleocr) (4.1.1.26)\n", "Requirement already satisfied: matplotlib in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from imgaug==0.4.0->paddleocr) (2.2.3)\n", "Requirement already satisfied: six in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from imgaug==0.4.0->paddleocr) (1.15.0)\n", "Collecting PyWavelets>=1.1.1\n", " Downloading https://pypi.tuna.tsinghua.edu.cn/packages/a1/9c/564511b6e1c4e1d835ed2d146670436036960d09339a8fa2921fe42dad08/PyWavelets-1.2.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl (6.1 MB)\n", " |████████████████████████████████| 6.1 MB 3.8 MB/s \n", "\u001b[?25hRequirement already satisfied: packaging>=20.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from scikit-image->paddleocr) (20.9)\n", "Requirement already satisfied: networkx>=2.2 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from scikit-image->paddleocr) (2.4)\n", "Collecting tifffile>=2019.7.26\n", " Downloading https://pypi.tuna.tsinghua.edu.cn/packages/d8/38/85ae5ed77598ca90558c17a2f79ddaba33173b31cf8d8f545d34d9134f0d/tifffile-2021.11.2-py3-none-any.whl (178 kB)\n", " |████████████████████████████████| 178 kB 7.1 MB/s \n", "\u001b[?25hRequirement already satisfied: et-xmlfile in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from openpyxl->paddleocr) (1.0.1)\n", "Requirement already satisfied: jdcal in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from openpyxl->paddleocr) (1.4.1)\n", "Requirement already satisfied: requests in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from premailer->paddleocr) (2.22.0)\n", "Collecting cssselect\n", " Downloading https://pypi.tuna.tsinghua.edu.cn/packages/3b/d4/3b5c17f00cce85b9a1e6f91096e1cc8e8ede2e1be8e96b87ce1ed09e92c5/cssselect-1.1.0-py2.py3-none-any.whl (16 kB)\n", "Collecting cssutils\n", " Downloading https://pypi.tuna.tsinghua.edu.cn/packages/24/c4/9db28fe567612896d360ab28ad02ee8ae107d0e92a22db39affd3fba6212/cssutils-2.3.0-py3-none-any.whl (404 kB)\n", " |████████████████████████████████| 404 kB 134 kB/s \n", "\u001b[?25hRequirement already satisfied: cachetools in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from premailer->paddleocr) (4.0.0)\n", "Requirement already satisfied: pre-commit in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from visualdl->paddleocr) (1.21.0)\n", "Requirement already satisfied: Flask-Babel>=1.0.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from visualdl->paddleocr) (1.0.0)\n", "Requirement already satisfied: flask>=1.1.1 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from visualdl->paddleocr) (1.1.1)\n", "Requirement already satisfied: flake8>=3.7.9 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from visualdl->paddleocr) (3.8.2)\n", "Requirement already satisfied: shellcheck-py in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from visualdl->paddleocr) (0.7.1.1)\n", "Requirement already satisfied: pandas in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from visualdl->paddleocr) (1.1.5)\n", "Requirement already satisfied: bce-python-sdk in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from visualdl->paddleocr) (0.8.53)\n", "Requirement already satisfied: protobuf>=3.11.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from visualdl->paddleocr) (3.14.0)\n", "Requirement already satisfied: pyflakes<2.3.0,>=2.2.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from flake8>=3.7.9->visualdl->paddleocr) (2.2.0)\n", "Requirement already satisfied: pycodestyle<2.7.0,>=2.6.0a1 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from flake8>=3.7.9->visualdl->paddleocr) (2.6.0)\n", "Requirement already satisfied: importlib-metadata in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from flake8>=3.7.9->visualdl->paddleocr) (0.23)\n", "Requirement already satisfied: mccabe<0.7.0,>=0.6.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from flake8>=3.7.9->visualdl->paddleocr) (0.6.1)\n", "Requirement already satisfied: itsdangerous>=0.24 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from flask>=1.1.1->visualdl->paddleocr) (1.1.0)\n", "Requirement already satisfied: Jinja2>=2.10.1 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from flask>=1.1.1->visualdl->paddleocr) (2.11.0)\n", "Requirement already satisfied: click>=5.1 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from flask>=1.1.1->visualdl->paddleocr) (7.0)\n", "Requirement already satisfied: Werkzeug>=0.15 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from flask>=1.1.1->visualdl->paddleocr) (0.16.0)\n", "Requirement already satisfied: pytz in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from Flask-Babel>=1.0.0->visualdl->paddleocr) (2019.3)\n", "Requirement already satisfied: Babel>=2.3 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from Flask-Babel>=1.0.0->visualdl->paddleocr) (2.8.0)\n", "Requirement already satisfied: decorator>=4.3.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from networkx>=2.2->scikit-image->paddleocr) (4.4.2)\n", "Requirement already satisfied: pyparsing>=2.0.2 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from packaging>=20.0->scikit-image->paddleocr) (2.4.2)\n", "Requirement already satisfied: pycryptodome>=3.8.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from bce-python-sdk->visualdl->paddleocr) (3.9.9)\n", "Requirement already satisfied: future>=0.6.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from bce-python-sdk->visualdl->paddleocr) (0.18.0)\n", "Requirement already satisfied: kiwisolver>=1.0.1 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from matplotlib->imgaug==0.4.0->paddleocr) (1.1.0)\n", "Requirement already satisfied: python-dateutil>=2.1 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from matplotlib->imgaug==0.4.0->paddleocr) (2.8.0)\n", "Requirement already satisfied: cycler>=0.10 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from matplotlib->imgaug==0.4.0->paddleocr) (0.10.0)\n", "Requirement already satisfied: aspy.yaml in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from pre-commit->visualdl->paddleocr) (1.3.0)\n", "Requirement already satisfied: virtualenv>=15.2 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from pre-commit->visualdl->paddleocr) (16.7.9)\n", "Requirement already satisfied: nodeenv>=0.11.1 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from pre-commit->visualdl->paddleocr) (1.3.4)\n", "Requirement already satisfied: pyyaml in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from pre-commit->visualdl->paddleocr) (5.1.2)\n", "Requirement already satisfied: cfgv>=2.0.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from pre-commit->visualdl->paddleocr) (2.0.1)\n", "Requirement already satisfied: identify>=1.0.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from pre-commit->visualdl->paddleocr) (1.4.10)\n", "Requirement already satisfied: toml in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from pre-commit->visualdl->paddleocr) (0.10.0)\n", "Requirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from requests->premailer->paddleocr) (1.25.6)\n", "Requirement already satisfied: chardet<3.1.0,>=3.0.2 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from requests->premailer->paddleocr) (3.0.4)\n", "Requirement already satisfied: idna<2.9,>=2.5 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from requests->premailer->paddleocr) (2.8)\n", "Requirement already satisfied: certifi>=2017.4.17 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from requests->premailer->paddleocr) (2019.9.11)\n", "Requirement already satisfied: MarkupSafe>=0.23 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from Jinja2>=2.10.1->flask>=1.1.1->visualdl->paddleocr) (1.1.1)\n", "Requirement already satisfied: zipp>=0.5 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from importlib-metadata->flake8>=3.7.9->visualdl->paddleocr) (3.6.0)\n", "Building wheels for collected packages: fasttext, python-Levenshtein\n", " Building wheel for fasttext (setup.py) ... \u001b[?25ldone\n", "\u001b[?25h Created wheel for fasttext: filename=fasttext-0.9.1-cp37-cp37m-linux_x86_64.whl size=2584156 sha256=acb4d4fde73d31c7dfdd2ae3de0da25a558c34c672d4904e6a5c4279185fe5af\n", " Stored in directory: /home/aistudio/.cache/pip/wheels/a1/cb/b3/a25a8ce16c1a4ff102c1e40d6eaa4dfc9d5695b92d57331b36\n", " Building wheel for python-Levenshtein (setup.py) ... \u001b[?25ldone\n", "\u001b[?25h Created wheel for python-Levenshtein: filename=python_Levenshtein-0.12.2-cp37-cp37m-linux_x86_64.whl size=171687 sha256=56b4a2de4349a05004121050df68b488ffd253dcc59187ca07b89b62d40c0218\n", " Stored in directory: /home/aistudio/.cache/pip/wheels/38/b9/a4/3729726160fb103833de468adb5ce019b58543ae41d0b0e446\n", "Successfully built fasttext python-Levenshtein\n", "Installing collected packages: tifffile, PyWavelets, shapely, scikit-image, pybind11, lxml, cssutils, cssselect, python-Levenshtein, pyclipper, premailer, opencv-contrib-python, lmdb, imgaug, fasttext, paddleocr\n", "Successfully installed PyWavelets-1.2.0 cssselect-1.1.0 cssutils-2.3.0 fasttext-0.9.1 imgaug-0.4.0 lmdb-1.2.1 lxml-4.7.1 opencv-contrib-python-4.4.0.46 paddleocr-2.3.0.2 premailer-3.10.0 pybind11-2.8.1 pyclipper-1.3.0.post2 python-Levenshtein-0.12.2 scikit-image-0.19.1 shapely-1.8.0 tifffile-2021.11.2\n" ] } ], "source": [ "# Install PaddlePaddle GPU version\n", "!pip install paddlepaddle-gpu\n", "# Install PaddleOCR whl package\n", "!pip install -U pip\n", "!pip install paddleocr" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### 1.2 Quickly predict text content\n", "\n", "The PaddleOCR whl package will automatically download the ppocr lightweight model as the default model\n", "\n", "The following shows how to use the whl package for recognition prediction:\n", "\n", "Test picture:\n", "\n", "![](https://ai-studio-static-online.cdn.bcebos.com/531d9b3aff45449893b33bcb5dd13971057fcb4038f045578b3abd99fa3a96f2)" ] }, { "cell_type": "code", "execution_count": 5, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "[2021/12/23 20:28:44] root WARNING: version 2.1 not support cls models, use version 2.0 instead\n", "download https://paddleocr.bj.bcebos.com/PP-OCRv2/chinese/ch_PP-OCRv2_det_infer.tar to /home/aistudio/.paddleocr/2.2.1/ocr/det/ch/ch_PP-OCRv2_det_infer/ch_PP-OCRv2_det_infer.tar\n" ] }, { "name": "stderr", "output_type": "stream", "text": [ "/opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages/skimage/morphology/_skeletonize.py:241: DeprecationWarning: `np.bool` is a deprecated alias for the builtin `bool`. To silence this warning, use `bool` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.bool_` here.\n", "Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations\n", " 0, 1, 1, 0, 0, 1, 0, 0, 0], dtype=np.bool)\n", "/opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages/skimage/morphology/_skeletonize.py:256: DeprecationWarning: `np.bool` is a deprecated alias for the builtin `bool`. To silence this warning, use `bool` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.bool_` here.\n", "Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations\n", " 0, 0, 0, 0, 0, 0, 0, 0, 0], dtype=np.bool)\n", " 0%| | 0.00/3.19M [00:00\n", "\n", "\n", "### 2.2 Detailed algorithm\n", "\n", "CRNN's network structure system is as follows, from bottom to top, there are three parts: convolutional layer, recursive layer, and transcription layer:\n", "\n", "
\n", "\n", "1. Backbone:\n", "\n", "As the underlying backbone network, the convolutional network is used to extract feature sequences from the input image. Since `conv`, `max-pooling`, `elementwise` and activation functions all act on the local area, they are translation invariant. Therefore, each column of the feature map corresponds to a rectangular area (called a receptive field) of the original image, and these rectangular areas are in the same order from left to right as their corresponding columns on the feature map. Since CNN needs to scale the input image to a fixed size to meet its fixed input dimensionality, it is not suitable for sequence objects that vary greatly in length. In order to better support variable length sequences, CRNN sends the feature vector output from the last layer of backbone to the RNN layer and converts it into sequence features.\n", "\n", "
\n", "\n", "2. neck:\n", "\n", "The recursive layer builds a recursive network on the basis of the convolutional network, converts image features into sequence features, and predicts the label distribution of each frame.\n", "RNN has a strong ability to capture sequence context information. Image-based sequence recognition using contextual cues is more effective than processing each pixel individually. Taking scene text recognition as an example, wide characters may require several consecutive frames to be fully described. In addition, some ambiguous characters are easier to distinguish when observing their context. Second, RNN can back-propagate the error differential back to the convolutional layer, so that the network can be trained uniformly. Third, RNN can operate on sequences of any length, which solves the problem of text images becoming longer. CRNN uses double-layer LSTM as the recursive layer to solve the problem of gradient disappearance and gradient explosion in the training process of long sequences.\n", "\n", "
\n", "\n", "\n", "3. head:\n", "\n", "The transcription layer converts the prediction of each frame into the final label sequence through the fully connected network and the softmax activation function. Finally, CTC Loss is used to complete the joint training of CNN and RNN without sequence alignment. CTC has a special merging sequence mechanism. After LSTM outputs the sequence, it needs to be classified in time sequence to obtain the prediction result. There may be multiple time steps corresponding to the same category, so the same results need to be combined. In order to avoid merging the repeated characters that exist, CTC introduced a `blank` character to be inserted between the repeated characters.\n", "\n", "
\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### 2.3 Code Implementation\n", "\n", "The entire network structure is very concise, and the code implementation is relatively simple. Modules can be built in sequence following the forecasting process. This section needs to be completed: data input, backbone construction, neck construction, head construction.\n", "\n", "**[Data Input]**\n", "\n", "The data needs to be scaled to a uniform size (3,32,320) before being sent to the network, and normalization is completed. The data enhancement part required during training is omitted here, and a single image is used as an example to show the necessary steps of preprocessing [source code location](https://github.com/PaddlePaddle/PaddleOCR/blob/release/2.3/ppocr/data/imaug/rec_img_aug.py#L126):\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [], "source": [ "import cv2\n", "import math\n", "import numpy as np\n", "\n", "def resize_norm_img(img):\n", " \"\"\"\n", " Data scaling and normalization\n", " :param img: input picture\n", " \"\"\"\n", "\n", " # Default input size\n", " imgC = 3\n", " imgH = 32\n", " imgW = 320\n", "\n", " # The real height and width of the picture\n", " h, w = img.shape[:2]\n", " # Picture real aspect ratio\n", " ratio = w / float(h)\n", "\n", " # Scaling\n", " if math.ceil(imgH * ratio) > imgW:\n", " # If greater than the default width, the width is imgW\n", " resized_w = imgW\n", " else:\n", " # If it is smaller than the default width, the actual width of the picture shall prevail\n", " resized_w = int(math.ceil(imgH * ratio))\n", " # Zoom\n", " resized_image = cv2.resize(img, (resized_w, imgH))\n", " resized_image = resized_image.astype('float32')\n", " # Normalized\n", " resized_image = resized_image.transpose((2, 0, 1)) / 255\n", " resized_image -= 0.5\n", " resized_image /= 0.5\n", " # For the position with insufficient width, add 0\n", " padding_im = np.zeros((imgC, imgH, imgW), dtype=np.float32)\n", " padding_im[:, :, 0:resized_w] = resized_image\n", " # Transpose the image after padding for visualization\n", " draw_img = padding_im.transpose((1,2,0))\n", " return padding_im, draw_img\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [ { "name": "stderr", "output_type": "stream", "text": [ "Clipping input data to the valid range for imshow with RGB data ([0..1] for floats or [0..255] for integers).\n" ] }, { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXQAAADbCAYAAAB9XmrcAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvIxREBQAAIABJREFUeJztvWmMJdl1JvadePGWfLlUVlZW1tq19UKySZFNuqdFLRBkamhTHMM9HghjcowRfxDogS3CkiHDwxkBhgQY8MiQJXsAQYP2iB7OQBDFoehhQ6BHpqjWaCRITTa3Xtnd1dVLVXXtWbm8fPkyXry4/nFvvPNFVUQuXVmZlU/nA7LqZmTEjbtEnLjnu2cR5xwMBoPBsPcR7XYDDAaDwbA9MIFuMBgMIwIT6AaDwTAiMIFuMBgMIwIT6AaDwTAiMIFuMBgMIwIT6AaDwTAi2BaBLiKfEJFXROSsiHx+O+o0GAwGw9Ygd+pYJCI1AK8C+DiACwC+DeDTzrmX7rx5BoPBYNgs4m2o4zEAZ51z5wBARL4E4HEAlQJ930TdHZ5plfyFPy5SfrgUUlrkX4TqEOHz6Rw631W0JQrnS6THBmlWcX8FfzeLH9GKPlO7nMtKj5chElW6Cv2ke2aFxpTfnk/hO3K/3fBcPZnvGRXqc1TOSo8zpHJO3x1kg3Hzbam8muqpOn/9h3Rz9y+fjKpLy56jwrk8zoWaK57zqi5UNqDi/GH7Nj55M+NSaGJVnfwe5+8ovQuFd6jYgvK28PtC9xFHY1fV9OIk3NZWfkf5eFYy/1fmu1jsrG36DdgOgX4MwHn6/QKAH731JBF5AsATAHBofxNP/vJHwh/oRRcadEeTUTIXmatx3VQmgQY9p0bnx7WGlmMdAq4ny2hwI62z2WwCABoNrWPh5pKeWysXqIPBYFhO+1TOtHMx3cfF2t5ev19af+Ty5lH7Ym1XvV7X+ug+a2tr2q6UHjSqZ5CVC+l66D+gAiUZpMNjjbqe24y1vmyg90ySZFhOU72WUfgwROXMIAu0/JwqAcF1SOFjqXUMCvKx/Pyqa7Ps9oe0qg+bqWMz/ednKnN+HBsxHaMx5+eZ6yv0Py0XurWaPossxQof6ZJ3dEADym3lvsU1fUar+N9MLy3UU1hIUBsbDf+M8vPPz7yj+9ciff+5nzwX/HxLRu80ywtavcRNEqt5452+w02SHSz/1ui9yOXb537jz7AV7NimqHPuSefco865R/dN1De+wGAwGAxbwnas0C8CuI9+Px6ObQ4FvWUz35fN699CX/A6rcR5MdVZ0S/3zeXOsDy/sDgsr/X065qvLvjrPzczPSy3Wkoltalci3llLaXlwgqRfmm1xvQ4ra7ykiNNxJGG0lnVlcX8/PywfGN+YVg+f/6CXkur0l6vNyyz1jM2Pj4s52PQHpsYHps9sG9YPn360LBcozZKTcclAq0oaWKywupPy7WIV7R6bSPML6/gBjz/TZqLSOcuHeg9azV9Ror9p/kiFZmnK29i1SqbV7C84uTnKBvQKp5ehWadtKKBPospaW45FZb2qT+krdVAK86+9o3BmqvQu9inMYqpLYOU5o4anPcpprEYsFZY0Ep1zGN6/njl3hjTuYsbtBikd6fPGnCgQPnY1KQ+l91ud1guaJ90z7VU29sg7b7Z0v6zVrBKYxFFtNIPWpJEOrbjdX7+dA4HTJEFTXyrbON2rNC/DeBBETktIg0AnwLw1DbUazAYDIYt4I5X6M65VEQ+B+CPAdQAfME59+Idt8xgMBgMW8J2UC5wzn0dwNe3dE2JMlG5EV2yQVXj3elNWMSwKhyRmrecrA7L//HZ7w/Lr72htARp68j38GJq6+FpVcP+wd//e8Myq5yrKyvD8sTUpN5/Ve8vFVQAEqIc6kR/TEwBALqrqh6+cvbNYfn1N5VOuXZNKZf5JaWW1hLaLE2UonEZqb91VRcjLGu7goJXJ9V+fFz7cN/Zt4blY0f2D8snT54elg8cUFqG6YTuirYxog2lRoM2Wvt0Tl4Ht47m+a033xmWf/iatqtHG4EDunp8QmmuWqRj9MiPvH9YnppU+qlWC5RHTze2WmPtYfkczUXS1/vcXNA+MM0Q11VtT3tKEcxM67Nz4ujhYbkRzl/p6hw2mTZZ0zEcb2u7mX9kKiImmi8lyuHmTX2O37pwWe/f1PPbY77+5SWlLacmdSyOHdF283O+r6XtunbpujZxoM9cOzzzQFGGEOOBOGyKRrG+lys8zwOiooijWCO6yhH9NejTe0HUUb2h9Sf0HrXaKjDGQhuErVzWiIoiOpfYR7Ta/lmItki6mKeowWAwjAhMoBsMBsOIYFsoly3DSYFG0eMVfEnBEqaEqqlwbGEUrR+03B/o+WtUXiXGIyXb09yet05/H4gOI6t+NVKz4gpbYra4KTgF0U752KSq2RlZv1wJNMoLL782PPbCy68My4srSuf0iUJZ6qialxWcL8hunyw+aqR+shVF0gv1O1VDx5a1fQvzl4bli5dU5b50Te32T59W+uXo4SPD8sSYqtbiyD53oH2qk8VFHNrryMlrNdF+3pjXe56/qFTBIk10WqDltDwxqVTI0aNHh+VGS1X3Vm65QDp8QnPVIzrrP/zlt4blzgrNBT3+dZr/Bln2HJ5V6qpPdZ457Q3NGg0d5xr5BLCNOVtFRdA6IiF780zLffKb+OGrSle99EN97ro6RRgfb4W69Z5nTh4bltsTahU1VtM5fOGC1v36q68Py/PzN7VPRKMwRdIlqgth7CKiMJvkP1Er2KxruU5zV2P/GLBVDp1DlB6xMqhTG3OrrFpN6xuje0bkN7D/gL7nx07652xQyUOXw1boBoPBMCIwgW4wGAwjgt2hXO4AOb1SRa04omf4FHZmYKcNVmiywvlajstiX5B2ys4+CXEuZKhQsGAZsPswO7+QKujqRHnUddf8yk3d/X/5lbMAgO+9oFai5y+p08i4shZojakVQkQqdI3oooToioQcKwaJ1pmShUaulTbJ2YPdntdIC75xQy0o+skbwzJbH3E4hfYxtYSoE/3TJ4ufmNYjLqcLqD/8AHDfVla1YV012kBK88/OP1FNaRF2shF+fcL9M3aNp3v21lQnX+1pmVgDkCEKIqIF2ZP8yoJaX83d1PLp++/3hZjd4ak/YPqFrDn4JSEKwQk7x1BoB5ojbu+CsiLorfrnpU3+cDwuDXqea2RCdnFFLYFevHZtWH7nkt6oPaGUW0QUSa9H1Fled40o1Kv03NB4MhNaiPdUHvmjEJ+Iz+EwF/WCQ9kgHNNz2bKGy2fOHBiWxw54Woqp4s3AVugGg8EwIjCBbjAYDCOCe4py2VQo1ZJzK+kX5lOiuLQsNY7BwOdQPRxtL+ygC0eGJA4n5WhwHEuDdC6Ot8J6cUyqKFMxlyn2ynef16jEL778MgBgcZmsTNSAAHWKgcH8z9iEHmcqKCariYQsRAoxVlgDDMedK7/uyAF1FOmvqaq8Ss4Zly5f1Xa1tfExRZs8RPFh6g0ao57yJUPLDR5z0ok5BglbEEXkIdYkykvIyoXjd9RE6x+Q5VDucJbRsYiooojipHCMlSZRaxlxVDE9IzGp80tdHcc3LqgV0dwRHz7pxFG1FCrGFCGLmwFTbhzWmKMgMv1CJhzUrgL9EJWU+R1y5ZRPj96/LlV4g2ieeY6DQ68oO0glNYrgGWixjBtAIWBSppP42abIn3y8xhFOWRYQLeaIdmTqrh7uxXVkFEuH2DesUp+zcL6rkG1VsBW6wWAwjAjuqRV6VcR4DsKvq3G2nwb9Xcu8QC+s6GnFn9HXesBlurbPG63Du0vp39do02xQsCvnjRKKx0yrNcT6lV9Z1k3El147Nyx/+7kXhuXLl/0Xfb/upUBoJXBtXlcC9YaW9x/Q3dIzZ+4flotx1bUthbjqNI5Li961++JFDa554/KNYbm3pn3IaDOV84HcXFS37jff1k0xKcSsVlv140e0s31a0Q7C88Ab2xnHo6dd7B7tRHZWefeL47rr4ZhshXtr5RudcP4CXrVLg1bopH0t06ZskmkdvUTvMzamWsE4bUT2Ul2hX7immtv06z4lwez+2eGxJoVsqJG9d5rqszCgVWkh8qDoALC2styh1SVtehd061CNI7+OlEOJ8iqfNJfWlEYtrVN4DCyqDwFPV7+nY9HhjeagJLJhA49nOiiPwV+niIgSU9tJi+EVMNu5X1zUZ52mHe1QHqPXvMaCiQTNSkqhLxKvxWZmh24wGAx/M2EC3WAwGEYE9xTlsqmNzqEd+sZ1uEK6KD2H6RK2j03J95pMteHYnz+cE0Xl6iRvxAhtsg5ItWZ3XqnxOeTWf12pixd/qG7Q1+ZJRQ6XcmS8VYreyHbgswfVJfx9731oWH7Pe7RcI8olpqiODWojuz7n0fmuHlc1/+pltR9+hTZwOe0eu7inNBecYEQuaHTEMYpeN0HlmFziJVA6hdSBUj4XvBE9oA1dtivu6b4tyGu8sOnpCnbo9dvqc0Qz9chou0mbeT2iMAb8LFLdfaYXKcJfKjqm74QwEFdvaITDCYr2WKeQAPxu9VKlCpg3KdjT0/2bLa1nrK1t6ZHfQr5byu/FoLDhqteNtbW+QV/raBPnRYEvi2l3aVNyokkJMSZ8/Y5TTVJ5uaOb6Rw+gelPHiNOmcjhOZrjHGWUU9nRpnPgfVpE4bToQatR2IzxSTUKKIuMshnYCt1gMBhGBCbQDQaDYUSwe5RLhUXL8M8F+qW2wd85wUV5vcJ5N6mckdrKFi+cAL0QETHUT9o0BhwagGx/QQkL+mSfzeEBoqaqbQmp5RcuXtHyeY0OWGM73BDtL00KNgZDPHhaI/N9gBIz3H/mhJ7E6h/ZG2cUPq5PbtUgm+jxoBafOqqUy5EZtRknL3Scf0ttpq+Ry3qNVOGMKLKFRVWLL76jtuqTlNP06GG1eMkj4gnZNnHShVqD3c35ONmykzpdI1osqrENO6vlZAkS1kZMwxTc5zkhC1lHJAOlPNKC5ZS2ZYHy3jItwhTR+cuecnn1rIZVmJ5QOmNuVi1IJNaxyECRLOlBL9BJNKbc/4yic6acpyG8PMT4obuq1NJKV/sTNZUifOSBB4blg2TlstLVa6Ma5xSl+aI5nWj7Z1Do2XrjDY3kePZ1HaN3LilFmBUS5VCCC6pn//TMsHz0pIanOPOgRpN0EeUJDdxdLdNjY0QVRpn2bXJMx/zonL9PnazNNgNboRsMBsOIwAS6wWAwjAjuKSuXKmzG+qUcZOVAXIWQGhNxpgqmS4oeBNqWcLorOCfxuaRykyUMR+lb61OOzJidRlRJvXZDc4AuUtKIfdOqWsbBiWhxQWkDMnjBA6dODcsPndZyg3bckzW9dozonzpZGQxYnyZrnTyqXNykPKcU3P/973vPsNzpaN+uUsKClBxbYrLgWCNaanFFaYkb5GRy4sR92q5AnWWcyIT90SO2bNHDHIWR81uOkUs+J/VgaxWmyGphrpnaaxLNw0kd2OInobqjBs0tRSFkK4sa0zxEF64seguh1986Pzx28rjSAFNT6kzWJAsmppky4nNcxdjxGPVp7Nj9pR6erzpFOxwj65g6hVLgCKODRJ/FuTF6R4k64iQcqzT+TBHFgTrkHLnvowQbS1eUwrzeZ6swSmRBMmKC2v7gcQ2t8MEPfXBYHp+m9taIugymZkLWT216/7KC8x3l6x3m161I+lOBTa/QReQLInJVRF6gYzMi8g0ReS38v3+9OgwGg8Fw97AVyuVfAfjELcc+D+CbzrkHAXwz/G4wGAyGXcCmKRfn3J+LyKlbDj8O4KdD+YsA/gzAP96oLpFi3BBFxfdlA4uYYuV0WcGygJJQEOXBUQ2ZWmHHojrdvxmiAGZUB1stLHdUhTpAjgJ12oV3tFOfkkp16bKqgpeuqJUL+W9gktS/pWVPXfA++IeI5niYrAbGOZYMUSgtihMSUWAJToLRKET7p2JQC6OC35U+UpOkKt93QlXeRXIgunBJ+8ltYSrmyhVN6nF47uCwfJWcr44dnfP3p8ZwTk+huCYRRV4cOFVzmy2KTkkOShyzg+PQtCeUxnDBsYafa3by4v40OfPDTaWQOGxHxFkY6Nq1AdN1eq9W27eRc+Q+96I6dk3tU6uR+08qVZV1dC5SzrXLOXVp0scpv22dPK6aLUogsuJphsYEOaFR4JdCjCXqT5OeP459kmZ6LTsLNjiCKdcq/vwxygzSWVGHqzblqB2nwCrEBBWSZ6QUP8f19P2eIuoqpiQwklFymDB3UcZxfyhmUCFqK8mrnKrJ7hLlUoFDzrncHu0ygEN3WJ/BYDAY3iW2zcrF+Z3Lys+JiDwhIs+KyLMLlHXeYDAYDNuDO7VyuSIiR5xzl0TkCICrVSc6554E8CQAvPfE1Nb0iG0AW8cUElZQmdVljqXA5+QKEgf3zyoYIbYaSAtOG4o+WWWw9QvnEmxyGFaikVrBomRqTI8d3K/70lOU4KJODeavuJAqGJHVRI2dSeieEauI+THuUIE30DvNzqhDxoED6hD0zhV17FijxBeNQswSrefGgqrOa+z8FOgVTqRQyBfLFk9STsXx/PM4VzmuFY6Hsqvw/KlxKN9+Qpfxc0mxT3j+idqYmdD5rZP1yfJNP47drlIo80tavnxV6alDh1SRbo6po1YGpRbYgqVOVNigML7lyTZzQxRX4hB4G6jPQpRLzHFl6Zljy7FCvBk6J3f4GlDdnOul2aJ5iSn2TiGqLluocYhjpejWelqeIIos4orC+5URtRTTM8LvHMfSdcPjO0u5PAXgM6H8GQBfu8P6DAaDwfAusRWzxd8H8FcA3iMiF0TkswD+GYCPi8hrAP52+N1gMBgMu4CtWLl8uuJPP7P12wpKvyUVOUXLG7S1XHusflfRLMUynb/BZy8j2oLPHZCqNsjYCULB1g+9Xq/0OGcPEhqjPO/kfnIaObhfqY1xoi1cpup0RNYMINW+QLmwNu2YiqG8myVOXoU0rvTbvn3UxjmN/VJ/TfvWo3i/LQoxK+R8skA0QmdFxyt3IolqrIZzqFtuGMUAobpjUpszCubDar5U0HUiefheHauCvxk77XBcD8r1usY0Hs1/m+b3+PHj2l7q37lVr/4vLWqcnMVlreP8RY2lc/SYOsccOaT0V8T0H1lCNSnbEtMcGT2LBSIubxYxLpmUU458XUq/ZRRXyBGn5wphfRVlc81xmhxZpAwoA9WArFwoGRUyUfqPAgyjQxYsnD0qypSWapCDYJbLAM5RyhnYQP3k9ob+y1ZkIsz132AwGEYGJtANBoNhRLArsVycc4X4LDt+/woLFlbXSRMvJF6W3KSFVciKvnA4UrZaiKjyPmV6WSWnBU5kG0dKnXBcj8EgOFC0VN3jjD41UlXX1tiygmgB3pHnfhQSb5dbYuS0VyRFcmH4Z+pznWLWTJHDEbeXY2/wPQtUCI3pcldjf/SC5ciYEM0Ulc9LIQE1zQWXbzF50MNs/SO3W7Rw+F4ew5ieLQrlgVaL3MJSirFDljCtWNt1+ODcsFyniq6+8zYA4CaxaRy+9vINdc46T0m9p/cptRVT/J4+xYdOiP5hLq6QkJvulVu8sAVRVcJ2tjhjHxqeOzasGuB2K6vb6gnPN89+j2iuNaJN1ugkpryY0e2TvFijbEQJJRXPChQwZ88ehD4w5cQWZHycaNmC9cvmYSt0g8FgGBGYQDcYDIYRwZ4In7tVi5YyZGSpURWOt9LKhZ0fohLLDlKnOAJvRmo7Wz/EjXIKgR1rOEsSZwlKOTxs3iemh+hcVk+Z8qnFZE3AY8GWGGSjwcqfFBxr8pPLrUlqFFZ3jXb+YxrDcaJfFlco9gk5YmSFwdDO5kmqAaAfnLLGyLKHUT3neo4U5pyvpaxWbNFUQr8M6NyYqZ2IqRgKu0sjVmMLjgGfo9dOjWscmHGiq2ZnfEaiyw2aH6p7mWK2vHn+7WH5wIxa0LD1S4M8cdYoZG6tziGmq5y1iv8DtzyLhSeKkqSz9QezWVn5s8hwRZsXf664W47c3pZ+4TjVQG1PqZw4toohxza6NqLfslArO03VCqGR3G3n+vbmtJFZuRgMBsPfSJhANxgMhhHBrlEu5ZYhd06tVIEpByflmU6KyXD1aC263cql0HpS7QrxYHg3nx2CiJdxpMJlKVuiUP2F5DGc7Nq33RHNIgXHGqqk6jihYD/AFEVV/JIwFqzacqLjel2tJrI1dQJip6UWxSnheCd9jmtD5YjuxY5YRVrmdhQSIFfMS2WZVGtXCIPKdQbVmi2IaOLIrwWOLJiylGkxPYfpFxBdF1F5kuiX40ePAgAuvK3JkFc7GmuEnbYuX9WQS2/Q+e1JtXiZPaDxXgZEucQU10XIyqbMcqwQS4fGn8sFqoaolVrG70g5dcJgijA/h99szlLGWZJqFSKHLWvY4Imfy1XKNrS2T6lDkEVN7lAlVCG3K6qiPF0em6i8fVWwFbrBYDCMCEygGwwGw4hg96xctsFyZStgOmNAYT2zrPybVoiMyupkrhYVrCDYgoIrKQ+lGtfL46EUaAMyJuBMMqDkxbVAaQg5nhTijpBKzM5RacE6g/pBTS86StBfOAtQPjBshcD0E3EIWYWjDluNsJNVo6kqrCO9uEdqLse7yeeAKS+2moBjKoTGk+ORMC1A7WUqpBiThywXSqxcCs5RNC9N6k+XMyPRkHOclmygz25vVa1V4kidjOZCJqczZ84Mj517/bVheYEol+6q9u3tCxeG5f37p4flqUkN08sJqxtEuRQcsQg5zVBFswxQTrnERLPEFfNSc+XvVIFGDM9ARscalLGqJfoOUVTdQvJwfr9jfi0pefmAnr+E49CQ81FursYxWQY05yyA2ZqJqditwFboBoPBMCLYpRW63LICXB8bfXWqauLrOBdkxgu0im0WvjbiTcFQrmq/0IZnIQA/22rTCoE3ZQcZ2/VSbkraRKuRe3ae4MBFvPpWOOrFoLhDNSzGhZ6WR8QrjAZ1ezDcuKHVFC+KKUxB1aYlbzL1KWFFa5w0ESlfofPqfnjPQnfK55ZXP4VclLy6LmyKsoZCq8usZBOLtBVKr1pYQTYKG4sU4oEXdhw9kN3WKYGFkNY51fZ1njqptuSXLp0bljsahLGw0XZjXsMnXLysSTCO3KcbzpMNzY2b1bTtTmjTuxDtsOSNrVhlF1aoNHacYCQqvEflER75bdT3jq/jTWZ6/2jMo4oKWQ+J6QGL6X3lNhYewmF7yzd2s2LD6fbvjsGwFbrBYDCMCEygGwwGw4hgVyiXKIrRnvSB9ZeX1VZ2bEztahvs7s4RBkMUukaDqA3OLZmoqhjVtHtpn2gGsn3OKKr9eF034lqRqpZJV+vP25hGqm46UuEWVrStE+PqVu3IJnt+STf/0oFeOzauyQaWV94clgdjtFlElE4SNss6ND7dNVXP0wFtOIGSZLA2R4kMeEORwXRJn+mH4MNcoIFo86lLduI1GvN0maLdEZ3kaON2jaIN8mbpGIUKyEjpTgb+nP5A78mJMWoNUo/ZKLx2u7s1AMQcKZE20WVAzyVFJMw3MeucVIQGOiGqJmrqGGVOKSQex5SoKKFQAWP0xq5156ntvv6jh9SW/NABHavuotIpFNQTy0tafvuC1jdzSMuzqb6X8x0doxUKQ7iS6Fg0o9BI2mWMqP/7W2SznVCIg0K0Sy6SfwY/fsL0y+2bjj3aZC1c15wclnv0jmRMXSba3iY1a9CjxB/0XDT7Wo6IinF5rluic4R+4fARmWMfkxJ/l03AVugGg8EwIjCBbjAYDCOCXaFcMpdhZdWrxpeuXhsej+j7wrbafXIbR6BXpvfpzvu+KVWh6g21LE1p57nGqjXvlJMtKduVUkrDQtS83FWXkzdwfQXagkMvsv0JnROT+h3XymmRHkVhbNA5UaArVlbUUqGzTPr0LFkWEP006Gl9TW5jVdIOzsHJ+R0DzTAgFZIpL97Br5NlT5X98oDyeLKNOdNlrTqp9kTj5OPOlifCCS4KLtZZ6XFGRLYNBQsGTuDBiRfyw4MCJ0C3KbfO4Ah/PCq80mL6h0MLxBydMISzYLf2U6ePDcsrFG3xwts39Z6UX2O5o3P3xluagzRqq306Yn2/YvIV4PfOBYpsQPRXn57hpKcUSoP4jIyoMEd+ANWrThpTNmIJ8zUouNJzxEa954BpDrZOKVi80PvNMoVDNbCFTsZ+LoPwP9XBFiyFhMVsfne7pc5msOkVuojcJyJPi8hLIvKiiPxiOD4jIt8QkdfC//s3qstgMBgM24+tUC4pgF92zj0M4KMAfkFEHgbweQDfdM49COCb4XeDwWAw7DA2Tbk45y4BuBTKyyLyMoBjAB4H8NPhtC8C+DMA/3i9ugaDDJ1AE1x4R1W7t85p4P2Y3Nn7CUWbC/+fOK5uzw+/9z3D8v79qiAknJczLlf565HuWtdJu2nSyPRp17oR5eocO0qoCtmMKI9mjSgMUpU5euN4S9sy0dabkhEPaqTbc7ty54N+TxNDrHbUbIEjvI21lJ7oJUrLsEs6O1llFS7WtYJqG1y8B7xTz8kj2GpD+8/5TQc0txzhjum3OqmlbHEy1VbarRUcXiIOTcDJI5iKSfm43jMuSV6yHorOUqHMneCxqnBsK5xecFQqv2dB5S/kepXbbnTi+PFheZkciG7eUPolIyunZaLrzp8/PyzvO3hwWG5Pz+g9abxqZN2UBcqFnfkSnn/6wwQ9l4UojOxMxGNXoJn4nJKxq3BIqgKfwhRO1ap3q1E7dwLvalNURE4B+DCAZwAcCsIeAC4DOFRxmcFgMBjuIrYs0EVkAsAfAvgl59wS/835z1HpJ0lEnhCRZ0Xk2aWVpOwUg8FgMNwBtmTlIiJ1eGH+e865r4bDV0TkiHPukogcAXC17Frn3JMAngSA+49NuTSoSMsdpQsWVBMsRMRLOWRHODy5T3fQBzV1fGhNqHNOSpHpslW9T0x0SYOoGC5zIoWI80gGNa5W2Pkm5wi2QqBySs4REcXAiMnhZZy8Rph+WaXoeM3odl0w40h6S4t6TzremFKHkz5RPpzrtJhTk9VVTsLBySxuj3BYo0okVnX6Js3ztavXh+XlJZp0QjMZwzfLAAAgAElEQVRWFT4tRKfTdu2bVOumVjDXKMTmoPpipoKY8mDjo4JVSgUVVZEoYziMFUukrCKUn6swuGEDnc1kOcjP76/pOLfaaoVy+JDSJlcOqtPQyvLFYZkeF9TI++jaNbVEa6yqtcoqvVOFaIqhLXWix1LHMY7IyYscsbKUrNmqEqyAQLxIVohamt+nPMJhwYClItxoMYLqxg53G1Eu1Uk6NqJqtkbZbMXKRQD8LoCXnXO/SX96CsBnQvkzAL62pRYYDAaDYVuwlRX6TwD4hwCeF5Hvh2P/FMA/A/BlEfksgLcA/P3tbaLBYDAYNoOtWLn8Baqt3H9mKzcViRDXvSNCo6V0yfikUhED2gmvkWNJEmJGrCSqwq0m2o21TFW45VWl+JM1sj5pkfpNDi+cVICtBdKULVduj80QUdzNYvIEVU+RkTpJ59dq2t7xttIMB2Y0Dsz8eVWRU1KFJdfzycpk8ebN0vL0mDp+CMes4FguBYWNVFiyFuqTtUKu5tbIJKdByRDiMbVCeeuS0iyXrqgK31kuj71S5fzlUj1nuq1j1AgOUmy1IaTyc9/YsgQFioZpAY7xUk65cJ7afN457C7TBhxW906Su3D9BauYnAIkCq27onGS9u/bNyyfPKEORxcuKkNar1GcEoo9dOXKlWE5WtQ6O+SsFBWogeDkRXk8Hc1Fn2kmsoQa0PsiJSGrgVvCyrIjXCEOrb8/02ZsWcTEBOd9ZQc5ro8TxTDYcikthAdmq68gLwrUCnWBu1N6l63BXP8NBoNhRGAC3WAwGEYEuxLLRaII9eBQML1frVJ++Io6FlEU2IIyl2vU86SqX1vU8r5VVXdWE/1etSjuREzWF6yLObbQADsfUejd3BKG1G1WTyPO+jPg/JPsHUGmFRR6d3xcKZejhw8PyxeuaroZDo+bO0LVSCVcIcuDK9c0ZOrMlKrcLQ4fTKGHb0mIqm0nqxyO6zJMKcrxYMiaoUehRq/PK/0zP6/9IQYHMdXdWyXnI+rzgWntB4dbzi1t+jQvNC2I6VGPXDn9wk47rEKDrJUKeTLZWiZUFG1ChWaHoEJmLM6Aw9mrSjIj3dqWvCNsqbNG2Z2mKOzwkUPqlHdoVq1fVntKrbDj3M15fY4a42t0DsetKcnSE1H4ajq1SyFz1xKyICs4lhWS9uo5BeszPYVz2Q4zaXGu26jcWsQVrLm2ZlGyJSuXghQrv6dRLgaDwWAYwgS6wWAwjAh2h3KBqsj7aPed44dm5ORQJ905t9DoZ3qsl3KcVorvMcmxUVTNG9B9uj3dtV/u6k3TQmYeigMTduUHoqpnq6WWHYyqxMisZrGlRLOlzj9Hjmiy3+m31BKhd41Cn4ax4DCy3VXt59sXNE7O1KSO89E5VbPrlAA4JYqo0HRyuKrXtZwFHZn70O2pCn327TeH5VfOaXmenImkwdl4ia6hLEzjNL4nT58elhvN2x/fghrMFBLRQpzEOGNrHqqHKZeIuIAC5VII8Zo7kGgdUmE1xHQeO7YUnFwIBbXcVZjoBP6Hab46zZsjmm98TJ+X+8+cHJZ57s5fpWxINL+c1JszLDHysctIvPA4J2TBxvFeagXLFqqQrVXYQoiHt8QpqJgXvfxdrELRsah8jhxZQmXFBpTUU+5YVDHl7xq2QjcYDIYRgQl0g8FgGBHsCuXioCpQneiCVlOtFpI+JXumuB5JSMy7Sll35pd0N3+xq8dZVY4bWkePMiBdW9CwovMUTKbTJdW1zqF3vVpKEWtRI0qIQ4qCY1aQY0VB+SP9q1HXsZia1jDA4+NKi8h1jdWS+xM5msbFjlJIg74680ztU2eeWl3H+djRo9ou4rkSyrAUk7VCneiPKKj8PXIw6fR0PJ978eVh+dxFtZRgWmKMnKnY+WQQ6Ty2pzVmy/0PnhmW2REpCzr6oCr2BRth0BwNWD0uhPXIysvscFSIz5GXObsV375cba/x+azalze9aIkxYHop0AxEibTGdZ4dUyWiz9nJY0rtvfOOPiNvnFear9kg6oQoH3YmYopE6Q0eW46NQ+8FZ6/ihM0F559yVMVHyemd8vkpjn/lilbKKRrZMPZKRVu5fcwy8jnbEGrXVugGg8EwIjCBbjAYDCOCXaFc4BwkqK4H9qv1xYFZpRk6XXVyWFxWNb6WJ6MlJ6BXzr0+LPdInzlxXOmEQV9pgcUltRQ5/46qlhxjxBHN48hpYhD08kZbjx05qplhauS0VKe4Jmy1w0ly1yh5riOao9HUOCUPv/+Dw/L1eUr2e8FTKhQlFdMzasFyjWJw/PV3nhuWL9/QGDcnrqiTzzEar4OUpYZxs6P01o0Fbwlx7ty54bGz517Tcyl8MfkbFcL0Doj+SCiTUp2ShB8hWqBJzlcJJaROQ4hhR6pyRHX0iMKTWBvQpgTjKVEYzTGi6LqUeJucuJptmt++5+BYbebMWAWapcKZhc9pt/T+HBKXE6ZP79O2Lyz5Z6FOmb4GfX3O+Pkbo7g++/drHSdOEP1C78KVBX1eVgfkWLRGGbHIyasdxo6f7VlqK4fdXe7qu90kq506Z/iqYCIytj6i4zmlw2GdWdA1m0qRcswejhnUpkxKHD8nJeurCaK0ChZK9IBHga6UijjJBQqrJGOWu1tJog0Gg8Fwb8MEusFgMIwIdodyEQy3xUkrwtysxnU5e+6dYZmigOLAEa9SNlpq+bFIVi7Pv/CDYfm1s68My2OUjDkhtWmFrGVS9nJgiwtyMuqFuCI1UvJmKB5Ns6HqbEoqXD/htEuU4YfUrJSsYthZZ3ZGE/O+//3vH5br9VcBAIvzavmSrOl9WhNK23DGnHNv69i+/oYmAx6neB8NsgrifhQca4KzCqvQix3KrqRaNvbvJ3qC+j+/oHPHGufJU0pjnTyp4V6JUUBCPMLQcYbVWUpH1CfHmtVE29uleCesNtc5Tgj57/T7+rwkVJZwUq1gKUPJzelB7xKF019jyoUstMiypEHlOGZrEYqDEso8V3GdKR+9f0IUTkZc2IF9+rwcO6zxXpa6en6TnotVCsTjaC6yLFjUpHRsoOUaUV5TZMHV6ygVylQE0xVS9DjSkrDFjYRz9Tqmf5hCiSuck9jJrsFOaQNOqq2C6eD0rNZfiEPj/2NrqqjCmqXMsWyrfke2QjcYDIYRgQl0g8FgGBHsUiwXhzjoIrzLfOrkfcPy629pKN2lRaUUlpf8rniDqAXOv8uJYZc7ahGQpGS1UrBE0CGQGsWH6dG1lD13qu3PObh/enhs7qCqW22yjnCkkvPuuJAFTZ2cdgYUY9RRlqSZaaVcHjqjsUzS4N30AxqfRXKOGp9kCkV37QeU1afW1LZ0qJ/pitIC7FjDDjJ5KOH6mMagObxP/77cVQuahKwzupS8iS0YHrhf5/9vfUQtew7N6Fh3lymWDanxWUjUzZYlFMoEMceMIQsKjvzLzmecdaZB1A3TBYOMsj3lzzFRNRz3JCNVPaY6aFoKdTcpC1Sb6Zc6q/McY3pwW5tqxP+w01JCNCNHct5PoYlPnVKa69K8OqhdodDHa0QXNcYoVHUYRw4vnK6pNQvPYaej94yZzqOB5CxghTJnBCfk9JKj+po00C2KQdMmS6VCnBpa6jYpkXujQNFQG/n4FsLGFOPElBzfIudiK3SDwWAYEeySHbpuYvEmB9vEfviDt2/+AcArr/nVwuUrusqYUvN1TExqHQUbb95AK3wKOcSjfqFTsuHljZP7jvvEE488/ODwWLupdTTp3IQ3gtjelJYuGX3aebVacDfv66ZUo64nnQhtcZTz89ybbw3LV6+ru/0q5SJtjZHhOmHAyzWalwaFPuBNx7WwuZbQxmK9SSt4Gmbe2OYgiWceUtv3H/mAzvlh8k9wqS7pY6dj2oxpAzTYkPMmc8wr8ez21eyt5YjOH9CKusbdp43IPj0j+cqtkIylEL1Sr+ONywFvvvZu3+QEgBaFahgMuB56vsL9eV+f79OKOWKontRPKMTChLb+8KxqhccOq09ClzZUB5QnN6aHd6wehbZqW4TGVjLaoOTnnJ8/6j+/C7zRybIjo3IenqEQPoGu41yrXQqVQQv0QkTIAY2RcGBV0pAKURjpGSxTIgrZVzkMQcSr9bx8l+zQRaQlIt8SkR+IyIsi8mvh+GkReUZEzorIH4hIY6O6DAaDwbD92ArlsgbgY865DwF4BMAnROSjAH4dwG855x4AcBPAZ7e/mQaDwWDYCJumXJzfScx33OrhxwH4GIB/EI5/EcCvAvid9WvLgOBCnJIdbJ02tI4dUdvuuPnwsNwaewMAcPZNtZ9eI7WGXfyXl8n2k4O60fm1WFW7yUndLDl2SFX+GaKCPvCg35R834Mnhsf6K7pRxPp5TBs4jYhUTqJZOJFGRIk6WnXKr7mkm1Ix1X/skN8sPHxQNw0PHVb+6QWKdnj+wuVheY3olwa7mBOFUEjUQO11xBHk5uesVo719cJjR7St983p2B4/rhu7DzzwgF5L7uOrC+p6HlPYwoPTek5G6r8b5Go+bawRFZSR7XlM9AvTQkyFsZofUx4H3sRPUq2zHjaXhezNOTIj24QztcDjTAwJaswQOaZuyPWeqKha4Fw4dypv8vGb3qBNwQHRNmurFGKDnrPjR9QmvUvPjnMaWmKF8seuhg31AW1+tyh6ZsQhLmhu0woqjOkaUFKRLOKNU6ZR/XGh8JnjFO4goz6zWftUm3xCaMOX/SMmxnTspqfUht5VbtDmm5sVUTULG6G3X3dX7dBFpCYi3wdwFcA3ALwOYMG54ShfAHCs6nqDwWAw3D1sSaA75wbOuUcAHAfwGID3bvZaEXlCRJ4VkWcXO/2NLzAYDAbDlvCurFyccwsi8jSAHwMwLSJxWKUfB3Cx4ponATwJAA8dn3TZ0Baa1E9K9rlGduDTk6pmP/a3PgQAOHlKKY9L15SSWFpR1a5g5UK2xGxjHkdqnzo3p6rloUOHhuX9k6paTQUTjYxykbo1ypEpqpLFhV17Nk5l9ZsizGUcEkDrYWsJpo4i1wv3pAiTR9QmfmryPxmWz1N+0fPvEP1CY9RZJZd0slXmYP+sxrZbXi1tkgo/Na7j+cj71BLoINmSzxygnKaU4KRPbYnJFKbGLvTsbp+yGu/buFbw0ycLDqKw7jumc3uwT0kyyPjYkTrP9uEH59T6o0FR+2qBu+H8mxy9kX0fHnjg/mH5ENn7r7J7Or0XkxNKi3GkRMeWU6EbNU4SwckjCjlS6Tmjc9hXwNHzN0PUwtycPl8c54Cfo9yiJKP3bJLymE7QM9InC66okJCiPMFIreD6X0655BEKB/TcLi8pLbpKER45Lep4W/0p+nV6/mmcx4iiLFJnhWSy2qf83aSYFYXokRX9ebfYipXLQRGZDuUxAB8H8DKApwH8XDjtMwC+dsetMhgMBsOWsZUV+hEAXxSRGvyH4MvOuT8SkZcAfElE/hcA3wPwu3ehnQaDwWDYAFuxcnkOwIdLjp+D59O3hpJd4ZTUrwFbJTRVFZqa9hHh6qT6TB9Qy44B0RkRqYR9Sp7AEQMjcr3nnKbjZHHBKr8LOTN7K0qzjLEFCzuwkPpdc+W73C7iiHh0PohmIct+VyPqKIxRjyI51hrqNHRkVqPnTe9TtfkIUQ51yoeakFVEP+HoeeUqbz24pI+RW/VES8dzktT5MbLy4MhzyQqpv2QhMjFBzjT0LKx2Fuh8siIK9TuygqiRM9l+iiT5njOn9JyGWjD1iYro18hahuocp3406rdTGsVolDTndOpD71HLHnZgKTgcEV1Ui3ReJtqcKYScX8J92Tms0eAEK3ou5+uNKdzFgNqbFJyWdE5PHdckGLMH9b3j56iehy0gh5w6zflEk2lJ7U5aCDGB0jK75RRymhbOCE5mNJ8z5Kj2/offMywfnNN3YWJKaUGmHFs0zzWiwmYPqiXeMj2XHAZgaK1CpjKca5bpJPY4yimcrWYZNdd/g8FgGBGYQDcYDIYRwe7EcoFDbt3CVhOrXVUtx1qqcjcp9shasMToUdD9ep1Uftq2Luy8E20ysU8pnAZdy9YvKzfVEiSiXe6JoE6Ok4NBjaIkcvD8PnuKEOqi9+S4GgNW1Th5AVnCCKm/k8HiYWyMxpDol5VlVQMd6YEHJ1UVX6MkBeNEOdVrStGwas/qYs6QFPIlkqruiLbp97WNA7DzDecU1fN7q9qPOpkFjI/r3LGVS07/9DmYCdEPNYpIMUVRKMfaSrl06do10WdxQNZHfE+O8aHKsfaHHXhaNR3zLkW1bNLzN0aWIJFoG4Xi19SEnb+oHOg9zos51tb7r5BD0BpZMzWJZuA8rv0VjkNEllhNpiL1WSxYXwVaLqOHpVGgHHRse5RfN6axKNh+0PNX9LQpj4MyjNVElVy7ou9zc1z7fHBGaaPxSaUomXKZmNB3YXVZ86t2lihAEaqchUJ+U1470zvPr06fDWUqkmBsBFuhGwwGw4jABLrBYDCMCMS9y6X9Hd1U5BqAFQDXNzp3BDAL6+eo4W9KX62fu4+TzrmDG5/msSsCHQBE5Fnn3KO7cvMdhPVz9PA3pa/Wz70Ho1wMBoNhRGAC3WAwGEYEuynQn9zFe+8krJ+jh78pfbV+7jHsGoduMBgMhu2FUS4Gg8EwIjCBbjAYDCOCXRHoIvIJEXlFRM6KyOd3ow13AyJyn4g8LSIviciLIvKL4fiMiHxDRF4L/+/fqK69gJCS8Hsi8kfh99Mi8kyY1z8QIX/7PQoRmRaRr4jID0XkZRH5sVGcTxH5H8Iz+4KI/L6ItEZhPkXkCyJyVUReoGOl8yce/zz09zkR+cjutfzdYccFeoin/tsAfhbAwwA+LSIPr3/VnkEK4Jedcw8D+CiAXwh9+zyAbzrnHgTwzfD7KOAX4ZOc5Ph1AL/lnHsAwE0An92VVm0v/k8A/945914AH4Lv70jNp4gcA/DfA3jUOfcB+JRan8JozOe/AvCJW45Vzd/PAngw/DyBDZPd33vYjRX6YwDOOufOOR9d6EsAHt+Fdmw7nHOXnHPfDeVl+Jf/GHz/vhhO+yKAv7s7Ldw+iMhxAH8HwL8MvwuAjwH4Sjhlz/dTRPYB+CmEpC3OucQ5t4ARnE/4QH1jIhIDaAO4hBGYT+fcnwOYv+Vw1fw9DuBfO4+/hk+veQR7CLsh0I8BOE+/XwjHRgoicgo+IcgzAA455/KknpcBHKq4bC/h/wDwP0ETIR4AsOA0w8QozOtpANcA/N+BWvqXIjKOEZtP59xFAL8B4G14Qb4I4DsYvfnMUTV/e1422aboXYCITAD4QwC/5Jxb4r85bye6p21FReS/AHDVOfed3W7LXUYM4CMAfsc592H4+EMFemVE5nM//Or0NICjAMZxO00xkhiF+WPshkC/COA++v14ODYSEJE6vDD/PefcV8PhK7nqFv6/ulvt2yb8BID/UkTehKfMPgbPNU8HlR0YjXm9AOCCc+6Z8PtX4AX8qM3n3wbwhnPumnOuD+Cr8HM8avOZo2r+9rxs2g2B/m0AD4Yd9Ab85stTu9CObUfgkX8XwMvOud+kPz0F4DOh/BkAX9vptm0nnHP/xDl33Dl3Cn7+/tQ5998AeBrAz4XTRqGflwGcF5E8CeXPAHgJIzaf8FTLR0WkHZ7hvJ8jNZ+Eqvl7CsDPB2uXjwJYJGpmb8A5t+M/AD4J4FUArwP4ld1ow13q10/Cq2/PAfh++PkkPL/8TQCvAfgTADO73dZt7PNPA/ijUD4D4FsAzgL4twCau92+bejfIwCeDXP67wDsH8X5BPBrAH4I4AUA/wZAcxTmE8Dvw+8L9OE1rs9WzR982qHfDnLpeXirn13vw1Z+zPXfYDAYRgS2KWowGAwjAhPoBoPBMCIwgW4wGAwjAhPoBoPBMCIwgW4wGAwjAhPoBoPBMCIwgW4wGAwjAhPoBoPBMCIwgW4wGAwjAhPoBoPBMCIwgW4wGAwjAhPoBoPBMCIwgW4wGAwjAhPoBoPBMCIwgW4wGAwjAhPoBoPBMCIwgW4wGAwjAhPoBoPBMCIwgW4wGAwjAhPoBoPBMCIwgW4wGAwjAhPoBoPBMCIwgW4wGAwjAhPoBoPBMCIwgW4wGAwjAhPoBoPBMCIwgW4wGAwjAhPoBoPBMCIwgW4wGAwjAhPoBoPBMCIwgW4wGAwjAhPoBoPBMCIwgW4wGAwjgjsS6CLyCRF5RUTOisjnt6tRBoPBYNg6xDn37i4UqQF4FcDHAVwA8G0An3bOvbR9zTMYDAbDZhHfwbWPATjrnDsHACLyJQCPA6gU6OM1cdN1YPgJccDweyKAIPyIPxSJLzvnr6nVgFoEJAmGJ7sMyDL/dxHACZA5rS8Sfw0ADAb+3HC7YTuiGt0j8tc4BwwyABm1LxyXUPetx5HX6bQP/L0UQQGhmcN+uNBeOP1bfrnL9L75PfL/+FqhyvM+iQBRBEgEZANt8/BUmodM/Hn5n6MwdvkYX1mDwWDYeVx3zh3c6KQ7EejHAJyn3y8A+NFbTxKRJwA8AQD7asA/OgpkEYAgiDP4/+PI8z+tGGgEIdLpAmcvAi8AmPeXoAXgwwBO7QNmp4CoAaQJ0EuBLPYnxC0gin19SQZ0loA33wB+CODlsp70/X+TAD4I4LHTQDv2wizLgDT1/zfC7+2GPz9JgcaEb1gvCXU1gE7H92ei5ducJECWAoj0vJzrarf9uUnP3y8G0GgBSIFeT09sRKGuMGa9HhDH/qfX9fW3G8BUDDTgy+0pf2038W1IUqA9ASwt+etyYd3rAI0YaLWBbgakcRjsyI9xmgLzS0DWAH6tdAANBsNdxlubOelOBPqm4Jx7EsCTAHC0KS5DEG5ZsQG5gMuyIPABLPSALwO4dVH4CoD/dQZopUBnAZia9UIJEZAlvtJGwwuyv3geeHqTbV0G8JcA/vIN4DSAH98PnDgOIAHSnhd6cUqDFgHPPg98raSuOoCfPwIcPRwEdgrMLwDfugKcA7ACoBnOPQrgp5rAqRPAUgeYiHx/ZmdCfwAsLAFRG7i6AHx/0X+cgrxGG8CjdeDwNDCVAa9eAhbC3zMAPXghH4dxzsLv+Zin4f+W7yogwPSEF/oL80C3B/QyYPboJgfSYDDsCu5EoF8EcB/9fjwcWx9MY4RVIOAFeRR5Yd8Lf++mtwvzHL3EC6VGHFbRmV+VN1p+Ffr2O8C/Pw+8/i46BgBvAHjjJvCjN4GPnAZmpoEoAaLMfyyyxGsQb1Zc3wdw7hIwPeO1jjQFlrrA83RO3rc3AHTXgL+34O/TaPkVe5qp0I3g7/nOIvCdcGyF6pruA40u8NzK5j9glXDwX7eAGoCHAHzs6p1WbDAY7ibuxMrl2wAeFJHTItIA8CkAT2366iC0c047L6eZXzGm0JV6GTq98D2I/TUZPGWAFvDmdeDf3YEwZzwD4Nzb/gOSQu+FCEAcVrQVyP+WBtqm16s+9wqAywtAktebeSqll/ofROuPRwy/gj+7hb5tFgN4qurtxbtQucFg2Da8a4HunEsBfA7AH8O/7192zr24mWtz7px/94WwgM+F1zr6QzcB0giIWv7cuOH59HeuA199fTOqwubxpwPgpVe95pDF4WPT8AJ0PRUnRaArwmp7I33o633g6rynZ5ABSz3Px3c6vq4oLo4bI4EX/N2td2/TuJt1GwyGO8cdcejOua8D+PpWr8sy5XIj+H9yoZ6QxFpv9dsNm6AJ/Oo3agCdBPjWW361uxEEnnueCG24tM65fQB/PgCmrwNHp/3GYRQ0iSoBC3gB2O0CWSus0jf4fK4BOHcNmMtpl67/cAGeXmpFyneXodXyfbqx/m3eNZbuUr0Gg2F7cNc3RQtwGHLoGQJnjuKGaG79AqwvvNKwgu8FCxc0gMtXgR9s0IQP7/f/T034+zcibxHS6wLvnPc2l/2S61YAPHsDeDQL1jVRsBRZ515DagZ+1d1d7wsV8B8BnLoKHD/sJycNy+IsBeKs+n49+DadGgPOr258n62iDuDwPgBGuxgM9yx2VqDfiiDN8xV7/nuUt6qxzqWx/+l0vKCMG8Cb6yzNmwB+/KAXwgDQDgI5S7ygnGsDhx8ETlwH/uQmUCYTXwNwagmYmPIr9NYGAr0Hby4Yx75PrRbKvxa34IVFYHbWXzf8qJH1TxlS+L48NAdMXAWWVr1des/5/6cmvFbz0ipwc5173wdg5hCQ9fwm8HQMRD0/XifmYALdYLiHYbFcDAaDYUSwoyv03GMxzh1XcvBnJdaV6Hr89FInbIqGupY63ja7Cj85Dsy29PdW7DlpNPz1reDcFE8Djy0B/2FQXs87A+B4BLQafgW+HnoIWkBKWscm8D0AJy4AH3hIefcsW39PAfBj206BMxNA2g5tSNQaqAu/kn+m4vqDAB69z2sSWTBKnwAQt/2Fke2KGgz3NHaccomi4KGY74oScl49od+r0B0EOiLQHkvz1ZuhH4LfzGxF2uE4AxqptifKgCj19zw6B+y/VE5NvA3gA0nYtMzWbyMANNqeCsk9TjeL764Bp3JPVPjN1SQFoiZKjfPzqmP4PYEk9R+SVkO5/FYMLCzffi3X0U6Cw1LPt7uRBSekCMhMoBsM9zR2h0O/VQpmtxTDqnSjVW0SNkTTzDv5VOFUM3DB8Lww4FfjjcgL9jj/yAQBPdXyjjRlK9lleK/NRjCXXE9GRwgfC/iP1Fb4rfMAzl0ATh33v/cyz4FHMUoFeoZgJx95zSCLwj5DXg5/i2vwhuUluAkvxKfgG5wL9DjSegwGw72LHRfo2UZLWqDoHrnOKWka4qwA6FQEjdwHb5WSeytFmV6fx3vJhW4e2ySKgLkDqLT/u74MTE17U8n1kCEIw8iv/rc62N9aBubyD1Ab6HaKZp2M3IQyd8hih6v8w5MC6GxwzyRYGsVZoIpyE9NcqzIYDPcsdt/KpeRQmgvd9QR6TQV6tA6f3QC852XqV5ssEOPUr1rzsANxsMNIod8AAA7iSURBVLLJzRHrKDdKWUAQdo3SLhT6Mvw2ZRp0rAz3NYHzt6y8rwB4Kbi7PvSgF+pphSaSC/RcE4hSPZZrPQk25uE78Fx7/uFrZP4h4Rg7BoPh3sSOCnTnNl6h50IaWP/cOMRwyYL54HrURxY2TjNaoWfwwj0KwhyBT08bPlYLUu+kU2al14MK8nXvCz0xt3mvwtFZ4PLF2z8gfxX+n74OHD3u47UUgrgE5II6AtBIoLb+UKHeioCpBsptMgMWUmCppR/TVqT275vSrgwGw65h51foGwiFLPWRDYH1PSujnMoIAbnWXTw2vOBOUBTouSaARIV6CiBO1hdeKdbXHnKwkN1ohR4DeHgf8IMKO+/nbgKzcz7wGK6VtwmRj0DZChuijdzOPzhhpTEws4FA76bAQuy1jyjzVjIt+B8T6AbDvY2dXaGD6BQUhXAeuiXLQgILhLjcFcjjmuRUQNVKOYH/W5KFELj0t6F8ioLFSljFZ10NElaGzW4QRvA3H8Z9j4pJKwroAkdngOuL5XFoLgK4MB9MCEsQQsF7iiRoHlnoQ4TAp28UqwDe8ocjYuabqUlqFLrBcK9jRwV6HPs4KDPTnibpdeCpjZbnhntZMPELUjdLfejWMqOMt5eBw3NeEL+9TljXRXgLmLk8cQZZuQCe3umG0AGNCf9/p+cDb3VLqA3AUy7dxAcG22gAkxBHvdHyH5SqhH+9Jc97f/AkcLEilP3T14ADFdfPAWgFj9n8Y5drBSm8NdBS6vu2HqYGwFSYlyyEC44BpG5jE02DwbC72FGB3k+Bv74CpFf8jXvwjivvBXDikOeuczNCAN50rqKufLMT8EKsYuEKAOguAY05v0ptkyVLlvk6GsGZKUmD/XakySDKPiYt6Cp4o1gubHueriMRW/BtazV8KqiqaJFVgbeW4D8Y6S3xcQDSUqCx5qvQgF+lZymQDkI9sv41BoPh3sCOC/S/Kjl+GcAnloDp2bB5F1bo7cwL6rKF8oT4NHFZ5D8Ec+MVJ8JHLZxoe4EX5yngoiBgg/BOA82Qe1UuLVWaa2MOgdq4hcIpw1Cgb+BYlNvCZwnw6IPAxdc2qPgWhEW1bsSmt2yK5tTPRpvSUOrJYDDsLdwTtOg7ALK2pzmSnHZJvGCfqLimgeAYBG+JMTPl6ZkyzDsAadEyJgvSLk29wO/1/Kq73fCC9c11kiHPwps8Rsn6Aj1FqD/k80zXsxkMK/6k49vxI+ucWoYEABqB7w7mmWnJT1LF+XCbMyBzZNvuCrS6wWC4R7GjAp2zyTNW4LP1XO0ACx0VuPlGaRnyDcdc6rRawImKc88CuHpV7bQTqMDLUkriHBIlZ/Bp4aow3fTCPErXF+gJfN1p+ECtu+odYOj4NH8dOHXQJ63eLNbg6ZQyQT78iKXrm1kCKtCZprHFusGwN7CjAj2qVQvdCwOgkwLxhN+cbEz41WaVR3+EICQB79kI4FTFuSsAXrjpk06nwXwvifxP2gDiltcOUgDXl4DvrhOG9yC85pCGD8lmVui5N+t6yGkRZKqdfGCDa27FUjd8QPLVeC7UQQJ+gzqGK3TYitxg2GvYUKCLyH0i8rSIvCQiL4rIL4bjvyoiF0Xk++Hnk5u54VzF8asAerEX6EnD/1yeXyd2d+DAhyZ5ETB3yHt3luEZAK9eA5JYf5Yyf5/WLNAKmYh+eMPHPa/CKcDbdAPDGClVGJpVhl82GuzcBn+q5YX67KS3aqmybLkVvCIvrNDTgjKzfhvgKZZbrBdNuBsMewCb2RRNAfyyc+67IjIJ4Dsi8o3wt99yzv3GVm54uMKf/g0A6UUv8HPLuvUSlEbBozPPBJQEn/czAF6puOZpAJdDrrlTdWBm1tezlPpcpN9dWz/5gwCYGPdmjkkGTGyQJBrw5ooINtxxldkMgImaOivFEZB2/Ebuo01/7I/X4fRzdAbAVJDEMdmR5x+W9SisHDEACBAR127C3GDYG9hQoDvnLiGk3HTOLYvIy/CWdQaDwWC4h7Als0UROQXgw/AMxk8A+JyI/DyAZ+FX8bctcEXkCQBPAMBkDZg7jEoj6/PhZyMcQnCBD7RLmvgNwTTyeS/PLlabHL6c/9/H+pmhS3AKnmbpdOGXrfH60Qsdih6l0Xor9Lby3/kGapYAU8HAft/axtnf8giUecTECH5M0kh/38xqO48+mWW+E0O79ko3V4PBcC9g05uiIjIB4A8B/JJzbgnA7wC4H8Aj8KLxfy+7zjn3pHPuUefco+2a9xB9zx02+mEAs9PeC7PX8QKoFWKxTDSAH9+KecgmcQDA7Lg3cUwG3qwvamw8gL0kbIwm69uh5/HYkz6Gwndp2ZsxJp3NbZD2oA5BqQs/wSY9CkJ+oy94HhMG8H0ctk/I4ctgMNyT2NQrKiJ1eGH+e865rwKAc+6Kc27gnMsA/F8AHttMXQ0Aj+5/l60NOHUMmG6p6WAcBFUcrE5mp4H339ktCjgI4AMHQ5CqLPwfPk4bORb1+l4wJqlP2FyFJAQIyxDCBYSZSfr+5+i+jXmuBCU24y4IZud58Y3aGwG3pQGM4G344x11QzMYDFvFZqxcBMDvAnjZOfebdPwInfZfAXhhw7s5v1KdnQb+Tg0YexcN/q8PBi/NkE1nIvIepXHiowzGPSBdAE7tA/7T5uYtRKrwfgDvbQKYB2Zjn8pubgyYjoCotzGF0QOQycbBvJKeF5itmtcAogho12hTswcc3+BeLdw+oRwGIBa/+boeOIRCJPQTmUA3GO51bOYV/QkA/xDA8yLy/XDsnwL4tIg8As+qvgngH21Ukcu8YIpib2f90xnwzhrwg0004j4A7637RM/JPIYJkdm1PcrtrDNvWz7VBtpd4NyicuebwTh8fJkHDgDtto8Fk/S8EG8ECd5Lgagb0rVVYD+AhvhYM1ni27Yft1vSTEJXwY1IaZJGBESBc++ueYF9BNXU/62TyREtI/iV+kLV5gIhQ+DRGyGfKDYXLthgMOwuNmPl8hcod/D8+pbvFlzwe10gXQWm9wOHjwNH571DTwbP/3JAwOk6cHgWmGh5Adq5DkxFnm5BiNWdx/xOMv+xmIj9Jun8ghdGHzgEnEmBV2+oY003/CTwQqsN79J/vA6cmAKm2xpydzqEJejM+/u2G4HiSYATdeDDZIbZhbb/qPgPQpYoFfLecM9OOC8GcBghGmPmefZ2yEyUuNA/qHnk0dDeDkK43PAzB2CGxi0X5vkEZwDadeCBKSC5AVxAMazCBLwG0B739E8ch1U5tAKLh24w3NvYeSU69VYiM5PezX5hHj4W+JQXQN1usGABMD3jV8bz14H5xK/OZ6d8mNgoA7o9L3QmWt5RKKcXssw750xMAEs9YL7jheVD+/wmZd7zdlh+xlkQYIG6iXsh/gq8gO2kfkNwouU1i1aQqFnmN2NPiW8rAFxPQuTDEOsl7QbhL8B0sJvvDYCZpqa6y1fBWc8L8bkWcH3F13M0fEobEdAdeMHbFs/HN4LVSQyvKeTp50LCpQLVgshz8VdvAIdrwOEIaAf1Is58/Z0FoLfiPXqjLGyCxl776WYYJgcxGAz3JsS5nbNDE5FlVPv97FXMAri+243YRlh/7n2MWp+sPxvjpHPu4EYn7fQK/RXn3KM7fM+7ChF5dpT6ZP259zFqfbL+bB9sq8tgMBhGBCbQDQaDYUSw0wL9yR2+305g1Ppk/bn3MWp9sv5sE3Z0U9RgMBgMdw9GuRgMBsOIYMcEuoh8QkReEZGzIvL5nbrvdkJE3hSR50NCj2fDsRkR+YaIvBb+v8NINXcXIvIFEbkqIi/QsdI+iMc/D3P2nIh8ZPdaXo6K/lQmXxGRfxL684qI/Oe70+pqrJNQZk/O0btJkLMH5qglIt8SkR+EPv1aOH5aRJ4Jbf8DEWmE483w+9nw91N3rXHOubv+A5+/+XX4/BMNeG//h3fi3tvcjzcBzN5y7H8D8PlQ/jyAX9/tdm7Qh58C8BEAL2zUBwCfBPD/wnsKfxTAM7vd/k3251cB/I8l5z4cnr0mgNPhmaztdh9uaeMRAB8J5UkAr4Z278k5Wqc/e3mOBMBEKNfhw4l/FMCXAXwqHP8XAP7bUP7vAPyLUP4UgD+4W23bqRX6YwDOOufOOecSAF8C8PgO3ftu43EAXwzlLwL4u7vYlg3hnPtzAPO3HK7qw+MA/rXz+GsA07cEZdt1VPSnCo8D+JJzbs059wZ8/vBNRQndKTjnLjnnvhvKy/BhiI5hj87ROv2pwl6YI+ecy1Mh1MOPA/AxAF8Jx2+do3zuvgLgZ0LQw23HTgn0YyjmrriAvZn1yAH4/0TkOyFxBwAccj6rEwBchs+/sddQ1Ye9PG+fCxTEF4gG21P9uSWhzJ6fo1v6A+zhORKRWghWeBXAN+A1iQXnXB4uits97FP4+yLuPBBsKWxTdGv4SefcRwD8LIBfEJGf4j86r1PtabOhUegDNpl85V5GSUKZIfbiHL3bBDn3KpzPBfEIfEy7x+Dj7u06dkqgX4SPgJvjOCoT0d27cM5dDP9fBfD/wE/klVzFDf9f3b0WvmtU9WFPzpurTr6yJ/pTllAGe3iOtpgg557vD8M5twCff/7H4OmuPJwKt3vYp/D3fQBu3I327JRA/zaAB8MucAN+Y+CpHbr3tkBExkVkMi8D+M/gk3o8BeAz4bTPAPja7rTwjlDVh6cA/HywpPgogEVS++9ZSHXylacAfCpYHZwG8CCAb+10+9ZD4FZvSyiDPTpHVf3Z43N0UESmQ3kMwMfh9waeBvBz4bRb5yifu58D8KdBy9p+7ODO8Cfhd7hfB/ArO3XfbWz/Gfjd9x8AeDHvAzwX9k0ArwH4EwAzu93WDfrx+/Aqbh+e5/tsVR/gd/N/O8zZ8wAe3e32b7I//ya09zn4l+kInf8roT+vAPjZ3W5/SX9+Ep5OeQ7A98PPJ/fqHK3Tn708Rx8E8L3Q9hcA/M/h+Bn4j89ZAP8WQDMcb4Xfz4a/n7lbbTNPUYPBYBgR2KaowWAwjAhMoBsMBsOIwAS6wWAwjAhMoBsMBsOIwAS6wWAwjAhMoBsMBsOIwAS6wWAwjAhMoBsMBsOI4P8HQw6G7YaZIk4AAAAASUVORK5CYII=", "text/plain": [ "
" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "import matplotlib.pyplot as plt\n", "# Read the picture\n", "raw_img = cv2.imread(\"/home/aistudio/work/word_1.png\")\n", "plt.figure()\n", "plt.subplot(2,1,1)\n", "# Visualize the original image\n", "plt.imshow(raw_img)\n", "# Scale and normalize\n", "padding_im, draw_img = resize_norm_img(raw_img)\n", "plt.subplot(2,1,2)\n", "# Visual network input diagram\n", "plt.imshow(draw_img)\n", "plt.show()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**[Network Structure]**\n", "\n", "* backbone\n", "\n", "PaddleOCR uses MobileNetV3 as the backbone network. The networking sequence is consistent with the network structure. First, define the public modules in the network ([source code location](https://github.com/PaddlePaddle/PaddleOCR/blob/release/2.3/ppocr/modeling/backbones/rec_mobilenet_v3.py)): `ConvBNLayer`, `ResidualUnit`, and `make_divisible`." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [], "source": [ "import paddle\n", "import paddle.nn as nn\n", "import paddle.nn.functional as F\n", "\n", "class ConvBNLayer(nn.Layer):\n", " def __init__(self,\n", " in_channels,\n", " out_channels,\n", " kernel_size,\n", " stride,\n", " padding,\n", " groups=1,\n", " if_act=True,\n", " act=None):\n", " \"\"\"\n", " Convolutional BN layer\n", " :param in_channels: number of input channels\n", " :param out_channels: Number of output channels\n", " :param kernel_size: Convolution kernel size\n", " :parma stride: stride size\n", " :param padding: padding size\n", " :param groups: the number of groups of the two-dimensional convolutional layer\n", " :param if_act: whether to add activation function\n", " :param act: activation function\n", " \"\"\"\n", " super(ConvBNLayer, self).__init__()\n", " self.if_act = if_act\n", " self.act = act\n", " self.conv = nn.Conv2D(\n", " in_channels=in_channels,\n", " out_channels=out_channels,\n", " kernel_size=kernel_size,\n", " stride=stride,\n", " padding=padding,\n", " groups=groups,\n", " bias_attr=False)\n", "\n", " self.bn = nn.BatchNorm(num_channels=out_channels, act=None)\n", "\n", " def forward(self, x):\n", " # conv layer\n", " x = self.conv(x)\n", " # batchnorm layer\n", " x = self.bn(x)\n", " # Whether to use activation function\n", " if self.if_act:\n", " if self.act == \"relu\":\n", " x = F.relu(x)\n", " elif self.act == \"hardswish\":\n", " x = F.hardswish(x)\n", " else:\n", " print(\"The activation function({}) is selected incorrectly.\".\n", " format(self.act))\n", " exit()\n", " return x\n", "\n", "class SEModule(nn.Layer):\n", " def __init__(self, in_channels, reduction=4):\n", " \"\"\"\n", " SE module\n", " :param in_channels: number of input channels\n", " :param reduction: channel zoom ratio\n", " \"\"\" \n", " super(SEModule, self).__init__()\n", " self.avg_pool = nn.AdaptiveAvgPool2D(1)\n", " self.conv1 = nn.Conv2D(\n", " in_channels=in_channels,\n", " out_channels=in_channels // reduction,\n", " kernel_size=1,\n", " stride=1,\n", " padding=0)\n", " self.conv2 = nn.Conv2D(\n", " in_channels=in_channels // reduction,\n", " out_channels=in_channels,\n", " kernel_size=1,\n", " stride=1,\n", " padding=0)\n", "\n", " def forward(self, inputs):\n", " # Average pooling\n", " outputs = self.avg_pool(inputs)\n", " # First convolutional layer\n", " outputs = self.conv1(outputs)\n", " # relu activation function\n", " outputs = F.relu(outputs)\n", " # The second convolutional layer\n", " outputs = self.conv2(outputs)\n", " # hardsigmoid activation function\n", " outputs = F.hardsigmoid(outputs, slope=0.2, offset=0.5)\n", " return inputs * outputs\n", "\n", "\n", "class ResidualUnit(nn.Layer):\n", " def __init__(self,\n", " in_channels,\n", " mid_channels,\n", " out_channels,\n", " kernel_size,\n", " stride,\n", " use_se,\n", " act=None):\n", " \"\"\"\n", " Residual layer\n", " :param in_channels: number of input channels\n", " :param mid_channels: number of intermediate channels\n", " :param out_channels: Number of output channels\n", " :param kernel_size: Convolution kernel size\n", " :parma stride: stride size\n", " :param use_se: whether to use se module\n", " :param act: activation function\n", " \"\"\" \n", " super(ResidualUnit, self).__init__()\n", " self.if_shortcut = stride == 1 and in_channels == out_channels\n", " self.if_se = use_se\n", "\n", " self.expand_conv = ConvBNLayer(\n", " in_channels=in_channels,\n", " out_channels=mid_channels,\n", " kernel_size=1,\n", " stride=1,\n", " padding=0,\n", " if_act=True,\n", " act=act)\n", " self.bottleneck_conv = ConvBNLayer(\n", " in_channels=mid_channels,\n", " out_channels=mid_channels,\n", " kernel_size=kernel_size,\n", " stride=stride,\n", " padding=int((kernel_size - 1) // 2),\n", " groups=mid_channels,\n", " if_act=True,\n", " act=act)\n", " if self.if_se:\n", " self.mid_se = SEModule(mid_channels)\n", " self.linear_conv = ConvBNLayer(\n", " in_channels=mid_channels,\n", " out_channels=out_channels,\n", " kernel_size=1,\n", " stride=1,\n", " padding=0,\n", " if_act=False,\n", " act=None)\n", "\n", " def forward(self, inputs):\n", " x = self.expand_conv(inputs)\n", " x = self.bottleneck_conv(x)\n", " if self.if_se:\n", " x = self.mid_se(x)\n", " x = self.linear_conv(x)\n", " if self.if_shortcut:\n", " x = paddle.add(inputs, x)\n", " return x\n", "\n", "\n", "def make_divisible(v, divisor=8, min_value=None):\n", " \"\"\"\n", " Make sure to be divisible by 8\n", " \"\"\" \n", " if min_value is None:\n", " min_value = divisor\n", " new_v = max(min_value, int(v + divisor / 2) // divisor * divisor)\n", " if new_v < 0.9 * v:\n", " new_v += divisor\n", " return new_v\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Use public modules to build backbone networks:" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [], "source": [ "class MobileNetV3(nn.Layer):\n", " def __init__(self,\n", " in_channels=3,\n", " model_name='small',\n", " scale=0.5,\n", " small_stride=None,\n", " disable_se=False,\n", " **kwargs):\n", " super(MobileNetV3, self).__init__()\n", " self.disable_se = disable_se\n", " \n", " small_stride = [1, 2, 2, 2]\n", "\n", " if model_name == \"small\":\n", " cfg = [\n", " # k, exp, c, se, nl, s,\n", " [3, 16, 16, True, 'relu', (small_stride[0], 1)],\n", " [3, 72, 24, False, 'relu', (small_stride[1], 1)],\n", " [3, 88, 24, False, 'relu', 1],\n", " [5, 96, 40, True, 'hardswish', (small_stride[2], 1)],\n", " [5, 240, 40, True, 'hardswish', 1],\n", " [5, 240, 40, True, 'hardswish', 1],\n", " [5, 120, 48, True, 'hardswish', 1],\n", " [5, 144, 48, True, 'hardswish', 1],\n", " [5, 288, 96, True, 'hardswish', (small_stride[3], 1)],\n", " [5, 576, 96, True, 'hardswish', 1],\n", " [5, 576, 96, True, 'hardswish', 1],\n", " ]\n", " cls_ch_squeeze = 576\n", " else:\n", " raise NotImplementedError(\"mode[\" + model_name +\n", " \"_model] is not implemented!\")\n", "\n", " supported_scale = [0.35, 0.5, 0.75, 1.0, 1.25]\n", " assert scale in supported_scale, \\\n", " \"supported scales are {} but input scale is {}\".format(supported_scale, scale)\n", "\n", " inplanes = 16\n", " # conv1\n", " self.conv1 = ConvBNLayer(\n", " in_channels=in_channels,\n", " out_channels=make_divisible(inplanes * scale),\n", " kernel_size=3,\n", " stride=2,\n", " padding=1,\n", " groups=1,\n", " if_act=True,\n", " act='hardswish')\n", " i = 0\n", " block_list = []\n", " inplanes = make_divisible(inplanes * scale)\n", " for (k, exp, c, se, nl, s) in cfg:\n", " se = se and not self.disable_se\n", " block_list.append(\n", " ResidualUnit(\n", " in_channels=inplanes,\n", " mid_channels=make_divisible(scale * exp),\n", " out_channels=make_divisible(scale * c),\n", " kernel_size=k,\n", " stride=s,\n", " use_se=se,\n", " act=nl))\n", " inplanes = make_divisible(scale * c)\n", " i += 1\n", " self.blocks = nn.Sequential(*block_list)\n", "\n", " self.conv2 = ConvBNLayer(\n", " in_channels=inplanes,\n", " out_channels=make_divisible(scale * cls_ch_squeeze),\n", " kernel_size=1,\n", " stride=1,\n", " padding=0,\n", " groups=1,\n", " if_act=True,\n", " act='hardswish')\n", "\n", " self.pool = nn.MaxPool2D(kernel_size=2, stride=2, padding=0)\n", " self.out_channels = make_divisible(scale * cls_ch_squeeze)\n", "\n", " def forward(self, x):\n", " x = self.conv1(x)\n", " x = self.blocks(x)\n", " x = self.conv2(x)\n", " x = self.pool(x)\n", " return x\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "At this point, the definition of the backbone network is completed, and the entire network structure can be visualized through the paddle.summary structure:" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "-------------------------------------------------------------------------------\n", " Layer (type) Input Shape Output Shape Param # \n", "===============================================================================\n", " Conv2D-1 [[1, 3, 32, 320]] [1, 8, 16, 160] 216 \n", " BatchNorm-1 [[1, 8, 16, 160]] [1, 8, 16, 160] 32 \n", " ConvBNLayer-1 [[1, 3, 32, 320]] [1, 8, 16, 160] 0 \n", " Conv2D-2 [[1, 8, 16, 160]] [1, 8, 16, 160] 64 \n", " BatchNorm-2 [[1, 8, 16, 160]] [1, 8, 16, 160] 32 \n", " ConvBNLayer-2 [[1, 8, 16, 160]] [1, 8, 16, 160] 0 \n", " Conv2D-3 [[1, 8, 16, 160]] [1, 8, 16, 160] 72 \n", " BatchNorm-3 [[1, 8, 16, 160]] [1, 8, 16, 160] 32 \n", " ConvBNLayer-3 [[1, 8, 16, 160]] [1, 8, 16, 160] 0 \n", "AdaptiveAvgPool2D-1 [[1, 8, 16, 160]] [1, 8, 1, 1] 0 \n", " Conv2D-4 [[1, 8, 1, 1]] [1, 2, 1, 1] 18 \n", " Conv2D-5 [[1, 2, 1, 1]] [1, 8, 1, 1] 24 \n", " SEModule-1 [[1, 8, 16, 160]] [1, 8, 16, 160] 0 \n", " Conv2D-6 [[1, 8, 16, 160]] [1, 8, 16, 160] 64 \n", " BatchNorm-4 [[1, 8, 16, 160]] [1, 8, 16, 160] 32 \n", " ConvBNLayer-4 [[1, 8, 16, 160]] [1, 8, 16, 160] 0 \n", " ResidualUnit-1 [[1, 8, 16, 160]] [1, 8, 16, 160] 0 \n", " Conv2D-7 [[1, 8, 16, 160]] [1, 40, 16, 160] 320 \n", " BatchNorm-5 [[1, 40, 16, 160]] [1, 40, 16, 160] 160 \n", " ConvBNLayer-5 [[1, 8, 16, 160]] [1, 40, 16, 160] 0 \n", " Conv2D-8 [[1, 40, 16, 160]] [1, 40, 8, 160] 360 \n", " BatchNorm-6 [[1, 40, 8, 160]] [1, 40, 8, 160] 160 \n", " ConvBNLayer-6 [[1, 40, 16, 160]] [1, 40, 8, 160] 0 \n", " Conv2D-9 [[1, 40, 8, 160]] [1, 16, 8, 160] 640 \n", " BatchNorm-7 [[1, 16, 8, 160]] [1, 16, 8, 160] 64 \n", " ConvBNLayer-7 [[1, 40, 8, 160]] [1, 16, 8, 160] 0 \n", " ResidualUnit-2 [[1, 8, 16, 160]] [1, 16, 8, 160] 0 \n", " Conv2D-10 [[1, 16, 8, 160]] [1, 48, 8, 160] 768 \n", " BatchNorm-8 [[1, 48, 8, 160]] [1, 48, 8, 160] 192 \n", " ConvBNLayer-8 [[1, 16, 8, 160]] [1, 48, 8, 160] 0 \n", " Conv2D-11 [[1, 48, 8, 160]] [1, 48, 8, 160] 432 \n", " BatchNorm-9 [[1, 48, 8, 160]] [1, 48, 8, 160] 192 \n", " ConvBNLayer-9 [[1, 48, 8, 160]] [1, 48, 8, 160] 0 \n", " Conv2D-12 [[1, 48, 8, 160]] [1, 16, 8, 160] 768 \n", " BatchNorm-10 [[1, 16, 8, 160]] [1, 16, 8, 160] 64 \n", " ConvBNLayer-10 [[1, 48, 8, 160]] [1, 16, 8, 160] 0 \n", " ResidualUnit-3 [[1, 16, 8, 160]] [1, 16, 8, 160] 0 \n", " Conv2D-13 [[1, 16, 8, 160]] [1, 48, 8, 160] 768 \n", " BatchNorm-11 [[1, 48, 8, 160]] [1, 48, 8, 160] 192 \n", " ConvBNLayer-11 [[1, 16, 8, 160]] [1, 48, 8, 160] 0 \n", " Conv2D-14 [[1, 48, 8, 160]] [1, 48, 4, 160] 1,200 \n", " BatchNorm-12 [[1, 48, 4, 160]] [1, 48, 4, 160] 192 \n", " ConvBNLayer-12 [[1, 48, 8, 160]] [1, 48, 4, 160] 0 \n", "AdaptiveAvgPool2D-2 [[1, 48, 4, 160]] [1, 48, 1, 1] 0 \n", " Conv2D-15 [[1, 48, 1, 1]] [1, 12, 1, 1] 588 \n", " Conv2D-16 [[1, 12, 1, 1]] [1, 48, 1, 1] 624 \n", " SEModule-2 [[1, 48, 4, 160]] [1, 48, 4, 160] 0 \n", " Conv2D-17 [[1, 48, 4, 160]] [1, 24, 4, 160] 1,152 \n", " BatchNorm-13 [[1, 24, 4, 160]] [1, 24, 4, 160] 96 \n", " ConvBNLayer-13 [[1, 48, 4, 160]] [1, 24, 4, 160] 0 \n", " ResidualUnit-4 [[1, 16, 8, 160]] [1, 24, 4, 160] 0 \n", " Conv2D-18 [[1, 24, 4, 160]] [1, 120, 4, 160] 2,880 \n", " BatchNorm-14 [[1, 120, 4, 160]] [1, 120, 4, 160] 480 \n", " ConvBNLayer-14 [[1, 24, 4, 160]] [1, 120, 4, 160] 0 \n", " Conv2D-19 [[1, 120, 4, 160]] [1, 120, 4, 160] 3,000 \n", " BatchNorm-15 [[1, 120, 4, 160]] [1, 120, 4, 160] 480 \n", " ConvBNLayer-15 [[1, 120, 4, 160]] [1, 120, 4, 160] 0 \n", "AdaptiveAvgPool2D-3 [[1, 120, 4, 160]] [1, 120, 1, 1] 0 \n", " Conv2D-20 [[1, 120, 1, 1]] [1, 30, 1, 1] 3,630 \n", " Conv2D-21 [[1, 30, 1, 1]] [1, 120, 1, 1] 3,720 \n", " SEModule-3 [[1, 120, 4, 160]] [1, 120, 4, 160] 0 \n", " Conv2D-22 [[1, 120, 4, 160]] [1, 24, 4, 160] 2,880 \n", " BatchNorm-16 [[1, 24, 4, 160]] [1, 24, 4, 160] 96 \n", " ConvBNLayer-16 [[1, 120, 4, 160]] [1, 24, 4, 160] 0 \n", " ResidualUnit-5 [[1, 24, 4, 160]] [1, 24, 4, 160] 0 \n", " Conv2D-23 [[1, 24, 4, 160]] [1, 120, 4, 160] 2,880 \n", " BatchNorm-17 [[1, 120, 4, 160]] [1, 120, 4, 160] 480 \n", " ConvBNLayer-17 [[1, 24, 4, 160]] [1, 120, 4, 160] 0 \n", " Conv2D-24 [[1, 120, 4, 160]] [1, 120, 4, 160] 3,000 \n", " BatchNorm-18 [[1, 120, 4, 160]] [1, 120, 4, 160] 480 \n", " ConvBNLayer-18 [[1, 120, 4, 160]] [1, 120, 4, 160] 0 \n", "AdaptiveAvgPool2D-4 [[1, 120, 4, 160]] [1, 120, 1, 1] 0 \n", " Conv2D-25 [[1, 120, 1, 1]] [1, 30, 1, 1] 3,630 \n", " Conv2D-26 [[1, 30, 1, 1]] [1, 120, 1, 1] 3,720 \n", " SEModule-4 [[1, 120, 4, 160]] [1, 120, 4, 160] 0 \n", " Conv2D-27 [[1, 120, 4, 160]] [1, 24, 4, 160] 2,880 \n", " BatchNorm-19 [[1, 24, 4, 160]] [1, 24, 4, 160] 96 \n", " ConvBNLayer-19 [[1, 120, 4, 160]] [1, 24, 4, 160] 0 \n", " ResidualUnit-6 [[1, 24, 4, 160]] [1, 24, 4, 160] 0 \n", " Conv2D-28 [[1, 24, 4, 160]] [1, 64, 4, 160] 1,536 \n", " BatchNorm-20 [[1, 64, 4, 160]] [1, 64, 4, 160] 256 \n", " ConvBNLayer-20 [[1, 24, 4, 160]] [1, 64, 4, 160] 0 \n", " Conv2D-29 [[1, 64, 4, 160]] [1, 64, 4, 160] 1,600 \n", " BatchNorm-21 [[1, 64, 4, 160]] [1, 64, 4, 160] 256 \n", " ConvBNLayer-21 [[1, 64, 4, 160]] [1, 64, 4, 160] 0 \n", "AdaptiveAvgPool2D-5 [[1, 64, 4, 160]] [1, 64, 1, 1] 0 \n", " Conv2D-30 [[1, 64, 1, 1]] [1, 16, 1, 1] 1,040 \n", " Conv2D-31 [[1, 16, 1, 1]] [1, 64, 1, 1] 1,088 \n", " SEModule-5 [[1, 64, 4, 160]] [1, 64, 4, 160] 0 \n", " Conv2D-32 [[1, 64, 4, 160]] [1, 24, 4, 160] 1,536 \n", " BatchNorm-22 [[1, 24, 4, 160]] [1, 24, 4, 160] 96 \n", " ConvBNLayer-22 [[1, 64, 4, 160]] [1, 24, 4, 160] 0 \n", " ResidualUnit-7 [[1, 24, 4, 160]] [1, 24, 4, 160] 0 \n", " Conv2D-33 [[1, 24, 4, 160]] [1, 72, 4, 160] 1,728 \n", " BatchNorm-23 [[1, 72, 4, 160]] [1, 72, 4, 160] 288 \n", " ConvBNLayer-23 [[1, 24, 4, 160]] [1, 72, 4, 160] 0 \n", " Conv2D-34 [[1, 72, 4, 160]] [1, 72, 4, 160] 1,800 \n", " BatchNorm-24 [[1, 72, 4, 160]] [1, 72, 4, 160] 288 \n", " ConvBNLayer-24 [[1, 72, 4, 160]] [1, 72, 4, 160] 0 \n", "AdaptiveAvgPool2D-6 [[1, 72, 4, 160]] [1, 72, 1, 1] 0 \n", " Conv2D-35 [[1, 72, 1, 1]] [1, 18, 1, 1] 1,314 \n", " Conv2D-36 [[1, 18, 1, 1]] [1, 72, 1, 1] 1,368 \n", " SEModule-6 [[1, 72, 4, 160]] [1, 72, 4, 160] 0 \n", " Conv2D-37 [[1, 72, 4, 160]] [1, 24, 4, 160] 1,728 \n", " BatchNorm-25 [[1, 24, 4, 160]] [1, 24, 4, 160] 96 \n", " ConvBNLayer-25 [[1, 72, 4, 160]] [1, 24, 4, 160] 0 \n", " ResidualUnit-8 [[1, 24, 4, 160]] [1, 24, 4, 160] 0 \n", " Conv2D-38 [[1, 24, 4, 160]] [1, 144, 4, 160] 3,456 \n", " BatchNorm-26 [[1, 144, 4, 160]] [1, 144, 4, 160] 576 \n", " ConvBNLayer-26 [[1, 24, 4, 160]] [1, 144, 4, 160] 0 \n", " Conv2D-39 [[1, 144, 4, 160]] [1, 144, 2, 160] 3,600 \n", " BatchNorm-27 [[1, 144, 2, 160]] [1, 144, 2, 160] 576 \n", " ConvBNLayer-27 [[1, 144, 4, 160]] [1, 144, 2, 160] 0 \n", "AdaptiveAvgPool2D-7 [[1, 144, 2, 160]] [1, 144, 1, 1] 0 \n", " Conv2D-40 [[1, 144, 1, 1]] [1, 36, 1, 1] 5,220 \n", " Conv2D-41 [[1, 36, 1, 1]] [1, 144, 1, 1] 5,328 \n", " SEModule-7 [[1, 144, 2, 160]] [1, 144, 2, 160] 0 \n", " Conv2D-42 [[1, 144, 2, 160]] [1, 48, 2, 160] 6,912 \n", " BatchNorm-28 [[1, 48, 2, 160]] [1, 48, 2, 160] 192 \n", " ConvBNLayer-28 [[1, 144, 2, 160]] [1, 48, 2, 160] 0 \n", " ResidualUnit-9 [[1, 24, 4, 160]] [1, 48, 2, 160] 0 \n", " Conv2D-43 [[1, 48, 2, 160]] [1, 288, 2, 160] 13,824 \n", " BatchNorm-29 [[1, 288, 2, 160]] [1, 288, 2, 160] 1,152 \n", " ConvBNLayer-29 [[1, 48, 2, 160]] [1, 288, 2, 160] 0 \n", " Conv2D-44 [[1, 288, 2, 160]] [1, 288, 2, 160] 7,200 \n", " BatchNorm-30 [[1, 288, 2, 160]] [1, 288, 2, 160] 1,152 \n", " ConvBNLayer-30 [[1, 288, 2, 160]] [1, 288, 2, 160] 0 \n", "AdaptiveAvgPool2D-8 [[1, 288, 2, 160]] [1, 288, 1, 1] 0 \n", " Conv2D-45 [[1, 288, 1, 1]] [1, 72, 1, 1] 20,808 \n", " Conv2D-46 [[1, 72, 1, 1]] [1, 288, 1, 1] 21,024 \n", " SEModule-8 [[1, 288, 2, 160]] [1, 288, 2, 160] 0 \n", " Conv2D-47 [[1, 288, 2, 160]] [1, 48, 2, 160] 13,824 \n", " BatchNorm-31 [[1, 48, 2, 160]] [1, 48, 2, 160] 192 \n", " ConvBNLayer-31 [[1, 288, 2, 160]] [1, 48, 2, 160] 0 \n", " ResidualUnit-10 [[1, 48, 2, 160]] [1, 48, 2, 160] 0 \n", " Conv2D-48 [[1, 48, 2, 160]] [1, 288, 2, 160] 13,824 \n", " BatchNorm-32 [[1, 288, 2, 160]] [1, 288, 2, 160] 1,152 \n", " ConvBNLayer-32 [[1, 48, 2, 160]] [1, 288, 2, 160] 0 \n", " Conv2D-49 [[1, 288, 2, 160]] [1, 288, 2, 160] 7,200 \n", " BatchNorm-33 [[1, 288, 2, 160]] [1, 288, 2, 160] 1,152 \n", " ConvBNLayer-33 [[1, 288, 2, 160]] [1, 288, 2, 160] 0 \n", "AdaptiveAvgPool2D-9 [[1, 288, 2, 160]] [1, 288, 1, 1] 0 \n", " Conv2D-50 [[1, 288, 1, 1]] [1, 72, 1, 1] 20,808 \n", " Conv2D-51 [[1, 72, 1, 1]] [1, 288, 1, 1] 21,024 \n", " SEModule-9 [[1, 288, 2, 160]] [1, 288, 2, 160] 0 \n", " Conv2D-52 [[1, 288, 2, 160]] [1, 48, 2, 160] 13,824 \n", " BatchNorm-34 [[1, 48, 2, 160]] [1, 48, 2, 160] 192 \n", " ConvBNLayer-34 [[1, 288, 2, 160]] [1, 48, 2, 160] 0 \n", " ResidualUnit-11 [[1, 48, 2, 160]] [1, 48, 2, 160] 0 \n", " Conv2D-53 [[1, 48, 2, 160]] [1, 288, 2, 160] 13,824 \n", " BatchNorm-35 [[1, 288, 2, 160]] [1, 288, 2, 160] 1,152 \n", " ConvBNLayer-35 [[1, 48, 2, 160]] [1, 288, 2, 160] 0 \n", " MaxPool2D-1 [[1, 288, 2, 160]] [1, 288, 1, 80] 0 \n", "===============================================================================\n", "Total params: 259,056\n", "Trainable params: 246,736\n", "Non-trainable params: 12,320\n", "-------------------------------------------------------------------------------\n", "Input size (MB): 0.12\n", "Forward/backward pass size (MB): 44.38\n", "Params size (MB): 0.99\n", "Estimated Total Size (MB): 45.48\n", "-------------------------------------------------------------------------------\n", "\n" ] }, { "data": { "text/plain": [ "{'total_params': 259056, 'trainable_params': 246736}" ] }, "execution_count": null, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# Define the network input shape\n", "IMAGE_SHAPE_C = 3\n", "IMAGE_SHAPE_H = 32\n", "IMAGE_SHAPE_W = 320\n", "\n", "\n", "# Visual network structure\n", "paddle.summary(MobileNetV3(),[(1, IMAGE_SHAPE_C, IMAGE_SHAPE_H, IMAGE_SHAPE_W)])" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "backbone output: [1, 288, 1, 80]\n" ] } ], "source": [ "# Picture input backbone network\n", "backbone = MobileNetV3()\n", "# Convert numpy data to Tensor\n", "input_data = paddle.to_tensor([padding_im])\n", "# Backbone network output\n", "feature = backbone(input_data)\n", "# View the latitude of the feature map\n", "print(\"backbone output:\", feature.shape)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "* neck\n", "\n", "The neck part converts the visual feature map output by the backbone into a 1-dimensional vector input and sends it to the LSTM network, and outputs the sequence feature ([source code location](https://github.com/PaddlePaddle/PaddleOCR/blob/release/2.3/ppocr/modeling/necks/rnn.py)):" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [], "source": [ "class Im2Seq(nn.Layer):\n", " def __init__(self, in_channels, **kwargs):\n", " \"\"\"\n", " Image feature is converted to sequence feature\n", " :param in_channels: number of input channels\n", " \"\"\" \n", " super().__init__()\n", " self.out_channels = in_channels\n", "\n", " def forward(self, x):\n", " B, C, H, W = x.shape\n", " assert H == 1\n", " x = x.squeeze(axis=2)\n", " x = x.transpose([0, 2, 1]) # (NWC)(batch, width, channels)\n", " return x\n", "\n", "class EncoderWithRNN(nn.Layer):\n", " def __init__(self, in_channels, hidden_size):\n", " super(EncoderWithRNN, self).__init__()\n", " self.out_channels = hidden_size * 2\n", " self.lstm = nn.LSTM(\n", " in_channels, hidden_size, direction='bidirectional', num_layers=2)\n", "\n", " def forward(self, x):\n", " x, _ = self.lstm(x)\n", " return x\n", "\n", "\n", "class SequenceEncoder(nn.Layer):\n", " def __init__(self, in_channels, hidden_size=48, **kwargs):\n", " \"\"\"\n", " Sequence encoding\n", " :param in_channels: number of input channels\n", " :param hidden_size: hidden layer size\n", " \"\"\" \n", " super(SequenceEncoder, self).__init__()\n", " self.encoder_reshape = Im2Seq(in_channels)\n", "\n", " self.encoder = EncoderWithRNN(\n", " self.encoder_reshape.out_channels, hidden_size)\n", " self.out_channels = self.encoder.out_channels\n", "\n", " def forward(self, x):\n", " x = self.encoder_reshape(x)\n", " x = self.encoder(x)\n", " return x\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "sequence shape: [1, 80, 96]\n" ] } ], "source": [ "neck = SequenceEncoder(in_channels=288)\n", "sequence = neck(feature)\n", "print(\"sequence shape:\", sequence.shape)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "* head\n", "The prediction header part is composed of a fully connected layer and softmax, which is used to calculate the label probability distribution on the sequence feature time step. This example only supports the model to recognize 36 categories of lowercase English letters and numbers (26+10) ([source code location](https://github.com/PaddlePaddle/PaddleOCR/blob/release/2.3/ppocr/modeling/heads/rec_ctc_head.py)):" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [], "source": [ "class CTCHead(nn.Layer):\n", " def __init__(self,\n", " in_channels,\n", " out_channels,\n", " **kwargs):\n", " \"\"\"\n", " CTC prediction layer\n", " :param in_channels: number of input channels\n", " :param out_channels: Number of output channels\n", " \"\"\" \n", " super(CTCHead, self).__init__()\n", " self.fc = nn.Linear(\n", " in_channels,\n", " out_channels)\n", " \n", " # Thinking: How much should out_channels be equal to?\n", " self.out_channels = out_channels\n", "\n", " def forward(self, x):\n", " predicts = self.fc(x)\n", " result = predicts\n", "\n", " if not self.training:\n", " predicts = F.softmax(predicts, axis=2)\n", " result = predicts\n", "\n", " return result" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "In the case of random initialization of the network, the output results are disordered. After SoftMax, the prediction result with the highest probability at each time step can be obtained, where: `pred_id` represents the predicted tag ID, and `pre_scores` represents the predicted result Confidence:" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "predict shape: [1, 80, 37]\n", "pred_id: Tensor(shape=[1, 80], dtype=int64, place=CUDAPlace(0), stop_gradient=False,\n", " [[23, 28, 23, 23, 23, 23, 23, 23, 23, 23, 23, 30, 30, 30, 31, 23, 23, 23, 23, 23, 23, 23, 31, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 5 ]])\n", "pred_scores: Tensor(shape=[1, 80], dtype=float32, place=CUDAPlace(0), stop_gradient=False,\n", " [[0.03683758, 0.03368053, 0.03604801, 0.03504696, 0.03696444, 0.03597261, 0.03925638, 0.03650934, 0.03873367, 0.03572492, 0.03543066, 0.03618268, 0.03805700, 0.03496549, 0.03329032, 0.03565763, 0.03846950, 0.03922413, 0.03970327, 0.03638541, 0.03572393, 0.03618102, 0.03565401, 0.03636984, 0.03691722, 0.03718850, 0.03623354, 0.03877943, 0.03731697, 0.03563465, 0.03447339, 0.03365586, 0.03312979, 0.03285240, 0.03273271, 0.03269565, 0.03269779, 0.03271412, 0.03273287, 0.03274929, 0.03276210, 0.03277146, 0.03277802, 0.03278249, 0.03278547, 0.03278742, 0.03278869, 0.03278949, 0.03279000, 0.03279032, 0.03279052, 0.03279064, 0.03279071, 0.03279077, 0.03279081, 0.03279087, 0.03279094, 0.03279106, 0.03279124, 0.03279152, 0.03279196, 0.03279264, 0.03279363, 0.03279509, 0.03279718, 0.03280006, 0.03280392, 0.03280888, 0.03281487, 0.03282148, 0.03282760, 0.03283087, 0.03282646, 0.03280647, 0.03275031, 0.03263619, 0.03242587, 0.03194289, 0.03122442, 0.02986610]])\n" ] } ], "source": [ "ctc_head = CTCHead(in_channels=96, out_channels=37)\n", "predict = ctc_head(sequence)\n", "print(\"predict shape:\", predict.shape)\n", "result = F.softmax(predict, axis=2)\n", "pred_id = paddle.argmax(result, axis=2)\n", "pred_socres = paddle.max(result, axis=2)\n", "print(\"pred_id:\", pred_id)\n", "print(\"pred_scores:\", pred_socres)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "* Post-processing\n", "\n", "The final result returned by the recognition network is the maximum index value at each time step, and the final expected output is the corresponding text result. Therefore, the post-processing of CRNN is a decoding process. The main logic is as follows:" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [], "source": [ "def decode(text_index, text_prob=None, is_remove_duplicate=False):\n", " \"\"\" convert text-index into text-label. \"\"\"\n", " character = \"-0123456789abcdefghijklmnopqrstuvwxyz\"\n", " result_list = []\n", " # Ignore tokens [0] represents the blank bit in ctc\n", " ignored_tokens = [0]\n", " batch_size = len(text_index)\n", " for batch_idx in range(batch_size):\n", " char_list = []\n", " conf_list = []\n", " for idx in range(len(text_index[batch_idx])):\n", " if text_index[batch_idx][idx] in ignored_tokens:\n", " continue\n", " # Combine the same characters between blank\n", " if is_remove_duplicate:\n", " # only for predict\n", " if idx > 0 and text_index[batch_idx][idx - 1] == text_index[\n", " batch_idx][idx]:\n", " continue\n", " # Store the decoded result in char_list\n", " char_list.append(character[int(text_index[batch_idx][\n", " idx])])\n", " # Record confidence\n", " if text_prob is not None:\n", " conf_list.append(text_prob[batch_idx][idx])\n", " else:\n", " conf_list.append(1)\n", " text = ''.join(char_list)\n", " # Output result\n", " result_list.append((text, np.mean(conf_list)))\n", " return result_list" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Take the predicted result from the random initialization of the head part as an example, and decode it to get:" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Tensor(shape=[1, 80], dtype=int64, place=CUDAPlace(0), stop_gradient=False,\n", " [[23, 28, 23, 23, 23, 23, 23, 23, 23, 23, 23, 30, 30, 30, 31, 23, 23, 23, 23, 23, 23, 23, 31, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 23, 5 ]])\n", "decode out: [('mrmmmmmmmmmtttummmmmmmummmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm4', 0.034180813)]\n" ] } ], "source": [ "pred_id = paddle.argmax(result, axis=2)\n", "pred_socres = paddle.max(result, axis=2)\n", "print(pred_id)\n", "decode_out = decode(pred_id, pred_socres)\n", "print(\"decode out:\", decode_out)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**Quick test:** If the index of the input model is trained, is the decoding result correct?" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "out: [('pain', 1.0)]\n" ] } ], "source": [ "# Replace the predicted result of the model\n", "right_pred_id = paddle.to_tensor([['xxxxxxxxxxxxx']])\n", "tmp_scores = paddle.ones(shape=right_pred_id.shape)\n", "out = decode(right_pred_id, tmp_scores)\n", "print(\"out:\",out)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The above steps complete the construction of the network and also realize a simple forward prediction process.\n", "\n", "The untrained network cannot predict the result correctly. Therefore, it is necessary to define the loss function and optimization strategy to run the entire network. The network training principle will be described in detail below.\n", "\n", "\n", "## 3. Detailed training principle\n", "### 3.1 Prepare training data\n", "PaddleOCR supports two data formats:\n", " -`lmdb` is used to train the data set (LMDBDataSet) stored in lmdb format;\n", " -`General Data` is used to train a data set (SimpleDataSet) stored in a text file;\n", " \n", " This time only introduces general data format reading\n", "\n", "The default storage path of training data is `./train_data`, execute the following command to decompress the data:" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [], "source": [ "!cd /home/aistudio/work/train_data/ && tar xf ic15_data.tar " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "After the decompression is complete, the training images are in the same folder, and there is a txt file (rec_gt_train.txt) that records the path and label of the image. The contents of the txt file are as follows:\n", "\n", "```\n", "\"Image file name Image annotation information\"\n", "\n", "train/word_1.png Genaxis Theatre\n", "train/word_2.png [06]\n", "...\n", "```\n", "\n", "**Note:** In the txt file, the picture path and picture label are divided by \\t by default. If they are divided by other methods, it will cause training errors.\n", "\n", "\n", "The data set should have the following file structure:\n", "```\n", "|-train_data\n", " |-ic15_data\n", " |- rec_gt_train.txt\n", " |- train\n", " |- word_001.png\n", " |- word_002.jpg\n", " |- word_003.jpg\n", " | ...\n", " |- rec_gt_test.txt\n", " |- test\n", " |- word_001.png\n", " |- word_002.jpg\n", " |- word_003.jpg\n", " | ...\n", "```\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Confirm whether the data path in the configuration file is correct, take [rec_icdar15_train.yml](https://github.com/PaddlePaddle/PaddleOCR/blob/release/2.3/configs/rec/rec_icdar15_train.yml) as an example:\n", "```yaml\n", "Train:\n", " dataset:\n", " name: SimpleDataSet\n", " # Training data root directory\n", " data_dir: ./train_data/ic15_data/\n", " # Training data label\n", " label_file_list: [\"./train_data/ic15_data/rec_gt_train.txt\"]\n", " transforms:\n", " - DecodeImage: # load image\n", " img_mode: BGR\n", " channel_first: False\n", " - CTCLabelEncode: # Class handling label\n", " - RecResizeImg:\n", " image_shape: [3, 32, 100] # [3,32,320]\n", " - KeepKeys:\n", " keep_keys: ['image', 'label', 'length'] # dataloader will return list in this order\n", " loader:\n", " shuffle: True\n", " batch_size_per_card: 256\n", " drop_last: True\n", " num_workers: 8\n", " use_shared_memory: False\n", "\n", "Eval:\n", " dataset:\n", " name: SimpleDataSet\n", " # Evaluate the data root directory\n", " data_dir: ./train_data/ic15_data\n", " # Evaluation data label\n", " label_file_list: [\"./train_data/ic15_data/rec_gt_test.txt\"]\n", " transforms:\n", " - DecodeImage: # load image\n", " img_mode: BGR\n", " channel_first: False\n", " - CTCLabelEncode: # Class handling label\n", " - RecResizeImg:\n", " image_shape: [3, 32, 100]\n", " - KeepKeys:\n", " keep_keys: ['image', 'label', 'length'] # dataloader will return list in this order\n", " loader:\n", " shuffle: False\n", " drop_last: False\n", " batch_size_per_card: 256\n", " num_workers: 4\n", " use_shared_memory: False\n", "```" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### 3.2 Data preprocessing\n", "\n", "The training data sent to the network needs to ensure that the dimensions within a batch are consistent. At the same time, in order to have a certain numerical comparison of the features between different dimensions, the data needs to be uniformly scaled **zoom** and **normalized**.\n", "\n", "In order to increase the robustness of the model, suppress over-fitting and improve generalization performance, a certain **data augmentation** needs to be implemented.\n", "\n", "* Scaling and normalization\n", "\n", "Related content has been introduced in the second section, which is the last step before the picture is sent to the network. Call `resize_norm_img` to complete image scaling, padding and normalization.\n", "\n", "* Data augmentation\n", "\n", "A variety of data augmentation methods are implemented in PaddleOCR, such as: color inversion, random cutting, affine change, random noise, etc., here is a simple random cutting as an example, more augmentation methods can be referred to: [rec_img_aug.py](https://github.com/PaddlePaddle/PaddleOCR/blob/release/2.3/ppocr/data/imaug/rec_img_aug.py)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [], "source": [ "def get_crop(image):\n", " \"\"\"\n", " random crop\n", " \"\"\"\n", " import random\n", " h, w, _ = image.shape\n", " top_min = 1\n", " top_max = 8\n", " top_crop = int(random.randint(top_min, top_max))\n", " top_crop = min(top_crop, h - 1)\n", " crop_img = image.copy()\n", " ratio = random.randint(0, 1)\n", " if ratio:\n", " crop_img = crop_img[top_crop:h, :, :]\n", " else:\n", " crop_img = crop_img[0:h - top_crop, :, :]\n", " return crop_img\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXQAAADyCAYAAABd/T4iAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvIxREBQAAIABJREFUeJzsvWuMHdl137tWVZ13P/luksPXzGikkazXVRwZvggM2caVkyAKAsOwHCTzQYC+2IgcCIiV5MsNcD/YgGEnAQwDA0uxcmH4EVmIBMNIoDuR4ySIZY1seTRPkcMZDptskj1s9rvPq2rfD6dOrd8eVg27h2RzeGb/gcHsLlbt2q+qU+u/1/ovdc5JQEBAQMDDj+hBNyAgICAg4N4gvNADAgICJgThhR4QEBAwIQgv9ICAgIAJQXihBwQEBEwIwgs9ICAgYEIQXugBAQEBE4J78kJX1U+r6iuqekFVv3Qv6gwICAgI2Bv0bgOLVDUWkR+KyE+LyKKIfFdEPuuce/HumxcQEBAQsFsk96COHxWRC865iyIiqvoHIvIZEal8oc9O1dyxA82Sf+GPi5YfLoWWFvmHog5Vno9zcL6raEuUn6+RHUuHWcX9Dfzd9H9EK/qMdjmXlR4vQ6RmdHn9xD0zrzHlt+cpvCP77Ypz7WTeM/LqcyhnpccJrZzTdwa9w7iN2lJ5NeqpOv/tF+nu7l8+GVWXlq0j71yOs1dzxTqv6kJlAyrOL9p355N3My5eE6vq5HM8fkbxLHjPkN+C8rbwecF91GHsqpruT8JtbeUzyuNZyfxfX9mWtc3erp+Ae/FCPyEil/H3ooj87be74NiBpjz9xY+P/sCb1ikG3WEySuYic3FRVm8y8UITOyfG+Ulct3JiQ8B6sgyDG1mdjUZDRETqdatj9da6nRuXv1DTNC3KwwHKmXUuwX1cYu3tDgal9Udu3Dy0L7F21Wo1qw/36fV61q4hFhrqSbPyl3Qt77+IvVD66bA4Vq/ZuY3E6stSu2e/3y/Kw6FdS3g/DFE5M8gX2vicqhcE61Dvx9LqSL33Y/n5Vddm2e2LtKoPu6ljN/3nmsrcaBzrCY5hzLmeWZ/X/2H5SzeObS3yLeb9SJc8oykGlG1l35LY1mgV/5vZpV493ocE2livj9Yo1z/XvMP948ief/aTc8H1rRmeab4v8PWSNPBaHTfe2TPcwLuD778enovx++2Xfv3PZC/Yt01RVf28qj6rqs+ubQ7ufEFAQEBAwJ5wL77Qr4jII/j7ZH7Mg3PuaRF5WkTkiVPT/CzHWbv5fdm9/a34Ba/hS5wfU5tb9st9a2OzKK+srhXlXtd+gMZfF/z1P3Jgrig3m0YltVGOE35Za2nZ+0LEH81my47j62pccrBEHCyUzR37slhZWSnKN1dWi/Lly4t2Lb5Ku91uUabV0+p0ivJ4DNqtqeLYoYOzRfns2aNFOUYbNbZxiQRflJiYzPv6s3Ic8YvWrq3n88svuJTz38BcRDZ3w9TuGce2Rvz+Y75gInO6xk2s+srmFyy/OLmOshRf8XgUGjVYRamtxSEstzEVNhygP7DWYsEX58D6RtByVTyLA4xRgrakQ8wdGjzuU4KxSGkVelapjXmC9ccv93rL5i6p23G+Owa0gHMKlMdmpm1dbm9vF2XP+sQ9e0Nrbx3WfaNp/adVsIOxiCJ86edWkkY2tp0a15/NYUqKLLfE98o23osv9O+KyOOqelZV6yLy8yLyzXtQb0BAQEDAHnDXX+jOuaGq/pKI/FcRiUXkK865F+66ZQEBAQEBe8K9oFzEOfenIvKne7qmxJio3Igu2aCKuTu9C48YmsIRzLyN/k5R/h/Pfr8on3/NaAlY6zLew0vQ1mNzZob9ws/9o6JMk3Nna6soT81M2/137P5aQQVIH5RDDfTH1IyIiGzvmHn4yoXXi/KrrxudsrxslMvKulFLvT42S/tG0bgM5m/NzMVINqxduYFXg2nf6VgfHrlwqSifWJgvyqdPny3KBw8aLUM6YXvL2hhhQ6lex0brAOeM62DrMM+XXr9alF8+b+3qYiMwxdWdKaO54sjG6KM/8sGiPDNt9FMc55RH1za2mq12Ub6IuegP7D63Vq0PpBmSmpntw65RBAfmbO2cOn6sKNfz87e2bQ4bpE16NoadtrWb/COpiAQ03xCUw61bto4vLV6z+zfs/HZrVP/GutGWM9M2FicWrN1c57NNa9fy0pvWxNTWXDtf8yL+OwSMhyT5pmiU2HO5xXlOQUWBo+iBrnKgv9IBngtQR7W61d/Hc9Rs2wujlbdB6eXSAxUFOhfsozTbo7UQ7ZF0CZGiAQEBAROC8EIPCAgImBDcE8plz3Dq0Sh2vIIv8TxhSqiaisAWwvd+sPIgtfN7KO+A8RjC93Tsz1vDv6dqw0jTL4aZlVT4EtPjxgsKwk55a9rM7AzeL9dzGuX5l84Xx55/6ZWivLZldM4AFMo63EYzL/gCfvvw+IhhftKLot/N63dmhrY2rH2rK0tF+cqSmdxLy+a3f/as0S/Hjy0U5amWmdbq4J+bWp9q8LhI8vY6BHnt9K2fN1fsnpevGFWwhokeerSclaemjQo5fvx4Ua43zXRvjj0XYMP3MVdd0Fn//X/9ZVHe3MJcYPnXMP91ePYcO2TU1QB1njs7cjSr122cY8QE0MecXlGRWB2Rwt88s/IAcRMv/9DoqhdftnW3bVMknU4zr9vuee70iaLcnjKvqFZsc/j8otX96g9fLcorK7esT6BRSJFsg+qSfOwiUJgNxE/Ens+6lWuYu5jxMUKvHJwDSg+sjNTQxrFXVhxbfS3cM0LcwPxBe85PnB6ts7SShy5H+EIPCAgImBCEF3pAQEDAhODBUC53gTG9UkWtONAzPIXBDAzaoEGTeedbOSnTvoB1ymCfPjgXOCp4Hiwpw4cZ/AJT0NVAedRs1/z6Ldv9f+mVCyIi8tfPm5fo5SULGukYayHNlnkhRDChY9BFfdAVfQRWpH2rcwgPjbFV2kCwB8Oee7CCb940D4pB/7WiTO8jyim0T5gnRA30zwAePwm+R9yYLkB/uADYt60da9i2OW3IEPPP4J8oNlqEQTbKxye/f8bQeNyz2zObfKdrZbAGAkcUiUALMpL8+qp5Xx25ZeWzjz46KiQMh0d/hPQLvDn4kIBCcMrgGEg7YI7Y3lVjRaS7M1ovbcTDcVzqWM8xXMiubJkn0AvLy0X56pLdqD1llFsEiqTbBXU2rjsGhXoD6wbjSSbU03sqV/7w9Il4DmUual5AWZofs3PpWcPyuXMHi3Lr4IiWIlW8G4Qv9ICAgIAJQXihBwQEBEwI3lWUy66kVEvOraRfyKdESWlZY2ow8BzUQ7W9fAddqQwJDmdINThqacDmot4K7eIEpiipmGvQXvmrH5gq8QsvvSQiImsb8DIxBwKpQQOD/E9ryo6TCkrgNdGHh4insUILMD/uXPl1CwctUGTQM1N5B8EZS9duWLva1vgEapNHoQ9Tq2OMusaXFJ4bHHPYxNQgoQdRhAixBigvhZcL9TtitfpTeA6NA84yHItAFUXQSaHGSgPUWgaOKsEaSWDOr2/bOL62aF5ERxZG8kmnjpunkK8pAo+blJQbZY2pgkj6BS4caJdHP0QlZT5Drpzy6eL520aFN0HzrFAHB48oA6T6MRQ8c1osYwMgATMkncS1DeVPHo+pcMp3AWgxB9qR1F0tvxfryKClA/ZNdtDnLD/fVbzbqhC+0AMCAgImBO+qL/QqxXiK8NvXOP2nBf9uZX6ge1/0+OLP8GudsoxrB9xoLe6upf/ew6ZZ6vmVc6MEesz4WpPEfuW3NmwT8cXzF4vyd597vihfuzb6RZ+3vRRRfAksr9iXQK1u5fmDtlt67tyjRdnXVbe2eLrqGMf1tVFo95UrJq5589rNotztWR8ybKYyH8itNQvrfv0N2xRTT7PafNVPLlhnB/iiTfP1wI3tjHr02MXuYidyc4e7X9R1t8MJfIW7vfKNTnGjC/jVrnV8ocP62sCmbD+zOrp9u0+rZVZBBxuR3aF9oS8um+U29+ooJcGh+UPFsQYkG2L4ew+HthZSfJV6yoNqA0BrZWMTX5fY9PZs67wah7iOIaVE+ZUPy6U5Y6qlNchjyJrFEHC6Bl0bi01uNOdGIh0bOJ7DtFyDvwZFRE3Qdlgx/AKmn/uVNVvrmHZp5+UWHvOYLya8aLaGkL7oj6zYLPihBwQEBLw3EV7oAQEBAROCdxXlsquNzsIP/c51OC9dlJ1DuoT+sUPEXsNVWxzj+fNzoqjcnORGjGKTNYVpzXBejXkOwvrfNOrihZctDHp5BSZyfimV8Xag3kg/8EOHLST8A+9/X1F+4gkrx6BcEqg61tFGhj6P1flunDQz/8Y18x9+BRu4TLvHEPch5oIJRnTR1BFbUK+bQjlBSLzmlI6XOlDL54Ib0Sk2dOlX3LV9W0HUuLfp6Tw/9Npt9TnQTF04bTewmdcFhZFyLaLuAelFKPwN1cb0ai4DceOmKRxOQe2xBkkAPlvdoVEF5E08f3rcv9G0elpta0sXcQvj3VI+F6m34WrXtdpWXzqwOtrgvCB86afdxabkVAMJMaZG9TummkR5Y9M20ymfQPqTY8SUiZTnaHSoMspUdth0znmfJiicJhZaDNmMzrQ5BZQpo+wG4Qs9ICAgYEIQXugBAQEBE4IHR7lUeLQU/+zRL/Ed/p0JLsrrVebdRDmD2UqPFyZA9xQR8/phTUtKaQD4/goSFgzgn015gKhhZlsfZvniletWvmzqgDH9cHO1v2Hf8zEo8PhZU+b7EBIzPHrulJ1E8w/+xhnk4wYIqxb4RHdys/jMcaNcFg6Yzzii0OXyJfOZXkbIegxTOANFtrpmZvGVq+arPo2cpsePmcfLWBFP4dvEpAtxneHmPA5fdpjTMWixKKYPO81yeILk30akYbzweSZkgXdEPzXKY+h5TllbVpH3lrQIKaLL10aUyw8vmKzC3JTRGUcOmQeJJjYWmUDJEgvdo5Mwpux/BnXOIfM05A8PGD/Z3jFqaWvb+hM1jCL86GOPFeXD8HLZ2rZro5g5RTFfmNOp9mgNKtbWa6+ZkuOFV22Mri4ZRZh5iXKQ4AL1zM8dKMrHT5s8xbnHTU3SRcgTmnN3cWbHWqAKo8z6Nt2yMT9+ZHSfGrzNdoPwhR4QEBAwIQgv9ICAgIAJwbvKy6UKu/F+KQe8HMBVKMyYiJkqSJf4EQTWlvx05wUn8VyY3PCEoUpfb4AcmQmDRsxIXb5pOUDXkDRids5MyyQPIlpbNdoADi/y2JkzRfl9Z61cx457v2fXtkD/1OBlkNKehrfOWFUuaSDPKcT9P/iBJ4ry5qb17QYSFgwR2JLAg6MHWmpty2iJmwgyOXXqEWtXTp1lTGTCePSIni12mCqMzG/ZQkg+k3rQW4UUWZzPNam9BmgeJnWgx08fdUd1zC1UCOllEZPmAV24tTbyEHr10uXi2OmTRgPMzFgwWQMeTKSZMvA5rmLsOEYDjB3DX2r5+qpB7bAF75gapBSoMJr2bS0eaeEZBXXEJBw7GH9SRElOHTJH7geQYGP9ulGYbw7oFYZEFnhHTKHtj580aYUPf+TDRbkzh/bGoC5zVzOF91Mbz1/mBd8hX2+RX7ci6U8Fdv2FrqpfUdUbqvo8jh1Q1W+p6vn8//NvV0dAQEBAwP3DXiiX3xWRT7/l2JdE5Bnn3OMi8kz+d0BAQEDAA8CuKRfn3J+r6pm3HP6MiPxEXv6qiPyZiPzKnepS9XVDDBW/L3fwiPErx2WeZwGSUIDyoKohqRUGFtVw/0auApihDnotbGyaCXUQgQI17MI77NQPYVItXTNTcOm6ebkgfkOmYf6tb4yoC+6DfwQ0x5PwGuhQSwYUShM6IRGEJZgEo+6p/aOYm4WRF3dlS2oapvIjp8zkXUMA0eKS9ZNtIRVz/bol9Th25HBRvoHgqxPHj4zuj8Ywp6dC1ySC8mLqzMxtNKFOiQAlanZQh6Y9ZTSGywNruK4Z5MX+NJj54ZZRSJTtiJiFAdf2UtJ1dq9me9RG5sh97gUL7JqZNa+RR08bVZVt2lwMmWuXOXUx6R3kt60h4qrRRAKRrRHNUJ9CEBqEXzyNJfSngfVH7ZNhZtcyWLBOBVPWqqPzW8gMsrllAVdt5KjtQFgFTJCXPGMI/RzXted7BtRVgiQwmiE5TD53UUbdH2gGeaqteF+NqZrsPlEuFTjqnBv7o10TkaN3WV9AQEBAwDvEPfNycaOdy8qfE1X9vKo+q6rPriLrfEBAQEDAvcHderlcV9UF59ySqi6IyI2qE51zT4vI0yIi7z81szc74h6A3jFewgqUaS5TS4HnjA0kivtnFYwQvQaGXtCGYQCvDHq/MJdggzKsoJGauUfJTMuOHZ63fekZJLioocH8FVeYghG8JmIGk+CeEU3E8TF2yOMN7E6HDlhAxsGDFhB09boFdvSQ+KLuaZZYPTdXzXTuMfgpp1eYSMHLF0uPJy2n4jj/HOeqwDXveF52FZE/MaV8B31cxnUJ7RPOP6iNA1M2vzV4n2zcGo3j9rZRKCvrVr52w+ipo0fNkG60LFArE6MW6MFSAxWWeuNbnmxz7IjiSgICbwP6rKBcEurKYs3Rc8zTm8E544CvFHUz10ujiXlJoL3jqerSQ40Sx0bR9bpWngJFFrGi/PnKQC0lWCN85qil64rj+0u5fFNEnsrLT4nIN+6yvoCAgICAd4i9uC3+voj8bxF5QlUXVfVzIvKrIvLTqnpeRH4q/zsgICAg4AFgL14un634p5/c+21VSn9LKnKKljdob7n2aH5X0Sx+Geff4WcvA23Bc1OYamnGIAgDvR+63W7pcWYPUozROO/kPIJGDs8btdEBbeEyM6cjeDMITHuPcqE17UjFIO9mSZCXl8YVf83Ooo1HTPuldt761oXebxMSs4rgk1XQCJtbNl7jIJIophlOqVs2DBogqDuB2ZxBzIdmvlbQdapj+V4bKy/ejEE71PVArtceaTzMfxvze/LkSWsv+ndxZ2T+r6+ZTs7ahtVx+Ypp6Rw/YcExC0eN/opI/8ETqoFsS6Q5MqxFj4gbNwuMS6bllCOvG+KvDLpCDpye82R9DWVzTZ0mB4+UFBmoUni5IBmVZGr0HwSGZRMeLMweFWVGS9URIJiN3wHMUcoMbIJ+sr15/3Uv70QJof8BAQEBE4PwQg8ICAiYEDwQLRfnnKfPsu/3r/BgobkOS9xLvKxjlxaakBV9oRwpvRYiVD5AppcdBC0wkW0SGXVCXY80zQMommbuMaNPDFO116NnBWgB7sizH17i7XJPjDHtFalPLhT/jD7XoFkzg4AjtpfaG7ynR4VgTDe2Tfujm3uOtBQ0U1Q+L14CaswFy29xebDD9P7R2z1aKN/LMUywtiDlIc0mwsKG0NiBJ0wzsXYdO3ykKNdQ0Y2rb4iIyC2waZSvvXbTgrMuI6n33KxRWwn0ewbQh+6D/iEX5yXkxr3GHi/0IKpK2E6PM8bQcO7oWJXK7V5Wt9WTr2/Ofhc0Vw+0SQ8nkfIiozvA+6KHbER9JBXPPAqY2bPTvA+knOhBxuOgZT3vl90jfKEHBAQETAjCCz0gICBgQvBQyOfu1aOlDBk8NarkeCu9XBj8EJV4dsCcogJvBrOd3g9JvZxCYGANsyQxS9CQ8rDjPpEewrk0T0n5xAm8CTgW9MSAjwaNP/UCa8Ynl3uTxJDV7WHnP8EYdkC/rG1B+wSBGJk3GNbZcZJqEZFBHpTVgmcPUT3ndo56c85rkdWKHk0l9EuKcxNSOxGpGMjuYsRienCkPMeunemYDkwHdNWhA6OMRNfqmB/UvQHNltcvv1GUDx4wDxp6v9QRidODZG5co8R0VbCW/3+Rt6xFb0UhSTq9P8hmZeVrkXC+z8voXHVvOXJ7WwbecdSAtg9R7jt6xSCwDddG+CvLa2XQVOxJI7nbzh21d0wbBS+XgICAgPckwgs9ICAgYELwwCiXcs+Qu6dWqkDKwWl5phM/Ga4djaPbvVy81sO08/RguJvPgCDwMg4mXDakJwrq95LHMNn1qO0ONIt6gTWopOo44PkPkKKo0i/Jx4KmLRMd12rmNZH1LAiIQUtN6JRQ72RAXRuUI9yLgVg+LXM7vATIFfNSWYZp7TwZVNaZm9b0IMLEIa5FHDyYsiFpMTuH9IuArotQngb9cvL4cRERWXzDkiHvbJrWCIO2rt0wyaXXcH572jxeDh00vZcUlEsCXReFl02Z55inpYPxZ9mjakCtxBmfkXLqhCBFOD6HTzazlDFLUlzxyqFnDR2euC53kG2oN2vUocCjZhxQpaiQ7YqqKE831iYqb18Vwhd6QEBAwIQgvNADAgICJgQPzsvlHniu7AWkM1LIemZZ+W+ap4xKc3JsFnleEPSgYCXlUqpJrVwPxaMN4EzATDKC5MVxTmkoAk883RGYxAyOGnreGegHmu4HSuBfmAVoPDD0QiD9BA4hqwjUodcIg6zqDTNhHeziLsxc6t2M54CUF70mxJEKwXhSj4S0ANpLKsTX5IHnQomXixcchXlpoD/bzIyEIadOS5ba2u3umLdKElmQ0ZE8k9O5c+eKYxdfPV+UV0G5bO9Y395YXCzK8/NzRXlm2mR6mbC6DsrFC8QCxjRDFc2SSjnlkoBmSSrmJXblz5RHI+ZrIMOxOjJWNdWeIajqesnD+XwnfCyRvDzF+utThwbBR2N3NWqypJhzvoDpzUQqdi8IX+gBAQEBE4IH9IWub/kCfHvc6VenqiZex1yQGT/QKrZZeG3ETcG8XNV+xYanJ8BPX218IXBTNs3o14vclNhEixGePU5w4CJ+fRscepH6O1RFMfF6Wq6I540Gup0WGzf4muJHMWQKqjYtuck0QMKKZgeWiJZ/ofPrvrin153yueXXj5eLkl/X3qYoLRR8XWYlm1iwVpBe1fuCrHsbi5B44Icd1QMZto4EFgqrc6Y9qvPMafMlX1q6WJQ3TYTR22i7uWLyCVeuWRKMhUdsw3m6brlxs9ja7hSb3p7aYckTW/GV7X2hYuyYYCTynqNyhUc+jfbc8TpuMuP5w5hHFRXSDkmwwBI8r2yjtwiL9pZv7GZ+w3H7d8ZghC/0gICAgAlBeKEHBAQETAgeCOUSRYm0p0fC+hsb5ivbaplfbZ3h7lQYzFXo6nVQG8wt2TdTMYqte8MBaAb4PmdQte/UbCOuGZlp2d+2+sdtHEZmbjqYcKtb1tapjoVVO/hkr6zb5t8wtWtbHUs2sLH1elFOW9gsAqXTzzfLNjE+2z0zz4cpNpwESTJozSGRATcUCdIlA9IPeQyzRwNh82kbfuIxxny4AbU70EkOG7c9qA1ys7QFqYAMRnc/HZ0zSO2eTIwR12Ee0yk8vj3cWkQkoVIiNtE1xbqEIuF4E7PGpCIY6D6omqhhY5Q5o5A4jkNQUQqpgBae2N72Cto+qv/4UfMlP3rQxmp7zegUiHrKxrqV31i0+g4ctfKhoT2XK5s2RluQIdzq21g0oryR2GWM0P/5Jny2+5A48NQuWUR8Bpefkn65fdOxi01W77rGdFHu4hnJSF32rb0NNCvtIvEH1kVjYOUIVIwb57oFnaP4g/IRmWOMSUm8yy4QvtADAgICJgThhR4QEBAwIXgglEvmMtnaGZnGSzeWi+MRfl/oqz1A2Ljk9MrcrO28z86YCVWrm2fpEDvPMU1r7pTDl5R+pUhp6KnmjUN1mbyB9Xm0BaUX6X+CcxKY30lcTot0ocJYxzlRTldsbZmnwuYG7OlD8CwA/ZR2rb4G21iVtIM5OJnfMacZUpiQpLy4g1+DZ0+V/3KKPJ70MSdd1qzBtAeNMx53ep4oE1x4IdZZ6XEigm+D58HABB5MvDA+nHqcAG5T7p1BhT+OCr+0SP9QWiChOmEuZ8Gw9jNnTxTlLagtLr5xy+6J/BobmzZ3r12yHKRR2/zTJbHnK0GsAJ87l1NkKeivAdZwv2sUSh18RgYqzCEOoPqrE2NKJ5Z8vlIvlJ6KjXbPlDQHvVM8jxc833ynUKqBHjoZ41zS/P+ogx4sXsJiut/d7qmzG+z6C11VH1HVb6vqi6r6gqp+IT9+QFW/parn8//P36mugICAgIB7j71QLkMR+aJz7kkR+aSI/KKqPikiXxKRZ5xzj4vIM/nfAQEBAQH7jF1TLs65JRFZyssbqvqSiJwQkc+IyE/kp31VRP5MRH7l7epK00w2c5pg8aqZdpcumvB+gnD2QR9qc/n/T520sOcn3/9EUZ6fNwOhz7ycSbnJX4ts17oG66aBkRlg17oejc05BkqYCdmIkEczBoUBU5nqjZ2mtWWqbTeFE4/EsO3ZrnHwwaBriSF2Ns1tgQpvrabRE92+0TIMSWeQVVYRYh17pm0e4p1yp57JI+i1Yf1nftMUc0uFO9JvNZil9DiZaRvt1swDXiJKEzB5BKmYIY/bPZOS5CVvBz9YKi+zExyrisA273QvUKn8np7J7+V61dtudOrkyaK8gQCiWzeNfsng5bQBuu7y5ctFefbw4aLcnjtg98R4xfBuynLKhcF8fc4//mEK69JTYWQwEcfOo5l4TsnYVQQkVYGnkMKp+urdq2rnfuAdbYqq6hkR+ZiIfEdEjuYvexGRayJytOKygICAgID7iD2/0FV1SkT+WER+2Tm3zn9zo5+j0p8kVf28qj6rqs+ub/XLTgkICAgIuAvsyctFVWsyepn/nnPu6/nh66q64JxbUtUFEblRdq1z7mkReVpE5NETM26Ym0gbm0YXrJol6CniDSnZkR+enrUd9DS2wIfmlAXnDKFMl+3YfRLQJXVQMSwzkULEPJK5GRd7O98IjqAXAspDBEdE0MBIEPDSQdQI6ZcdqOM1otttwYxKeutrdk8cr89YwMkAlA9znfo5NWmuMgkHk1ncrnAYoxJNzJy+hXlevvFmUd5Yx6QDjcRM+KGnTmftmp0276Zm7q7haXOgvoRUECkPOh95XikVVFRFooxiGCs+kbIKKT9X4XBDB53dZDkYnz/o2Tg32+aFcuyo0SbXD1vQ0NbGlaKM5SIxoo+Wl80Trb5j3io7eKY8NcW8LTXQY0NHjSMEeSEQKxvCm6156UtRAAAgAElEQVQqwYoA4EUyT7V0fJ9yhUPPgaVCbtRXUL1zwN2dKJfqJB13omr2RtnsxctFReTLIvKSc+438E/fFJGn8vJTIvKNPbUgICAgIOCeYC9f6D8uIv9ERH6gqt/Pj/0rEflVEfkjVf2ciFwSkZ+7t00MCAgICNgN9uLl8j+l2sv9J/dyU9VIktooEKHeNLqkM21URIqd8BiBJf1cM2KrbybcTt+60cvMhNvYMYq/34P3SRPmNwJemFSA3gLDIT1XbtdmiKC76SdPMPNUMpiTOD+Orb2dttEMBw+YDszKZTORhzCFdWznw8tk7dat0vJcywI/lJoV1HLxDDaYsPAWGsBbYWzmxnDJqSMZQtIyL5RLS0azLF03E35zo1x7pSr4yw3tnLm2jVE9D5Ci14bC5Gff6FkiHkVDWoAaL+WUC/PUjuedsrukDSirezfJXVi/5xUzpgBBoW1vmU7S/OxsUT59ygKOFq8YQ1qLoVMC7aHr168X5WjN6txEsFLkUQN5kBfyeDrMxYA0EzyhUjwvWiJZLfIWWVkGwnk6tKP7kzajZxGJCeZ9ZYAc62OiGIKeS0NPHpheX/n7wqNW0AV2p/Que0MI/Q8ICAiYEIQXekBAQMCE4IFouWgUSS0PKJibN6+Ul1+xwCKowHrG3NiiXoGpvrxm5dkdM3d2+vZ71YTuRALvC9pijh4awuAjSO+OPWFgbtM8jZj1J2X+SUZHwLUC0rudjlEux48dK8qLNyzdDOVxx4FQMUzCLXgeXF82ydQDM2ZyNykfDOnhtyREtbbDK4e6LkVKUerBwJuhC6nRN1eM/llZsf6AwZEEdXd3EHyEPh+cs35QbnnsaTPAvGBaJMFSj1w5/cKgHZrQAm8lL08mvWXyiqJdmNAMCPIyYzEDDrNXlWRGemtbxh2hp04P2Z1mIDu8cNSC8o4eMu+Xna5RKwycu7Vi66je6eEc6taUZOmJIF+NU7chmdvrw4PMCyzzkvbaOZ73mZ3CXLZFJi3muo3KvUWc5821N4+SPXm5eG+x8nsGyiUgICAgoEB4oQcEBARMCB4M5SJmIs9i9536oRmCHGqwncceGoPMjnWH1GmFvsc0tVHMzEtxn+2u7dpvbNtNh15mHujA5LvyqZrp2WyaZwdRlRiZZhY9JRpNC/5ZWLBkv3OXzBOhuwzp03wsKCO7vWP9fGPRdHJmpm2cjx8xM7uGBMBDUERe0xFwVatZOcttZPZhu2sm9IU3Xi/Kr1y08gqCibTObLyga5CFqYPxPX32bFGuN25fvp4ZTAoJtBCTGGf05kE9pFwicAEe5eJJvI4DSKwOrfAaIp3HwBYvyAXwzHJX4aKT8z+k+WqYNwear9Oy9fLoudNFmXN3+QayIWF+mdSbGZaI8dhleL1wnPvwYKPeS+x5tqBCeqvQQ4jDWxIU5OdFL38Wq+AHFpXPkYMnVOY3oKSe8sCiiil/xwhf6AEBAQETgvBCDwgICJgQPBDKxYmZQDXQBc2GeS30B0j2DF2Pfp6YdwdZd1bWbTd/bduO01RO6lZHFxmQlldNVnQFYjKb2zBda5TeHZmlUKyVGJQQJUWFmhUIrPCMP9hf9ZqNxcycyQB3OkaL6Jum1TKOJ3KYxrVNo5DSgQXzzMxaME9cs3E+cfy4tQs8Vx8ZlhJ4K9RAf0S5yd9FgMlm18bzuRdeKsoXr5inBGmJFoKpGHySRjaP7TnTbHn08XNFmYFIWW6jp1XaF3TCwBylNI89WY+svMyAI0+fY1xmdivevtxsj3k+TfvypvueGCnppZxmACXS7Ng8O1Ilauvs9Amj9q5etTXy2mWj+Rp1UCegfBhMRIrE6A2OLbVx8FwwexUTNnvBP+Wo0kcZ0zvl8+OPf+UXrZZTNHpH7ZWKtrJ9ZBl5zj2Q2g1f6AEBAQETgvBCDwgICJgQPBDKRZwTzU3Xg/PmfXHwkNEMm9sW5LC2YWZ8PE5GiyCgVy6+WpS7sGdOnTQ6IR0YLbC2bp4il6+aaUmNEQeaxyFoIs3t8nrbji0ct8wwMYKWatA1odcOk+T2kDzXgeaoN0yn5MkPfrgov7mCZL+LI0oFKqkyd8A8WJahwfEX33uuKF+7aRo3p65bkM8JjNdhZKkhbm0avXVzdeQJcfHixeLYhYvn7VzIFyPeyJPpTUF/9JFJqYYk4QugBRoIvuojIfUwlxh2MJUj1NEFhaeJNaCNBONDUBiNFii6bSTeRhBXo435HYw4OJrNzIzl0SwVwSw8p920+1MSlwnT52at7avro7VQQ6avdGDrjOuvBV2f+Xmr49Qp0C94Fq6v2nrZSRFY1ENGLAR5tfOx49o+hLZSdndj257tBrx2aszwVcFEZPQ+wvExpUNZZ77oGg2jSKnZQ82gNjIpUT9nCO+rKVBanocSFniU05VaoZPsUVglGbPc/UoSHRAQEBDw7kZ4oQcEBARMCB4M5aJSbIvDKpIjh0zX5cLFq0UZKqBycGFkUtab5vmxBi+XHzz/N0X5/IVXinILyZj7MJu24C0zZJQDPS4QZNTNdUViGHkHoEfTqJs5O4QJN+gz7RIy/MDMGsIrhsE6hw5YYt4PfvCDRblW+6GIiKytmOdLv2f3aU4ZbcOMORffsLF99TVLBtyB3kcdXkHshxdYkwer0IRe20R2JbOyZX4e9AT6v7Jqc0eL8/QZo7FOnza5VzAK0gePUATO0JxFOqIBAmt2+tbebeid0GyuUScE8TuDga2XPsqanxR7njJIbo6Fvg0KZ9Aj5QIPLXiW1FFOEnqLQAclL3OukhopH7t/HxROBi7s4KytlxPHTO9lfdvOb2Bd7ECIx2Eusiz3qBniWGrlGJTXDDy4uptGhZKKIF2hfsSRlZQeN5qfa9eR/iGFklQEJzHIrs6gtJRJte3FdHjukNXv6dCM/kdvqqjCm6UssGyvcUfhCz0gICBgQhBe6AEBAQETggek5eIkyW0R7jKfOf1IUX71kknprq8ZpbCxPtoVr4NaYP5dJobd2DSPgP4QXiueJ4INgcbQh+niWmTPnWmPzjk8P1ccO3LYzK02vCMcTHLujis8aGoI2kmhMeqQJenAnFEu7ztnWibDPLrpbzA+awiO6kyTQrFd+xRZfeKGtWUT/RxuGS3AwBoGyIylhGst06A5Nmv/vrFtHjR9eGdsI3kTPRgee9Tm/2993Dx7jh6wsd7egJYNzPgsT9RNzxJImUhCzRh4UFD5l8FnzDpTB3VDuiDNkO1pvI5B1VD3JIOpnqAOTItXdwNZoNqkX2o056kxnd7Wphj8D4OW+qAZqeQ8D2niM2eM5lpasQC165A+7oEuqrcgVZ2PI+WFhz3zZuEcbm7aPRPSeRhIZgHzyswIDozpJYf6GhjoJjRo2vBU8nRq8KnbQCL3ukfRoI08vgfZGF8npuT4HjmX8IUeEBAQMCF4QH7otonFTQ76xH7sw7dv/omIvHJ+9LVw7bp9ZcyY+7pMTVsdno83N9C8n0JKPNov9BA+vNw4eeTkKPHER598vDjWblgdDZzb50YQ/U3x6ZLhp51fq164+cA2peo1O+lU3haHnJ8XX79UlG+8aeH2O8hF2mzBcR1I+bmGealD+oCbjr18c62PjcVaA1/wGGZubFMk8dz7zPf9Rz5kc34M8QluaJ/0ibMxbSTYAM19yLnJnPBLPLv9a/at5Qjnp/iijtl9bEQOsEbGX25eMhZPvdKu48Zlys3X7u2bnCIiTUg1pCnrwfrK7899fd6nmVAx1E4a9CGxMGWtP3bIrMITxywmYRsbqiny5CZYvK1alLfV2qIYW82wQcl1zvWH/vNZ4EYn3x0ZymN5Bk8+Adcx1+o2pDLwge4pQqYYI6WwKiwkT4URa7DMiPCyr1KGIOLX+rh8n/zQVbWpqn+pqn+jqi+o6r/Jj59V1e+o6gVV/UNVrd+proCAgICAe4+9UC49EfmUc+4jIvJREfm0qn5SRH5NRH7TOfeYiNwSkc/d+2YGBAQEBNwJu6Zc3GgncbzjVsv/cyLyKRH5hfz4V0Xk/xaR33772jKRPIR4CD/YGja0TiyYb3fSeLIoN1uviYjIhdfNf7oHs4Yh/hsb8P2kqBvOjxMz7aanbbPkxFEz+Q+ACvrQ46NNyQ88fqo4NtiyjSLa5wk2cOoRTE7QLEykESFRR7OG/JrrtimVoP4TR0ebhccO26bh0WPGPz0PtcPLi9eKcg/0S50h5qAQvEQNaK8DRzB2P6dZ2RrYhScWrK2PHLGxPXnSNnYfe+wxuxbh4zurFnqeQLbw8Jydk8H8d+nYzMfGGqigDL7nCegX0kKkwmjmJ8jjwE38/tDqrOWbywp/cyoz0iec1ALHGQyJxGSIHKkbhN6DiopzzoW5U7nJxye9jk3BFLRNbwcSG1hnJxfMJ30ba8c5k5bYQv7YnXxDPcXmdxPqmRElLjC3wwoqjHSNIKlIFnHjlDTq6LhCPrMDuYMMfaZb+0wbMSHY8GV8xFTLxm5uxnzoXeUG7Xhzs0JV09sIvf26++qHrqqxqn5fRG6IyLdE5FURWXWuGOVFETlRdX1AQEBAwP3Dnl7ozrnUOfdRETkpIj8qIu/f7bWq+nlVfVZVn13bHNz5goCAgICAPeEdebk451ZV9dsi8mMiMqeqSf6VflJErlRc87SIPC0i8r6T0y4rfKFhfiLZZw9+4HPTZmb/6N/6iIiInD5jlMfSslES61tm2nleLvAlpo95Epl/6pEjZloePXq0KM9Pm2k1k7toZMhF6nrIkalmkiXerj2dU2l+Q2EuoySA1UNvCVJHkevm94TC5IL5xM9M/x9F+TLyi16+CvoFY7S5g5B0+CpT7J9mbLs5MksbMOFnOjaeH/2AeQIdhi/5gYPIaYoEJwO0JYErTMwQeobbD2nGj9rY8+L04cEBCuuREza3hwdIkgHnYwdznv7hh4+Y90cdqn1xzt0w/ybVGxn78Nhjjxblo/D332F4Op6L6SmjxaiU6Og5lXcjZpIIJo/wcqRineEcxgo4rL8DoBaOHLH1RZ0DrqOxR0mG52waeUynsEYG8OCKvIQU5QlGYi/0v5xyGSsUpli3G+tGi+5A4ZFpUTtti6cY1LD+Mc4tUJQ+deYlk7U+jZ9NaFZ46pEV/Xmn2IuXy2FVncvLLRH5aRF5SUS+LSI/m5/2lIh8465bFRAQEBCwZ+zlC31BRL6qqrGMfgj+yDn3J6r6ooj8gar+PyLy1yLy5fvQzoCAgICAO2AvXi7PicjHSo5flBGfvjeU7AoPYX6l9EpomCk0MzdShKvB9Jk7aJ4dKeiMCCbhAMkTqBgYIfSeOU078Ligye/ynJndLaNZWvRgYQALzO/Yle9yu4iKeDhfQLPAs9/FoI7yMepCyTGuW9DQwiFTz5ubNbN5AZRDDflQ+/CKGPSpnldu8tbykPQWwqqnmjae0zDnW/DyoPJcfwvmLzxEpqYQTIO1sLO5ivPhRZTX7+AFESOYbB5Kkk+cO2Pn1M2DaQAqYhDDWwZ1dtCPeu12SsNXo8Sc49T3PWGePQxg8QKOQBfFkc3LVJuZQhD8kt+XwWH1OhOs2LnM15tA7iJFe/te0JLN6ZmTlgTj0GF77riOamPZAgTk1DDnUw3SktadoScxIaVlhuV4OU29M/IgM8znAQSqffDJJ4ry4SP2LEzNGC1IyrGJeY5BhR06bJ54G1iXlAEovFXgKsNcs6STGHE0pnD2mmU0hP4HBAQETAjCCz0gICBgQvBgtFzEydi7hV4TO9tmWraaZnI3oD3Syz0xuhDdr9Vg8mPb2tt5B20yNWsUTh3X0vtl65Z5gkTY5Z7KzckOAgxiqCRSPH/ASBGgpnZP6mqkNNWYvACeMArzdzr3eGi1MIagX7Y2zAx0sAMPT5sp3kOSgg4op1psFA1Ne5qLY4bEy5cIU92BthkMrI2pMPiGOUXt/O6O9aMGt4BOx+aOXi5j+mdAMRPQDzEUKWagQtlqG+WyjWt7amsxhfcR70mNDzOOrT8M4GnGNubbULVsYP214AkSqbVRoV8TK4O/UM7pPebFbLXt/lsICOrBm6kBmoF5XAdb1CGCJ1aDVKStRc/7KqflMiyWukc52Nh2kV83wVh4vh9Yf36kTbkOSqHVhEqWr9vz3OhYnw8fMNqoM20UJSmXqSl7FnY2LL/q5joEiqQqWCjPb8pvZzzzfHQGdJSpSIJxJ4Qv9ICAgIAJQXihBwQEBEwI1L3DT/u7uqnqsohsicibdzp3AnBIQj8nCe+Vfoq8d/r6bu7naefc4TufNsIDeaGLiKjqs865TzyQm+8jQj8nC++Vfoq8d/o6Sf0MlEtAQEDAhCC80AMCAgImBA/yhf70A7z3fiL0c7LwXumnyHunrxPTzwfGoQcEBAQE3FsEyiUgICBgQhBe6AEBAQETgn1/oavqp1X1FVW9oKpf2u/730+o6iOq+m1VfVFVX1DVL+THD6jqt1T1fP7/+TvV9W5Hno7wr1X1T/K/z6rqd/J5/UNVxNo/xFDVOVX9mqq+rKovqeqPTeh8/vN8zT6vqr+vqs1JmFNV/Yqq3lDV53GsdP50hH+f9/c5Vf34g2v5O8O+vtBzLfXfEpGfEZEnReSzqvrk21/1UGEoIl90zj0pIp8UkV/M+/clEXnGOfe4iDyT//2w4wsySnAyxq+JyG865x4TkVsi8rkH0qp7j38nIv/FOfd+EfmIjPo8UfOpqidE5J+JyCeccx+SUUqtn5fJmNPfFZFPv+VY1fz9jIg8nv/3ebljsvt3H/b7C/1HReSCc+6iGykL/YGIfGaf23Df4Jxbcs79VV7ekNHDf0JGffxqftpXReQfPpgW3huo6kkR+Xsi8jv53yoinxKRr+WnPPR9FBFR1VkR+TuSJ21xzvWdc6syYfOZIxGRlqomItIWkSWZgDl1zv25iKy85XDV/H1GRP6jG+EvZJRec0EeIuz3C/2EiFzG34v5sYmDqp6RUUKQ74jIUefcOKnnNRE5WnHZw4J/KyL/QiwJ4kERWXWWXWJS5vWsiCyLyH/I6aXfUdWOTNh8OueuiMivi8gbMnqRr4nI92Qy51Skev4e+vdT2BS9D1DVKRH5YxH5ZefcOv/NjfxEH1pfUVX9+yJywzn3vQfdln1AIiIfF5Hfds59TEb6Qx698rDPp4hIziF/RkY/YMdFpCO30xQTiUmYP2K/X+hXROQR/H0yPzYxUNWajF7mv+ec+3p++PrYdMv/f+NBte8e4MdF5B+o6usyosw+JSOeeS4310UmZ14XRWTROfed/O+vyegFP0nzKSLyUyLymnNu2Tk3EJGvy2ieJ3FORarn76F/P+33C/27IvJ4vntel9HGyzf3uQ33DTmX/GUReck59xv4p2+KyFN5+SkR+cZ+t+1ewTn3L51zJ51zZ2Q0f//NOfePReTbIvKz+WkPdR/HcM5dE5HLqjpOQvmTIvKiTNB85nhDRD6pqu18DY/7OXFzmqNq/r4pIv8093b5pIisgZp5OOCc29f/ROTvisgPReRVEfnX+33/+9y3/1NG5ttzIvL9/L+/KyOO+RkROS8i/5+IHHjQbb1H/f0JEfmTvHxORP5SRC6IyH8SkcaDbt896uNHReTZfE7/s4jMT+J8isi/EZGXReR5Efl/RaQxCXMqIr8vo32BgYwsrs9VzZ+M0g79Vv5u+oGMvH4eeB/28l8I/Q8ICAiYEIRN0YCAgIAJQXihBwQEBEwIwgs9ICAgYEIQXugBAQEBE4LwQg8ICAiYEIQXekBAQMCEILzQAwICAiYE4YUeEBAQMCEIL/SAgICACUF4oQcEBARMCMILPSAgIGBCcFcv9EnODxoQEBDwsOEdi3Pl+UF/KCI/LSMVs++KyGedcy/eu+YFBAQEBOwWyZ1PqUSRH1RERFXH+UErX+izUzV37ECz5F/4o6LlhyuhJUU7pqhjJPVc/IHLrOwq2hLx/MjK6TArO90DfzP9H9CSe+E+zrHuisqBSM3g8vqKe2ZeY26//Vvby7uy314v8gt4z0hv//dROSs9TmjpnN4d9A7jV/1dg3mvGKPdLNQ73X9UZ/mEVF1atpa8cznWXs0V672qG5UNqDjfa+OdL7jj2HjPT8U/8Pn0yvZMeM+T34LytvC5wb3UYfzKmu5PQmmZzyqPZxVr4Pzl1Tedc4fLWk/czQu9LP/e337rSar6eRll0Jaj8w15+osfz/8BD7lioB0moGL8MxezfpRH16rYv8c4N4nrVk4SXGd1ZBkGNLK2NBqNolyvWz2rt5BhLi5/oaZpWpSHA5Qz62CS38sl1t7uYFBad+Q1EW1MrF21Wq0oO9yn1+tZu4ZYYKgnxRiwHzWMAV8m/XSUdrJes3MbidWXpXbPfr9flIfDoZTB+2GIyllB3p/nVL0cqs4Z15N6z135uWXXiYhkWflCrerHburZzRhwXWV56s96gmMYd65r1sf7cz0QcWxrkm8w74e64llNMbBsL/uXxKO1WsX/ZnaZVwfvr2hjvW7rlM8B177D/ePI3gXsK+djvMZFRDTDs53Xk+ELJmngtcrGO3ueG3iH8F3Yw/PB99z/9YX/fEl2gbt5oe8KzrmnReRpEZEnTk3zLY6zdkPl7/5TTTHRNby4+dxtbtnk3trYLMorq2tFude1CfBebFgkRw7MFeVm06yPNspxwpexlpaLdY8HoNlsWYPxEPq/7TbpDr/6mzu2AFdWLOn5zZXVonz58qJdi5dYt9styvzCaXU6RZlj0G5NiYjIoYOzxbGzZy1vcow2amzjEglePpiczHtRWDmO+PKza+uYYz7wKddBA/MRWduH6ei+cWx1+P3HfOFLiq8+fgNUvZT5wuMzznHMUrz0McmNGn5IU1uTQ/zgj62n4cBuFOMHPha8nAbWP4IfPIpVNkjx4YG2pEPMHxrMPiUYj5QfE97HzGjsE6zB8UteRKTesrlL6nac75ABP5pgNfP4zLStz+3t7aLsPdu4b29o7a3j5dpo2hiMf0h2MBZRhB8F/KhqZOPbqXEN2jymtKri3bwXfdzNpuhDn38vICAgYJJwNy/0ic4PGhAQEPCw4R1TLs65oar+koj8VxGJReQrzrkX7nhdCXVSuVdRwWXG3MS4wyYqTeAIXNlGf6co/49nv1+Uz79mlASsdCHdm6C9x+bM/PqFn/tHOMdO2tnaKspTM9PWhh1rg+Y0wBDmrfRBN9RAfUzNFOXtHTPpXrnwelF+9XWjU5aXjXJZWTd6qdcHt963DroMpm/NzMRINqxt+Bao5aZ9p2Nm5CMXjPI7sTBflE+fPluUDx40WoZUwvaWtTEC71ivg5cf4By0ikuA833p9atF+eXz1rZuzhunuLIzZVRXHNkYffRHPliUZ6aNfopjW4PDrnGgzVa7KF/EfPQHdq9bq9YP0gxJzcz2YdfogQNztn5OHT9WlOv5+VvbNo8N0iY9G8dO29pOHpI0RAK6bwi64dYtW8uXFq/Z/Rt2frtl9W+sG4U5M23jcWLB2j5e87NNu2556U1rYmrrro21z3cJGA9JwKFHiZW3huTzQUdhAfVAWTlQYOkAzweoo1p+rz6epWbbXhwt3F+5KdoDFQVqFyykNNugl3aJu+LQnXN/KiJ/ejd1BAQEBATcG4RI0YCAgIAJwX33cvHg1KNR7HiFQ6vnCVPu5eK5LpW4rPleD1YepHZuD+UdMB5DuCfR7auGc1K1IaTZF8O8Sirczuh1M/ZLVeyQt6bNvM7g/XIdFMrzL51H+ZWivLZldM4AFMr6ppl3meejC1dPeHzEMDvpQdHvWv3iRiZoa8PauLqyVJSvLJmpvbRsbp5nzxr9cvzYQlGeaplZrQ5uXKndswZviwTtdfBw2OlbX2+u2H0vXzGqYC2f8KFHzVl5atrM3uPHjxfletNM9iY8Fmi/9zFnXVBa//1//WVR3tzCfOAxqGEd1OHdc+yQ0VcD1Hnu7Mg/oV63sY7hRkqXRHpGRWJ1RAr3xMzKA7javvxDo6tefNnW3rZNk3Q69GSy+547faIot6eminIrHs3l84tW96s/fLUor6zcsj6BwiA9sg2qSzB2ETya6Hoce26OVq5h/mK6Vgu9cnBOTuuBkZEa2kjPrDi2cgv3jOBqOn/QnvkTp2297RbhCz0gICBgQhBe6AEBAQETgv2lXO4Cd6JWRudo/u92jAEMDNagY03mnW/lpCpEGpYpA3764FzgpFB4sIzqZzARaJzcBHQ10B01M12v37Jd/5deuVCU//p5cyy6vGQBIx1jLaTZMg+ECOZzDLqoD6qij4CKtG91DuGdQUmFRh7swQi5HizgmzfNe2LQf60o0wOJ0bftE+YBUQOdMoDXT4JvEUeqAH3iQmD/tnascdu508YQa4CBP1FslAgDbJSPDu6fMZIS9+z2zCbf6VoZrIHAEUUi0IMMPLy+al5YR25Z+eyjj44KCaMn7bpMSL/Ak4MPC7yCOKYZoiQ5T2zvqrEi0t2xNdNGbBzHpo61HefuZFe2zBPoheXlonx1yW7UnjLaLQI90u1anxh/TA+knRtYPxhTMqKeVEh50LgnazE+h5HRNS+gDO8EXEfPGpbPnTtYlFsHjZbaLcIXekBAQMCEILzQAwICAiYE7wrKZVeqexXnl9EvXqBSlJSWNab2As9BPZ7CIgXBqIIILxAKB1FHA7YW9VZoEye5CUoa5hp0V/7qByZi+cJLLxXltQ2jR1qw0GrQvyD/05qy46SCEnhM9OEd4mmsgIqgi47LhaF43cJBCxIZ9MxM3kFQxtK1G9autjU+gUDZUejD1OrW9rRrAS702hCOO2xmapBwLqM8SqwBrweFlwt1O2K1ulN4DjHoLMPxCHRRBJ0Uaqw0QLFl4KkSrJME5vz6to3la4vmSXRkYaS6ceq4eQv5miLwuElJu1EJk4JZpF/QQbTLox6i8rL3PLly2qebP4vbqPAmaJ4V6uDgUWWAVD+G6BuosfpZ2PgAACAASURBVIwNgDPSkJQS1zgmk8djiuPxvZBTYw4aM6TvakrvMQbGGS0FBk520O9sn7VcAgICAgLeRQgv9ICAgIAJwbuCcilXifc9S3xqhQExgnNG/yfj4tEzoHAymEIpy7h2gHb5d9fSc3ow9VImm2BiCHi50PSWZGSybW2YR8iL5y8W5e8+93xRvnbNzLJ52xQXhYm2vGImXa3exfnm/nLu3KPWJ09X3cxHT1cdY7m+ZhodV66MzP2b124Wx7o960cG7xjmA7m1Zhodr79hHg6ebk9kwUcnF6yzA9ATKdYDPZYyatKDlunCtWRzJ29bBE8OmOYJgj66vXKvFXH0YsK810G5wKtjA142/czq6fbtXq2WUT0deJZ0h0a5LC4bJTf36ig1waH5Q8WxBnR44tjqGw5tPaSgGDwZWbU+kYLa2ARVAE8mjzSlvBKC9oYUnSd1k9NRzRmToq5B80jWLChsB1UMENy2Sc8hSqNgvXFMh2m5Fn8NEreaoO2gpjxGKactr6zZese0SxvlFh73mC8pvHS2htA06vdkrwhf6AEBAQETgvBCDwgICJgQvCsol90FDfGct6/HeSmi7N9JlTDIYQgRDcTdiKM4i5fCq9yM5I66wnMmhVnNjCQa85xRPdffNNrihZdNz2J5BaYxZo0ypzuQ42Vgz6HDpu/xgfe/ryg/8YSVY1AuCaR662gjNSwotXrj5MjMv3HNgkFegVcO0+5Rr2SI+WDWKF00qdsWpEinUE6gkaGgdLwUglo+H/QuSnMPHQaRdGHpQv7D82BxXmARKBcHmVW0q4sonAa8M7qgMFKuSdQ/IMUIadih2rhezfV9btw0KmwK8r01aLzwGesObR7Jm3gBUrh/o2n1tNrWli6C0aTi+Ug9Lxq7ttUe1ZkOrI42eC+oGXt0jiL4aqqBDEdTyPDEtJMob2yalxQ1cUiDcpyYOpH5QBudsXQ009jBiwicTxMUThMLLnbW7860eXuVyV7dCeELPSAgIGBCEF7oAQEBAROC/adcKjxain/2qJV4F+dwS/32upVJlFHOYK7S44WJzz15WyakxTkp9V4QyCHIPjNAwA31XqKGmXf93CRfvHK9OLZ42WReYwZUQLp12Pf8Cwo8ftZkVj+ETDuPnjtlJ9HsQ/BIBi3QATQyBAEuHZjEZ46PKJeFAxYEBEkRuXzJAmCWoT8SwwTOQJOtrpk5fOWqBR9NI0n18WPm8UJpU0WZWXTiOrVDeHx0LwZ8xaDFopgBSTTH4QUi5VSMp4fCLFsI6OqnRnkMPe8pq3MVCc1Ji5AmunxtRLn88IJp5cxNGT1y5JB5kGhiY5EJ5IkZLOa9GqiHhDmD5PKQniV4iMD8yTY0Xra2rU9RY0QVfvSxx4pjh+HlsrVt10V4rshJcE6n2rYOFWvstddMnvfCqzZOV5eMKsw8Fx1kLEI983MHivLx0yPdoXOPn8BlSPoM/i7O7HgLlGGUWf+mWzbux4/YfXaL8IUeEBAQMCEIL/SAgICACcG7wsulCrvxfikHvBvAVSgCZiKmHSJV4kcNWFtwuvMClHg+zG3s9FN2tYeoh3rCgJGRcbp807IRrSED0OycmZQJAojWVo2egMOLPHbmTFF+31kr17HT3u/ZtS3QPzXQKSltaXjrUCI0aYzaw4S4H/zAE0V5c9MM7xvIPjNEUAsT+/ZAS61tGSVxEwEmp049Yu0ifcYMVRQViejZYofHsrrU/2hBX4VZmuip0kc5xlyT4muA5mGGHnr99FF/VMccI0M5PSxiUj3o99bayEvo1UuXi2OnTxoNMDNjAWUNeDGRasrA57iKseM4DYbUTDHUsMZqkK9twUOmBo2csXR02rf1eKSFZxXUEbMq7WAOSBEloA+Z/PwDyJi0ft3ozDcH9A5DZiK8L6bQ9sdPml7Ohz/yYRER6cyhvTHoS7ibKTyg2hijzAvCQyJ2JE7fLe74ha6qX1HVG6r6PI4dUNVvqer5/P/zb1dHQEBAQMD9x24ol98VkU+/5diXROQZ59zjIvJM/ndAQEBAwAPEHSkX59yfq+qZtxz+jIj8RF7+qoj8mYj8yp3qUvU1QwwVvyt38Ii5/Qb5ZZ5HATIKge6gTC2pFQYW1XD/BiRdM9RDj4WNTTOdDiJAoIYdeIdd+iFM3KVrIxNw6bp5uSBuQ6Zh8q1vGG2BPX/5CGiOJ+Ex0KGWDCiUJjRCIghKMKtR3UvbgiLMwSg/PXO2nKZhJj9yykzdNQQQLS5ZX9kWUjHXr1umpmNHDhflGwjAOnH8SFHOItAZ4FYUuiYRpHRTNzJxG03IDXtBZJgv8ArtKaMwHIJquL4Z6MU+NZjG55bRSKT1IqbUwbW9lJSd3avZHrWTyc+fe8GCu2ZmzWvk0dNGV2WbNh9DJlFnsnRMfAeJy2uIumo0kRFqy2iG+hSC0SD+4ukt5X1qYA1S92SY2XUMGqxTlpo1qp3fQrqnzS0Lumoj+XgHwipggryMSENo6LiuPeczOX2VILOXZsj4hbmLMur/QDvIk+PGu4vRgbvEO90UPeqcG/uiXRORo1UnqurnVfVZVX12dXPvnFBAQEBAwO5w114ubrRzWe4MPfr3p51zn3DOfWJuqlZ1WkBAQEDAXeKderlcV9UF59ySqi6IyI07XvGAQO8YLwMRyjSTqZ/Ac7wkSJ5Ea/l96TEw9AI2DAN4ZIy9X1Ica1B+FTRSE94kMy07fnh+HseNQqihwfwFV5iAETwmYi9QB9fSNASKGAmPM7A7HTpgARIHD1pA0NXrFtDRQyajuqdXYvXcXDWTucfgJ1ArzIrjJQKn55PeTslxDXCsqwLXvONcYxWRPzGlfAfwfPDWJ3RPuA5AbRyYsjmuwftk49ZoLLe3jUJZWbfytRtGUR09agZ1o2XBWpkYrUAPlhrosNQb3/LMyYwHdBXBgR7yfisol4Saslh39B7z9GZwDoO+Uowpk3g1mpibBPo7nqouPdUoc2yeKL3uqDwFiixiJXjGMtBlCdYJnz1q6Trv+O7wTr/QvykiT+Xlp0TkG++wnoCAgICAe4TduC3+voj8bxF5QlUXVfVzIvKrIvLTqnpeRH4q/zsgICAg4AFiN14un634p5/c++1USn9DKpJEV2IP3i80u6toFr+M83dhv2QwqXh+ChMtzRgAYaDnQ7fbve0YMwcpxogJhOcRMHJ43qiNDmgLl5kpHcGTQWDWe5QLLWlHKoaJsm+fA4+Wwl+zs2jjEcuoUztv/etiR78JeVlF8uZVUAibW+ZVwOCRKKYZTo0VNg4aIHn9CUzmDFokNPG1grJTmPUZMmn7Wa5QJzU9kMC7RyoP66CNOT558mRR5jq4uDMy/dfXTCtnbcPquHzF9HSOn7DAmIWjRoFFpADhDdVAtiXSHBnWpEfG8bkB45JpOfU4PjpELRmCg1xUTqmxjqq5pmaTQ0BVikxUKbxckJRKMjUaECLDsgkvlnEGqSgzWqo+hDcPMyMx6TSzsQn6yvaqx//sCiH0PyAgIGBCEF7oAQEBAROCfdVycc55+iz7jSoPFprpsPC9pMtKdxaajhX9oRQpPRYi3GCAgJSdPFiByWuTyGgT6nmkKQInIKXLjD4xzNQe9SRg0lFbwqO9vMTb5V4YpL2i4hyYvehzDZo1Mwg4YntJm/CeHhWCMd3YNt2PLrxGWgqqKSqfGy8JdT4fMSfelZu6nkyvlnuzUL6X45hgjUHGQ5pNuPIOobODPjUR1HbssAVRUTPlxtU3RETkFhg1hqVcu2kBWpfzpN4iInOzRm8l0PAZQCO6D/qHfJyXkBv38jxeQG9VJW8fe58x2RTnjs5VqZR7WtGDzaNZcE4XVFcPtEkPJ5H2IrM7wLujh4xE/Ty5eObRwMyejefN83Yrl3x2Anp2H71cAgICAgLeZXhXqy3uOfS/BBl+IavUGys3RfnlFZW3hV+NFG3M8JXHjbKkXv7FOfbDZlINJpQYUkWQG5u0KHA+v2RoIcQJNp44Hty0w3YevxHU88PGP+SmD7+G4oa1pYdNogTj2MHX+toWwuThr5t5A2KdZU7TAaQYWtgMJqrnPj/mzTuvg5IjN8ArvtY5NwktgYhf7lBqxKjF3PBLeY5dO9Mx2YAOLJxDB0YJLK7VqfxodW8gxP/1y28U5YMHbMOVm6V1OG33oLAY16hMWuXbL6Vlb016KyvJj2GOaABl5euRcP4WqZ2vruSo35aBdxy1oO1DlPuOm6jD2+qI8Fcm3CTHc+gparjS811lb6sRvtADAgICJgThhR4QEBAwIdh3yqV8E/HuqZUqkG5wWi6I7+dMtKNxVL4p6vUAJp0nIcBNH/qQg5dxMN2ynJbwqAwvxwBzoyKRBmgW9XywUVHVccDbaiI9URXuTjM4HwPmxKzVbIMt65nPOH3cmwhpZ2g8KZQU5QhjPfbbF3krLVMOL19mydxUzZcDzeI8tTzWB7Oam86YQLhAi8PGdzYkNWbnkH4RUHYRytOgX04ePy4iIotvWN7MnU0LUaef/7UbptTxGs5vT9sG6aGDJg+QgnJJIAOg2JTdjWQG54Dl8fl8xuKMz0k5bUKQJuQ5fMqZ4KaG5ymueP1wM5b75FyfO3lyit6s0YeCzdfMkyTAOOI+URX1+Q4cSMIXekBAQMCEILzQAwICAiYE++/lcg88V/aCDB4WKZTfsqz8t8wT0aMZ6TlnW9H3nmBF5X7KSa08fL4wQeFEwIQDgjyXMegMhY+yF6YOc5j+9EPPOwP9QNN9n1r8CxNGcHByU5keBQwjzyr8uuk1Qp/8esPMVwd7uIvci5RI4ByQ9qLXhOcPTO+B/FqPDkB7SYP4Eg7wWKjwcvH86TE3DfRpm8k0MOwM688Qd9DdMW+VJDKf9CN58o9z584Vxy6+er4or4Jy2d6x/r2xuFiU5+fnivLMtKk6Mr9pHZSL57sPkGaoollSuZ1ySUCzJJQVYMi8K3+uPCqRvu84XkeSk6ba8wQRRi/fLJ9zOIdJjJy34wQWfcoWwE+dbmuU70gx73wJ06OJlOxuEb7QAwICAiYE4YUeEBAQMCHYZ8pF32LOvz1282tTVhuvYx7IjJZ2xX45r43o4eGpzJX3QeHB4onvwwXAwdSjt0qaU0CkMnbgDREjLJuJDVxEOsXg0JPUdzUoionX23IlPG9E0O20xCvEC92GVEGVFwq9BQZIWNHsgF7ScsqFdA3hvC6VzzFN2SIEm1SJ5+VCygnmc1bujcCcl0ix6lECdc9TBFIPtNSpHsiQdSSwUNCJM+1RnWdOW3DQ0tLForxpIoyeysPNFZNQuHLNkmAsPGJeRNN1y4+bxdZ2h0QSXnurntwK6mQ8TDHGjklGIu9ZKld45BPpP5+8lp5DeA4x7lFFpSSXEiyyJH9u2UZvEXqeKuXeOpnfeNx+7/R0+EIPCAgImBCEF3pAQEDAhGBfKZcoSqQ9PRLU39iwoIdWywIk6tQvocIg1OfqdVAbzC3ZH5mJUWzdGg5AMSCQJYOSfadmXhXNyEzK/rbVzTYOIzM1HUy31S1r41THNDIcvFJW1s2bY5jata1OPi5brxfH0hZ2/UGz9OH1sIkx2u6ZaT5M4WEh8IqhFYckBvQOIUiXDEg/QIxiTAfV4EWwjcCfGOM+3IDKHSglB0+cHuaa3i8taL9kMLj7qZ0zSO2+TI4R16HXwyif3COBXisJVRLhGaUp1ibUCOmRUmNiEQx2H3RN1LBxypzRSKTVhqCjFNovLTyxve0Vu1c+HsePWnDQ0YM2XttrRqd0ka1hY93KbyxafQeOWvnQ0Nb+yqaN0xYkCLf6Nh6NCI2E20iEMZhvIhCnP+qfr3jJItRCuQSZWETKPUi68Jzxrm1M2zl4VjJSmH1rbwNNS7tI/pGvj8bATojgQeeY7xZ0juIP6gJljgGEgXIJCAgIeM8ivNADAgICJgT7SrlkLpOtnZFJvHRjuTge4XeFgTcDaIAIqJW5Wdt1n50x06lWH4UIDLHjHMPJ39shR2BKijLT+FH+lHoLTN5QFdTiaenS/wTnJDC9k3hkbpMS6fYQbBObOR6BqtjaMi+FzQ3Y0ofgUQAKKu1anQ22sSppB/NvMrcjaIY0Nx/HlJeIv3Nfg2dPVTBKijyeDBoiZdaswawHjRN5mjuUPKbdXpWE4nZPmAg+DZ73AhN4MOkCP4tSjxPAbcq9MyiQypFhlaSAqBWTUG421ymiRsmZsyeK8hbkcxffuGX3RH6NjU2bv9cuWQ7SqG0BR5JYGE6CALDxsyci4kCTpaDABljP/S4o1JzPyECFOQR3VX91YkzpwII5Sz1tFB63cUpJc9BDxfN4wXPO90s+HZ6HTsYARsovow7SKV7yYrri3QfKRVUfUdVvq+qLqvqCqn4hP35AVb+lqufz/8/fqa6AgICAgPuH3VAuQxH5onPuSRH5pIj8oqo+KSJfEpFnnHOPi8gz+d8BAQEBAQ8Id6RcnHNLIrKUlzdU9SUROSEinxGRn8hP+6qI/JmI/Mrb1ZWmmWzmFMHiVTPpLl20DCoJtEkGfciGop5TJ03D4sn3P1GU5+dHRkKfeTmTcnO/FtlOdQ2WTQMjMsBudT2iucYACTMdGxHyaMagMGAmU5K307T2TLVHN4YDj8Sw69kuBhwMupbpZ2fTXBYo1dlqGj3R7RstQ30RBlplFXoZsWfWYmc+916gqRtF9Niw/jO/aYr5pVQpKbgazFF6nMy0jXZrItglot4MswGRihnyeF53RUaqKvjBUnRfYJnno5xWnO4FK5Xf1zP3vXyvetuNTp08WZQ3EEB066bRLxk8nTZA2V2+fLkozx4+XJTbcwfsnhizGB5OGSgXBvb1uQ7wD1P5+vQkdRlMxLHzaCaeUzF2FUFJVeAppHGqvnzvKL+8zzmU97QpqqpnRORjIvIdETmav+xFRK6JyNGKywICAgIC9gG7fqGr6pSI/LGI/LJzbp3/5kY/Q6U/Rar6eVV9VlWfXd/ql50SEBAQEHAPsCsvF1Wtyehl/nvOua/nh6+r6oJzbklVF0TkRtm1zrmnReRpEZFHT8y4YW4abWwaVbBqFqAnbTqkXAfM1OlZ2zlPYwt6aE6NgnOGkBjNduw+CUzjOqgYlpkVJ2JSYM/U5Y43kvnSAwHlIQIjIuhfJAh26eQRI2PqRURkBzKnjajc/ssoi7q+ZvfE8fqMBZsMKC0K299PkkwzlVmV4ImC3++xl0mMSjQxmucW5nr5xptFeWMdEw80EjPfh57MqLVrdtq8m5pw1fB0OVBnQjqIlEc+xJHnkVJBRVVkPSKFUiljUqHL6iocbiKPNriz2T4+f9CzsW62zQvl2FGjTa4ftqChrY0rRRlLRmJEHy0vm0dafcfolB08W540LppbA002dNQ6QqBXHoyVDeHVVpU1SwBwIpknRY1TKiRrPQeWCh1pXxr77QPvdkO5VGdeund0zW68XFREviwiLznnfgP/9E0ReSovPyUi39jz3QMCAgIC7hl284X+4yLyT0TkB6r6/fzYvxKRXxWRP1LVz4nIJRH5ufvTxICAgICA3WA3Xi7/U6qzOP/kXm6mGklSGwUg1JtGlXSmzWROsfsdI6ikD62Irb6Zbjt960IvG5luGztG8fd78DxpwuxGsEviZd+xew6H9Fop12SIoLfpZ8Ix01QymJI4P46Nlujk8qcHD5gGzMplM42HMIGVNj70MdZu3Sotz7Us6EOpVUEtF89Yg/kKj6EBvBRo4sa5a04dmW2SlnmhXFoymmXpupnvmxvluitVAWBuaOfMtW2c6pQtRp8U5j77R8+S8ZQxmCgDb1NFuTD5OOed0rukDSirezdZu1i/5xUzpgGxNra3TC9pfna2KJ8+ZQFHi1eMKa3F0CiB/tD169eLcrRmdW4iWCnySARSb6DDMB8DUk25R1SKZ0YrpKs9SVkGw3katAxQ4lqW0nOYzDt15XUyAxgxXhJDTxqY2iykWco9cbwsaaV32T1C6H9AQEDAhCC80AMCAgImBPuq5aJRJLU8iGBu/mBx/OVXLLAICrCeEQdLWlZgqi+vWXl2Z2Tq7PTtd6oJvYkEnhe0v6hLosLgI0jvJpQEtWtpmkbM+pMyoTC33eG6A+ndTmdEuRw/dqw4tnjD0sxQGpeBUDFMwS14HVxfNrnUAzNmbjcpIQx9nLdkuLa2wyuHui4eSzWmPODF0IXE6JsrRv+srFifwOBIgrq7Owg+Qr8Pzlk/KGdMLZcB5gZTIwmWeuRup18YsEPzWeCt5CU8pqcMTP9oF+YzA4K8DFkIZGO5KjuSF9yUt4HeOj1keJqB9PDCUQvMO3rIvF92ukatMHju1oqtpXoHOi18QiF/7LvrQMoah7f7yMLUH7ILo8v8TOxFUT0vNDsl9RJ1l1NdaVTuNeI8r669eZbsycvFe6OV3zNQLgEBAQEBIhJe6AEBAQETg/2lXMTM41nsulM3NENwQw02M70zBpkd7w6p0zryrGhOI2gIO/cp7rPdtd36jW276dDLygMdGGiTpGpmZ7Npnh1EVWJkmlf0lGg0R8E/CwuW5HfuknkgdJcheYqxoIzs9o719Y1F08qZmbaxPn7ETOwakv8OQRF5TQfVVKtZOdPbPT62u0bhXHjj9aL8ykUrryCYSOvkjkDXIAtTB+N7+uzZolxvlC9dL/iHNBITa+M7JsuNXHaZlEuEfnqUiyfvWuG9UOE5RFqPQS1egAvgmeSuxEVHpOCASPXVMHcOVF+nZWvm0XOnizLn7/INZEPCOmVib2ZYIjh+GV4xGcagD2+2cTH2PFtQYVWCdg5vRUCQnx+9/Jmsgh9YVD5PLveIyvwGlNZRlSS6YtrfEcIXekBAQMCEILzQAwICAiYE+0q5ODGzpwaqoNkwj4X+wLxWImh69JGUdwdZd1bWbSd/bXt0nGZyUrc6usiAtLxqcqIrEJPZ3IbJWqP0rpmjUKyVGLQQ5USFWhUIqPCMPthd9dpoPGbmLE9Ip2OUiL5pOi2IJRKHKVzbNBopHVgwz8ysBfPENRvrE8ePW7vAdfWRYSmBl0IN9EcEc7+bB5hsdm1Mn3vhpaJ88Yp5SdACbbVtbhh0kkY2v+0502x59PFzRZmBSBk9HMo14rxPF87T2COCljy9krwyA4524b1Apwp6bdBkj3k+zfrypvsmfEp6aXQ8AyXS7NhcO1Ilas/e6RNG8V29auvktctG9zXqoE1A+ZBqIkXiUxsMfqJGDp6P8VwyWbMX+FOOKm2UrMKDhPOkFZ5GHrScotES7ZVdSfOyjWQbec5dyu2GL/SAgICACUF4oQcEBARMCPaVchHnRHOz9eC8eV4cPGQ0w+a2BTesbZgJHyMJLaNaXrn4alHu5nbMqZNGJaQDo1PW1s1T5PJVMympL+JA8zgES6SwyettO75w3LLCxAhcqkHXhJ47TI7bQ9Jcl9Mc9YZplDz5wQ8X5TdXkOR30egUKKTK3AHzYFmG/sZffO+5onztpuncnLpuQT4nMGaHkaGGuLVp9NbNVfOCuHjxooiIXLh43s6FhDHijbyApBTURx+ZlGpIFL4ASqDRAQWHhNRDyAw7Bvmgni6oPE2sEe08yfgQ9EWjBZpuG4m3EcTVaGN+B8bB0WRmhiyPZqkIZOE57aa1gZK4TJw+N2t01Or6aE3UkPErHdha4xpsQdtnft7qOHUK9AueieurtmZ2UgQW9ZAVC4FebYwf1/ghtJfSuxvbo+e8ARqvxixfFSwEPa18kgd0DpOy45xGw6hSavdQO6iNTF/U0BnCC2sqp7U8DyXl/e2uWqGV7NFYe8yc9VaEL/SAgICACUF4oQcEBARMCHQ/k5iq6rKIbInIm3c6d0JwSEJfJxHvlb6+V/op8u7v62nnXDkXCuzrC11ERFWfdc59Yl9v+oAQ+jqZeK/09b3ST5HJ6WugXAICAgImBOGFHhAQEDAheBAv9KcfwD0fFEJfJxPvlb6+V/opMiF93XcOPSAgICDg/iBQLgEBAQETgn19oavqp1X1FVW9oKpf2s9730+o6iOq+m1VfVFVX1DVL+THD6jqt1T1fP7/+TvV9bBAVWNV/WtV/ZP877Oq+p18bv9QVet3quNhgKrOqerXVPVlVX1JVX9sUudVVf95vn6fV9XfV9XmpMyrqn5FVW+o6vM4VjqPOsK/z/v8nKp+/MG1fG/Ytxe6qsYi8lsi8jMi8qSIfFZVn9yv+99nDEXki865J0XkkyLyi3nfviQizzjnHheRZ/K/JwVfEJGX8PevichvOuceE5FbIvK5B9Kqe49/JyL/xTn3fhH5iIz6PHHzqqonROSficgnnHMfkpEI4M/L5Mzr74rIp99yrGoef0ZEHs//+7yI/PY+tfGusZ9f6D8qIheccxedc30R+QMR+cw+3v++wTm35Jz7q7y8IaOH/oSM+vfV/LSvisg/fDAtvLdQ1ZMi8vdE5Hfyv1VEPiUiX8tPmYi+quqsiPwdEfmyiIhzru+cW5UJnVcZyZ20VDURkbaILMmEzKtz7s9FZOUth6vm8TMi8h/dCH8hInOquiAPAfbzhX5CRC7j78X82ERBVc+IyMdE5DsictQ5N84Fd01Ejj6gZt1r/FsR+RdimkgHRWTVuSIv26TM7VkRWRaR/5DTS7+jqh2ZwHl1zl0RkV8XkTdk9CJfE5HvyWTO6xhV8/jQvqvCpug9hKpOicgfi8gvO+fW+W9u5E700LsUqerfF5EbzrnvPei27AMSEfm4iPy2c+5jMpKt8OiVCZrXeRl9mZ4VkeMi0pHbKYqJxaTM436+0K+IyCP4+2R+bCKgqjUZvcx/zzn39fzw9bGplv//RtX1DxF+XET+gaq+LiPa7FMy4pnnclNdZHLmdlFEFp1z38n//pqMXvCTOK8/JSKvOeeWnXMDEfm6jOZ6Eud1jKp5fGjfVfv5Qv+uiDye75rXZbTh8s19vP99Q84hf1lEXnLO/Qb+6Zsi8lReisgRwgAAAS1JREFUfkpEvrHfbbvXcM79S+fcSefcGRnN4X9zzv1jEfm2iPxsftqk9PXa/9/e3aNEDIRhAH5SLdjpEWxsLbewEOz2EDYew8pDeAILCxsRS72AWIiKiD+VJ7C2WIsZYZsFi3VXPt4HBkKazMcXXshkQvAxDMNWP7WHJwX7qi21jIdhWOv380+t5fo6Y14fL7Dfd7uM8TmzNPO/TafTpQ1M8IJ3HC7z2n9c1472uHaPuz4m2tryNV5xhY1Vz3XBde/ish9v4gZvOMNo1fNbUI3buO29Pcd61b7iCM94xAlGVfqKU+3dwJf25HUwr4/ab0yPe049aDt/Vl7Db0a+FI2IKCIvRSMiikigR0QUkUCPiCgigR4RUUQCPSKiiAR6REQRCfSIiCIS6BERRXwDNksVm26UhvcAAAAASUVORK5CYII=", "text/plain": [ "
" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "# Read the picture\n", "raw_img = cv2.imread(\"/home/aistudio/work/word_1.png\")\n", "plt.figure()\n", "plt.subplot(2,1,1)\n", "# Visualize the original image\n", "plt.imshow(raw_img)\n", "# Random cut\n", "crop_img = get_crop(raw_img)\n", "plt.subplot(2,1,2)\n", "# Visual augmentation graph\n", "plt.imshow(crop_img)\n", "plt.show()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### 3.3 Main training program\n", "\n", "The entry code for model training is [train.py](https://github.com/PaddlePaddle/PaddleOCR/blob/release/2.3/tools/train.py), which shows the various modules required in training: `build dataloader`, `build post process`, `build model`, `build loss`, `build optim`, `build metric`, after connecting all parts in series, you can start training:\n", "\n", "* Build dataloader\n", "\n", "The training model requires the data to be formed into a specified number of batches, which are sequentially yielded during the training process. In this example, the [SimpleDataSet](https://github.com/PaddlePaddle/PaddleOCR/blob/release/2.3) implemented in PaddleOCR is called /ppocr/data/simple_dataset.py)\n", "\n", "Based on the original code slightly modified, the main logic of returning a single piece of data is as follows:" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [], "source": [ "def __getitem__(data_line, data_dir):\n", " import os\n", " mode = \"train\"\n", " delimiter = '\\t'\n", " try:\n", " substr = data_line.strip(\"\\n\").split(delimiter)\n", " file_name = substr[0]\n", " label = substr[1]\n", " img_path = os.path.join(data_dir, file_name)\n", " data = {'img_path': img_path, 'label': label}\n", " if not os.path.exists(img_path):\n", " raise Exception(\"{} does not exist!\".format(img_path))\n", " with open(data['img_path'], 'rb') as f:\n", " img = f.read()\n", " data['image'] = img\n", " # Pre-processing operation, comment out first\n", " # outs = transform(data, self.ops)\n", " outs = data\n", " except Exception as e:\n", " print(\"When parsing line {}, error happened with msg: {}\".format(\n", " data_line, e))\n", " outs = None\n", " return outs" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Suppose the current input label is `train/word_1.png Genaxis Theatre`, the path of the training data is `/home/aistudio/work/train_data/ic15_data/`, the result of parsing is a dictionary containing `img_path` `label` `image` three fields:" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "{'img_path': '/home/aistudio/work/train_data/ic15_data/train/word_1.png', 'label': 'Genaxis Theatre', 'image': b'\\x89PNG\\r\\n\\x1a\\n\\x00\\x00\\x00\\rIHDR\\x00\\x00\\x00Y\\x00\\x00\\x00\\x0e\\x08\\x02\\x00\\x00\\x00\\xcb\\xe2\\'\\xb7\\x00\\x00\\x00\\x01sRGB\\x00\\xae\\xce\\x1c\\xe9\\x00\\x00\\x00\\x04gAMA\\x00\\x00\\xb1\\x8f\\x0b\\xfca\\x05\\x00\\x00\\x00 cHRM\\x00\\x00z&\\x00\\x00\\x80\\x84\\x00\\x00\\xfa\\x00\\x00\\x00\\x80\\xe8\\x00\\x00u0\\x00\\x00\\xea`\\x00\\x00:\\x98\\x00\\x00\\x17p\\x9c\\xbaQ<\\x00\\x00\\x0bmIDATHK\\x8d\\x96\\xf9S[\\xd7\\x15\\x80\\x01\\xa7\\x93\\xa4\\xfd1\\x99L\\xea\\x80\\xc4\\xa2]B\\x0bb\\xdf\\x84\\x04\\x18\\x8c\\x01\\xb3\\x8aE\\xec\\x12\\x02\\t\\xb4KhC\\xfb\\xbe=\\xed\\xbb\\x04\\xc2l&N\\xd2\\xb4\\x93i\\x9bv\\xa6\\x7fL\\xdb\\xe9d\\xe2N\\xd3dy\\xf2\\xe4\\xeb\\xaf\\xbf\\xbe\\xbc\\xbc|\\xedWU\\xaf]\\xb7\\xab\\xab\\xab\\xca\\xca\\xab\\x8a\\x8a\\x8a\\xca\\xca\\xca\\xeb\\x08\\x97\\xb7\\xa0sY\\xf1\\xda\\xf7\\xdf?\\x83vUYQUU\\x05wa\\xd6w\\xcf\\x9e\\xdd\\xbau\\xeb\\xf5\\xd7_G\\xa1Po\\xfe\\xe6\\xd7\\x9f}\\xf6\\x19,\\x08+|\\xf9\\xe5\\x97U\\xe5\\xd9\\xcf\\xdb\\x1bo\\xbc\\xf1\\xf6\\xdbo\\xdf\\xbe}\\x1b\\xf6\\xf0\\xce;\\xef\\xc0\"\\xdf~\\xfb\\xedw\\xdf}\\xf7\\xf4\\xe9\\xd3\\xaf\\xbe\\xfa\\xea\\x8b/\\xbex\\xf2\\xe4\\xbf\\xffy\\xfc\\xaf[\\x95\\xcf`eh\\xcf\\xa7]\\x96\\xb7\\x01\\x8f\\x95\\xb7q\\xf5\\xe3~*\\xe16l\\xa6\\xb2\\xf2\\xcd\\xab\\x8a[\\xb0\\rh\\xd7\\xfb)?\\x00\\rV\\xae\\xac*O\\xfc\\xe1\\x08\\x97\\x15\\x15U\\x95W\\x97\\xf0W\\x05\\xf3LJ]&\\x14\\x07r\\xa1D\\xdc\\x1d\\xd2\\x8a\\x15\\x0b\\xe3\\\\J-\\x89\\x88&\\xe0\\xaa\\xb1\\xb8j\\x0c\\xbe\\x06K@\\xe1\\x08h\\x0c\\x1e\\xd5@@\\xd7\\x03xT\\x1d\\x1e\\x85\\xc6\\xd5\\xa0p\\xa8jlu\\r\\xb6\\x1aM\\xc1R\\xb0\\xb5x\\xf4\\xed\\xba\\x9aw\\xd1\\x10\\xd1\\xd5\\r5\\xb7\\xeb\\xde{\\x17M\\xc0R\\x18\\xf4\\xd6\\xd9\\x99\\xc5\\x95UAw\\x0f\\xa7\\x91\\xca\\xa4\\xd2\\x9a\\xd1\\xb5\\xd8\\xea\\xf7jQ\\xa82ht\\x1d\\x99\\xdc8<<\\xa2P\\xa8\\xc2\\xe1\\xe8\\xe9\\xe9\\xf9\\xe1\\xe1Q:\\x9d\\x8dF\\xe3\\x1e\\x8f\\x0f\\x06gg\\xe7\\xfazY4\\n\\x9eN\\xc6\\xd2H\\x18\\x80Jl\\x00h\\x84z\\x80\\xd4P\\x03P\\x1aP@#\\x06\\rP\\x1b\\x80\\xba\\x16\\x12\\x9d\\x81\\xa7Q\\xeaI\\xb0\\xf3\\xf2\\xfe\\xd1\\x18r=\\xbe\\x11C\\xa4bIT,\\x81\\x86#\\xde@\\xc7\\x13\\xe8x\\x12\\x03Oh&\\x90\\x80\\n\\x97\\xc1\\x1a\\xb0\\xba5b\\xb9f[\\xa6\\x12J\\x16F\\xa79m\\xacNz\\x07\\xa7\\xbd\\x7f|\\xe0>orq}nmy\\x9a73256p\\xf7\\x1egh\\x84=0\\xd4\\xcb\\x1e\\xe8\\xeaf\\xb7w\\xf4\\xb40\\xbb\\x98\\x8c\\xf6\\xa6\\xa6\\xb1\\xe1\\xf1;\\xec\\xe1\\xde\\x8e\\xbe\\xae\\xd6\\x9e\\xaevVO\\'\\xbb\\xbb\\xa3\\xaf\\xb3\\xad\\xb7\\xbd\\xad\\xa7\\x8f5\\xb8\\xc1\\xdf\\xde\\x16I9\\xfd\\xc3\\xadm\\xdd}\\xec;\\x10{{8,\\x16\\xbb\\xb3\\xb3\\x9b\\xc1`R\\xa9\\xf4\\xbe>\\xce\\xc6\\x86\\xc0d\\xb2 H\\x04\\x14X\\xadv\\x8b\\xc5\\x06\\x972\\x99\\x82\\xcb\\x9d\\xe7@c\\xc3\\x84vVw[w\\x1b\\xa3\\x9dIke\\x90\\x9bi\\xe4\\x16*\\x81\\x8ak\\xa0\\xe2\\xeb\\xe88\\x0c\\x1d_\\x0f\\x91Ah`\\xe0\\xb1\\x0c\\x87K\\xabP\\xcdO\\xcd\\x88\\xf8|\\xbdJ\\x15\\xf2\\xf8b\\xa1\\x84\\xdb\\xea\\x97l)\\xf6\\xf7\\xac\\xd2\\x1d\\xb5\\x90\\xbf;9>\\'X\\x17\\xabU\\xc6M\\xc1\\xae\\\\\\xae\\xdb\\xda\\x92\\t\\x85R\\x91Ha\\xb7\\xfbuz\\xb3i\\xdff1;dR\\xd5\\xf8\\xd8T#\\x85\\x81\\xaa\\xa9\\xef\\xec\\xe8\\x15\\x8b\\xa4>oH.S//\\xadK%J$\\x14\\x8b\\xc7\\xd20\\xe2\\xf3\\x05\\x90P`\\x89\\xc7\\x9d\\x9d\\x990\\x194\\xbb\\xdb\\x9bF\\x9d\\xca\\xa8U\\x8b\\x04k3\\xf7GW\\x17\\xb9\\x06\\xa5\\xdc\\xaa\\xd3l\\xaf\\xadl,\\xce9\\x8c\\xfa\\xb3\\x83\\xc2I\\xbe\\xe0w:\\xb5J\\x99d{sG\\xc8WJ\\xc4R\\xd1&\\x8f;\\x05S\\xb4J\\xa9A#S\\xcbD\"\\xc1\\xca\\xf4\\xf8p?\\xab\\xf5No\\x17\\xb8`\\x82\\x0b\\xcf\\xbe\\x1d\\\\\\xec\\xac\\t\\xf9\\xdc\\xe5\\xd5\\xa9\\x85\\xe5\\x899!\\x8f\\xef4\\x94E\\xe4\\xe3\\xa5\\x93\\xc2E.V4i\\xacz\\xc5~,\\x10?H\\x1f\\x94\\xb2\\xa5\\\\\"\\x93\\x8a$T\\x12\\x85Eo\\n\\xfb\\x02\\xa5\\xdc\\xc1\\xf1\\xc1\\xe9\\xc9\\xc1C\\xc4\\x1b\\x8b\\x87\\xd2\\x01W8\\xe0\\x8d\\x1a\\xb4V\\x9b\\xd9c6:\\x95\\n\\xbd\\xd3\\x110\\x1a\\x1dz\\xbdM\\xa7\\xb3\\xc6\\xe3\\xf9P(\\x19\\xf0G\\xcc&\\xfb\\x96pgrb\\xb6\\xa5\\xb9\\xa3\\xa6\\xba\\x8eAoY[\\x15\\xd8mn\\x10\\x04.\\x14rM&]8*\\x9d\\xa6S\\xf9L&\\x97\\xcf\\xe6\\xe42\\x89T\\xb2\\x93I\\'\\x11\\xbf\\xaf\\x90Ig\\x12q\\xa7\\xd5bP\\xab=6[>\\x99<+\\x1e\\xc4\\xfd\\x01\\x9dL\\xae\\x95\\xca\\x02\\x0e\\xdb\\x87\\x0fO\\x83\\x1e\\xe7\\xbeV\\xe5uX\\xd21$\\x9f\\x8a\\xa5\\xa2\\x81\\xa0\\xc7\\x96\\x8e\\x05\\x8f\\n\\xa9\\xd3\\xa3\\xccI)\\x9dM\\x06\\xf7\\x94\\xa2\\xd9\\xc9;c\\xc3\\x1cH\\x19&\\x81Zv\\x818|z\\xa9\\xda\\xb1g\\x8a\\xb9\\x82~\\x93\\xd3\\xa2\\xde\\xcf\"\\x85\\x83\\xd4\\xe9i\\xf1\\xfd\\x8b\\xa3\\x8f\\x8e\\xb2g\\x01G$\\xe8\\x8c\\x9e\\x1f\\xbe\\x0f9\\xb2\\xaf19M\\xae|\\xa2`T\\x1b\\xdd\\x16w.\\x9e;/\\x9d?:\\xfe\\xe8\\xcf\\x1f\\xff\\xf5({\\x02:\\x82\\xee\\xc8\\x83\\xc2Y!UJ\\xc5\\x0b\\xfbz\\xbbVm\\x8a\\x84R~_\\xd4a\\xf7[\\xcc\\xee|\\xeeA6S\\x8aG\\xd2&\\x83M\\xb0\\xbe=39\\x0f\\xd5\\x04\\xdf\\x17\\n\\x91\\xbe\\xc0]v\\xd9}\\n\\xa9fyq].Q\\xa7\\xe2\\xb9D4c\\xb7\\xb8\\xddN_<\\x9eT\\xa9T:\\x9d\\xee\\xe4\\xe4\\xa4\\x98/\\\\\\x9c?<:,\\x85C\\x88`m]\\xbe+1\\x1b\\x8c\\xc9p\\xf4(_\\x0c\\xb8<6\\xa3)\\x1b\\x8f?:;\\xd6\\xab\\xe5\\xfc\\x95E\\xbbI\\x9fKF\\x11\\x9f\\xcb\\xb8\\'\\xd7\\xa9$`$\\x9b@\\x8a\\xd9\\xc8\\xc5i\\xe1\\xfc8g7kxs\\xa3\\xf7G\\x06\\x9a\\x88\\x94&\\xe2\\xb5\\x8b\\xa0\\xcd\\xa3\\x10\\xee\\x82\\x8b\\x07\\xc9\\xc2a,\\x8b8\\x02\\xb9p1\\x1b-%C\\xb9\\x98?\\xed\\xb7\\x87M\\x1a\\xbb\\xd7\\x1a:?\\xfc\\xc0m\\x0e\\xac/\\x08w\\x05\\x8a\\\\\\xec\\xd0at\\x07\\x9d\\xe1\\xc3\\xf4\\x83\\x8b\\xa3\\x0f\\xc0Q1Yr\\x9b}|\\x1e\\xdc\\x95\\xa5\"9H\\x93b\\xfaH.V\\xef\\x8a\\x94>w\\xd8i\\xf3C\\xa6\\xc8eZ$\\x90\\x005\\xf1H\\xd6\\xebB\\xd4R\\xdd\\x02w\\xb5\\xa7\\x8d]\\x87\\xc2\\xd3\\x88\\xcc\\xd9)\\x9e\\xdf\\x89@\\x89-L/+\\xa4\\xdad8\\x0bSV\\x17\\xf8\\xbb\"y*\\x99\\xb3Y]n\\x97\\xff\\xe2\\xe1\\x87\\xb9L\\xb1\\x98/E\\x90\\xb8\\xdd\\xea\\xda\\\\\\x13*%*\\x9dJ\\x1f\\xf6G\\xe0\\x95 \\x1eD\\xaf\\xd4G\\xfc\\xc8\\xe9\\xe1!\\x94\\xc6\\xd4\\xe8\\xa8A\\xa3LE\\x11\\x9dJ>?=\\xb18;Y\\xcc$R\\xd1P4\\xe8\\xcd%\\xc3\\x80F.\\x86\\xa4`w\\xb7\\x97]\\x10\\xe8\\x15P ^\\x93C\\xbc\\xbai\\x94j\\n\\xe1\\x14\\xb8\\x88{#p\\xfe\\xa0;\\xe1\\xb1\\x85\\xad\\x06\\xafFf\\x12\\xf1\\x15*\\x891\\x16\\xcc\\x01K\\\\\\xc1\\xea\\xfcv\"Tt\\xec\\x07=\\xd6H,\\x90/$\\x8e\\xddF\\x1f\\xa4\\x92\\xcf\\x1c\\x14\\xf2D:\\x89\\xf1A\\xee\\xfc\\xd3\\xdf\\xff\\xed\\x93\\x0f>U\\x8a\\xb5\\xbbB\\xa5\\xc7\\x16\\xb2\\xef{\\xb5\\n\\x93xK\\xe9s\\x84C\\x9ex1}|z\\xf8\\x08D\\x0bV\\xc4\\xdd\\xcdl\\\\-\\xa5\\x99\\xd21yo\\x0e\\xb2O\\xb1\\xa3]\\x9e\\xdd\\xb0\\x1a\\xdcg\\x07\\x8fLZ\\x07\\xf4\\x8d\\x1a\\x1b\\xa4R \\x18\\xf5\\x07\"\\xa5\\xa33\\x9f\\x17q\\xd8\\xbd*\\xa5nG,\\x97\\x88\\x15\\x0e\\xab\\'\\x12\\x88\\x9d\\x1c\\x9e}\\xf2\\xbb?F\\x83q\\xc9\\xb6L\\xab\\xd4\\xe7S\\x05\\xa9H2?\\xbd`1\\x98\\x0f\\xb2E\\x93n\\x7f~zne\\x91\\x97\\x8a\\xc5\\xa3\\xa1`\\xd0\\xebID\\xc2\\xf10\\xa2\\x94JF\\x87\\x87\\xd8=\\xddt\\x12\\x95N\\xa4W\\x98U\\xfa\\x98\\'\\x04:R\\xbeH)\\x9eCl\\xde}\\xa8\\xf0}_\\xc4\\x9fI\\x86\\x8ba_\\xdaf\\xf4I\\xb64\\n\\xb1\\xc1g\\x8f\\x85\\xdc\\xa9\\xb9\\x89\\x8d\\xa9{\\xcb\\x1eK\\xc2\\xbc\\x170i\\xfcvC8\\xee/\\xfa-\\xe1\\x847\\x15\\xb4\"R\\xbe\\xc2\\xa4\\xb4\\x06m\\xe1|\\xe4 \\x85\\xe4\\x1dF\\xafFb\\x00\\xb3N\\xb3\\x1f\\x8e\\x04\\xe7\\x0c\\xfb\\x92\\xf1P\\xf6A\\xee\\xe2\\xe3\\x87\\x7f\\xca\\xc5\\x8e\\xf8\\xab\\xb3\\x0fD\\x94]\\xf0\\xe7W@\\x07\\xfcg\\x85\\x8f\\x85\\x07jxn\\xe5\\xfe\\xc0\\xa8`I\\xa4W\\xd9 /\\xdcVD+\\xb7n\\xf0v\\xf8KR\\xa3\\xda\\xad\\x91Z\\xc7\\x06\\x17\\xef\\xf4Nk$N\\xd9\\x96y{M+\\xde\\xd0{\\xcc\\xf1U\\xcb\\xb3kC\\xac\\x11\\x1c\\x8a\\xdc\\xc9\\xe0\\xac\\xcf\\xef\\x00Sw\\x17a{\\xd3#\\xbc\\x0e:\\xbb\\x85\\xd2C\\xc5\\xb6\\xc2\\x88pE\\xb2\\xbe\\xb0=w\\x7f\\x052\\xb1\\x95\\xdaC\\'\\xb44\\x11\\x98M\\x04F\\x05\\xa9\\x96\\xf0\"d4\\x11.\\xf1\\xb5\\xf8k\\xa0\\xf3\\x02h\\x12\\xbe\\x0c\\xe5\\x07\\xa8x\\xf4\\r\\x14\\x1c\\x8a\\x88C\\xe3_\\t\\xfc\\x1e}\\x19b\\x1d\\x85P\\xdb\\x08\\xdc,u\\xd3\\'\\xd6Qo:7\\xfd\\x1bH\\xf5\\xf0\\xf3\\xb1\\x91\\x8e\\xa1\\xd2\\xb1\\x8d\\xd7\\x90_\\x8c\\x0c\\x1c\\x05h\\xc2\\x91^\\x84\\x81\\xa3\\xd21\\xf0F[\\xaei\\xfe!B\\xa7\\x99\\x8emy\\x15\\xcc&\\\\S\\x13\\x8e^A\\xa8\\xfb\\xd1\\x05Xx\\xce\\xab]\\xfc\\xe8\\xe5\\xb9\\x94\\x1b5e0h\\x1c\\x06\\x8d}9^[\\xc0\\xbd\\x14\\xf1\\x84:\"X~n\\xf0\\xda#,\\x02\\x83\\x84Z2\\xb1\\x9e\\x04\\xa6 \\x92\\xea\\x1bI\\rd\\x88\\x94z2\\x03\\xd3\\xc8\\xc4\\x90\\x99\\x18\\xe2\\xab\\xc031?\\x83\\x81%2\\xb04\\x06\\x96\\xc1\\xc02\\x7fadb\\x9b\\x98/\\xba\\x80\\x8c\\xb8\\x01\\x8c\\xfc\\xfc\\xfd\\xdf$\\xc2O\\'\\x7f!5\\x9e\\xdf\\xc2\\xa0\\x08e\\x11/E,\\x9aX\\x16\\xf1R,\\x8b@\\xe3\\xb0(\\x00{\\x13\\xcb:j\\xf1e\\x11\\xf5\\xc4\\xb2\\x88\\x06\\x80L\\xc6\\x00\\x94\\xc6\\x06\\xc8\\x05\\xd0\\xf1\\n\\xe8\\r\\xa4\\x17\\xf9\\xe9\\x99\\xeb|\\x81\\x04\\xf9e\\x91\\n\"\\x80\\xff\\x03\\x99\\xa0+\\x94\\xbd\\xf0X\\xa1\\x00\\x00\\x00\\x00IEND\\xaeB`\\x82'}\n" ] } ], "source": [ "data_line = \"train/word_1.png\tGenaxis Theatre\"\n", "data_dir = \"/home/aistudio/work/train_data/ic15_data/\"\n", "\n", "item = __getitem__(data_line, data_dir)\n", "print(item)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "After realizing the return logic of a single piece of data, call `padde.io.Dataloader` to combine the data into a batch. For details, please refer to [build_dataloader](https://github.com/PaddlePaddle/PaddleOCR/blob/95c670faf6cf4551c841764cde43a4f4d9d5e634/ppocr/data/__init__.py#L52).\n", "\n", "\n", "* build model\n", "\n", " The build model is to build the main network structure. The specific details are as described in \"2.3 Code Implementation\". This section will not introduce too much. For the code of each module, please refer to [modeling](https://github.com/PaddlePaddle/PaddleOCR/tree/release/2.3/ppocr/modeling)\n", "\n", "* build loss\n", " \n", " The loss function of the CRNN model is CTC loss, and the flying paddle integrates the commonly used Loss function. You only need to call the implementation:" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [], "source": [ "import paddle.nn as nn\n", "class CTCLoss(nn.Layer):\n", " def __init__(self, use_focal_loss=False, **kwargs):\n", " super(CTCLoss, self).__init__()\n", " # blank is a meaningless connector for ctc\n", " self.loss_func = nn.CTCLoss(blank=0, reduction='none')\n", "\n", " def forward(self, predicts, batch):\n", " if isinstance(predicts, (list, tuple)):\n", " predicts = predicts[-1]\n", " # Transpose the prediction results of the head layer of the model, arranged along the channel layer\n", " predicts = predicts.transpose((1, 0, 2)) #[80,1,37]\n", " N, B, _ = predicts.shape\n", " preds_lengths = paddle.to_tensor([N] * B, dtype='int64')\n", " labels = batch[1].astype(\"int32\")\n", " label_lengths = batch[2].astype('int64')\n", " # Calculate the loss function\n", " loss = self.loss_func(predicts, labels, preds_lengths, label_lengths)\n", " loss = loss.mean()\n", " return {'loss': loss}" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "* build post process\n", "\n", "The specific details are also introduced in \"2.3 Code Implementation\", and the implementation logic is the same as before." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "* build optim\n", "\n", "The optimizer uses `Adam` and also calls the flying paddle API: `paddle.optimizer.Adam`\n", "\n", "* build metric\n", "\n", "The metric part is used to calculate model indicators. In PaddleOCR's text recognition, the prediction of the entire sentence is judged to be correct. Therefore, the main logic of the accuracy rate calculation is as follows:" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [], "source": [ "def metric(preds, labels): \n", " correct_num = 0\n", " all_num = 0\n", " norm_edit_dis = 0.0\n", " for (pred), (target) in zip(preds, labels):\n", " pred = pred.replace(\" \", \"\")\n", " target = target.replace(\" \", \"\")\n", " if pred == target:\n", " correct_num += 1\n", " all_num += 1\n", " correct_num += correct_num\n", " all_num += all_num\n", " return {\n", " 'acc': correct_num / all_num,\n", " }" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "acc: {'acc': 0.6}\n" ] } ], "source": [ "preds = [\"aaa\", \"bbb\", \"ccc\", \"123\", \"456\"]\n", "labels = [\"aaa\", \"bbb\", \"ddd\", \"123\", \"444\"]\n", "acc = metric(preds, labels)\n", "print(\"acc:\", acc)\n", "# Among the five prediction results, 3 are completely correct, so the accuracy rate should be 0.6" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Combining the above parts is the complete training process:" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [], "source": [ "\n", "def main(config, device, logger, vdl_writer):\n", " # init dist environment\n", " if config['Global']['distributed']:\n", " dist.init_parallel_env()\n", "\n", " global_config = config['Global']\n", "\n", " # build dataloader\n", " train_dataloader = build_dataloader(config, 'Train', device, logger)\n", " if len(train_dataloader) == 0:\n", " logger.error(\n", " \"No Images in train dataset, please ensure\\n\" +\n", " \"\\t1. The images num in the train label_file_list should be larger than or equal with batch size.\\n\"\n", " +\n", " \"\\t2. The annotation file and path in the configuration file are provided normally.\"\n", " )\n", " return\n", "\n", " if config['Eval']:\n", " valid_dataloader = build_dataloader(config, 'Eval', device, logger)\n", " else:\n", " valid_dataloader = None\n", "\n", " # build post process\n", " post_process_class = build_post_process(config['PostProcess'],\n", " global_config)\n", "\n", " # build model\n", " # for rec algorithm\n", " if hasattr(post_process_class, 'character'):\n", " char_num = len(getattr(post_process_class, 'character'))\n", " if config['Architecture'][\"algorithm\"] in [\"Distillation\",\n", " ]: # distillation model\n", " for key in config['Architecture'][\"Models\"]:\n", " config['Architecture'][\"Models\"][key][\"Head\"][\n", " 'out_channels'] = char_num\n", " else: # base rec model\n", " config['Architecture'][\"Head\"]['out_channels'] = char_num\n", "\n", " model = build_model(config['Architecture'])\n", " if config['Global']['distributed']:\n", " model = paddle.DataParallel(model)\n", "\n", " # build loss\n", " loss_class = build_loss(config['Loss'])\n", "\n", " # build optim\n", " optimizer, lr_scheduler = build_optimizer(\n", " config['Optimizer'],\n", " epochs=config['Global']['epoch_num'],\n", " step_each_epoch=len(train_dataloader),\n", " parameters=model.parameters())\n", "\n", " # build metric\n", " eval_class = build_metric(config['Metric'])\n", " # load pretrain model\n", " pre_best_model_dict = load_model(config, model, optimizer)\n", " logger.info('train dataloader has {} iters'.format(len(train_dataloader)))\n", " if valid_dataloader is not None:\n", " logger.info('valid dataloader has {} iters'.format(\n", " len(valid_dataloader)))\n", "\n", " use_amp = config[\"Global\"].get(\"use_amp\", False)\n", " if use_amp:\n", " AMP_RELATED_FLAGS_SETTING = {\n", " 'FLAGS_cudnn_batchnorm_spatial_persistent': 1,\n", " 'FLAGS_max_inplace_grad_add': 8,\n", " }\n", " paddle.fluid.set_flags(AMP_RELATED_FLAGS_SETTING)\n", " scale_loss = config[\"Global\"].get(\"scale_loss\", 1.0)\n", " use_dynamic_loss_scaling = config[\"Global\"].get(\n", " \"use_dynamic_loss_scaling\", False)\n", " scaler = paddle.amp.GradScaler(\n", " init_loss_scaling=scale_loss,\n", " use_dynamic_loss_scaling=use_dynamic_loss_scaling)\n", " else:\n", " scaler = None\n", "\n", " # start train\n", " program.train(config, train_dataloader, valid_dataloader, device, model,\n", " loss_class, optimizer, lr_scheduler, post_process_class,\n", " eval_class, pre_best_model_dict, logger, vdl_writer, scaler)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## 4. Complete training task\n", "\n", "### 4.1 Start training\n", "\n", "PaddleOCR recognition task is similar to detection task, which transmits parameters through configuration files.\n", "\n", "To perform a complete model training, you first need to download the entire project and install related dependencies:" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Looking in indexes: https://pypi.tuna.tsinghua.edu.cn/simple\n", "Requirement already satisfied: shapely in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from -r requirements.txt (line 1)) (1.8.0)\n", "Collecting scikit-image==0.17.2\n", " Downloading https://pypi.tuna.tsinghua.edu.cn/packages/d7/ee/753ea56fda5bc2a5516a1becb631bf5ada593a2dd44f21971a13a762d4db/scikit_image-0.17.2-cp37-cp37m-manylinux1_x86_64.whl (12.5 MB)\n", " |████████████████████████████████| 12.5 MB 8.4 MB/s \n", "\u001b[?25hRequirement already satisfied: imgaug==0.4.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from -r requirements.txt (line 3)) (0.4.0)\n", "Requirement already satisfied: pyclipper in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from -r requirements.txt (line 4)) (1.3.0.post2)\n", "Requirement already satisfied: lmdb in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from -r requirements.txt (line 5)) (1.2.1)\n", "Requirement already satisfied: tqdm in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from -r requirements.txt (line 6)) (4.36.1)\n", "Requirement already satisfied: numpy in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from -r requirements.txt (line 7)) (1.20.3)\n", "Requirement already satisfied: visualdl in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from -r requirements.txt (line 8)) (2.2.0)\n", "Requirement already satisfied: python-Levenshtein in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from -r requirements.txt (line 9)) (0.12.2)\n", "Requirement already satisfied: opencv-contrib-python==4.4.0.46 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from -r requirements.txt (line 10)) (4.4.0.46)\n", "Requirement already satisfied: lxml in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from -r requirements.txt (line 11)) (4.7.1)\n", "Requirement already satisfied: premailer in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from -r requirements.txt (line 12)) (3.10.0)\n", "Requirement already satisfied: openpyxl in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from -r requirements.txt (line 13)) (3.0.5)\n", "Requirement already satisfied: imageio>=2.3.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from scikit-image==0.17.2->-r requirements.txt (line 2)) (2.6.1)\n", "Requirement already satisfied: matplotlib!=3.0.0,>=2.0.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from scikit-image==0.17.2->-r requirements.txt (line 2)) (2.2.3)\n", "Requirement already satisfied: tifffile>=2019.7.26 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from scikit-image==0.17.2->-r requirements.txt (line 2)) (2021.11.2)\n", "Requirement already satisfied: PyWavelets>=1.1.1 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from scikit-image==0.17.2->-r requirements.txt (line 2)) (1.2.0)\n", "Requirement already satisfied: pillow!=7.1.0,!=7.1.1,>=4.3.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from scikit-image==0.17.2->-r requirements.txt (line 2)) (7.1.2)\n", "Requirement already satisfied: networkx>=2.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from scikit-image==0.17.2->-r requirements.txt (line 2)) (2.4)\n", "Requirement already satisfied: scipy>=1.0.1 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from scikit-image==0.17.2->-r requirements.txt (line 2)) (1.6.3)\n", "Requirement already satisfied: six in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from imgaug==0.4.0->-r requirements.txt (line 3)) (1.15.0)\n", "Requirement already satisfied: opencv-python in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from imgaug==0.4.0->-r requirements.txt (line 3)) (4.1.1.26)\n", "Requirement already satisfied: flask>=1.1.1 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from visualdl->-r requirements.txt (line 8)) (1.1.1)\n", "Requirement already satisfied: requests in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from visualdl->-r requirements.txt (line 8)) (2.22.0)\n", "Requirement already satisfied: pre-commit in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from visualdl->-r requirements.txt (line 8)) (1.21.0)\n", "Requirement already satisfied: shellcheck-py in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from visualdl->-r requirements.txt (line 8)) (0.7.1.1)\n", "Requirement already satisfied: pandas in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from visualdl->-r requirements.txt (line 8)) (1.1.5)\n", "Requirement already satisfied: Flask-Babel>=1.0.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from visualdl->-r requirements.txt (line 8)) (1.0.0)\n", "Requirement already satisfied: bce-python-sdk in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from visualdl->-r requirements.txt (line 8)) (0.8.53)\n", "Requirement already satisfied: flake8>=3.7.9 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from visualdl->-r requirements.txt (line 8)) (3.8.2)\n", "Requirement already satisfied: protobuf>=3.11.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from visualdl->-r requirements.txt (line 8)) (3.14.0)\n", "Requirement already satisfied: setuptools in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from python-Levenshtein->-r requirements.txt (line 9)) (56.2.0)\n", "Requirement already satisfied: cssutils in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from premailer->-r requirements.txt (line 12)) (2.3.0)\n", "Requirement already satisfied: cachetools in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from premailer->-r requirements.txt (line 12)) (4.0.0)\n", "Requirement already satisfied: cssselect in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from premailer->-r requirements.txt (line 12)) (1.1.0)\n", "Requirement already satisfied: jdcal in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from openpyxl->-r requirements.txt (line 13)) (1.4.1)\n", "Requirement already satisfied: et-xmlfile in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from openpyxl->-r requirements.txt (line 13)) (1.0.1)\n", "Requirement already satisfied: pycodestyle<2.7.0,>=2.6.0a1 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from flake8>=3.7.9->visualdl->-r requirements.txt (line 8)) (2.6.0)\n", "Requirement already satisfied: pyflakes<2.3.0,>=2.2.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from flake8>=3.7.9->visualdl->-r requirements.txt (line 8)) (2.2.0)\n", "Requirement already satisfied: importlib-metadata in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from flake8>=3.7.9->visualdl->-r requirements.txt (line 8)) (0.23)\n", "Requirement already satisfied: mccabe<0.7.0,>=0.6.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from flake8>=3.7.9->visualdl->-r requirements.txt (line 8)) (0.6.1)\n", "Requirement already satisfied: click>=5.1 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from flask>=1.1.1->visualdl->-r requirements.txt (line 8)) (7.0)\n", "Requirement already satisfied: Jinja2>=2.10.1 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from flask>=1.1.1->visualdl->-r requirements.txt (line 8)) (2.11.0)\n", "Requirement already satisfied: Werkzeug>=0.15 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from flask>=1.1.1->visualdl->-r requirements.txt (line 8)) (0.16.0)\n", "Requirement already satisfied: itsdangerous>=0.24 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from flask>=1.1.1->visualdl->-r requirements.txt (line 8)) (1.1.0)\n", "Requirement already satisfied: pytz in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from Flask-Babel>=1.0.0->visualdl->-r requirements.txt (line 8)) (2019.3)\n", "Requirement already satisfied: Babel>=2.3 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from Flask-Babel>=1.0.0->visualdl->-r requirements.txt (line 8)) (2.8.0)\n", "Requirement already satisfied: cycler>=0.10 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from matplotlib!=3.0.0,>=2.0.0->scikit-image==0.17.2->-r requirements.txt (line 2)) (0.10.0)\n", "Requirement already satisfied: pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.1 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from matplotlib!=3.0.0,>=2.0.0->scikit-image==0.17.2->-r requirements.txt (line 2)) (2.4.2)\n", "Requirement already satisfied: python-dateutil>=2.1 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from matplotlib!=3.0.0,>=2.0.0->scikit-image==0.17.2->-r requirements.txt (line 2)) (2.8.0)\n", "Requirement already satisfied: kiwisolver>=1.0.1 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from matplotlib!=3.0.0,>=2.0.0->scikit-image==0.17.2->-r requirements.txt (line 2)) (1.1.0)\n", "Requirement already satisfied: decorator>=4.3.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from networkx>=2.0->scikit-image==0.17.2->-r requirements.txt (line 2)) (4.4.2)\n", "Requirement already satisfied: pycryptodome>=3.8.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from bce-python-sdk->visualdl->-r requirements.txt (line 8)) (3.9.9)\n", "Requirement already satisfied: future>=0.6.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from bce-python-sdk->visualdl->-r requirements.txt (line 8)) (0.18.0)\n", "Requirement already satisfied: cfgv>=2.0.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from pre-commit->visualdl->-r requirements.txt (line 8)) (2.0.1)\n", "Requirement already satisfied: virtualenv>=15.2 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from pre-commit->visualdl->-r requirements.txt (line 8)) (16.7.9)\n", "Requirement already satisfied: nodeenv>=0.11.1 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from pre-commit->visualdl->-r requirements.txt (line 8)) (1.3.4)\n", "Requirement already satisfied: toml in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from pre-commit->visualdl->-r requirements.txt (line 8)) (0.10.0)\n", "Requirement already satisfied: pyyaml in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from pre-commit->visualdl->-r requirements.txt (line 8)) (5.1.2)\n", "Requirement already satisfied: aspy.yaml in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from pre-commit->visualdl->-r requirements.txt (line 8)) (1.3.0)\n", "Requirement already satisfied: identify>=1.0.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from pre-commit->visualdl->-r requirements.txt (line 8)) (1.4.10)\n", "Requirement already satisfied: idna<2.9,>=2.5 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from requests->visualdl->-r requirements.txt (line 8)) (2.8)\n", "Requirement already satisfied: chardet<3.1.0,>=3.0.2 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from requests->visualdl->-r requirements.txt (line 8)) (3.0.4)\n", "Requirement already satisfied: certifi>=2017.4.17 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from requests->visualdl->-r requirements.txt (line 8)) (2019.9.11)\n", "Requirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from requests->visualdl->-r requirements.txt (line 8)) (1.25.6)\n", "Requirement already satisfied: MarkupSafe>=0.23 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from Jinja2>=2.10.1->flask>=1.1.1->visualdl->-r requirements.txt (line 8)) (1.1.1)\n", "Requirement already satisfied: zipp>=0.5 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from importlib-metadata->flake8>=3.7.9->visualdl->-r requirements.txt (line 8)) (3.6.0)\n", "Installing collected packages: scikit-image\n", " Attempting uninstall: scikit-image\n", " Found existing installation: scikit-image 0.19.1\n", " Uninstalling scikit-image-0.19.1:\n", " Successfully uninstalled scikit-image-0.19.1\n", "Successfully installed scikit-image-0.17.2\n" ] } ], "source": [ "# Clone PaddleOCR code\n", "#!git clone https://gitee.com/paddlepaddle/PaddleOCR\n", "# Modify the default directory where the code runs to /home/aistudio/PaddleOCR\n", "import os\n", "os.chdir(\"/home/aistudio/PaddleOCR\")\n", "# Install PaddleOCR third-party dependencies\n", "!pip install -r requirements.txt" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Create a soft link and place the training data under the PaddleOCR project:" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [], "source": [ "!ln -s /home/aistudio/work/train_data/ /home/aistudio/PaddleOCR/" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Download the pre-trained model:\n", "\n", "In order to speed up the convergence speed, it is recommended to download the trained model and perform finetune on the icdar2015 data" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "--2021-12-22 15:39:39-- https://paddleocr.bj.bcebos.com/dygraph_v2.0/en/rec_mv3_none_bilstm_ctc_v2.0_train.tar\n", "Resolving paddleocr.bj.bcebos.com (paddleocr.bj.bcebos.com)... 182.61.200.195, 182.61.200.229, 2409:8c04:1001:1002:0:ff:b001:368a\n", "Connecting to paddleocr.bj.bcebos.com (paddleocr.bj.bcebos.com)|182.61.200.195|:443... connected.\n", "HTTP request sent, awaiting response... 200 OK\n", "Length: 51200000 (49M) [application/x-tar]\n", "Saving to: ‘./pretrain_models/rec_mv3_none_bilstm_ctc_v2.0_train.tar’\n", "\n", "rec_mv3_none_bilstm 100%[===================>] 48.83M 15.5MB/s in 3.6s \n", "\n", "2021-12-22 15:39:42 (13.7 MB/s) - ‘./pretrain_models/rec_mv3_none_bilstm_ctc_v2.0_train.tar’ saved [51200000/51200000]\n", "\n" ] } ], "source": [ "!cd PaddleOCR/\n", "# Download the pre-trained model of MobileNetV3\n", "!wget -nc -P ./pretrain_models/ https://paddleocr.bj.bcebos.com/dygraph_v2.0/en/rec_mv3_none_bilstm_ctc_v2.0_train.tar\n", "# Decompress model parameters\n", "!tar -xf pretrain_models/rec_mv3_none_bilstm_ctc_v2.0_train.tar && rm -rf pretrain_models/rec_mv3_none_bilstm_ctc_v2.0_train.tar" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Starting the training command is very simple, just specify the configuration file. In addition, in the command line, you can use `-o` to modify the parameter values in the configuration file. Start the training command as shown below\n", "\n", "in:\n", "\n", "* `Global.pretrained_model`: Loaded pretrained model path\n", "* `Global.character_dict_path`: dictionary path (only 26 lowercase letters + numbers are supported here)\n", "* `Global.eval_batch_step`: evaluation frequency\n", "* `Global.epoch_num`: total number of training rounds\n", "\n" ] }, { "cell_type": "code", "execution_count": 4, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "/opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages/skimage/morphology/_skeletonize.py:241: DeprecationWarning: `np.bool` is a deprecated alias for the builtin `bool`. To silence this warning, use `bool` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.bool_` here.\n", "Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations\n", " 0, 1, 1, 0, 0, 1, 0, 0, 0], dtype=np.bool)\n", "/opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages/skimage/morphology/_skeletonize.py:256: DeprecationWarning: `np.bool` is a deprecated alias for the builtin `bool`. To silence this warning, use `bool` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.bool_` here.\n", "Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations\n", " 0, 0, 0, 0, 0, 0, 0, 0, 0], dtype=np.bool)\n", "[2021/12/23 20:28:15] root INFO: Architecture : \n", "[2021/12/23 20:28:15] root INFO: Backbone : \n", "[2021/12/23 20:28:15] root INFO: model_name : large\n", "[2021/12/23 20:28:15] root INFO: name : MobileNetV3\n", "[2021/12/23 20:28:15] root INFO: scale : 0.5\n", "[2021/12/23 20:28:15] root INFO: Head : \n", "[2021/12/23 20:28:15] root INFO: fc_decay : 0\n", "[2021/12/23 20:28:15] root INFO: name : CTCHead\n", "[2021/12/23 20:28:15] root INFO: Neck : \n", "[2021/12/23 20:28:15] root INFO: encoder_type : rnn\n", "[2021/12/23 20:28:15] root INFO: hidden_size : 96\n", "[2021/12/23 20:28:15] root INFO: name : SequenceEncoder\n", "[2021/12/23 20:28:15] root INFO: Transform : None\n", "[2021/12/23 20:28:15] root INFO: algorithm : CRNN\n", "[2021/12/23 20:28:15] root INFO: model_type : rec\n", "[2021/12/23 20:28:15] root INFO: Eval : \n", "[2021/12/23 20:28:15] root INFO: dataset : \n", "[2021/12/23 20:28:15] root INFO: data_dir : ./train_data/ic15_data\n", "[2021/12/23 20:28:15] root INFO: label_file_list : ['./train_data/ic15_data/rec_gt_test.txt']\n", "[2021/12/23 20:28:15] root INFO: name : SimpleDataSet\n", "[2021/12/23 20:28:15] root INFO: transforms : \n", "[2021/12/23 20:28:15] root INFO: DecodeImage : \n", "[2021/12/23 20:28:15] root INFO: channel_first : False\n", "[2021/12/23 20:28:15] root INFO: img_mode : BGR\n", "[2021/12/23 20:28:15] root INFO: CTCLabelEncode : None\n", "[2021/12/23 20:28:15] root INFO: RecResizeImg : \n", "[2021/12/23 20:28:15] root INFO: image_shape : [3, 32, 100]\n", "[2021/12/23 20:28:15] root INFO: KeepKeys : \n", "[2021/12/23 20:28:15] root INFO: keep_keys : ['image', 'label', 'length']\n", "[2021/12/23 20:28:15] root INFO: loader : \n", "[2021/12/23 20:28:15] root INFO: batch_size_per_card : 256\n", "[2021/12/23 20:28:15] root INFO: drop_last : False\n", "[2021/12/23 20:28:15] root INFO: num_workers : 4\n", "[2021/12/23 20:28:15] root INFO: shuffle : False\n", "[2021/12/23 20:28:15] root INFO: use_shared_memory : False\n", "[2021/12/23 20:28:15] root INFO: Global : \n", "[2021/12/23 20:28:15] root INFO: cal_metric_during_train : True\n", "[2021/12/23 20:28:15] root INFO: character_dict_path : ppocr/utils/ic15_dict.txt\n", "[2021/12/23 20:28:15] root INFO: character_type : EN\n", "[2021/12/23 20:28:15] root INFO: checkpoints : None\n", "[2021/12/23 20:28:15] root INFO: debug : False\n", "[2021/12/23 20:28:15] root INFO: distributed : False\n", "[2021/12/23 20:28:15] root INFO: epoch_num : 40\n", "[2021/12/23 20:28:15] root INFO: eval_batch_step : [0, 200]\n", "[2021/12/23 20:28:15] root INFO: infer_img : doc/imgs_words_en/word_19.png\n", "[2021/12/23 20:28:15] root INFO: infer_mode : False\n", "[2021/12/23 20:28:15] root INFO: log_smooth_window : 20\n", "[2021/12/23 20:28:15] root INFO: max_text_length : 25\n", "[2021/12/23 20:28:15] root INFO: pretrained_model : rec_mv3_none_bilstm_ctc_v2.0_train/best_accuracy\n", "[2021/12/23 20:28:15] root INFO: print_batch_step : 10\n", "[2021/12/23 20:28:15] root INFO: save_epoch_step : 3\n", "[2021/12/23 20:28:15] root INFO: save_inference_dir : ./\n", "[2021/12/23 20:28:15] root INFO: save_model_dir : ./output/rec/ic15/\n", "[2021/12/23 20:28:15] root INFO: save_res_path : ./output/rec/predicts_ic15.txt\n", "[2021/12/23 20:28:15] root INFO: use_gpu : True\n", "[2021/12/23 20:28:15] root INFO: use_space_char : False\n", "[2021/12/23 20:28:15] root INFO: use_visualdl : False\n", "[2021/12/23 20:28:15] root INFO: Loss : \n", "[2021/12/23 20:28:15] root INFO: name : CTCLoss\n", "[2021/12/23 20:28:15] root INFO: Metric : \n", "[2021/12/23 20:28:15] root INFO: main_indicator : acc\n", "[2021/12/23 20:28:15] root INFO: name : RecMetric\n", "[2021/12/23 20:28:15] root INFO: Optimizer : \n", "[2021/12/23 20:28:15] root INFO: beta1 : 0.9\n", "[2021/12/23 20:28:15] root INFO: beta2 : 0.999\n", "[2021/12/23 20:28:15] root INFO: lr : \n", "[2021/12/23 20:28:15] root INFO: learning_rate : 0.0005\n", "[2021/12/23 20:28:15] root INFO: name : Adam\n", "[2021/12/23 20:28:15] root INFO: regularizer : \n", "[2021/12/23 20:28:15] root INFO: factor : 0\n", "[2021/12/23 20:28:15] root INFO: name : L2\n", "[2021/12/23 20:28:15] root INFO: PostProcess : \n", "[2021/12/23 20:28:15] root INFO: name : CTCLabelDecode\n", "[2021/12/23 20:28:15] root INFO: Train : \n", "[2021/12/23 20:28:15] root INFO: dataset : \n", "[2021/12/23 20:28:15] root INFO: data_dir : ./train_data/ic15_data/\n", "[2021/12/23 20:28:15] root INFO: label_file_list : ['./train_data/ic15_data/rec_gt_train.txt']\n", "[2021/12/23 20:28:15] root INFO: name : SimpleDataSet\n", "[2021/12/23 20:28:15] root INFO: transforms : \n", "[2021/12/23 20:28:15] root INFO: DecodeImage : \n", "[2021/12/23 20:28:15] root INFO: channel_first : False\n", "[2021/12/23 20:28:15] root INFO: img_mode : BGR\n", "[2021/12/23 20:28:15] root INFO: CTCLabelEncode : None\n", "[2021/12/23 20:28:15] root INFO: RecResizeImg : \n", "[2021/12/23 20:28:15] root INFO: image_shape : [3, 32, 100]\n", "[2021/12/23 20:28:15] root INFO: KeepKeys : \n", "[2021/12/23 20:28:15] root INFO: keep_keys : ['image', 'label', 'length']\n", "[2021/12/23 20:28:15] root INFO: loader : \n", "[2021/12/23 20:28:15] root INFO: batch_size_per_card : 256\n", "[2021/12/23 20:28:15] root INFO: drop_last : True\n", "[2021/12/23 20:28:15] root INFO: num_workers : 8\n", "[2021/12/23 20:28:15] root INFO: shuffle : True\n", "[2021/12/23 20:28:15] root INFO: use_shared_memory : False\n", "[2021/12/23 20:28:15] root INFO: train with paddle 2.1.2 and device CUDAPlace(0)\n", "[2021/12/23 20:28:15] root INFO: Initialize indexs of datasets:['./train_data/ic15_data/rec_gt_train.txt']\n", "[2021/12/23 20:28:15] root INFO: Initialize indexs of datasets:['./train_data/ic15_data/rec_gt_test.txt']\n", "W1223 20:28:15.851713 306 device_context.cc:404] Please NOTE: device: 0, GPU Compute Capability: 7.0, Driver API Version: 11.0, Runtime API Version: 10.1\n", "W1223 20:28:15.857080 306 device_context.cc:422] device: 0, cuDNN Version: 7.6.\n", "[2021/12/23 20:28:19] root INFO: loaded pretrained_model successful from rec_mv3_none_bilstm_ctc_v2.0_train/best_accuracy.pdparams\n", "[2021/12/23 20:28:19] root INFO: train dataloader has 17 iters\n", "[2021/12/23 20:28:19] root INFO: valid dataloader has 9 iters\n", "[2021/12/23 20:28:19] root INFO: During the training process, after the 0th iteration, an evaluation is run every 200 iterations\n", "[2021/12/23 20:28:19] root INFO: Initialize indexs of datasets:['./train_data/ic15_data/rec_gt_train.txt']\n", "[2021/12/23 20:28:23] root INFO: epoch: [1/40], iter: 10, lr: 0.000500, loss: 9.336592, acc: 0.203125, norm_edit_dis: 0.674909, reader_cost: 0.27284 s, batch_cost: 0.40185 s, samples: 2816, ips: 700.75290\n", "[2021/12/23 20:28:24] root INFO: epoch: [1/40], iter: 16, lr: 0.000500, loss: 6.955496, acc: 0.210938, norm_edit_dis: 0.678930, reader_cost: 0.00008 s, batch_cost: 0.05430 s, samples: 1536, ips: 2828.80514\n", "[2021/12/23 20:28:24] root INFO: save model in ./output/rec/ic15/latest\n", "[2021/12/23 20:28:24] root INFO: Initialize indexs of datasets:['./train_data/ic15_data/rec_gt_train.txt']\n", "[2021/12/23 20:28:28] root INFO: epoch: [2/40], iter: 20, lr: 0.000500, loss: 6.402417, acc: 0.246094, norm_edit_dis: 0.695874, reader_cost: 0.24180 s, batch_cost: 0.34361 s, samples: 1024, ips: 298.00945\n", "[2021/12/23 20:28:29] root INFO: epoch: [2/40], iter: 30, lr: 0.000500, loss: 4.007382, acc: 0.412109, norm_edit_dis: 0.743064, reader_cost: 0.00013 s, batch_cost: 0.08982 s, samples: 2560, ips: 2849.98954\n", "[2021/12/23 20:28:29] root INFO: epoch: [2/40], iter: 33, lr: 0.000500, loss: 3.906031, acc: 0.458984, norm_edit_dis: 0.770415, reader_cost: 0.00004 s, batch_cost: 0.02684 s, samples: 768, ips: 2861.80304\n", "^C\n", "main proc 306 exit, kill process group 306\n" ] } ], "source": [ "!python3 tools/train.py -c configs/rec/rec_icdar15_train.yml \\\n", " -o Global.pretrained_model=rec_mv3_none_bilstm_ctc_v2.0_train/best_accuracy \\\n", " Global.character_dict_path=ppocr/utils/ic15_dict.txt \\\n", " Global.eval_batch_step=[0,200] \\\n", " Global.epoch_num=40" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "According to the `save_model_dir` field set in the configuration file, the following parameters will be saved:\n", "\n", "```\n", "output/rec/ic15\n", "├── best_accuracy.pdopt \n", "├── best_accuracy.pdparams \n", "├── best_accuracy.states \n", "├── config.yml \n", "├── iter_epoch_3.pdopt \n", "├── iter_epoch_3.pdparams \n", "├── iter_epoch_3.states \n", "├── latest.pdopt \n", "├── latest.pdparams \n", "├── latest.states \n", "└── train.log\n", "```\n", "Among them, `best_accuracy.*` is the best model on the evaluation set; `iter_epoch_x.*` is the model saved at intervals of `save_epoch_step`; `latest.*` is the model of the last epoch.\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**Summarize:**\n", "\n", "If you need to train your own data, you need to modify:\n", "\n", "1. Training and evaluation data path (required)\n", "2. Dictionary path (required)\n", "3. Pre-trained model (optional)\n", "4. Learning rate, image shape, network structure (optional)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### 4.2 Model Evaluation\n", "\n", "\n", "The evaluation data set can be modified by `configs/rec/rec_icdar15_train.yml` to modify the `label_file_path` setting in Eval.\n", "\n", "The evaluation set of icdar2015 is used by default here, and the weights of the newly trained model are loaded:" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "[2021/12/23 14:27:51] root INFO: Architecture : \n", "[2021/12/23 14:27:51] root INFO: Backbone : \n", "[2021/12/23 14:27:51] root INFO: model_name : large\n", "[2021/12/23 14:27:51] root INFO: name : MobileNetV3\n", "[2021/12/23 14:27:51] root INFO: scale : 0.5\n", "[2021/12/23 14:27:51] root INFO: Head : \n", "[2021/12/23 14:27:51] root INFO: fc_decay : 0\n", "[2021/12/23 14:27:51] root INFO: name : CTCHead\n", "[2021/12/23 14:27:51] root INFO: Neck : \n", "[2021/12/23 14:27:51] root INFO: encoder_type : rnn\n", "[2021/12/23 14:27:51] root INFO: hidden_size : 96\n", "[2021/12/23 14:27:51] root INFO: name : SequenceEncoder\n", "[2021/12/23 14:27:51] root INFO: Transform : None\n", "[2021/12/23 14:27:51] root INFO: algorithm : CRNN\n", "[2021/12/23 14:27:51] root INFO: model_type : rec\n", "[2021/12/23 14:27:51] root INFO: Eval : \n", "[2021/12/23 14:27:51] root INFO: dataset : \n", "[2021/12/23 14:27:51] root INFO: data_dir : ./train_data/ic15_data\n", "[2021/12/23 14:27:51] root INFO: label_file_list : ['./train_data/ic15_data/rec_gt_test.txt']\n", "[2021/12/23 14:27:51] root INFO: name : SimpleDataSet\n", "[2021/12/23 14:27:51] root INFO: transforms : \n", "[2021/12/23 14:27:51] root INFO: DecodeImage : \n", "[2021/12/23 14:27:51] root INFO: channel_first : False\n", "[2021/12/23 14:27:51] root INFO: img_mode : BGR\n", "[2021/12/23 14:27:51] root INFO: CTCLabelEncode : None\n", "[2021/12/23 14:27:51] root INFO: RecResizeImg : \n", "[2021/12/23 14:27:51] root INFO: image_shape : [3, 32, 100]\n", "[2021/12/23 14:27:51] root INFO: KeepKeys : \n", "[2021/12/23 14:27:51] root INFO: keep_keys : ['image', 'label', 'length']\n", "[2021/12/23 14:27:51] root INFO: loader : \n", "[2021/12/23 14:27:51] root INFO: batch_size_per_card : 256\n", "[2021/12/23 14:27:51] root INFO: drop_last : False\n", "[2021/12/23 14:27:51] root INFO: num_workers : 4\n", "[2021/12/23 14:27:51] root INFO: shuffle : False\n", "[2021/12/23 14:27:51] root INFO: use_shared_memory : False\n", "[2021/12/23 14:27:51] root INFO: Global : \n", "[2021/12/23 14:27:51] root INFO: cal_metric_during_train : True\n", "[2021/12/23 14:27:51] root INFO: character_dict_path : ppocr/utils/ic15_dict.txt\n", "[2021/12/23 14:27:51] root INFO: character_type : EN\n", "[2021/12/23 14:27:51] root INFO: checkpoints : output/rec/ic15/best_accuracy\n", "[2021/12/23 14:27:51] root INFO: debug : False\n", "[2021/12/23 14:27:51] root INFO: distributed : False\n", "[2021/12/23 14:27:51] root INFO: epoch_num : 72\n", "[2021/12/23 14:27:51] root INFO: eval_batch_step : [0, 2000]\n", "[2021/12/23 14:27:51] root INFO: infer_img : doc/imgs_words_en/word_10.png\n", "[2021/12/23 14:27:51] root INFO: infer_mode : False\n", "[2021/12/23 14:27:51] root INFO: log_smooth_window : 20\n", "[2021/12/23 14:27:51] root INFO: max_text_length : 25\n", "[2021/12/23 14:27:51] root INFO: pretrained_model : None\n", "[2021/12/23 14:27:51] root INFO: print_batch_step : 10\n", "[2021/12/23 14:27:51] root INFO: save_epoch_step : 3\n", "[2021/12/23 14:27:51] root INFO: save_inference_dir : ./\n", "[2021/12/23 14:27:51] root INFO: save_model_dir : ./output/rec/ic15/\n", "[2021/12/23 14:27:51] root INFO: save_res_path : ./output/rec/predicts_ic15.txt\n", "[2021/12/23 14:27:51] root INFO: use_gpu : True\n", "[2021/12/23 14:27:51] root INFO: use_space_char : False\n", "[2021/12/23 14:27:51] root INFO: use_visualdl : False\n", "[2021/12/23 14:27:51] root INFO: Loss : \n", "[2021/12/23 14:27:51] root INFO: name : CTCLoss\n", "[2021/12/23 14:27:51] root INFO: Metric : \n", "[2021/12/23 14:27:51] root INFO: main_indicator : acc\n", "[2021/12/23 14:27:51] root INFO: name : RecMetric\n", "[2021/12/23 14:27:51] root INFO: Optimizer : \n", "[2021/12/23 14:27:51] root INFO: beta1 : 0.9\n", "[2021/12/23 14:27:51] root INFO: beta2 : 0.999\n", "[2021/12/23 14:27:51] root INFO: lr : \n", "[2021/12/23 14:27:51] root INFO: learning_rate : 0.0005\n", "[2021/12/23 14:27:51] root INFO: name : Adam\n", "[2021/12/23 14:27:51] root INFO: regularizer : \n", "[2021/12/23 14:27:51] root INFO: factor : 0\n", "[2021/12/23 14:27:51] root INFO: name : L2\n", "[2021/12/23 14:27:51] root INFO: PostProcess : \n", "[2021/12/23 14:27:51] root INFO: name : CTCLabelDecode\n", "[2021/12/23 14:27:51] root INFO: Train : \n", "[2021/12/23 14:27:51] root INFO: dataset : \n", "[2021/12/23 14:27:51] root INFO: data_dir : ./train_data/ic15_data/\n", "[2021/12/23 14:27:51] root INFO: label_file_list : ['./train_data/ic15_data/rec_gt_train.txt']\n", "[2021/12/23 14:27:51] root INFO: name : SimpleDataSet\n", "[2021/12/23 14:27:51] root INFO: transforms : \n", "[2021/12/23 14:27:51] root INFO: DecodeImage : \n", "[2021/12/23 14:27:51] root INFO: channel_first : False\n", "[2021/12/23 14:27:51] root INFO: img_mode : BGR\n", "[2021/12/23 14:27:51] root INFO: CTCLabelEncode : None\n", "[2021/12/23 14:27:51] root INFO: RecResizeImg : \n", "[2021/12/23 14:27:51] root INFO: image_shape : [3, 32, 100]\n", "[2021/12/23 14:27:51] root INFO: KeepKeys : \n", "[2021/12/23 14:27:51] root INFO: keep_keys : ['image', 'label', 'length']\n", "[2021/12/23 14:27:51] root INFO: loader : \n", "[2021/12/23 14:27:51] root INFO: batch_size_per_card : 256\n", "[2021/12/23 14:27:51] root INFO: drop_last : True\n", "[2021/12/23 14:27:51] root INFO: num_workers : 8\n", "[2021/12/23 14:27:51] root INFO: shuffle : True\n", "[2021/12/23 14:27:51] root INFO: use_shared_memory : False\n", "[2021/12/23 14:27:51] root INFO: train with paddle 2.1.2 and device CUDAPlace(0)\n", "[2021/12/23 14:27:51] root INFO: Initialize indexs of datasets:['./train_data/ic15_data/rec_gt_test.txt']\n", "W1223 14:27:51.861889 5192 device_context.cc:404] Please NOTE: device: 0, GPU Compute Capability: 7.0, Driver API Version: 10.1, Runtime API Version: 10.1\n", "W1223 14:27:51.865501 5192 device_context.cc:422] device: 0, cuDNN Version: 7.6.\n", "[2021/12/23 14:27:56] root INFO: resume from output/rec/ic15/best_accuracy\n", "[2021/12/23 14:27:56] root INFO: metric in ckpt ***************\n", "[2021/12/23 14:27:56] root INFO: acc:0.48531535869041886\n", "[2021/12/23 14:27:56] root INFO: norm_edit_dis:0.7895228681338454\n", "[2021/12/23 14:27:56] root INFO: fps:3266.1877400927865\n", "[2021/12/23 14:27:56] root INFO: best_epoch:24\n", "[2021/12/23 14:27:56] root INFO: start_epoch:25\n", "eval model:: 100%|████████████████████████████████| 9/9 [00:02<00:00, 3.32it/s]\n", "[2021/12/23 14:27:59] root INFO: metric eval ***************\n", "[2021/12/23 14:27:59] root INFO: acc:0.48531535869041886\n", "[2021/12/23 14:27:59] root INFO: norm_edit_dis:0.7895228681338454\n", "[2021/12/23 14:27:59] root INFO: fps:4491.015930181665\n" ] } ], "source": [ "!python tools/eval.py -c configs/rec/rec_icdar15_train.yml -o Global.checkpoints=output/rec/ic15/best_accuracy \\\n", " Global.character_dict_path=ppocr/utils/ic15_dict.txt\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "After the evaluation, you can see the accuracy of the training model on the validation set.\n", "\n", "PaddleOCR supports alternate training and evaluation. You can modify the evaluation frequency of `eval_batch_step` in `configs/rec/rec_icdar15_train.yml`. The evaluation frequency is set every 2000 iter by default. In the evaluation process, the best acc model is saved as `output/rec/ic15/best_accuracy` by default.\n", "\n", "If the validation set is large, the test will be more time-consuming. It is recommended to reduce the number of evaluations or perform the evaluation after training." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### 4.3 Forecast\n", "\n", "Using the model trained by PaddleOCR, you can use the following script to make quick predictions.\n", "\n", "Forecast picture:\n", "![](https://raw.githubusercontent.com/PaddlePaddle/PaddleOCR/release/2.3/doc/imgs_words_en/word_19.png)\n", "\n", "The default prediction image is stored in `infer_img`, and the trained parameter file is loaded through `-o Global.checkpoints`:" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "[2021/12/23 14:29:19] root INFO: Architecture : \n", "[2021/12/23 14:29:19] root INFO: Backbone : \n", "[2021/12/23 14:29:19] root INFO: model_name : large\n", "[2021/12/23 14:29:19] root INFO: name : MobileNetV3\n", "[2021/12/23 14:29:19] root INFO: scale : 0.5\n", "[2021/12/23 14:29:19] root INFO: Head : \n", "[2021/12/23 14:29:19] root INFO: fc_decay : 0\n", "[2021/12/23 14:29:19] root INFO: name : CTCHead\n", "[2021/12/23 14:29:19] root INFO: Neck : \n", "[2021/12/23 14:29:19] root INFO: encoder_type : rnn\n", "[2021/12/23 14:29:19] root INFO: hidden_size : 96\n", "[2021/12/23 14:29:19] root INFO: name : SequenceEncoder\n", "[2021/12/23 14:29:19] root INFO: Transform : None\n", "[2021/12/23 14:29:19] root INFO: algorithm : CRNN\n", "[2021/12/23 14:29:19] root INFO: model_type : rec\n", "[2021/12/23 14:29:19] root INFO: Eval : \n", "[2021/12/23 14:29:19] root INFO: dataset : \n", "[2021/12/23 14:29:19] root INFO: data_dir : ./train_data/ic15_data\n", "[2021/12/23 14:29:19] root INFO: label_file_list : ['./train_data/ic15_data/rec_gt_test.txt']\n", "[2021/12/23 14:29:19] root INFO: name : SimpleDataSet\n", "[2021/12/23 14:29:19] root INFO: transforms : \n", "[2021/12/23 14:29:19] root INFO: DecodeImage : \n", "[2021/12/23 14:29:19] root INFO: channel_first : False\n", "[2021/12/23 14:29:19] root INFO: img_mode : BGR\n", "[2021/12/23 14:29:19] root INFO: CTCLabelEncode : None\n", "[2021/12/23 14:29:19] root INFO: RecResizeImg : \n", "[2021/12/23 14:29:19] root INFO: image_shape : [3, 32, 100]\n", "[2021/12/23 14:29:19] root INFO: KeepKeys : \n", "[2021/12/23 14:29:19] root INFO: keep_keys : ['image', 'label', 'length']\n", "[2021/12/23 14:29:19] root INFO: loader : \n", "[2021/12/23 14:29:19] root INFO: batch_size_per_card : 256\n", "[2021/12/23 14:29:19] root INFO: drop_last : False\n", "[2021/12/23 14:29:19] root INFO: num_workers : 4\n", "[2021/12/23 14:29:19] root INFO: shuffle : False\n", "[2021/12/23 14:29:19] root INFO: use_shared_memory : False\n", "[2021/12/23 14:29:19] root INFO: Global : \n", "[2021/12/23 14:29:19] root INFO: cal_metric_during_train : True\n", "[2021/12/23 14:29:19] root INFO: character_dict_path : ppocr/utils/ic15_dict.txt\n", "[2021/12/23 14:29:19] root INFO: character_type : EN\n", "[2021/12/23 14:29:19] root INFO: checkpoints : output/rec/ic15/best_accuracy\n", "[2021/12/23 14:29:19] root INFO: debug : False\n", "[2021/12/23 14:29:19] root INFO: distributed : False\n", "[2021/12/23 14:29:19] root INFO: epoch_num : 72\n", "[2021/12/23 14:29:19] root INFO: eval_batch_step : [0, 2000]\n", "[2021/12/23 14:29:19] root INFO: infer_img : doc/imgs_words_en/word_19.png\n", "[2021/12/23 14:29:19] root INFO: infer_mode : False\n", "[2021/12/23 14:29:19] root INFO: log_smooth_window : 20\n", "[2021/12/23 14:29:19] root INFO: max_text_length : 25\n", "[2021/12/23 14:29:19] root INFO: pretrained_model : None\n", "[2021/12/23 14:29:19] root INFO: print_batch_step : 10\n", "[2021/12/23 14:29:19] root INFO: save_epoch_step : 3\n", "[2021/12/23 14:29:19] root INFO: save_inference_dir : ./\n", "[2021/12/23 14:29:19] root INFO: save_model_dir : ./output/rec/ic15/\n", "[2021/12/23 14:29:19] root INFO: save_res_path : ./output/rec/predicts_ic15.txt\n", "[2021/12/23 14:29:19] root INFO: use_gpu : True\n", "[2021/12/23 14:29:19] root INFO: use_space_char : False\n", "[2021/12/23 14:29:19] root INFO: use_visualdl : False\n", "[2021/12/23 14:29:19] root INFO: Loss : \n", "[2021/12/23 14:29:19] root INFO: name : CTCLoss\n", "[2021/12/23 14:29:19] root INFO: Metric : \n", "[2021/12/23 14:29:19] root INFO: main_indicator : acc\n", "[2021/12/23 14:29:19] root INFO: name : RecMetric\n", "[2021/12/23 14:29:19] root INFO: Optimizer : \n", "[2021/12/23 14:29:19] root INFO: beta1 : 0.9\n", "[2021/12/23 14:29:19] root INFO: beta2 : 0.999\n", "[2021/12/23 14:29:19] root INFO: lr : \n", "[2021/12/23 14:29:19] root INFO: learning_rate : 0.0005\n", "[2021/12/23 14:29:19] root INFO: name : Adam\n", "[2021/12/23 14:29:19] root INFO: regularizer : \n", "[2021/12/23 14:29:19] root INFO: factor : 0\n", "[2021/12/23 14:29:19] root INFO: name : L2\n", "[2021/12/23 14:29:19] root INFO: PostProcess : \n", "[2021/12/23 14:29:19] root INFO: name : CTCLabelDecode\n", "[2021/12/23 14:29:19] root INFO: Train : \n", "[2021/12/23 14:29:19] root INFO: dataset : \n", "[2021/12/23 14:29:19] root INFO: data_dir : ./train_data/ic15_data/\n", "[2021/12/23 14:29:19] root INFO: label_file_list : ['./train_data/ic15_data/rec_gt_train.txt']\n", "[2021/12/23 14:29:19] root INFO: name : SimpleDataSet\n", "[2021/12/23 14:29:19] root INFO: transforms : \n", "[2021/12/23 14:29:19] root INFO: DecodeImage : \n", "[2021/12/23 14:29:19] root INFO: channel_first : False\n", "[2021/12/23 14:29:19] root INFO: img_mode : BGR\n", "[2021/12/23 14:29:19] root INFO: CTCLabelEncode : None\n", "[2021/12/23 14:29:19] root INFO: RecResizeImg : \n", "[2021/12/23 14:29:19] root INFO: image_shape : [3, 32, 100]\n", "[2021/12/23 14:29:19] root INFO: KeepKeys : \n", "[2021/12/23 14:29:19] root INFO: keep_keys : ['image', 'label', 'length']\n", "[2021/12/23 14:29:19] root INFO: loader : \n", "[2021/12/23 14:29:19] root INFO: batch_size_per_card : 256\n", "[2021/12/23 14:29:19] root INFO: drop_last : True\n", "[2021/12/23 14:29:19] root INFO: num_workers : 8\n", "[2021/12/23 14:29:19] root INFO: shuffle : True\n", "[2021/12/23 14:29:19] root INFO: use_shared_memory : False\n", "[2021/12/23 14:29:19] root INFO: train with paddle 2.1.2 and device CUDAPlace(0)\n", "W1223 14:29:19.803710 5290 device_context.cc:404] Please NOTE: device: 0, GPU Compute Capability: 7.0, Driver API Version: 10.1, Runtime API Version: 10.1\n", "W1223 14:29:19.807695 5290 device_context.cc:422] device: 0, cuDNN Version: 7.6.\n", "[2021/12/23 14:29:25] root INFO: resume from output/rec/ic15/best_accuracy\n", "[2021/12/23 14:29:25] root INFO: infer_img: doc/imgs_words_en/word_19.png\n", "pred idx: Tensor(shape=[1, 25], dtype=int64, place=CUDAPlace(0), stop_gradient=True,\n", " [[29, 0 , 0 , 0 , 22, 0 , 0 , 0 , 25, 0 , 0 , 0 , 0 , 0 , 0 , 0 , 0 , 0 , 0 , 0 , 0 , 0 , 0 , 0 , 33]])\n", "[2021/12/23 14:29:25] root INFO: \t result: slow\t0.8795223\n", "[2021/12/23 14:29:25] root INFO: success!\n" ] } ], "source": [ "!python tools/infer_rec.py -c configs/rec/rec_icdar15_train.yml -o Global.checkpoints=output/rec/ic15/best_accuracy Global.character_dict_path=ppocr/utils/ic15_dict.txt" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Get the prediction result of the input image:\n", "\n", "```\n", "infer_img: doc/imgs_words_en/word_19.png\n", " result: slow\t0.8795223\n", "```\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Assignment\n", "\n", "**[Question 1]**\n", "\n", "Visualize the [Data Enhancement](https://github.com/PaddlePaddle/PaddleOCR/blob/release/2.4/ppocr/data/imaug/rec_img_aug.py) results implemented in PaddleOCR: noise, jitter, and explain the effect in language .\n", "\n", "Optional test picture:\n", "\n", "![](https://raw.githubusercontent.com/PaddlePaddle/PaddleOCR/release/2.4/doc/imgs_words/ch/word_1.jpg)\n", "\n", "![](https://raw.githubusercontent.com/PaddlePaddle/PaddleOCR/release/2.4/doc/imgs_words/ch/word_2.jpg)\n", "\n", "![](https://raw.githubusercontent.com/PaddlePaddle/PaddleOCR/release/2.4/doc/imgs_words/ch/word_3.jpg)\n", "\n", "\n", "**[Question 2]**\n", "\n", "Replace the backbone in the configs/rec/rec_icdar15_train.yml configuration with [ResNet34_vd] in PaddleOCR (https://github.com/PaddlePaddle/PaddleOCR/blob/6ee301be36eb54d91dc437842f754593dce13967/ppocr/modeling/#resbones/whendre.py) When the input image shape is (3, 32, 100), what is the final output feature size of the Head layer?\n", "\n", "\n", "**[Question 3]**\n", "\n", "Download the 10W Chinese data set [rec_data_lesson_demo](https://paddleocr.bj.bcebos.com/dataset/rec_data_lesson_demo.tar), modify the configs/rec/rec_icdar15_train.yml configuration file to train a recognition model and provide the training log.\n", "\n", "Loadable pre-training model: https://paddleocr.bj.bcebos.com/dygraph_v2.0/en/rec_mv3_none_bilstm_ctc_v2.0_train.tar\n", "\n", "\n", "## Summarize\n", "\n", "At this point, a CRNN-based text recognition task has been completed. For more functions and codes, please refer to [PaddleOCR](https://github.com/PaddlePaddle/PaddleOCR).\n", "\n", "If you have any questions or questions about the project, please leave a message in the comment area." ] } ], "metadata": { "kernelspec": { "display_name": "Python 3", "language": "python", "name": "py35-paddle1.2.0" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.7.4" } }, "nbformat": 4, "nbformat_minor": 4 }