diff --git a/tutorials/notebook/README.md b/tutorials/notebook/README.md
index fb999501e7ac9982e36a878b76981277fc0dca92..f7eac5dd4b49ceb37ad71ce574f11d0165e53954 100644
--- a/tutorials/notebook/README.md
+++ b/tutorials/notebook/README.md
@@ -57,6 +57,7 @@
| 数据处理与数据增强 | [data_loading_enhancement.ipynb](https://gitee.com/mindspore/docs/blob/master/tutorials/notebook/data_loading_enhance/data_loading_enhancement.ipynb) | 使用指南 | - 学习MindSpore中数据处理和增强的方法
- 展示数据处理、增强方法的实际操作
- 对比展示数据处理前和处理后的效果
- 表述在数据处理、增强后的意义
| 自然语言处理应用 | [nlp_application.ipynb](https://gitee.com/mindspore/docs/blob/master/tutorials/notebook/nlp_application.ipynb) | 应用实践 | - 展示MindSpore在自然语言处理的应用
- 展示自然语言处理中数据集特定的预处理方法
- 展示如何定义基于LSTM的SentimentNet网络
| 计算机视觉应用 | [computer_vision_application.ipynb](https://gitee.com/mindspore/docs/blob/master/tutorials/notebook/computer_vision_application.ipynb) | 应用实践 | - 学习MindSpore卷积神经网络在计算机视觉应用的过程
- 学习下载CIFAR-10数据集,搭建运行环境
- 学习使用ResNet-50构建卷积神经网络
- 学习使用Momentum和SoftmaxCrossEntropyWithLogits构建优化器和损失函数
- 学习调试参数训练模型,判断模型精度
+| 模型的训练及验证同步方法 | [synchronization_training_and_evaluation.ipynb](https://gitee.com/mindspore/docs/blob/master/tutorials/notebook/synchronization_training_and_evaluation.ipynb) | 应用实践 | - 了解模型训练和验证同步进行的方法
- 学习同步训练和验证中参数设置方法
- 利用绘图函数从保存的模型中挑选出最优模型
| 使用PyNative进行神经网络的训练调试体验 | [debugging_in_pynative_mode.ipynb](https://gitee.com/mindspore/docs/blob/master/tutorials/notebook/debugging_in_pynative_mode.ipynb) | 模型调优 | - GPU平台下从数据集获取单个数据进行单个step训练的数据变化全过程解读
- 了解PyNative模式下的调试方法
- 图片数据在训练过程中的变化情况的图形展示
- 了解构建权重梯度计算函数的方法
- 展示1个step过程中权重的变化及数据展示
| 自定义调试信息体验文档 | [customized_debugging_information.ipynb](https://gitee.com/mindspore/docs/blob/master/tutorials/notebook/customized_debugging_information.ipynb) | 模型调优 | - 了解MindSpore的自定义调试算子
- 学习使用自定义调试算子Callback设置定时训练
- 学习设置metrics算子输出相对应的模型精度信息
- 学习设置日志环境变量来控制glog输出日志
| MindInsight的模型溯源和数据溯源体验 | [mindinsight_model_lineage_and_data_lineage.ipynb](https://gitee.com/mindspore/docs/blob/master/tutorials/notebook/mindinsight/mindinsight_model_lineage_and_data_lineage.ipynb) | 模型调优 | - 了解MindSpore中训练数据的采集及展示
- 学习使用SummaryRecord记录数据
- 学习使用回调函数SummaryCollector进行数据采集
- 使用MindInsight进行数据可视化
- 了解数据溯源和模型溯源的使用方法
diff --git a/tutorials/notebook/synchronization_training_and_evaluation.ipynb b/tutorials/notebook/synchronization_training_and_evaluation.ipynb
new file mode 100644
index 0000000000000000000000000000000000000000..486fd4cf2f762b193054db536c9576b5bdc5512f
--- /dev/null
+++ b/tutorials/notebook/synchronization_training_and_evaluation.ipynb
@@ -0,0 +1,510 @@
+{
+ "cells": [
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "#
同步训练和验证模型体验"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## 概述"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "在面对复杂网络时,往往需要进行几十甚至几百次的epoch训练。而在训练之前,往往很难掌握在训练到第几个epoch时,模型的精度能达到满足要求的程度。所以经常会采用一边训练的同时,在相隔固定epoch的位置对模型进行精度验证,并保存相应的模型,等训练完毕后,通过查看对应模型精度的变化就能迅速地挑选出相对最优的模型,本文将采用这种方法,以LeNet网络为样本,进行示例。"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "整体流程如下:\n",
+ "1. 数据集准备。\n",
+ "2. 构建神经网络。\n",
+ "3. 定义回调函数EvalCallBack。\n",
+ "4. 定义训练网络并执行。\n",
+ "5. 定义绘图函数并对不同epoch下的模型精度绘制出折线图。"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## 数据准备"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "### 数据集的下载"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "训练数据集下载地址:{\"http://yann.lecun.com/exdb/mnist/train-images-idx3-ubyte.gz \", \"http://yann.lecun.com/exdb/mnist/train-labels-idx1-ubyte.gz \"}。\n",
+ "\n",
+ "测试数据集:{\"\", \"\"}\n",
+ "
数据集放在----*Jupyter工作目录+\\MNIST_Data\\*,如下图结构:"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "```\n",
+ "MNIST\n",
+ "├── test\n",
+ "│ ├── t10k-images-idx3-ubyte\n",
+ "│ └── t10k-labels-idx1-ubyte\n",
+ "└── train\n",
+ " ├── train-images-idx3-ubyte\n",
+ " └── train-labels-idx1-ubyte \n",
+ "```"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "### 数据集的增强操作"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "下载下来后的数据集,需要通过`mindspore.dataset`处理成适用于MindSpore框架的数据,再使用一系列框架中提供的工具进行数据增强操作来适应LeNet网络的数据处理需求。"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 1,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "import os\n",
+ "import mindspore.dataset as ds\n",
+ "import mindspore.dataset.transforms.vision.c_transforms as CV\n",
+ "import mindspore.dataset.transforms.c_transforms as C\n",
+ "from mindspore.dataset.transforms.vision import Inter\n",
+ "from mindspore.common import dtype as mstype\n",
+ "\n",
+ "def create_dataset(data_path, batch_size=32, repeat_size=1,\n",
+ " num_parallel_workers=1):\n",
+ " # define dataset\n",
+ " mnist_ds = ds.MnistDataset(data_path)\n",
+ "\n",
+ " # define map operations\n",
+ " resize_op = CV.Resize((32, 32), interpolation=Inter.LINEAR) \n",
+ " rescale_nml_op = CV.Rescale(1 / 0.3081, -1 * 0.1307 / 0.3081) \n",
+ " rescale_op = CV.Rescale(1/255.0, 0.0) \n",
+ " hwc2chw_op = CV.HWC2CHW() \n",
+ " type_cast_op = C.TypeCast(mstype.int32) \n",
+ "\n",
+ " # apply map operations on images\n",
+ " mnist_ds = mnist_ds.map(input_columns=\"label\", operations=type_cast_op, num_parallel_workers=num_parallel_workers)\n",
+ " mnist_ds = mnist_ds.map(input_columns=\"image\", operations=[resize_op,rescale_op,rescale_nml_op,hwc2chw_op],\n",
+ " num_parallel_workers=num_parallel_workers)\n",
+ "\n",
+ " # apply DatasetOps\n",
+ " buffer_size = 10000\n",
+ " mnist_ds = mnist_ds.shuffle(buffer_size=buffer_size)\n",
+ " mnist_ds = mnist_ds.batch(batch_size, drop_remainder=True)\n",
+ " mnist_ds = mnist_ds.repeat(repeat_size)\n",
+ " \n",
+ " return mnist_ds"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## 构建神经网络"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "LeNet网络属于7层神经网络,其中涉及卷积层,全连接层,函数激活等算法,在MindSpore中都已经建成相关算子只需导入使用,如下先将卷积函数,全连接函数,权重等进行初始化,然后在LeNet5中定义神经网络并使用`construct`构建网络。"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 2,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "import mindspore.nn as nn\n",
+ "from mindspore.common.initializer import TruncatedNormal\n",
+ "\n",
+ "\n",
+ "def conv(in_channels, out_channels, kernel_size, stride=1, padding=0):\n",
+ " \"\"\"Conv layer weight initial.\"\"\"\n",
+ " weight = weight_variable()\n",
+ " return nn.Conv2d(in_channels, out_channels,\n",
+ " kernel_size=kernel_size, stride=stride, padding=padding,\n",
+ " weight_init=weight, has_bias=False, pad_mode=\"valid\")\n",
+ "\n",
+ "def fc_with_initialize(input_channels, out_channels):\n",
+ " \"\"\"Fc layer weight initial.\"\"\"\n",
+ " weight = weight_variable()\n",
+ " bias = weight_variable()\n",
+ " return nn.Dense(input_channels, out_channels, weight, bias)\n",
+ "\n",
+ "def weight_variable():\n",
+ " \"\"\"Weight initial.\"\"\"\n",
+ " return TruncatedNormal(0.02)\n",
+ "\n",
+ "class LeNet5(nn.Cell):\n",
+ " \"\"\"Lenet network structure.\"\"\"\n",
+ " # define the operator required\n",
+ " def __init__(self):\n",
+ " super(LeNet5, self).__init__()\n",
+ " self.conv1 = conv(1, 6, 5)\n",
+ " self.conv2 = conv(6, 16, 5)\n",
+ " self.fc1 = fc_with_initialize(16 * 5 * 5, 120)\n",
+ " self.fc2 = fc_with_initialize(120, 84)\n",
+ " self.fc3 = fc_with_initialize(84, 10)\n",
+ " self.relu = nn.ReLU()\n",
+ " self.max_pool2d = nn.MaxPool2d(kernel_size=2, stride=2)\n",
+ " self.flatten = nn.Flatten()\n",
+ "\n",
+ " # use the preceding operators to construct networks\n",
+ " def construct(self, x):\n",
+ " x = self.conv1(x)\n",
+ " x = self.relu(x)\n",
+ " x = self.max_pool2d(x)\n",
+ " x = self.conv2(x)\n",
+ " x = self.relu(x)\n",
+ " x = self.max_pool2d(x)\n",
+ " x = self.flatten(x)\n",
+ " x = self.fc1(x)\n",
+ " x = self.relu(x)\n",
+ " x = self.fc2(x)\n",
+ " x = self.relu(x)\n",
+ " x = self.fc3(x)\n",
+ " return x"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## 定义回调函数EvalCallBack"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "实现思想:每隔n个epoch验证一次模型精度,由于在自定义函数中实现,如需了解自定义回调函数的详细用法,请参考[API说明](https://www.mindspore.cn/api/zh-CN/master/api/python/mindspore/mindspore.train.html?highlight=callback#mindspore.train.callback.Callback)。\n",
+ "\n",
+ "核心实现:回调函数的`epoch_end`内设置验证点,如下:\n",
+ "\n",
+ "`cur_epoch % eval_per_epoch == 0`:即每`eval_per_epoch`个epoch结束时,验证一次模型精度。\n",
+ "\n",
+ "- `cur_epoch`:当前训练过程的epoch数值。\n",
+ "- `eval_per_epoch`:用户自定义数值,即验证频次。\n",
+ "\n",
+ "其他参数解释:\n",
+ "\n",
+ "- `model`:即是MindSpore中的`Model`函数。\n",
+ "- `eval_dataset`:验证数据集。\n",
+ "- `epoch_per_eval`:记录验证模型的精度和相应的epoch数,其数据形式为`{\"epoch\":[],\"acc\":[]}`。"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 3,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "import matplotlib.pyplot as plt\n",
+ "from mindspore.train.callback import Callback\n",
+ "\n",
+ "class EvalCallBack(Callback):\n",
+ " def __init__(self, model, eval_dataset, eval_per_epoch):\n",
+ " self.model = model\n",
+ " self.eval_dataset = eval_dataset\n",
+ " self.eval_per_epoch = eval_per_epoch\n",
+ " \n",
+ " def epoch_end(self, run_context):\n",
+ " cb_param = run_context.original_args()\n",
+ " cur_epoch = cb_param.cur_epoch_num\n",
+ " if cur_epoch % self.eval_per_epoch == 0:\n",
+ " acc = self.model.eval(self.eval_dataset,dataset_sink_mode = True)\n",
+ " epoch_per_eval[\"epoch\"].append(cur_epoch)\n",
+ " epoch_per_eval[\"acc\"].append(acc[\"Accuracy\"])\n",
+ " print(acc)\n"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## 定义训练网络并执行"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "在保存模型的参数`CheckpointConfig`中,需计算好单个epoch中的step数,再根据需要进行验证模型精度的频次对应,\n",
+ "本次示例为1875个step/epoch,按照每两个epoch验证一次的思想,这里设置`save_checkpoint_steps=eval_per_epoch*1875`,\n",
+ "其中变量`eval_per_epoch`等于2。\n",
+ "\n",
+ "参数解释:\n",
+ "\n",
+ "- `train_data_path`:训练数据集地址。\n",
+ "- `eval_data_path`:验证数据集地址。\n",
+ "- `train_data`:训练数据集。\n",
+ "- `eval_data`:验证数据集。\n",
+ "- `net_loss`:定义损失函数。\n",
+ "- `net-opt`:定义优化器函数。\n",
+ "- `config_ck`:定义保存模型信息。\n",
+ " - `save_checkpoint_steps`:每多少个step保存一次模型。\n",
+ " - `keep_checkpoint_max`:设置保存模型数量的上限。\n",
+ "- `ckpoint_cb`:定义模型保存的名称及路径信息。\n",
+ "- `model`:定义模型。\n",
+ "- `model.train`:模型训练函数。\n",
+ "- `epoch_per_eval`:定义收集`epoch`数和对应模型精度信息的字典。"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 4,
+ "metadata": {
+ "scrolled": true
+ },
+ "outputs": [
+ {
+ "name": "stdout",
+ "output_type": "stream",
+ "text": [
+ "epoch: 1 step: 375, loss is 2.3058078\n",
+ "epoch: 1 step: 750, loss is 2.3073978\n",
+ "epoch: 1 step: 1125, loss is 2.3103657\n",
+ "epoch: 1 step: 1500, loss is 0.65018296\n",
+ "epoch: 1 step: 1875, loss is 0.07800862\n",
+ "epoch: 2 step: 375, loss is 0.010344766\n",
+ "epoch: 2 step: 750, loss is 0.052723818\n",
+ "epoch: 2 step: 1125, loss is 0.08183526\n",
+ "epoch: 2 step: 1500, loss is 0.007430988\n",
+ "epoch: 2 step: 1875, loss is 0.0076965275\n",
+ "{'Accuracy': 0.9753605769230769}\n",
+ "epoch: 3 step: 375, loss is 0.11964749\n",
+ "epoch: 3 step: 750, loss is 0.04522314\n",
+ "epoch: 3 step: 1125, loss is 0.018271001\n",
+ "epoch: 3 step: 1500, loss is 0.006928641\n",
+ "epoch: 3 step: 1875, loss is 0.15374172\n",
+ "epoch: 4 step: 375, loss is 0.12120275\n",
+ "epoch: 4 step: 750, loss is 0.122824214\n",
+ "epoch: 4 step: 1125, loss is 0.0023852547\n",
+ "epoch: 4 step: 1500, loss is 0.018273383\n",
+ "epoch: 4 step: 1875, loss is 0.08102103\n",
+ "{'Accuracy': 0.9821714743589743}\n",
+ "epoch: 5 step: 375, loss is 0.12944886\n",
+ "epoch: 5 step: 750, loss is 0.0010141768\n",
+ "epoch: 5 step: 1125, loss is 0.0054096584\n",
+ "epoch: 5 step: 1500, loss is 0.0022614016\n",
+ "epoch: 5 step: 1875, loss is 0.07229582\n",
+ "epoch: 6 step: 375, loss is 0.0025749032\n",
+ "epoch: 6 step: 750, loss is 0.06261393\n",
+ "epoch: 6 step: 1125, loss is 0.021273317\n",
+ "epoch: 6 step: 1500, loss is 0.011360342\n",
+ "epoch: 6 step: 1875, loss is 0.12855275\n",
+ "{'Accuracy': 0.9853766025641025}\n",
+ "epoch: 7 step: 375, loss is 0.09330422\n",
+ "epoch: 7 step: 750, loss is 0.002063415\n",
+ "epoch: 7 step: 1125, loss is 0.0047940286\n",
+ "epoch: 7 step: 1500, loss is 0.0052507296\n",
+ "epoch: 7 step: 1875, loss is 0.018066114\n",
+ "epoch: 8 step: 375, loss is 0.08678668\n",
+ "epoch: 8 step: 750, loss is 0.02440551\n",
+ "epoch: 8 step: 1125, loss is 0.0017507032\n",
+ "epoch: 8 step: 1500, loss is 0.02957578\n",
+ "epoch: 8 step: 1875, loss is 0.0023948685\n",
+ "{'Accuracy': 0.9863782051282052}\n",
+ "epoch: 9 step: 375, loss is 0.012376097\n",
+ "epoch: 9 step: 750, loss is 0.029711302\n",
+ "epoch: 9 step: 1125, loss is 0.017438065\n",
+ "epoch: 9 step: 1500, loss is 0.015443239\n",
+ "epoch: 9 step: 1875, loss is 0.0031764025\n",
+ "epoch: 10 step: 375, loss is 0.0005294987\n",
+ "epoch: 10 step: 750, loss is 0.0015696918\n",
+ "epoch: 10 step: 1125, loss is 0.019949459\n",
+ "epoch: 10 step: 1500, loss is 0.004248183\n",
+ "epoch: 10 step: 1875, loss is 0.07389321\n",
+ "{'Accuracy': 0.9824719551282052}\n"
+ ]
+ }
+ ],
+ "source": [
+ "from mindspore.train.serialization import load_checkpoint, load_param_into_net\n",
+ "from mindspore.train.callback import ModelCheckpoint, CheckpointConfig, LossMonitor\n",
+ "from mindspore.train import Model\n",
+ "from mindspore import context\n",
+ "from mindspore.nn.metrics import Accuracy\n",
+ "from mindspore.nn.loss import SoftmaxCrossEntropyWithLogits\n",
+ "\n",
+ "if __name__ == \"__main__\":\n",
+ " context.set_context(mode=context.GRAPH_MODE, device_target=\"GPU\")\n",
+ " train_data_path = \"./MNIST_Data/train\"\n",
+ " eval_data_path = \"./MNIST_Data/test\"\n",
+ " ckpt_save_dir = \"./lenet_ckpt\"\n",
+ " epoch_size = 10\n",
+ " eval_per_epoch = 2\n",
+ " repeat_size = 1\n",
+ " network = LeNet5()\n",
+ " \n",
+ " train_data = create_dataset(train_data_path,repeat_size = repeat_size)\n",
+ " eval_data = create_dataset(eval_data_path,repeat_size = repeat_size)\n",
+ " \n",
+ " # define the loss function\n",
+ " net_loss = SoftmaxCrossEntropyWithLogits(is_grad=False, sparse=True, reduction='mean')\n",
+ " # define the optimizer\n",
+ " net_opt = nn.Momentum(network.trainable_params(), learning_rate=0.01, momentum=0.9)\n",
+ " config_ck = CheckpointConfig(save_checkpoint_steps=eval_per_epoch*1875, keep_checkpoint_max=15)\n",
+ " ckpoint_cb = ModelCheckpoint(prefix=\"checkpoint_lenet\",directory=ckpt_save_dir, config=config_ck)\n",
+ " model = Model(network, net_loss, net_opt, metrics={\"Accuracy\": Accuracy()})\n",
+ " \n",
+ " epoch_per_eval = {\"epoch\":[],\"acc\":[]}\n",
+ " eval_cb = EvalCallBack(model,eval_data,eval_per_epoch)\n",
+ " \n",
+ " model.train(epoch_size, train_data, callbacks=[ckpoint_cb, LossMonitor(375),eval_cb],\n",
+ " dataset_sink_mode=True)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "在同一目录的文件夹中可以看到`lenet_ckpt`文件夹中,保存了5个模型,和一个计算图相关数据,其结构如下:"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "```\n",
+ "lenet_ckpt\n",
+ "├── checkpoint_lenet-10_1875.ckpt\n",
+ "├── checkpoint_lenet-2_1875.ckpt\n",
+ "├── checkpoint_lenet-4_1875.ckpt\n",
+ "├── checkpoint_lenet-6_1875.ckpt\n",
+ "├── checkpoint_lenet-8_1875.ckpt\n",
+ "└── checkpoint_lenet-graph.meta\n",
+ "```"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## 绘制不同epoch下模型的精度"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "定义绘图函数`eval_show`,将`epoch_per_eval`载入到`eval_show`中,绘制出不同`epoch`下模型的验证精度折线图。"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 5,
+ "metadata": {},
+ "outputs": [
+ {
+ "data": {
+ "image/png": "iVBORw0KGgoAAAANSUhEUgAAAY4AAAEWCAYAAABxMXBSAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjMsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+AADFEAAAgAElEQVR4nO3debzWY/7H8ddb0SLZygyisgxiZEiWkMk+wxhb9m1sQxnrjzBmyJKxm9Ega3ZjnRgVmsrUYDohIhQmEsqWpRHV5/fH9T26O85yn5z7fM/yfj4e53Hu+/4u9+c+y/25r+v6Xp9LEYGZmVmxlso7ADMza1ycOMzMrFacOMzMrFacOMzMrFacOMzMrFacOMzMrFacOKzkJHWRFJJaFrHvEZLG1UdczYWkgyU98QOOHy7p8LqMqYbnK/rvxfLhxGGLkfRfSd9I6lDh8Rezf+Yu+URmSyoi7oqInYvZV9J5ku6scPxuETG0NNHVj+xvd52842gqnDisMm8DB5bfkfRToE1+4TQMjfETcGOMuS4199dfKk4cVpk7gMMK7h8O3F64g6TlJd0uabak6ZJ+L2mpbFsLSZdL+kjSW8AvKzn2ZknvS3pP0oWSWhQTmKT7JX0gaY6kpyVtWLCtjaQrsnjmSBonqU22bRtJ/5b0maR3JR2RPT5G0tEF51isqyz7pNpP0lRgavbYNdk5Ppc0UdK2Bfu3kHS2pDclfZFtX0PSYElXVHgtj0o6uZLXeL2kyys89ndJp2a3BxSc/1VJe1WIf7ykqyR9ApxXyWuqNH5JuwJnA/tL+lLSpIo/I0lLZb/r6ZJmZX8Dy2fbyruYDpf0Tvb7P6ea32WVv6/MwZWdR1JPSc9kv8v3JV0raZmqfmeSns42Tcpe1/5VxWRFigh/+eu7L+C/wI7A68AGQAvgXaAzEECXbL/bgb8DywFdgDeAo7JtvwVeA9YAVgJGZ8e2zLY/AtwALAusAvwHOC7bdgQwrpr4fpM9ZyvgauDFgm2DgTHA6lncW2f7rQl8QWpFLQ2sDGySHTMGOLrgHIs9fxb3k9nraJM9dkh2jpbAacAHQOts2/8BLwPrAQK6Z/v2BGYCS2X7dQDmAj+q5DVul/3Mld1fEfgfsFp2fz9gNdIHv/2Br4BVC+KfD5yYxdemktdUXfznAXdWiOe7n1H2858GrAW0Ax4C7si2dcl+Xjdmz9sdmAdsUMXvsqrfV7XnATYDtszi7wJMAU6u4XcWwDp5/381la/cA/BXw/piUeL4PTAI2DX7J2yZ/fN1yf7J5wHdCo47DhiT3f4n8NuCbTtnx7YEfpQd26Zg+4HA6Oz2Ym9yNcS6Qnbe5bM30f8B3SvZ7yzg4SrO8d2bYmXPn52/Tw1xfFr+vKSEu2cV+00Bdspu9wcer2I/Ae8A22X3jwH+Wc3zv1j+nFn871TYXu3PtEL851F94hgFnFCwbT3g24I38QA6FWz/D3BAJc9Z3e+r6PNk204u/P1W9jvDiaNOv9xVZVW5AziI9KZze4VtHYBlgOkFj00nfXKE9Gn43QrbynUmfep/P+tq+IzU+lilpoCybqBLsm6az0lJrjyeDkBr4M1KDl2jiseLVfhakHSapClZ98pnpMRVfjFBdc81lPRpn+z7HZXtFOmd7l4WjTMdBNxV8PyHKV2sUP7z26jg+b8Xb0U1xF+T1fj+7738A0G5DwpuzyW1TCqq7vdV7Xkk/UTSY1mX5efAxZXEX+3PwH4YJw6rVERMJw2S/4LUHVHoI9KnzM4Fj60JvJfdfp/0Blq4rdy7pBZHh4hYIftqHxEbUrODgD1JLaLlSZ9MIX1C/wj4Gli7kuPereJxSN08bQvu/7iSfb4rIZ2NB5wJ9AVWjIgVgDlZDDU9153AnpK6k7oBH6liP4B7gH0ldQa2AB7Mnr8zqQunP7By9vyTC55/sXgrKiL+msplz+T7v/f5wIc1HFdRdb+vmlxH6gpdNyLak8ZlVGEfl/0uIScOq85RpCb/V4UPRsQC4G/ARZKWy97MTiW9MZJt+52kTpJWBAYUHPs+8ARwhaT22WDr2pJ6FxHPcqSk8zHpzf7igvMuBG4BrpS0WtY62UpSK9Kn9R0l9ZXUUtLKkjbJDn0R2FtSW6XLNY8qIob5wGygpaQ/AO0Ltt8EXCBpXSUbS1o5i3EGMIHU0ngwIv5X1ZNExAvZc9wEjIyIz7JNy5LeFGcDSDqS1OIoVk3xfwh0UXahQyXuAU6R1FVSO9Lv4L6ImF+LGGr6fRXzGj4HvpS0PnB8Ecd8SBqXsTrgxGFViog3I6Ksis0nkj6tvwWMA+4mvRFA+kQ8EpgEPM/3WyyHkbq6XiX1rz8ArFpESLeTukbey459tsL200kD0xOAT4A/kQaj3yG1nE7LHn+RNOAKcBXwDemNZSgFXUJVGAkMJ10MMJ30qbmwW+RKUuJ8gvTmdjOLX8o8FPgpVXRTVXAPqXV1d/kDEfEqcAXwTBbzT4HxRZyr2Pjvz75/LOn5So6/JYv9aVKL9GvS38KSqPT3VeRxB5EueLgRuK+IY84Dhmbde32XKFr7TvlVG2ZWDyRtR2qZdck+dZs1Om5xmNUTSUsDJwE3OWlYY+bEYVYPJG0AfEbqkrs653DMfhB3VZmZWa24xWFmZrXSLAqAdejQIbp06ZJ3GGZmjcrEiRM/ioiOFR9vFomjS5culJVVdVWpmZlVRtL0yh53V5WZmdWKE4eZmdWKE4eZmdWKE4eZmdWKE4eZmdWKE4eZmdWKE4eZmdVKs5jHYWaN3MyZMG4cfPUVHHYYtGiRd0TNmhOHmTUsEfD66ylR/Otf6ftbby3a/thjcNdd0Lp1fjE2c04cZpavb7+FF15YPFF89FHa1rEjbLMN9O+fvo8bB6eeCr/8JTzyCCy3XL6xN1NOHGZWv778Ep59dlGiePZZmDs3bVt77ZQUtt02JYqf/ARUsJz45ptDhw5w5JHQpw88/nhKLlavnDjMrLQ+/BDGj1/UmnjhBViwAJZaCrp3h6OPTkmiVy9YbbWaz3foobDiirDffinBPPEErLlm6V+HfaekiUPSrsA1QAvSqmeXVNjembSGcUfSmsOHRMSMbNulwC9JV349CZwUESFpGeBaYHtgIXBORDxYytdhZkWKgDffXLzb6Y030rbWrWGLLeCss1Ki2GoraN9+yZ5n991Twthjj5RwnngCNtig7l6HVatkiUNSC2AwsBMwA5ggaVhEvFqw2+XA7RExVFIfYBBwqKStgV7Axtl+44DewBjgHGBWRPxE0lLASqV6DWZWg/nz4aWXFiWJcePggw/StpVWSm/q5S2KzTaDZZapu+fedlsYOxZ22SXdfvxx6Nmz7s5vVSpli6MnMC0i3gKQdC+wJ1CYOLoBp2S3RwOPZLcDaA0sAwhYGvgw2/YbYH2AbN3mj0r3EsxsMXPnwn/+syhRPPMMfPFF2ta5M+y4Y0oS224L66+fuqNKqXv31A22005pzOORR1IMVlKlTByrA+8W3J8BbFFhn0nAPqTurL2A5SStHBHPSBoNvE9KHNdGxBRJK2THXSBpe+BNoH9EfFjhvEg6FjgWYE33f5otmY8/Xnx8YuLEdBWUBD/9aRpv2Gab9LXGGvnEuPbaKcZddoFf/ALuvhv23TefWJqJUiYOVfJYxQXOTweulXQE8DTwHjBf0jrABkCnbL8nJW1Haq10AsZHxKmSTiV1dx36vSeKGAIMAejRo4cXVjerSQRMn74oSfzrXzBlStq2zDKpG+i001JrYqut0gB1Q7Hqqqnbao89oG9fuO46OO64vKNqskqZOGYAhR9BOgEzC3eIiJnA3gCS2gH7RMScrLXwbER8mW0bDmwJ/AuYCzycneJ+4KgSvgazpmvBApg8edHYxL/+Be+9l7Ytv3wanzj00JQoevRo+BPuVlwxDZLvtx/89reptXTWWYtfzmt1opSJYwKwrqSupJbEAcBBhTtI6gB8ko1VnEW6wgrgHeAYSYNILZfewNXZVVWPkq6o+iewA4uPmZhZVb7+GiZMWJQk/v1vmDMnbVt99ZQgyudPbLhh4yzr0bZtGuc48kg455w0kfDyy0s/1tLMlCxxRMR8Sf2BkaTLcW+JiFckDQTKImIYKQEMkhSkrqp+2eEPAH2Al0ndWyMi4tFs25nAHZKuBmYDR5bqNZg1ap99lvr+yxPFhAnwzTdpW7dusP/+ixJF585N55P50kvD7benq7quuiq1PG66KT1udUIRTb/7v0ePHlFWVpZ3GGal9e67i8+fmDw5jVu0bJm6msqvdtp66zT7uqmLgAsvhD/8IY193HcftGmTd1SNiqSJEdGj4uOeOW7WGC1cmAauCxPF9OlpW7t2KTn07ZuSRc+eqQunuZHg3HNTkuzXL1119eijafzGfhAnDrPG4Jtv0qWw5Yli/Hj45JO07cc/Tgni1FPT9403Tq0MS44/PnVbHXoo9O4NI0akn5ktMf91mTVEn3+eJteVtyaeey4NbkMq/LfXXou6ntZaq+mMT5TK/vvDCivA3nunn9uTT0LXrnlH1Wg5cZg1BO+/v3i306RJqTuqRQv42c/Sp+byiXarrJJ3tI3TLrvAqFFpkmCvXjByZJrEaLXmxGGWl4kT4S9/SYnizTfTY23bpsl1556bksSWW6YxC6sbW26ZkvPOO8N228E//pHGg6xWnDjM6lsEXHttmoXdrh1svz2ccELqdtpkE182WmobbpjGiHbeOdW1euCB1AqxojlxmNWnOXNStdgHHkiXiN52Wxq4tfrVpUtq6e26K+y5JwwdCgcdVONhlng6pVl9eeGFVFr84Yfhssvg73930sjTKqvAmDFpvOPgg1O3oRXFicOs1CLghhvS2MXXX6difKef7iuhGoL27dPlub/+Nfzud/DHP6bfl1XLicOslL78Eg45JBXd23771Oro1SvvqKxQ69Zw//2pvtXAgdC/f7qizarkMQ6zUnn55VSpdepUuOgiGDDAxfYaqpYt4eab0yzzyy5LkyuHDq3bFQubECcOs1K49dZU5mL55dPcge23zzsiq4kEl16akseZZ8Knn8KDD8Kyy+YdWYPjjz9mdWnu3NTl8ZvfpPkBL77opNHYnHFGqqb75JNpSdry0i72HScOs7oyZUoqKDh0aBpkHTkSfvSjvKOyJXHUUemS6YkT00TB8gWuDHDiMKsbd94Jm28Os2alhHHeeY1zISRbZK+90hVX06enWfxTp+YdUYPhxGH2Q/zvf3Dssany6mabpa6pnXbKOyqrKz//OYwena6O22abdFWcOXGYLbGpU9PcjBtvTGtbjxoFq62Wd1RW13r0SLPMW7VK41Vjx+YdUe6cOMyWxN/+lloY776bCuVdfLHXwGjK1lsv1bdaffVUZXfYsLwjypUTh1ltzJuXJojtvz9stFHqmnKBvOZhjTVSZd3u3dO6HrfdlndEuXHiMCvW22+nWd+DB6fKtmPHpjcTaz5WXjl1Sfbpky67vuKKvCPKhROHWTEeeSQtqPTmm+n25Ze7/Hlz1a5dWrt8v/1SzbGzzmp29a3cKWtWnW++SaVCrroqXW57331ectTSQPk996TqxpdcAh99BNdf32wuwXbiMKvKO+9A375pve8TT0w1jFq1yjsqayhatIDrroOOHeHCC9MM87vuSkUTmzgnDrPK/OMfcNhh8O23qXLqvvvmHZE1RBJccEEa+zjlFPjlL1NX5nLL5R1ZSXmMw6zQ/Pmpa2r33WHNNeH55500rGYnnwy3354umOjTB2bPzjuiknLiMCv33ntppvCf/gTHHQfPPAPrrJN3VNZYHHpoam1MnpzWj3/nnbwjKhknDjOAJ56ATTZJJSXuvjsNdDaDvmqrY7vvnv6WPvggXbo9ZUreEZVESROHpF0lvS5pmqQBlWzvLGmUpJckjZHUqWDbpZJekTRF0p+lxdfZlDRM0uRSxm/NwIIF8Ic/wK67wo9/DGVlcOCBeUdljdm226Yuq2+/TbcnTMg7ojpXssQhqQUwGNgN6AYcKKlbhd0uB26PiI2BgcCg7NitgV7AxsBGwOZA74Jz7w18WarYrZn44INUkPCCC+CII9LVU+uvn3dU1hR0755KlLRvn7o/n3oq74jqVClbHD2BaRHxVkR8A9wL7Flhn27AqOz26ILtAbQGlgFaAUsDHwJIagecClxYwtitqRs9OnVNPftsKh1xyy3Qtm3eUVlTsvbaKXmstVa62uqBB/KOqM6UMnGsDrxbcH9G9lihScA+2e29gOUkrRwRz5ASyfvZ18iIKO8svAC4Aphb3ZNLOlZSmaSy2U38CgerhYUL0zX3O+4IK64I//kPHH543lFZU7XqqqnbavPN05ygIUPyjqhOlDJxqJLHKs7LPx3oLekFUlfUe8B8SesAGwCdSMmmj6TtJG0CrBMRD9f05BExJCJ6RESPjh07/qAXYk3E7Nmw225w7rlpHGPChFSo0KyUVlwxDZjvtlu6Wm/QoEZfoqSUEwBnAIUV4DoBMwt3iIiZwN7wXRfUPhExR9KxwLMR8WW2bTiwJfAFsJmk/2axryJpTERsX8LXYU3BuHFwwAGpNMSQIXD00Wnylll9aNs2Xap75JFw9tnpQ8zll8NSjfPC1lJGPQFYV1JXScsABwCLFbGX1EFSeQxnAbdkt98htURaSlqa1BqZEhHXRcRqEdEF2AZ4w0nDqrVwYZqXsf320KZNGtM45hgnDat/Sy+dJgmeeGKqfXbkkenKq0aoZC2OiJgvqT8wEmgB3BIRr0gaCJRFxDBge2CQpACeBvplhz8A9AFeJnVvjYiIR0sVqzVRH3+cxi/+8Y/Uv3zjjekqF7O8LLUUXHNNqm/1hz/Ap5+mwplt2uQdWa0oGnlfWzF69OgRZWVleYdh9enZZ9NiSx98kD7dHX+8WxnWsFx3HfTrl9Yyf/RRWH75vCP6HkkTI6JHxccbZwebWVUiUqLYdttUvXT8eDjhBCcNa3iOPz6VZn/2WejdO33IaSScOKzp+Owz2GcfOPXUVPrh+eehx/c+LJk1HPvvn1obU6emlsfbb+cdUVGcOKxpmDgRNt00/RNedRU89BCssELeUZnVbJdd0nK0n3yS6lu9/HLeEdXIicMat4i0BvjWW6eS6P/6Vypx7a4pa0y23DL97Uqw3Xbw73/nHVG1nDis8fr88zQ3o3//VHPqhRfSP6BZY7ThhmlMrmPHVNlg+PC8I6qSE4c1TpMmpfGLBx9M8zSGDUursJk1Zl26pMmq668Pv/pVGjxvgJw4rHGJSPMxttgCvvoKxoyBM85otDNwzb5nlVXS33WvXnDwwXDttXlH9D3+b7PG48sv0zrgxx6bLl984YV0JYpZU9O+PYwYAXvumWaan3deg6pv5cRhjcMrr6QKo3ffndbPGD48fTIza6pat4b770+lSc4/PyWQhQvzjgoobZFDs7oxdGiaLNW+fVoQ5+c/zzsis/rRsiXcfDN06ACXXZYu2b3tNlhmmXzDyvXZzaozd276lHXLLalI4T33pOVdzZoTCS69NCWPM89M9a0eeACWXTa3kNxVZQ3T66+nAfBbb03rZzz1lJOGNW9nnAE33ZTW9thpp9T6yIkThzU8d98Nm22WaveMGAEDB6a6U2bN3VFHpdbGxIlpouB77+USRo2JI1t+tZ+kFesjIGvGvv4afvvbdAniz34GL74IO++cd1RmDctee6UPVNOnp6sKp06t9xCKaXEcAKwGTJB0r6RdJNdzsDo2bRpstRXccEPqxx09GlavuES9mQHpApHRo9Ml6ttsky5Nr0c1Jo6ImBYR5wA/Ae4mrdL3jqTzJa1U6gCtGXjwwdQ1NX16KlJ4ySXpahIzq1qPHmmWeatW6eKRsWPr7amLGuOQtDFwBXAZ8CCwL/A58M/ShWZN3rx58Lvfwb77QrduqWtq993zjsqs8VhvvVTfavXVU5XdYcNqPqYOFDPGMRG4irSG+MYR8buIeC4irgDeKnWA1kS9/XZabOkvf4FTTkmfltZcM++ozBqfNdZIlXW7d4e9907znkqsmP6A/SKi0gQREXvXcTzWHAwbltYCj0jrZuy1V94RmTVuK6+c1vTYe2844gj4+OO0oFmJFNNVdbSk71bEkbSipAtLFpE1Xd9+C6efnurvrL12WqHPScOsbrRrl8YI99sPTjsNzj67ZPWtikkcu0XEZ+V3IuJT4BclicaarnffTYUJr7gC+vVL/bJrrZV3VGZNS6tWqcLCccfBoEHp+4IFdf40xSSOFpJald+R1AZoVc3+ZosbPjzNy5g8Ge67L5WJbuU/IbOSaNECrrsOfv/7VHnh+efr/CmKSRx3AqMkHSXpN8CTQOlHX6zxmz8/NZd/8Yt01UdZGfTtm3dUZk2flKpIT56cqkrXsRoHxyPiUkkvAzsAAi6IiJF1Hok1LTNnwoEHwtNPwzHHwDXXQJs2eUdl1ryst15JTlvULKuIGA403AVwrWF56ik46KC0Qt8dd8Ahh+QdkZnVoWLmcWwpaYKkLyV9I2mBpM/rIzhrZBYsSCuV7bxzWmSprMxJw6wJKmaM41rgQGAq0AY4GvhLKYOyRuqEE9JKZYcdBs89BxtskHdEZlYCRZUciYhpQIuIWBARtwJFLcEmaVdJr0uaJmlAJds7Sxol6SVJYyR1Kth2qaRXJE2R9GclbSX9Q9Jr2bZLin2hVmJjx8KQIWnS0W235brIjJmVVjGJY66kZYAXszfzU4Aa3xUktQAGA7sB3YADJXWrsNvlwO0RsTEwEBiUHbs10AvYGNgI2BzoXX5MRKwP/AzoJWm3Il6DldK8eakceteu6UoOM2vSikkch2b79Qe+AtYA9iniuJ7AtIh4KyK+Ae4F9qywTzdgVHZ7dMH2AFoDy5DmjCwNfBgRcyNiNEB2zueBTli+Lr0UXnsNBg+Gtm3zjsbMSqzaxJG1Gi6KiK8j4vOIOD8iTs26rmqyOvBuwf0Z2WOFJrEoCe0FLCdp5Yh4hpRI3s++RkbElAqxrQDswaLEUzH2Y7NFqMpmz55dRLi2RKZOhYsuSvMzdnPjz6w5qDZxRMQCoGPWVVVblS32VLFwyulAb0kvkLqi3gPmS1oH2IDUmlgd6CNpu+9OLLUE7gH+XE0BxiER0SMienTs2HEJwrcaRaQB8Vat4Oqr847GzOpJMfM4/guMlzSM1FUFQERcWcNxM0jdWuU6ATMLd4iImcDeAJLaAftExBxJxwLPRsSX2bbhwJbA09mhQ4CpEeF3qzzdfXeaszF4MKy6at7RmFk9KWaMYybwWLbvcgVfNZkArCupa9ZiOQBYbJURSR0klcdwFml1QYB3SC2RlpKWJrVGpmTHXAgsD5xcRAxWKp98kq6g6tkzFVIzs2ajmJIj5y/JiSNivqT+wEigBXBLRLwiaSBQFhHDgO2BQZKC1Jrolx3+ANAHeJnUvTUiIh7NLtc9B3gNeD5b+vzaiLhpSWK0H2DAgFTzf+TIVFTNzJoNRQ312iWN5vtjE0REn1IFVdd69OgRZWVleYfRdIwfD9tsk2r+X3553tGYWYlImhgRPSo+XswYx+kFt1uTroKaX1eBWSPz7bdpzsYaa6TyImbW7BTTVTWxwkPjJY0tUTzW0F1xRSrVPGxYWnHMzJqdGhOHpJUK7i4FbAb8uGQRWcP11lswcGBa7nWPPfKOxsxyUkxX1UTSGIdIXVRvA0eVMihrgCLSkq8tWsCf/5x3NGaWo2K6qrrWRyDWwN1/P4wYkSb6dXKVF7PmrJj1OPpl5T3K768o6YTShmUNypw5cNJJsOmm0L9/3tGYWc6KmQB4TER8Vn4nIj4FjildSNbgnH02zJqVyqZ7zoZZs1dM4lhK2Uw7+K7w4ZLUrrLG6Lnn4LrrUktjs83yjsbMGoBiBsdHAn+TdD1pkPy3wIiSRmUNw/z5qZzIqqt6nQ0z+04xieNM4FjgeNKVVU8ALvHRHFxzDUyaBA8+CO3b5x2NmTUQxSSONsCNEXE9fNdV1QqYW8rALGfTp8Mf/gC7757mbZiZZYoZ4xhFSh7l2gBPlSYcaxAi4MQT0+1rrwVVtrSKmTVXxbQ4WpeviwEQEV9K8vqgTdkjj8Cjj8Jll0HnznlHY2YNTDEtjq8kbVp+R9JmwP9KF5Ll6osvUmtj443T3A0zswqKaXGcDNwvqXz1vlWB/UsXkuXq3HNh5kx44AFYeum8ozGzBqiYkiMTJK0PrEe6quq1iPi25JFZ/Zs4Ef7yl1Q2fcst847GzBqoYlockJJGN9J6HD+TRETcXrqwrN4tWJDmbKyyClx8cd7RmFkDVkxZ9T+SlnjtBjwO7AaMA5w4mpLBg1OL4957YYUVat7fzJqtYgbH9wV2AD6IiCOB7qR5HNZUzJgB55wDu+wCffvmHY2ZNXDFJI7/RcRCYL6k9sAsYK3ShmX16qSTUnmRv/7VczbMrEbFjHGUZWXVbyQt6vQl8J+SRmX159FH4aGH0rjGWv48YGY1U0QUv7PUBWgfES+VKqBS6NGjR5SVleUdRsPz1VfQrRsstxw8/zws46LHZraIpIkR0aPi48VeVQVARPy3ziKy/J13HrzzDowb56RhZkUrZozDmqJJk+Cqq+Doo6FXr7yjMbNGxImjOSqfs7HSSvCnP+UdjZk1MlV2VUlaqboDI+KTug/H6sWQIWllvzvuSMnDzKwWqhvjmEha8a+y6zMDX5LbOL3/PgwYADvsAAcfnHc0ZtYIVdlVFRFdI2Kt7HvFr6KShqRdJb0uaZqkAZVs7yxplKSXJI2R1Klg26WSXpE0RdKfy9c9l7SZpJezc373uBXplFNg3ry0jrh/dGa2BGoc41ByiKRzs/trSupZxHEtgMGkEiXdgAMldauw2+XA7RGxMTAQGJQduzXQC9gY2AjYHOidHXMdaSnbdbOvXWuKxTIjRsB998HZZ8O66+YdjZk1UsUMjv8V2Ao4KLv/BSkh1KQnMC0i3oqIb4B7gT0r7NONtMIgwOiC7UEqqLgMqbzJ0sCHklYlzSN5JtIElNuBXxcRi82dCyecAOutB2eemXc0ZtaIFZM4toiIfsDXABHxKekNvSarA+8W3J+RPVZoErBPdnsvYDlJK0fEM6RE8n72NTIipmTHzyDc73oAABM2SURBVKjhnABIOlZSmaSy2bNnFxFuE3fhhfD223D99dDKpcbMbMkVkzi+zbqdAkBSR2BhEcdVNahe6HSgt6QXSF1R75FqYq0DbAB0IiWGPpK2K/Kc6cGIIRHRIyJ6dOzYsYhwm7DJk9MysEccAdtvn3c0ZtbIFZM4/gw8DKwi6SJSSfViFmyYAaxRcL8TMLNwh4iYGRF7R8TPgHOyx+aQWh/PRsSX2Xrnw4Ets3N2qu6cVsHChWlhpuWXT8nDzOwHqjFxRMRdwBmkgev3gV9HxP1FnHsCsK6krpKWAQ4AhhXuIKmDpPIYzgJuyW6/Q2qJtJS0NKk1MiUi3ge+kLRldjXVYcDfi4il+br5Zhg/PiWNDh3yjsbMmoBiJwDOAu4p3FbTBMCImC+pPzASaAHcEhGvSBoIlEXEMNICUYMkBfA00C87/AGgD/AyqStqREQ8mm07HrgNaENqiQwv7qU2Q7NmpYHw7bZL3VRmZnWgyuq4kt5m0QTANYFPs9srAO9ERNf6CvKHarbVcQ89NF1++9JLsP76eUdjZo1MVdVxa5wASGox7BERHSJiZWB34KHShWp14qmn4M470yxxJw0zq0PFDI5vHhGPl9+JiOEsmoxnDdHXX8Pxx8M666TJfmZmdaiY9Tg+kvR74E5S19UhwMcljcp+mIsvhmnT4MknoXXrvKMxsyammBbHgUBH0iW5jwCrZI9ZQ/Taa3DJJamA4Y475h2NmTVBNbY4squnTpLUHliYzauwhigizdlYdlm44oq8ozGzJqqYIoc/zWZ2vwy8ImmipI1KH5rV2tChMHYsXHop/OhHeUdjZk1UMV1VNwCnRkTniOgMnAYMKW1YVmsffQSnn56WgT3qqLyjMbMmrJjEsWxEjC6/ExFjgGVLFpEtmf/7P5gzJxUxXMorAptZ6RRzVdVb2Vocd2T3DwHeLl1IVmtjx8Jtt6U5Gxu5F9HMSquYj6a/IV1V9RDpyqqOwJGlDMpqYd68NCDetSuce27e0ZhZM1DMVVWfAr+rh1hsSVx6aboE9/HHoW3bvKMxs2aguiKHw6raBhARv6r7cKxWpk6Fiy6Cvn1ht93yjsbMmonqWhxbkVbwuwd4jsoXUbK8RKSyIq1awdVX5x2NmTUj1SWOHwM7kWaJHwT8A7gnIl6pj8CsBnffDaNGweDBsOqqeUdjZs1IddVxF0TEiIg4nLT63jRgjKQT6y06q9wnn8Cpp0LPnnDccXlHY2bNTLWD45JaAb8ktTq6kJaRdUn1vA0YAB9/DCNHQosWeUdjZs1MdYPjQ4GNSCvsnR8Rk+stKqva+PFw441w2mmwySZ5R2NmzVB1KwAuBL7K7hbuJCAion2JY6szTWYFwG++gU03hc8/h1dfhXbt8o7IzJqwqlYArLLFERGuW9HQXHklvPIKDBvmpGFmuXFyaCzeegsGDoS99oI99sg7GjNrxpw4GoMI6NcvDYT/+c95R2NmzVwxRQ4tb/ffDyNGpIl+nTrlHY2ZNXNucTR0n30GJ52UBsX79887GjMztzgavHPOgVmz4LHHPGfDzBoEtzgasueeg+uuSy2NzTbLOxozM8CJo+GaPz+VE1ltNbjggryjMTP7TkkTh6RdJb0uaZqkAZVs7yxplKSXJI2R1Cl7/OeSXiz4+lrSr7NtO0h6Pnt8nKR1SvkacnPNNTBpUrqKqn2jmWtpZs1AlTPHf/CJpRbAG6QKuzOACcCBEfFqwT73A49FxFBJfYAjI+LQCudZiVRgsVNEzJX0BrBnREyRdALQMyKOqC6WRjdzfPp06NYN+vRJk/3kivZmVv+qmjleyhZHT2BaRLwVEd8A9wJ7VtinGzAquz26ku0A+wLDI2Judj+A8o/gywMz6zTqvEXAiVkB4muvddIwswanlIljddJCUOVmZI8VmgTsk93eC1hO0soV9jmAtJhUuaOBxyXNAA4FLqmziBuCRx6BRx+F88+Hzp3zjsbM7HtKmTgq+6hcsV/sdKC3pBeA3sB7wPzvTiCtCvwUGFlwzCnALyKiE3ArcGWlTy4dK6lMUtns2bOX/FXUpy++SK2N7t3T3A0zswaolPM4ZgBrFNzvRIVupYiYCewNIKkdsE9EzCnYpS/wcER8m+3TEegeEc9l2+8DRlT25BExBBgCaYzjB7+a+nDuuTBzJjz4ICy9dN7RmJlVqpQtjgnAupK6SlqG1OU0rHAHSR0klcdwFnBLhXMcyOLdVJ8Cy0v6SXZ/J2BKnUeeh4kT4S9/SeuIb7FF3tGYmVWpZC2OiJgvqT+pm6kFcEtEvCJpIFAWEcOA7YFBkgJ4GuhXfrykLqQWy9gK5zwGeDBbL+RT4Deleg31ZsGCNGdjlVXg4ovzjsbMrFolLTkSEY8Dj1d47A8Ftx8AHqji2P/y/cF0IuJh4OE6DTRvgwenFse998Lyy+cdjZlZtTxzPG8zZqR6VLvuCn375h2NmVmNnDjydtJJqbzI4MGes2FmjYKr4+bp0UfhoYdg0CBYa628ozEzK4pbHHn56qtU9XbDDeG00/KOxsysaG5x5OW88+Cdd2DcOM/ZMLNGxS2OPEyaBFddBcccA7165R2NmVmtOHHUt/I5GyutBJc0rTJbZtY8uKuqvt1wQ1rZ7847U/IwM2tk3OKoT++/D2edBTvuCAcdlHc0ZmZLxImjPp1yCsybB3/9q+dsmFmj5cRRX0aMgPvuS7PE110372jMzJaYE0d9mDsXTjgB1lsPzjgj72jMzH4QD47XhwsugLffhjFjoFWrvKMxM/tB3OIotcmT4fLL4YgjoHfvvKMxM/vBnDhKaeFC+O1vU6n0yy7LOxozszrhrqpSuvlmGD8ebr0VOnTIOxozszrhFkepfPhhGgjv3RsOPzzvaMzM6owTR6mcdlqqgHv99Z6zYWZNihNHKTz1FNx1FwwYAOuvn3c0ZmZ1yomjrn39NRx/PKyzDpx9dt7RmJnVOQ+O17WLL4Zp0+DJJ6F167yjMTOrc25x1KXXXkul0g8+OBUyNDNrgpw46kpEmrPRrh1ceWXe0ZiZlYy7qurK0KEwdiwMGQKrrJJ3NGZmJeMWR1346CM4/fS0DOxRR+UdjZlZSTlx1IX/+z+YMyfN2VjKP1Iza9r8LvdDjRkDt92WWhwbbZR3NGZmJVfSxCFpV0mvS5omaUAl2ztLGiXpJUljJHXKHv+5pBcLvr6W9OtsmyRdJOkNSVMk/a6Ur6Fa8+alAfGuXeHcc3MLw8ysPpVscFxSC2AwsBMwA5ggaVhEvFqw2+XA7RExVFIfYBBwaESMBjbJzrMSMA14IjvmCGANYP2IWCgpv5HoSy+F11+H4cOhbdvcwjAzq0+lbHH0BKZFxFsR8Q1wL7BnhX26AaOy26Mr2Q6wLzA8IuZm948HBkbEQoCImFXnkRdj6lS46CLo2xd23TWXEMzM8lDKxLE68G7B/RnZY4UmAftkt/cClpO0coV9DgDuKbi/NrC/pDJJwyVVuoC3pGOzfcpmz569xC+iUhGprEirVnD11XV7bjOzBq6UiaOykrBR4f7pQG9JLwC9gfeA+d+dQFoV+CkwsuCYVsDXEdEDuBG4pbInj4ghEdEjInp07NhxyV9FZe6+G0aNgkGDYNVV6/bcZmYNXCknAM4gjUWU6wTMLNwhImYCewNIagfsExFzCnbpCzwcEd9WOO+D2e2HgVvrOO7qffIJnHIKbLEFHHdcvT61mVlDUMoWxwRgXUldJS1D6nIaVriDpA6SymM4i++3Hg5k8W4qgEeAPtnt3sAbdRp1TQYMSMnjhhugRYt6fWozs4agZIkjIuYD/UndTFOAv0XEK5IGSvpVttv2wOuS3gB+BFxUfrykLqQWy9gKp74E2EfSy6SrsI4u1Wv4nvHj4cYb4eSToXv3entaM7OGRBEVhx2anh49ekRZWdkPO8k338Cmm8IXX8Arr6RihmZmTZikidl48mJc5LBYV16ZEsawYU4aZtasueRIMd56C84/H/beG/bYI+9ozMxy5cRRkwjo1w9atoRrrsk7GjOz3Lmrqib33w8jRqSJfp065R2NmVnu3OKozmefwUknwWabQf/+eUdjZtYguMVRnXPOgVmz4LHHPGfDzCzjFkd1unaFM85ILQ4zMwPc4qje6afnHYGZWYPjFoeZmdWKE4eZmdWKE4eZmdWKE4eZmdWKE4eZmdWKE4eZmdWKE4eZmdWKE4eZmdVKs1jISdJsYPoSHt4B+KgOw6krjqt2HFftOK7aaapxdY6IjhUfbBaJ44eQVFbZClh5c1y147hqx3HVTnOLy11VZmZWK04cZmZWK04cNRuSdwBVcFy147hqx3HVTrOKy2McZmZWK25xmJlZrThxmJlZrThxVEHSGpJGS5oi6RVJJ+UdE4Ck1pL+I2lSFtf5ecdUTlILSS9IeizvWApJ+q+klyW9KKks73jKSVpB0gOSXsv+zrZqADGtl/2cyr8+l3Ry3nEBSDol+5ufLOkeSa3zjglA0klZTK/k+bOSdIukWZImFzy2kqQnJU3Nvq9YF8/lxFG1+cBpEbEBsCXQT1K3nGMCmAf0iYjuwCbArpK2zDmmcicBU/IOogo/j4hNGti19tcAIyJifaA7DeBnFxGvZz+nTYDNgLnAwzmHhaTVgd8BPSJiI6AFcEC+UYGkjYBjgJ6k3+HuktbNKZzbgF0rPDYAGBUR6wKjsvs/mBNHFSLi/Yh4Prv9BemfevV8o4JIvszuLp195X6Fg6ROwC+Bm/KOpTGQ1B7YDrgZICK+iYjP8o3qe3YA3oyIJa26UNdaAm0ktQTaAjNzjgdgA+DZiJgbEfOBscBeeQQSEU8Dn1R4eE9gaHZ7KPDrunguJ44iSOoC/Ax4Lt9IkqxL6EVgFvBkRDSEuK4GzgAW5h1IJQJ4QtJEScfmHUxmLWA2cGvWvXeTpGXzDqqCA4B78g4CICLeAy4H3gHeB+ZExBP5RgXAZGA7SStLagv8Algj55gK/Sgi3of0YRhYpS5O6sRRA0ntgAeBkyPi87zjAYiIBVlXQiegZ9Zczo2k3YFZETExzziq0SsiNgV2I3U5bpd3QKRPz5sC10XEz4CvqKNuhLogaRngV8D9eccCkPXN7wl0BVYDlpV0SL5RQURMAf4EPAmMACaRurmbNCeOakhampQ07oqIh/KOp6Ksa2MM3+/XrG+9gF9J+i9wL9BH0p35hrRIRMzMvs8i9df3zDciAGYAMwpaiw+QEklDsRvwfER8mHcgmR2BtyNidkR8CzwEbJ1zTABExM0RsWlEbEfqKpqad0wFPpS0KkD2fVZdnNSJowqSROp/nhIRV+YdTzlJHSWtkN1uQ/qHei3PmCLirIjoFBFdSN0b/4yI3D8NAkhaVtJy5beBnUndC7mKiA+AdyWtlz20A/BqjiFVdCANpJsq8w6wpaS22f/mDjSAiwkAJK2SfV8T2JuG9XMbBhye3T4c+HtdnLRlXZykieoFHAq8nI0nAJwdEY/nGBPAqsBQSS1Iif9vEdGgLn9tYH4EPJzea2gJ3B0RI/IN6TsnAndl3UJvAUfmHA8AWV/9TsBxecdSLiKek/QA8DypK+gFGk6ZjwclrQx8C/SLiE/zCELSPcD2QAdJM4A/ApcAf5N0FCn57lcnz+WSI2ZmVhvuqjIzs1px4jAzs1px4jAzs1px4jAzs1px4jAzs1px4jCrQ5K2z7M6sKQjJF2b1/Nb8+DEYWbfyeYHmVXLicOaHUmHZGuavCjphvI3S0lfSrpC0vOSRknqmD2+iaRnJb0k6eHyNQ0krSPpqWxtlOclrZ09RbuCdTbuymY6V4xhjKQ/ZXG8IWnb7PHFWgySHpO0fUF8f8qKNT4lqWd2nrck/arg9GtIGiHpdUl/LPJ1D5T0HJD7miDW8DlxWLMiaQNgf1Lhw02ABcDB2eZlSfWZNiWVxy5/070dODMiNgZeLnj8LmBwtjbK1qSqrZAqKZ8MdCNVwe1VRTgtI6Jntu8fq9in0LLAmIjYDPgCuJA0w3svYGDBfj2z17QJsJ+kHkW87skRsUVEjCsiDmvmXHLEmpsdSAsUTcgaAm1YVPhtIXBfdvtO4CFJywMrRMTY7PGhwP1Z/avVI+JhgIj4GiA7538iYkZ2/0WgC1DZG3J54cyJ2T41+YZUgRVSApsXEd9KernC8U9GxMfZ8z8EbEMq01HV615AKuZpVhQnDmtuBAyNiLOK2Le6ejzf634qMK/g9gKq/j+bV8k+81m8J6BwedRvY1GNoIXlx0fEwmxxo6riDqp/3V9HxIIqYjT7HndVWXMzCti3oKLpSpI6Z9uWAvbNbh8EjIuIOcCn5WMQpMKXY7O1WWZI+nV2nlZZccAf6r/AJpKWkrQGS1YCfqfsdbUhrfg2nupft1mtuMVhzUpEvCrp96QVAZciq2gKTCctprShpInAHNKYAKRy1NdniaGwiu2hwA2SBmbnqYvKo+OBt0ldUZNJ1WBraxxwB7AOqRpwGUA1r9usVlwd1ywj6cuIaJd3HGYNnbuqzMysVtziMDOzWnGLw8zMasWJw8zMasWJw8zMasWJw8zMasWJw8zMauX/AWFVESTQvY98AAAAAElFTkSuQmCC\n",
+ "text/plain": [
+ ""
+ ]
+ },
+ "metadata": {
+ "needs_background": "light"
+ },
+ "output_type": "display_data"
+ }
+ ],
+ "source": [
+ "def eval_show(epoch_per_eval):\n",
+ " plt.xlabel(\"epoch number\")\n",
+ " plt.ylabel(\"Model accuracy\")\n",
+ " plt.title(\"Model accuracy variation chart\")\n",
+ " plt.plot(epoch_per_eval[\"epoch\"],epoch_per_eval[\"acc\"],\"red\")\n",
+ " plt.show()\n",
+ " \n",
+ "eval_show(epoch_per_eval)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "从上图可以一目了然地挑选出需要的最优模型。"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## 总结"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "本例使用MNIST数据集通过卷积神经网络LeNet5进行训练,着重介绍了利用回调函数在进行模型训练的同时进行模型的验证,保存对应`epoch`的模型,并从中挑选出最优模型的方法。"
+ ]
+ }
+ ],
+ "metadata": {
+ "kernelspec": {
+ "display_name": "Python 3",
+ "language": "python",
+ "name": "python3"
+ },
+ "language_info": {
+ "codemirror_mode": {
+ "name": "ipython",
+ "version": 3
+ },
+ "file_extension": ".py",
+ "mimetype": "text/x-python",
+ "name": "python",
+ "nbconvert_exporter": "python",
+ "pygments_lexer": "ipython3",
+ "version": "3.7.6"
+ },
+ "toc": {
+ "base_numbering": 1,
+ "nav_menu": {},
+ "number_sections": true,
+ "sideBar": true,
+ "skip_h1_title": false,
+ "title_cell": "Table of Contents",
+ "title_sidebar": "Contents",
+ "toc_cell": false,
+ "toc_position": {},
+ "toc_section_display": true,
+ "toc_window_display": true
+ }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 2
+}
diff --git a/tutorials/source_zh_cn/advanced_use/images/synchronization_training_and_evaluation.png b/tutorials/source_zh_cn/advanced_use/images/synchronization_training_and_evaluation.png
new file mode 100644
index 0000000000000000000000000000000000000000..cbecb6c9739eaf047c89ea79f9d596a2793e6283
Binary files /dev/null and b/tutorials/source_zh_cn/advanced_use/images/synchronization_training_and_evaluation.png differ
diff --git a/tutorials/source_zh_cn/advanced_use/synchronization_training_and_evaluation.md b/tutorials/source_zh_cn/advanced_use/synchronization_training_and_evaluation.md
new file mode 100644
index 0000000000000000000000000000000000000000..6e6932f1894f5a6caa018e5a7684e738a323f294
--- /dev/null
+++ b/tutorials/source_zh_cn/advanced_use/synchronization_training_and_evaluation.md
@@ -0,0 +1,174 @@
+# 同步训练和验证模型
+
+
+
+- [同步训练和验证模型](#同步训练和验证模型)
+ - [概述](#概述)
+ - [定义回调函数EvalCallBack](#定义回调函数evalcallback)
+ - [定义训练网络并执行](#定义训练网络并执行)
+ - [定义函数绘制不同epoch下模型的精度](#定义函数绘制不同epoch下模型的精度)
+ - [总结](#总结)
+
+
+
+
+
+
+
+## 概述
+
+在面对复杂网络时,往往需要进行几十甚至几百次的epoch训练。在训练之前,很难掌握在训练到第几个epoch时,模型的精度能达到满足要求的程度,所以经常会采用一边训练的同时,在相隔固定epoch的位置对模型进行精度验证,并保存相应的模型,等训练完毕后,通过查看对应模型精度的变化就能迅速地挑选出相对最优的模型,本文将采用这种方法,以LeNet网络为样本,进行示例。
+
+流程如下:
+1. 定义回调函数EvalCallBack,实现同步进行训练和验证。
+2. 定义训练网络并执行。
+3. 将不同epoch下的模型精度绘制出折线图并挑选最优模型。
+
+完整示例请参考[notebook](https://gitee.com/mindspore/docs/blob/master/tutorials/notebook/synchronization_training_and_evaluation.ipynb)。
+
+## 定义回调函数EvalCallBack
+
+实现思想:每隔n个epoch验证一次模型精度,由于在自定义函数中实现,如需了解详细用法,请参考[API说明](https://www.mindspore.cn/api/zh-CN/master/api/python/mindspore/mindspore.train.html?highlight=callback#mindspore.train.callback.Callback);
+
+核心实现:回调函数的`epoch_end`内设置验证点,如下:
+
+`cur_epoch % eval_per_epoch == 0`:即每`eval_per_epoch`个epoch结束时,验证一次模型精度。
+
+- `cur_epoch`:当前训练过程的epoch数值。
+- `eval_per_epoch`:用户自定义数值,即验证频次。
+
+其他参数解释:
+
+- `model`:即是MindSpore中的`Model`函数。
+- `eval_dataset`:验证数据集。
+- `epoch_per_eval`:记录验证模型的精度和相应的epoch数,其数据形式为`{"epoch":[],"acc":[]}`。
+
+```python
+import matplotlib.pyplot as plt
+from mindspore.train.callback import Callback
+
+class EvalCallBack(Callback):
+ def __init__(self, model, eval_dataset, eval_per_epoch):
+ self.model = model
+ self.eval_dataset = eval_dataset
+ self.eval_per_epoch = eval_per_epoch
+
+ def epoch_end(self, run_context):
+ cb_param = run_context.original_args()
+ cur_epoch = cb_param.cur_epoch_num
+ if cur_epoch % self.eval_per_epoch == 0:
+ acc = self.model.eval(self.eval_dataset,dataset_sink_mode = True)
+ epoch_per_eval["epoch"].append(cur_epoch)
+ epoch_per_eval["acc"].append(acc["Accuracy"])
+ print(acc)
+
+```
+
+## 定义训练网络并执行
+
+在保存模型的参数`CheckpointConfig`中,需计算好单个epoch中的step数,再根据需要进行验证模型精度的频次对应,本次示例为1875个step/epoch,按照每两个epoch验证一次的思想,这里设置`save_checkpoint_steps=eval_per_epoch*1875`,其中变量`eval_per_epoch`等于2。
+
+参数解释:
+
+- `config_ck`:定义保存模型信息。
+ - `save_checkpoint_steps`:每多少个step保存一次模型。
+ - `keep_checkpoint_max`:设置保存模型数量的上限。
+- `ckpoint_cb`:定义模型保存的名称及路径信息。
+- `model`:定义模型。
+- `model.train`:模型训练函数。
+- `epoch_per_eval`:定义收集`epoch`数和对应模型精度信息的字典。
+
+```python
+from mindspore.train.serialization import load_checkpoint, load_param_into_net
+from mindspore.train.callback import ModelCheckpoint, CheckpointConfig, LossMonitor
+from mindspore.train import Model
+from mindspore import context
+from mindspore.nn.metrics import Accuracy
+from mindspore.nn.loss import SoftmaxCrossEntropyWithLogits
+
+if __name__ == "__main__":
+ context.set_context(mode=context.GRAPH_MODE, device_target="GPU")
+ ckpt_save_dir = "./lenet_ckpt"
+ eval_per_epoch = 2
+
+ ... ...
+
+ # need to calculate how many steps are in each epoch,in this example, 1875 steps per epoch
+ config_ck = CheckpointConfig(save_checkpoint_steps=eval_per_epoch*1875, keep_checkpoint_max=15)
+ ckpoint_cb = ModelCheckpoint(prefix="checkpoint_lenet",directory=ckpt_save_dir, config=config_ck)
+ model = Model(network, net_loss, net_opt, metrics={"Accuracy": Accuracy()})
+
+ epoch_per_eval = {"epoch":[],"acc":[]}
+ eval_cb = EvalCallBack(model,eval_data,eval_per_epoch)
+
+ model.train(epoch_size, train_data, callbacks=[ckpoint_cb, LossMonitor(375),eval_cb],
+ dataset_sink_mode=True)
+```
+
+输出结果:
+
+ epoch: 1 step: 375, loss is 2.298612
+ epoch: 1 step: 750, loss is 2.075152
+ epoch: 1 step: 1125, loss is 0.39205977
+ epoch: 1 step: 1500, loss is 0.12368304
+ epoch: 1 step: 1875, loss is 0.20988345
+ epoch: 2 step: 375, loss is 0.20582482
+ epoch: 2 step: 750, loss is 0.029070046
+ epoch: 2 step: 1125, loss is 0.041760832
+ epoch: 2 step: 1500, loss is 0.067035824
+ epoch: 2 step: 1875, loss is 0.0050643035
+ {'Accuracy': 0.9763621794871795}
+
+ ... ...
+
+ epoch: 9 step: 375, loss is 0.021227183
+ epoch: 9 step: 750, loss is 0.005586236
+ epoch: 9 step: 1125, loss is 0.029125651
+ epoch: 9 step: 1500, loss is 0.00045874066
+ epoch: 9 step: 1875, loss is 0.023556218
+ epoch: 10 step: 375, loss is 0.0005807788
+ epoch: 10 step: 750, loss is 0.02574059
+ epoch: 10 step: 1125, loss is 0.108463734
+ epoch: 10 step: 1500, loss is 0.01950589
+ epoch: 10 step: 1875, loss is 0.10563098
+ {'Accuracy': 0.979667467948718}
+
+
+在同一目录找到`lenet_ckpt`文件夹,文件夹中保存了5个模型,和一个计算图相关数据,其结构如下:
+
+```
+lenet_ckpt
+├── checkpoint_lenet-10_1875.ckpt
+├── checkpoint_lenet-2_1875.ckpt
+├── checkpoint_lenet-4_1875.ckpt
+├── checkpoint_lenet-6_1875.ckpt
+├── checkpoint_lenet-8_1875.ckpt
+└── checkpoint_lenet-graph.meta
+```
+
+## 定义函数绘制不同epoch下模型的精度
+
+定义绘图函数`eval_show`,将`epoch_per_eval`载入到`eval_show`中,绘制出不同`epoch`下模型的验证精度折线图。
+
+
+```python
+def eval_show(epoch_per_eval):
+ plt.xlabel("epoch number")
+ plt.ylabel("Model accuracy")
+ plt.title("Model accuracy variation chart")
+ plt.plot(epoch_per_eval["epoch"],epoch_per_eval["acc"],"red")
+ plt.show()
+
+eval_show(epoch_per_eval)
+```
+
+输出结果:
+
+
+
+
+从上图可以一目了然地挑选出需要的最优模型。
+
+## 总结
+
+本次使用MNIST数据集通过卷积神经网络LeNet5进行训练,着重介绍了在进行模型训练的同时进行模型的验证,保存对应`epoch`的模型,并从中挑选出最优模型的方法。
diff --git a/tutorials/source_zh_cn/index.rst b/tutorials/source_zh_cn/index.rst
index 33171ea18ab9b7a45a08c40f6aeea694c4cc7efa..bb6bdfdedde13c651bd3abbfd4a147572aa853fc 100644
--- a/tutorials/source_zh_cn/index.rst
+++ b/tutorials/source_zh_cn/index.rst
@@ -33,6 +33,7 @@ MindSpore教程
advanced_use/computer_vision_application
advanced_use/nlp_application
advanced_use/second_order_optimizer_for_resnet50_application
+ advanced_use/synchronization_training_and_evaluation
.. toctree::
:glob: