diff --git a/tutorials/source_zh_cn/advanced_use/bert_poetry.md b/tutorials/source_zh_cn/advanced_use/bert_poetry.md index e4fc7f37ca8902a1b3d9a24647a95dc4c9cd08f8..6cdb95fe9c99617bf9603cb2a3c5309bb77b751e 100644 --- a/tutorials/source_zh_cn/advanced_use/bert_poetry.md +++ b/tutorials/source_zh_cn/advanced_use/bert_poetry.md @@ -81,7 +81,7 @@ BERT采用了Encoder结构,`attention_mask`为全1的向量,即每个token ![Teaser image](images/finetune.PNG) -图2:训练流程示意图 +图3:训练流程示意图 ## 样例代码