提交 5afbe471 编写于 作者: L liyukun01

update paper arxiv link

上级 a75ba9d4
......@@ -3,7 +3,6 @@ English | [简体中文](./README.zh.md)
## ERNIE 2.0: A Continual Pre-training Framework for Language Understanding
* [Continual Pre-training Framework for Language Understanding](#continual-pre-training-framework-for-language-understanding)
* [Pre-training Tasks](#pre-training-tasks)
* [Word-aware Tasks](#word-aware-tasks)
* [Knowledge Masking Task](#knowledge-masking-task)
......@@ -21,7 +20,11 @@ English | [简体中文](./README.zh.md)
* [Results on Chinese Datasets](#results-on-chinese-datasets)
### Continual Pre-training Framework for Language Understanding
![ernie2.0_paper](.metas/ernie2.0_paper.png)
<div align="center"><i>arxiv: ERNIE 2.0: A Continual Pre-training Framework for Language Understanding</i>, <a href="https://arxiv.org/abs/1907.12412v1" target="_blank"><i>link</i></a> </div>
---
**[ERNIE 2.0](https://arxiv.org/abs/1907.12412v1) is a continual pre-training framework for language understanding** in which pre-training tasks can be incrementally built and learned through multi-task learning. In this framework, different customized tasks can be incrementally introduced at any time. For example, the tasks including named entity prediction, discourse relation recognition, sentence order prediction are leveraged in order to enable the models to learn language representations.
......
......@@ -3,7 +3,6 @@
## ERNIE 2.0: A Continual Pre-training Framework for Language Understanding
* [持续学习语义理解框架](#持续学习语义理解框架)
* [Pre-Training 任务](#pre-training-任务)
* [Word-aware Tasks](#word-aware-tasks)
* [Knowledge Masking Task](#knowledge-masking-task)
......@@ -21,7 +20,11 @@
* [英文效果验证](#英文效果验证)
### 持续学习语义理解框架
![ernie2.0_paper](.metas/ernie2.0_paper.png)
<div align="center"><i>arxiv: ERNIE 2.0: A Continual Pre-training Framework for Language Understanding</i>, <a href="https://arxiv.org/abs/1907.12412v1" target="_blank"><i>link</i></a> </div>
---
**[ERNIE 2.0](https://arxiv.org/abs/1907.12412v1)** 是基于持续学习的语义理解预训练框架,使用多任务学习增量式构建预训练任务。**[ERNIE 2.0](https://arxiv.org/abs/1907.12412v1)** 中,新构建的预训练任务类型可以无缝的加入训练框架,持续的进行语义理解学习。 通过新增的实体预测、句子因果关系判断、文章句子结构重建等语义任务,**[ERNIE 2.0](https://arxiv.org/abs/1907.12412v1)** 语义理解预训练模型从训练数据中获取了词法、句法、语义等多个维度的自然语言信息,极大地增强了通用语义表示能力。
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册