diff --git a/.metas/ernie2.0_paper.png b/.metas/ernie2.0_paper.png new file mode 100644 index 0000000000000000000000000000000000000000..5bc38eacd4a955afd4a13756a2e9d06aca1a245c Binary files /dev/null and b/.metas/ernie2.0_paper.png differ diff --git a/README.md b/README.md index d5b3f9989e09fcc540b91a622e46b94dc3c7892f..334cb4a124cd93a002d5db39e2f4d12dede1c99e 100644 --- a/README.md +++ b/README.md @@ -3,7 +3,6 @@ English | [简体中文](./README.zh.md) ## ERNIE 2.0: A Continual Pre-training Framework for Language Understanding - * [Continual Pre-training Framework for Language Understanding](#continual-pre-training-framework-for-language-understanding) * [Pre-training Tasks](#pre-training-tasks) * [Word-aware Tasks](#word-aware-tasks) * [Knowledge Masking Task](#knowledge-masking-task) @@ -21,7 +20,11 @@ English | [简体中文](./README.zh.md) * [Results on Chinese Datasets](#results-on-chinese-datasets) -### Continual Pre-training Framework for Language Understanding +![ernie2.0_paper](.metas/ernie2.0_paper.png) + +
arxiv: ERNIE 2.0: A Continual Pre-training Framework for Language Understanding, link
+ +--- **[ERNIE 2.0](https://arxiv.org/abs/1907.12412v1) is a continual pre-training framework for language understanding** in which pre-training tasks can be incrementally built and learned through multi-task learning. In this framework, different customized tasks can be incrementally introduced at any time. For example, the tasks including named entity prediction, discourse relation recognition, sentence order prediction are leveraged in order to enable the models to learn language representations. diff --git a/README.zh.md b/README.zh.md index 330548db165bffdf6e46531f24f74957e7123905..e42e06bbc813ba16da3518f39bc644b42da5b6f0 100644 --- a/README.zh.md +++ b/README.zh.md @@ -3,7 +3,6 @@ ## ERNIE 2.0: A Continual Pre-training Framework for Language Understanding - * [持续学习语义理解框架](#持续学习语义理解框架) * [Pre-Training 任务](#pre-training-任务) * [Word-aware Tasks](#word-aware-tasks) * [Knowledge Masking Task](#knowledge-masking-task) @@ -21,7 +20,11 @@ * [英文效果验证](#英文效果验证) -### 持续学习语义理解框架 +![ernie2.0_paper](.metas/ernie2.0_paper.png) + +
arxiv: ERNIE 2.0: A Continual Pre-training Framework for Language Understanding, link
+ +--- **[ERNIE 2.0](https://arxiv.org/abs/1907.12412v1)** 是基于持续学习的语义理解预训练框架,使用多任务学习增量式构建预训练任务。**[ERNIE 2.0](https://arxiv.org/abs/1907.12412v1)** 中,新构建的预训练任务类型可以无缝的加入训练框架,持续的进行语义理解学习。 通过新增的实体预测、句子因果关系判断、文章句子结构重建等语义任务,**[ERNIE 2.0](https://arxiv.org/abs/1907.12412v1)** 语义理解预训练模型从训练数据中获取了词法、句法、语义等多个维度的自然语言信息,极大地增强了通用语义表示能力。