Datasets: '' Example: null IfOnlineDemo: 0 IfTraining: 0 Language: Chinese License: apache-2.0 Model_Info: description: Please use 'Bert' related functions to load this model! description_en: Please use 'Bert' related functions to load this model! from_repo: https://huggingface.co/hfl/chinese-roberta-wwm-ext-large icon: https://paddlenlp.bj.bcebos.com/models/community/transformer-layer.png name: hfl/chinese-roberta-wwm-ext-large Paper: - title: Pre-Training with Whole Word Masking for Chinese BERT url: http://arxiv.org/abs/1906.08101v3 - title: Revisiting Pre-Trained Models for Chinese Natural Language Processing url: http://arxiv.org/abs/2004.13922v2 Publisher: hfl Task: - sub_tag: 槽位填充 sub_tag_en: Fill-Mask tag: 自然语言处理 tag_en: Natural Language Processing