description:Please use 'Bert' related functions to load this model!description_en:Please use 'Bert' related functions to load this model!from_repo:https://huggingface.co/hfl/chinese-roberta-wwm-ext-largeicon:https://paddlenlp.bj.bcebos.com/models/community/transformer-layer.pngname:hfl/chinese-roberta-wwm-ext-large
-title:Pre-Training with Whole Word Masking for Chinese BERTurl:http://arxiv.org/abs/1906.08101v3-title:Revisiting Pre-Trained Models for Chinese Natural Language Processingurl:http://arxiv.org/abs/2004.13922v2Publisher:hflTask:-sub_tag:槽位填充sub_tag_en:Fill-Masktag:自然语言处理tag_en:Natural Language Processing