description:Chinese BERT with Whole Word Maskingdescription_en:Chinese BERT with Whole Word Maskingfrom_repo:https://huggingface.co/hfl/chinese-bert-wwm-ext
-title:Pre-Training with Whole Word Masking for Chinese BERTurl:http://arxiv.org/abs/1906.08101v3-title:Revisiting Pre-Trained Models for Chinese Natural Language Processingurl:http://arxiv.org/abs/2004.13922v2Publisher:hflTask:-sub_tag:槽位填充sub_tag_en:Fill-Masktag:自然语言处理tag_en:Natural Language Processing