description:BERT large model (cased) whole word maskingdescription_en:BERT large model (cased) whole word maskingfrom_repo:https://huggingface.co/bert-large-cased-whole-word-maskingicon:https://paddlenlp.bj.bcebos.com/models/community/transformer-layer.pngname:bert-large-cased-whole-word-masking
-title:'BERT:Pre-trainingofDeepBidirectionalTransformersforLanguageUnderstanding'url:http://arxiv.org/abs/1810.04805v2Publisher:huggingfaceTask:-sub_tag:槽位填充sub_tag_en:Fill-Masktag:自然语言处理tag_en:Natural Language Processing