description:BERT base model (uncased)description_en:BERT base model (uncased)from_repo:https://huggingface.co/bert-base-uncasedicon:https://paddlenlp.bj.bcebos.com/models/community/transformer-layer.pngname:bert-base-uncased
-title:'BERT:Pre-trainingofDeepBidirectionalTransformersforLanguageUnderstanding'url:http://arxiv.org/abs/1810.04805v2Publisher:huggingfaceTask:-sub_tag:槽位填充sub_tag_en:Fill-Masktag:自然语言处理tag_en:Natural Language Processing