description:BERT large model (cased) whole word masking finetuned on SQuADdescription_en:BERT large model (cased) whole word masking finetuned on SQuADfrom_repo:https://huggingface.co/bert-large-cased-whole-word-masking-finetuned-squadicon:https://paddlenlp.bj.bcebos.com/models/community/transformer-layer.pngname:bert-large-cased-whole-word-masking-finetuned-squad
-title:'BERT:Pre-trainingofDeepBidirectionalTransformersforLanguageUnderstanding'url:http://arxiv.org/abs/1810.04805v2Publisher:huggingfaceTask:-sub_tag:回答问题sub_tag_en:Question Answeringtag:自然语言处理tag_en:Natural Language Processing