提交 f9b80afd 编写于 作者: Y yuchaojie

add transformer in model_zoo/README

上级 391f5e44
......@@ -32,6 +32,7 @@ In order to facilitate developers to enjoy the benefits of MindSpore framework a
- [Natural Language Processing](#natural-language-processing)
- [BERT](#bert)
- [MASS](#mass)
- [Transformer](#transformer)
# Announcements
......@@ -301,6 +302,26 @@ In order to facilitate developers to enjoy the benefits of MindSpore framework a
| Model for inference | |
| Scripts | |
#### [Transformer](#table-of-contents)
| Parameters | Transformer |
| -------------------------- | -------------------------------------------------------------- |
| Published Year | 2017 |
| Paper | [Attention Is All You Need ](https://arxiv.org/abs/1706.03762) |
| Resource | Ascend 910 |
| Features | • Multi-GPU training support with Ascend |
| MindSpore Version | 0.5.0-beta |
| Dataset | WMT Englis-German |
| Training Parameters | epoch=52, batch_size=96 |
| Optimizer | Adam |
| Loss Function | Softmax Cross Entropy |
| BLEU Score | 28.7 |
| Speed | 410ms/step (8pcs) |
| Loss | 2.8 |
| Params (M) | 213.7 |
| Checkpoint for inference | 2.4G (.ckpt file) |
| Scripts | https://gitee.com/mindspore/mindspore/tree/master/model_zoo/Transformer |
#### License
[Apache License 2.0](https://github.com/mindspore-ai/mindspore/blob/master/LICENSE)
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册