From f9b80afdc07b82846be328ad19da2e2a7f4c3b21 Mon Sep 17 00:00:00 2001 From: yuchaojie Date: Fri, 3 Jul 2020 11:01:02 +0800 Subject: [PATCH] add transformer in model_zoo/README --- model_zoo/README.md | 21 +++++++++++++++++++++ 1 file changed, 21 insertions(+) diff --git a/model_zoo/README.md b/model_zoo/README.md index 24be683b2..2dde98567 100644 --- a/model_zoo/README.md +++ b/model_zoo/README.md @@ -32,6 +32,7 @@ In order to facilitate developers to enjoy the benefits of MindSpore framework a - [Natural Language Processing](#natural-language-processing) - [BERT](#bert) - [MASS](#mass) + - [Transformer](#transformer) # Announcements @@ -301,6 +302,26 @@ In order to facilitate developers to enjoy the benefits of MindSpore framework a | Model for inference | | | Scripts | | +#### [Transformer](#table-of-contents) + +| Parameters | Transformer | +| -------------------------- | -------------------------------------------------------------- | +| Published Year | 2017 | +| Paper | [Attention Is All You Need ](https://arxiv.org/abs/1706.03762) | +| Resource | Ascend 910 | +| Features | • Multi-GPU training support with Ascend | +| MindSpore Version | 0.5.0-beta | +| Dataset | WMT Englis-German | +| Training Parameters | epoch=52, batch_size=96 | +| Optimizer | Adam | +| Loss Function | Softmax Cross Entropy | +| BLEU Score | 28.7 | +| Speed | 410ms/step (8pcs) | +| Loss | 2.8 | +| Params (M) | 213.7 | +| Checkpoint for inference | 2.4G (.ckpt file) | +| Scripts | https://gitee.com/mindspore/mindspore/tree/master/model_zoo/Transformer | + #### License [Apache License 2.0](https://github.com/mindspore-ai/mindspore/blob/master/LICENSE) -- GitLab