Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
PaddlePaddle
DeepSpeech
提交
f8c7b107
D
DeepSpeech
项目概览
PaddlePaddle
/
DeepSpeech
1 年多 前同步成功
通知
207
Star
8425
Fork
1598
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
245
列表
看板
标记
里程碑
合并请求
3
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
D
DeepSpeech
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
245
Issue
245
列表
看板
标记
里程碑
合并请求
3
合并请求
3
Pages
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
提交
Issue看板
未验证
提交
f8c7b107
编写于
9月 26, 2021
作者:
J
Jackwaterveg
提交者:
GitHub
9月 26, 2021
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
Update released_model.md
上级
4b225b76
变更
1
隐藏空白更改
内联
并排
Showing
1 changed file
with
13 addition
and
13 deletion
+13
-13
docs/src/released_model.md
docs/src/released_model.md
+13
-13
未找到文件。
docs/src/released_model.md
浏览文件 @
f8c7b107
# Released Models
## Acoustic Model Released in paddle 2.X
Acoustic Model | Training Data | Token-based | Size | Descriptions | CER
or
WER | Hours of speech
:-------------:| :------------:| :-----: | -----: | :----------------- | :---------- | :---------
[
Ds2 Online Aishell Model
](
https://deepspeech.bj.bcebos.com/release2.1/aishell/s0/aishell.s0.ds_online.5rnn.debug.tar.gz
)
| Aishell Dataset | Char-based | 345 MB | 2 Conv + 5 LSTM layers with only forward direction | 0.0824 | 151 h
[
Ds2 Offline Aishell Model
](
https://deepspeech.bj.bcebos.com/release2.1/aishell/s0/aishell.s0.ds2.offline.cer6p65.release.tar.gz
)
| Aishell Dataset | Char-based | 306 MB | 2 Conv + 3 bidirectional GRU layers| 0.065 | 151 h
[
Conformer Online Aishell Model
](
https://deepspeech.bj.bcebos.com/release2.1/aishell/s1/aishell.chunk.release.tar.gz
)
| Aishell Dataset | Char-based | 283 MB | Encoder:Conformer, Decoder:Transformer, Decoding method: Attention + CTC | 0.0594 | 151 h
[
Conformer Offline Aishell Model
](
https://deepspeech.bj.bcebos.com/release2.1/aishell/s1/aishell.release.tar.gz
)
| Aishell Dataset | Char-based | 284 MB | Encoder:Conformer, Decoder:Transformer, Decoding method: Attention | 0.0547 | 151 h
[
Conformer Librispeech Model
](
https://deepspeech.bj.bcebos.com/release2.1/librispeech/s1/conformer.release.tar.gz
)
| Librispeech Dataset | Word-based | 287 MB | Encoder:Conformer, Decoder:Transformer, Decoding method: Attention | 0.0325 | 960 h
[
Transformer Librispeech Model
](
https://deepspeech.bj.bcebos.com/release2.1/librispeech/s1/transformer.release.tar.gz
)
| Librispeech Dataset | Word-based | 195 MB | Encoder:Conformer, Decoder:Transformer, Decoding method: Attention | 0.0544 | 960 h
Acoustic Model | Training Data | Token-based | Size | Descriptions | CER
|
WER | Hours of speech
:-------------:| :------------:| :-----: | -----: | :----------------- |
:--------- |
:---------- | :---------
[
Ds2 Online Aishell Model
](
https://deepspeech.bj.bcebos.com/release2.1/aishell/s0/aishell.s0.ds_online.5rnn.debug.tar.gz
)
| Aishell Dataset | Char-based | 345 MB | 2 Conv + 5 LSTM layers with only forward direction | 0.0824 |
-|
151 h
[
Ds2 Offline Aishell Model
](
https://deepspeech.bj.bcebos.com/release2.1/aishell/s0/aishell.s0.ds2.offline.cer6p65.release.tar.gz
)
| Aishell Dataset | Char-based | 306 MB | 2 Conv + 3 bidirectional GRU layers| 0.065 |
-|
151 h
[
Conformer Online Aishell Model
](
https://deepspeech.bj.bcebos.com/release2.1/aishell/s1/aishell.chunk.release.tar.gz
)
| Aishell Dataset | Char-based | 283 MB | Encoder:Conformer, Decoder:Transformer, Decoding method: Attention + CTC | 0.0594 |
-|
151 h
[
Conformer Offline Aishell Model
](
https://deepspeech.bj.bcebos.com/release2.1/aishell/s1/aishell.release.tar.gz
)
| Aishell Dataset | Char-based | 284 MB | Encoder:Conformer, Decoder:Transformer, Decoding method: Attention | 0.0547 |
-|
151 h
[
Conformer Librispeech Model
](
https://deepspeech.bj.bcebos.com/release2.1/librispeech/s1/conformer.release.tar.gz
)
| Librispeech Dataset | Word-based | 287 MB | Encoder:Conformer, Decoder:Transformer, Decoding method: Attention |
-|
0.0325 | 960 h
[
Transformer Librispeech Model
](
https://deepspeech.bj.bcebos.com/release2.1/librispeech/s1/transformer.release.tar.gz
)
| Librispeech Dataset | Word-based | 195 MB | Encoder:Conformer, Decoder:Transformer, Decoding method: Attention |
-|
0.0544 | 960 h
## Acoustic Model Transformed from paddle 1.8
Acoustic Model | Training Data | Token-based | Size | Descriptions | CER
or
WER | Hours of speech
:-------------:| :------------:| :-----: | -----: | :----------------- | :---------- | :---------
[
Ds2 Offline Aishell model
](
https://deepspeech.bj.bcebos.com/mandarin_models/aishell_model_v1.8_to_v2.x.tar.gz
)
|Aishell Dataset| Char-based| 234 MB| 2 Conv + 3 bidirectional GRU layers| 0.0804 | 151 h|
[
Ds2 Offline Librispeech model
](
https://deepspeech.bj.bcebos.com/eng_models/librispeech_v1.8_to_v2.x.tar.gz
)
|Librispeech Dataset| Word-based| 307 MB| 2 Conv + 3 bidirectional sharing weight RNN layers | 0.0685| 960 h|
[
Ds2 Offline Baidu en8k model
](
https://deepspeech.bj.bcebos.com/eng_models/baidu_en8k_v1.8_to_v2.x.tar.gz
)
|Baidu Internal English Dataset| Word-based| 273 MB| 2 Conv + 3 bidirectional GRU layers | 0.0541 | 8628 h|
Acoustic Model | Training Data | Token-based | Size | Descriptions | CER
|
WER | Hours of speech
:-------------:| :------------:| :-----: | -----: | :----------------- | :---------- | :---------
- | :---------
[
Ds2 Offline Aishell model
](
https://deepspeech.bj.bcebos.com/mandarin_models/aishell_model_v1.8_to_v2.x.tar.gz
)
|Aishell Dataset| Char-based| 234 MB| 2 Conv + 3 bidirectional GRU layers| 0.0804 |
-|
151 h|
[
Ds2 Offline Librispeech model
](
https://deepspeech.bj.bcebos.com/eng_models/librispeech_v1.8_to_v2.x.tar.gz
)
|Librispeech Dataset| Word-based| 307 MB| 2 Conv + 3 bidirectional sharing weight RNN layers |
-|
0.0685| 960 h|
[
Ds2 Offline Baidu en8k model
](
https://deepspeech.bj.bcebos.com/eng_models/baidu_en8k_v1.8_to_v2.x.tar.gz
)
|Baidu Internal English Dataset| Word-based| 273 MB| 2 Conv + 3 bidirectional GRU layers |
-|
0.0541 | 8628 h|
...
...
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录