- 14 4月, 2021 1 次提交
-
-
由 Hui Zhang 提交于
-
- 12 4月, 2021 1 次提交
-
-
由 Hui Zhang 提交于
-
- 07 4月, 2021 1 次提交
-
-
由 Hui Zhang 提交于
-
- 31 3月, 2021 1 次提交
-
-
由 Hui Zhang 提交于
-
- 22 3月, 2021 1 次提交
-
-
由 Hui Zhang 提交于
* when loss div batchsize, change lr, more epoch, loss can reduce more and cer lower than before * since loss reduce more when loss div batchsize, less lm alpha can be better. * less lm alpha, more cer reduce * alpha 2.2, cer 0.077478 * alpha 1.9, cer 0.077249 * large librispeech lr for batch_average ctc loss * since loss reduce and model more confidence, then less lm alpha
-
- 11 3月, 2021 1 次提交
-
-
由 Hui Zhang 提交于
* add acts, refactor ctc, add pos embed * fix export, dataloader time log * fix egs * fix libri readme
-