1. 14 4月, 2021 1 次提交
  2. 12 4月, 2021 1 次提交
  3. 07 4月, 2021 1 次提交
  4. 31 3月, 2021 1 次提交
  5. 22 3月, 2021 1 次提交
    • H
      batch average ctc loss (#567) · e0a87a5a
      Hui Zhang 提交于
      * when loss div batchsize, change lr, more epoch, loss can reduce more and cer lower than before
      
      * since loss reduce more when loss div batchsize,  less lm alpha can be better.
      
      * less lm alpha, more cer reduce
      
      * alpha 2.2, cer 0.077478
      
      * alpha 1.9, cer 0.077249
      
      * large librispeech lr for batch_average ctc loss
      
      * since loss reduce and model more confidence, then less lm alpha
      e0a87a5a
  6. 11 3月, 2021 1 次提交