<dd><p>A layer intergrating the open-source <cite>warp-ctc
<dd><p>A layer intergrating the open-source <aclass="reference external"href="https://github.com/baidu-research/warp-ctc">warp-ctc</a> library, which is used in
<https://github.com/baidu-research/warp-ctc></cite> library, which is used in
<aclass="reference external"href="https://arxiv.org/pdf/1512.02595v1.pdf">Deep Speech 2: End-toEnd Speech Recognition in English and Mandarin</a>, to compute Connectionist Temporal
<cite>Deep Speech 2: End-toEnd Speech Recognition in English and Mandarin
Classification (CTC) loss. Besides, another <aclass="reference external"href="https://github.com/gangliao/warp-ctc">warp-ctc</a> repository, which is forked from
<https://arxiv.org/pdf/1512.02595v1.pdf></cite>, to compute Connectionist Temporal
the official one, is maintained to enable more compiling options. During the
Classification (CTC) loss.</p>
building process, PaddlePaddle will clone the source codes, build and
install it to <codeclass="code docutils literal"><spanclass="pre">third_party/install/warpctc</span></code> directory.</p>
<p>To use warp_ctc layer, you need to specify the path of <codeclass="code docutils literal"><spanclass="pre">libwarpctc.so</span></code>,
using following methods:</p>
<p>1. Set it in <codeclass="code docutils literal"><spanclass="pre">paddle.init</span></code> (python api) or <codeclass="code docutils literal"><spanclass="pre">paddle_init</span></code> (c api),
such as <codeclass="code docutils literal"><spanclass="pre">paddle.init(use_gpu=True,</span>
<p>More details of CTC can be found by referring to <aclass="reference external"href="http://machinelearning.wustl.edu/mlpapers/paper_files/icml2006_GravesFGS06.pdf">Connectionist Temporal
<p>More details of CTC can be found by referring to <aclass="reference external"href="http://machinelearning.wustl.edu/mlpapers/paper_files/icml2006_GravesFGS06.pdf">Connectionist Temporal
Classification: Labelling Unsegmented Sequence Data with Recurrent
Classification: Labelling Unsegmented Sequence Data with Recurrent
Neural Networks</a></p>
Neural Networks</a>.</p>
<divclass="admonition note">
<divclass="admonition note">
<pclass="first admonition-title">Note</p>
<pclass="first admonition-title">Note</p>
<ulclass="last simple">
<ulclass="last simple">
<li>Let num_classes represent the category number. Considering the ‘blank’
<li>Let num_classes represent the category number. Considering the ‘blank’
label needed by CTC, you need to use (num_classes + 1) as the input
label needed by CTC, you need to use (num_classes + 1) as the input size.
size. Thus, the size of both warp_ctc and ‘input’ layer should
Thus, the size of both warp_ctc layer and ‘input’ layer should be set to
be set to num_classes + 1.</li>
num_classes + 1.</li>
<li>You can set ‘blank’ to any value ranged in [0, num_classes], which
<li>You can set ‘blank’ to any value ranged in [0, num_classes], which
should be consistent as that used in your labels.</li>
should be consistent as that used in your labels.</li>
<li>As a native ‘softmax’ activation is interated to the warp-ctc library,
<li>As a native ‘softmax’ activation is interated to the warp-ctc library,
<dd><p>A layer intergrating the open-source <cite>warp-ctc
<dd><p>A layer intergrating the open-source <aclass="reference external"href="https://github.com/baidu-research/warp-ctc">warp-ctc</a> library, which is used in
<https://github.com/baidu-research/warp-ctc></cite> library, which is used in
<aclass="reference external"href="https://arxiv.org/pdf/1512.02595v1.pdf">Deep Speech 2: End-toEnd Speech Recognition in English and Mandarin</a>, to compute Connectionist Temporal
<cite>Deep Speech 2: End-toEnd Speech Recognition in English and Mandarin
Classification (CTC) loss. Besides, another <aclass="reference external"href="https://github.com/gangliao/warp-ctc">warp-ctc</a> repository, which is forked from
<https://arxiv.org/pdf/1512.02595v1.pdf></cite>, to compute Connectionist Temporal
the official one, is maintained to enable more compiling options. During the
Classification (CTC) loss.</p>
building process, PaddlePaddle will clone the source codes, build and
install it to <codeclass="code docutils literal"><spanclass="pre">third_party/install/warpctc</span></code> directory.</p>
<p>To use warp_ctc layer, you need to specify the path of <codeclass="code docutils literal"><spanclass="pre">libwarpctc.so</span></code>,
using following methods:</p>
<p>1. Set it in <codeclass="code docutils literal"><spanclass="pre">paddle.init</span></code> (python api) or <codeclass="code docutils literal"><spanclass="pre">paddle_init</span></code> (c api),
such as <codeclass="code docutils literal"><spanclass="pre">paddle.init(use_gpu=True,</span>
<p>More details of CTC can be found by referring to <aclass="reference external"href="http://machinelearning.wustl.edu/mlpapers/paper_files/icml2006_GravesFGS06.pdf">Connectionist Temporal
<p>More details of CTC can be found by referring to <aclass="reference external"href="http://machinelearning.wustl.edu/mlpapers/paper_files/icml2006_GravesFGS06.pdf">Connectionist Temporal
Classification: Labelling Unsegmented Sequence Data with Recurrent
Classification: Labelling Unsegmented Sequence Data with Recurrent
Neural Networks</a></p>
Neural Networks</a>.</p>
<divclass="admonition note">
<divclass="admonition note">
<pclass="first admonition-title">注解</p>
<pclass="first admonition-title">注解</p>
<ulclass="last simple">
<ulclass="last simple">
<li>Let num_classes represent the category number. Considering the ‘blank’
<li>Let num_classes represent the category number. Considering the ‘blank’
label needed by CTC, you need to use (num_classes + 1) as the input
label needed by CTC, you need to use (num_classes + 1) as the input size.
size. Thus, the size of both warp_ctc and ‘input’ layer should
Thus, the size of both warp_ctc layer and ‘input’ layer should be set to
be set to num_classes + 1.</li>
num_classes + 1.</li>
<li>You can set ‘blank’ to any value ranged in [0, num_classes], which
<li>You can set ‘blank’ to any value ranged in [0, num_classes], which
should be consistent as that used in your labels.</li>
should be consistent as that used in your labels.</li>
<li>As a native ‘softmax’ activation is interated to the warp-ctc library,
<li>As a native ‘softmax’ activation is interated to the warp-ctc library,