<li><strong>x</strong> (<em>Variable|list</em>) – The input tensor from which the data will be read.</li>
<li><strong>i</strong> (<em>Variable|list</em>) – The subscript index in tensor array, that points the
place from which data will be read.</li>
<li><strong>array</strong> (<em>Variable|list</em>) – The data can be read into this variable if
this is assigned.</li>
<li><strong>i</strong> (<em>Variable|list</em>) – The index of the output LOD_TENSOR_ARRAY, pointing to
the position to which the input tensor will be
written.</li>
<li><strong>array</strong> (<em>Variable|list</em>) – The output LOD_TENSOR_ARRAY to which the input
tensor will be written. If this parameter is
NONE, a new LOD_TENSOR_ARRAY will be created and
returned.</li>
</ul>
</td>
</tr>
<trclass="field-even field"><thclass="field-name">Returns:</th><tdclass="field-body"><pclass="first">The tensor type variable that has the data written to it.</p>
<trclass="field-even field"><thclass="field-name">Returns:</th><tdclass="field-body"><pclass="first">The output LOD_TENSOR_ARRAY where the input tensor is written.</p>
"comment":"ReorderLoDTensorByRankTable\n\nReorder the input X by the rank of `RankTable`. If `RankTable` is ordered by\nindex [3, 0, 2, 1]. Input X will reorder its sequence, the third sequence of\nX will be the first sequence of Output.\n\nNOTE: The RankTable does not need to be calculated by X.\n\nFor example:\nThe X = [Seq0, Seq1, Seq2, Seq3]. The indices of RankTable are [3, 0, 2, 1].\n\nThe Out = [Seq3, Seq0, Seq2, Seq1] with correct LoD information.\n",
"comment":"ReorderLoDTensorByRankTable operator.\n\nInput(X) is a batch of sequences. Input(RankTable) stores new orders of the\ninput sequence batch. The reorder_lod_tensor_by_rank operator reorders the\nInput(X) according to the information provided by Input(RankTable).\n\nFor example:\n\nIf the indices stored in the Input(RankTable) are [3, 0, 2, 1], the\nInput(X) will be reordered that the fourth sequence in Input(X) will become the\nfirst one, and then followed by the original first, third, and the second one.\n\nThis is:\nX = [Seq0, Seq1, Seq2, Seq3]. The indices in RankTable are [3, 0, 2, 1].\nOut = [Seq3, Seq0, Seq2, Seq1] with a new LoD information.\n\nIf the LoD information of Input(X) is empty, this means Input(X) is not sequence\ndata. This is also identical to a batch of sequences where each sequence has a\nfixed length 1. In this case, the reorder_lod_tensor_by_rank operator reorders\neach slice of Input(X) along the first axis according to Input(RankTable).\n\nThis is:\nX = [Slice0, Slice1, Slice2, Slice3] and its LoD information is empty. The\nindices in RankTable are [3, 0, 2, 1].\nOut = [Slice3, Slice0, Slice2, Slice1] with no LoD information is appended.\n\nNOTE: This operator sorts Input(X) according to a given LoDRankTable which does\nnot need to be calculated according to Input(X). It can be calculated according\nto another different sequence, and then this operator sorts Input(X) according\nto the given LoDRankTable.\n\n",
"inputs":[
{
"name":"X",
"comment":"(LoDTensor) the input lod tensor need to be reordered.",
"comment":"(LoDTensor), the input lod tensor to be reordered according to Input(RankTable).",
"duplicable":0,
"intermediate":0
},{
"name":"RankTable",
"comment":"(LoDRankTable) the rank table that input need follow",
"comment":"(LoDRankTable), the rank table according to which Input(X) is reordered.",
"duplicable":0,
"intermediate":0
}],
"outputs":[
{
"name":"Out",
"comment":"(LoDTensor) reordered lod tensor",
"comment":"(LoDTensor), the reordered lod tensor.",
"duplicable":0,
"intermediate":0
}],
...
...
@@ -2485,7 +2485,7 @@
}]
},{
"type":"shrink_rnn_memory",
"comment":"\n In dynamic RNN, we are able to handle sequences of different lengths.\n Because of the multiple lengths, the size of each step input can be\n different, which may lead to a mismatching between the input of\n the current step and the memory generated by the previous one. This\n operator shrinks memory according to the size of the next step input,\n to make sure that they can match each other.\n",
"comment":"\nThis operator is used to shrink output batch of memory defined in dynamic RNN.\n\nDynamic RNN is able to handle variable-length sequences, in which, sequences in\na mini-batch are sorted by their lengths first. After that, the longest sequence\nbecomes the first one in the sorted batch, followed by the second longest, the\nthird longest, and so on. Dynamic RNN then slices a batch input timestep by\ntimestep from the sorted input. Once any sequence in the input batch reaches its\nend, memory defined in dynamicRNN has to shrink its outputs to adapt to the input\nbatch size for the next time step.\n",
<li><strong>x</strong> (<em>Variable|list</em>) – The input tensor from which the data will be read.</li>
<li><strong>i</strong> (<em>Variable|list</em>) – The subscript index in tensor array, that points the
place from which data will be read.</li>
<li><strong>array</strong> (<em>Variable|list</em>) – The data can be read into this variable if
this is assigned.</li>
<li><strong>i</strong> (<em>Variable|list</em>) – The index of the output LOD_TENSOR_ARRAY, pointing to
the position to which the input tensor will be
written.</li>
<li><strong>array</strong> (<em>Variable|list</em>) – The output LOD_TENSOR_ARRAY to which the input
tensor will be written. If this parameter is
NONE, a new LOD_TENSOR_ARRAY will be created and
returned.</li>
</ul>
</td>
</tr>
<trclass="field-even field"><thclass="field-name">返回:</th><tdclass="field-body"><pclass="first">The tensor type variable that has the data written to it.</p>
<trclass="field-even field"><thclass="field-name">返回:</th><tdclass="field-body"><pclass="first">The output LOD_TENSOR_ARRAY where the input tensor is written.</p>