RNN roadmap for refactoring
Created by: Superjomn
RNN is a big concept to support in our refactoring. Based on the current support of our framework, it is better to implement RNNs in following stages:
-
recurrent_op
, an RNN operator that takes a tensor as input -
dynamic_recurrent_op
, an RNN that takes variable-length sequences as input, and output sequences as output- should replace the
recurrent_op
- should replace the
-
built-in
beam search
, that will be a method indynamic_recurrent_op
and make thedynamic_recurrent_op
an equivalent of the oldRecurrentGradientMachine
. -
dynamic RNN based on
while_loop
and some other conditional operators, after we support this, the infrastructure might apply to some other dynamic models such as Tree-LSTM and so on. -
dynamic beam search, that is a beam search built on
while_loop
and other conditional operations.
Milestones
-
support neural machine translation model
-
dynamic_recurrent_op
with a built-in beam search module should be ready at that point.
-
-
a text classification model with dynamic RNNs, the operators not limited to the following ones should be ready
pd.while_loop
pd.equals
pd.TensorArray
pd.less_than
-
machine translation model based on dynamic beam search (maybe wrapped as
generator
)- that needs more dynamic operators ready