recurrent单词写错了,情感分析那一小节。
Created by: xiaotao321
LSTM enhances its ability to handle long-range dependencies by adding memory and control gates to RNN. A similar principle improvement is Gated Recurrent Unit (GRU)[8], which is more concise in design. These improvements are different, but their macro descriptions are the same as simple recurrent neural networks (as shown in Figure 2). That is, the hidden state changes according to the current input and the hidden state of the previous moment, and this process is continuous until the input is processed:
$$ h_t=Recrurent(x_t,h_{t-1})$$
Among them, $Recrurent$ can represent a RNN, GRU or LSTM.
情感分析那一小节,这两个Recurrent单词都写错了