提交 a651c0b5 编写于 作者: S sheqiZ 提交者: ruri

fix typo:recurrent, test=develop (#762)

上级 23371ce8
......@@ -95,7 +95,7 @@ LSTM通过给简单的循环神经网络增加记忆及控制门的方式,增
<img src="https://github.com/PaddlePaddle/book/blob/develop/06.understand_sentiment/image/formula_recrurent.png?raw=true" width = "50%" align="center"/><br/>
</p>
其中,$Recrurent$可以表示简单的循环神经网络、GRU或LSTM。
其中,$Recurrent$可以表示简单的循环神经网络、GRU或LSTM。
### 栈式双向LSTM(Stacked Bidirectional LSTM)
......
......@@ -79,9 +79,9 @@ Figure 3. LSTM for time $t$ [7]
LSTM enhances its ability to handle long-range dependencies by adding memory and control gates to RNN. A similar principle improvement is Gated Recurrent Unit (GRU)\[[8](#References)\], which is more concise in design. **These improvements are different, but their macro descriptions are the same as simple recurrent neural networks (as shown in Figure 2). That is, the hidden state changes according to the current input and the hidden state of the previous moment, and this process is continuous until the input is processed:**
$$ h_t=Recrurent(x_t,h_{t-1})$$
$$ h_t=Recurrent(x_t,h_{t-1})$$
Among them, $Recrurent$ can represent a RNN, GRU or LSTM.
Among them, $Recurrent$ can represent a RNN, GRU or LSTM.
......
......@@ -137,7 +137,7 @@ LSTM通过给简单的循环神经网络增加记忆及控制门的方式,增
<img src="https://github.com/PaddlePaddle/book/blob/develop/06.understand_sentiment/image/formula_recrurent.png?raw=true" width = "50%" align="center"/><br/>
</p>
其中,$Recrurent$可以表示简单的循环神经网络、GRU或LSTM。
其中,$Recurrent$可以表示简单的循环神经网络、GRU或LSTM。
### 栈式双向LSTM(Stacked Bidirectional LSTM)
......
......@@ -121,9 +121,9 @@ Figure 3. LSTM for time $t$ [7]
LSTM enhances its ability to handle long-range dependencies by adding memory and control gates to RNN. A similar principle improvement is Gated Recurrent Unit (GRU)\[[8](#References)\], which is more concise in design. **These improvements are different, but their macro descriptions are the same as simple recurrent neural networks (as shown in Figure 2). That is, the hidden state changes according to the current input and the hidden state of the previous moment, and this process is continuous until the input is processed:**
$$ h_t=Recrurent(x_t,h_{t-1})$$
$$ h_t=Recurrent(x_t,h_{t-1})$$
Among them, $Recrurent$ can represent a RNN, GRU or LSTM.
Among them, $Recurrent$ can represent a RNN, GRU or LSTM.
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册