循环神经网络

https://zh-v2.d2l.ai/chapter_recurrent-neural-networks/rnn.html

第一段,

对于一个足够强大的函数 f( (8.4.2) ),隐变量模型不是近似值。

是有typo吗,看起来应该是

对于一个足够强大的函数 f (8.4.2) ,隐变量模型不是近似值。

加了一个括号以后看起来像是(8.4.2)是f的参数

Personally, I find Fig 8.4.1. a little confusing. These three RNN neurons are actually the same neuron acted by elements in a sequence X in order. Unlike previous networks discussed, the RNN neuron takes two inputs: the data input, the hidden state (or “persistent memory”). We can also see this hidden state as a property of the RNN neuron when it starts working on a sequence. More complicated variations come later, with LSTM or GRU, but this is the very idea of attaching states to the neurons. Whereas in the past, neurons have been stateless. What a beautiful idea.

I personally prefer this Figure. This tutorial and its model setup is also very clear as to what is the nature of RNN neurons.

basic RNN neuron diagram