Recurrent Neural Networks

https://d2l.ai/chapter_recurrent-neural-networks/rnn.html

Answers:

  1. For a sequence it should be a single char value if the batch_size is 1?
  2. Because we store all such relation as a latent model and while training we learn the weights?
  3. Vanish?
  4. Since, it is only char based language model, we can’t learn correlation between words?