Gated Recurrent Units (GRU)

https://d2l.ai/chapter_recurrent-modern/gru.html

When will the source code implemented by pytorch be released?Thanks!

Now you have it!
@huangxiaoshuo

Correct me if I’m wrong.
Exercise 1: For t>t’, Rt = 0 and Zt = 1, such that we just remember the hidden state at time step t’.

Exercise 1: For t>t’, Rt = 0 and Zt = 1, such that we just remember the hidden state at time step t’.

I’am not 100% sure but I should say :

  • keep 𝑅𝑡 alive in order to reset the hidden state (that is a representation of the historic of the sequence before t’) with information carried into Xt
  • keep 𝑍𝑡≅0 that leads to fully replace 𝐻𝑡 with candidate state, 𝐻̃𝑡 (that only depends of Xt : 𝑅𝑡≅0⟹𝐻̃𝑡=tanh(𝑊xh 𝑋𝑡 + 𝐵h)) and then, to remove the impact of hidden state 𝐻{𝑡−1} before t’ over the new hidden state 𝐻𝑡.

sure thing. that is it