October 11, 2017
by Karen Renner

Gated recurrent units

Gated recurrent units (GRUs) are a gating mechanism in recurrent neural networks, introduced in 2014. They use a form of memory that allow them to learn from arbitrarily long sequences, and then use their memory to change previous predictions based on what they've seen in the sequence. The technology is used in music modelling and speech signal modelling.