Recurrent neural network

Case contributed by Andrew Murphy

Case Discussion

This is a diagram of a simplified recurrent neural network, with an input layer, hidden layers and an output layer with a recurrent loop within the hidden layers. The squares in the hidden layers represent neurons or nodes and the lines represent weighted connections.  The blue lines are weighted connections which could be seen in a variety of neural network architectures, however, the yellow line represents a weighted connection between a neuron and a neuron in the previously hidden layer. Thus within the hidden layers, a previous layer’s input information is merged with the current input.
Weighted connections that go back to previous layers are important when a recurrent network takes into account context, for example, if you are to input the letter ‘r’ followed by the letter ‘a’ followed by the letter ‘d ‘the network will take into account each input contextualised by the last; over time an RNN may conclude that r-a-d in some cases = radiopaedia.
Many recurrent neural networks also allow for weighted connections between neurons in the same hidden layer.

PlayAdd to Share

Case information

rID: 69184
Published: 5th Jul 2019
Last edited: 14th Aug 2019
Inclusion in quiz mode: Included
Institution: BC Children's Hospital

Updating… Please wait.

 Unable to process the form. Check for errors and try again.

 Thank you for updating your details.