Bidirectional RNNs and Deep RNNs

The typical RNN model works in a way such that the past sequences affect the next sequence, while in reality a particular output could get influenced by both the sequences before it and sequences after it. BiDirectional RNNs (or BRNN) take into account this effect in its architecture

Test Image

The BRNNs contain both the forward and the backward units. For forward units the forward propagation runs from left to right and backpropagation runs from right to left. For backward units, forward propagation runs from right to left and backpropagation runs from left to right.

Deep RNN: Stacking multiple layers of RNNs

Test Image

Written on February 18, 2018
[ ]