Convolution Recurrent Neural Network-Based Architecture



Today is 14th November, celebrated as Children's Day in India. We all know how children are always curious to mix-up the things. So today's article is dedicated to the curiosity of mixing CNN and RNN about which we discussed in our last two articles of Deep Learning series.

Convolution recurrent neural network (CRNN) has advantages of both CNN and RNN. The output and input layers are dependent on each other because future computations depend on past computations. The hidden layers represent RNN as they are stored in memory. RNN is used as a memory storing element in various applications such as the implementation of video sequences and long short-term memory (LSTM). CRNN is trained using gradient backpropagation. Unlike feedforward network, it is difficult to optimize the large model.


The concept of LSTM will be discussed at the right time.


Akshay Juneja authored 15+ articles for INFO4EEE Website on Deep Learning.

No comments

Powered by Blogger.