**Neural Network Tutorials - Herong's Tutorial Examples** - 1.20, by Dr. Herong Yang

RNN (Recurrent Neural Network)

This chapter provides introductions and tutorials on RNN (Recurrent Neural Network). Topics include introduction to the classical RNN model, LSTM (Long Short-Term Memory) model, GRU (Gated Recurrent Unit) model.

What Is RNN (Recurrent Neural Network)

Takeaways:

- RNN (Recurrent Neural Network) uses a recursive function to carry a state vector from one sample to the next sample. This is to help capture dependences between samples in a sequential sample set.
- LSTM (Long Short-Term Memory) is an enhancement of the RNN that two state vectors, s representing the short-term memory and l representing long-term memory.
- GRU (Gated Recurrent Unit) is a simplified version of the LSTM that uses only one state vector and two gate vectors, reset gate and update gate.

Table of Contents

Deep Playground for Classical Neural Networks

Building Neural Networks with Python

Simple Example of Neural Networks

TensorFlow - Machine Learning Platform

PyTorch - Machine Learning Platform

CNN (Convolutional Neural Network)