The ↻ symbol shows the recurrent connection - the key to RNN memory!
Types of Memory in RNNs
📱 Short-term Memory
Hidden state that remembers recent inputs
🧠 Working Memory
Active processing and manipulation of information
🎯 Contextual Memory
Understanding based on accumulated context
🔍 Pattern Memory
Recognition of recurring temporal patterns
Key Architecture Components
⚖️ Weight Matrices
Wx, Wh, Wy - learned parameters
📈 Activation Functions
Tanh, ReLU, Sigmoid for non-linearity
➕ Bias Terms
Learned offsets for better fitting
🔄 Recurrent Connection
Feedback loop creating memory
RNN Mathematical Foundation
Hidden State Update:
ht = tanh(Wxxt + Whht-1 + bh)
Output Computation:
yt = Wyht + by
💡 Key Insight:
The hidden state ht depends on both current input xt and previous memory ht-1. This recurrence creates the network's ability to remember and build context!