CS5720 - Week 6
Slide 106 of 120
Unfolding RNNs Through Time
RNN Unfolded Across Time Steps
t = 0
x₀
RNN
↻
y₀
→
t = 1
x₁
RNN
↻
y₁
→
t = 2
x₂
RNN
↻
y₂
→
t = n
xₙ
RNN
yₙ
Each time step shares the same RNN parameters but processes different inputs
Key Unfolding Concepts
🔄 Parameter Sharing
Same weights used at every time step
⏱️ Sequential Processing
Information flows from past to future
🧠 Memory Flow
Hidden states carry information forward
📈 Computational Graph
Unfolding reveals the computation structure
Benefits of Unfolding
👁️ Better Visualization
Makes temporal dependencies explicit
🔄 Backpropagation
Enables gradient computation through time
🐛 Debugging
Easier to trace information flow
💡 Understanding
Clarifies how memory accumulates
Folded vs Unfolded View
Folded RNN
Compact representation showing the recurrent loop
🔄
Emphasizes the recursive nature
Unfolded RNN
Extended view across multiple time steps
→→→
Shows temporal flow clearly
Interactive Unfolding Demo
Click the buttons below to see different aspects of RNN unfolding
🎬 Animate Unfolding
🔄 Show Parameter Sharing
🧠 Trace Memory Flow
← Previous
Next →
Prepared by Dr. Gorkem Kar
Modal Title
×
Modal content goes here...