CS5720 - Week 6
Slide 103 of 120

Limitations of Feedforward Networks for Sequences

The Core Problem

Feedforward networks process each input independently. They have no memory of previous inputs, making them unsuitable for sequential data where context and order matter.
Why This Matters:

• Understanding "bank" requires knowing if we're talking about money or rivers
• Predicting stock prices needs historical context
• Language translation depends on sentence structure
• Each input provides valuable context for the next
🔍 Real Example:
"The bank can guarantee deposits will eventually yield a profit" - Which bank? Financial or river bank? Context from earlier sentences is crucial!

Key Limitations

  • 🧠
    No Memory
    Cannot remember previous inputs or learn from sequence history
  • 📏
    Fixed Input Size
    Requires same-length inputs, can't handle variable sequences
  • 🔄
    Order Doesn't Matter
    Treats [A,B,C] same as [C,A,B] - loses sequential information
  • 🎯
    No Context Sharing
    Each prediction is independent, no information flow between steps
  • Poor Scalability
    Network size grows exponentially with sequence length

Feedforward vs RNN: A Direct Comparison

❌ Feedforward Network
Sentence: "The movie was great"
The
movie
was
great
⚠️ Each word processed separately
No context between words
Cannot capture sentence meaning
✅ Recurrent Neural Network
Sentence: "The movie was great"
The
movie
was
great
✅ Processes words sequentially
Builds context from "The" → "movie"
Understands complete sentence meaning
🚀 The Solution:
Recurrent Neural Networks (RNNs) solve these problems by introducing memory and sequential processing. They can handle variable-length sequences and maintain context!
Prepared by Dr. Gorkem Kar