CS5720 - Week 1
Slide 15 of 20

Why We Need Non-Linear Activations

šŸ”¢ Linear Functions

Click to explore: Multiple linear layers still create linear functions

🌊 Non-Linear Functions

Click to explore: Non-linear activations enable complex patterns

The Fundamental Problem

āš ļø
Linear Limitations
No matter how many linear layers you stack, the result is still linear
āŒ
XOR Problem
Classic example that linear models cannot solve
✨
Non-Linear Solution
Activation functions enable universal approximation

Interactive Neural Network Demo

Try both modes: See how non-linear activations transform the network's capabilities

Prepared by Dr. Gorkem Kar