Network Components
🏗️ Layers
Organized groups of neurons that process information at different abstraction levels
🔮 Nodes (Neurons)
Basic computational units that receive, process, and transmit signals
🔗 Connections (Weights)
Learnable parameters that determine signal strength between neurons
⚖️ Biases
Threshold adjustments that shift activation functions
Understanding Depth vs Width
Neural network architecture is defined by two key dimensions:
Depth: Number of layers (how deep)
More layers = More abstract representations
Width: Number of neurons per layer (how wide)
More neurons = More features per layer
Architecture Trade-offs:
- Deep & Narrow: Efficient but harder to train
- Shallow & Wide: Easy to train but less efficient
- Optimal: Problem-dependent balance