Neural Architecture Search (NAS) is the process of automatically designing neural network architectures using algorithms, reducing the need for manual architecture engineering.
Why Use NAS?
• Automation - Discovers architectures without human expertise
• Performance - Often finds better architectures than humans
• Efficiency - Optimizes for specific hardware/constraints
• Innovation - Discovers novel architectural patterns
💡 Key Insight
NAS has discovered architectures like EfficientNet and RegNet that achieve state-of-the-art performance while being computationally efficient.
NAS Approaches
🎮
Reinforcement Learning
Uses RL agents to propose architectures and receive rewards based on performance.
✓ NASNet • AutoML-Zero • Progressive search
🧬
Evolutionary Methods
Evolves populations of architectures through mutation and crossover operations.
✓ AmoebaNet • EvoNAS • Population-based
📈
Differentiable NAS
Makes architecture search differentiable, allowing gradient-based optimization.
✓ DARTS • PC-DARTS • Faster search
⚡
One-Shot NAS
Trains a supernet once, then efficiently searches for optimal subnetworks.
✓ SPOS • Once-for-All • Efficient search
NAS Workflow Process
1
Define Search Space
Specify the set of possible operations, layer types, and connections that can be used in architectures.
2
Choose Search Strategy
Select the algorithm (RL, evolutionary, differentiable) to explore the architecture space.
3
Performance Estimation
Evaluate candidate architectures using training, proxies, or weight sharing techniques.
4
Iterative Optimization
Use performance feedback to guide the search toward better architectures over iterations.
Click on any approach or workflow step to explore implementation details!