What Makes PyTorch Dynamic?
🔄 Define-by-Run Philosophy
Build computational graphs on-the-fly as operations execute
🐛 Natural Debugging
Use standard Python debugging tools like pdb and print statements
🌊 Dynamic Control Flow
Use Python control flow (if, for, while) naturally in your models
🐍 Pythonic Design
Feels like writing regular Python code, not a domain-specific language
Dynamic vs Static Graphs
# PyTorch Dynamic Example
def dynamic_net(x, n_layers):
for i in range(n_layers): # Dynamic!
if x.sum() > 0: # Conditional
x = torch.relu(linear(x))
else:
x = torch.tanh(linear(x))
return x
# Graph built during execution
output = dynamic_net(input, 5)
# TensorFlow Static Example (TF 1.x)
# Define graph first
x = tf.placeholder(tf.float32)
y = tf.layers.dense(x, 128)
y = tf.nn.relu(y)
output = tf.layers.dense(y, 10)
# Then execute
with tf.Session() as sess:
result = sess.run(output, {x: data})