CS5720 - Week 1
Slide 12 of 20

Forward Propagation - Step by Step

What is Forward Propagation?

Forward propagation is the process of passing input data through the neural network to generate predictions. Signals flow from input to output, layer by layer.
The Journey of Data:

Input → Weighted Sum → Activation → Next Layer → ... → Output

Key Operations:
  • Matrix multiplication (weights × inputs)
  • Addition of bias terms
  • Application of activation functions
  • Sequential layer processing

The Forward Pass Steps

1 Input Reception

Receive and normalize input features

2 Weighted Sum

Calculate z = Wx + b for each neuron

3 Activation Function

Apply non-linear transformation a = f(z)

4 Layer Output

Pass activations to next layer or final output

Interactive Forward Propagation

Detailed Calculations

Layer 1 Computation

h₁ = σ(w₁₁x₁ + w₁₂x₂ + b₁) h₂ = σ(w₂₁x₁ + w₂₂x₂ + b₂)

Click to see step-by-step calculation

Activation Functions

ReLU: f(x) = max(0, x) Sigmoid: f(x) = 1/(1+e⁻ˣ)

Explore different activations

Output Layer

y = softmax(Wₒᵤₜh + bₒᵤₜ)

Final prediction calculation

Prepared by Dr. Gorkem Kar