11.2

Backpropagation

Watch data flow forward through a neural network, then gradients flow backward. The network learns by adjusting its weights to minimize the loss.

Forward Pass
Loss Computation
Backward Pass
Weight Update
InputHidden 1Hidden 2Outputx1x2ŷ
LOSS0.0000
STEP0
SAMPLE: [0, 0][0]
Loss History
No training data yet
Network
2
4
0.500
Metrics
Loss0.0000
Steps0
LR0.5000
Avg |grad|0.0000
Accuracy0%
Arch2-4-4-1
1.0x