Back Propagation

  • 2024/08/03
  • ML Neural networks Backpropagation

Oi, listen up you mob! If you're keen to wrap your head around back-prop and how those bloody parameters get updated, I reckon this vid's the duck's nuts: Building makemore Part 4: Becoming a Backprop Ninja by that clever guru Andrej Karpathy. Fair dinkum, I actually managed to follow most of it!

While I was at it, I knocked up a quick sketch to help me get my head around it. Used it while I was glued to the screen, you know? Then I got that AI bot Claude to give us a hand writing out all them fancy math formulas in that MathJax lingo.

After giving it a burl, I'm proper curious how they've gone and chucked this back-prop business into PyTorch. Bet it's a bit of a dog's breakfast in the actual code, ay? Wouldn't mind having a gander at that!

This diagram helped me follow the lecture. Also, hereis the notebook in which I added mathjax expressions using Claude.

./images/backpropagation-1.svg