Day 5 of 5
⏱ ~60 minutes
Calculus for AI in 5 Days — Day 5

Gradient Descent & Backprop

SGD, learning rate, momentum, Adam, backpropagation derivation, autograd

What You'll Cover Today

Day 5 of Calculus for AI in 5 Days brings everything together. You'll synthesize what you've built across the week into a complete, working implementation. This is the hardest day — and the most satisfying.

ℹ️
Topics today: SGD, Adam, backpropagation. Each section has code you can copy and run immediately.

SGD

Understanding SGD is the core goal of Day 5. The concept is straightforward once you see it in practice — most confusion comes from skipping the mental model and jumping straight to implementation. Start with the model, then write the code.

SGD
# SGD — Working Example
# Study this pattern carefully before writing your own version

class SGDExample:
    """
    Demonstrates core SGD concepts.
    Replace placeholder values with your real implementation.
    """
    
    def __init__(self, config: dict):
        self.config = config
        self._validate()
    
    def _validate(self):
        required = ['name', 'type']
        for field in required:
            if field not in self.config:
                raise ValueError(f"Missing required field: {field}")
    
    def process(self) -> dict:
        # Core logic goes here
        result = {
            'status': 'success',
            'topic': 'SGD',
            'data': self.config
        }
        return result


# Usage
example = SGDExample({
    'name': 'my-implementation',
    'type': 'sgd'
})
output = example.process()
print(output)
💡
Key insight: When working with SGD, always start with the simplest possible case that works end-to-end. Complexity is easier to add than simplicity is to recover.

Adam

Adam is the practical application of SGD in real projects. Once you understand the underlying model, Adam becomes the natural next step.

💡
Pro tip: When working with Adam, always read the official documentation for the exact version you're using. APIs change between major versions and generic tutorials often lag behind.

backpropagation

backpropagation rounds out today's lesson. It connects SGD and Adam into a complete picture. You'll use all three concepts together in the exercise below.

Common Mistakes on Day 5

📝 Day 5 Exercise
Gradient Descent & Backprop — Hands-On
  1. Set up your environment for today's topic: install required tools and verify the basics work before writing any logic.
  2. Implement a minimal working version of SGD using the code example in this lesson as your starting point.
  3. Extend your implementation to incorporate Adam — this is where the two concepts connect.
  4. Test your implementation with both valid and invalid inputs. What happens at the boundaries?
  5. Review your code: is there anything you'd name differently? Any function doing more than one thing? Refactor one thing.

Day 5 Summary

  • SGD is the foundation of today's lesson — understand it before moving on.
  • Adam is how you apply it in real projects.
  • backpropagation ties the day's concepts together into a complete pattern.
  • Error handling and input validation belong in the first version, not as an afterthought.
  • Read error messages carefully — they usually tell you exactly what's wrong.
Challenge

Extend today's exercise by adding one feature that wasn't in the instructions. Document what you built in a comment at the top of the file. This habit of going one step further is what separates engineers who grow fast from those who stay stuck.

Finished this lesson?