Courses Curriculum Cities Blog Enroll Now
API Development for AI · Day 4 of 5 ~45 minutes

Day 4: Database Integration: Store and Retrieve AI Results

AI APIs are stateless — each call forgets everything before it. To build useful applications, you need to store results, conversations, and metadata. Learn how to connect your API to SQLite and PostgreSQL.

1
Day 1
2
Day 2
3
Day 3
4
Day 4
5
Day 5
What You'll Build

An API with persistent storage: endpoints that save AI analysis results to a database, retrieve history by user, and support conversation threads with full context.

1
Section 1 · 10 min

When You Need a Database

Not every AI endpoint needs a database. A one-shot analysis endpoint that returns results immediately doesn't. But these cases do:

  • Conversation history: Multi-turn chat requires storing previous messages
  • Result caching: Don't re-run expensive AI calls for identical inputs
  • User data: Track usage, preferences, and personalization data
  • Audit trail: Record what was sent, received, and when for compliance

For development: use SQLite (no server required, file-based). For production: use PostgreSQL. The code is nearly identical with SQLAlchemy — change the connection string.

2
Section 2 · 15 min

SQLAlchemy Setup and Models

SQLAlchemy is the standard Python ORM — it maps Python classes to database tables so you never write raw SQL unless you want to.

pythondatabase.py
from sqlalchemy import create_engine, Column, String, Text, DateTime, Integer
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import sessionmaker
from datetime import datetime

# SQLite for dev, swap to PostgreSQL URL for production
DATABASE_URL = "sqlite:///./ai_results.db"

engine = create_engine(DATABASE_URL, connect_args={"check_same_thread": False})
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
Base = declarative_base()

class AnalysisResult(Base):
    __tablename__ = "results"
    id = Column(Integer, primary_key=True)
    user_id = Column(String, index=True)
    input_text = Column(Text)
    result = Column(Text)
    tokens_used = Column(Integer)
    created_at = Column(DateTime, default=datetime.utcnow)

Base.metadata.create_all(bind=engine)
3
Section 3 · 20 min

CRUD Endpoints with Database

Now connect the database to your endpoints:

pythonmain.py
from fastapi import FastAPI, Depends
from sqlalchemy.orm import Session
from .database import SessionLocal, AnalysisResult
import anthropic

def get_db():
    db = SessionLocal()
    try:
        yield db
    finally:
        db.close()

@app.post("/analyze/{user_id}")
async def analyze_and_save(
    user_id: str,
    req: TextRequest,
    db: Session = Depends(get_db)
):
    client = anthropic.AsyncAnthropic()
    msg = await client.messages.create(
        model="claude-sonnet-4-20250514",
        max_tokens=512,
        messages=[{"role": "user", "content": req.text}]
    )
    result_text = msg.content[0].text
    record = AnalysisResult(
        user_id=user_id, input_text=req.text,
        result=result_text,
        tokens_used=msg.usage.input_tokens + msg.usage.output_tokens
    )
    db.add(record)
    db.commit()
    return {"result": result_text, "id": record.id}

What You Learned Today

  • When AI applications need a database: conversations, caching, audit trails, user data
  • How SQLAlchemy maps Python classes to database tables
  • How FastAPI's dependency injection (Depends) provides database sessions to endpoints
  • How to save AI results to SQLite during development and swap to PostgreSQL for production
Your Challenge

Go Further on Your Own

  • Add a GET /history/{user_id} endpoint that returns the last 10 analysis results for a user, ordered by most recent
  • Add result caching: before calling Claude, check if the exact same input text was analyzed in the last 24 hours — if so, return the cached result
  • Add a DELETE /history/{id} endpoint that deletes a specific result record
Day 4 Complete

Nice work. Keep going.

Day 5 is ready when you are.

Continue to Day 5
Course Progress
80%

Want live instruction and hands-on projects? Join the AI bootcamp — 3 days, 5 cities.

Finished this lesson?