Building with AI – A Developer's Diary

Starlette Day 1 — Foundations

Build your first Starlette API from scratch: routing, JSON responses, POST requests, and the async model that makes it all work.

NoCo Interactive • Python · Starlette · ASGI • 15–20 minute read •

If you've spent time with Flask, you already have most of the mental model you need. Routes map URLs to functions. Functions receive a request and return a response. That core loop is the same in Starlette. What changes is the foundation underneath—and the foundation changes everything.

Starlette is an ASGI framework. That acronym matters, but you don't need to fully understand it on day one. What you need to understand is the practical consequence: every endpoint you write is async, which means your server can handle many requests concurrently without spawning threads. For a backend that talks to databases, calls external APIs, or waits on slow I/O, that's a meaningful advantage.

This is Day 1 of a multi-day series. By the end of today you'll have a running local server, two working API endpoints, and a POST handler that reads incoming JSON. You'll also understand why the code is structured the way it is, not just that it works.


The mental model

Before writing any code, it helps to know exactly what two things you're working with:

  • Starlette is your application framework. It defines routes, receives requests, runs your handler functions, and sends responses.
  • Uvicorn is the server that runs your app. It listens on a port, accepts connections, and speaks the ASGI protocol that Starlette understands.

You could swap Uvicorn for another ASGI server and the same Starlette app would run fine. The two are deliberately separate—framework and server are not the same thing. Most tutorials blur this line; keeping it clear will save you confusion later.

Setup

This takes about five minutes. Start with a fresh project folder and a virtual environment—it keeps your dependencies isolated and makes the project portable.

Shell
mkdir starlette-day1
cd starlette-day1
python3 -m venv venv
source venv/bin/activate   # Mac / Linux
# venv\Scripts\activate    # Windows

With the environment active, install the two packages you need:

Shell
pip install starlette uvicorn

Then create your app file:

Shell
touch main.py

Your first Starlette app

Open main.py and add this:

Python
from starlette.applications import Starlette
from starlette.responses import JSONResponse
from starlette.routing import Route

async def homepage(request):
    return JSONResponse({"message": "Hello, Starlette!"})

app = Starlette(routes=[
    Route("/", homepage)
])

Run the server:

Shell
uvicorn main:app --reload

You'll see output like Uvicorn running on http://127.0.0.1:8000. Open that URL in a browser and you'll get back:

JSON
{"message": "Hello, Starlette!"}

That's a working API. Let's break down exactly what you just wrote.

What each line does

async def homepage(request): — This is your endpoint function. It's async because Starlette is an ASGI framework; every handler needs to be a coroutine. The request argument contains everything about the incoming HTTP request: method, headers, body, query parameters, path parameters.

return JSONResponse({...}) — This serializes your dictionary to JSON and returns a proper HTTP response with the right Content-Type header set automatically.

Route("/", homepage) — This maps the URL path / to the homepage function. When a request comes in for /, Starlette finds this route and calls the function.

Request → Route → Function → Response. That's the entire core loop of a Starlette application. Every endpoint, no matter how complex, is just this pattern.

Adding a second endpoint

A single route isn't an API. Let's simulate a real backend by adding a /tasks endpoint that returns structured data.

Replace your main.py with this:

Python
from starlette.applications import Starlette
from starlette.responses import JSONResponse
from starlette.routing import Route

# In-memory task list (we'll connect a real database in a later day)
tasks = [
    {"id": 1, "title": "Learn Starlette"},
    {"id": 2, "title": "Build an API"}
]

async def homepage(request):
    return JSONResponse({"message": "API is running"})

async def get_tasks(request):
    return JSONResponse(tasks)

app = Starlette(routes=[
    Route("/", homepage),
    Route("/tasks", get_tasks)
])

Because you started Uvicorn with --reload, the server restarts automatically when you save. Hit http://127.0.0.1:8000/tasks and you'll see:

JSON
[
  {"id": 1, "title": "Learn Starlette"},
  {"id": 2, "title": "Build an API"}
]

This is exactly how a mobile or web frontend talks to a backend: it makes an HTTP request to a URL and gets back JSON. The tasks list is in memory for now, but the shape of the API is real and already useful for front-end development.

Handling incoming data — POST requests

Reading data is step one. Step two is accepting it. Let's add a POST endpoint that creates a new task from a JSON body.

Add this import at the top:

Python
from starlette.requests import Request

Then add the handler function:

Python
async def create_task(request: Request):
    data = await request.json()

    new_task = {
        "id": len(tasks) + 1,
        "title": data.get("title")
    }

    tasks.append(new_task)

    return JSONResponse(new_task)

And register it as a route — notice the explicit methods argument:

Python
app = Starlette(routes=[
    Route("/", homepage),
    Route("/tasks", get_tasks),
    Route("/tasks", create_task, methods=["POST"])
])

Starlette allows multiple routes on the same path as long as the HTTP methods are different. The GET and POST for /tasks are handled separately, which is standard REST convention.

Test the POST endpoint with curl:

Shell
curl -X POST http://127.0.0.1:8000/tasks \
  -H "Content-Type: application/json" \
  -d '{"title": "New Task"}'

You should get back:

JSON
{"id": 3, "title": "New Task"}

Now hit /tasks in your browser and you'll see all three items—the two you started with and the one you just created. The list persists for the lifetime of the server process. On Day 3, we'll replace this with a real database so data survives restarts.

What await request.json() is doing

The await keyword is the practical face of async Python. Reading a request body is an I/O operation—the data arrives over a network connection. Rather than blocking the entire server process until that read completes, await suspends this coroutine and lets the event loop handle other requests in the meantime. When the read finishes, execution resumes exactly where it left off.

For a tutorial app with one user, this distinction is invisible. For a production API handling hundreds of concurrent connections, it's the difference between a server that scales and one that doesn't.

Understanding async — the core advantage

Every Starlette endpoint is async. This is not optional ceremony—it's the design. Here's the most practical way to think about it.

A traditional synchronous server (like a basic Flask or Django app without special configuration) handles one request at a time per worker process. When handler code does something slow—a database query, an HTTP call to a third-party API, reading a file—that worker sits idle, waiting. To handle concurrency, you add more workers, which means more memory and more operating system overhead.

An async server handles that wait differently. When your handler hits an await—waiting on a database, waiting on an external HTTP call—it hands control back to the event loop, which can immediately begin processing another request. The original handler resumes when its I/O resolves. One process, many in-flight requests.

The practical upshot: async doesn't make individual requests faster. It makes your server more efficient when requests spend time waiting. The more I/O-bound your API is, the bigger the advantage.

You don't need to fully internalize async/await today. What matters is building the habit: every Starlette handler is async def, and any call that does I/O—request.json(), database queries, HTTP client calls—gets an await in front of it.

Inspecting requests

The request object is richer than it looks. Add a few debug prints to your get_tasks handler to see what's available:

Python
async def get_tasks(request):
    print("Method:", request.method)
    print("URL:", request.url)
    print("Headers:", dict(request.headers))

    return JSONResponse(tasks)

Save the file, hit /tasks, and check your terminal. You'll see the full request metadata. The most commonly useful properties are:

  • request.methodGET, POST, etc.
  • request.url — full URL including query string
  • request.headers — all request headers (dict-like)
  • request.query_params — URL query parameters
  • request.path_params — dynamic path segments (e.g. /tasks/{id})
  • await request.json() — parsed JSON body
  • await request.body() — raw bytes body

Remove the print statements when you're done—they're useful for exploration, noisy in production.

The final code

Here's the complete Day 1 file, cleaned up and ready to build on:

Python
from starlette.applications import Starlette
from starlette.responses import JSONResponse
from starlette.routing import Route
from starlette.requests import Request

tasks = [
    {"id": 1, "title": "Learn Starlette"},
    {"id": 2, "title": "Build an API"}
]

async def homepage(request):
    return JSONResponse({"message": "API is running"})

async def get_tasks(request):
    return JSONResponse(tasks)

async def create_task(request: Request):
    data = await request.json()

    new_task = {
        "id": len(tasks) + 1,
        "title": data.get("title")
    }

    tasks.append(new_task)

    return JSONResponse(new_task)

app = Starlette(routes=[
    Route("/", homepage),
    Route("/tasks", get_tasks),
    Route("/tasks", create_task, methods=["POST"])
])

That's under 30 lines. It runs a real HTTP server, handles two URL patterns, serves structured JSON, and accepts incoming POST data. It's also a reasonable foundation for what comes next—because the structure you see here scales well.

What you actually learned

The checklist matters less than the mental model. Here's the version worth holding onto:

  • Starlette is the framework; Uvicorn is the ASGI server that runs it. They're separate by design.
  • Every endpoint is an async function that receives a request and returns a response.
  • Route(path, handler) maps a URL to a function. Add methods=["POST"] to restrict by HTTP method.
  • JSONResponse({...}) serializes a dict to a proper JSON HTTP response.
  • await request.json() reads and parses a JSON request body.
  • The --reload flag on Uvicorn watches your files and restarts the server automatically—leave it on during development.

Day 2 preview — Full CRUD and app structure

On Day 2, the tasks API gets the rest of the operations: GET by ID, update with PUT, delete with DELETE. You'll also break main.py into multiple files the way a production codebase would be organized—because writing everything in one file stops working the moment your app grows past the tutorial stage.

If something didn't work today, the most likely culprits are: virtual environment not activated (run source venv/bin/activate), missing uvicorn install, or a port conflict on 8000 (add --port 8001 to the Uvicorn command to use a different port).

← Back to Building with AI – A Developer's Diary