Building with AI – A Developer's Diary
Starlette Day 1 — Foundations
Build your first Starlette API from scratch: routing, JSON responses, POST requests, and the async model that makes it all work.
If you've spent time with Flask, you already have most of the mental model you need. Routes map URLs to functions. Functions receive a request and return a response. That core loop is the same in Starlette. What changes is the foundation underneath—and the foundation changes everything.
Starlette is an ASGI framework. That acronym matters, but you don't need to fully understand it on day one. What you need to understand is the practical consequence: every endpoint you write is async, which means your server can handle many requests concurrently without spawning threads. For a backend that talks to databases, calls external APIs, or waits on slow I/O, that's a meaningful advantage.
This is Day 1 of a multi-day series. By the end of today you'll have a running local server, two working API endpoints, and a POST handler that reads incoming JSON. You'll also understand why the code is structured the way it is, not just that it works.
The mental model
Before writing any code, it helps to know exactly what two things you're working with:
- Starlette is your application framework. It defines routes, receives requests, runs your handler functions, and sends responses.
- Uvicorn is the server that runs your app. It listens on a port, accepts connections, and speaks the ASGI protocol that Starlette understands.
You could swap Uvicorn for another ASGI server and the same Starlette app would run fine. The two are deliberately separate—framework and server are not the same thing. Most tutorials blur this line; keeping it clear will save you confusion later.
Setup
This takes about five minutes. Start with a fresh project folder and a virtual environment—it keeps your dependencies isolated and makes the project portable.
Shellmkdir starlette-day1
cd starlette-day1
python3 -m venv venv
source venv/bin/activate # Mac / Linux
# venv\Scripts\activate # Windows
With the environment active, install the two packages you need:
Shellpip install starlette uvicorn
Then create your app file:
Shelltouch main.py
Your first Starlette app
Open main.py and add this:
from starlette.applications import Starlette
from starlette.responses import JSONResponse
from starlette.routing import Route
async def homepage(request):
return JSONResponse({"message": "Hello, Starlette!"})
app = Starlette(routes=[
Route("/", homepage)
])
Run the server:
Shelluvicorn main:app --reload
You'll see output like Uvicorn running on http://127.0.0.1:8000. Open that URL in a browser and you'll get back:
{"message": "Hello, Starlette!"}
That's a working API. Let's break down exactly what you just wrote.
What each line does
async def homepage(request): — This is your endpoint function. It's async because Starlette is an ASGI framework; every handler needs to be a coroutine. The request argument contains everything about the incoming HTTP request: method, headers, body, query parameters, path parameters.
return JSONResponse({...}) — This serializes your dictionary to JSON and returns a proper HTTP response with the right Content-Type header set automatically.
Route("/", homepage) — This maps the URL path / to the homepage function. When a request comes in for /, Starlette finds this route and calls the function.
Adding a second endpoint
A single route isn't an API. Let's simulate a real backend by adding a /tasks endpoint that returns structured data.
Replace your main.py with this:
from starlette.applications import Starlette
from starlette.responses import JSONResponse
from starlette.routing import Route
# In-memory task list (we'll connect a real database in a later day)
tasks = [
{"id": 1, "title": "Learn Starlette"},
{"id": 2, "title": "Build an API"}
]
async def homepage(request):
return JSONResponse({"message": "API is running"})
async def get_tasks(request):
return JSONResponse(tasks)
app = Starlette(routes=[
Route("/", homepage),
Route("/tasks", get_tasks)
])
Because you started Uvicorn with --reload, the server restarts automatically when you save. Hit http://127.0.0.1:8000/tasks and you'll see:
[
{"id": 1, "title": "Learn Starlette"},
{"id": 2, "title": "Build an API"}
]
This is exactly how a mobile or web frontend talks to a backend: it makes an HTTP request to a URL and gets back JSON. The tasks list is in memory for now, but the shape of the API is real and already useful for front-end development.
Handling incoming data — POST requests
Reading data is step one. Step two is accepting it. Let's add a POST endpoint that creates a new task from a JSON body.
Add this import at the top:
Pythonfrom starlette.requests import Request
Then add the handler function:
Pythonasync def create_task(request: Request):
data = await request.json()
new_task = {
"id": len(tasks) + 1,
"title": data.get("title")
}
tasks.append(new_task)
return JSONResponse(new_task)
And register it as a route — notice the explicit methods argument:
app = Starlette(routes=[
Route("/", homepage),
Route("/tasks", get_tasks),
Route("/tasks", create_task, methods=["POST"])
])
Starlette allows multiple routes on the same path as long as the HTTP methods are different. The GET and POST for /tasks are handled separately, which is standard REST convention.
Test the POST endpoint with curl:
Shellcurl -X POST http://127.0.0.1:8000/tasks \
-H "Content-Type: application/json" \
-d '{"title": "New Task"}'
You should get back:
JSON{"id": 3, "title": "New Task"}
Now hit /tasks in your browser and you'll see all three items—the two you started with and the one you just created. The list persists for the lifetime of the server process. On Day 3, we'll replace this with a real database so data survives restarts.
What await request.json() is doing
The await keyword is the practical face of async Python. Reading a request body is an I/O operation—the data arrives over a network connection. Rather than blocking the entire server process until that read completes, await suspends this coroutine and lets the event loop handle other requests in the meantime. When the read finishes, execution resumes exactly where it left off.
For a tutorial app with one user, this distinction is invisible. For a production API handling hundreds of concurrent connections, it's the difference between a server that scales and one that doesn't.
Understanding async — the core advantage
Every Starlette endpoint is async. This is not optional ceremony—it's the design. Here's the most practical way to think about it.
A traditional synchronous server (like a basic Flask or Django app without special configuration) handles one request at a time per worker process. When handler code does something slow—a database query, an HTTP call to a third-party API, reading a file—that worker sits idle, waiting. To handle concurrency, you add more workers, which means more memory and more operating system overhead.
An async server handles that wait differently. When your handler hits an await—waiting on a database, waiting on an external HTTP call—it hands control back to the event loop, which can immediately begin processing another request. The original handler resumes when its I/O resolves. One process, many in-flight requests.
You don't need to fully internalize async/await today. What matters is building the habit: every Starlette handler is async def, and any call that does I/O—request.json(), database queries, HTTP client calls—gets an await in front of it.
Inspecting requests
The request object is richer than it looks. Add a few debug prints to your get_tasks handler to see what's available:
async def get_tasks(request):
print("Method:", request.method)
print("URL:", request.url)
print("Headers:", dict(request.headers))
return JSONResponse(tasks)
Save the file, hit /tasks, and check your terminal. You'll see the full request metadata. The most commonly useful properties are:
request.method—GET,POST, etc.request.url— full URL including query stringrequest.headers— all request headers (dict-like)request.query_params— URL query parametersrequest.path_params— dynamic path segments (e.g./tasks/{id})await request.json()— parsed JSON bodyawait request.body()— raw bytes body
Remove the print statements when you're done—they're useful for exploration, noisy in production.
The final code
Here's the complete Day 1 file, cleaned up and ready to build on:
Pythonfrom starlette.applications import Starlette
from starlette.responses import JSONResponse
from starlette.routing import Route
from starlette.requests import Request
tasks = [
{"id": 1, "title": "Learn Starlette"},
{"id": 2, "title": "Build an API"}
]
async def homepage(request):
return JSONResponse({"message": "API is running"})
async def get_tasks(request):
return JSONResponse(tasks)
async def create_task(request: Request):
data = await request.json()
new_task = {
"id": len(tasks) + 1,
"title": data.get("title")
}
tasks.append(new_task)
return JSONResponse(new_task)
app = Starlette(routes=[
Route("/", homepage),
Route("/tasks", get_tasks),
Route("/tasks", create_task, methods=["POST"])
])
That's under 30 lines. It runs a real HTTP server, handles two URL patterns, serves structured JSON, and accepts incoming POST data. It's also a reasonable foundation for what comes next—because the structure you see here scales well.
What you actually learned
The checklist matters less than the mental model. Here's the version worth holding onto:
- Starlette is the framework; Uvicorn is the ASGI server that runs it. They're separate by design.
- Every endpoint is an
asyncfunction that receives arequestand returns aresponse. Route(path, handler)maps a URL to a function. Addmethods=["POST"]to restrict by HTTP method.JSONResponse({...})serializes a dict to a proper JSON HTTP response.await request.json()reads and parses a JSON request body.- The
--reloadflag on Uvicorn watches your files and restarts the server automatically—leave it on during development.