Polylith Architecture
A component-first architecture where your code is organized into small, reusable bricks that can be composed into any number of deployable projects — all from a single codebase.
The problem
- Monoliths become unwieldy as they grow
- Microservices duplicate code across repos
- Shared libraries need versioning & publishing
- Refactoring across service boundaries is painful
The polylith answer
- One repo, many deployables
- Share code by reference, not by publishing
- Reuse components across any number of projects
- Refactor globally with full IDE support
Think in bricks
Imagine a bucket of LEGO. You don't decide up front which model each brick belongs to — you just make well-shaped bricks with consistent studs. Then you assemble them into whatever model you need. Polylith treats code the same way: make well-shaped bricks, and snap them together into different projects.
Workspace layout
A polylith workspace has exactly four top-level folders. Each has a distinct role:
The four roles at a glance
Why it works
🧱 One codebase
All bricks live in a single monorepo. No internal packages to publish or version.
🔄 Reusability
Drop the same component into any project just by referencing it in that project's config.
⚡ Incremental testing
The poly tool tracks what changed and tests only
affected bricks.
🎯 Clear boundaries
Each brick exposes its public API through
__init__.py. Prefix names with _ to
keep them private.
🧩 Composable
Build a new deployable by picking a base and the components you need. That's it.
🔍 Global refactors
Rename a function and IDE finds every usage across every project at once.
Next step: click into bases/,
components/, or projects/ in the
sidebar to drill into each concept. When you're ready to play,
hit Build a Project.
Bases
Entry PointsA base is the bridge between the outside world and your business logic. HTTP requests, CLI invocations, Lambda events, message queue consumers — they all land in a base.
The crucial rule: bases contain no business logic. They parse the incoming request, call the right component(s), and format a response. If you find yourself writing a conditional that checks user roles or calculating a discount inside a base, it belongs in a component.
What a base looks like
# bases/my_ns/api/src/my_ns/api/core.py from fastapi import APIRouter, HTTPException from my_ns.user import register_user, get_user from my_ns.email import send_welcome router = APIRouter() @router.post("/users") async def create_user(payload: dict): # 1. delegate to component — no logic here user = register_user(payload) # 2. orchestrate — still no logic send_welcome(user.email) return {"id": user.id}
Dependency direction
Bases depend on components. Components never depend on bases. This one-way flow is what makes components portable across projects.
Anti-pattern: importing from a base inside a
component. If your user component imports from
my_ns.api, the component is no longer reusable in a
CLI or worker project. Components must stay base-agnostic.
Tip: a base is often just a thin file of route definitions or command handlers — often fewer than 100 lines. If a base grows large, suspect logic leakage.
bases/api/
REST API BaseA FastAPI-powered HTTP base. Exposes JSON endpoints and delegates every route to a component. It handles the protocol concerns — request parsing, status codes, OpenAPI docs — and nothing else.
Folder layout
bases/my_ns/api/ ├── src/my_ns/api/ │ ├── __init__.py # public surface (usually empty) │ ├── core.py # route wiring │ └── schemas.py # pydantic request/response models # Tests live at the workspace root (loose theme, recommended): test/bases/my_ns/api/ └── test_core.py # integration tests w/ TestClient
Example route
from fastapi import APIRouter from my_ns.user import register_user from .schemas import NewUser, UserOut router = APIRouter(prefix="/users") @router.post("", response_model=UserOut, status_code=201) def create(payload: NewUser) -> UserOut: user = register_user(payload.model_dump()) return UserOut(id=user.id, email=user.email)
Keep schemas (pydantic models) in the base — they describe the wire format. Keep domain models (user, order) in components — they describe your domain.
bases/cli/
CLI BaseA command-line entry point, typically built on Click or Typer. Great for admin tools, one-off scripts, and maintenance tasks that share logic with your web API.
Folder layout
bases/my_ns/cli/ ├── src/my_ns/cli/ │ ├── __init__.py │ └── core.py # command definitions └── tests/
Example commands
import click from my_ns.user import register_user, list_users from my_ns.database import run_migrations @click.group() def cli(): """Admin CLI for the platform.""" @cli.command() @click.argument("email") def create_user(email: str): user = register_user({"email": email}) click.echo(f"created {user.id}") @cli.command() def migrate(): run_migrations() click.echo("done")
Notice the CLI reuses the exact same
register_user component as the REST API. That's the
payoff — no duplicated user-creation logic.
bases/lambda/
AWS Lambda BaseAn AWS Lambda (or any serverless function) entry point. The base translates the event payload into a component call and returns the serialized result.
Handler example
# bases/my_ns/lambda_handler/src/my_ns/lambda_handler/core.py import json from my_ns.email import send_email def handler(event: dict, context) -> dict: # 1. parse event body = json.loads(event["body"]) # 2. call component send_email(to=body["to"], subject=body["subject"]) # 3. format response return {"statusCode": 200, "body": json.dumps({"ok": True})}
Serverless functions are a natural fit for polylith: each deployable Lambda is its own project, sharing components with the main API. You get fast cold-starts because each project only bundles the bricks it uses.
Components
Building BlocksComponents are the bricks that contain your actual business logic. They're small, focused, and infinitely reusable across projects.
The discipline: each component does
one thing well and exposes a minimal public
interface through its __init__.py. Everything else
inside the component is private implementation detail that callers
must never import.
Anatomy of a component
components/my_ns/user/ ├── src/my_ns/user/ │ ├── __init__.py # PUBLIC: only what callers may use │ │ # from .core import register_user, get_user │ ├── core.py # PRIVATE: orchestration, validation │ ├── models.py # PRIVATE: dataclasses, SQL models │ └── _repository.py # PRIVATE: persistence details # Tests live at the workspace root (loose theme, recommended): test/components/my_ns/user/ └── test_user.py
The public/private discipline
# components/my_ns/user/src/my_ns/user/__init__.py from .core import register_user, get_user, authenticate # That's it. Callers import only these names: # from my_ns.user import register_user # They MUST NOT reach into my_ns.user.core or .models. # # Pro tip: prefix a name with _ to keep it out of the interface: # from .core import _internal_helper # polylith ignores this
Same components, different projects
This is where the architecture pays dividends. A single
user component is consumed by every project that needs
user logic — no duplication, no sync problems.
Golden rule: components may depend on other components, but never on a base. This one-way dependency is what keeps each component portable.
Available components in this workspace
👤 user
Registration, profile management, and authentication logic.
Composing and sending notification emails via a provider.
💾 database
Connection pool, session management, and migrations.
💳 payments
Payment gateway integration, charges, and invoicing.
components/user/
Domain LogicOwns everything about user identity: registration, profile updates, and authentication. Other components and bases interact with users exclusively through this brick.
Public interface
# components/my_ns/user/src/my_ns/user/__init__.py from .core import ( register_user, get_user, update_profile, authenticate, )
Implementation sketch
# components/my_ns/user/src/my_ns/user/core.py from my_ns.database import session_scope from .models import User from ._repository import insert_user, find_by_email def register_user(data: dict) -> User: if not data.get("email"): raise ValueError("email required") with session_scope() as s: return insert_user(s, User(email=data["email"])) def authenticate(email: str, password: str) -> User | None: with session_scope() as s: user = find_by_email(s, email) return user if user and user.verify(password) else None
Who uses it?
Every project in this workspace: api_service (for
REST), worker (for background jobs touching user
state), and admin_cli (for ops tasks).
components/email/
Infrastructure
Wraps an external email provider (SendGrid, Mailgun, SES) behind a
stable interface. Consumers call send_email without
caring which provider is behind it — swap providers by changing one
file.
Public interface
# components/my_ns/email/src/my_ns/email/__init__.py from .core import send_email, send_welcome, send_password_reset
Implementation
# components/my_ns/email/src/my_ns/email/core.py from ._provider import get_provider from .templates import render def send_email(to: str, subject: str, body: str) -> None: get_provider().deliver(to=to, subject=subject, body=body) def send_welcome(email: str, name: str = "") -> None: html = render("welcome.html", name=name) send_email(to=email, subject="Welcome!", body=html)
Notice this component has zero knowledge of HTTP routes, CLI flags, or Lambda events. It only knows about emails. That's what makes it deployable from any base.
components/database/
Infrastructure
Owns the database connection pool, session lifecycle, and
migrations. Other components depend on
my_ns.database to get a session — they don't create
their own engines.
Public interface
# components/my_ns/database/src/my_ns/database/__init__.py from .core import session_scope, init_engine, run_migrations
Session helper
from contextlib import contextmanager from sqlalchemy.orm import sessionmaker from ._engine import engine SessionLocal = sessionmaker(bind=engine, autoflush=False) @contextmanager def session_scope(): """Provide a transactional scope around a series of operations.""" session = SessionLocal() try: yield session session.commit() except Exception: session.rollback() raise finally: session.close()
Centralizing database access means one place to add connection
pooling, read replicas, or observability. The
user and payments components
automatically benefit.
components/payments/
Domain Logic
Handles charges, refunds, and invoicing through a payment gateway
(Stripe, etc). Pairs with user for customer IDs and
email for receipts.
Public interface
# components/my_ns/payments/src/my_ns/payments/__init__.py from .core import charge, refund, create_invoice
Orchestration example
from my_ns.user import get_user from my_ns.email import send_email from ._gateway import stripe_charge def charge(user_id: int, amount_cents: int) -> str: user = get_user(user_id) tx = stripe_charge(customer=user.stripe_id, amount=amount_cents) send_email(user.email, "Receipt", f"Charged ${amount_cents/100:.2f}") return tx.id
Components calling components is
perfectly fine and encouraged — just keep the
graph acyclic. Here payments → user and
payments → email, but user and
email never call back into payments.
projects/admin_cli/
DeployablesA project is a deployable artifact. Each project picks exactly one base and the components it needs — nothing more, nothing less.
Projects themselves contain no application code.
They are pure configuration: a pyproject.toml that
declares which bricks to include, plus deployment glue like
Dockerfiles, environment files, and CI scripts.
What a project looks like
projects/api_service/ ├── pyproject.toml # declares which bricks are included ├── Dockerfile # how to build the image ├── .env.example # environment variables └── README.md # deployment notes
Project configuration
# projects/api_service/pyproject.toml [project] name = "api_service" version = "1.0.0" [tool.polylith.bricks] # ONE base: "../../bases/my_ns/api" = "my_ns/api" # N components — pick what this deployable needs: "../../components/my_ns/user" = "my_ns/user" "../../components/my_ns/email" = "my_ns/email" "../../components/my_ns/database" = "my_ns/database" "../../components/my_ns/payments" = "my_ns/payments"
That's it. When you build this project, the poly tool
copies just these bricks into the wheel / Docker image. Other
projects get only the bricks they need.
Rule of thumb: exactly one base per project. If you find yourself wanting two, that's a signal to create a second project that shares the same components.
Try Build a Project to experiment with picking bases and components and see the generated config.
Development
REST API
The primary deployable — a FastAPI service behind a load balancer.
Combines the api base with the full set of domain
components.
Brick selection
# projects/api_service/pyproject.toml [tool.polylith.bricks] "../../bases/my_ns/api" = "my_ns/api" # base "../../components/my_ns/user" = "my_ns/user" "../../components/my_ns/email" = "my_ns/email" "../../components/my_ns/database" = "my_ns/database" "../../components/my_ns/payments" = "my_ns/payments"
What's inside the project folder?
projects/api_service/ ├── pyproject.toml # brick references + dependencies ├── Dockerfile # how to build the image ├── README.md # deployment notes └── config/ # environment-specific settings └── settings.py # Notice: no Python app code here. The entry point is the `api` base.
Run it locally
From the workspace root, using the development virtual environment:
$ uv run uvicorn my_ns.api.core:app --reload
# Or with the FastAPI CLI:
$ uv run fastapi dev bases/my_ns/api/core.py
The project folder has zero application code.
The FastAPI app lives in the api base. When you
build the project wheel, the build hook pulls in only the
referenced bricks.
Golden Rules
Background WorkerA long-running process that consumes a message queue and processes async jobs (send emails in bulk, reconcile payments, rebuild caches). Reuses user/email/database components from the API with zero duplication.
Brick selection
# projects/worker/pyproject.toml [tool.polylith.bricks] "../../bases/my_ns/worker" = "my_ns/worker" # queue consumer base "../../components/my_ns/user" = "my_ns/user" "../../components/my_ns/email" = "my_ns/email" "../../components/my_ns/database" = "my_ns/database" # note: no `payments` — this worker doesn't need it
What's inside the project folder?
projects/worker/ ├── pyproject.toml # brick references + dependencies ├── Dockerfile # image for the worker container └── README.md # ops runbook # The queue-consumer logic lives in the `worker` base — not here.
Run it locally
$ uv run python -m my_ns.worker.core
# Or however the worker base exposes its entry point:
$ uv run python -c "from my_ns.worker.core import run; run()"
Reuse win: fixing a bug in
send_welcome (in the email component)
fixes it for every caller — API, worker, CLI — in a single
commit. The project just bundles the same bricks differently.
Build a Project
Admin ToolingA lean command-line tool for operators: create users, run migrations, trigger one-off maintenance tasks. Pulls only the components it actually needs, so it starts instantly.
Brick selection
# projects/admin_cli/pyproject.toml [tool.polylith.bricks] "../../bases/my_ns/cli" = "my_ns/cli" "../../components/my_ns/user" = "my_ns/user" "../../components/my_ns/database" = "my_ns/database" # no email, no payments — kept minimal
What's inside the project folder?
projects/admin_cli/ ├── pyproject.toml # brick references + dependencies └── README.md # usage notes # The CLI commands live in the `cli` base — not here.
Run it locally
# During development (from workspace root): $ uv run python -m my_ns.cli.core --help # After building & installing the project wheel: $ pip install ./projects/admin_cli $ my-admin --help $ my-admin create-user alice@example.com $ my-admin migrate
projects/api_service/
Monolithic Experience
While projects are sliced deployables, the
development/ folder pulls in
every brick into a single virtual environment — so
you can navigate, refactor, debug, and test across the whole
codebase as if it were a monolith.
Workspace config
# pyproject.toml at the workspace root [tool.polylith] namespace = "my_ns" [tool.hatch.build] dev-mode-dirs = [ "components", "bases", "development", ".", ]
Setting up a new workspace
# Create the workspace structure (uses the recommended "loose" theme) $ uv run poly create workspace --name my_ns --theme loose # The tool scaffolds: # bases/ components/ development/ projects/ test/ # pyproject.toml workspace.toml
The poly command
Run via your package manager (shown with uv; use
hatch run poly, pdm run poly,
poetry poly, or rye run poly as needed):
$ uv run poly info # overview: bases, components, projects $ uv run poly check # verify brick dependencies & interfaces $ uv run poly diff # list bricks changed since last tag $ uv run poly test diff # show affected bricks after changes $ uv run poly deps # render dependency graph between bricks $ uv run poly sync # keep projects in sync with imported bricks
Why development/ is the secret weapon
🔍 Global navigation
Jump to definition works across every brick — just like a regular monolith.
🐛 Unified debugging
Set breakpoints in base, component, or test code — all in one debugger session.
♻️ Atomic refactors
Rename a component's public function and your IDE updates every caller in every project.
⚡ Incremental tests
poly test runs only what changed, giving you
microservice-speed feedback on a single repo.
🧪 Real integration
Because every brick is importable, integration tests cross component boundaries freely.
📈 Shape-of-the-system
poly info prints the current architecture — great
for onboarding & reviews.
The development/ folder isn't a deployable. It's an
editor convenience — a single virtual env where IDEs see every
file, so tooling like autocomplete, go-to-definition, and
rename-across-workspace just works.
projects/worker/
PrinciplesFollow these and Polylith stays clean. Violate them and you're back to spaghetti.
- One base per project A project exposes exactly one entry point. If you want a second, make it a second project that shares the same components.
-
Components never import from bases
One-way dependency:
base → component. The moment a component imports from a base, reusability dies. - Bases contain no business logic Parse request, call component, return response. That's the full job description of a base.
- Each brick does one thing If you can't describe a component's purpose in one sentence, split it.
-
Public interface =
__init__.pyCallers import only the names exported there. Internal modules are private — don't let consumers reach in. -
Components may call components
Keep the dependency graph acyclic. Linting /
poly checkwill flag cycles. -
Projects are configs, not code
A project contains no application code — only
pyproject.tomland deployment glue (Dockerfile, CI scripts, env files). Real code lives in bases and components. -
Test bricks in isolation
Each component and base ships its own tests. Integration tests
live in
development/or in the project folder. - Prefer pure functions Components that expose pure functions (data in, data out) compose most easily and test fastest.
-
Refactor fearlessly
The
development/virtual env gives you a single codebase view. Use it — global renames and moves are a superpower.
Projects
InteractivePick a base and some components — see the project configuration that Polylith would generate. No wrong answers: play until the mental model clicks.
// Pick a base and some components to see your project configuration...
What you just did
You composed a deployable out of existing bricks — no new code, no
copy-paste. In a real polylith workspace, running
uv run poly create project --name my_thing scaffolds
exactly this. Need a second deployable with a different shape? Make
another project, pick different bricks. The components don't know
(and don't care).
Try: build an api project with
user + database, then a cli project
with user + database, then add
email to just the API. The
user component serves all of them from one source
of truth.