PythonFastAPI

Strict mypy, Protocol types, and the art of library design

How boundary validation with Pydantic and custom exception hierarchies changed how I build Python libraries.

30 March 2026 · 10 min read

Every Python library starts the same way: a few functions that solve an immediate problem. Then it grows. Then someone else uses it. Then the bugs arrive — not logic bugs, but integration bugs. The function expected a string but got bytes. The config dict was missing a key. The callback returned the wrong type.

This post documents the practices I’ve adopted for building Python libraries that are strict by default and pleasant to use.

Why strict mypy from day one

Running mypy in strict mode from the first commit is non-negotiable. Here’s what pyproject.toml looks like:

[tool.mypy]
strict = true
warn_return_any = true
warn_unused_configs = true
disallow_untyped_defs = true
disallow_any_generics = true

This catches entire categories of bugs at the type-checking stage rather than at runtime. The upfront cost is real — you’ll spend time annotating functions that feel obvious. But the payoff compounds over every module you add.

Protocol-based typing over inheritance

Python’s Protocol class (from typing) enables structural subtyping — duck typing with a safety net:

from typing import Protocol, runtime_checkable

@runtime_checkable
class OrderSource(Protocol):
    def fetch_orders(self, symbol: str) -> list[Order]: ...
    def stream_updates(self) -> AsyncIterator[OrderUpdate]: ...

Any class that implements fetch_orders and stream_updates with the right signatures satisfies OrderSource — no inheritance required. This keeps your library decoupled from specific implementations while maintaining type safety.

Pydantic at every boundary

Internal functions can pass typed dataclasses around. But at every boundary — API endpoints, config loading, external data ingestion — Pydantic validates:

from pydantic import BaseModel, Field, field_validator

class TradeConfig(BaseModel):
    symbol: str
    max_position_size: int = Field(gt=0, le=100)
    stop_ticks: int = Field(gt=0, le=50)
    target_ticks: int = Field(gt=0, le=200)

    @field_validator('symbol')
    @classmethod
    def symbol_must_be_uppercase(cls, v: str) -> str:
        return v.upper()

The validation happens once, at the boundary. Everything downstream receives a validated, typed object. No defensive checking scattered through the codebase.

Custom exception hierarchies

A well-designed library communicates failure modes through its exception hierarchy:

class TradeForgeError(Exception):
    """Base exception for all TradeForge errors."""

class ConfigurationError(TradeForgeError):
    """Raised when configuration is invalid or missing."""

class ConnectionError(TradeForgeError):
    """Raised when a service connection fails."""

class ValidationError(TradeForgeError):
    """Raised when input data fails validation."""

class OrderRejectedError(TradeForgeError):
    """Raised when an order is rejected by the exchange."""
    def __init__(self, order_id: str, reason: str) -> None:
        self.order_id = order_id
        self.reason = reason
        super().__init__(f"Order {order_id} rejected: {reason}")

Users of your library can catch TradeForgeError to handle everything, or catch specific exceptions for granular control. The hierarchy is part of your public API — design it deliberately.

The src layout

Every library uses the src layout:

my-library/
├── src/
│   └── my_library/
│       ├── __init__.py
│       ├── core.py
│       └── exceptions.py
├── tests/
│   └── test_core.py
├── pyproject.toml
└── README.md

The src layout prevents accidental imports of the local source tree during testing. It’s a small structural decision that eliminates an entire class of “works on my machine” bugs.

Pre-commit hooks as guardrails

The .pre-commit-config.yaml runs on every commit:

  • ruff: Linting and formatting in one pass
  • mypy: Type checking in strict mode
  • pytest: Test suite with 100% coverage requirement

If any hook fails, the commit is rejected. This means the main branch is always in a known-good state. CI confirms what pre-commit already caught.

The compound effect

None of these practices is revolutionary on its own. Strict typing, boundary validation, structured exceptions, consistent layout, automated checks — they’re all well-known.

The value is in applying them together, consistently, from the start. A library built this way is a library that other developers (and AI agents) can work with confidently. And when your AI coding assistant can trust the type system, its output quality improves dramatically.