You have TLS everywhere, a reverse proxy that actually behaves, and yet access control still feels like duct tape. Every login form brings a new YAML file. Every internal service has its own “temporary” password rule that stuck around for three years. That is where Caddy OneLogin makes sense.
Caddy runs your web edge. It automates certificates, routes, and middleware with the ease that Nginx never quite nailed. OneLogin anchors identity. It gives you single sign‑on with SAML or OIDC, mapping users to roles so you decide who gets through long before a request hits your app. Bring them together, and you get a secure, self‑updating front door that knows who is walking in.
Here is the logic flow. A user hits Caddy. The proxy checks the request against a OneLogin token or session. If it is valid, Caddy passes traffic upstream with the right headers or JWT claims. If not, it redirects to the OneLogin auth page. OneLogin handles multifactor, user state, and group membership. Caddy enforces what you decide counts as “authorized.” The result feels like one continuous system, not two tools bolted together.
The cleanest setups use OIDC. Let OneLogin issue short‑lived tokens, and teach Caddy to validate them using its identity middleware. Keep your user groups in OneLogin so DevOps never has to edit a proxy config just to add someone new. Rotate secrets often, store them via environment variables, and avoid hard‑coding anything. Once this is live, access management runs itself.
Common pitfalls? Token expiration mismatches, missing callback URLs, and confusion over which roles map to which routes. The trick is to start small: protect one internal admin path, watch the flow, then expand. Each win builds muscle memory.