You start a new microservice. It runs fast on localhost, but you need a real web server that can handle requests, manage sessions, and not turn into a memory-eating monster under load. Enter Eclipse Jetty, the small, sharp, and very reliable web server hiding behind many of the apps you already use.
Jetty began as a lightweight servlet container, but it evolved into much more. It’s now a full HTTP server and client library designed for modern, modular Java applications. Unlike heavier stacks, Jetty thrives in embedded setups. You can drop it inside your app like any other dependency and spin it up programmatically. No external container, no deployment gymnastics. For developers who want control without the drama, Jetty feels like a precision tool rather than a platform.
At its core, Jetty manages HTTP connections efficiently. It implements the Servlet and WebSocket specifications, so it plays nicely with frameworks like Spring Boot. It supports HTTP/2 and asynchronous I/O, which means it can juggle thousands of idle connections without sweating. This makes it perfect for chat servers, APIs, or anything that values non-blocking performance.
So how does Eclipse Jetty fit into modern infrastructure? Teams often pair it with reverse proxies such as NGINX, Kubernetes Ingress controllers, or identity-aware proxies that wrap access policies around it. Jetty manages application logic while the proxy handles identity, audit logging, and TLS termination. In a secure setup, authentication flows through an OIDC provider like Okta or AWS IAM, then passes verified headers downstream to Jetty. The result is consistent authentication everywhere without modifying your app’s core code.
Common developer challenges—session affinity, rolling updates, and request throttling—are easier when Jetty runs embedded. You control the lifecycle from inside the JVM. Need to swap a handler or tune thread pools dynamically? Just script it. Errors that would require container restarts elsewhere become on-the-fly config changes here.