You can almost hear the sigh of an engineer waiting for another access approval. The notebook is ready, the model tuned, yet the data environment refuses to authenticate. Domino Data Lab Jetty exists to solve that daily irritation, transforming how infrastructure teams handle secure web access and service routing inside Domino.
Jetty is the lightweight web server and servlet container Domino uses under the hood to host user services, dashboards, and APIs. It deals with requests, sessions, and TLS so data scientists can focus on analysis instead of URLs and port bindings. Domino Data Lab wraps Jetty with enterprise controls like OIDC and role‑based access. Together, they form a clean pattern: Jetty handles HTTP flows, Domino orchestrates users, projects, and compute resources. When they align properly, approvals drop from minutes to seconds.
Here’s what that alignment looks like. Identity from Okta or Azure AD flows through Domino’s authentication chain, which Jetty respects as the source of truth. Permissions map to internal projects, so each containerized session runs with precise RBAC enforcement. Logs from Jetty feed Domino’s central auditing layer, allowing both DevOps and security to verify every session, token, and data call. You essentially get zero‑trust transport baked into your discovery workflows.
If Jetty feels misconfigured, check its keystore and context paths first. Domino deployments rely on the correct servlet mounts for notebook servers and model endpoints. A misplaced alias or old certificate causes most connection errors. Automate renewal, rotate secrets regularly, and keep Jetty’s thread pool close to the concurrency you actually need. Overprovisioning just slows startup and eats memory.
Benefits of Domino Data Lab Jetty done right
- Reliable identity propagation with OIDC and SAML.
- Faster notebook launches and reduced waiting time for authorization.
- Precise audit trails that meet SOC 2 and ISO controls.
- Clear separation between infrastructure and user workloads.
- Less human toil, more developer velocity, fewer “who‑owns‑this‑port” moments.
For developers, this integration feels like someone finally removed the invisible speed bump. When Jetty honors centralized identity, there’s no juggling tokens or re‑entering credentials. Onboarding new engineers takes hours instead of days. Debugging model endpoints stays in the browser, not buried in setup scripts.
AI workloads only multiply the impact. When generative tools start pulling project data, consistent access boundaries keep model prompts and private datasets from crossing wires. Automating that boundary with Jetty’s policy layer gives each AI agent its own sandbox, safe yet flexible.
Platforms like hoop.dev turn those access rules into guardrails that enforce policy automatically. It transforms manual Jetty mappings into reproducible infrastructure, reducing the chance a risky endpoint sneaks past review.
How do I connect Domino Data Lab Jetty to my identity provider?
Configure Domino’s environment variables with your IdP’s OIDC discovery URL and client secrets, then ensure Jetty trusts Domino’s proxy certificates. Once synced, every request gets verified before reaching the workspace. That single step turns scattered credentials into consistent enterprise access.
Domino Data Lab Jetty is not magic, but when tuned correctly, it feels close. It replaces waiting with working and friction with flow.
See an Environment Agnostic Identity-Aware Proxy in action with hoop.dev. Deploy it, connect your identity provider, and watch it protect your endpoints everywhere—live in minutes.