All posts

The Simplest Way to Make Domino Data Lab Zscaler Work Like It Should

Every data scientist knows the pain of waiting for network access while their models gather dust. Security teams lock things down, researchers push for speed, and somewhere in between, a project stalls. That’s exactly the gap Domino Data Lab and Zscaler are built to close. Domino Data Lab gives enterprises a centralized platform for developing and deploying machine learning models at scale. Zscaler provides zero-trust access and network-level security without the usual VPN headaches. When you p

Free White Paper

End-to-End Encryption + Sarbanes-Oxley (SOX) IT Controls: The Complete Guide

Architecture patterns, implementation strategies, and security best practices. Delivered to your inbox.

Free. No spam. Unsubscribe anytime.

Every data scientist knows the pain of waiting for network access while their models gather dust. Security teams lock things down, researchers push for speed, and somewhere in between, a project stalls. That’s exactly the gap Domino Data Lab and Zscaler are built to close.

Domino Data Lab gives enterprises a centralized platform for developing and deploying machine learning models at scale. Zscaler provides zero-trust access and network-level security without the usual VPN headaches. When you pair them, your ML stack gains both agility and compliance — data moves efficiently, engineers stay productive, and policies remain airtight.

Think of the Domino-Zscaler integration as a handshake between science and governance. Domino handles workloads and permissions at the workspace level. Zscaler enforces identity-aware connectivity, inspecting and securing every request before it touches your infrastructure. Together, they build a secure channel for collaboration where one cloud-native platform (Domino) runs computation and another (Zscaler) guards every byte.

How do I connect Domino Data Lab and Zscaler?

You configure Domino to route outbound and inbound traffic through Zscaler’s tenant gateway. Authentication flows follow your identity provider — often Okta or Azure AD — to verify users before they access project endpoints. Domino admins then map user roles to Zscaler policies, ensuring each data scientist only reaches allowed resources. It’s zero-trust, but without the zero-productivity feeling.

To keep the integration clean, manage credentials through Domino secrets, rotate tokens on schedule, and audit identity logs regularly. Zscaler’s inspection layer gives you visibility into endpoint usage while Domino’s workspace metadata provides context for who ran what and why. The two logs together create an unbeatable compliance trail.

Continue reading? Get the full guide.

End-to-End Encryption + Sarbanes-Oxley (SOX) IT Controls: Architecture Patterns & Best Practices

Free. No spam. Unsubscribe anytime.

Quick best practices

  • Align Domino project roles with Zscaler group policies.
  • Route all compute node traffic through Zscaler trusted connectors.
  • Use OIDC for single sign-on to simplify identity enforcement.
  • Store non-interactive keys in Domino’s environment settings.
  • Test connectivity in staged environments before granting global access.

Tangible benefits

  • Faster onboarding for new researchers with instant secure access.
  • Fewer manual approvals from IT thanks to automated identity routing.
  • Consistent audit logging for SOC 2 and GDPR reporting.
  • Simplified network architecture without traditional VPN sprawl.
  • Stronger isolation between sensitive datasets and public endpoints.

For developers, this setup feels lighter. You open Domino, spin up a workspace, and everything just works — private data, cloud access, compliance checks included. No more toggling between portals or approving ticket requests for network exceptions. Domino Data Lab Zscaler turns secure connectivity into a background service.

Platforms like hoop.dev take the same principle further. They turn those Zscaler-style access rules into guardrails that enforce policy automatically across different environments. That way, security isn’t an afterthought, it’s part of the workflow.

As AI tooling becomes deeply integrated, this model helps teams evaluate data risk before anyone fine-tunes a prompt or model. Whether guarding external REST endpoints or internal notebooks, zero-trust access meets reproducible AI development head-on.

The combination unlocks velocity and assurance at once, proof that strong governance doesn’t have to slow down innovation.

See an Environment Agnostic Identity-Aware Proxy in action with hoop.dev. Deploy it, connect your identity provider, and watch it protect your endpoints everywhere—live in minutes.

Get started

See hoop.dev in action

One gateway for every database, container, and AI agent. Deploy in minutes.

Get a demoMore posts