All posts

How to configure Domino Data Lab k3s for secure, repeatable access

You get a ping from a data scientist asking for GPU access on a test cluster. Then another wants admin rights “just to debug.” Before you know it, your Kubernetes playground looks like a casino. That is when teams start asking how Domino Data Lab k3s fits into the picture. Domino Data Lab manages the lifecycle of machine learning workspaces, storage, and compute across hybrid or cloud environments. K3s, on the other hand, is a lightweight Kubernetes distribution ideal for edge nodes or ephemera

Free White Paper

VNC Secure Access + Customer Support Access to Production: The Complete Guide

Architecture patterns, implementation strategies, and security best practices. Delivered to your inbox.

Free. No spam. Unsubscribe anytime.

You get a ping from a data scientist asking for GPU access on a test cluster. Then another wants admin rights “just to debug.” Before you know it, your Kubernetes playground looks like a casino. That is when teams start asking how Domino Data Lab k3s fits into the picture.

Domino Data Lab manages the lifecycle of machine learning workspaces, storage, and compute across hybrid or cloud environments. K3s, on the other hand, is a lightweight Kubernetes distribution ideal for edge nodes or ephemeral infrastructure. Together they become a fast way to run distributed training jobs with real governance, not a cloud free-for-all.

Domino uses Kubernetes as its substrate, and k3s makes that substrate easier to spin up anywhere. Smaller clusters for dev or staging environments, quick-fire experiments, or on-prem clusters with limited footprint all benefit. You keep the same control plane concepts—Pods, Services, RBAC—but shed the overhead of full K8s.

Here’s the logic: Domino coordinates workloads and users, while k3s provides the compute runtime. With identity centralized through SSO or OIDC, you can map Domino users to Kubernetes service accounts automatically. That means permission boundaries from Okta or AWS IAM propagate down to container workloads. No more shared kubeconfigs or rogue tokens sitting on laptops.

Quick Answer: You configure Domino Data Lab to point at a k3s cluster by registering it as a compute environment, integrating your identity provider, and inheriting Domino’s workspace-level access controls into Kubernetes service accounts for consistent, auditable job execution.

Continue reading? Get the full guide.

VNC Secure Access + Customer Support Access to Production: Architecture Patterns & Best Practices

Free. No spam. Unsubscribe anytime.

Best practices for the pairing

  • Use k3s when your workloads need consistent but lightweight orchestration.
  • Rotate Domino-issued service account tokens through your identity provider.
  • Map Domino projects to unique namespaces to ensure zero cross-contamination.
  • Keep cluster upgrades fast by using k3s’s single-binary model and standard Helm charts.
  • Monitor node resources directly from Domino to prevent noisy-neighbor issues.

The benefits become obvious

  • Faster environment creation for data scientists and MLOps teams.
  • Cleaner credential flow, fully aligned with corporate identity.
  • Predictable upgrades and smaller cluster footprints.
  • Easier policy enforcement and compliance reporting.
  • Consistent user context across workloads, logs, and audit trails.

Developers notice the difference right away. Launching a test notebook or job takes seconds instead of waiting on IT to whitelist a cluster. Debugging is straightforward because every pod has the correct identity baked in. Result: higher developer velocity, less toil, and fewer frantic Slack messages.

Platforms like hoop.dev take this further, turning those same access maps into policy guardrails that auto-enforce identity and authorization. Instead of managing kubeconfigs, teams use OAuth-based access that adjusts dynamically as roles change.

How do I connect Domino Data Lab to a k3s cluster?

Set up your k3s control plane, create a service account with cluster-admin privileges, and point Domino Data Lab’s compute configuration at the cluster’s API endpoint. Domino handles the rest, issuing jobs and notebooks as Kubernetes workloads that inherit project-level policies automatically.

How does this help with AI workloads?

AI training pipelines often generate dynamic pods and nodes. With Domino Data Lab on k3s, you can scale those jobs fast and still meet auditing and compliance standards. It keeps experimentation agile while ensuring each model run is linked to a real user identity.

The result: less chaos, more control, and fewer late-night restarts. That’s how secure, repeatable access should feel.

See an Environment Agnostic Identity-Aware Proxy in action with hoop.dev. Deploy it, connect your identity provider, and watch it protect your endpoints everywhere—live in minutes.

Get started

See hoop.dev in action

One gateway for every database, container, and AI agent. Deploy in minutes.

Get a demoMore posts