All posts

The Simplest Way to Make Cloud Run Port Work Like It Should

Every engineer who’s deployed a container on Google Cloud Run has hit the same quiet mystery: what port is this thing actually listening on? The app runs locally on 8080, but Cloud Run keeps whispering something about $PORT. Get that detail wrong and your container boots fine, then disappears into a black hole of failed health checks. Cloud Run Port defines the single inbound port your container must listen on for HTTP traffic from the platform. You cannot pick two, you cannot ignore it, and yo

Free White Paper

End-to-End Encryption + Sarbanes-Oxley (SOX) IT Controls: The Complete Guide

Architecture patterns, implementation strategies, and security best practices. Delivered to your inbox.

Free. No spam. Unsubscribe anytime.

Every engineer who’s deployed a container on Google Cloud Run has hit the same quiet mystery: what port is this thing actually listening on? The app runs locally on 8080, but Cloud Run keeps whispering something about $PORT. Get that detail wrong and your container boots fine, then disappears into a black hole of failed health checks.

Cloud Run Port defines the single inbound port your container must listen on for HTTP traffic from the platform. You cannot pick two, you cannot ignore it, and you definitely should not hardcode a different one in production. The service injects the right value during runtime, and your app must respect it. Simple rule, deceptively easy to forget.

When you deploy, Cloud Run assigns each revision an environment variable named PORT, usually set to 8080 but not guaranteed. Your web server grabs this value to start listening. The platform then routes requests through its load balancer to that port. One container, one port, built for consistency across languages, frameworks, and CI/CD pipelines.

So why does it matter? Modern infrastructure thrives on predictable interfaces. The moment you introduce dynamic routing or background jobs that bind to random ports, scaling breaks. Cloud Run Port removes that guesswork. It locks your container into a contract: receive here, respond fast, shut down gracefully.

Common Pitfalls and Quick Fixes

If your service starts but returns connection errors, check that your framework respects environment variables. In Node.js or Python, always bind to process.env.PORT. For Go or Rust, pass it in as a runtime variable. Forget that step and you’ll be debugging “connection refused” messages that make you question reality.

Continue reading? Get the full guide.

End-to-End Encryption + Sarbanes-Oxley (SOX) IT Controls: Architecture Patterns & Best Practices

Free. No spam. Unsubscribe anytime.

Avoid running multiple servers inside one Cloud Run instance. Only the process listening on $PORT is reachable. Background metrics agents or gRPC endpoints belong somewhere else, like Cloud Tasks or Pub/Sub.

Benefits of Doing Cloud Run Port Right

  • Predictable scaling under load
  • Cleaner health checks and monitoring
  • Faster container startup and routing convergence
  • Reduced configuration drift across environments
  • More consistent logs and audit trails

When teams wire up continuous delivery, automation becomes effortless. The same container image works in staging, prod, or local Docker, because the port logic follows a contract instead of tribal memory. That’s how developer velocity stays high and onboarding stays boring, which is a compliment in ops.

Platforms like hoop.dev turn those access rules into guardrails that enforce policy automatically. With identity-aware access control baked in, your services run safely inside Cloud Run without slowing down deploys or adding bespoke proxy configs. The platform tags, authenticates, and forwards only the requests you actually intend.

Quick Answer: What Port Does Cloud Run Use?

By default, Cloud Run expects your container to listen on the port defined by the $PORT environment variable. Most often it’s 8080, but Google may change or randomize it. Always reference that variable, never hardcode a number.

Cloud Run Port is small detail, big foundation. Get it right once and every deploy after just works. Get it wrong and none of the magic boots up.

See an Environment Agnostic Identity-Aware Proxy in action with hoop.dev. Deploy it, connect your identity provider, and watch it protect your endpoints everywhere—live in minutes.

Get started

See hoop.dev in action

One gateway for every database, container, and AI agent. Deploy in minutes.

Get a demoMore posts