Imagine this: a data engineer joins a new project and asks for access to a few BigQuery datasets. Two tickets later and three days of waiting, she finally gets credentials—then discovers Redash dashboards broken from expired tokens. Most teams know this pain. BigQuery is fast, Redash is flexible, but their connection often lives in permission limbo.
BigQuery is Google Cloud’s serverless data warehouse built for scaled analytics. Redash is an open-source query and visualization platform many teams use for light BI. They pair nicely when you need SQL-driven insights on top of petabyte-scale tables. The catch is authentication, which, if handled casually, becomes a security and compliance headache.
The goal is simple: connect BigQuery to Redash once, control who runs queries with identity-based access, and avoid rotating credentials like clockwork. Instead of static JSON keys, use service accounts tied to your identity provider, such as Okta or Google Workspace, with scoped IAM roles. Redash should query BigQuery using OAuth or delegated tokens so you can audit each user action directly in Cloud Audit Logs.
A clean setup usually follows this flow. First, create a dedicated Redash service identity in Google Cloud and assign minimal roles like bigquery.dataViewer. Next, configure Redash to use OAuth for that connection and ensure its proxy or backend holds no long-lived secrets. Finally, map Redash users to Google identities so when a dashboard executes, you know who pressed “Run.” Access is visible, revocable, and logged.
If dashboards start to break, check scope mismatches or expired OAuth clients. Avoid wide “Editor” roles—limit by dataset or project. Rotate credentials automatically with CI jobs or ephemeral tokens. These small controls prevent sprawling key files across laptops and build servers.