# BigQuery

## Prerequisites

You’ll need to have a [Google Cloud Platform](https://cloud.google.com/) account with a [project](https://cloud.google.com/storage/docs/projects) you would like Dot to use. Consult the Google Cloud Platform documentation for how to [create and manage a project](https://cloud.google.com/resource-manager/docs/creating-managing-projects). This project should have a BigQuery dataset for Dot to connect to.

## 1 Create a service account

**Create a service account that you manage in your Google Cloud account.** This account should be provisioned with the following read-only roles:

* `bigquery.dataViewer`
* `bigquery.jobUser`
* `bigquery.readSessionUser`

You'll need to provide the service account's email, a [JSON-formatted key](https://cloud.google.com/iam/docs/creating-managing-service-account-keys#creating), and the location of your BigQuery instance.

<details>

<summary>Create a service account step by step.</summary>

1. **Navigate to Service Accounts**:
   * Go to the [Google Cloud Console](https://console.cloud.google.com/).
   * In the **Navigation menu**, select **IAM & Admin >** [**Service Accounts**](https://console.cloud.google.com/iam-admin/serviceaccounts).
2. **Create a New Service Account**:
   * Click on **Create Service Account** at the top.
   * Assign a **Name** and optional **Description** (e.g., `dot-service-account` for identification).
   * Click **Create and Continue**.
3. **Assign Required Roles**:
   * In the **Grant this service account access to project** section, add the following roles:
     * **BigQuery Data Viewer** (`roles/bigquery.dataViewer`)
     * **BigQuery Job User** (`roles/bigquery.jobUser`)
     * **BigQuery Read Session User** (`roles/bigquery.readSessionUser`)
   * Click **Continue** to finalize the role assignments.
4. **Create a JSON Key**:
   * Under **Create key (optional)**, select **JSON** and click **Create**.
   * This downloads a JSON file with the service account credentials. Store this file securely; it contains sensitive information.
5. **Service Account Details Needed for Dot**:
   * **Service Account Email**: Visible in the **Email** column on the Service Accounts page.
   * **JSON Key**: The file downloaded in step 4.
   * **BigQuery Location**: The regional or multi-regional setting for your BigQuery instance (e.g., `us-central1`). Find this in the BigQuery console under **BigQuery > Settings**.

</details>

## 2 Granting permissions

The service account also needs the appropriate read-only roles.

The easiest way to grant these roles is through the [Google Cloud Shell](https://console.cloud.google.com/home/dashboard?cloudshell=true).

First, we'll create a custom role for Dot-related permissions and then bind it to the service account that you're using. We'll also bind read-only BigQuery roles to the service account.

**A) Create a Dot custom role**

```bash
gcloud iam roles create DotMonitor \
  --project={{PROJECT_ID}} \
  --title=DotMonitor \
  --description="Dot specific permissions" \
  --permissions=bigquery.jobs.listAll,bigquery.jobs.list
```

*Note that the* `{{PROJECT_ID}}` *placeholder needs to be replaced with your project id.*

**B) Bind the custom role to a service account and apply read-only BQ roles**

<pre class="language-bash"><code class="lang-bash"><strong>gcloud projects add-iam-policy-binding {{PROJECT_ID}} \
</strong>  --member="serviceAccount:{{SERVICE_ACCOUNT}}" \
  --role="projects/{{PROJECT_ID}}/roles/DotMonitor"

gcloud projects add-iam-policy-binding {{PROJECT_ID}} \
  --member="serviceAccount:{{SERVICE_ACCOUNT}}" \
  --role="roles/bigquery.dataViewer"

gcloud projects add-iam-policy-binding {{PROJECT_ID}} \
  --member="serviceAccount:{{SERVICE_ACCOUNT}}" \
  --role="roles/bigquery.jobUser"

gcloud projects add-iam-policy-binding {{PROJECT_ID}} \
  --member="serviceAccount:{{SERVICE_ACCOUNT}}" \
  --role="roles/bigquery.readSessionUser"
</code></pre>

*Note that the* `{{SERVICE_ACCOUNT}}` *and* `{{PROJECT_ID}}` *placeholders needs to be replaced with your service account and project id, respectively.*

Example Values

* PROJECT\_ID: `super-position-123456`
* SERVICE\_ACCOUNT: `dot-101@super-position-123456.iam.gserviceaccount.com`

## Per-User Access (Optional)

By default, all Dot users in your organization share the same service account when querying BigQuery. If you need each user to only see the data they have access to in BigQuery — based on their individual IAM roles, row-level security policies, or column-level policy tags — you can enable **per-user access** via domain-wide delegation.

When enabled, Dot runs each query as the logged-in user's Google Workspace identity instead of the shared service account. BigQuery enforces access controls natively, so you manage permissions in GCP — not in Dot.

### Prerequisites

* **Google Workspace** — per-user access uses [domain-wide delegation](https://developers.google.com/identity/protocols/oauth2/service-account#delegatingauthority), which requires a Google Workspace domain.
* **Recommended: Google SSO** configured in Dot (see [Google SSO setup](https://docs.getdot.ai/integrations/sso/google)). SSO guarantees that the user's Dot email matches their Google Workspace identity.
* Users who sign in with a password are not delegated by default. If your password-login users have Dot emails that match their Google Workspace emails, you can enable the **Include non-SSO users** sub-toggle.

### Step 1: Enable domain-wide delegation for the service account

1. Go to the [Google Cloud Console](https://console.cloud.google.com/) > **IAM & Admin** > **Service Accounts**.
2. Click on the Dot service account.
3. Under **Show domain-wide delegation**, check **Enable Google Workspace Domain-wide Delegation**.
4. Note the **Client ID** shown (you'll need it in the next step).

### Step 2: Authorize the service account in Google Workspace

1. Go to [Google Workspace Admin Console](https://admin.google.com/) > **Security** > **API Controls** > **Manage Domain Wide Delegation**.
2. Click **Add new**.
3. Enter the **Client ID** from Step 1.
4. Enter the following OAuth scope: `https://www.googleapis.com/auth/bigquery`
5. Click **Authorize**.

### Step 3: Grant BigQuery access to your users

Each user who will use Dot needs BigQuery permissions on the relevant projects and datasets. At minimum:

```bash
gcloud projects add-iam-policy-binding {{PROJECT_ID}} \
  --member="user:alice@yourcompany.com" \
  --role="roles/bigquery.dataViewer"

gcloud projects add-iam-policy-binding {{PROJECT_ID}} \
  --member="user:alice@yourcompany.com" \
  --role="roles/bigquery.jobUser"
```

For managing permissions at scale, use [Google Groups](https://cloud.google.com/iam/docs/groups-in-cloud-console) — grant roles to a group and add users to it.

{% hint style="info" %}
You can also use BigQuery [row-level security policies](https://cloud.google.com/bigquery/docs/row-level-security-intro) and [column-level security (policy tags)](https://cloud.google.com/bigquery/docs/column-level-security-intro) for fine-grained access control. These are enforced automatically when per-user access is enabled.
{% endhint %}

### Step 3: Enable per-user access in Dot

1. Go to **Settings** > **Connections** > **BigQuery**.
2. Click **Edit**.
3. Enable the **Per-user BigQuery access** toggle.
4. Optionally enable **Include non-SSO users** if your password-login users should also be impersonated.
5. Click **Connect** to save.

Once enabled, every query a user runs in Dot will execute as their Google identity. If a user doesn't have access to a table or column in BigQuery, they'll see a clear error message instead of the data.

### How it works

| Scenario                       | Who runs the query                                                                      |
| ------------------------------ | --------------------------------------------------------------------------------------- |
| User logged in via Google SSO  | The user's Google identity                                                              |
| User logged in with password   | The shared service account (or user's identity if **Include non-SSO users** is enabled) |
| Scheduled queries and alerts   | The schedule creator's Google identity                                                  |
| Data sync and model operations | The shared service account                                                              |

{% hint style="warning" %}
**Scheduled queries run as the creator.** If a user's Google account is deactivated (e.g., they leave the company), their scheduled queries will fail. After 3 consecutive failures, the schedule is automatically paused and the owner is notified via email. To fix this, reassign the schedule to an active user.
{% endhint %}

### Known limitations

* **Table metadata is shared.** Data sync (schema discovery, AI-generated descriptions) always runs as the shared service account. This means all users can see table names, column names, and AI-generated descriptions for all synced tables — even tables they cannot query. No actual data values are exposed, but the table structure and descriptions are visible to all users in the organization.
* **BigQuery row-level security returns empty results.** If a user has table access but is restricted by BigQuery row-level security policies, queries return empty results rather than an error. Dot will inform the user that no data was found but cannot distinguish between "no matching rows" and "access restricted."

## Allow Dot IPs

If your organization uses a network policy to manage BigQuery access, Dot will only access your BigQuery through the following IPs:

* `3.229.110.216`
* `3.122.135.165`
