Home Integrations Cloud Storage Azure

Azure Blob Storage

Overview

Exporting transformed data to Azure Blob Storage requires a BigQuery connection to Azure. The connection allows BigQuery to authenticate with Azure, write to your Blob Storage container using the EXPORT DATA statement and read data back to the external table in BigQuery.

Requirements

Google Cloud

The executing user or service account must have:

  • BigQuery Connection Admin (roles/bigquery.connectionAdmin) — to create the Azure connection

Azure

The Azure AD application used by the connection must have the following role assigned on the target storage account or container:

  • Storage Blob Data Contributor — grants both read and write access to blob data

Connection Setup

1. Register an application in Azure AD

In the Azure portal, go to App registrationsNew registration. Select "Accounts in this organizational directory only" and complete the registration. Note the Application (client) ID and your Tenant ID for later steps.

2. Create the BigQuery connection

In the Google Cloud console, navigate to BigQuery → Add dataAzure Blob Storage. Select BigLake on Azure (via BigQuery Omni) as the connection type and enter:

  • Connection ID: a name for the connection
  • Location: the Azure region, e.g. azure-eastus2
  • Azure tenant ID: from step 1
  • Federated application client ID: the Application (client) ID from step 1

After saving, note the BigQuery Google identity value displayed (a service account email).

3. Add a federated credential in Azure

In the Azure portal, open your app registration → Certificates & secretsFederated credentialsAdd credential. Select Other issuer and set:

  • Issuer: https://accounts.google.com
  • Subject identifier: the BigQuery Google identity from step 2

4. Assign the Storage Blob Data Contributor role

In the Azure portal, navigate to your storage account (or specific container) → Access control (IAM)Add role assignment. Assign the Storage Blob Data Contributor role to the Azure AD application registered in step 1.

Example

The following install script will install Decode GA4, run the transformation and export all transformed data to Azure Blob Storage:

DECLARE options JSON;

SET options = JSON '''
{
    "ga4_dataset_id": "project_id.ga4_dataset_name",
    "transform_config_template": "events_external",
    "azure_storage_account": "my_storage_account",
    "azure_container_name": "my-container",
    "connection_id": "project_id.azure-eastus2.my_azure_connection"
}
''';

EXECUTE IMMEDIATE (
    SELECT `project_id.decode_ga4_europe_west2.deploy_installer`(options)
    );

CALL `project_id.ga4_dataset_name.install_decode_ga4`();

CALL `project_id.decode_ga4_dataset_name.RUN`(NULL);

Note that you will have to replace project_id, ga4_dataset_name, and the connection details with your actual values. The connection_id must match the connection created in the setup steps above, in the format project_id.azure_region.connection_name.

Further Reading