Power BI
Overview
Power BI is Microsoft's BI platform for interactive reports and dashboards. It supports two connection paths for Decode GA4 data:
- BigQuery — query the
eventstable directly via the native BigQuery connector - Azure Blob Storage — read the Parquet files produced by the
events_externaltemplate directly, without going through BigQuery
Via BigQuery
Requirements
- Power BI Desktop or a Power BI Service account
- A Decode GA4 installation with the
eventstable accessible in BigQuery - A Google account or service account with BigQuery Data Viewer and BigQuery Job User roles on the project
1. Connect to BigQuery
In Power BI Desktop, select Get Data → Google BigQuery and sign in with your Google account.
For a service account, select Advanced options and provide your service account JSON key path.
2. Select the events table
In the Navigator, expand your GCP project and locate the dataset you configured as destination_dataset_id in your Decode GA4 installation. Select the events table and click Load (for import mode) or Transform Data to open the Power Query editor first.
3. Filter in Power Query
To avoid importing the full table, add a filter on partition_date in the Power Query editor before loading:
4. Build reports
Key fields to use in your reports:
| Field | Type | Notes |
|---|---|---|
partition_date | Date | Use as your date axis |
event_name | Text | Filter to specific events |
event_param.page_location | Text | Page URL |
5. Schedule refresh (Power BI Service)
After publishing, configure a scheduled refresh under Dataset Settings → Scheduled Refresh and provide your BigQuery credentials.
Via Azure Blob Storage
If you are exporting Decode GA4 data to Azure Blob Storage (using the events_external template with an Azure destination), Power BI can read the Parquet files directly — no BigQuery required.
Requirements
- Power BI Desktop or a Power BI Service account
- Decode GA4 data exported to Azure Blob Storage using the
events_externaltemplate (see Azure Blob Storage) - Your Azure Storage account name and account key
1. Connect to Azure Blob Storage
In Power BI Desktop, select Get Data → Azure → Azure Blob Storage. Enter your storage account name and click OK, then provide your account key when prompted.
2. Navigate to the Parquet files
In the Navigator, browse to the container and folder where Decode GA4 writes its Parquet files. The path will follow the pattern:
Select the folder and click Transform Data to open the Power Query editor.
3. Filter and combine the Parquet files
In the Power Query editor, Power BI will show the folder listing. Click Combine Files to merge all Parquet partitions into a single table. Then add a date filter to limit the data loaded:
4. Build reports
The schema matches the BigQuery events table. Use the same key fields:
| Field | Type | Notes |
|---|---|---|
partition_date | Date | Use as your date axis |
event_name | Text | Filter to specific events |
event_param.page_location | Text | Page URL |
5. Schedule refresh (Power BI Service)
After publishing, configure a scheduled refresh under Dataset Settings → Scheduled Refresh. Select Azure Blob Storage as the data source and provide your storage account credentials.