Export Data to Google BigQuery
Set up continuous export of assets, vulnerabilities, and scan results from Mondoo to a Google BigQuery dataset using a service account key or Workload Identity Federation (WIF).
Export your Mondoo security data to Google BigQuery so you can run SQL queries across your assets, vulnerabilities, and scan results. BigQuery is ideal for building custom dashboards, running trend analysis, or joining Mondoo data with other datasets in your data warehouse. Once configured, Mondoo automatically exports data approximately every 24 hours.
Requirements
-
A GCP project with the BigQuery API enabled
-
The GCP CLI (
gcloud) installed -
Editor or Owner access to the Mondoo space from which you want to export data
Create a BigQuery dataset
Your BigQuery integration needs a dataset to which Mondoo exports data. To learn about BigQuery datasets, read Introduction to datasets in the Google documentation.
Create a new GCP BigQuery dataset for the Mondoo integration to use. For instructions, read Creating datasets in the Google documentation.
Note the dataset ID (in the format PROJECT_ID.DATASET_ID). You need it when you configure the integration.
Set up authentication
Mondoo supports two ways to authenticate with GCP for BigQuery exports:
-
Workload Identity Federation (WIF) eliminates the need to store and rotate service account keys. Mondoo authenticates to GCP using short-lived OIDC tokens that GCP validates and exchanges for temporary credentials. This is the recommended approach.
-
Service account key uses a static JSON key file. This is simpler to set up but requires you to securely store and regularly rotate the key.
With WIF, Mondoo acts as an OIDC identity provider. When it's time to export, Mondoo issues a short-lived OIDC token and presents it to GCP. GCP validates the token against Mondoo's public signing keys (fetched from Mondoo's OIDC discovery endpoint), then issues temporary GCP credentials that Mondoo uses to write data to BigQuery. No static keys are stored or transmitted.
Step 1: Create a GCP service account
Create a GCP service account that Mondoo will impersonate to write data to BigQuery. Do not create a key for this service account—WIF replaces the need for static keys.
-
In the GCP Console, navigate to IAM & Admin > Service Accounts.
-
Select Create Service Account.
-
Give the service account a name (for example,
mondoo-bq-export) and select Create and Continue. -
Grant the service account the following roles:
- BigQuery Data Editor (
roles/bigquery.dataEditor) - BigQuery Job User (
roles/bigquery.jobUser)
For dataset-level permissions instead of project-level, grant BigQuery Data Editor on the specific dataset. To learn how, read Grant access to a dataset in the Google documentation.
- BigQuery Data Editor (
-
Select Done. Note the service account email address (for example,
mondoo-bq-export@PROJECT_ID.iam.gserviceaccount.com).
Step 2: Create a Workload Identity Pool
A Workload Identity Pool is a GCP resource that manages external identities. You create one pool and add Mondoo as an OIDC provider within it.
-
In the GCP Console, navigate to IAM & Admin > Workload Identity Federation.
-
Select Create Pool.
-
Enter a name for the pool (for example,
mondoo-export-pool) and an optional description. -
Make sure the pool is Enabled and select Continue.
Step 3: Add Mondoo as an OIDC provider to the pool
Within the Workload Identity Pool, add Mondoo as a trusted OpenID Connect (OIDC) identity provider. GCP uses Mondoo's OIDC discovery endpoint to fetch the public keys it needs to validate the tokens Mondoo presents during export.
-
In the pool you just created, select Add Provider.
-
For Select a provider, choose OpenID Connect (OIDC).
-
Enter a provider name (for example,
mondoo-provider). -
Set the Issuer (URL) to the Mondoo API endpoint for your environment:
Environment Issuer URL Mondoo (US) https://api.mondoo.comMondoo (EU) https://eu.api.mondoo.comMondoo Edge https://api.edge.mondoo.comDedicated deployment https://api.mondoo.CUSTOMER.comGCP automatically fetches the OIDC discovery document from
<issuer>/.well-known/openid-configurationto obtain Mondoo's signing keys. To learn more about Mondoo's OIDC endpoints, see Mondoo as an OIDC identity provider. -
Under Audiences, select Allowed audiences and leave the default value. The audience is the full provider resource name, which you note in the next step.
-
Under Attribute Mapping, add this mapping:
Google attribute OIDC attribute google.subjectassertion.sub -
Select Save.
Step 4: Note the WIF audience URL
After creating the provider, note the full audience URL. It follows this format:
https://iam.googleapis.com/projects/PROJECT_NUMBER/locations/global/workloadIdentityPools/POOL_ID/providers/PROVIDER_IDYou can find this on the provider details page in the GCP Console. You enter this value in Mondoo in the next step.
:::tip
PROJECT_NUMBER is a numeric value, not the project ID string. Find it on your GCP project's dashboard or by running:
gcloud projects describe PROJECT_ID --format='value(projectNumber)':::
Step 5: Create the BigQuery export integration in Mondoo
-
In the Mondoo Console, navigate to the space from which you want to export data.
-
In the side navigation bar, select Integrations. Under Exports, select BigQuery.
-
Enter a name for the integration.
-
In the Enter the Dataset ID box, enter your BigQuery dataset ID (in the format
PROJECT_ID.DATASET_ID). To find this value, read Listing datasets in the Google documentation. -
Under Configure authentication, select Workload Identity Federation.
-
In the WIF Audience URL box, enter the audience URL from Step 4.
-
In the Service Account Email box, enter the email of the service account you created in Step 1 (for example,
mondoo-bq-export@PROJECT_ID.iam.gserviceaccount.com). -
Select CREATE INTEGRATION.
Alternative: Create the integration using the GraphQL API
mutation {
createIntegration(
input: {
spaceMrn: "//captain.api.mondoo.app/spaces/YOUR_SPACE_ID"
name: "BigQuery WIF Export"
type: BIGQUERY
configurationOptions: {
bigqueryOptions: {
datasetId: "PROJECT_ID.DATASET_ID"
serviceAccount: ""
wifAudience: "https://iam.googleapis.com/projects/PROJECT_NUMBER/locations/global/workloadIdentityPools/POOL_ID/providers/PROVIDER_ID"
wifServiceAccountEmail: "mondoo-bq-export@PROJECT_ID.iam.gserviceaccount.com"
}
}
}
) {
integration {
mrn
configurationOptions {
... on BigqueryConfigurationOptions {
datasetId
wifSubject
wifAudience
}
}
}
}
}Step 6: Copy the WIF subject value
After you create the integration, Mondoo computes a WIF subject value that uniquely identifies this integration. You need this value to authorize Mondoo in GCP.
-
On the integration details page in the Mondoo Console, find the WifSubject field.
-
Copy the subject value. It has the format
INTEGRATION_ID@integrations.SPACE_ID.
If you created the integration using the API, the subject is in the wifSubject field of the response.
:::note
The WIF subject is a computed, read-only value. Mondoo generates it automatically when you create the integration—you cannot set or change it.
:::
Step 7: Authorize the WIF subject to impersonate the service account
Back in GCP, you must grant the WIF subject the Workload Identity User role on the service account you created in Step 1. This allows Mondoo's OIDC-validated identity to impersonate the service account and write to BigQuery.
Run this gcloud command, substituting your own values:
gcloud iam service-accounts add-iam-policy-binding \
mondoo-bq-export@PROJECT_ID.iam.gserviceaccount.com \
--project=PROJECT_ID \
--role="roles/iam.workloadIdentityUser" \
--member="principal://iam.googleapis.com/projects/PROJECT_NUMBER/locations/global/workloadIdentityPools/POOL_ID/subject/SUBJECT_VALUE"| Placeholder | Replace with |
|---|---|
PROJECT_ID | Your GCP project ID (the string identifier, not the number) |
PROJECT_NUMBER | Your GCP project number (numeric) |
POOL_ID | The Workload Identity Pool ID from Step 2 |
SUBJECT_VALUE | The WIF subject value you copied in Step 6 |
Step 8: Verify the integration
The export runs on its configured schedule (approximately every 24 hours). To verify the setup works immediately:
-
Return to the integration details page in the Mondoo Console.
-
Select the SCHEDULE NOW button to trigger an immediate export.
-
Wait for the export to complete. If the status changes to active, the setup is working correctly.
If the export fails, double-check:
- The WIF audience URL matches the full provider resource name in GCP exactly (including project number, pool ID, and provider ID).
- The service account email is correct.
- The
gcloud iam service-accounts add-iam-policy-bindingcommand completed successfully with the correct subject value. - The service account has the required BigQuery roles (BigQuery Data Editor and BigQuery Job User).
Step 1: Create a GCP service account and JSON key
To access BigQuery, your integration needs a GCP service account with a JSON key. To learn about service accounts, read Understanding service accounts in the Google documentation.
-
Create a new GCP service account for the Mondoo integration to use.
For instructions, read Creating and managing service accounts in the Google documentation.
Note the email address created for the new service account.
-
Grant the service account the following roles:
- BigQuery Data Editor (
roles/bigquery.dataEditor) - BigQuery Job User (
roles/bigquery.jobUser)
To learn how, read Grant access to a dataset in the Google documentation.
- BigQuery Data Editor (
-
Create a JSON key for the service account.
For instructions, read Create and manage service account keys in the Google documentation.
Save the JSON file that downloads to your workstation when you create the key. You need it to configure the integration in the next step.
Step 2: Add the integration in Mondoo
-
In the Mondoo Console, navigate to the space from which you want to export data. In the side navigation bar, select Integrations. Under Exports, select BigQuery.
-
Enter a name for the integration.
-
In the Enter the Dataset ID box, enter your BigQuery dataset ID (in the format
PROJECT_ID.DATASET_ID). To find this value, read Listing datasets in the Google documentation. -
Under Configure authentication, select Service Account Key.
-
Drag and drop the JSON key file you downloaded earlier, or select the cloud icon to browse for it.
-
Select CREATE INTEGRATION.
Mondoo begins exporting data from your space. When the initial export completes, the status becomes active. Exports then run automatically approximately every 24 hours.
Manage your integration
To view your integration, select BigQuery under Integrations in the side navigation bar, then select the integration.
Trigger a manual export
Exports run approximately every 24 hours. To export immediately, select the SCHEDULE NOW button on the integration detail page.
Statuses
| Status | Meaning |
|---|---|
| active | The integration is healthy and exporting on schedule. |
| error | Mondoo encountered an error during the last export. |
Remove the integration
Select the trash can icon and confirm the deletion. Mondoo stops future exports but does not delete data already exported to BigQuery.
Next steps
Google Cloud Storage
Set up continuous export of assets and vulnerabilities from Mondoo to a Google Cloud Storage bucket using a service account key or Workload Identity Federation (WIF).
PostgreSQL
Set up continuous export of assets, vulnerabilities, and scan results from Mondoo to a PostgreSQL database.