BigQuery Connect

BigQuery data into your data warehouse in minutes

Collect BigQuery data into your data warehouse or ours. The Matatika pipelines will take care of the data collection and preparation for your analytics and BI tools.

Automate BigQuery from a single space with no code

BigQuery data warehouse extractor

Settings

Batch Config Encoding Compression

Compression format to use for batch files.

Batch Config Encoding Format

Format to use for batch files.

Batch Config Storage Prefix

Prefix to use when writing batch files.

Batch Config Storage Root

Root path to use when writing batch files.

Google Application Credentials

JSON content or path to service account credentials.

Google Storage Bucket

An optional Google Storage Bucket, when supplied a file based extract will be used.

Filter Schemas

If an array of schema names is provided, the tap will only process the specified BigQuery schemas (datasets) and ignore others. If left blank, the tap automatically determines ALL available schemas.

Filter Tables

If an array of table names is provided, the tap will only process the specified BigQuery tables and ignore others. If left blank, the tap automatically determines ALL available tables. Shell patterns are supported.

Flattening Enabled

'True' to enable schema flattening and automatically expand nested properties.

Flattening Max Depth

The max depth to flatten schemas.

Project ID

GCP Project

Stream Map Config

User-defined config values to be used within map expressions.

Stream Maps

Config object for stream maps capability. For more information check out Stream Maps.


View source code

BigQuery data you can trust

Extract, Transform, and Load BigQuery data into your data warehouse or ours.

Interested in learning more?

Get in touch