Setup the Matatika platform to deliver and process your data in Snowflake in minutes.
A loader for Snowflake, a cloud-based data warehousing platform.
Your account identifier. See Account Identifiers.
The initial database for the Snowflake session.
The initial schema for the Snowflake session.
The default target database schema name to use for all streams.
The login name for your Snowflake user.
The password for your Snowflake user.
The initial warehouse for the session.
The initial role for the session.
Whether to add metadata columns.
Maximum number of rows in each batch.
Whether to remove batch files after processing.
One or more LCID locale strings to produce localized output for: https://faker.readthedocs.io/en/master/#localization
Value to seed the Faker generator for deterministic output: https://faker.readthedocs.io/en/master/#seeding-the-generator
'True' to enable schema flattening and automatically expand nested properties.
The max depth to flatten schemas.
Hard delete records.
The method to use when loading data into the destination. append-only
will always write all input records whether that records already exists or not. upsert
will update existing records and insert new records. overwrite
will delete all existing records and insert all input records.
Passphrase to decrypt private key if encrypted.
Path to file containing private key.
User-defined config values to be used within map expressions.
Config object for stream maps capability. For more information check out Stream Maps.
Whether to use SSO authentication using an external browser.
Whether to validate the schema of the incoming streams.
Collect and process data from 100s of sources and tools with Snowflake.