Imply Polaris
This connector materializes Flow collections as Kafka-compatible messages that an Imply Polaris Kafka consumer can read. Imply Polaris is a fully managed, cloud-native Database-as-a-Service (DBaaS) built on Apache Druid, designed for real-time analytics on streaming and batch data.
Prerequisites
To use this connector, you'll need:
- At least one Flow collection
- An Imply Polaris account
Variants
This connector is a variant of the default Dekaf connector. For other integration options, see the main Dekaf page.
Setup
Provide an auth token when setting up the Dekaf connector. This can be a password of your choosing and will be used to authenticate consumers to your Kafka topics.
Once the connector is created, note the task name, such as YOUR-ORG/YOUR-PREFIX/YOUR-MATERIALIZATION
. You will use this as the username.
Connecting Estuary Flow to Imply Polaris
-
Log in to your Imply Polaris account and navigate to your project.
-
In the left sidebar, click on "Tables" and then "Create Table".
-
Choose "Kafka" as the input source for your new table.
-
In the Kafka configuration section, enter the following details:
- Bootstrap Servers:
dekaf.estuary-data.com:9092
- Topic: The name of an Estuary Flow collection you added to your materialization (e.g.,
/my-collection
) - Security Protocol:
SASL_SSL
- SASL Mechanism:
PLAIN
- SASL Username: Your materialization's task name
- SASL Password: Your materialization's auth token
- Bootstrap Servers:
-
For the "Input Format", select "avro".
-
Configure the Schema Registry settings:
- Schema Registry URL:
https://dekaf.estuary-data.com
- Schema Registry Username: Same as the SASL username
- Schema Registry Password: Same as the SASL password
- Schema Registry URL:
-
In the "Schema" section, Imply Polaris should automatically detect the schema from your Avro data. Review and adjust the column definitions as needed.
-
Review and finalize your table configuration, then click "Create Table".
-
Your Imply Polaris table should now start ingesting data from Estuary Flow.
Configuration
To use this connector, begin with data in one or more Flow collections. Use the below properties to configure a Dekaf materialization, which will direct one or more of your Flow collections to your desired topics.
Properties
Endpoint
Property | Title | Description | Type | Required/Default |
---|---|---|---|---|
/token | Auth Token | The password that Kafka consumers can use to authenticate to this task. | string | Required |
/strict_topic_names | Strict Topic Names | Whether or not to expose topic names in a strictly Kafka-compliant format. | boolean | false |
/deletions | Deletion Mode | Can choose between kafka or cdc deletion modes. | string | kafka |
Bindings
Property | Title | Description | Type | Required/Default |
---|---|---|---|---|
/topic_name | Topic Name | Kafka topic name that Dekaf will publish under. | string | Required |
Sample
materializations:
${PREFIX}/${mat_name}:
endpoint:
dekaf:
config:
token: <auth-token>
strict_topic_names: false
deletions: kafka
variant: imply-polaris
bindings:
- resource:
topic_name: ${COLLECTION_NAME}
source: ${PREFIX}/${COLLECTION_NAME}