Imply Polaris
This guide uses a legacy method of connecting with Dekaf and is presented for historical purposes. For new integrations or to migrate your existing Dekaf setup to the new workflow, see the Dekaf materialization connector.
This guide demonstrates how to use Estuary Flow to stream data to Imply Polaris using the Kafka-compatible Dekaf API.
Imply Polaris is a fully managed, cloud-native Database-as-a-Service (DBaaS) built on Apache Druid, designed for real-time analytics on streaming and batch data.
Connecting Estuary Flow to Imply Polaris
-
Generate a refresh token for the Imply Polaris connection from the Estuary Admin Dashboard.
-
Log in to your Imply Polaris account and navigate to your project.
-
In the left sidebar, click on "Tables" and then "Create Table".
-
Choose "Kafka" as the input source for your new table.
-
In the Kafka configuration section, enter the following details:
- Bootstrap Servers:
dekaf.estuary-data.com:9092
- Topic: Your Estuary Flow collection name (e.g.,
/my-organization/my-collection
) - Security Protocol:
SASL_SSL
- SASL Mechanism:
PLAIN
- SASL Username:
{}
- SASL Password:
Your generated Estuary Access Token
- Bootstrap Servers:
-
For the "Input Format", select "avro".
-
Configure the Schema Registry settings:
- Schema Registry URL:
https://dekaf.estuary-data.com
- Schema Registry Username:
{}
(same as SASL Username) - Schema Registry Password:
The same Estuary Access Token as above
- Schema Registry URL:
-
In the "Schema" section, Imply Polaris should automatically detect the schema from your Avro data. Review and adjust the column definitions as needed.
-
Review and finalize your table configuration, then click "Create Table".
-
Your Imply Polaris table should now start ingesting data from Estuary Flow.