Blog /

Decodable's Imply Polaris Connector: The Druid Easy Button

John Allwright

Decodable now includes a native sink connector to send data to Imply Polaris.

Imply Polaris® is a real-time database for modern analytics applications, built from Apache Druid® and delivered as a fully managed database as a service (DBaaS). It provides a complete, integrated experience that simplifies everything from ingesting data to using visualizations to extracting valuable insights from your data.

Imply Polaris & Druid

Imply Polaris can be used as an alternative to a self-managed Druid deployment. So, Decodable's Imply Polaris connector is the alternative to using Decodable’s  Druid sink connector. Unlike the Druid connector, no Kafka is necessary to work with Polaris.

Key benefits of Polaris include:

  • A fully managed cloud service. You do not have to configure and run your own Kafka data sources to ingest data to Polaris (as you would need to with Druid). Just point, click, and stream.
  • A single development experience, with push-based streaming built on Confluent Cloud.
  • Database optimization.
  • Scale in seconds.
  • Resiliency and security.

Decodable + Imply Polaris

Decodable provides a low-latency transport and transformation for ingesting data in a way that's matched to Imply Polaris' real-time analytics. After all, there's no point running real-time queries on stale data! Imply Polaris and Apache Druid performs much more efficiently if the data it ingests is pre-processed, and Decodable is the ideal tool to perform this transformation, as described in this blog post.

Getting Started With The Imply Polaris Connector

Imply Polaris is a sink connection, meaning that data can only be written from Decodable to Polaris after processing in a pipeline using SQL.

Connecting Decodable to Polaris consists of 4 steps:

  • Create a Polaris table
  • Create a Decodable stream and associated schema, which in turn will take data from one or more Decodable pipelines.
  • In Polaris, create a push_streaming connection and streaming job with schema matching that in Decodable.
  • Select the Polaris connection in the Decodable create connection dialog

Complete the Polaris connection configuration by specifying:

  • The name of the Polaris connection you created in step 3.
  • your Polaris organization name.
  • your Polaris API client ID.
  • the secret associated with your API client ID.

Hit "Next" and select the Decodable stream you created in step 2

Finally, you'll be asked to confirm the schema is correct and name the Decodable connection. That's it - Happy Decoding!

For a more thorough walkthrough of using the Polaris connector, please check out the documentation or watch the video demo that follows.

You can get started with Decodable for free - our developer account includes enough for you to build a useful pipeline and - unlike a trial - it never expires.

Learn more:

Join the community Slack

Top 6 Patterns for Ingesting Data Into Druid

Most people fold their laundry before putting it into the drawers, for the simple reason that once the object is in the container, it is then constrained by the limits of the container. Similarly, users working to ingest data into Druid will find that it’s much preferable to pre-process the data. Here are the top 6 reasons to pre-process your streams.

Learn more


Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.

Learn more
Pintrest icon in black

Start using Decodable today.