Apache Kafka

Messaging
Sink
Source
Start Free

Apache Kafka

Event streaming is the practice of capturing data in real-time from event sources, storing these event streams durably for later retrieval, and routing the event streams to different destinations as needed. Kafka is an open-source distributed event streaming platform that is optimized for ingesting and processing streaming data in real-time. The primary functions of Kafka are:

  • Publish and subscribe to streams of records
  • Effectively store streams of records in the order in which records were generated

Kafka is run as a cluster of one or more servers that can span multiple data centers or cloud regions. Some of these servers form the storage layer, called the brokers. Other servers run Kafka Connect to continuously import and export data as event streams to integrate Kafka with your existing systems such as relational databases as well as other Kafka clusters.

Kafka clients allow you to write distributed applications and microservices that read, write, and process streams of events in parallel, at scale, and in a fault-tolerant manner even in the case of network problems or machine failures.

No items found.

Decodable + Apache Kafka

Decodable regards Apache Kafka as both a source which can send a stream of data, and a sink which can receive one or more streams of data from a pipeline. You could use Decodable to take data from Kafka and send to an analytical database or machine learning model, or even send back to another Kafka topic after transformation. With Kafka as a sink, Decodable can take a stream of data from a messaging system such as Kinesis, a REST API or even a standard database like MySQL or Postgres running in CDC (change data capture) mode. Transforming data between Kafka topics can be the basis for building sophisticated event-driven microservice applications without writing a single line of code.

In the following video you'll see how to perform ETL, taking data from one Kafka topic, processing it in Decodable, and consuming it into another Kafka topic.

In the following video you'll see how to use Kafka as a source, streaming data into Decodable which transforms it and outputs to Amazon S3.

Let's Get Decoding