Resources

451 Research Brief

The complexity of establishing and maintaining stream processing architectures is widely acknowledged. As the costs of real-time data have become less prohibitive, skillsets are increasingly the bottleneck to leveraging the technology. Decodable is seeking to address this bottleneck directly by letting teams establish capabilities that can filter, route, enrich or transform data streams using SQL, and easily build streaming applications.

Read more
Play video

Achieve Results Faster with Apache Flink SQL

Flink SQL provides the ability to process real-time data using Structured Query Language, the de facto standard language used to access and process data, and one that is familiar to a very wide range of developers. It provides an optimized implementation that is difficult to achieve with Java, resulting in the clear advantage of being able get to market faster with efficient real-time applications when using Flink SQL.

Read more
Play video

451 Research Brief

The complexity of establishing and maintaining stream processing architectures is widely acknowledged. As the costs of real-time data have become less prohibitive, skillsets are increasingly the bottleneck to leveraging the technology. Decodable is seeking to address this bottleneck directly by letting teams establish capabilities that can filter, route, enrich or transform data streams using SQL, and easily build streaming applications.

Read more
Play video

Achieve Results Faster with Apache Flink SQL

Flink SQL provides the ability to process real-time data using Structured Query Language, the de facto standard language used to access and process data, and one that is familiar to a very wide range of developers. It provides an optimized implementation that is difficult to achieve with Java, resulting in the clear advantage of being able get to market faster with efficient real-time applications when using Flink SQL.

Read more
Play video
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Video & Podcasts

Benefits Of Real Time Stream Processing

Join us as we explore the features and benefits of leveraging Decodable for your real-time stream processing needs. As a fully managed stream processing service, Decodable provides pre-built connectors to external systems and leverages SQL to provide a familiar development experience so you can be up and running in minutes, not months.

Video & Podcasts
Whitepaper

Introduction To Flink And Stream Processing

In this whitepaper, we'll explore what Flink and stream processing are, what differentiates it from traditional batch processing, how it's used to drive modern applications and services, why it matters in the context of today's business needs, and touch on how it works.

Whitepaper
Video & Podcasts

Introduction to Apache Flink and Flink SQL

Join Gunnar Morling for a ten minute introduction to Flink and FlinkSQL, as you see him build a Flink pipeline to process data from one Kafka topic to another Kafka topic. In this example, he'll be using RedPanda's Kafka API compatible offering to stream data into and from Flink.

Video & Podcasts
eBook

Build The Future With Decodable And Stream Processing

Learn why every business should consider switching to stream processing, and how it's easier and cheaper than ever before with the Decodable platform.

eBook
Video & Podcasts

Decodable in 90 Seconds

Decodable in 90 Seconds

Video & Podcasts
Use Cases

Use Case Walkthroughs

In this eBook, we'll walk through five common use cases of real-time stream processing, including Customer 360, food delivery tracking, fraud detection, shipping logistics tracking, and website clickstream analytics. Starting with a sample data schema for each scenario, we'll take a look at how quickly and easily SQL statements can be used to create a series of one or more pipelines to clean, transform, enrich, and aggregate raw incoming data into its desired form.

Use Cases
News

TechCrunch: Decodable wants to take real-time stream processing mainstream

TechCrunch coverage of the announcement of Decodable's enterprise capabilities and enhancements.

News
Video & Podcasts

Webinar - The Top 5 Mistakes Deploying Flink

Learn about the 5 most common mistakes deploying Apache Flink, and how you can avoid them from Flink co-creator and PMC member Robert Metzger. Robert will be joined by Decodable CEO and streaming industry veteran Eric Sammer who'll demo some of the most common stream processing patterns using SQL in a form that you can reproduce yourself in minutes.

Video & Podcasts
Video & Podcasts

Decodable at Solutions Review's Data Demo Day

Doug Atkinson and Decodable CEO Eric Sammer discuss making streaming data engineering easier with Decodable's fully-managed stream processing platform, which allows for the real-time ingestion, integration, and transformation of data to support the development of event-driven applications and services.

Video & Podcasts
Datasheet

Achieve Results Faster with Apache Flink SQL

Flink SQL provides the ability to process real-time data using Structured Query Language, the de facto standard language used to access and process data, and one that is familiar to a very wide range of developers. It provides an optimized implementation that is difficult to achieve with Java, resulting in the clear advantage of being able get to market faster with efficient real-time applications when using Flink SQL.

Datasheet
Video & Podcasts

TFIR: Decodable's 2023 Predictions

Eric Sammer talks to TFIR with his predictions for 2023 and Data.

Video & Podcasts
Video & Podcasts

What's new in Apache Flink 1.16

Apache Flink PMC Chair Robert Metzger summarizes the top features shipped in the latest Apache Flink 1.16 release. Official Flink release announcement: https://flink.apache.org/news/2022/10/28/1.16-announcement.html Slides: https://speakerdeck.com/rmetzger/whats-new-in-flink-1-dot-16 Chapters: 0:00 Introduction 0:17 SQL Gateway 1:36 Hive Compatibility 2:15 Changelog State Backend 4:30 Overdraft Buffers and Unaligned Checkpoints 6:55 RocksDB 7:43 Lookup Joins & Async I/O 9:23 Batch Improvements 12:12 Wrap Up

Video & Podcasts
Video & Podcasts

Machine Learning with Apache Flink

Robert Metzger, Software Engineer at decodable and PMC member of Apache Flink asks the questions. We talk about the machine learning space in general, relevant machine learning projects for Apache Flink and Apache Flink ML itself: What’s the status of the project right now, and what are the plans for the future.

Video & Podcasts
Video & Podcasts

Realtime ETL is Easier Than You Think

In this demo-heavy webinar you'll learn why streaming ETL is essential for modern businesses, why current batch architectures with monolithic data warehouses are out of date and how Decodable is approaching the challenges traditionally associated with Streaming ETL.

Video & Podcasts
Video & Podcasts

Change Data Capture With Apache Flink

In this interview, we talk about Change Data Capture with Debezium and Flink with experts in this field as our guests: Gunnar Morling, one of the creators of the Debezium project answers any about CDC in general, as well as Debezium-specific questions. For deep insights into the Flink CDC Connectors project, we have Leonard Xu and Jark Wu, two long-term Flink contributors and leads on the Flink CDC project. Robert Metzger, the PMC Chair of the Apache Flink project asks the questions.

Video & Podcasts
Video & Podcasts

Change Stream Processing With Apache Flink

In this demo-heavy webinar Gunnar Morling and Sharon Xie answer common questions on change stream processing: - What is Change Data Capture (CDC) and why should I care? - What do I gain when I integrate CDC with Apache Flink? After this introduction, they switch to code to show how it works in a demo that includes CDC sources, stream processing and delivery to an Elastic database for searching.

Video & Podcasts
Video & Podcasts

Top 3 Challenges Running Multitenant Flink At Scale

In this talk, originally given at Flink Forward 2022, Decodable founding engineer Sharon Xie explores the key challenges and solutions building a managed Apache Flink solution into the Decodable stream processing platform.

Video & Podcasts
Video & Podcasts

Deploying Flink With The New Kubernetes Operator

Apache Flink PMC chair, Robert Metzger, in conversation with Gyula Fóra and Mátyás Örhidi, the main contributors of the new Flink Kubernetes Operator launched earlier this year. The Flink Kubernetes Operator is an abstraction layer on top of Kubernetes that makes deployment and operation much easier than apply Flink directly to Kubernetes components. In this discussion we cover the motivation for creating a kubernetes operator as part of the Flink project and dig into the details of using it.

Video & Podcasts
Video & Podcasts

Benefits of Real-Time Stream Processing

Join David Fabritius as he explores the features and benefits of leveraging Decodable for your real-time stream processing needs. Decodable is a stream processing platform providing the simplest method for moving data anywhere with real-time speed, transformed to match the needs of its destination. As a fully managed stream processing service, Decodable provides pre-built connectors to external systems and leverages SQL to provide a familiar development experience so you can be up and running in minutes, not months.

Video & Podcasts
Tutorial

Getting Started with Decodable

This quick introductory tutorial walks you through creating your first data flow in Decodable. Create a free account at https://app.decodable.co/ Run this demo for yourself by following the tutorial (includes all SQL in this video): https://docs.decodable.co/docs/web-quickstart-guide

Tutorial
Tutorial

Building A Stream Processing Pipeline With SQL

In this demo video Charles Harding builds a stream processing pipeline with SQL using Decodable in under 8 minutes, showing the developer experience, SQL editor and preview function for testing the SQL transformation.

Tutorial
Analyst Report

451 Research Brief

The complexity of establishing and maintaining stream processing architectures is widely acknowledged. As the costs of real-time data have become less prohibitive, skillsets are increasingly the bottleneck to leveraging the technology. Decodable is seeking to address this bottleneck directly by letting teams establish capabilities that can filter, route, enrich or transform data streams using SQL, and easily build streaming applications.

Analyst Report
Video & Podcasts

Building a practical real-time data platform for everyone

Decodable's CEO, Eric Sammer, explores the challenges faced building a real-time streaming data platform and how Decodable has solved them.

Video & Podcasts
News

The New Stack: Apache Flink for Unbounded Data Streams

Eric Sammer explains what Apache Flink is and why Booking.com, Pinterest, Stripe, Alibaba and Goldman Sachs are just a few of the companies that rely on Flink.

News
Video & Podcasts

PODCAST: Stream Processing, Observability, and the User Experience with Eric Sammer

Eric Sammer joins host Sam Ramji on the Open || Source || Data podcast.

Video & Podcasts
Upcoming Events

Data Council Austin '23

Get ready for another amazing year of learning, exchanging ideas and networking with the best technical minds building the future of data.

Upcoming Events
Upcoming Events

Real-Time Analytics Summit '23

The Real-Time Analytics Summit (#RTASummit) is an annual conference that brings professionals in the data space together to discuss harnessing actionable insights from real-time data. Join us to learn, teach, connect, and have an amazing time with the best community in the user-facing real-time analytics world.

Upcoming Events
Video & Podcasts

TFIR: Decodable Is Making It Easier For Developers To Use Real-Time Data

TFIR's Swapnil Bhartiya interviews Decodable CEO Eric Sammer about the company's mission, stream processing platform and the future of real-time data.

Video & Podcasts
Upcoming Events

Current '23

August 29-30, Austin TX. Current is intended to be the conference for the emerging data streaming ecosystem, a place for developers, architects, and technical executives to come together to engage with their peers. Kafka Summit itself will continue to thrive and grow at the heart of this new event.

Upcoming Events
Upcoming Events

Kafka Summit London 2023

Kafka Summit is the premier event for developers, architects, data engineers, devops professionals, and anyone else who wants to learn about streaming data. It brings the Apache Kafka community together to share best practices, learn how to build next-generation systems, and discuss the future of streaming technologies.

Upcoming Events
Upcoming Events

Flink Forward '23

Flink Forward is the conference dedicated to Apache Flink and the stream processing community. November 6-8 2023, Seattle USA

Upcoming Events
News

VentureBeat: Decodable and Datastax Partnership: What does it mean for enterprises?

VentureBeat's analysis of the recent Datastax+Decodable partnership to promote streaming among joint customers

News
News

VentureBeat: Decodable Raises $20M to grow its real-time analytics platform

Venturebeat's coverage of Decodable's Series A funding announcement

News
Video & Podcasts

PODCAST: Eric Sammer on Software Engineering Daily

Eric Sammer is founder and CEO of Decodable and joins the show to discuss the potential of stream processing, its role in modern data platforms, and how it’s being used today.

Video & Podcasts
Video & Podcasts

Security and how it plays in your data infrastructure

An Overview of How Security Works With Data. Join Hubert Dulay as he demonstrates approaches to protecting data, and other related security practices such as monitoring and observability.

Video & Podcasts
Video & Podcasts

Mirror Data from PostgreSQL to Snowflake

Need to get data from PostgreSQL to Snowflake? Hubert Dulay is back with another demo, this time showing how to take PostgreSQL data into Decodable and load into Snowflake using Snowpipe.

Video & Podcasts
Video & Podcasts

Recovering a Dropped Table with Decodable's Record Replay

This video tells the story of an important table that is dropped and how it was easily recovered.

Video & Podcasts
Video & Podcasts

Configuring Snowpipe

In his second Snowflake ingestion video, Hubert Dulay shows how to configure Snowpipe to load data from Decodable to Snowflake.

Video & Podcasts
Video & Podcasts

Ingesting & Processing S3 changes via AWS Lambda

In this demo video, we'll be using changes in an S3 bucket to trigger AWS Lambda functions, sending records to decodable for processing. Hubert Dulay is at the controls once more.

Video & Podcasts
Video & Podcasts

Realtime Join Between Confluent Cloud and PostgreSQL

Hubert Dulay is back, this time demonstrating how Decodable enables streaming joins between a PostgreSQL table streamed via CDC and a website clickstream delivered in a Kafka topic on Confluent Cloud.

Video & Podcasts
Video & Podcasts

Configuring Apache Kafka with mTLS - mutual TLS authentication

In this video Hubert Dulay shows how to configure mTLS (mutual TLS authentication) with Kafka, one of the most common questions the team at Decodable get from our customers. He then uses this configuration with Decodable's stream processing platform to securely transform and transfer data with Kafka.

Video & Podcasts
Video & Podcasts

Ingesting COVID data into Imply Polaris

Saketh Kurnool demonstrates Decodable's new Imply Polaris connector with a demo drawing COVID data from a REST endpoint into a Decodable pipeline and output to Imply Polaris.

Video & Podcasts
Tutorial

Cloning Resources - Connectors, Streams and Pipelines. 2 Minute Tip.

First in our series of 2-minute tips, Charles Harding demonstrates the time-saving technique of cloning existing resources as the basis for creating new resources.

Tutorial
Video & Podcasts

Connecting Decodable To Imply Polaris

Saketh Kurnool demonstrates how to set up the new Decodable Imply-Polaris sink connector to transform and send data to the Apache Druid based cloud service.

Video & Podcasts
Video & Podcasts

Real-time Change Data Capture (CDC) Processing Part Two

Building on the previous CDC video, Eric Sammer explains & demos real-time streaming change data capture (CDC) from PostgresSQL, processing in decodable and updating back into PostgreSQL.

Video & Podcasts
Video & Podcasts

Real-time Change Data Capture (CDC), Processing and Ingest using Decodable

Eric Sammer explains & demos real-time streaming change data capture (CDC), processing and ingest, taking Debezium format change records from PostgresSQL, processing with Decodable pipelines written in SQL and ingesting to various target systems including S3, Delta Lake (Databricks), PostgresSQL and more.

Video & Podcasts
Video & Podcasts

MySQL CDC to Clickhouse using Decodable's Change Stream Capabilities

In this demo John McKinnon shows how to use the new MySQL CDC (Change Data Capture) connector to create a stream of data from a static table, process the change stream using Decodable and sending the data to Clickhouse.

Video & Podcasts
Video & Podcasts

Processing Bitcoin data from Kafka to S3

In this video Hubert Dulay demonstrates how to load, & process cryptocurrency data from Kafka to AWS S3 using Decodable's stream processing service and SQL transformation.

Video & Podcasts
Video & Podcasts

Synchronizing MySQL Data To Clickhouse

In this video you'll see how to sync MySQL CDC. data to Clickhouse using Decodable pipelines and the Clickhouse connector to transport and transform the data.

Video & Podcasts
Video & Podcasts

Ingesting Wikipedia Data To Tinybird

Like most real-time analytical databases, Tinybird performs best when the data is prepared so that it can focus on low-latency queries. Decodable provides the capability to capture, enrich, and transform the real-time streaming data and send it to Tinybird. This demo builds a fully managed streaming solution from end to end from the source to the real-time web application.

Video & Podcasts
Video & Podcasts

Ingesting IoT Data To Rockset

Rockset is a real-time analytics database that is capable of low latency, high concurrency analytical queries. It’s a fully managed database service that supports all the major cloud providers. It uses a storage engine called RocksDB which is an open source key-value data store. RocksDB is used in many high performance storage systems like MySQL, Apache Kafka and CockroachDB. RocksDB is written entirely in C++, for maximum performance. RocksDB is ideal for fast, low latency storage such as flash drives and high-speed disk drives. In this demo, we will walk you through how to capture IoT data from a MQTT broker. The data will contain metrics from a cell phone. We’ll then transform it and send it to Rockset and visualize it in a real time dashboard.

Video & Podcasts
Tutorial

Delta Lake Decoded

In the last few years, Delta Lake has emerged as a popular new standard for cloud storage due to its ability to merge traditional data store features with elastic cloud storage. However, for many who've not worked with it in production, Delta Lake remains something of a mystery and so this training session is designed to increase the understanding of what Delta lake is, its underpinnings, and what use cases it is best suited for. After completing this training, you'll have a deeper understanding of Delta Lake and be ready to implement it in your data architecture.

Tutorial
Video & Podcasts

Ingesting to Apache Pinot

Real-time OLAP databases (RTOLAP) or even sometimes called streaming databases, are a special type of databases that are designed to perform OLAP workloads on large data sets In this demo we will cleanse security logs before sending them to Apache Pinot, an RTOLAP, using Decodable.

Video & Podcasts
Video & Podcasts

Ingesting Covid Data Into Apache Druid

Apache Druid is a real-time analytics database designed for fast OLAP queries on large data sets. Druid powers use cases where real-time and streaming ingestion, fast query performance, and high uptime are important. In this video, we will ingest COVID 19 global statistics data into Apache Druid, cleansing the data so that Druid can easily work with it. Once Druid has the COVID data, we will create a real-time dashboard in Apache SuperSet, an open source business intelligence dashboard. Realtime OLAP databases such as Druid operate faster with pre-processed datasets, reducing their workloads and enabling more focused query execution. Decodable is the ideal solution for performing this pre-processing.

Video & Podcasts
Video & Podcasts

Masking Sensitive Data

Masking is the process of obscuring information from applications or users that don’t have the permissions to process or view them. Masking data is also a requirement enforced by regulations like Health Insurance Portability and Accountability Act (HIPAA). Personal Identifiable Information (PII) and Protected Health Information (PHI) are examples of data that need to be masked. Real time data masking is easy with Decodable’s built-in functions.

Video & Podcasts
Video & Podcasts

Ingesting the Twitter Firehose into Databricks

In this video Charles Harding demonstrates the new Decodable Delta Lake connector enabling filtering and reformatting a stream of twitter data (provided by datapm) before analysis in Databricks.

Video & Podcasts
Video & Podcasts

Routing OSQuery Events via Apache Pulsar

OSQuery is an open source tool that lets you query operating system events using SQL.The events can be fed into a streaming platform, in this case Pulsar, for subsequent transformation and routing on the stream using Decodable.

Video & Podcasts
Video & Podcasts

Converting streaming XML to JSON with Apache Kafka

In today's incredibly useful demo, Hubert Dulay shows how easy it is to use Decodable to convert XML data into JSON on Kafka topics.

Video & Podcasts
Video & Podcasts

This Microservice Should Have Been a SQL Statement

In event-driven architecture, functions can be used to replace backend microservices. In this blog, we will demonstrate how to use Decodable as a serverless function written in SQL performing stateful transformations, replacing unnecessary microservices in your application. "That microservice could have been a SQL statement" made real, with Decodable. This starts off as an insight on how to achieve more, faster by replacing needless complexity with simplicity, but it really is a great educational piece on change streams, materialized views, CQRS, and stateless VS stateful processing of data.

Video & Podcasts
Video & Podcasts

Comparing Decodable with Kinesis Data Analytics

Flink is an amazing technology; the power to process real-time data on the stream is something that everyone running Kafka, Pulsar, Kinesis and other popular messaging platforms will want to use eventually. But with power comes responsibility, and in the case of Flink, complexity. Decodable's mission is to eliminate this complexity to make stream processing available to everyone. In this video we'll show the relative experience of two Flink-based cloud services; Decodable and Kinesis Data Analytics.

Video & Podcasts
Video & Podcasts

Using Rest APIs to Build Pipelines & Stream Data with Postman

In this video, we build a streaming data pipeline with Decodable’s REST APIs using Postman to invoke the APIs from a GUI, and then use Postman to send streaming data into the new pipeline using Decodable's HTTP connector. We also show how to configure Postman with the necessary configuration and variables all within a collection - a set of saved requests that can be loaded and shared with others. Read the accompanying blog here: https://www.decodable.co/blog/using-rest-apis-to-build-pipelines-stream-data-with-postman

Video & Podcasts
Video & Podcasts

Opinionated Streaming Data Pipelines

Schema-on-write is a feature many engineers have used to their advantage when building data pipelines. In this demo, we will introduce this idea, and show how you can add "guard rails" to your pipelines to ensure they don't break.

Video & Podcasts
Video & Podcasts

Building a data mesh with AsyncAPI and Data Products

Hubert Dulay demonstrates how easy it is to create data products for your data mesh using Decodable with the AsyncAPI open source initiative.

Video & Podcasts
Video & Podcasts

Decodable Developer Experience Tour

Josh Mahonin takes us on a whistle-stop tour of the Decodable developer experience including schema version management and update, debugging, pipeline dependency management and data product navigation via schemas in a data mesh setup.

Video & Podcasts
Video & Podcasts

Stream Processing on Confluent Cloud using Airline data

Arlo Purcell shows how easy it is to connect to Confluent Cloud as both a sink and source, with auto-detection and import of a complex schema on both ends. In this demo he's transforming Airline industry standard SSIM XML data representing flight schedules.

Video & Podcasts
Video & Podcasts

Connecting Apache Pulsar to Amazon Kinesis

This video shows off the new Decodable Pulsar connector as both a source and sink, followed by the usual Q&A with the team.

Video & Podcasts
Video & Podcasts

Processing real-time crypto transactions fed by datapm

Learn how to process cryptocurrency exchange rates in real-time using and Decodable's transformations on coinbase transactions fed from DataPM

Video & Podcasts
Video & Podcasts

ML Feature extraction using SQL pipeline transformations and the Moonsense SDK

See how Moonsense's SDK transmits real-time data from a device's accelerometer and touch screen to Decodable for processing before sending to the Moonsense fraud detection model.

Video & Podcasts
Video & Podcasts

Real-time Data Engineering With Decodable

Tim James walks through the Decodable app and then shows the power of stream processing in SQL using AWS S3 and Athena as the destination sink.

Video & Podcasts
Video & Podcasts

Connecting Kafka to S3

Eric Sammer and Tim James demonstrate how Decodable connects Kafka to S3 by way of Decodable's own dogfooding - using for internal metrics - in the context of how Decodable connects to a range of systems including Redis, AWS Kinesis, Pulsar, RedPanda, RedShift, Snowflake, Snowpipe, Apache Pinot/StarTree and more.

Video & Podcasts
Video & Podcasts

Decodable product capabilities and the new user experience

Decodable CEO Eric Sammer demonstrates the capabilities of the real-time data engineering platform through the new user experience.

Video & Podcasts
Video & Podcasts

Decodable Presentation for Netflix

Eric Sammer, CEO of Decodable is joined by founding engineer Sharon Xie as they present the Decodable story & product demos to the Netflix real-time data team. Try Decodable today, start free at http://decodable.co

Video & Podcasts
No items found.

Let’s get decoding.

Register for access and see how easy it is.

Start free