Resources
Explore the latest resources from Decodable, including videos, podcasts, news, and more. Enhance your knowledge of Apache Flink, stream processing, and other topics.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
No items match your search.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
No items match your search.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Unapologetically Technical: 1 Billion Row Challenge
Unapologetically Technical: 1 Billion Row Challenge
Navigating Event Streaming with Decodable CEO Eric Sammer
Navigating Event Streaming with Decodable CEO Eric Sammer
Stream Processing, Observability, and the User Experience with Eric Sammer
Stream Processing, Observability, and the User Experience with Eric Sammer
Building a practical real-time data platform for everyone
Building a practical real-time data platform for everyone
No items match your search.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
No items match your search.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Apache Flink
Apache Kafka
Change Data Capture
Connectors
Security
Stream Processing
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
The Easiest Way to Run Apache Flink Jobs in Production
Get your data to the right place, in the right shape, with as little or as much processing as needed—fast. Decodable offers a unified platform for connectivity and stream processing, enabling rapid data movement and complex transformations, without the burden of infrastructure management.
The Easiest Way to Run Apache Flink Jobs in Production
Get your data to the right place, in the right shape, with as little or as much processing as needed—fast. Decodable offers a unified platform for connectivity and stream processing, enabling rapid data movement and complex transformations, without the burden of infrastructure management.
Current 2024
Join Decodable plus the best and brightest minds in data streaming, AI, stream processing, and Apache Kafka® app development at Current 2024. Boost your Kafka game with Apache Flink® integration. Gain incisive insight on burgeoning and emerging Generative AI use cases.
Current 2024
Join Decodable plus the best and brightest minds in data streaming, AI, stream processing, and Apache Kafka® app development at Current 2024. Boost your Kafka game with Apache Flink® integration. Gain incisive insight on burgeoning and emerging Generative AI use cases.
Kafka Summit London 2024
Join Decodable along with developers, architects, data engineers, DevOps professionals, and software thought leaders at the premier event for the Apache Kafka® community at Kafka Summit 2024 in London.
Kafka Summit London 2024
Join Decodable along with developers, architects, data engineers, DevOps professionals, and software thought leaders at the premier event for the Apache Kafka® community at Kafka Summit 2024 in London.
Unapologetically Technical: 1 Billion Row Challenge
In addition to going in-depth into the Billion Row Challenge, Jesse Anderson talks with Gunnar Morling about why it's important to stay at a position long enough to gain experience and see the success or failure of decisions. Gunnar shares his experiences working on Debezium at RedHat and his ongoing work at Decodable.
Unapologetically Technical: 1 Billion Row Challenge
In addition to going in-depth into the Billion Row Challenge, Jesse Anderson talks with Gunnar Morling about why it's important to stay at a position long enough to gain experience and see the success or failure of decisions. Gunnar shares his experiences working on Debezium at RedHat and his ongoing work at Decodable.
TheNewStack: Apache Flink 2023 Retrospective and Glimpse into the Future
Flink is ushering in a long-imagined era when data can finally be harnessed for on-target insights and informed, instantaneous decision-making. As we welcome 2024, let’s take a brief look back at the milestones achieved by the Apache Flink community and ecosystem in the past year.
TheNewStack: Apache Flink 2023 Retrospective and Glimpse into the Future
Flink is ushering in a long-imagined era when data can finally be harnessed for on-target insights and informed, instantaneous decision-making. As we welcome 2024, let’s take a brief look back at the milestones achieved by the Apache Flink community and ecosystem in the past year.
InfoQ: The One Billion Row Challenge
On the first day of 2024, Gunnar Morling, Senior Staff Software Engineer at Decodable, launched The One Billion Row Challenge (1BRC) to the Java Community. This ongoing challenge will run until the end of January and aims to find Java code that processes one billion rows in the fastest time.
InfoQ: The One Billion Row Challenge
On the first day of 2024, Gunnar Morling, Senior Staff Software Engineer at Decodable, launched The One Billion Row Challenge (1BRC) to the Java Community. This ongoing challenge will run until the end of January and aims to find Java code that processes one billion rows in the fastest time.
The Latest in Stream Architectures
Eric Sammer and Tim Berglund talk about the current landscape of stream processing, explore the various architectures and their real-world applications. From practical insights to engaging anecdotes, this episode is a must-listen for anyone keen on understanding the dynamic world of real-time data processing.
The Latest in Stream Architectures
Eric Sammer and Tim Berglund talk about the current landscape of stream processing, explore the various architectures and their real-world applications. From practical insights to engaging anecdotes, this episode is a must-listen for anyone keen on understanding the dynamic world of real-time data processing.
VMblog: Real-Time Data Stream Processing Grows Up
In 2023, stream processing gained momentum as the choice for online feature extraction, data cleansing and normalization, enrichment, and anonymization of sensitive data. In 2024, this trend will continue to expand, integrating generative AI models to power real-time, online, user-facing applications.
VMblog: Real-Time Data Stream Processing Grows Up
In 2023, stream processing gained momentum as the choice for online feature extraction, data cleansing and normalization, enrichment, and anonymization of sensitive data. In 2024, this trend will continue to expand, integrating generative AI models to power real-time, online, user-facing applications.
PRWeb: Decodable Joins 'Connect with Confluent' Technology Partner Program
Decodable, makers of the enterprise-ready stream processing platform built on Apache Flink and Debezium, has joined the Connect with Confluent (CwC) Partner Program to further support Confluent Cloud customers with their data streaming initiatives.
PRWeb: Decodable Joins 'Connect with Confluent' Technology Partner Program
Decodable, makers of the enterprise-ready stream processing platform built on Apache Flink and Debezium, has joined the Connect with Confluent (CwC) Partner Program to further support Confluent Cloud customers with their data streaming initiatives.
Going Real-Time with IoT and Stream Processing: Strategies and Technologies
As the world becomes increasingly more demanding in the ways it consumes any form of content, data is no stranger to this phenomenon. The emphasis on real-time, streaming data, as well as IoT, has dictated what many organizations seek in terms of operational intelligence.
Going Real-Time with IoT and Stream Processing: Strategies and Technologies
As the world becomes increasingly more demanding in the ways it consumes any form of content, data is no stranger to this phenomenon. The emphasis on real-time, streaming data, as well as IoT, has dictated what many organizations seek in terms of operational intelligence.
RTInsights: Real-time Analytics News
Decodable announced that it has expanded the enterprise features of its platform with the addition of a Snowflake Streaming API connector and a dbt adapter.
RTInsights: Real-time Analytics News
Decodable announced that it has expanded the enterprise features of its platform with the addition of a Snowflake Streaming API connector and a dbt adapter.
VentureBeat: An open data lakehouse will maintain and grow the value of your data
Are recession fears eating at you? Worried about all your digital transformation investments evaporating like so much dew in the morning sun? Learn how an open data lakehouse will maintain and grow the value of your data.
VentureBeat: An open data lakehouse will maintain and grow the value of your data
Are recession fears eating at you? Worried about all your digital transformation investments evaporating like so much dew in the morning sun? Learn how an open data lakehouse will maintain and grow the value of your data.
InfoQ: What Are Cloud-Bound Applications?
This article examines the commoditization of the full software stack by binding the application to cloud services using open APIs and standards that preserve flexibility and portability.
InfoQ: What Are Cloud-Bound Applications?
This article examines the commoditization of the full software stack by binding the application to cloud services using open APIs and standards that preserve flexibility and portability.
The New Stack: Data Streaming, for When Micro-batching Just Isn’t Fast Enough
Micro-batching offers many, but not all, the advantages real-time data streaming. But in order to succeed, streaming data needs the same tooling that everyone expects from batch data.
The New Stack: Data Streaming, for When Micro-batching Just Isn’t Fast Enough
Micro-batching offers many, but not all, the advantages real-time data streaming. But in order to succeed, streaming data needs the same tooling that everyone expects from batch data.
InfoQ: The Wonders of Postgres Logical Decoding Messages
Explore how to take advantage of Postgres Logical Decoding for implementing data propagation via the outbox pattern, application logging, and enriching audit logs with metadata.
InfoQ: The Wonders of Postgres Logical Decoding Messages
Explore how to take advantage of Postgres Logical Decoding for implementing data propagation via the outbox pattern, application logging, and enriching audit logs with metadata.
Solutions Review Names 7 Low-Code Data Engineering Vendors to Watch, 2023
Solutions Review’s Low-Code Data Engineering Vendors to Watch is an annual listing of solution providers we believe are worth monitoring. Companies are commonly included if they demonstrate a product roadmap aligning with our meta-analysis of the marketplace. Other criteria include recent and significant funding, talent acquisition, a disruptive or innovative new technology or product, or inclusion in a major analyst publication.
Solutions Review Names 7 Low-Code Data Engineering Vendors to Watch, 2023
Solutions Review’s Low-Code Data Engineering Vendors to Watch is an annual listing of solution providers we believe are worth monitoring. Companies are commonly included if they demonstrate a product roadmap aligning with our meta-analysis of the marketplace. Other criteria include recent and significant funding, talent acquisition, a disruptive or innovative new technology or product, or inclusion in a major analyst publication.
Datanami: Five Drivers Behind the Rapid Rise of Apache Flink
In 2022 alone, a total of at least $55 million has been invested by venture capitalists into startups building companies around Apache Flink, the open source project that’s used to process data streams at large scale and deliver real-time analytical insights. In 2023, Confluent announced acquiring a Flink startup for a rumored $100m. Investors have high confidence that Flink is the right technology for stream processing.
Datanami: Five Drivers Behind the Rapid Rise of Apache Flink
In 2022 alone, a total of at least $55 million has been invested by venture capitalists into startups building companies around Apache Flink, the open source project that’s used to process data streams at large scale and deliver real-time analytical insights. In 2023, Confluent announced acquiring a Flink startup for a rumored $100m. Investors have high confidence that Flink is the right technology for stream processing.
Open Source Streaming Data Analytics
Decodable CEO Eric Sammer explains how the rise of open source streaming data analytics platforms are driving modern approaches to DataOps that are fueling digital business transformations.
Open Source Streaming Data Analytics
Decodable CEO Eric Sammer explains how the rise of open source streaming data analytics platforms are driving modern approaches to DataOps that are fueling digital business transformations.
insideBIGDATA Latest News
In this regular column, we’ll bring you all the latest industry news centered around our main topics of focus: big data, data science, machine learning, AI, and deep learning. Our industry is constantly accelerating with new products and services being announced everyday.
insideBIGDATA Latest News
In this regular column, we’ll bring you all the latest industry news centered around our main topics of focus: big data, data science, machine learning, AI, and deep learning. Our industry is constantly accelerating with new products and services being announced everyday.
Datanami: Decodable Ships Additional Features for Real-Time Data Streaming
Decodable has announced a slate of new features in its enterprise-ready platform built on Apache Flink, an industry leading stream processing technology. The Decodable platform rounds out open source Apache Flink with capabilities that deliver security, efficiency and performance, enabling enterprise users to connect to anything, develop with speed and operate with confidence.
Datanami: Decodable Ships Additional Features for Real-Time Data Streaming
Decodable has announced a slate of new features in its enterprise-ready platform built on Apache Flink, an industry leading stream processing technology. The Decodable platform rounds out open source Apache Flink with capabilities that deliver security, efficiency and performance, enabling enterprise users to connect to anything, develop with speed and operate with confidence.
TFiR: Decodable Announces New Enterprise-Ready Features
Decodable has announced new features in its enterprise-ready platform built on Apache Flink, the stream processing technology. The Decodable platform rounds out open source Apache Flink with capabilities that deliver security, efficiency and performance, enabling enterprise users to connect to anything, develop with speed and operate with confidence.
TFiR: Decodable Announces New Enterprise-Ready Features
Decodable has announced new features in its enterprise-ready platform built on Apache Flink, the stream processing technology. The Decodable platform rounds out open source Apache Flink with capabilities that deliver security, efficiency and performance, enabling enterprise users to connect to anything, develop with speed and operate with confidence.
Medium: Operational Use case Patterns for Apache Kafka and Flink
Dunith Danushka walks through a few patterns for building operational use cases with Kafka and Flink. Operational use cases differ from the typical analytical use cases and can directly impact business operations.
Medium: Operational Use case Patterns for Apache Kafka and Flink
Dunith Danushka walks through a few patterns for building operational use cases with Kafka and Flink. Operational use cases differ from the typical analytical use cases and can directly impact business operations.
Tech Startups to Watch in 2023
Startups continue to emerge this year, addressing challenges and leveraging opportunities in innovative and novel ways. The landscape is constantly changing, and 2023 is no different. Here are 15 startups DBTA thinks are worth watching in 2023.
Tech Startups to Watch in 2023
Startups continue to emerge this year, addressing challenges and leveraging opportunities in innovative and novel ways. The landscape is constantly changing, and 2023 is no different. Here are 15 startups DBTA thinks are worth watching in 2023.
DZone Trend Report Data Pipelines
DZone’s 2023 Trend Report Data Pipelines: Investigating the Modern Data Stack is here, and Decodable is a proud partner. Explore the current state of data pipelines, including data-driven design and architecture, data observability, data integration models and techniques, and more.
DZone Trend Report Data Pipelines
DZone’s 2023 Trend Report Data Pipelines: Investigating the Modern Data Stack is here, and Decodable is a proud partner. Explore the current state of data pipelines, including data-driven design and architecture, data observability, data integration models and techniques, and more.
Send Your Decodable Metrics to Datadog
Join Saketh Kurnool to learn how to send data from Decodable to Datadog for downstream analytics. You can use this connector to send data from the Decodable _metrics stream into Datadog in order to monitor the state and health of your Decodable connections and pipelines.
Send Your Decodable Metrics to Datadog
Join Saketh Kurnool to learn how to send data from Decodable to Datadog for downstream analytics. You can use this connector to send data from the Decodable _metrics stream into Datadog in order to monitor the state and health of your Decodable connections and pipelines.
Approaches to Real-Time Data Analysis
When it comes to collecting and accurately utilizing real-time data, companies need a solution that is easy to use, scalable, and most importantly, can reduce costs. Between December 2022 and January 2023, Gatepoint Research invited selected executives to participate in a survey themed. Download the report to see the results!
Approaches to Real-Time Data Analysis
When it comes to collecting and accurately utilizing real-time data, companies need a solution that is easy to use, scalable, and most importantly, can reduce costs. Between December 2022 and January 2023, Gatepoint Research invited selected executives to participate in a survey themed. Download the report to see the results!
Navigating Event Streaming with Decodable CEO Eric Sammer
In this episode of the Real-Time Analytics podcast, host Tim Berglund welcomes Eric Sammer, Founder and CEO of Decodable. Eric, an industry leader in event streaming technology, discusses the company's focus on stream processing, real-time data processing, and integration with systems like Apache Pinot and StarTree. The conversation delves into the challenges and complexities of managing data, from data cleansing to structuring for different use cases. They explore the ideal balance between generalized and specialized systems, emphasizing the importance of flexibility. Ultimately, they highlight how stream processing serves as an effective solution to adjust and distribute data intelligently, providing an essential abstraction point.
Navigating Event Streaming with Decodable CEO Eric Sammer
In this episode of the Real-Time Analytics podcast, host Tim Berglund welcomes Eric Sammer, Founder and CEO of Decodable. Eric, an industry leader in event streaming technology, discusses the company's focus on stream processing, real-time data processing, and integration with systems like Apache Pinot and StarTree. The conversation delves into the challenges and complexities of managing data, from data cleansing to structuring for different use cases. They explore the ideal balance between generalized and specialized systems, emphasizing the importance of flexibility. Ultimately, they highlight how stream processing serves as an effective solution to adjust and distribute data intelligently, providing an essential abstraction point.
TFiR: Real-Time Stream Processing With Decodable
In this episode of TFiR: T3M, Decodable CEO Eric Sammer shares his insights on the current data processing trends and how Decodable’s stream processing platform is helping companies enhance their overall data strategy.
TFiR: Real-Time Stream Processing With Decodable
In this episode of TFiR: T3M, Decodable CEO Eric Sammer shares his insights on the current data processing trends and how Decodable’s stream processing platform is helping companies enhance their overall data strategy.
Enable Change Data Capture With Postgres On Amazon RDS
Learn how to set up a Postgres database on Amazon RDS for change data capture, so that you can emit data change streams to tools like Debezium or managed stream processing platforms such as Decodable.
Enable Change Data Capture With Postgres On Amazon RDS
Learn how to set up a Postgres database on Amazon RDS for change data capture, so that you can emit data change streams to tools like Debezium or managed stream processing platforms such as Decodable.
Array Aggregation With Flink SQL
In this episode of Data Streaming Quick Tips, you’ll learn how to aggregate the elements of an array with Flink SQL using both the built-in function JSON_ARRAYAGG() as well as a user-defined function (UDF). The results of a one-to-many join are then ingested as a nested document structure into a search index in Elasticsearch.
Array Aggregation With Flink SQL
In this episode of Data Streaming Quick Tips, you’ll learn how to aggregate the elements of an array with Flink SQL using both the built-in function JSON_ARRAYAGG() as well as a user-defined function (UDF). The results of a one-to-many join are then ingested as a nested document structure into a search index in Elasticsearch.
Re-keying a Kafka Topic
In this episode of "Data Streaming Quick Tips", Gunnar Morling explains how–and why–to re-key a Kafka topic, and how the Decodable platform can help make that easier.
Re-keying a Kafka Topic
In this episode of "Data Streaming Quick Tips", Gunnar Morling explains how–and why–to re-key a Kafka topic, and how the Decodable platform can help make that easier.
The Flink Upsert Kafka SQL Connector
In this inaugural episode of Data Streaming Quick Tips, Gunnar Morling is taking a look at how to use Flink's Upsert Kafka SQL connector for propagating events from a changelog stream (created via Flink CDC and Debezium) to a topic in a Redpanda cluster.
The Flink Upsert Kafka SQL Connector
In this inaugural episode of Data Streaming Quick Tips, Gunnar Morling is taking a look at how to use Flink's Upsert Kafka SQL connector for propagating events from a changelog stream (created via Flink CDC and Debezium) to a topic in a Redpanda cluster.
Streaming Data into Snowflake with Decodable
Learn how to reduce the cost and complexity of streaming data into Snowflake with Decodable with the new Snowpipe Streaming connector.
Streaming Data into Snowflake with Decodable
Learn how to reduce the cost and complexity of streaming data into Snowflake with Decodable with the new Snowpipe Streaming connector.
Decodable at Solutions Review's Data Demo Day
Doug Atkinson and Decodable CEO Eric Sammer discuss making streaming data engineering easier with Decodable's fully-managed stream processing platform, which allows for the real-time ingestion, integration, and transformation of data to support the development of event-driven applications and services.
Decodable at Solutions Review's Data Demo Day
Doug Atkinson and Decodable CEO Eric Sammer discuss making streaming data engineering easier with Decodable's fully-managed stream processing platform, which allows for the real-time ingestion, integration, and transformation of data to support the development of event-driven applications and services.
Introduction to Apache Flink and Flink SQL
Join Gunnar Morling for a ten minute introduction to Flink and FlinkSQL, as you see him build a Flink pipeline to process data from one Kafka topic to another Kafka topic. In this example, he'll be using RedPanda's Kafka API compatible offering to stream data into and from Flink.
Introduction to Apache Flink and Flink SQL
Join Gunnar Morling for a ten minute introduction to Flink and FlinkSQL, as you see him build a Flink pipeline to process data from one Kafka topic to another Kafka topic. In this example, he'll be using RedPanda's Kafka API compatible offering to stream data into and from Flink.
The New Stack: Apache Flink for Unbounded Data Streams
Eric Sammer explains what Apache Flink is and why Booking.com, Pinterest, Stripe, Alibaba and Goldman Sachs are just a few of the companies that rely on Flink.
The New Stack: Apache Flink for Unbounded Data Streams
Eric Sammer explains what Apache Flink is and why Booking.com, Pinterest, Stripe, Alibaba and Goldman Sachs are just a few of the companies that rely on Flink.
Eric Sammer on Software Engineering Daily
Eric Sammer is founder and CEO of Decodable and joins the show to discuss the potential of stream processing, its role in modern data platforms, and how it’s being used today.
Eric Sammer on Software Engineering Daily
Eric Sammer is founder and CEO of Decodable and joins the show to discuss the potential of stream processing, its role in modern data platforms, and how it’s being used today.
VentureBeat: Decodable and Datastax Partnership: What does it mean for enterprises?
VentureBeat's analysis of the recent Datastax+Decodable partnership to promote streaming among joint customers
VentureBeat: Decodable and Datastax Partnership: What does it mean for enterprises?
VentureBeat's analysis of the recent Datastax+Decodable partnership to promote streaming among joint customers
Building a practical real-time data platform for everyone
Decodable's CEO, Eric Sammer, explores the challenges faced building a real-time streaming data platform and how Decodable has solved them.
Building a practical real-time data platform for everyone
Decodable's CEO, Eric Sammer, explores the challenges faced building a real-time streaming data platform and how Decodable has solved them.
TFIR: Decodable Is Making It Easier For Developers To Use Real-Time Data
TFIR's Swapnil Bhartiya interviews Decodable CEO Eric Sammer about the company's mission, stream processing platform and the future of real-time data.
TFIR: Decodable Is Making It Easier For Developers To Use Real-Time Data
TFIR's Swapnil Bhartiya interviews Decodable CEO Eric Sammer about the company's mission, stream processing platform and the future of real-time data.
TechCrunch: Decodable wants to take real-time stream processing mainstream
TechCrunch coverage of the announcement of Decodable's enterprise capabilities and enhancements.
TechCrunch: Decodable wants to take real-time stream processing mainstream
TechCrunch coverage of the announcement of Decodable's enterprise capabilities and enhancements.
Benefits of Real-Time Stream Processing
Join David Fabritius as he explores the features and benefits of leveraging Decodable for your real-time stream processing needs. Decodable is a stream processing platform providing the simplest method for moving data anywhere with real-time speed, transformed to match the needs of its destination. As a fully managed stream processing service, Decodable provides pre-built connectors to external systems and leverages SQL to provide a familiar development experience so you can be up and running in minutes, not months.
Benefits of Real-Time Stream Processing
Join David Fabritius as he explores the features and benefits of leveraging Decodable for your real-time stream processing needs. Decodable is a stream processing platform providing the simplest method for moving data anywhere with real-time speed, transformed to match the needs of its destination. As a fully managed stream processing service, Decodable provides pre-built connectors to external systems and leverages SQL to provide a familiar development experience so you can be up and running in minutes, not months.
Security and how it plays in your data infrastructure
An Overview of How Security Works With Data. Join Hubert Dulay as he demonstrates approaches to protecting data, and other related security practices such as monitoring and observability.
Security and how it plays in your data infrastructure
An Overview of How Security Works With Data. Join Hubert Dulay as he demonstrates approaches to protecting data, and other related security practices such as monitoring and observability.
Change Data Capture With Apache Flink
In this interview, we talk about Change Data Capture with Debezium and Flink with experts in this field as our guests: Gunnar Morling, one of the creators of the Debezium project answers any about CDC in general, as well as Debezium-specific questions. For deep insights into the Flink CDC Connectors project, we have Leonard Xu and Jark Wu, two long-term Flink contributors and leads on the Flink CDC project. Robert Metzger, the PMC Chair of the Apache Flink project asks the questions.
Change Data Capture With Apache Flink
In this interview, we talk about Change Data Capture with Debezium and Flink with experts in this field as our guests: Gunnar Morling, one of the creators of the Debezium project answers any about CDC in general, as well as Debezium-specific questions. For deep insights into the Flink CDC Connectors project, we have Leonard Xu and Jark Wu, two long-term Flink contributors and leads on the Flink CDC project. Robert Metzger, the PMC Chair of the Apache Flink project asks the questions.
Change Stream Processing With Apache Flink
In this demo-heavy webinar Gunnar Morling and Sharon Xie answer common questions on change stream processing: - What is Change Data Capture (CDC) and why should I care?
- What do I gain when I integrate CDC with Apache Flink?
After this introduction, they switch to code to show how it works in a demo that includes CDC sources, stream processing and delivery to an Elastic database for searching.
Change Stream Processing With Apache Flink
In this demo-heavy webinar Gunnar Morling and Sharon Xie answer common questions on change stream processing: - What is Change Data Capture (CDC) and why should I care?
- What do I gain when I integrate CDC with Apache Flink?
After this introduction, they switch to code to show how it works in a demo that includes CDC sources, stream processing and delivery to an Elastic database for searching.
Deploying Flink With The New Kubernetes Operator
Apache Flink PMC chair, Robert Metzger, in conversation with Gyula Fóra and Mátyás Örhidi, the main contributors of the new Flink Kubernetes Operator launched earlier this year. The Flink Kubernetes Operator is an abstraction layer on top of Kubernetes that makes deployment and operation much easier than apply Flink directly to Kubernetes components. In this discussion we cover the motivation for creating a kubernetes operator as part of the Flink project and dig into the details of using it.
Deploying Flink With The New Kubernetes Operator
Apache Flink PMC chair, Robert Metzger, in conversation with Gyula Fóra and Mátyás Örhidi, the main contributors of the new Flink Kubernetes Operator launched earlier this year. The Flink Kubernetes Operator is an abstraction layer on top of Kubernetes that makes deployment and operation much easier than apply Flink directly to Kubernetes components. In this discussion we cover the motivation for creating a kubernetes operator as part of the Flink project and dig into the details of using it.
Mirror Data from PostgreSQL to Snowflake
Need to get data from PostgreSQL to Snowflake? Hubert Dulay is back with another demo, this time showing how to take PostgreSQL data into Decodable and load into Snowflake using Snowpipe.
Mirror Data from PostgreSQL to Snowflake
Need to get data from PostgreSQL to Snowflake? Hubert Dulay is back with another demo, this time showing how to take PostgreSQL data into Decodable and load into Snowflake using Snowpipe.
Realtime ETL is Easier Than You Think
In this demo-heavy webinar you'll learn why streaming ETL is essential for modern businesses, why current batch architectures with monolithic data warehouses are out of date and how Decodable is approaching the challenges traditionally associated with Streaming ETL.
Realtime ETL is Easier Than You Think
In this demo-heavy webinar you'll learn why streaming ETL is essential for modern businesses, why current batch architectures with monolithic data warehouses are out of date and how Decodable is approaching the challenges traditionally associated with Streaming ETL.
What's new in Apache Flink 1.16
Apache Flink PMC Chair Robert Metzger summarizes the top features shipped in the latest Apache Flink 1.16 release.
Official Flink release announcement: https://flink.apache.org/news/2022/10/28/1.16-announcement.html
Slides: https://speakerdeck.com/rmetzger/whats-new-in-flink-1-dot-16 Chapters:
0:00 Introduction
0:17 SQL Gateway 1:36 Hive Compatibility
2:15 Changelog State Backend
4:30 Overdraft Buffers and Unaligned Checkpoints
6:55 RocksDB
7:43 Lookup Joins & Async I/O
9:23 Batch Improvements
12:12 Wrap Up
What's new in Apache Flink 1.16
Apache Flink PMC Chair Robert Metzger summarizes the top features shipped in the latest Apache Flink 1.16 release.
Official Flink release announcement: https://flink.apache.org/news/2022/10/28/1.16-announcement.html
Slides: https://speakerdeck.com/rmetzger/whats-new-in-flink-1-dot-16 Chapters:
0:00 Introduction
0:17 SQL Gateway 1:36 Hive Compatibility
2:15 Changelog State Backend
4:30 Overdraft Buffers and Unaligned Checkpoints
6:55 RocksDB
7:43 Lookup Joins & Async I/O
9:23 Batch Improvements
12:12 Wrap Up
Ingesting & Processing S3 changes via AWS Lambda
In this demo video, we'll be using changes in an S3 bucket to trigger AWS Lambda functions, sending records to decodable for processing. Hubert Dulay is at the controls once more.
Ingesting & Processing S3 changes via AWS Lambda
In this demo video, we'll be using changes in an S3 bucket to trigger AWS Lambda functions, sending records to decodable for processing. Hubert Dulay is at the controls once more.
Realtime Join Between Confluent Cloud and PostgreSQL
Hubert Dulay is back, this time demonstrating how Decodable enables streaming joins between a PostgreSQL table streamed via CDC and a website clickstream delivered in a Kafka topic on Confluent Cloud.
Realtime Join Between Confluent Cloud and PostgreSQL
Hubert Dulay is back, this time demonstrating how Decodable enables streaming joins between a PostgreSQL table streamed via CDC and a website clickstream delivered in a Kafka topic on Confluent Cloud.
Configuring Apache Kafka with mTLS - mutual TLS authentication
In this video Hubert Dulay shows how to configure mTLS (mutual TLS authentication) with Kafka, one of the most common questions the team at Decodable get from our customers. He then uses this configuration with Decodable's stream processing platform to securely transform and transfer data with Kafka.
Configuring Apache Kafka with mTLS - mutual TLS authentication
In this video Hubert Dulay shows how to configure mTLS (mutual TLS authentication) with Kafka, one of the most common questions the team at Decodable get from our customers. He then uses this configuration with Decodable's stream processing platform to securely transform and transfer data with Kafka.
Ingesting COVID data into Imply Polaris
Saketh Kurnool demonstrates Decodable's new Imply Polaris connector with a demo drawing COVID data from a REST endpoint into a Decodable pipeline and output to Imply Polaris.
Ingesting COVID data into Imply Polaris
Saketh Kurnool demonstrates Decodable's new Imply Polaris connector with a demo drawing COVID data from a REST endpoint into a Decodable pipeline and output to Imply Polaris.
Top 3 Challenges Running Multitenant Flink At Scale
In this talk, originally given at Flink Forward 2022, Decodable founding engineer Sharon Xie explores the key challenges and solutions building a managed Apache Flink solution into the Decodable stream processing platform.
Top 3 Challenges Running Multitenant Flink At Scale
In this talk, originally given at Flink Forward 2022, Decodable founding engineer Sharon Xie explores the key challenges and solutions building a managed Apache Flink solution into the Decodable stream processing platform.
Connecting Decodable To Imply Polaris
Saketh Kurnool demonstrates how to set up the new Decodable Imply-Polaris sink connector to transform and send data to the Apache Druid based cloud service.
Connecting Decodable To Imply Polaris
Saketh Kurnool demonstrates how to set up the new Decodable Imply-Polaris sink connector to transform and send data to the Apache Druid based cloud service.
Real-time Change Data Capture (CDC) Processing, Part 2
Building on the previous CDC video, Eric Sammer explains & demos real-time streaming change data capture (CDC) from PostgresSQL, processing in decodable and updating back into PostgreSQL.
Real-time Change Data Capture (CDC) Processing, Part 2
Building on the previous CDC video, Eric Sammer explains & demos real-time streaming change data capture (CDC) from PostgresSQL, processing in decodable and updating back into PostgreSQL.
Real-time Change Data Capture (CDC) Processing, Part 1
Eric Sammer explains & demos real-time streaming change data capture (CDC), processing and ingest, taking Debezium format change records from PostgresSQL, processing with Decodable pipelines written in SQL and ingesting to various target systems including S3, Delta Lake (Databricks), PostgresSQL and more.
Real-time Change Data Capture (CDC) Processing, Part 1
Eric Sammer explains & demos real-time streaming change data capture (CDC), processing and ingest, taking Debezium format change records from PostgresSQL, processing with Decodable pipelines written in SQL and ingesting to various target systems including S3, Delta Lake (Databricks), PostgresSQL and more.
MySQL CDC to Clickhouse using Decodable's Change Stream Capabilities
In this demo John McKinnon shows how to use the new MySQL CDC (Change Data Capture) connector to create a stream of data from a static table, process the change stream using Decodable and sending the data to Clickhouse.
MySQL CDC to Clickhouse using Decodable's Change Stream Capabilities
In this demo John McKinnon shows how to use the new MySQL CDC (Change Data Capture) connector to create a stream of data from a static table, process the change stream using Decodable and sending the data to Clickhouse.
Processing Bitcoin data from Kafka to S3
In this video Hubert Dulay demonstrates how to load, & process cryptocurrency data from Kafka to AWS S3 using Decodable's stream processing service and SQL transformation.
Processing Bitcoin data from Kafka to S3
In this video Hubert Dulay demonstrates how to load, & process cryptocurrency data from Kafka to AWS S3 using Decodable's stream processing service and SQL transformation.
Synchronizing MySQL Data To Clickhouse
In this video you'll see how to sync MySQL CDC. data to Clickhouse using Decodable pipelines and the Clickhouse connector to transport and transform the data.
Synchronizing MySQL Data To Clickhouse
In this video you'll see how to sync MySQL CDC. data to Clickhouse using Decodable pipelines and the Clickhouse connector to transport and transform the data.
Ingesting Wikipedia Data To Tinybird
Like most real-time analytical databases, Tinybird performs best when the data is prepared so that it can focus on low-latency queries. Decodable provides the capability to capture, enrich, and transform the real-time streaming data and send it to Tinybird. This demo builds a fully managed streaming solution from end to end from the source to the real-time web application.
Ingesting Wikipedia Data To Tinybird
Like most real-time analytical databases, Tinybird performs best when the data is prepared so that it can focus on low-latency queries. Decodable provides the capability to capture, enrich, and transform the real-time streaming data and send it to Tinybird. This demo builds a fully managed streaming solution from end to end from the source to the real-time web application.
Ingesting IoT Data To Rockset
Rockset is a real-time analytics database that is capable of low latency, high concurrency analytical queries. It’s a fully managed database service that supports all the major cloud providers. It uses a storage engine called RocksDB which is an open source key-value data store. RocksDB is used in many high performance storage systems like MySQL, Apache Kafka and CockroachDB. RocksDB is written entirely in C++, for maximum performance. RocksDB is ideal for fast, low latency storage such as flash drives and high-speed disk drives. In this demo, we will walk you through how to capture IoT data from a MQTT broker. The data will contain metrics from a cell phone. We’ll then transform it and send it to Rockset and visualize it in a real time dashboard.
Ingesting IoT Data To Rockset
Rockset is a real-time analytics database that is capable of low latency, high concurrency analytical queries. It’s a fully managed database service that supports all the major cloud providers. It uses a storage engine called RocksDB which is an open source key-value data store. RocksDB is used in many high performance storage systems like MySQL, Apache Kafka and CockroachDB. RocksDB is written entirely in C++, for maximum performance. RocksDB is ideal for fast, low latency storage such as flash drives and high-speed disk drives. In this demo, we will walk you through how to capture IoT data from a MQTT broker. The data will contain metrics from a cell phone. We’ll then transform it and send it to Rockset and visualize it in a real time dashboard.
Ingesting to Apache Pinot
Real-time OLAP databases (RTOLAP) or even sometimes called streaming databases, are a special type of databases that are designed to perform OLAP workloads on large data sets In this demo we will cleanse security logs before sending them to Apache Pinot, an RTOLAP, using Decodable.
Ingesting to Apache Pinot
Real-time OLAP databases (RTOLAP) or even sometimes called streaming databases, are a special type of databases that are designed to perform OLAP workloads on large data sets In this demo we will cleanse security logs before sending them to Apache Pinot, an RTOLAP, using Decodable.
Ingesting Covid Data Into Apache Druid
Apache Druid is a real-time analytics database designed for fast OLAP queries on large data sets. Druid powers use cases where real-time and streaming ingestion, fast query performance, and high uptime are important. In this video, we will ingest COVID 19 global statistics data into Apache Druid, cleansing the data so that Druid can easily work with it. Once Druid has the COVID data, we will create a real-time dashboard in Apache SuperSet, an open source business intelligence dashboard. Realtime OLAP databases such as Druid operate faster with pre-processed datasets, reducing their workloads and enabling more focused query execution. Decodable is the ideal solution for performing this pre-processing.
Ingesting Covid Data Into Apache Druid
Apache Druid is a real-time analytics database designed for fast OLAP queries on large data sets. Druid powers use cases where real-time and streaming ingestion, fast query performance, and high uptime are important. In this video, we will ingest COVID 19 global statistics data into Apache Druid, cleansing the data so that Druid can easily work with it. Once Druid has the COVID data, we will create a real-time dashboard in Apache SuperSet, an open source business intelligence dashboard. Realtime OLAP databases such as Druid operate faster with pre-processed datasets, reducing their workloads and enabling more focused query execution. Decodable is the ideal solution for performing this pre-processing.
Machine Learning with Apache Flink
Robert Metzger, Software Engineer at decodable and PMC member of Apache Flink asks the questions. We talk about the machine learning space in general, relevant machine learning projects for Apache Flink and Apache Flink ML itself: What’s the status of the project right now, and what are the plans for the future.
Machine Learning with Apache Flink
Robert Metzger, Software Engineer at decodable and PMC member of Apache Flink asks the questions. We talk about the machine learning space in general, relevant machine learning projects for Apache Flink and Apache Flink ML itself: What’s the status of the project right now, and what are the plans for the future.
Masking Sensitive Data
Masking is the process of obscuring information from applications or users that don’t have the permissions to process or view them. Masking data is also a requirement enforced by regulations like Health Insurance Portability and Accountability Act (HIPAA). Personal Identifiable Information (PII) and Protected Health Information (PHI) are examples of data that need to be masked. Real time data masking is easy with Decodable’s built-in functions.
Masking Sensitive Data
Masking is the process of obscuring information from applications or users that don’t have the permissions to process or view them. Masking data is also a requirement enforced by regulations like Health Insurance Portability and Accountability Act (HIPAA). Personal Identifiable Information (PII) and Protected Health Information (PHI) are examples of data that need to be masked. Real time data masking is easy with Decodable’s built-in functions.
The Top 5 Mistakes Deploying Flink
Learn about the 5 most common mistakes deploying Apache Flink, and how you can avoid them from Flink co-creator and PMC member Robert Metzger. Robert will be joined by Decodable CEO and streaming industry veteran Eric Sammer who'll demo some of the most common stream processing patterns using SQL in a form that you can reproduce yourself in minutes.
The Top 5 Mistakes Deploying Flink
Learn about the 5 most common mistakes deploying Apache Flink, and how you can avoid them from Flink co-creator and PMC member Robert Metzger. Robert will be joined by Decodable CEO and streaming industry veteran Eric Sammer who'll demo some of the most common stream processing patterns using SQL in a form that you can reproduce yourself in minutes.
Routing OSQuery Events via Apache Pulsar
OSQuery is an open source tool that lets you query operating system events using SQL.The events can be fed into a streaming platform, in this case Pulsar, for subsequent transformation and routing on the stream using Decodable.
Routing OSQuery Events via Apache Pulsar
OSQuery is an open source tool that lets you query operating system events using SQL.The events can be fed into a streaming platform, in this case Pulsar, for subsequent transformation and routing on the stream using Decodable.
Converting streaming XML to JSON with Apache Kafka
In today's incredibly useful demo, Hubert Dulay shows how easy it is to use Decodable to convert XML data into JSON on Kafka topics.
Converting streaming XML to JSON with Apache Kafka
In today's incredibly useful demo, Hubert Dulay shows how easy it is to use Decodable to convert XML data into JSON on Kafka topics.
This Microservice Should Have Been a SQL Statement
In event-driven architecture, functions can be used to replace backend microservices. In this blog, we will demonstrate how to use Decodable as a serverless function written in SQL performing stateful transformations, replacing unnecessary microservices in your application. "That microservice could have been a SQL statement" made real, with Decodable. This starts off as an insight on how to achieve more, faster by replacing needless complexity with simplicity, but it really is a great educational piece on change streams, materialized views, CQRS, and stateless VS stateful processing of data.
This Microservice Should Have Been a SQL Statement
In event-driven architecture, functions can be used to replace backend microservices. In this blog, we will demonstrate how to use Decodable as a serverless function written in SQL performing stateful transformations, replacing unnecessary microservices in your application. "That microservice could have been a SQL statement" made real, with Decodable. This starts off as an insight on how to achieve more, faster by replacing needless complexity with simplicity, but it really is a great educational piece on change streams, materialized views, CQRS, and stateless VS stateful processing of data.
Comparing Decodable with Kinesis Data Analytics
Flink is an amazing technology; the power to process real-time data on the stream is something that everyone running Kafka, Pulsar, Kinesis and other popular messaging platforms will want to use eventually. But with power comes responsibility, and in the case of Flink, complexity. Decodable's mission is to eliminate this complexity to make stream processing available to everyone. In this video we'll show the relative experience of two Flink-based cloud services; Decodable and Kinesis Data Analytics.
Comparing Decodable with Kinesis Data Analytics
Flink is an amazing technology; the power to process real-time data on the stream is something that everyone running Kafka, Pulsar, Kinesis and other popular messaging platforms will want to use eventually. But with power comes responsibility, and in the case of Flink, complexity. Decodable's mission is to eliminate this complexity to make stream processing available to everyone. In this video we'll show the relative experience of two Flink-based cloud services; Decodable and Kinesis Data Analytics.
Opinionated Streaming Data Pipelines
Schema-on-write is a feature many engineers have used to their advantage when building data pipelines. In this demo, we will introduce this idea, and show how you can add "guard rails" to your pipelines to ensure they don't break.
Opinionated Streaming Data Pipelines
Schema-on-write is a feature many engineers have used to their advantage when building data pipelines. In this demo, we will introduce this idea, and show how you can add "guard rails" to your pipelines to ensure they don't break.
Decodable Developer Experience Tour
Josh Mahonin takes us on a whistle-stop tour of the Decodable developer experience including schema version management and update, debugging, pipeline dependency management and data product navigation via schemas in a data mesh setup.
Decodable Developer Experience Tour
Josh Mahonin takes us on a whistle-stop tour of the Decodable developer experience including schema version management and update, debugging, pipeline dependency management and data product navigation via schemas in a data mesh setup.
Stream Processing on Confluent Cloud using Airline data
Arlo Purcell shows how easy it is to connect to Confluent Cloud as both a sink and source, with auto-detection and import of a complex schema on both ends. In this demo he's transforming Airline industry standard SSIM XML data representing flight schedules.
Stream Processing on Confluent Cloud using Airline data
Arlo Purcell shows how easy it is to connect to Confluent Cloud as both a sink and source, with auto-detection and import of a complex schema on both ends. In this demo he's transforming Airline industry standard SSIM XML data representing flight schedules.
Connecting Apache Pulsar to Amazon Kinesis
This video shows off the new Decodable Pulsar connector as both a source and sink, followed by the usual Q&A with the team.
Connecting Apache Pulsar to Amazon Kinesis
This video shows off the new Decodable Pulsar connector as both a source and sink, followed by the usual Q&A with the team.
Processing real-time crypto transactions fed by datapm
Learn how to process cryptocurrency exchange rates in real-time using and Decodable's transformations on coinbase transactions fed from DataPM
Processing real-time crypto transactions fed by datapm
Learn how to process cryptocurrency exchange rates in real-time using and Decodable's transformations on coinbase transactions fed from DataPM
ML Feature extraction using SQL pipeline transformations and the Moonsense SDK
See how Moonsense's SDK transmits real-time data from a device's accelerometer and touch screen to Decodable for processing before sending to the Moonsense fraud detection model.
ML Feature extraction using SQL pipeline transformations and the Moonsense SDK
See how Moonsense's SDK transmits real-time data from a device's accelerometer and touch screen to Decodable for processing before sending to the Moonsense fraud detection model.
Real-time Data Engineering With Decodable
Tim James walks through the Decodable app and then shows the power of stream processing in SQL using AWS S3 and Athena as the destination sink.
Real-time Data Engineering With Decodable
Tim James walks through the Decodable app and then shows the power of stream processing in SQL using AWS S3 and Athena as the destination sink.
Connecting Kafka to S3
Eric Sammer and Tim James demonstrate how Decodable connects Kafka to S3 by way of Decodable's own dogfooding - using for internal metrics - in the context of how Decodable connects to a range of systems including Redis, AWS Kinesis, Pulsar, RedPanda, RedShift, Snowflake, Snowpipe, Apache Pinot/StarTree and more.
Connecting Kafka to S3
Eric Sammer and Tim James demonstrate how Decodable connects Kafka to S3 by way of Decodable's own dogfooding - using for internal metrics - in the context of how Decodable connects to a range of systems including Redis, AWS Kinesis, Pulsar, RedPanda, RedShift, Snowflake, Snowpipe, Apache Pinot/StarTree and more.
Decodable product capabilities and the new user experience
Decodable CEO Eric Sammer demonstrates the capabilities of the real-time data engineering platform through the new user experience.
Decodable product capabilities and the new user experience
Decodable CEO Eric Sammer demonstrates the capabilities of the real-time data engineering platform through the new user experience.
451 Research Brief
The complexity of establishing and maintaining stream processing architectures is widely acknowledged. As the costs of real-time data have become less prohibitive, skillsets are increasingly the bottleneck to leveraging the technology. Decodable is seeking to address this bottleneck directly by letting teams establish capabilities that can filter, route, enrich or transform data streams using SQL, and easily build streaming applications.
451 Research Brief
The complexity of establishing and maintaining stream processing architectures is widely acknowledged. As the costs of real-time data have become less prohibitive, skillsets are increasingly the bottleneck to leveraging the technology. Decodable is seeking to address this bottleneck directly by letting teams establish capabilities that can filter, route, enrich or transform data streams using SQL, and easily build streaming applications.
Search the complete library of resources.
No items match your search.