Decodable Delivers ‘Change Data Capture’ Support to Unlock Data from Silos in Real Time, Announces Numerous New Features in Real-Time Data Platform at Current 2022
Power and Simplicity: Decodable offers a one-interface real-time data solution capable of managing dozens of data sources and terabytes of data.
Decodable, the real-time data engineering company, is announcing three new capabilities in the Decodable platform at Current 2022, the next generation of the Kafka Summit, taking place today and tomorrow at the Austin Convention Center. Decodable is a powerful and easy-to-use stream processing platform, ideal for unlocking the full potential of Kafka by moving data anywhere, with real-time speed and transformed to match the needs of its destination.
“Over the summer we built three key new features,” said Eric Sammer, Decodable founder and CEO. “We’ve completed these important engineering milestones just in time for Current 2022. Individually, these capabilities are useful but collectively, they unlock new streaming scenarios. Our early access customers have already shown the value of integrating a wide range of data sources working with real-world applications and data, and the results speak for themselves. Our platform delivered streaming joins combining more than 10 tables from real-time and static data sources processing terabytes of data at very low latency. At Current 2022, we’ll show how users can put these new features to work to transform their own data processes with real-time data.”
1. CDC Connectors Turn Traditional Databases into Streams of Events
The Decodable platform now ships with turnkey CDC (change data capture) connectors that effectively convert relational database tables into real-time streams of records describing every change to the table (insert, update, delete, etc.) as an event. The connector will initially treat every table row as an insert event to bulk transfer the table. Subsequent changes to the table trigger more records describing each change as an event.
CDC Connectors for MySQL and PostgreSQL are already included in the Decodable platform and ready for cloud implementations in popular services like Amazon Cloud RDS or Google Cloud SQL. Connectors for Oracle, MSSQL, Snowflake and more are coming soon. Read more about CDC and Decodable.
2. ‘Multi-way Stream Joins’ Combine Multiple Diverse Data Sources
The Decodable platform now includes the capability of multi-way streaming table joins for large data sets. This feature is based on Apache Flink and includes custom optimizations by Decodable. Whereas other technologies may provide 2-way streaming table joins, Decodable has successfully executed 14-way streaming joins, as proven by customers in production who are processing terabytes of data without dropping a beat. Read more about streaming joins.
3. Change Stream Processing Hides the Complexity of Change Stream Records
The ability to join multiple change streams is only useful if you can perform processing on the resultant data stream. It turns out that processing individual change records emanating from multiple streams is difficult to get right and that difficulty increases exponentially with each additional streaming data source in the join. Decodable eliminates this challenge by abstracting change records back into their original tables— so you write industry standard SQL as if you were processing the original table and never see the underlying change records. This massively reduces time to value and eliminates human error introduced by processing change records directly.
Additional Enhancements to the Decodable Platform on Display at Current 2022
Decodable has also added features to strengthen the platform’s ease of use and scalability:
- Previewing Data in Change Streams: Decodable provides a “Preview” feature to speed development of SQL queries, showing the developer the result of the query on incoming data. With this release, Preview is extended to change data streams, eliminating much of the complexity developing and debugging with CDC events.
- Large-Scale Efficiency: The engineering team continues to invest in optimizing how Decodable provisions and manages the underlying Flink infrastructure to further scale Decodable’s performance, invisible to the developer so they can focus on building and running pipelines. For example, Decodable now uses virtual machines with locally attached SSD storage for increased speed and performance.
Check Out Decodable and All the New Platform Capabilities at Current 2022
Visit Decodable, a Gold Sponsor for Current 2022, at its virtual booth and learn more about Decodable’s new capabilities in these presentations (full agenda):
- Breakout: “Streaming Is Still Not the Default: Challenges, Objections and the Future of Streaming” led by Eric Sammer, Tuesday, 3:15 p.m., Ballroom E (Also available as a Virtual Keynote)
- “Powerful Stream Processing, In Minutes Not Months,” Decodable Team, Tuesday, 4:45 p.m., Expo Theatre
- Apache Flink Meetup: “Introduction to Flink” presented by Robert Metzger, Flink PMC chair and Flink co-creator, staff engineer at Decodable, Wednesday, 1 p.m., Meetup Hub B
- Panel: “If Streaming Is the Answer, Why Are We Still Doing Batch?”, Eric Sammer, Wednesday, 4 p.m., Ballroom E (Also available as a Virtual Keynote)
Decodable’s mission is to make streaming data engineering easy. Decodable delivers the first real-time data engineering service—that anyone can run. As a serverless platform for real-time data ingestion, integration, analysis and event-driven service development, Decodable eliminates the need for a large data team, clusters to set up, or complex code to write. The company is backed by Bain Capital Ventures and Venrock. To learn more, please visit decodable.co.