This year’s Current conference highlighted the need to bring real-time data and AI together. The keynote and many sessions dug into this topic for the streaming data community.
For years, the annual Current conference, hosted by Confluent, has been the go-to event for anyone interested in real-time data streaming and data-in-motion. The core focus centered around Apache Kafka and similar technologies. This year’s conference, like last year’s, focuses on expanding to address the growing need for AI to integrate real-time data.
In fact, this year’s conference theme is “Where Real-Time Data and AI Come Together.” To that point, Jay Kreps, Confluent CEO, used his opening keynote address to discuss this trend and the changing nature of the conference and the industry over the past few years.
One point he raised was that the definition of being “data-driven” has shifted from just deriving insights to taking actions based on data. That has always been a foundational element of real-time systems. For example, over the years, credit card fraud protection has shifted from identifying fraud to preventing it in real-time. Similarly, newer applications, such as autonomous vehicles, must respond to events in sub-second time frames (e.g., pedestrians walking in front of the car or changes in road conditions) as they develop.
Increasingly, automatic reaction to events is based on AI. And again, there must be a tight integration between real-time data and AI. For instance, Kreps noted that building AI systems that can automate business processes is challenging as they require a different paradigm from traditional software development. The quality of the data available to an AI system is crucial, as it guides the model and determines the quality of the output. What’s needed are effective data pipelines to prepare data from various systems in a way that is usable by AI agents. However, he noted, traditional batch processing approaches are not suitable for real-time decision-making.
See also: Data Streaming’s Importance in AI Applications
Conference Focus Reflects the Growing Need for Real-Time Data
Conference sessions explored some of the key issues in data streaming today. Apache Kafka remained a top topic of conversation.
Several of the sessions looked into Kafka 4.0, which was released earlier this year. Some of the key benefits of Kafka 4.0 are its simplicity, scalability, and performance. Thanks to some of the new features and capabilities in the 4.0 release, Kafka is open to a broader range of use cases. For example, Kafka is now open to more messaging-style workloads (e.g., work-queues, task dispatch) in addition to its traditional event-log / stream processing roles.
Other sessions looked at how AI and streaming data work together. As one speaker noted, AI is the demand driver and streaming is the substrate. To that point, it has been noted many times that reliable real-time data is foundational for next-generation AI.
See also: Confluent Launches Data Streaming for AI and Adds Apache Flink
New Solutions to Power AI and Real-Time Data
Confluent uses the conference to announce new and enhanced solutions for the AI and real-time data space.
It announced the launch of Confluent Intelligence for building and powering context-rich, real-time artificial intelligence (AI). Built on Confluent Cloud, the fully managed stack continuously streams and processes historic and real-time data, delivering this context directly into AI applications. With Confluent Intelligence, businesses can establish AI systems grounded in dynamic, trustworthy data.
It also announced the General Availability (GA) of Delta Lake and Databricks Unity Catalog integrations in Confluent Tableflow, along with Early Access (EA) availability on Microsoft OneLake. Those features make Tableflow a fully managed, end-to-end solution that connects operational, analytical, and artificial intelligence (AI) systems across hybrid and multicloud environments. Additionally, Confluent now integrates Apache Kafka topics directly into Delta Lake or Apache Iceberg tables, featuring automated quality controls, catalog synchronization, and enterprise-grade security.
The company also introduced its Real-Time Context Engine, a fully managed service utilizing the Model Context Protocol (MCP). The service delivers real-time, structured data and accurate, relevant context to any artificial intelligence (AI) agent, copilot, or large language model (LLM)-powered application. With the Real-Time Context Engine, developers can utilize Confluent’s data streaming platform to manage their data infrastructure and quickly unlock trustworthy context for all their AI agents and applications, anywhere.
Tying everything together, Kreps reminded people that Confluent was started to help information move freely across a business, so companies can act in real-time. Such capabilities are needed more than ever in the age of AI. “Off-the-shelf models are powerful, but without the continuous flow of data, they can’t deliver decisions that are timely and uniquely valuable to a business. That’s where data streaming becomes essential,” said Kreps.





























