Confluent Launches Data Streaming for AI and Adds Apache Flink

With the introduction of Flink, Confluent customers will be able to do more with the data processed through the streaming platform.

Confluent made several major announcements at its Current event on Wednesday, including a fully managed service on Confluent Cloud for Apache Flink and a data streaming for AI initiative.

As one of the premier event streaming platforms deploying Apache Kafka, adding Flink is a sign of growing demands from businesses in the real-time data processing space. Flink is considered the more advanced of the two frameworks, with more flexibility and a richer feature set. Many businesses use both, as they are complementary to one another. 

“Companies globally are leveraging data streaming with Apache Kafka to power real-time experiences,” said Shaun Clowes, Chief Product Officer at Confluent. “While data streaming connects data across an organization, stream processing tools like Apache Flink make it possible to act on that data in real time and accelerate the development of new use cases. Our managed Kafka and Flink offerings are like the peanut butter and jelly of data streaming. They work better together on a unified platform to connect and enrich data so teams can uncover more insights and deliver impactful experiences that move the needle.”

With the introduction of Flink, Confluent customers will be able to do more with the data processed through the streaming platform. As mentioned in the company’s press release, Flink is the framework underlying many important processes, such as matching drivers and riders on ride sharing apps, alerting banks of fraudulent activity, and personalizing recommendations on social media platforms.  

Apache Flink is available on open preview for AWS customers. Confluent said general availability will be coming soon, but did not give a date. 

See also: 12 Streaming Analytics Solutions to Consider in 2024

Data Streaming for AI 

Confluent announced at its Current event the Data Streaming for AI initiative, which aims to accelerate the development of real-time AI applications through the use of ecosystem partnerships and data streaming expertise. The initiative comes as businesses of all industries are looking for ways to integrate AI into their portfolio, especially tools such as generative AI and large language models. 

To make Confluent’s platform more accessible for AI development, Confluent has expanded its partnerships with AI and vector database companies. These include MongoDB, Pinecone, Rockset, Weaviate, and Zilliz. It will also build on its partnership agreements with Google Cloud and Microsoft Azure specifically around the AI market. One application already under development is a Copilot Solution Template, which gives an AI assistant the capability to perform business transactions and send real-time updates.

“Data streaming is foundational technology for the future of AI,” said Jay Kreps, CEO and Cofounder, Confluent. “Continuously enriched, trustworthy data streams are key to building next-gen AI applications that are accurate and have the rich, real-time context modern use cases demand. We want to make it easier for every company to build powerful AI applications and are leveraging our expansive ecosystem of partners and data streaming expertise to help achieve that.”

A lot of machine learning and large language models have been trained on batch data, but there is a growing need for real-time processing solutions. Predictive fraud detection and personalized recommendations are two of the use cases which require real-time data to work correctly, and Confluent has said it is committed to providing more capabilities within its own space to address this growing need.   

See also: Kafka Training is Essential in Today’s Real-time World

Data Portal in Stream Governance Suite 

One of the challenges in the discovery stage for developers working on Kafka-based data processing projects is finding and accessing the relevant data streams. Confluent Data Portal, which is an expansion of the company’s Stream Governance suite, aims to guide developers through this process with a self-service interface which can identify all of the data streams running through an organization, filtered by topics, tags, and metadata. 

Through the dashboard, developers will be able to request access to data streams through the Confluent Cloud user interface, as well as build and manage data products. 

Confluent has not set a date for the availability of Data Portal. It will be available as part of the Stream Governance suite for those who are already customers. 

Enterprise Clusters for Private Networking 

Confluent introduced Enterprise clusters for its cloud customers, built for those who require private networking solutions. The solution provides teams with a way to establish communication between virtual private connections and Confluent Cloud, without exposing data to the internet. The enterprise clusters were built on the Kora Engine, a cloud-native Apache Kafka engine which Confluent uses to manage thousands of its Cloud clusters autonomously. 

Enterprise clusters are available through AWS PrivateLink. Confluent intends to add it to more networking constructs in the near future. 

AI Assistant 

It wouldn’t be a product event without some mention of generative AI, and Confluent hasn’t disappointed. It announced the launch of the Confluent AI Assistant, which will use natural language processing to answer customer queries. These can be questions on billing and payments, and also requests for code. 

The AI assistant has been trained on publicly available data, Confluent documentation, and contextual customer data. Confluent did not say if it had developed the AI assistant in-house or if the framework came from one of the many large language models available from cloud service providers. 

Confluent said it will be available in 2024, but did not narrow down the date any further. 

AI for Flink SQL 

Confluent had a bit more to add to its Apache Flink announcement, with the introduction of AI capabilities for Flink SQL, which should be coming in stages over the next few months. At the Current event, Confluent showed off how developers can make OpenAI API request calls directly from Flink SQL. There are many use cases which can be found from integrating AI into Flink SQL, including sentiment analysis and summarization. 

Confluent Terraform Update

To finish off the updates, Confluent said that it would be improving its Terraform provider through the integration of HashiCorp Sentinel, a policy as code framework for enterprise products. The framework allows organizations to enable policy decisions at multiple enforcement levels, alongside having condition-based policy instructions. The integration gives customers a handle on every stage of an infrastructure lifecycle. 

Leave a Reply

Your email address will not be published. Required fields are marked *