
In this week’s real-time analytics news: The Databricks Data + AI Summit yielded numerous announcements from Databricks and its partners.
Keeping pace with news and developments in the real-time analytics and AI market can be a daunting task. Fortunately, we have you covered with a summary of the items our staff comes across each week. And if you prefer it in your inbox, sign up here!
Databricks used this week’s Databricks Data + AI Summit to make multiple announcements relevant to those implementing real-time and AI applications and systems. The company announced:
Agent Bricks, a new, automated way to create high-performing AI agents tailored to an organization’s needs. Agent Bricks are optimized for common industry use cases, including structured information extraction, reliable knowledge assistance, custom text transformation, and orchestrated multi-agent systems. Agent Bricks is available now in Beta.
A new strategic product partnership with Google Cloud to make the latest Gemini models available as native products within the Databricks Data Intelligence Platform. This partnership empowers organizations to build, deploy, and scale AI agents using Google Gemini’s advanced capabilities directly on their enterprise data within their Databricks environment.
A multi-year early extension of its strategic partnership with Microsoft. Through this extended partnership, customers can expect continuous advancements in Azure Databricks to meet the evolving demands of AI. Additionally, Databricks and Microsoft will continue to deliver deep integrations between Azure Databricks and the broader Microsoft ecosystem.
New capabilities in Unity Catalog that add full support for Apache Iceberg tables, including native support for the Apache Iceberg REST Catalog APIs. Databricks also introduced two new enhancements that extend Unity Catalog to business users. Business metrics and KPIs can now be defined as first-class data assets with Unity Catalog Metrics. In addition, data + AI discovery is enhanced for business users with a new, curated internal marketplace that surfaces the highest-value data, AI, and AI/BI assets organized by business domain.
The open-sourcing of the company’s core declarative ETL framework as Apache Spark Declarative Pipelines. Spark Declarative Pipelines provides an easier way to define and execute data pipelines for both batch and streaming ETL workloads across any Apache Spark-supported data source, including cloud storage, message buses, change data feeds and external systems.
The launch of Lakebase, a fully-managed Postgres database built for AI. With Lakebase, Databricks adds an operational database layer to the company’s Data Intelligence Platform. Now, developers and enterprises can build data applications and AI agents faster and more easily on a single multi-cloud platform. Lakebase is now available in Public Preview.
A $100 million investment in global data and AI education aimed at closing the industry-wide talent gap and preparing the next generation of data and AI engineers, data analysts, and data scientists. The initiative includes the launch of Databricks Free Edition, a new offering to provide everyone with free access to the full capabilities of the Databricks Data Intelligence Platform, and training to accelerate their knowledge of data and AI technologies.
The launch of Databricks One, a new experience that gives all business users simple and secure access to the data and AI capabilities offered by the Data Intelligence Platform. At the heart of Databricks One is a completely redesigned interface tailored for business users. The interface provides a simple, intuitive way for non-technical users to access what they need.
The upcoming Preview of Lakeflow Designer, which helps business analysts build no-code ETL pipelines with natural language and a drag-and-drop UI. Backed by Lakeflow, Unity Catalog, and Databricks Assistant, Lakeflow Designer lets non-technical users solve business problems without burdening data engineers with maintenance issues and governance headaches.
Other related announcements at the conference from partners include:
Amperity launched Chuck Data, an AI Agent built specifically for customer data engineering. Chuck connects directly to a user’s Databricks environment, leveraging native compute and large language model (LLM) endpoints to execute high-impact workflows like identity resolution, compliance tagging, and data profiling.
Anaconda announced a go-to-market partnership and native integration with Databricks. By combining Anaconda’s AI Platform’s secure, curated open-source Python packages with the Databricks Data Intelligence Platform, enterprises will be able to enable enterprise-ready AI development with enhanced governance.
CData Software announced the launch of the CData Databricks Integration Accelerator, a new solution purpose-built to simplify and expedite enterprise data integration for Databricks environments. By eliminating traditional bottlenecks in data pipeline development, the Accelerator can reduce integration timelines.
Indicium announced the launch of AI Data Squads as a Service, a new delivery model that helps companies execute complex migrations, modernize platforms, improve data quality, and more. Powered by Agentic AI, the service can help streamline Databricks migrations.
Informatica announced an expansion of its partnership with Databricks. Informatica is a launch partner for two innovations from Databricks—Managed Iceberg Tables and Databricks Lakebase, a modern database built for AI. Informatica also unveiled GenAI-focused enhancements to its Intelligent Data Management Cloud (IDMC) platform to accelerate data and AI at scale with Databricks.
MathCo announced the launch of a dedicated Databricks Center of Excellence (CoE) to deepen its strategic collaboration with Databricks. The CoE is designed to accelerate Agentic AI readiness and help enterprises build and scale autonomous agents using the Databricks Data Intelligence Platform.
Monte Carlo announced extended support for the Databricks Data Intelligence Platform through new integrations with Databricks AI/BI and Unity Catalog Metrics. These enhancements enable teams to maintain observability across every layer of the data + AI stack, from transformation pipelines to semantic metrics powering critical pipelines, dashboards, and products.
OneTrust announced the expansion of its partnership with Databricks with the launch of its new Data Policy Enforcement product integration. The integration enables data governance teams to automate data policy enforcement with Unity Catalog and keep up with the speed at which AI-driven systems process and utilize data.
PuppyGraph announced native integration with Managed Iceberg Tables on the Databricks Data Intelligence Platform. The integration allows organizations to run complex graph queries directly on Iceberg Tables governed by Unity Catalog- no data movement and no ETL pipelines.
Qlik announced a series of new capabilities for customers of Databricks. The enhancements include streaming real-time data into Unity Catalog’s Uniform tables via change data capture (CDC), automated Apache Iceberg optimization through Qlik Open Lakehouse, and the creation of high-quality data products.
Sedai announced the general availability of its Databricks Optimization solution. The new offering enables cloud and data teams to reduce wasted costs, improve performance, and efficiently scale the usage of Databricks. Sedai autonomously applies its optimizations with a single click, such as right-sizing clusters or tuning auto-termination settings.
Spice AI announced the expansion of its partnership with Databricks with the launch of Spice AI’s new integration with the Databricks Data Intelligence Platform. The new integration increases developer productivity, accelerates the building of AI applications and agents, and enables organizations to drive more value from their data by leveraging the Databricks Data Intelligence Platform.
SqlDBM announced the release of its Model Context Protocol (MCP) Server. This component makes it easier for data teams to expose their structured data models and metadata in formats that large language models (LLMs) and generative AI tools can access. The announcement coincides with SqlDBM achieving Validated Partner status with Databricks.
Striim expanded its SQL2Fabric-X support for Azure Databricks. This enables organizations to stream operational SQL Server data into Azure Databricks’ Lakehouse architecture in real time, with full support for change data capture (CDC), inline transformations, and schema evolution.
In other Striim news, the company announced that it is expanding its Postgres offerings with high-throughput ingestion from Neon into Databricks for real-time analytics, as well as high-speed data delivery from legacy systems into Neon for platform and data modernization.
Trust3 AI announced the launch of Trust AI for Databricks, a trust layer solution designed to empower enterprises using Data Intelligence Platforms. Trust3 AI for Databricks enhances decision-making and system reliability while embedding governance, helping organizations confidently deploy autonomous agents and generative AI (GenAI) applications.
Real-time analytics news in brief
Amplitude debuted Amplitude AI Agents. These new agents turn Amplitude into a team of specialized experts that help accomplish specific goals, such as better checkout conversion or faster feature adoption. The AI Agents work in parallel to monitor data, spot patterns, and changes, watch user sessions, form hypotheses, run experiments, ship changes, and monitor impact.
New Relic announced support for the Model Context Protocol (MCP) within its AI Monitoring solution. With this support, developers building agents that use MCP and the teams providing MCP services can access deep, actionable insights that allow them to pinpoint and resolve any issues with AI applications, reducing the need for manual effort and custom instrumentation and lowering operational costs.
Pentaho announced enhancements to the Pentaho Data Catalog, designed to help organizations achieve the data fitness needed for the AI age by increasing quality, observability, and trust in data. The new enhancements expand the capacity of customers to better understand and organize their data, observe how that data is being used, and use automation to improve its quality, governance, and trust.
Precisely announced a series of advancements to the Precisely Data Integrity Suite, introducing new capabilities that transform data quality, observability, governance, and location intelligence. The latest enhancements enable enterprises to build the foundational data environment needed to scale AI and analytics, achieve visibility across hybrid data ecosystems, and embed trusted data into workflows for confident decision-making.
Vanta announced the launch of the Vanta AI Agent. The solution autonomously handles end-to-end workflows across a company’s entire GRC program, including identifying issues and inconsistencies individuals might miss and proactively taking action on their behalf. Launching in private beta now and generally available in July, the Vanta AI Agent automates some of the most time-consuming and error-prone GRC workflows.
If your company has real-time analytics news, send your announcements to [email protected].
In case you missed it, here are our most recent previous weekly real-time analytics news roundups:
- Real-time Analytics News for the Week Ending June 7
- Real-time Analytics News for the Week Ending May 31
- Real-time Analytics News for the Week Ending May 24
- Real-time Analytics News for the Week Ending May 17
- Real-time Analytics News for the Week Ending May 10
- Real-time Analytics News for the Week Ending May 3
- Real-time Analytics News for the Week Ending April 26
- Real-time Analytics News for the Week Ending April 19
- Real-time Analytics News for the Week of April 12
- Real-time Analytics News for the Week Ending April 5