Agent Bricks (beta): Modularizing the agent deployment lifecycle
Databricks has launched Agent Bricks, now in beta, built on top of its existing Mosaic AI Agent Framework. Put simply, a user can now describe the task that they want an agent to achieve, connect their data to the system, and Agent Bricks will handle everything required to get it production ready. These modular “bricks” are designed to help teams tackle the messy reality of deploying AI agents into real-world environments.
Why it matters
Building an LLM-based agent is one thing – getting it into production, where it can behave reliably, scale efficiently, and stay within governance parameters, is another. Agent Bricks addresses these challenges head-on by introducing a reusable framework that includes:
- Built-in performance evaluation – Track correctness, latency, and task outcomes with standardized metrics.
- Cost optimization tooling – Choose efficient execution paths and reduce unnecessary inference overhead.
- Governance & safety controls – Implement policy-based decision logic and observe agent behaviors in a traceable, auditable way.
At Qubika, we see immediate value in this framework for enterprise clients pursuing task-specific agents – such as for customer service or compliance workflows. We expect Agent Bricks to significantly accelerate time-to-production while reducing the overhead of building agents.
LakeFlow Designer: Bringing no-code ETL to the lakehouse
Another major step toward democratizing the Databricks platform: the preview release of LakeFlow Designer. This new capability introduces a no-code interface for building ETL pipelines, enabling less technical users – such as analysts, operations teams, and business partners – to take part in shaping the data lifecycle.
Why it matters
Databricks has historically catered to data engineers, machine learning practitioners, and specialized roles. LakeFlow Designer lowers the barrier to entry, allowing cross-functional teams to:
- Visually build and schedule data flows
- Combine structured and unstructured sources
- Move faster without waiting on dev teams
This evolution is especially important for enterprise clients aiming to decentralize data access and build more agile, domain-driven data products. It complements the broader trend of empowering business users with tools to contribute to analytics and AI development – without introducing risk or data quality issues.
MLflow 3.0: Lifecycle management built for generative AI
The third major announcement is the release of MLflow 3.0, which marks a pivotal upgrade for teams operationalizing large language models and generative AI systems.
Key capabilities include:
- Prompt versioning – Store, track, and compare prompt engineering iterations in a version-controlled manner.
- Hierarchical observability – Visualize and monitor complex agent flows, including tool use, retrieval steps, and output chains.
- Expanded integration with the Databricks platform – Align model tracking, evaluation, and deployment with Unity Catalog and Workflows for tighter control and scalability.
Why it matters
At Qubika, MLflow has been a cornerstone of many of the production-grade AI systems that we have built. With this release, we expect to gain a deeper lens into prompt behavior, improved tools for lifecycle accountability, and better infrastructure to manage agents over time.
As generative AI systems grow more complex and dynamic, this level of transparency and structure will be a long-term benefit.
Lakehouse Lakebase: fully managed Postgres for intelligent applications
Databricks also introduced Lakebase, a new fully managed Postgres-compatible engine designed to support the needs of intelligent applications running directly on the lakehouse. It’s based on its recent acquisition of Neon, which closed in May.
Why it matters
Databricks is sessentially arguing that we’ll see an entirely new architecture emerge, due to the requirements of AI systems – indeed this is what their CEO and Co-founder, Ali Ghodsi said in this keynote, “We think that there is going to be almost a new architecture going forward, almost like a new category. We call this the Lakebase”. Its core advantage is separating storage and compute.
Lakebase is a signal that Databricks is moving beyond just analytics into operational and transactional territory—important for building next-gen apps that blend data, AI, and user interactivity.
Databricks Free Edition: Learn and build on the production platform
Finally, Databricks launched a Free Edition, making it easier for developers, students, and data professionals to experiment and learn—on the same infrastructure trusted by enterprise customers.
Why it matters
Removing friction from onboarding and experimentation is essential to expanding the AI ecosystem.
Looking ahead: Enterprise-grade AI is the norm
Across all three announcements, we can see how Databricks is providing the building blocks to make agent-based systems usable, observable, and governable at scale.
This direction aligns closely with where we’re headed at Qubika. Our focus has always been on helping clients bridge the gap between proof of concept and production – and these new tools will help accelerate that journey, reduce risk, and improve ROI across the board.
We’ll continue tracking announcements throughout the week and sharing insights on what this means for the future of enterprise AI.
Join me for my talk on building AI agents in financial services today
Later today, I’ll be speaking in theater 4 at 4.30pm at the summit about our approach to building AI agents using the Databricks platform – and the success we’re having with clients. If you’re at the event, make sure to join me. You can see more about our activites at the Databricks Data + AI Summit here (you can also sign-up here for our happy hour, “Data on the Rocks”, on Wednesday evening!)