Back to Blog
News

What it takes to build an AI-ready data foundation: Insights from Think 2026

adminDatabase Expert
May 8, 2026
5 min read
#Artificial Intelligence#IT automation#Compute and servers#Analytics
What it takes to build an AI-ready data foundation: Insights from Think 2026
What it takes to build an AI-ready data foundation: Insights from Think 2026 - Image 2
What it takes to build an AI-ready data foundation: Insights from Think 2026 - Image 3

Intelligence for AI is evolving.Where there was once a simple emphasis on data access, the standards for AI-ready data today are higher. The enterprises advancing beyond AI experimentation into AI transformation are streaming data in real time, enriching it with context and ensuring that it’s trusted.“Intelligence must evolve from just being available to being something that’s actionable,” Rob Thomas, IBM SVP of Software and Chief Commercial Officer, explained during his keynote address atThink 2026.“Without that, your AI will just amplify the fragmentation that exists in many organizations.”During the conference, Thomas, fellow IBM leaders and IBM partners explained how IBM’s watsonx.data and Confluent, the leading data streaming platform recentlyacquired by IBM, are helping enterprises overcome fragmentation and other data challenges—driving AI performance at scale.“Confluent is providing the real-time event backbone, capturing and distributing data as it happens,” Scott Brokaw, IBM Vice President of Product, watsonx.data integration, said during a Think session on building smarter architecture for AI. “IBM is bringing that data together, governing it, enriching it, and making it ready for AI to act on.”

Jay Kreps, the co-founder and CEO of Confluent, joined Thomas’ keynote to explain how companies are breaking data silos and deriving value from enterprise data.Data has “gone from something that is locked up in individual systems to something that’s really flowing across organizations and powering a lot of the actors,” Kreps said.He pointed to success stories like Marriott, which used Confluent’s technology to unite disparate data sources to elevate customer experiences.The hotel chain brought “together all the data they had about customers—all the personalization, loyalty programs, marketing—to really unite everything that they knew about the people that were staying with them, and try and optimize that experience,” Kreps said. The powerful initiative, he said, delivered more than $250 million in revenue.For Suresh Visvanathan, Group Chief Operating Officer of the U.K.-basedNationwide Building Society,a retail financial services provider, leveraging timely data across the enterprise is about “both playing offense and defense.”The offense entailed offering customers personalization, and doing it fast.  “If you can’t do it for a customer when they need it, it’s a little too late,” Visvanathan, a guest at Thomas’ session, said.On the defense side, real-time data has helped the company address minor issues before they become major problems. “Observability is a big thing in our industry, like it is in in many others,” he said. “And so your ability to find out if an ATM goes down, if the mobile app is not working, where in the food chain is it actually breaking down, and how do you react to it becomes mission critical.”

It’s not just large companies like Marriott and Nationwide Building Society that are moving beyond AI experimentation to see transformative outcomes. AI-forward businesses of any size, regardless of where they are in their AI journeys, can take steps to harness the value of their enterprise data through real-time context and governance.Businesses can begin by starting small: Identifying a use case with the potential to make a major impact. They can build a foundation for AI-readiness—one that they can scale by domain later on— through what Sean Falconer, Head of AI, Product at Confluent, calls a “fundamentally different approach to data.”Falconer, who joined Brokaw’s session, said that instead of processing and governing data downstream, successful enterprises have “shifted that problem left closer to the source of the data, where it’s created and where it’s already streaming.”“The result is they’re able to create real-time, reusable and universaldata products,” Falconer said. “These fully governed data assets are instantly accessible the moment that they need them.”The right approach to data can power one of the most powerful AI technologies today: AI agents. For example, Confluent’s Jay Kreps explained agents could be designed to handle customer’s complaints regarding food delivery.“You can actually hook into the source systems to have this information, the applications and database, capture these real time changes, be able to take the stream of complaints coming in,” he said, “and combine it in real time with all the context about the delivery” including details about the customer and the driver.The result? An agent has the information necessary to determine key next steps, such as issuing refunds or sending a fraud alert.“At the end, that agent has the full context of what’s happening across the business—just the things we want, not the things we don’t,” Kreps said. “[It’s] well-governed, well-structured data to act on.What’s more, that same data powering the agent’s real-time decisions can also be plumbed for other insights that can deliver additional value through watsonx.data.“I can take the data that’s flowing in real time—that I’m acting on—and I can expose it to all the intelligence capabilities and the types of questions that are going to naturally arise,” Kreps said. “I can be able to analyze this and try and figure out what’s actually happening out there in production, what is it that our agent is doing? What are the types of customers that are coming through? …What’s the overall trend?”“We’ve done great integration with watsonx.data, and this is incredibly powerful,” Kreps said.

Comments (0)