Q&A: The gap between AI ambitions and data readiness
Ataccama CEO Mike McKee discusses why most organizations aren't ready for AI, why ROI failures are largely a people problem and what the data management industry keeps overlooking.
The data management industry is deep into a cycle of AI-driven product launches and bold claims about agentic automation, governance at scale and intelligent data pipelines. But industry benchmarks suggest most organizations are far from ready, with AI investments struggling to deliver ROI and confidence in data governance remaining low.
Mike McKee became CEO of Ataccama in 2023 after 30 years of building software companies across cybersecurity and data management. At Ataccama, he oversees a unified data trust platform spanning data quality, governance, catalog, observability and master data management.
The following interview, conducted shortly after the Gartner Data & Analytics Summit in March 2026, covers the gap between industry optimism and organizational reality.
Editor's note: This interview has been edited for length and clarity.
At Gartner, I spoke with people from many industries, but not from the vendor side. Are the pain points customers are raising top of mind for you, or are these just growing pains because we're still in the early days?
Mike McKee
Mike McKee: I do believe it's early days on AI and agentic right now. Every single booth out there, ours included, said AI readiness, agentic, durability and governance.
The same thing happened in November 2022 when ChatGPT came out, and it was all the business folks and CEOs asking, 'Are we using AI?' And the data team's saying, 'Well, hold on. Look, our data's all over the place. We've got to get our data in control.'
I just think it's increased the intensity. The foundation has to be solid, and the foundation is data. I come back to people, process and technology, and you might be saying, 'This is a technology vendor,' but it's still figuring out the people and the process side behind that foundation. That's, in some regards, the hardest piece. I'll talk about AI and agents, but companies are far behind that.
I spoke to the CEO of Zoetis yesterday, and my jaw was on the floor with the amount of automation that they have in agents. They started a pilot across 1,000 people last May. 12,000 people are now using agents across Zoetis. Instead of GenAI, they call it ZenAI, which is pretty funny.
I also think some people throw agents and AI above a workflow that already existed that's actually not particularly agentic. But there are certainly organizations that are there. I completely believe that all organizations have to figure out how they're going to get there because it's going to happen, and it's going to be incredibly beneficial for organizations at so many different levels.
There's a lot of foundation that has to happen first.
Some stats from Gartner: Only one in five AI investments are delivering a positive ROI, 14% of organizations feel confident that their data is properly governed, and most still can't connect data initiatives to business outcomes. Yet everyone at the summit was very optimistic and scaling up regardless. How do you reconcile these two perspectives?
McKee: On the first point, I actually don't think there's been that many investments yet. There's still a challenge, and I go to our own organization. Whose budget does AI come from? It's just a practical challenge. If [our head of communications] wants budget for articles and press releases, that comes to me and our head of finance. She says, ‘last year I had $1 million, and this year I want $2 million.’ If sales wants more sales reps, they ask for capital expenditure for that. So [AI spending] is an IT budget.
We have an IT budget for Claude and OpenAI, but it's along with all the other IT budgets for Slack and AWS. Do we measure whether we have a positive ROI on our Slack investment? We don't actually. I think a lot of organizations are still trying to figure out where the AI budgets should sit and who to hold accountable for positive ROI against that budget. So that's the early days in my view.
You got governance, so what? People haven't been pushed on those questions historically.
When it comes to connecting AI performance to business outcomes, I would say 99% of organizations can't connect data initiatives to business initiatives, let alone AI initiatives. And that's changing for sure. Fortunately, it's a known problem. We push customers on this. You got better data quality, so what? You're mastering data, so what? You got governance, so what? People haven't been pushed on those questions historically, and now they're being pushed.
So that is moving in the right direction and is essential. Now the initiative's coming from the business side, and that's where you get a more discrete initiative that you show value against. And we're pushing for those discrete initiatives. Data leaders are realizing that they need to show momentum; they need to show that connection if they want to properly roll something out.
I think that's a known problem with an easier fix, which is basically just involving business. And I'll tie it into one of the biggest trends that I saw at [Gartner]. Data stewards are in the business. Data stewards used to be data engineers who put on a data steward hat for a little while. They were really data people. Now they're business people who have to increase their data literacy and have to use data management products that they can contribute to and really convey what they need from the business side. Because almost every organization says, 'data sucks.' Okay. What does that mean? What sucks? What report sucks? When are you making bad decisions? When are you not meeting regulatory requirements?
And that requires someone on the business side moving past 'data sucks' to 'We don't have the right unit measures.' Back to Zoetis as an example: For our different drugs, we use them on animals around the world in different countries. Therefore, we don't know, from a supply chain perspective, a pricing perspective, or a customer perspective, how to measure the different doses of drugs that we hand out for pets. We have to have data quality standards and make sure stuff comes in at a standard rate so that we are selling the same amount of pharmaceutical drugs to people in different situations. It's not until you get someone on the business side who can actually articulate that pain that you can say, 'Now I understand what the data initiative was for, and now I understand how to make this better.'
When you took the job in 2023, you identified a tremendous need to consolidate data, make it securely available and leverage AI for better decisions. Three years later, the numbers suggest that most organizations aren't there. Is this moving slower than you expected?
McKee: Short answer, yes.
I was on a call yesterday with the SVP at Warburg Pincus. He showed up 13 years ago, after working in different industries, and shows up at Warburg Pincus, where he was brought in to drive data governance, data quality and data processes.
Thirteen years later, he's getting there. For the first five years, he had to digitize Warburg Pincus and move everything from piles of paper in people's offices into a data warehouse. So for five years, it was literally digitizing information. Now, most companies are already there, but it just sits in too many different places.
I think the analog to that is people moving to cloud data warehouses now, like Snowflake and Databricks, because they realize they can't do much with data across a whole bunch of different sources. That process takes time to either digitize, as was the case 10 years ago, or the case in the last five years, moving that to a cloud data warehouse.
The next thing he did at Warburg Pincus was establish reporting. Because the report used to be: go to one smart person and say, 'Give me a report that looks exactly like this, in a PDF format.' And [Dave McNally, SVP of Warburg Pincus], said, ‘You might want a real-time report.’ There's this thing called Power BI where you can actually click on it, and you don't have to go find another person to have them create a PDF for you and send it in exactly your format. This information can be in real time. So that was the next five years: setting up reporting and setting up Power BI. And now some people are starting to question the quality of those reports. And now we've got to fix quality.
Bringing that to the example I said before, I think it takes time to get the data into the cloud data warehouses. I don't think we're going to see that five-year reporting time frame because I do believe if you're using natural language and saying, 'I want the customer information,' that'll be quicker in terms of getting reports from different data that exists.
What falls from that is on the quality and governance sides. I think the fact that data is moving away from cloud data warehouses and into AI models will speed everything up. It has taken a long time, and it will take a long time because data is messy. It's in a lot of different places.
Imagine trying to sort out your garage, and someone's dumping another truckload every day, of stuff in the garage. You're like, ‘Okay, I think I've got everything on shelves now.’ Next thing you know, there's more old sports equipment, and bikes, and lawnmowers that are being thrown in there. It's a moving target, what you're trying to organize. It's a target that's increasing in pace in terms of the amount of data that's being dumped in there. Get it under control, secure and govern. It's a hard job and it takes time.
A phrase that keeps coming up is 'AI magic.' And depending on where you stand in an organization, there are different implications for what that could mean to you. The pattern I keep hearing is that executives see an ROI projection on paper, and the end user who gets handed the tool gets very little workflow integration. The result is resentment from non-technical users, despite how useful these tools are for their work. How much of this ROI failure problem do you think is an implementation and change management problem masquerading as a tech problem?
McKee: I'm debating between saying over 50% or over 80%. At a minimum, it's over 50%. Quite likely, it's over 80%. It's just technology. People get set in their ways. Like before, when going to someone else for the report. Why would I go to Power BI? And why isn't that data any good?
I think people get caught in their ways. Not everybody. When I say people, process and technology, technology comes first. I've never had anybody disagree with that. And people naturally gravitate toward what's easier. You go plug something in, right? The technology doesn't talk back at you. Technology doesn't work around you. People work around you, and people talk back to you.
We have a phenomenal platform. And all too often we're like, 'Hey, here's our phenomenal platform.' And people say, 'Well, tell me how to roll out and implement that,' and we say, 'We can do whatever you want.' I'm like, don't say that. Yes, that's true, but go with the recommendation and tell a story. What are the baby steps toward getting value from our data trust platform or our data management platform?
And do I set up quality first? Do I set up observability first? Do I connect with my governance tool first? Or should I do data quality first? Do I do MDM first, or data quality first? There is no single answer. But what we're trying to do is orient more and more things around business use cases by vertical, really thinking about it in the context of what business processes are already happening within one of these organizations.
Some progress is already happening where there's pain, and saying, 'Let's go target that process and make that process better,' as opposed to, 'Here's a tool set that you can do whatever you want with, wherever you want in your organization.'
Too often, people buy the tool set, and they plug the tool set into their cloud data warehouse, their reporting systems and the other data management products that might be there. That can take a long time. You get sources lined up, get ETL going, spend a year setting it up, and you forget, why am I setting this up? As opposed to a discrete use case or a discrete business case.
I recently spoke with someone who described being locked into a vendor ecosystem despite being unhappy with the product. Too entrenched to switch, too costly to migrate. Her point was that last year's offerings were the best available at the time, this year's are better, and next year's will be better still. Does this arms race of product offerings really serve the people trying to make long-term infrastructure decisions?
McKee: My first reaction is, that's software, right? I was in cybersecurity for 10 years, but the same thing exists here. When I was in software before – this is going back 20 years in the community design software area – I felt like the vendor job was harder than the purchaser job because purchasers knew exactly what they wanted and vendors were trying to do that. It was sort of a rock fight to see who could build a better tool at a better price for that particular use.
Now, tools are getting broader; there are more of them, and there's more competition. The job of the CDO or CDAO is hard because all vendors say they do everything.
There are demonstrable improvements coming out more and more quickly. I think with AI being involved in the development process and cloud growth, you're going to see even faster development. It's even less expensive. I absolutely think it's going to happen.
Two things I would say. One, I saw this at Gartner, and it's the first time I've really heard it so loudly: People want a unified platform. You have best-of-breed across each of those different spokes. You'll get dizzy if you try to get best-of-breed over all of those, and you'll get exhausted trying to hook them all together. Better to go with a vendor that's got a single unified platform that maybe isn't the best at all those different spokes, but it's connected, and you're working with one vendor.
Now, the argument against that is vendor lock-in. I think that's 'pick your poison' a little bit. That's a risk you have to take to a certain extent. And my second major point on a unified platform is you've got to choose the partner that you trust, and that's going to be super responsive to what you need and has the resources and capabilities to keep up with the others.
A lot of conversations in data management circles keep coming back to governance and AI. And while they're both very important, they've become the default answer to most hard questions about enterprise data. Is there something the industry needs to grapple with that keeps getting drowned out by those two conversations?
McKee: Usability, to a certain extent. And one of the things that we've tried to do, and this is where AI is really helping – we call it automated intelligence – the amount of data is doubling every three years, and the number of data management professionals is staying the same. That math doesn't work. We put in a ton of effort, and continue to put a ton of effort, into making the product more usable.
That's hard, right? Because there's so much complexity, and you've got data engineers who want all the knobs to be able to turn and tweak, but we've got to find a way that it is more usable, more efficient, so you're leveraging AI more. You've got natural language rule writing. If I say that all people must be 21 years of age in this field, it gets flagged. We've got natural rule descriptions that come with that, where the description automatically comes out. Sure, the person on the data team might have to tweak it a little bit, but it's 80% done.
We have rule suggestions against different critical data elements. If I put bank transactions in, the different rules that are often applied to bank transactions come up. We're trying to make the tool as usable as possible and as integrated as possible across the platform, so you're not learning something new between governance, catalog, rule creation, observability, mastering the data, whatever it might be.
Being able to stand up against those business use cases quickly, that goes back to usability and having a multitenant SaaS platform that doesn't require lots of on-premises hardware or single-tenant environments.
And usability absolutely applies to governance, because governance can be hard. When can [users] see something? Is there PII? Is there security data that's flowing through? That has to be digestible because that's part of the challenge. And in MDM, it's always been an example of that – it's hard to get it all right.
You've got to be able to set it up in a reasonable amount of time, maintain it in a reasonable way, and have it be usable by lots of different people in the organization. That certainly applies to governance, and AI goes back to where we started the conversation: you have to make sure you're ready for AI. AI is going to be the foundation that you're working with.
Scott Thompson is a site editor for TechTarget's Data Technologies group.