FinOps is no longer just about optimizing cloud spending. Hear from experts on what makes FinOps a desirable option to help businesses manage their AI costs.
While the growing use of AI is driving up costs across enterprises, many businesses aren't seeing returns on investment. PwC surveyed more than 4,000 chief executives for its 29th Global CEO Survey, released in January 2026, and 56% said they've realized neither revenue nor cost benefits from AI.
Given those fiscal realities, enterprise leaders are demanding more detailed accounting and a better understanding of their AI spending. That demand in turn spurred the creation and growing use of FinOps for AI.
"AI costs are showing up everywhere," said Rob Martin, a fellow at the FinOps Foundation. "AI costs are showing up in cloud [bills], the SaaS products you're using, in licensing agreements and in credit card reimbursement statements submitted by employees. Now FinOps teams are being asked to look at the remit to understand what [the enterprise is] spending on AI overall and what value it's getting."
FinOps expands to AI
If these dynamics sound familiar, that's because they are. Today's concerns about AI costs mirror concerns business leaders had about cloud computing about 15 years ago. Back then, organizations moved en masse from on-premises compute and storage to cloud-based options. But unlike previous technical investments, cloud deployments didn't come with predictable bills -- rather, they involved ongoing, variable costs.
The need to better manage unpredictable cloud costs gave rise to FinOps, a discipline that brings financial accountability to cloud spending. FinOps creates shared ownership of cloud costs among engineering, finance and business teams who work together to manage costs and ensure companies maximize the business value of their cloud spending. The FinOps Foundation, formed in 2019, created a framework and established best practices, training programs and professional certifications to support the discipline.
Since the discipline's creation, FinOps has played a role in shaping several business benefits related to cost savings and service improvements.
FinOps has since expanded in scope, going beyond management of public cloud spending to SaaS, data platforms and even on-premises computing. More recently, it moved into managing AI bills -- a move that has created a new FinOps for AI discipline.
"The same principles we are using for cloud FinOps around observability, reporting and accountability are still relevant, but now you need to expand to include an understanding of tokenomics, silicon usage for training and inference, and new governance models to drive consistency and compliance," said Shawn Lund, managing director with Deloitte Consulting.
That makes sense, as the same dynamics that produced variable cloud bills are present with generative AI and agentic AI. Most organizations have been using analytics, machine learning and AI for much of the past decade, if not longer. But those have predictable workloads. That's not the case with generative AI and agentic AI, which bill on myriad factors -- tokens processed, API calls, computing hours and training needs, for example -- that can change from one AI use to the next.
"GenAI and agentic AI are different," explained Himanshu Jain, a partner in the digital and analytics practice of Kearney, a global strategy and management consulting firm. "We can keep making queries, so our demand forecast is very tricky. And the pricing models are evolving. So, the demand is uncertain, and the price is uncertain, and CIOs are now seeing runaway bills."
FinOps evolves to meet unique AI cost profile
FinOps didn't just expand into AI; it evolved to meet the needs of the moment, adapting to the components that make AI use distinct -- and more complex, in terms of cost calculations -- from cloud computing.
Organizations need a way to bring financial accountability closer to technical and product decisions before those decisions scale into persistent spend.
Chirag MehtaVice president and principal analyst, Constellation Research
"AI changes how organizations think about cost management," said Chirag Mehta, vice president and principal analyst at Constellation Research. "With conventional cloud, companies could often identify waste after the fact and clean it up. With AI, spend is shaped earlier and more dynamically by model choice, prompt design, retrieval architecture, experimentation volume, GPU allocation, inference patterns and governance requirements."
These considerations have led to growing interest in FinOps for AI, he explained. "Organizations need a way to bring financial accountability closer to technical and product decisions before those decisions scale into persistent spend."
Placing financial accountability earlier in the process involves new components to manage. While traditional FinOps often focuses on managing software licensing expenses or straightforward public cloud consumption charges, AI cost management introduces new considerations, such as tokenization and prompt engineering, Lund said.
"Early AI FinOps efforts centered on establishing monitoring to make token usage visible," Lund said. "A monthly view was a useful starting point. But the focus is now shifting toward more granular, near-real-time visibility across a rich and expanding ecosystem of large language models. As maturity increases, organizations can introduce a governance layer, an intentional entry point for accessing LLMs … alongside metrics that connect usage and spend to measurable business benefit."
CIOs, COOs and CFOs are all pushing to adopt FinOps for AI to manage costs, Lund said. "Once you have observability, you can apply different techniques to actively reduce your token usage and costs." Techniques -- such as AI gateways using governance frameworks to monitor token use, or prompt efficiency through prioritizing concise prompts or even prompt compression -- can help manage costs, he added.
Lund expects FinOps for AI will enable more "levers to pull" as it matures -- noting, for example, that organizations might in the future use "a routing layer that automatically sends prompts to the lowest cost model capable of handling the prompt, and caching that stores responses to common requests."
Businesses bet on FinOps for AI
According to the State of FinOps 2026 Report from the FinOps Foundation, 98% of survey respondents said they use FinOps to manage their AI spend -- up from 31% just two years ago in 2024. Moreover, 58% listed AI cost management as a skill they plan to add to their FinOps practices over the next 12 months, making it the most desired skill set. And 33% cited FinOps for AI as a current or future priority, landing it ahead of all other priorities in this space.
[FinOps for AI] enables the most cost-efficient use of resources to further enable everything from ideation to value-based outcomes.
Shawn LundManaging director, Deloitte Consulting
Executives are right to prioritize FinOps for AI, practitioners and advocates of the discipline said. They stressed that FinOps for AI isn't necessarily about cutting costs; it's about ensuring AI spending is producing business value.
"[FinOps for AI] enables the most cost-efficient use of resources to further enable everything from ideation to value-based outcomes," Lund said.
To do that, the FinOps team needs to do more than collect data, Jain said. They need to operationalize the information. "A dashboard doesn't give you optimization -- engineering does," he said. "In FinOps, it's important to understand where the spend is happening, and the areas for optimization, but engineering is where you bring that home."
When that happens, FinOps for AI becomes a value proposition for organizations. It has benefits far beyond cost forecasting, Mehta said. It can assist teams in decisions related to model use, experimentation, architectural changes and use case evaluation.
"FinOps for AI is really about operational discipline," Mehta said. "It gives organizations a way to move from broad AI ambition to a more grounded understanding of which workloads deserve to scale, who owns the spend and what value the business is getting in return."
Mary K. Pratt is an award-winning freelance journalist with a focus on covering enterprise IT and cybersecurity management.