Enterprises will waste $44.5 billion on cloud infrastructure in 2025—21% of total cloud spend—according to Harness’s “FinOps in Focus 2025” report released in February. The culprit isn’t just expensive cloud pricing. The real problem is a massive disconnect: developers build cloud-first applications without understanding costs, while FinOps teams scramble to explain why budgets are blown. Meanwhile, 86% of CIOs now plan to move workloads back to on-premises or private cloud. The cloud-first era is over.
Developers Don’t Know What Things Cost
The Harness study surveyed 700 developers and engineering leaders and found a shocking education gap. 55% admit they make cloud purchasing commitments based on guesswork. Even worse, 71% don’t use basic cost optimization practices like spot orchestration that could save up to 90% on compute costs.
Furthermore, the data gets worse. Fewer than half have access to real-time data on idle resources (43%), unused resources (39%), or over-provisioned workloads (33%). Without visibility, it takes enterprises an average of 31 days just to identify and eliminate cloud waste—far too slow to prevent significant damage.
This isn’t a finance problem—it’s a developer competency problem. Developers make architectural decisions that multiply costs by 2-5x without knowing it. A single line of code choosing an expensive managed service over a cheaper alternative can waste thousands per month. The industry has failed to teach developers cloud economics as a core skill, and the $44.5 billion waste proves it.
CIOs Are Bleeding Money
An Azul survey of 300 CIOs found that 83% spend more on cloud infrastructure than they anticipated. The average overage? 30% above budget. Nearly half reported cost overruns of 26% or more. Cloud spending isn’t just high—it’s unpredictable and out of control.
Moreover, Gartner’s late-2024 survey confirmed the pain: CIOs expect an 8.9% cost increase for IT products and services in 2025, with the biggest hikes coming from cloud computing. This is becoming a board-level concern. 43% of surveyed CIOs report that their CEO and board have serious concerns about cloud spend.
CIOs can’t forecast accurately because developers don’t provide usage estimates, and the pay-as-you-go model means bills vary wildly month-to-month. The promise was that cloud would reduce IT costs. For many organizations, it’s doing the opposite.
Companies Are Leaving—And Saving Millions
A Barclays CIO Survey from late 2024 found that 86% of CIOs plan to move some public cloud workloads back to private cloud or on-premises—the highest rate on record. This isn’t theoretical. Companies are publicly documenting massive savings from cloud exits.
37signals (Basecamp and HEY) completed their AWS exit and are saving $2 million annually—$10 million over five years. Founder David Heinemeier Hansson was blunt: “Cloud made sense when we were small. Now we’re predictable at scale, and owning is 10x cheaper.”
Additionally, Dropbox moved 90% of its data to owned data centers back in 2016 and saved $75 million over two years. SEO firm Ahrefs ran the numbers and estimated they saved $400 million over two years by using on-premises servers instead of AWS—an 11x cost difference.
The pattern is clear. For predictable workloads at scale, cloud economics don’t work. Cloud is not “always cheaper”—it’s only cheaper for unpredictable, variable workloads. Companies that believed the hype are now correcting course.
Cloud Pricing Punishes Modern Architecture
Cloud pricing models were designed for predictable VM workloads and enterprise data center migrations—not for modern microservices architectures. The per-service pricing model punishes how developers actually build software today.
Each microservice is hosted independently, so the same code costs 2-5x more as microservices than as a monolith. Hidden fees compound the problem: data egress ($0.09-$0.12 per GB out), NAT Gateways ($40-100 per month per availability zone), Load Balancers ($16-22 per month even with zero traffic), and CloudWatch Logs ($0.50 per GB ingested).
Do the math for a typical microservices architecture with 20 services. That’s $400 per month in load balancer costs alone before handling a single request. Add NAT Gateways across three availability zones and you’re at $520 monthly baseline infrastructure cost. Scale that up and the numbers become absurd.
A developer on DEV.to put it bluntly: “Cloud hosting is still rigid, built for a world that doesn’t exist anymore. The platform charges you as if each tool is an entire company.” Cloud providers have no incentive to fix this—it makes them money.
Cloud Costs Are Now a Core Engineering Skill
Developers don’t need to abandon cloud entirely, but they must learn cloud economics as a core engineering skill—on par with security, performance, and reliability. Organizations need to integrate FinOps into development workflows, not treat it as a finance-only problem.
Start with immediate actions. Tag all resources on day one to enable cost attribution. Use spot instances for non-critical workloads to save up to 90%. Rightsize instances based on actual usage, not guesses. Shut down dev and staging environments outside work hours. Choose cheaper regions for development—US-East is often 20-30% cheaper than others. Most importantly, learn what services cost before building.
Organizations need systemic changes. Implement “FinOps as Code” by integrating cost policies into CI/CD pipelines. Give developers real-time dashboards showing their team’s spend. Use showback or chargeback models to make teams accountable for their costs. Train developers on cloud economics during onboarding. Adopt hybrid strategies: keep predictable workloads on-premises, use cloud for traffic spikes.
Developers who don’t understand cloud costs are bad engineers—full stop. Just like writing insecure code or building slow applications is unacceptable, architecting expensive systems without justification is professional incompetence. Cloud cost optimization isn’t optional anymore.
Key Takeaways
- The $44.5 billion cloud waste crisis is caused by a systemic failure to educate developers on cloud economics, with 55% making purchasing decisions based on guesswork
- 83% of CIOs are overspending on cloud by 30% on average, making cloud costs a board-level concern rather than just an IT issue
- Cloud repatriation is accelerating, with 86% of CIOs planning to move workloads back to on-premises or private cloud—companies like 37signals, Dropbox, and Ahrefs have documented savings of $10M, $75M, and $400M respectively
- Cloud pricing models were designed for VMs, not microservices—per-service fees punish modern architectures and can cost 2-5x more than monolithic alternatives for the same workload
- Cloud cost optimization is now a core developer competency alongside security and performance—developers must learn what services cost before building, use cost-aware architectural patterns, and take ownership of their infrastructure spending
The cloud-first dogma is collapsing under the weight of real-world economics. Hybrid strategies are the new normal. Developers who master cloud economics will build better systems. Those who don’t will become expensive liabilities.










