AI coding tools promised to make developers faster. They delivered—but it’s making most organizations slower. A new report from Harness finds that 69% of teams using AI coding tools multiple times daily experience frequent deployment problems, longer recovery times, and increased burnout. The reason: while AI accelerates code generation, the downstream infrastructure for testing, deploying, and operating that code hasn’t kept up. Welcome to the AI velocity paradox.
The Numbers Don’t Lie
Individual developer productivity is soaring. The 2025 DORA report shows 90% of developers now use AI coding tools, with AI-assisted engineers creating 98% more pull requests and completing 21% more tasks than their non-AI counterparts. Teams using these tools multiple times daily deploy 45% faster—daily or more frequently.
But here’s the problem: 69% of these high-frequency users report deployment issues at least half the time. Their Mean Time to Recovery runs 7.6 hours—longer than teams using AI less frequently. Twenty-two percent of their deployments result in rollbacks, hotfixes, or incidents. The faster the code flows in, the more the delivery system breaks.
The Infrastructure Nobody Upgraded
The bottleneck isn’t coding anymore. It’s everything else.
Harness surveyed 700 engineering leaders across the U.S., U.K., France, Germany, and India in February 2026. The findings reveal a stark infrastructure gap: 73% of organizations have “hardly any” standardized deployment templates or “golden paths.” Seventy-eight percent complain of fragmented toolchains. Seventy percent report pipelines plagued by flaky tests and deployment failures.
The result? Developers spend 36% of their time on manual repetitive work—copy-paste configuration, chasing approvals, rerunning failed jobs. Seventy-seven percent of teams must wait on others for routine delivery work before they can ship code. Only 21% can provision a working build-and-deploy pipeline in under two hours.
AI made the front of the assembly line blazing fast. But the rest of the factory is still running on duct tape and manual labor.
AI Doesn’t Fix Teams—It Amplifies Them
Here’s the insight most organizations missed: AI doesn’t automatically improve performance. It magnifies what’s already there.
The DORA research makes this clear: “AI’s primary role is as an amplifier, magnifying an organization’s existing strengths and weaknesses.” Teams with mature DevOps practices—robust automated testing, strong version control, fast feedback loops—use AI to get even better. Their throughput AND stability improve.
Teams without those foundations? AI accelerates their problems. Fifty-one percent of heavy AI users report more code quality issues since adoption. Fifty-three percent cite increased security vulnerabilities. Forty-nine percent experience more performance problems.
And yet: 95% of these same heavy users rate their DevOps capability as “good” or “very good.” They’re deploying frequently, shipping fast, feeling productive—while their systems quietly burn down around them. That’s the confidence paradox: high velocity masking low maturity.
The Human Cost
This infrastructure mismatch isn’t just a technical problem. It’s a people problem.
Seventy-two percent of engineering teams say their current working practices aren’t sustainable long-term. That number jumps to 81% among heavy AI tool users. Seventy-one percent work evenings or weekends at least weekly—31% multiple times per week. Seventy-nine percent cite slow CI/CD pipelines as a significant contributor to burnout.
Manual downstream work—the QA checks, the code reviews, the deployment babysitting—became more problematic for 47% of teams after adopting AI tools. The code comes faster, but someone still has to test it, secure it, deploy it, monitor it, fix it when it breaks. And that someone is working off-hours trying to keep up.
Fix the Foundation First
The industry got this backwards. Organizations rushed to adopt AI coding tools without preparing the infrastructure to handle the output.
Harness recommends “standardized, templatized, and governed pipelines” with automated rollbacks, feature flags, and centralized secrets management. The DORA model emphasizes foundational practices: mature version control, code review standards, platform engineering investment, small-batch development.
Golden paths—pre-built, opinionated templates encoding best practices and security requirements—are becoming critical. Over 80% of software engineering organizations now have dedicated platform teams, according to Gartner. Seventy-five percent offer developer self-service portals.
The message is clear: modernize your delivery systems before scaling AI adoption. Automate testing, security, and compliance. Standardize your pipelines. Invest in platform engineering.
Because right now, you’re running faster cars on roads that weren’t built for the speed. And the crashes are piling up.

