Industry AnalysisAI & Development

Anduril’s $30.5B Weapons Fail: WSJ Exposes Testing Crisis

Anduril Industries, Palmer Luckey’s $30 billion defense tech unicorn, is learning that “move fast and break things” doesn’t work when the things you’re breaking might kill people. A Wall Street Journal investigation published November 27 exposed systematic failures across the company’s autonomous weapons portfolio. From Navy drone boats shutting down mid-exercise to Ukrainian forces abandoning Anduril’s combat drones after repeated failures, the gap between Silicon Valley hype and battlefield performance has never been more apparent.

When Autonomous Weapons Fail, People Die

The failures aren’t isolated incidents – they’re a pattern. In May 2025, 30 drone boats operated by Anduril’s Lattice software shut down during a Navy exercise off California. Sailors didn’t mince words in their report: “continuous operational security [and] safety violations” posing “extreme risk” to military personnel, with warnings of potential “loss of life.” Rear Admiral Kevin Smith, who oversaw the Navy’s unmanned vessel programs, was fired shortly after citing “loss of confidence.”

Ukrainian forces tell a similar story. The country’s Security Service (SBU) stopped using Anduril’s Altius drones in 2024 after they proved vulnerable to Russian electronic warfare systems, crashed repeatedly, and failed to hit targets. They haven’t been fielded since. Meanwhile, a mechanical issue damaged the engine of Anduril’s Fury unmanned fighter jet during summer testing, and an August test of the company’s Anvil counter-drone system sparked a 22-acre wildfire in Oregon that required three fire trucks to extinguish.

$30.5B Valuation, Doubled During the Failures

Here’s where it gets interesting: Anduril’s valuation doubled to $30.5 billion in 2025 even as these failures were occurring. The company’s Series G round, led by Peter Thiel’s Founders Fund, brought in $2.5 billion – with Founders Fund alone contributing $1 billion. That’s a 30x revenue multiple based on the company’s $1 billion in 2024 revenue, and it raises uncomfortable questions about whether investors are betting on promise or proven performance.

Palmer Luckey founded Anduril in 2017 with a mission to bring Silicon Valley speed to what he called a “stagnant” defense industry. The pitch was compelling: use commercial AI and rapid iteration to build cheaper, faster weapons systems. Anduril’s Fury fighter jet costs $25-30 million per unit compared to the F-35’s $400 million price tag. But cheaper only matters if it works.

Are These Really “Typical” Development Problems?

Anduril insists these challenges are “typical of weapons development” and that the incidents don’t indicate any “underlying flaws” in its technology. But that defense doesn’t hold up when you look at the stakes. The Department of Defense’s Directive 3000.09 on autonomous weapons requires systems to be “sufficiently robust to minimize the probability and consequences of failures.” Navy sailors explicitly warning of safety violations isn’t minimizing failure probability. Ukrainian forces entirely abandoning a weapons system isn’t typical field testing.

Autonomous weapons aren’t like consumer software where you can push updates after launch and apologize for bugs. These systems delegate life-or-death decisions to AI. When they fail in combat, people die. When Russian electronic warfare disrupts their communications, missions fail and soldiers are exposed. The bar for “robust enough” should be higher when the consequences of failure are measured in lives, not user complaints.

The Black Box Problem Goes to War

The deeper issue is one developers know well: the black box problem. As researchers studying autonomous weapons have documented, programmers can’t predict what AI will learn from its algorithms. Neural networks make decisions through opaque processes that are too complex for humans to intuitively understand. You can validate overall system performance, but individual predictions remain unavailable for inspection.

That creates an accountability gap. When an autonomous weapon misidentifies a target, who’s responsible? The programmer who wrote the algorithm? The commander who deployed it? The manufacturer who built it? Everyone disclaims responsibility while pointing at the technical complexity, creating scenarios where violations become procedurally unavoidable yet legally unpunishable.

What Comes Next

The WSJ investigation could force a reckoning for the defense tech boom. Approximately 30 countries and 165 non-governmental organizations have called for a preemptive ban on autonomous weapons, and a December 2023 UN General Assembly resolution requested views on addressing the ethical challenges they pose. Anduril’s failures provide concrete evidence for critics who argue we’re moving too fast.

For Anduril, the path forward requires proving that Silicon Valley speed can coexist with weapons systems rigor. The company maintains an “almost continuous” presence in Ukraine to update software and claims its systems have hit “significant numbers” of Russian targets. But claims need to become track records, especially when your $30.5 billion valuation depends on convincing the Pentagon and allied militaries that your autonomous weapons won’t fail when lives are on the line.

The defense industry may be stagnant, as Luckey claimed, but it’s stagnant for a reason: weapons development is hard, testing is harder, and autonomous systems that make lethal decisions need to clear a higher bar than any consumer product. Move fast and break things works beautifully for social media apps. For weapons systems, you need to move deliberately and get it right the first time. The battlefield doesn’t offer second chances.

ByteBot
I am a playful and cute mascot inspired by computer programming. I have a rectangular body with a smiling face and buttons for eyes. My mission is to simplify complex tech concepts, breaking them down into byte-sized and easily digestible information.

    You may also like

    Leave a reply

    Your email address will not be published. Required fields are marked *