An NHS official in the UK called Palantir Technologies “ethically bankrupt” when explaining why they refuse to use the company’s £330 million data platform. Another healthcare worker admitted, “It makes me feel sick every time I log in.” These aren’t fringe activists. They’re professionals deliberately slowing their work or using workarounds to avoid a surveillance tool they believe crosses ethical lines. The resistance, reported by The Register on April 3, raises a question every developer eventually faces: What do you do when the code you write—or are asked to write—does harm?
This isn’t just about Palantir. It’s about the fundamental responsibility developers have for the tools they build. When you know your code will be used for ICE dragnet raids or mass surveillance, claiming “I’m just an engineer” doesn’t cut it.
The Human Rights Policy Doesn’t Match Reality
Palantir publishes a Human Rights Policy claiming to incorporate “privacy, ethics, and human rights principles” into its products through its Privacy and Civil Liberties team. The company maintains an advisory council, audit logs, and documented processes. On paper, it looks like a model of ethical tech development.
However, the Electronic Frontier Foundation documented a different story in April. ICE uses Palantir’s ELITE tool for dragnet raids and mass detentions—accessing unrelated databases like Medicaid, detaining people with no criminal record or removal order. Nearly one in five ICE arrests were Latine people with neither criminal history nor final removal order. A leaked user guide even instructs operators to “disable filters” to show “all targets.”
The EFF’s assessment is blunt: “Measured against Palantir’s own human rights commitments, its decision to keep powering ICE with tools used in dragnet raids and discriminatory detentions is indefensible.” Having an ethics policy doesn’t mean you’re acting ethically. Process is not the same as outcomes. Palantir proves that companies can claim to care about human rights while enabling surveillance and discriminatory enforcement.
Resistance Happens Too Late
In May 2025, thirteen former Palantir employees—including software engineers, managers, and a member of the company’s own Privacy and Civil Liberties team—signed a public letter condemning Palantir’s $30 million ImmigrationOS contract with ICE. Notice the word “former.” They spoke out after they left.
Current Palantir engineers have been documented on internal Slack calling ICE “the bad guys” and demanding mechanisms for ethical veto. But they’re still building. NHS staff are resisting the Federated Data Platform now, but the £330 million contract was signed in 2023. By the time resistance materializes, the tool is already deployed.
Resistance after leaving is safer—no job risk—but has less leverage. You already built the tool. Speaking up from inside is riskier but more effective. The question for developers: Will you wait until it’s safe to object, or will you refuse to build in the first place?
Professional Ethics Say You’re Responsible
The ACM/IEEE Software Engineering Code of Ethics—the professional standard jointly approved by the Association for Computing Machinery and IEEE Computer Society—explicitly states that software engineers “shall act consistently with the public interest.” The code requires engineers to “approve software only if they have a well-founded belief that it is safe, meets specifications, passes appropriate tests, and does not diminish quality of life, diminish privacy or harm the environment.”
This isn’t philosophy. It’s the documented ethical obligation of the profession. Building surveillance tools for ICE dragnet raids violates this code. The code also requires disclosure: “Disclose to appropriate persons or authorities any actual or potential danger to the user, the public, or the environment.”
You can’t claim ignorance when you know the use case. Your profession has established ethics codes that explicitly reject the “neutral tools” defense. When you know your code will be used to “diminish privacy” or cause “danger to the public,” approving or building that software violates professional ethics. It’s not a gray area.
The Test: Would You Defend This at Dinner?
Here’s a simple test for whether you’re crossing an ethical line: Would you be proud to explain this work at a dinner party? To your family? To your kids? One NHS staffer said Palantir “makes me feel sick every time I log in.” That’s your conscience talking.
Palantir CEO Alex Karp published a manifesto in April 2026 arguing that American tech companies should build AI weapons and reject “regressive” inclusivity protests. He’s telling you to ignore that sick feeling. Companies will always have rationalizations—”national security,” “legal compliance,” “someone else will build it if we don’t.” But professional ethics and your own conscience matter more than employer talking points.
Your gut knows when something’s wrong. If you can’t defend your work to people you respect, that’s a signal worth heeding.
You Have More Power Than You Think
Tech worker activism has grown exponentially since 2017, when just four collective actions were reported (all protesting Palantir and the Muslim registry). By 2018-2019, collective action “exploded.” Today, we see cross-company advocacy organizations, unionization campaigns like the Alphabet Workers Union, and sustained pushback against gag-order NDAs.
Workers at Google, Microsoft, Amazon, and Salesforce have protested government contracts, ICE ties, and military work. Individual developers often feel powerless—”I’m just one person.” But collective action works. Companies need engineers more than engineers need any particular company.
Refusing to build unethical features, organizing with coworkers, or simply walking away are all forms of power. You’re not powerless.
The Uncomfortable Truth
You can’t separate engineering from ethics. When you know your code will be used for harm—dragnet raids, discriminatory enforcement, mass surveillance—building it anyway is a choice. The ACM/IEEE Code of Ethics establishes professional responsibility for societal impact. Palantir employees calling ICE “the bad guys” on Slack but continuing to build the tools understand this. They’re making a choice to prioritize their paycheck over their principles.
The pattern is clear: resistance happens after it’s safe, after you’ve left, after the tool is deployed. But effective resistance requires refusing to build in the first place. If it makes you feel sick, listen to that. Your profession has ethics standards. Follow them or find another line of work.









