A UK government watchdog warned this week that developers building end-to-end encrypted messaging apps like Signal or WhatsApp could be classified as engaging in “hostile activity” under the National Security Act. Jonathan Hall KC, the Independent Reviewer of State Threats Legislation, published his first annual report stating that developers whose encryption technology “makes it harder for UK intelligence agencies to monitor communications” could fall within the legal definition of hostile actors—even without foreign state involvement.
This isn’t theoretical. Apple already disabled Advanced Data Protection for UK users in February 2025 after the government issued a secret Technical Capability Notice demanding backdoor access. Signal and WhatsApp have threatened to exit the UK rather than compromise encryption. Developers now face a chilling prospect: building privacy tools could make you a legal target.
The Breadth Problem: Where Does This End?
The definition of “hostile activity” under the National Security Act is so broad it threatens to criminalize standard security practices. Hall’s report warns that any technology that “indirectly benefits hostile states by limiting surveillance capabilities” could qualify—even if no foreign government is involved. By this standard, HTTPS encryption, banking security protocols, and every password manager become legally suspect.
One Hacker News developer noted that by this logic, curtains and walls should be considered hostile because they obstruct surveillance. Another asked: “If you want to stop criminals, focus on their illegal activities, not the streets they walk on.” The comparison is apt—this is the equivalent of banning roads because criminals use them.
Important context: Hall isn’t endorsing this interpretation. He’s warning that Parliament wrote legislation so broad it could inadvertently criminalize legitimate security research and privacy tool development.
The Apple Precedent: It’s Already Happening
This threat isn’t hypothetical. In February 2025, Apple disabled Advanced Data Protection for UK users after the Home Office served a secret Technical Capability Notice. The notice demanded Apple retain the ability to access iCloud account contents for law enforcement—effectively requiring a backdoor.
Apple chose to disable the feature entirely rather than weaken it globally. UK users lost the ability to fully encrypt their iCloud backups. In March 2025, Apple took the UK government to court over the order. Even a company with Apple’s resources couldn’t resist UK surveillance demands.
The Technical Reality: Backdoors Are Cryptographic Fantasy
Security experts are unanimous: selective backdoors that only let the “good guys” in are technically impossible. As Stanford’s Cyberlaw blog notes, “it is nearly impossible to create a backdoor to a communications product that is only accessible for certain purposes.” If a vulnerability exists, it will eventually be exploited by criminals, foreign intelligence services, and insider threats.
The concept of “NOBUS” (Nobody But Us) backdoors has been thoroughly debunked. There is no such thing as a “controlled security flaw.” Once you weaken encryption for government access, you’ve weakened it for everyone. Even the US Congress has warned that “any measure that weakens encryption works against the national interest.”
The UK government is demanding the cryptographic equivalent of a square circle. You either have strong encryption or you have backdoors. There is no middle ground.
The Irony: Security for Me, Surveillance for Thee
MI6—the UK’s foreign intelligence service—operates Tor nodes to enable secure, anonymous whistleblowing. The same government that calls building Signal-like apps “hostile activity” relies on identical encryption technology to protect its own operations.
The double standard is impossible to miss. Encryption is vital for national security when MI6 uses it, but suspicious when developers build it for citizens. As Jemimah Steinfeld of Index on Censorship told TechRadar: “The government signposts end-to-end encryption as a threat, but what they fail to consider is that breaking it would be a threat to our national security too.”
Five Eyes Coordination: This Is Bigger Than the UK
The UK isn’t acting alone. The Five Eyes intelligence alliance—the US, UK, Canada, Australia, and New Zealand, recently joined by India and Japan—has issued joint statements demanding tech companies “create customized solutions” for lawful access. The threat is explicit: provide voluntary backdoors or face legislation compelling compliance.
Australia’s Assistance and Access Act already enables intelligence sharing that lets Five Eyes members access encrypted data they couldn’t legally obtain under their own domestic laws. The UK is pioneering the legal framework; if it succeeds here, expect similar legislation in Washington, Canberra, and Ottawa.
What Happens Next
WhatsApp and Signal have stated publicly they will leave the UK before implementing backdoors. Apple chose to disable features rather than weaken security globally. Developers now face legal risk for building privacy-preserving technology in the UK—precisely what Hall’s report warned about.
The practical outcome is predictable: brain drain. Security researchers and privacy-focused developers will relocate to jurisdictions that don’t classify their work as “hostile activity.” The UK tech sector will lose talent and companies. Meanwhile, actual hostile actors will continue using encryption—they’re not bound by UK law.
Calling the development of encryption tools “hostile activity” is authoritarian overreach that threatens the foundational security infrastructure of the internet. The UK government is demanding cryptographic impossibilities while using the very tools they want to deny citizens.
If building Signal is hostile activity, we’re all hostile actors.










