Security researchers hacked Flock Safety’s police surveillance cameras in under 30 seconds this November using nothing but “a stick” to press a button sequence, exposing catastrophic vulnerabilities in systems used by over 5,000 law enforcement agencies nationwide. The discovery revealed seven critical security failures—hardcoded Wi-Fi credentials, unencrypted storage of billions of license plate images, an outdated Android OS discontinued in 2021, and no mandatory multi-factor authentication. Senators Ron Wyden and Raja Krishnamoorthi demanded an FTC investigation after evidence emerged of stolen Flock credentials being sold on Russian dark web forums.
This isn’t a theoretical vulnerability—it’s a systematic security failure in public surveillance infrastructure that violates 8 out of 10 OWASP IoT security standards while affecting billions of unencrypted images. The Congressional investigation signals regulatory pressure on IoT vendors, and defense attorneys can now challenge Flock evidence integrity in criminal cases.
The 30-Second Exploit: Physical Access Is Not a Security Barrier
Researchers Ben Jordan and Jon Gaines demonstrated complete camera compromise by pressing a button sequence on the camera’s publicly accessible back panel, creating an unauthenticated Wi-Fi hotspot that grants root access via Android Debug Bridge. Jordan’s deadpan assessment: “The longest part, actually, is waiting for the hotspot to turn on.”
The attack is trivial. Press buttons → Wi-Fi hotspot appears → connect from any device → enable adb debugging → gain root access. Total time: under 30 seconds. The cameras sit on public poles accessible to anyone with a ladder, yet Flock defended this vulnerability by claiming exploits “require physical access and intimate knowledge of internal device hardware.”
That defense is nonsense. Physical access to public infrastructure is not a security barrier—it’s an excuse. When your cameras are mounted on street poles that anyone can reach, dismissing vulnerabilities because they “require physical access” is gaslighting. Jordan published a 40-minute video demonstration in November 2025, and the exploit is now trending on Hacker News with 178 upvotes. The myth that physical access equals difficulty has been shattered.
Seven Systematic Flock Camera Security Failures
Flock’s vulnerabilities aren’t isolated bugs—they’re systematic design failures spanning hardware, software, and policy. The cameras violate 8 out of 10 OWASP IoT security standards:
1. Android Things 8 Operating System: Google discontinued this OS in 2021. It has hundreds of known CVEs with zero security patches available. Flock is still deploying these cameras in 2025 with a dead operating system that will remain vulnerable forever.
2. Hardcoded Wi-Fi Credentials: The cameras have specific Wi-Fi SSID names hardcoded into firmware. When LTE signal is weak, cameras automatically connect to these networks without authentication. Attackers set up rogue access points with matching SSIDs, and cameras connect blindly.
3. Cleartext Credential Transmission: Cameras send credentials in plaintext. Man-in-the-middle attacks are “extremely easy,” according to researchers. Any attacker on the network path captures credentials.
4. Exposed USB Ports: The cameras have completely accessible USB ports with no physical protection. Plug in a “rubber ducky” keyboard emulator, execute arbitrary scripts, install malware. No authentication required.
5. Unencrypted Storage: Billions of images stored in plaintext on local devices. This includes factory test images never deleted, images without detected license plates, and everything the camera sees—not just vehicles.
6. Optional Multi-Factor Authentication: Flock offered 2FA to police departments for years but didn’t require it. Only in November 2024, after pressure, did they enable it by default for new customers. As of late 2025, 3% of existing customers—dozens of agencies—still don’t use MFA.
7. IMSI Catcher Vulnerability: Fake cell towers can intercept cellular connections because cameras don’t validate tower authenticity. This enables cellular man-in-the-middle attacks.
Each failure compounds the others. An outdated OS can’t be patched. Hardcoded credentials can’t be rotated. Unencrypted storage means any breach exposes everything. This is a comprehensive case study in how NOT to build IoT security.
Congressional Investigation and Real-World Consequences
Senators Wyden and Krishnamoorthi demanded an FTC investigation after Hudson Rock cybersecurity firm documented Flock credentials being traded on Russian cybercrime forums. The scale is staggering: 5,000+ law enforcement agencies, 20 billion vehicle scans monthly, billions of images at risk.
The consequences extend beyond hacking. Immigration and Customs Enforcement conducted over 4,000 warrantless searches through backdoor access to local police Flock systems—no formal ICE-Flock contracts, just exploitation of police partnerships to surveil immigration targets. A Washington state court ruled Flock data constitutes public records, eliminating any privacy protection.
Evidence tampering becomes possible when attackers with root access can modify stored footage. Defense attorneys can now challenge any Flock-based prosecution by arguing the system’s vulnerabilities make evidence integrity unprovable. Municipalities are responding: San Diego’s City Council directed police to seek Flock alternatives on December 12, 2025.
Flock’s Inadequate Response: Security Theater Exposed
Flock’s official response dismissed vulnerabilities by claiming they “require physical access and intimate knowledge of internal device hardware.” This defense ignores that researchers reverse-engineered the cameras, published detailed exploits publicly, and demonstrated 30-second compromises that anyone with a ladder can replicate.
The 2FA timeline reveals negligence, not oversight. Flock offered multi-factor authentication optionally for approximately four years (2021-2024). Police departments handling billions of sensitive images could simply… not enable it. Only after Congressional exposure in November 2024 did Flock enable 2FA by default for new customers. Existing customers still aren’t forced to use it—3% holdouts represent dozens of agencies with credentials potentially circulating on dark web markets.
This is security theater. Marketing “secure surveillance” while deploying discontinued operating systems and making critical security features optional isn’t an architectural choice—it’s negligence. The gap between vendor claims and reality couldn’t be wider.
Developer Lessons: The IoT Security Checklist
Flock’s failures provide a comprehensive checklist of IoT security mistakes developers must avoid. Here’s what they did wrong and what they should have done:
Operating System: ❌ Android Things 8 (discontinued 2021, hundreds of CVEs) → ✅ Current LTS OS with 5+ years active security support (Android 13+, hardened Linux)
Credential Management: ❌ Hardcoded Wi-Fi SSIDs, cleartext transmission → ✅ Unique per-device credentials, TLS 1.3+ encryption, certificate pinning
Data Protection: ❌ Unencrypted local storage, images never deleted → ✅ Full-disk encryption with device-unique keys, enforced retention policies
Access Controls: ❌ Optional 2FA for 4 years → ✅ Mandatory FIDO2/WebAuthn with anomaly detection, geofencing alerts
Physical Security: ❌ Exposed USB ports, accessible buttons, “physical access required” defense → ✅ Tamper-evident seals, no debug interfaces in production, assume physical compromise
Update Lifecycle: ❌ No visible update mechanism, static vulnerable OS → ✅ Automatic security updates, signed firmware, rollback protection
The critical lesson: For publicly deployed IoT devices, physical access assumptions are invalid. Cameras on public poles have zero physical security. Security must assume physical compromise and design defense-in-depth protections accordingly. Making security features optional ensures they won’t be used—2FA adoption hit 97% only after Congressional pressure forced Flock’s hand.
The OWASP IoT Top 10 framework exists precisely to prevent these failures. Flock violated 8 out of 10 standards. Every IoT developer should audit their systems against this checklist—hardcoded credentials, outdated components, insecure data storage, lack of update mechanisms. These aren’t edge cases. They’re fundamental security principles.
What Happens Next
The FTC investigation will set precedent for IoT vendor accountability. If regulators impose consent decrees requiring security improvements, the entire ALPR market faces costly retrofits or replacements. Flock’s competitors—Motorola Vigilant, Genetec, Axon—will face increased scrutiny despite having no public security audits available.
Legal challenges are inevitable. Defense attorneys will argue Flock evidence is inadmissible due to tampering risks. Civil liberties lawsuits may follow the Washington public records ruling. Class actions become possible if stolen credentials are exploited for stalking or harassment.
Flock’s options are limited: offer hardware replacement programs, face acquisition by a competitor, or risk collapse if vulnerabilities are exploited at scale within the next 12-24 months. The insurance industry may demand security certifications for IoT deployments, forcing market-wide changes.
The broader implication is clear: the IoT security correction has arrived. Vendors can’t deploy fundamentally insecure systems, dismiss vulnerabilities with “physical access required” excuses, and expect consequences to stay theoretical. Congressional oversight, municipal pushback, and public exposure are forcing accountability. Security-by-obscurity has failed again.
Key Takeaways
- Flock cameras compromised in 30 seconds via button sequence on publicly accessible devices—physical access is not a security barrier for public infrastructure
- Seven systematic failures violate 8/10 OWASP IoT standards: discontinued OS, hardcoded credentials, no encryption, optional 2FA
- Congressional investigation by Senators Wyden/Krishnamoorthi after stolen credentials found on Russian dark web forums
- Real-world impact: 4,000+ ICE warrantless searches, evidence tampering risks, San Diego seeking alternatives
- Developer lesson: Never use discontinued OSes, never make security features optional, assume physical compromise for public IoT devices











