Security researcher Ben Jordan discovered this week that Flock Safety—whose 90,000 surveillance cameras track 20 billion vehicle scans monthly for 5,000+ U.S. police departments—hardcoded the same password 53 times across its infrastructure. CVE-2025-59407 (CVSS 9.8 Critical) documents the most egregious example: password “flockhibiki17” baked into Android apps running on license plate readers. The Wi-Fi access point password? Literally “security.” A button sequence on the device back grants full root access. These aren’t edge cases—they’re architectural failures at a company pulling $300M annual revenue.
Moreover, cameras running Android 8 haven’t received security updates since 2021. Flock’s response to the 55-day-old token-minting vulnerability? Still unpatched. This is critical infrastructure security theater.
Not One Mistake—Fifty-Three Instances of the Same Failure
Jordan’s research uncovered a default ArcGIS API key embedded in 53 separate JavaScript bundles. This auto-generated Esri credential carried nearly 1 million API credits and 50 administrative privileges with zero restrictions—no referrer limits, no IP allowlists, no origin checks. Furthermore, development credentials ran in production environments, exposing FlockOS’s entire mapping infrastructure.
What was accessible through this key? Live patrol vehicle GPS tracking. Body-worn camera locations. 911 call transcripts. License plate detection data. Officer mobile app locations. Drone telemetry. Names, addresses, phone numbers. Fifty private ArcGIS data layers consolidating surveillance data from 12,000 deployments: 5,000 police departments, 6,000 community HOA systems, 1,000 private businesses.
This wasn’t a developer forgetting to remove a test credential. Consequently, fifty-three instances signals absence of code review, secret scanning tools, security audits, or basic secrets management. At this scale, it’s systemic negligence.
Physical Access Equals Full Control
Flock cameras installed on public streets are physically accessible to anyone with a ladder. The company’s physical security model? A button sequence on the device back creates a Wi-Fi access point (SSID “flock-54E823”, password “security”). Connect to it, enable Android Debug Bridge, gain full root access. 9News quoted researchers: “30 seconds with a stick.”
Additionally, exposed USB ports accept malicious devices that execute scripts with full permissions. Sixty Condor PTZ cameras were accessible from the open internet with no authentication—live feeds, 30-day video archives, settings, logs, all exposed. Jordan described watching “everything from playgrounds to parking lots with people, Christmas shopping and unloading their stuff into cars” without entering a single credential.
When surveillance tools lack basic physical security, they become liabilities. Police operational security—patrol routes, officer locations, tactical deployments—becomes public information for anyone willing to press a few buttons.
Four Years of Unpatched Vulnerabilities
Every Flock device runs Android 8, which reached end-of-life in 2021. Four years of accumulating CVEs sit unpatched on cameras deployed across 49 states. The company offers no upgrade path. Even if Flock patches today’s vulnerabilities, the underlying OS remains Swiss cheese.
Multi-factor authentication? Only 97% of customers use it, and it only became default for new users in November 2024. Flock police login credentials trade on dark web marketplaces. The combination of stolen credentials and years of OS vulnerabilities creates compounding security debt on critical infrastructure.
This is the IoT lifecycle problem writ large: vendors ship hardware, abandon software support, leave customers with insecure devices indefinitely. Without regulatory requirements for security updates matching deployment lifespans, abandonware becomes the standard for “critical infrastructure.”
Real Abuses Enabled by Security Failures
These vulnerabilities enabled documented surveillance abuses. Between December 2024 and October 2025, over 50 federal, state, and local agencies ran hundreds of searches tracking First Amendment protesters. A Texas police officer searched Flock’s nationwide network for a woman who’d had a self-administered abortion—illegal in that state. Federal immigration enforcement agents accessed databases from 18 Washington state police agencies, often without the agencies’ knowledge. Three officers used the system to stalk ex-partners across jurisdictions.
San Jose police ran 3,965,519 searches through Flock’s license plate database in one year. The ACLU and Electronic Frontier Foundation filed a lawsuit in November 2025 challenging these warrantless searches. Representatives Raja Krishnamoorthi and Robert Garcia launched a congressional investigation into “invasive surveillance practices threatening the privacy, safety, and civil liberties of women, immigrants, and other vulnerable Americans.” The FTC sought to probe Flock’s cybersecurity protections.
Community pushback is mounting. Redmond, Lynnwood, and Olympia, Washington, turned off their cameras in late 2025. Mountlake Terrace cancelled its contract. When security failures enable stalking, protest tracking, and reproductive healthcare surveillance, the irony becomes inescapable: tools meant to enhance security become weapons targeting the innocent.
How Developers Prevent Hardcoded Credentials
CWE-798 (Use of Hard-coded Credentials) is entirely preventable. What went wrong at Flock represents a checklist of what not to do: no secrets management (hardcoding credentials in 53 files), no secret scanning (GitGuardian or TruffleHog would catch exposed API keys), no code review process that flags credentials, development keys running in production, no regular security audits, and slow patch response (55 days for a critical bug).
Prevention strategies exist: HashiCorp Vault or AWS Secrets Manager for credential storage, pre-commit hooks blocking credential commits, environment variable injection instead of hardcoding, explicit CWE-798 checks in code review checklists, security training teaching the “why” behind secrets management, third-party penetration testing, and bug bounty programs incentivizing responsible disclosure.
Industry best practices that Flock violated include changing default passwords (they used “security”), mandatory MFA (only 97% adoption), never exposing cameras to the internet (60 were publicly accessible), regular firmware updates (Android 8 reached EOL in 2021), physical security protections (button sequences grant access), and API restrictions (no referrer or IP limits on their key).
Flock Safety is now a case study in security debt at scale. Your mistakes might not compromise 5,000 police departments, but the lesson applies at every scale: secrets management, code review, security audits, and rapid patching aren’t optional. They’re table stakes for building systems people trust.
Key Takeaways
- Systemic failure, not one-off: 53 instances of hardcoded credentials indicates absence of security processes, not developer mistakes
- Physical security matters: Publicly accessible cameras with trivial button sequences and exposed USB ports become attack vectors
- OS lifecycle = deployment lifecycle: Deploying Android 8 in 2026 on 5+ year devices guarantees years of unpatched CVEs
- Security debt enables abuse: Documented cases of stalking, protest tracking, and reproductive surveillance show theoretical vulnerabilities become real harm
- Prevention is straightforward: Secrets managers, secret scanning, pre-commit hooks, MFA, regular audits, and rapid patching prevent CWE-798 at any scale












