Hacker News developers are alarmed: 662 upvotes and 386 comments today on a story about browser fingerprinting. Researchers at Texas A&M and Johns Hopkins just presented evidence at the WWW 2025 conference proving websites track you via browser fingerprints you cannot clear. Unlike cookies—which you delete with a button—fingerprints are permanent identifiers based on your hardware and software. The research, developed through their FPTrace framework, represents the first concrete proof that fingerprinting drives ad targeting and cross-site tracking.
You Can’t Clear Your Browser Fingerprint
Browser fingerprinting creates a unique identifier from your device characteristics: screen resolution, timezone, installed fonts, GPU capabilities. This identifier persists across browser sessions. There’s no “Clear Browsing Data” button that removes it. Consequently, you’re tracked permanently until you change hardware.
Each fingerprint is unique to about 1 in 286,777 users. The technique combines Canvas fingerprinting (2D rendering variations), WebGL fingerprinting (3D GPU characteristics), and font detection. Google’s own 2019 statement captured the problem: “users cannot clear their fingerprint and therefore cannot control how their information is collected.” Cookie consent banners are theater when fingerprinting runs unchecked in the background.
First Proof That Fingerprinting Drives Tracking
The FPTrace framework settles the debate. Researchers measured how ad systems respond when browser fingerprints change—bid values drop and HTTP syncing events decrease, confirming fingerprints drive targeting. Moreover, the methodology is elegant: if fingerprinting influences tracking, altering fingerprints should affect advertiser bidding. It does.
Tracking occurred even when users cleared cookies. Furthermore, even users who explicitly opt out under GDPR or CCPA are tracked silently through fingerprinting. Privacy laws assume user control exists. However, fingerprinting breaks that assumption entirely.
Google’s February 2025 Betrayal
Google reversed its anti-fingerprinting policy on February 16, 2025. The company now allows advertisers using its products to employ fingerprinting techniques. This directly contradicts Google’s 2019 position calling fingerprinting “incompatible with user choice” and “wrong.”
The UK Information Commissioner’s Office didn’t mince words in its official response. The ICO warned that “organisations do not have free rein to use fingerprinting” and must comply with data protection law, including transparency and consent requirements. Nevertheless, Google justified the reversal by citing Connected TVs and app environments where cookies don’t apply. Translation: profit over principle. Google controls 65% of browser market share through Chrome. Therefore, when Google enables fingerprinting, it becomes industry standard.
How Fingerprinting Actually Works
Three techniques dominate. Canvas fingerprinting renders invisible 2D graphics, then reads pixel differences caused by font and hardware variations. The browser’s toDataURL() function extracts pixel data, which gets hashed into a unique identifier. WebGL fingerprinting tests your GPU’s 3D capabilities by calling gl.getParameter() for hardware constants—it can even reveal iPhone models on mobile devices. Meanwhile, font fingerprinting measures text dimensions to detect which fonts you have installed.
Most websites use Canvas and WebGL together for maximum accuracy. Hardware differences guarantee uniqueness. There’s no browser API to “clear” your GPU characteristics or font library. In fact, the fingerprint is inherent to your device.
What Developers Can Actually Do
Three defenses exist, each with trade-offs. Tor Browser offers the best protection by making all users look identical—fingerprint homogeneity with automatic Canvas and WebGL blocking. However, adoption is minimal (0.017% of identification events) because of usability friction. Firefox’s resistFingerprinting feature (enabled via about:config) achieves 0.48% adoption. Additionally, Brave takes a different approach: randomization instead of uniformity. Brave randomizes Web Audio APIs, WebGL parameters, and navigator.hardwareConcurrency. The paradox? Randomization may actually increase uniqueness.
For developers wanting to harden Firefox: set privacy.resistFingerprinting to true, enable privacy.trackingprotection, and disable WebGL with webgl.disabled. Avoid Chrome-based browsers—Chromium inherits a larger fingerprinting surface than Firefox’s Gecko engine. Nevertheless, the real solution requires regulation. Fingerprinting should face the same consent requirements as cookies. Until then, privacy laws are failing. Test your own fingerprint uniqueness at EFF’s Cover Your Tracks.










