
Google’s Universal Commerce Protocol barely had time to launch before a consumer watchdog’s warning about “surveillance pricing” went viral with 400,000 views. Lindsay Owens, executive director of the Groundwork Collaborative, issued the alert on January 13—just two days after Google unveiled UCP at the National Retail Federation conference. Her concern: the protocol could enable merchants to analyze your AI shopping conversations and charge you more than the person next to you.
Google denied the claims the same day, calling it “standard upselling.” But the controversy exposes the central tension in AI shopping: convenience versus privacy. And for developers building e-commerce experiences, that tension isn’t theoretical—it’s a design problem you’ll need to solve.
What Is Google’s Universal Commerce Protocol?
UCP is an open standard that lets AI shopping agents complete purchases end-to-end without jumping between apps or websites. Announced January 11, it has backing from 20+ major retailers including Walmart, Target, Shopify, Home Depot, Best Buy, Macy’s, Etsy, and Wayfair. The goal: solve the fragmentation problem where every retailer required custom integrations.
The timing isn’t accidental. AI agents drove 20% of retail sales during the 2025 holiday season, according to Salesforce. Google, Shopify, and Microsoft are racing to become the default interface for AI shopping, and UCP is Google’s bid to set the standard. OpenAI launched a competing protocol in September 2025 with Stripe, setting up a battle that will force developers to choose sides—or support both.
The Surveillance Pricing Warning
Owens’ warning centered on a feature in UCP’s roadmap called “personalized upselling.” The fear: AI agents have access to your shopping conversations, browsing history, and personal data. Combine that with machine learning, and you get surveillance pricing—individualized prices based on what an algorithm thinks you’ll pay.
“New technologies like better data collection and smarter algorithms are turbocharging [surveillance pricing] and costing Americans a small fortune,” Owens stated. Her organization has spent years exposing corporate profiteering, from algorithmic rent pricing to grocery markups.
Google’s response was swift but narrow. The company said it “strictly prohibits merchants from showing prices on Google that are higher than what is reflected on their site” and called upselling “a standard way for retailers to show additional premium product options.” That’s technically true, but it sidesteps the core issue: the technical capability exists even if policy currently forbids it.
This isn’t theoretical fearmongering. Consumer Reports exposed the scale of AI-driven surveillance pricing in late 2025, prompting Instacart to halt its experiments. New York passed the Algorithmic Pricing Disclosure Act in May 2025 (effective November 2025), requiring retailers to disclose when algorithms set prices using personal data. The FTC is investigating the practice industry-wide.
Why Only 17% of Consumers Trust AI Shopping
UCP is launching into a trust crisis. Only 17% of shoppers feel comfortable letting AI complete a purchase, according to a ChannelEngine study of 4,500 consumers. The top concern: payment security, cited by about one-third of respondents.
Here’s the paradox: 58% of shoppers already use AI tools to research products. The gap between research and purchase is a UX problem, not a technology problem. Consumers want AI help, but they don’t trust AI enough to hand over their credit card.
That 17% trust barrier is the obstacle preventing agentic commerce from crossing into the mainstream. Every percentage point matters when AI shopping could hit 30-40% of sales by Holiday 2026 if trust improves—or stall at 20% if it doesn’t.
What This Means for Developers
E-commerce developers are caught in the middle of this debate. Implement UCP (or OpenAI’s competing protocol), and you enable seamless AI shopping. Ignore it, and your merchants risk losing visibility in AI-powered interfaces where 20% of sales already happen.
The technical challenge isn’t protocol implementation—Google provides SDKs and documentation. The challenge is UX: how do you build trust when only 17% of users are comfortable? Privacy-first design becomes a competitive advantage, not a compliance checkbox.
This means transparency about pricing logic, clear disclosure when AI influences prices, and giving users control over what data feeds recommendations. Skills like privacy engineering and conversational commerce UX move from nice-to-have to required.
You’ll also need to navigate the protocol war. Google’s UCP has broader retail backing, but OpenAI’s protocol launched first and has ChatGPT’s massive user base. Fragmentation is expensive—supporting both protocols doubles integration work.
What’s Next for AI Shopping
Regulatory pressure is building. The FTC is investigating surveillance pricing practices, and federal legislation similar to New York’s disclosure law seems likely. Developers should expect compliance requirements to tighten, not loosen.
On the industry side, Modern Retail predicts “2026 will be the year retailers, tech giants and startups jockey to determine whose AI agent becomes the default interface for shopping.” That’s not hype—traffic from AI-powered answer engines surged 1,200% while traditional search declined 10% year-over-year.
The surveillance pricing debate will define whether consumers embrace this future or reject it. If Google and retailers prioritize transparency and user control, trust could climb from 17% to mainstream adoption. If surveillance pricing scandals dominate headlines, that 17% could shrink.
For developers, the choice is clear: build trust through privacy-first design, or watch users abandon AI shopping before it reaches its potential. The protocol exists. The market is shifting. What’s missing is trust—and that’s a problem only thoughtful design can solve.












