A senior Google scientist has issued a sharp warning to European Union antitrust regulators, claiming that the bloc’s proposal to force the company to share sensitive search engine data with competitors like OpenAI could expose European users’ private information to serious re-identification risks.
Sergei Vassilvitskii, a distinguished scientist at Google since 2012 and a recognized leader in data science and algorithms, is scheduled to meet with European Commission officials on Wednesday to express deep concerns about the plan and propose stronger safeguards.
In exclusive written comments to Reuters, Vassilvitskii said Google’s internal AI red team, a group of ethical hackers tasked with simulating real-world attacks, was able to re-identify individual users from supposedly anonymized data in less than two hours.
Register for Tekedia Mini-MBA edition 20 (June 8 – Sept 5, 2026).
Register for Tekedia AI in Business Masterclass.
Join Tekedia Capital Syndicate and co-invest in great global startups.
Register for Tekedia AI Lab.
“We are concerned because the EC’s approach to anonymization fails to protect Europeans’ privacy: our red team managed to re-identify users in less than two hours,” he said. “We are eager to share our technical expertise and work with the EC to establish the right guardrails and protect Europeans from privacy harm.”
The rebuke represents Google’s strongest public pushback yet against the European Commission’s efforts to open up its dominant search business under the Digital Markets Act (DMA), the EU’s landmark legislation designed to curb the power of Big Tech gatekeepers.
EU’s Push for Search Data Sharing
Last month, the Commission outlined proposals that would require Google to share critical search data, including ranking signals, user queries, clicks, and views, with rivals on “fair, reasonable, and non-discriminatory” terms. The goal is to foster greater competition in search and help challengers, particularly AI-powered entrants like OpenAI, build better alternatives.
The Commission is expected to finalize the measures by July 27 after gathering feedback. Non-compliance could result in massive fines of up to 10% of Google’s global annual revenue — potentially tens of billions of dollars.
Google has repeatedly warned that the proposal amounts to regulatory overreach that could undermine user privacy and security while handing sensitive proprietary information to competitors. Vassilvitskii’s intervention adds significant technical weight to those arguments, highlighting the practical difficulties of truly anonymizing complex behavioral data in the age of powerful AI systems.
Modern AI models are increasingly adept at de-anonymizing datasets by cross-referencing patterns, even when direct identifiers are removed. Vassilvitskii’s comments suggest the Commission’s current anonymization framework may not be robust enough to withstand determined efforts by sophisticated actors.
The dispute sits at the intersection of competition policy, technological innovation, and data privacy — three pillars of the EU’s approach to regulating Big Tech. Brussels has grown increasingly assertive in recent years, viewing dominant platforms like Google as gatekeepers that stifle competition and harm consumers.
However, the aggressive stance has drawn criticism from the United States, which has accused the EU of targeting American companies while protecting its own interests. The tension underlines a wider transatlantic divide over how to govern the digital economy.
Search remains an enormously lucrative business that funds much of Google’s broader innovation, including heavy investments in artificial intelligence. Forcing the company to share core search signals could erode its competitive moat and accelerate the rise of AI-first search challengers.
Google’s Counter-Position
Vassilvitskii emphasized that Google is not opposed to competition but insists any data-sharing mandate must include ironclad protections. He plans to offer the Commission access to Google’s technical expertise to develop better anonymization techniques and guardrails.
The scientist’s intervention is notable because he is not a typical corporate spokesperson but a respected technical expert with deep domain knowledge. His willingness to engage directly with regulators signals how seriously Google views the threat posed by the current proposals.
The outcome of this regulatory battle could have far-reaching consequences. If the EU proceeds with broad data-sharing requirements without stronger privacy protections, it could set a precedent that influences how other jurisdictions approach Big Tech regulation. Conversely, if Google succeeds in pushing for more robust safeguards, it may temper the scope of the Commission’s ambitions.
As the July 27 deadline approaches, the meeting between Vassilvitskii and EU officials could prove pivotal. Google’s message is that competition should not come at the expense of user privacy and security.



