Home Latest Insights | News EU Clears Path for National Social Media Bans for Minors in New Digital Guidelines

EU Clears Path for National Social Media Bans for Minors in New Digital Guidelines

EU Clears Path for National Social Media Bans for Minors in New Digital Guidelines

The European Commission on Monday released new guidelines under its powerful Digital Services Act (DSA), formally allowing member states to impose national restrictions—or outright bans—on minors’ access to social media.

Though nonbinding, the move represents a significant policy shift that is expected to drastically reshape how digital platforms operate across the European Union, especially as several countries prepare to implement age-specific laws.

The guidelines come amid mounting public and political pressure on the EU to act decisively in protecting children online. National governments in France, Denmark, Spain, Greece, and the Netherlands have long criticized the Commission for dragging its feet on the matter, arguing that children are being exposed to addictive features, harmful content, and privacy risks without meaningful regulatory barriers.

Register for Tekedia Mini-MBA edition 19 (Feb 9 – May 2, 2026): big discounts for early bird

Tekedia AI in Business Masterclass opens registrations.

Join Tekedia Capital Syndicate and co-invest in great global startups.

Register for Tekedia AI Lab: From Technical Design to Deployment (next edition begins Jan 24 2026).

The Commission’s decision effectively gives member states the green light to take the lead. Countries can now enforce stricter rules—such as bans for users under a certain age or requiring mandatory parental consent—within the framework of the DSA, which already obliges major platforms to assess and mitigate systemic risks, particularly for vulnerable users like children.

Denmark’s Digital Minister Caroline Stage Olsen, who presented the guidelines in Brussels alongside EU tech chief Henna Virkkunen, said: “Age verification is not a nice to have. It’s absolutely essential.”

Major Impact Expected on Platform User Base

With countries such as France and the Netherlands pushing for full bans on social media use for children under 15, and others like Greece and Denmark favoring mandatory parental consent for underage users, platforms like TikTok, Instagram, Snapchat, and YouTube are now facing the prospect of losing millions of users across the EU.

Industry analysts say the guidelines could lead to a significant contraction of the under-18 user base in Europe over the next two years, depending on how national governments implement the policy.

According to internal estimates from some platforms, minors make up as much as 25–30% of active daily users in some EU countries. Losing that segment would ripple across everything from content moderation algorithms to ad targeting and monetization strategies.

Age Verification App Moves Toward Deployment

To aid in enforcing these changes, the Commission also released technical specifications for a new age verification app that will allow users to confirm their age using government-issued ID or facial recognition technology. The app is slated for testing in five EU countries—France, Greece, Spain, Italy, and Denmark—all of which are actively developing or considering their own national restrictions.

The EU said that while the app is voluntary, it is designed to serve as a common infrastructure that can be adapted by countries setting different minimum age thresholds, whether 13, 15, or 18. National authorities will be able to integrate the app into their regulatory frameworks, while companies will be expected to make use of it when enforcing local rules.

The initiative comes as platforms face increased scrutiny for relying on self-declared age information, which has proven ineffective in preventing minors from accessing age-inappropriate content.

“It’s hard to imagine a world where kids can enter a store to buy alcohol, go to a nightclub by simply stating that they are old enough, no bouncers, no ID checks, just a simple yes, I am over the age of 18,” but this is what “has been the case online for many years,” said Stage Olsen.

What Platforms Are Now Expected to Do

Beyond enforcing access restrictions, the Commission’s guidelines also outline best practices that platforms should follow to protect minors still allowed on their services. Key recommendations include:

  • Turning off addictive features such as “streaks,” “likes,” and read receipts that pressure children into prolonged use.
  • Disabling camera and microphone access by default for underage users.
  • Making accounts private by default and restricting who can view or interact with them.
  • Eliminating behavioral tracking to prevent platforms from using children’s browsing habits to personalize content or ads.
  • Deploying a risk-based approach, requiring platforms to evaluate their systems for possible harms to children and take tailored steps to mitigate them.

Though these guidelines are technically voluntary, enforcement of the DSA means that platforms that fail to demonstrate efforts to comply could face hefty fines of up to 6% of their global turnover and risk being suspended in the EU altogether.

Tech Industry Pushes Back Against Fragmentation

In response, major tech firms have launched a lobbying campaign, arguing that the guidelines could lead to a fragmented regulatory landscape across the EU. Companies warn that if each member state adopts its own set of age rules and enforcement tools, it will become increasingly difficult—and costly—for platforms to comply.

Meta, which owns Instagram and Facebook, said in a statement that while it supports “a harmonized and transparent approach to age verification,” national-level fragmentation could “undermine the effectiveness of common digital frameworks” and “risk confusing users.”

Other companies, including TikTok and Snap, are also said to be concerned about the requirement to redesign user interfaces and backend systems on a country-by-country basis, especially given that many minors often lie about their age upon sign-up.

A New Era of Regulated Access

The Commission’s move marks one of the most ambitious attempts yet by a global regulator to limit and reshape how young people access digital platforms. With the Digital Services Act now in full force and member states emboldened to set national thresholds, the days of unrestricted access to social media for minors in Europe may be drawing to a close.

France is already expected to begin debating a bill that would ban all users under 15 from joining social platforms unless they have verified parental consent. Spain and Italy are expected to follow closely, while Denmark is currently revising its digital policy framework to mandate stricter protections for children.

No posts to display

Post Comment

Please enter your comment!
Please enter your name here