Estonia is the rare EU country opposing bans on children’s social media use


Estonia is the rare EU country opposing bans on children’s social media use

In short: Estonia and Belgium are the only two EU member states to have declined the Jutland Declaration, an October 2025 pan-European commitment to restrict children’s access to social media. Estonia’s ministers argue that age-based bans are unenforceable, that children will find ways around them, and that the correct approach is to enforce the GDPR against the platforms themselves and invest in digital literacy rather than restricting young people’s participation in the information society.

The declaration most EU countries signed

On 10 October 2025, digital ministers from 25 of the European Union’s 27 member states signed the Jutland Declaration at an informal gathering in Horsens, Denmark. Norway and Iceland also signed. The declaration is a non-binding political commitment to introduce privacy-preserving age verification on social media platforms, protect minors from addictive design features and dark patterns, and work toward what the document describes as a “digital legal age” for access to online services. Estonia and Belgium were the two EU members that declined. Belgium’s refusal came from a veto by Flemish Media Minister Cieltje Van Achter, who described the declaration’s age verification requirements as disproportionate and objected to requiring children to use national identity systems such as Itsme to access services like YouTube or Instagram. Estonia’s refusal was substantively different: principled rather than procedural, and rooted in a broader argument about where Europe’s regulatory effort should be directed. The political momentum the declaration reflects is considerable. Europe’s social media age shift accelerated through 2025 and into 2026, with Australia implementing the world’s first ban on under-16s from December 2025, France passing legislation in January 2026 to prohibit under-15s, Spain enacting restrictions for under-16s in February 2026, and Austria moving to restrict children under 14. Greece announced it would ban under-15s from social media from 2027, part of a six-country EU grouping that also includes Denmark, France, Austria, Portugal, and Spain. On 20 November 2025, the European Parliament backed a non-binding resolution calling for an EU-wide digital minimum age of 16 by 483 votes to 92, with 86 abstentions, and called on the European Commission to incorporate the measure into the forthcoming Digital Fairness Act.

Why Estonia said no

Estonia’s dissent is articulated by two ministers who have approached the question from different but complementary angles. Kristina Kallas, Minister of Education and Research, has been the more outspoken critic of the ban consensus. At a Politico forum in Barcelona, Kallas argued that age restrictions place responsibility on the wrong party. “The way to approach this, to me, is not to make kids responsible for that harm and start self-regulating,” she said. Her corresponding argument is that the responsibility should fall on the platforms. “Europe pretends to be weak when it comes to big American and international corporations,” she told the forum, challenging the EU to “actually take this power and start regulating the big American corporations.” She was also direct about the practical limits of ban-based approaches: “kids will find very quickly the ways to go around and to still use social media.” That argument connects to Europe’s broader effort to assert its regulatory power over American technology companies, a project that has gathered considerable momentum since 2025 but has not yet been applied with comparable force to social media content governance. Liisa-Ly Pakosta, Minister of Justice and Digital Affairs, has framed the positive case for Estonia’s preferred approach. “Estonia believes in an information society and including young people in the information society,” she has said, emphasising digital participation rather than exclusion. Pakosta has pointed to the General Data Protection Regulation as the enforcement mechanism already available: the GDPR prohibits platforms from processing children’s personal data without appropriate consent and carries fines of up to 4% of global annual turnover for violations. Estonia’s argument, in essence, is that Europe has not exhausted its existing tools before reaching for a new and unproven one.

The enforcement problem Estonia is pointing to

Estonia’s critique of the ban model has a concrete reference point. Australia became the first country in the world to enforce a social media ban for minors on 10 December 2025, prohibiting anyone under 16 from holding accounts on platforms including Instagram, TikTok, YouTube, Snapchat, X, and Facebook. Platforms face fines of up to approximately A$50 million for failing to take reasonable steps to prevent underage access. In the months after the ban came into force, the eSafety Commissioner found Meta, TikTok, and YouTube were not complying with the ban, with the regulator proceeding to court action against the platforms. The compliance picture was bleak: seven in ten children who had held social media accounts before the ban still had active accounts after it took effect. Workarounds including VPNs, false birth dates, and the transfer of accounts to adult relatives proved straightforward and were widely adopted. Whether the Australian experience represents the definitive verdict on the ban model, or merely an early implementation struggle that stricter enforcement will eventually resolve, remains contested. What is not contested is that the world’s first and most closely watched age ban produced a high rate of non-compliance within months of introduction, and that this outcome was predicted in advance by critics who argued the compliance burden would be met by creative circumvention rather than by genuine restriction.

What comes next in Brussels

The practical arena for the contest between Estonia’s platform-enforcement approach and the ban-majority’s position is the Digital Fairness Act, the European Commission’s forthcoming legislation targeting addictive design, dark patterns, and manipulative commercial practices in digital services. The European Parliament’s November 2025 vote made explicit that it wants a 16-plus digital minimum age incorporated into the DFA text, along with bans on engagement-based recommender algorithms for users who are minors, restrictions on loot boxes, and a default-off requirement for infinite scroll, autoplay, and pull-to-refresh mechanisms on services used by young people. The Commission is expected to table the DFA proposal in the fourth quarter of 2026. That timeline gives Estonia a legislative window in which to argue for a platform-accountability framework to sit alongside, or in place of, an age-based access restriction. The two approaches are not necessarily mutually exclusive, but they reflect genuinely different theories of where regulatory leverage is most effectively applied: against the commercial platforms that build and profit from the systems in question, or against the young people who have grown up treating social media as ordinary infrastructure. 2025 established AI as the defining technology of the decade, and as AI-powered recommendation systems become the primary mechanism by which young people encounter content online, the question of who bears legal and regulatory responsibility for what those systems serve to a 14-year-old is one that Europe will have to answer in law, not just in declarations.

Get the TNW newsletter

Get the most important tech news in your inbox each week.