
In brief: Estonia and Belgium are the only two EU member states that have rejected the Jutland Declaration, a pan-European commitment from October 2025 to restrict children’s access to social media. Estonian ministers argue that age-based bans are unenforceable, that children will find ways to circumvent them, and that the right approach is to enforce the GDPR against the platforms themselves and invest in digital literacy rather than restricting young people’s participation in the information society.
The declaration signed by the majority of EU countries
On October 10, 2025, digital ministers from 25 of the 27 European Union member states signed the Jutland Declaration at an informal meeting in Horsens, Denmark. Norway and Iceland also signed. The declaration is a non-binding political commitment to introduce privacy-preserving age verification on social media platforms, protect minors from addictive design features and dark patterns, and work towards what the document describes as a “digital legal age” for accessing online services. Estonia and Belgium were the two EU members that declined. Belgium’s refusal arose from a veto by Flemish Media Minister Cieltje Van Achter, who described the declaration’s age verification requirements as disproportionate and opposed requiring children to use national identity systems such as Itsme to access services such as YouTube or Instagram. Estonia’s refusal was substantially different: based on principles rather than procedures, and rooted in a broader argument about where Europe’s regulatory effort should be directed. The political momentum reflected in the declaration is considerable. The change of era in social networks in Europe accelerated between 2025 and 2026Australia implemented the world’s first ban on under-16s starting in December 2025, France passed legislation in January 2026 to ban under-15s, Spain enacted restrictions for under-16s in February 2026, and Austria took steps to restrict under-14s. Greece announced that it would ban access to social networks for minors under 15 years of age from 2027part of a group of six EU countries that also includes Denmark, France, Austria, Portugal and Spain. On 20 November 2025, the European Parliament backed a non-binding resolution calling for an EU-wide minimum digital age of 16 by 483 votes to 92, with 86 abstentions, and called on the European Commission to incorporate the measure into the next Digital Fairness Law.
Why Estonia said no
Estonia’s disagreement is expressed by two ministers who have approached the issue from different but complementary angles. Kristina Kallas, Minister of Education and Research, has been the most outspoken critic of the ban consensus. At a Politico forum in Barcelona, Kallas argued that age restrictions put the onus on the wrong party. “For me, the way to address this is to not hold children responsible for that harm and start self-regulating,” she said. Their corresponding argument is that the responsibility should fall on the platforms. “Europe pretends to be weak when it comes to large American and international corporations,” he said at the forum, challenging the EU to “really seize this power and start regulating large American corporations.” She was also direct about the practical limits of ban-based approaches: “children will very quickly find a way to get around and continue using social media.” This argument is connected to Europe’s broader effort to assert its regulatory power over American tech companiesa project that has gained considerable momentum since 2025 but has not yet been applied with comparable force to social media content governance. Liisa-Ly Pakosta, Minister of Justice and Digital Affairs, has made positive arguments in favor of Estonia’s preferred approach. “Estonia believes in an information society and the inclusion of young people in the information society,” he said, emphasizing digital participation rather than exclusion. Pakosta has pointed to the General Data Protection Regulation as the enforcement mechanism already available: the GDPR prohibits platforms from processing children’s personal data without proper consent and carries fines of up to 4% of global annual turnover for violations. Estonia’s argument, in essence, is that Europe has not exhausted its existing tools before resorting to new, untested ones.
Estonia’s Law Enforcement Problem
Estonia’s criticism of the prohibition model has a concrete reference point. Australia became the first country in the world to impose a social media ban for minors on December 10, 2025, prohibiting anyone under 16 from having accounts on platforms such as Instagram, TikTok, YouTube, Snapchat, X and Facebook. Platforms face fines of up to approximately A$50 million for failing to take reasonable steps to prevent access by minors. In the months after the ban came into effect, eSafety Commissioner found Meta, TikTok and YouTube were not complying with the banand the regulator took legal action against the platforms. The compliance picture was bleak: seven in ten children who had social media accounts before the ban still had active accounts after it took effect. Workarounds, such as VPNs, fake birth dates, and transferring accounts to adult relatives, proved easy and were widely adopted. It remains debated whether the Australian experience represents the final verdict on the ban model, or simply an early implementation struggle that will eventually be resolved by stricter enforcement. What is not disputed is that the world’s first and most closely followed age ban produced a high non-compliance rate within months of its introduction, and that this result was predicted in advance by critics who argued that the burden of compliance would be met by creative circumvention rather than genuine restriction.
What comes next in Brussels
The practical field for competition between Estonia’s approach to platform enforcement and the prohibitionist majority’s position is the Digital Fairness Act, the European Commission’s upcoming legislation that targets addictive design, dark patterns and manipulative business practices in digital services. The European Parliament’s November 2025 vote made it explicit that it wants to incorporate a minimum digital age of 16+ into the DFA text, along with bans on engagement-based recommendation algorithms for underage users, restrictions on loot boxes, and a default requirement for infinite scrolling, autoplay, and refresh mechanisms in services used by young people. The Commission is expected to present the DFA proposal in the fourth quarter of 2026. That timeline gives Estonia a legislative window to advocate for a platform accountability framework that combines with, or instead of, an age-based access restriction. The two approaches are not necessarily mutually exclusive, but they reflect genuinely different theories about where regulatory leverage is most effectively applied: against the commercial platforms that build and profit from the systems in question, or against young people who have grown up treating social media as ordinary infrastructure. 2025 established AI as the defining technology of the decadeAnd as AI-powered recommendation systems become the primary mechanism by which young people find content online, the question of who has legal and regulatory responsibility for what those systems serve up to a 14-year-old is something Europe will have to answer through law, not just declarations.





