Denmark’s government announced plans Friday to ban social media access for anyone under the age of 15, marking one of the toughest measures yet by a European country to shield children from harmful online content and corporate influence.
Under the proposal, parents could be granted permission — after a formal assessment — to allow children as young as 13 to use social media. The government has yet to detail how the restriction would be enforced, though officials acknowledge that existing age limits on platforms like Instagram, TikTok, and Snapchat have proven easy to bypass.
Digital Affairs Minister Caroline Stage said the move aims to curb the growing risks children face in a highly digitalized world. “Ninety-four percent of Danish children under 13 have profiles on at least one social platform, and more than half of those under 10 do,” she told The Associated Press.
“The amount of violence and self-harm children are exposed to online is an unacceptable risk,” Stage said. While calling Big Tech firms “some of the greatest companies in the world,” she criticized them for failing to protect young users: “They have enormous resources but are simply not willing to invest in children’s safety.”
Careful Legislation and Tough Enforcement
The law is not expected to take effect immediately, as lawmakers across party lines work out enforcement mechanisms. “We’ll move quickly, but we must do it right,” Stage said. “There can be no loopholes for the tech giants to exploit.”
Denmark’s plan follows Australia’s 2024 legislation, which made it illegal for children under 16 to access social media and imposed fines of up to AUD 50 million ($33 million) for companies that fail to comply.
Stage said Denmark will rely on its national electronic ID system, which nearly all citizens over 13 already use, and a forthcoming age-verification app. While tech companies cannot be forced to adopt the Danish app, they will be legally required to verify users’ ages. Platforms that fail to comply could face EU penalties of up to 6% of their global revenue.
Protecting Children from Digital Harm
The Danish government emphasized that the initiative is not meant to disconnect children from digital life, but to protect them from toxic content and online pressure.
“Children and young people lose sleep, concentration, and peace of mind due to constant digital engagement,” the ministry said in a statement. “This is not a problem parents or teachers can solve alone.”
Other countries have taken similar steps. China limits minors’ gaming and smartphone time, and in France, prosecutors recently opened an investigation into TikTok for allegedly promoting suicide-related content through its algorithms.
The EU’s Digital Services Act, in force since 2023, already bans users under 13 from holding social media accounts, but enforcement remains inconsistent. Platforms such as TikTok and Meta (Instagram, Facebook) use AI-based facial analysis to estimate users’ ages, though the methods have been criticized as unreliable.
In an emailed response, TikTok said it supports Denmark’s goals: “We have developed more than 50 safety features for teen accounts and tools like Family Pairing to help guardians manage content and screen time. We look forward to constructive collaboration on industry-wide solutions.”
Meta did not respond to a request for comment.
Minister Stage said Denmark has given tech companies ample time to act on child safety. “They’ve had many chances to fix this themselves,” she said. “Since they haven’t, we will now take control — and ensure our children’s digital futures are safe.”