The European Union is preparing new regulatory action against major social media platforms, including TikTok and Instagram, over concerns that “addictive design” features are harming children and exposing them to unsafe online content.
European Commission President Ursula von der Leyen said the bloc will move to introduce new rules later this year targeting design features such as infinite scrolling, autoplay videos and algorithm-driven content feeds.
Speaking at the European Summit on Artificial Intelligence and Children, von der Leyen said platforms such as TikTok and Instagram are under scrutiny for failing to adequately protect minors online.
“We are taking action against TikTok and its addictive design – endless scrolling, autoplay, and push notifications,” she said. “The same applies to Meta, because we believe Instagram and Facebook are failing to enforce their own minimum age of 13.”
The European Commission’s concerns extend beyond design features to the broader issue of algorithmic recommendations that may expose children to harmful content, including material linked to eating disorders and self-harm.
Officials say investigations are also focused on so-called “rabbit hole” effects, where users are repeatedly directed to increasingly extreme or harmful content through platform algorithms.
As part of its response, the EU has developed an age verification system designed to help enforce minimum age requirements on social media platforms. The system is expected to be integrated into national digital identity frameworks across member states.
The Commission said the tool has “high privacy standards” and is intended to make age enforcement more effective, addressing long-standing concerns that underage users can easily bypass existing safeguards.
A formal legislative proposal could be introduced as early as the summer, following recommendations from a special expert panel on child safety online.
The move comes amid broader global efforts to regulate social media use among minors, with governments increasingly concerned about mental health impacts and online safety risks.
The EU has already intensified scrutiny of major U.S. technology firms under its Digital Services Act, which imposes stricter obligations on platforms regarding content moderation, transparency and user protection.
Earlier this year, regulators found preliminary evidence that Meta Platforms Inc. failed to effectively prevent under-13 users from accessing its services, raising questions about enforcement of age restrictions.
The bloc’s actions have sparked tensions with the United States, where officials have criticised European regulatory fines and investigations targeting American tech companies.
Despite pushback, EU officials say child protection remains a priority, and several member states are also considering stricter national laws, including potential bans on social media access for users under 16.
Australia has already introduced similar restrictions, while countries such as France, Spain and the United Kingdom are exploring comparable measures.
Analysts say the EU’s latest move could mark one of the most significant global regulatory shifts yet in how social media platforms are designed and operated, particularly for younger users.