EU introduces new rules to crack down on TikTok and Instagram's 'addictive design' aimed at children

The European Union has announced new rules targeting "addictive design" features on social media platforms aimed at children. According to a CNBC report carried on Hacker News, the regulatory package will directly affect TikTok, Instagram, Snapchat and other large social media platforms. The package is expected to be presented to the European Parliament next month.
The regulatory package is being presented as supplementary guidelines developed under the Digital Services Act (DSA) framework. Designed specifically for child-protection standards, the guidelines will require platforms to switch off certain features by default for users under 18. The features in question include infinite scrolling, autoplay video, notification bursts and excessive personalisation of content recommendations.
EU Commissioner Henna Virkkunen told CNBC: "Ensuring children's safety online cannot be treated solely as a parental responsibility. Platforms need to share that responsibility through their own design choices." Virkkunen said the existing DSA was not sufficient and that more specific regulation was needed.
TikTok, Instagram's owner Meta and other large platforms have not yet commented officially on the new package. However, the companies have in recent months tended to argue that they have strengthened child-protection features on their own initiative and that the existing tools are sufficient. TikTok announced last year that it had introduced a default 60-minute daily screen-time limit for users under 16.
Among the details of the regulation is an obligation on platforms to explain transparently to child-protection experts and regulators how their algorithmic design works. This approach is in line with the broader political line the EU has taken in recent years on AI transparency. Firms will be required to produce systematic assessment reports on how algorithms may affect the emotional state of children.
Child protection organisations have welcomed the regulatory package. Maria Steiner, a spokesperson for UNICEF's European office, told CNBC: "This package represents a structural change for children's online safety." Steiner recalled that current research shows the negative effects of social media platforms on the mental health of child and young users.
On enforcement, the European Commission will require annual audit reports. These reports will show platforms' compliance status with the regulation and will be verified by external audit firms. Penalties for non-compliance can rise to 6 percent of global turnover. This approach is similar to the penalty structure the EU used in the GDPR.
Some aspects of the regulatory package have prompted debate among industry representatives. Daniel Friedlaender, the European director of the Computer & Communications Industry Association (CCIA), told CNBC: "While the goals of the regulation are legitimate, the implementation details could weaken some important areas of innovation." Friedlaender said the algorithmic transparency requirements in particular could slow platforms' product development processes.
The likely approval process in the European Parliament is expected to complete during the summer months. According to CNBC, looking at the party positions, the regulation looks likely to pass with broad support. The Socialists & Democrats, the Greens and Renew Europe groups have spoken in favour of the package, while the EPP group has been broadly supportive.
After the regulatory package enters into force, EU member states will have six months to bring their national rules into line. Platforms will then have a 12-month transition period for implementing the new features. CNBC reports that during this period, significant investment will be required across the sector, particularly to adapt technical infrastructures. The EU's move is being watched closely for its potential to encourage similar steps in the global regulatory landscape.