The demand for one Social media ban for children and young people up to the age of 16 sounds like common sense. Who wouldn’t agree when it comes to protection against cyberbullying, sexualized content or excessive use? However, social media is increasingly becoming a central place in social, civic and political life. A blanket ban on children and young people under the age of 16 alone would be a convenient solution, but could prove to be inefficient and difficult to implement is also not compatible with the participation rights of young people.
The risks of using social media are well documented. The research describes how “unregulated and harmful” social media can be part of a package of strains on young people’s mental health. A Experiment shows, for examplethat reduced use among young adults can significantly reduce loneliness and depressive symptoms.
The responsibility lies with the platforms
However, anyone who draws the conclusion that access to social media must be blocked across the board until the age of 16 is making it too easy for themselves. Digital participation with exchange and networking opportunities is not the problem in itself. Rather, it is the business models of the successful platforms that convert attention into profit – and in doing so systematically increase stimulus, outrage, constant scrolling and social comparison dynamics. It is the platforms that put too little effort into removing harmful content and often encourage the spread of such posts through their algorithms.
To This is also the conclusion reached by a commission of scientific experts appointed by the Leopoldina: Social media is generally considered unsuitable for children under 13 years of age. Age-appropriate restrictions on platform functions are proposed for 13 to 17 year olds. On the one hand, this protects them from risks, but also gives them the opportunity to develop a reflective and self-determined approach to dealing with them. That too
European Parliament not only calls for age limitsbut also interventions in “additive practices” and engagement algorithms.
When politicians now talk primarily about bans for children and young people, it seems like symbolic politics: loud, but missing the point. It would be better to hold the providers accountable – with clear liability, security from the start and transparency about why certain contributions are recommended and others not. There must also be real audit rights for supervisory authorities and sanctions in the event of systematic failure. Instruments already exist: in Europe
such as DSA and DMA (since 2022). The DSA obliges platforms to provide greater security and transparency, and the DMA limits the market power of large gatekeepers in favor of fair competition.
In concrete terms, this would mean that platforms would not only have to manage risks, but also noticeably reduce them; Manipulative growth tricks for minors would be taboo. The standard would be a chronological feed, recommendations would only be available if requested. Age would be checked in a data-efficient and reliable manner – and independent research and supervision would have access to data so that regulation is based on effect instead of gut feeling.