The Schöpflin Foundation is committed to raising critical awareness and to building a vibrant democracy and diverse society.  Through our work we want to set the course for a better world for young people and future generations.

»Safety by Design – Pathways to Safer Social Media Platforms« by HateAid

Social Media, but safer

Platforms such as Instagram, Facebook, and TikTok have become everyday staples for many people: they use them to communicate, for entertainment, to consume news, and to form (political) opinions. This makes it all the more important that social media platforms do not operate solely according to commercial logics. Currently, attention-grabbing content spreads particularly quickly on social media. Posts that evoke strong emotions, simplify, or polarize are more likely to be amplified – including digital violence and disinformation.

These negative effects are not accidental. More importantly, they can be corrected. Where policymakers and platforms would need to take action is outlined by HateAid in its new publication »Safety by Design – Pathways to Safer Social Media Platforms«. HateAid, a digital human rights organization we have supported for years, argues that if social media platforms were designed from the outset according to the principle of »safety by design«, there would be less digital violence, less disinformation, and fewer addictive features that encourage endless scrolling and video consumption. After all, social media platforms are human-made products – much like colored pencils, beta blockers, or e-bikes. Their technical architecture, design, and algorithms are the result of deliberate decisions.

The study presented now is based on expert reports by digital law specialist Michael Denga from the Business & Law School Berlin and researcher Caroline Sinders. It outlines 200 concrete technical and regulatory measures to enhance safety on social media, many of which should be implemented already at the design stage of platforms.

While platform operators generally claim they are already reducing risks and often point to the European Digital Services Act, in practice they tend to remove or restrict problematic content only after it has already spread. According to the study, it would be far more effective to prevent such mechanisms at the design stage of social media services. For example, filters for harmful or disturbing content could be integrated, or minimum age requirements for platform use could be introduced.

The study puts forward two key recommendations: safety must become a central component of platform architecture, and the Digital Services Act must be implemented and enforced more consistently. In addition, HateAid calls for platform executives to be held personally liable. At present, for example, Meta CEO Mark Zuckerberg is facing a lawsuit brought by a user who alleges that the company’s platforms were deliberately designed to foster addiction. A ruling is still pending.

Read the study »Safety by Design« by HateAid here.


Schöpflin Foundation Newsletter

To stay up to date with all the activities of the Schöpflin Foundation, please subscribe to our newsletter (German language). It will provide you with regular news about all the Foundation’s events, grantees and latest projects.


Program areas

As an active funding foundation, we support organizations and initiatives throughout Europe that offer innovative solutions and approaches to the major social challenges of our time. Our goal is to ensure that these organizations and initiatives are sustainably and permanently anchored in their respective fields of activity.


Our local venues in Lörrach

Logo Schöpflin Schule

An innovative school, where learning is experienced as a community.

Logo Werkraum Schöpflin

A cultural centre of the Schöpflin Foundation. A house of restlessness.

Logo Villa Schöpflin

Taking action before addiction develops.

We are a member of the Transparent Civil Society Initiative.

It takes courage

Datenschutz-Einstellungen

Cookie settings

This website uses cookies that enable services offered by external providers, such as YouTube or Google Maps. The legal basis here is Art. 6 GDPR (1) a. You can object here to the anonymised recording of your user behaviour by MATOMO. For this purpose, we store a cookie on your computer in order to honour this decision on subsequent visits. Please note that depending on your settings, some functions may not be available. You can find more information in our privacy policy.

Matomo-Einstellungen

Achtung: Beim Widerspruch des Trackings wird ein Matomo Deaktivierungs-Cookie gesetzt, der sich Ihren Widerspruch merkt.