Trust & Safety at a Crossroads
I’m sensing a general hesitancy to describe the current US political situation in specific terms. We call it a “moment” or “a period of change” to not deny the legitimacy of the election outcome, but these phrases underplay reality. Winning an election should also not result in immunity from critique. Here are statements I believe to be true: the US has shifted to a more authoritarian type of regime. Building blocks of democracy like the separation of powers and the rule of law have eroded. The White House is re-aligning itself toward autocrats. Diversity and inclusion, safety, sustainability, have been recast into wokeness, censorship and foolishness.
I could be talking in general terms about the experience of living through these revolutions as a person living in the US, but I’ll adopt the more narrow perspective of a person working in tech. Over eight years ago, I entered the field of what we call ‘Trust and Safety’ out of genuine concern about the potentially negative role of social media platforms in democracy, and the promise these same platforms simultaneously hold for a more engaged citizenry. I was driven by values that seemed at the time relatively uncontroversial. A belief that social media feeds should prioritize verified, authoritative sources. That there should be rules to prevent hate and violence from going viral. That these same rules can at the same time foster high quality, diverse speech.
Fast forward to 2025, and it can feel like the ideals that made me originally choose this work over any other — are being dismantled in real time. Efforts to prevent AI’s most existential risks are dismissed as anti-growth or self-serving. Content moderation — essential to making online spaces livable — is framed as censorship. In short, tech policy has been consumed by identity politics. Does it mean the battle has been lost? Where do we go from here?
As ‘Trust & Safety’ professionals in Tech, or ‘AI Safety’ professionals in AI Labs, we have to ask ourselves whether our positioning, our posture, our strategy need to change if we want to continue to advocate for the values that brought us into the field.
For my part, I have not yet settled on a new strategy yet, but I have found it useful to look into my past for inspiration and motivation. I’m originally from France. Democracy in Europe is still strong. Values of belonging and truth are still, although with some exceptions, uncontested. Europe still believes that a society is greater than the sum of its individual parts, that basic rules that govern behavior can also make us safer and more free. Europe has not yet given up on Rousseau, Montesquieu or Lafayette.
Yet 2025 is a turning point in terms of whether these values live on. In tech, we will find out this year if European values get applied to global information technologies, our public squares. This is because the EU and the UK are rolling out expansive regulations (Digital Services Act, Online Safety Act) that aim to hold online platforms to a high(er) transparency and safety standard. And recent threats from the White House to retaliate with tariffs if Europe moves ahead with fines has set the stage for a multi-faceted showdown.
At a time of malaise and doubt, when I feel most defeated and confused, thinking about these European laws and the democratic, legal system that birthed them, unexpectedly gives me hope. The fact that these regulations exist, regardless of the specific details of each and if everything about them is “good”, is a testament to Europe’s commitment to representative democracy and the rule of law.
Such principles are so fundamental to the functioning of our societies that preserving them can be, in and of itself, a motivation to stay in the battle. A trust & safety professional’s manifesto in 2025 has to include going back to basics, and defending the rule of law.