When social media first became mainstream, many dismissed it as a playground for personal photos and status updates. Today, it’s a communication hub where politicians campaign, businesses market and journalists break news. Without professional moderation, it’s too easy for toxicity to flourish, for people with intent to harm to take advantage and for foreign bots to hijack the national conversation. Even deleted content lingers, retweeted and screenshot, fueling bigotry that can embolden others. Community Notes might eventually offer context, but context isn’t always enough to quell the harm done.

As users, we, too, must be vigilant. We should report content that crosses the line, scrutinize sources before sharing dubious claims and support policies that uphold the free exchange of ideas without enabling abuse. But, just as we expect a city to have traffic lights, fire departments and emergency services, we should expect and demand that online environments are similarly protected.

Companies must invest in professionals who understand cultural context, language nuances and how threats evolve online. They should leverage emerging advanced A.I. systems that can examine text, images and other forms of communication, and also the context in which they are shared, to more accurately and consistently identify dangerous content and behavior. They should invest in getting this right, rather than scaling down moderation to cut costs or acquiesce to a particular political movement. And regulators or independent oversight bodies need the power and expertise to ensure these platforms live up to their responsibilities.

This isn’t about nostalgic longing for the old days of moderation; it’s about learning from failures and building a system that’s transparent, adaptive and fair. Whether we like it or not, social media is the public square of the 21st century. If we allow it to devolve into a battlefield of unchecked vitriol and deception, first the most vulnerable among us will pay the price, and then we all will.

Free speech is essential for a healthy democracy. But social media platforms don’t merely host speech — they also make decisions about what speech to broadcast and how widely. Content moderation, as flawed as it has been, offers a framework for preventing the loudest or most hateful from overshadowing everyone else.