Moderation, Free Speech, And Your Money

February 2024

One of the biggest conversations in the tech sector today is the debate on content moderation versus free speech. In this post, we explore all the reasons why content moderation is important for user-generated content — and why moderating your user-generated content is important for your bottom line.

Over the past few years, a trend has emerged among several prominent social media platforms: advertising themselves as havens for “free speech.” From the establishment of new platforms like Truth Social built around the idea that users can speak freely, to the promises Elon Musk made when he took over the reins at Twitter/X, the message to users has been “say what you want.”

On the surface, this seems like a no-brainer — who doesn’t like freedom of speech? Implementing a more-lax content policy should, theoretically, save the company money in the form of fewer moderators, review councils, etc. But like many things, the truth of the matter is more complicated.

It’s important to keep in mind the First Amendment protects people against government suppression of free speech — if you own a platform where users generate content, you are free to moderate or ban users just the same as you would be able to ask a customer to leave your brick-and-mortar building if they were screaming abuse in your lobby. Moreover, if you own the platform, there are times where you might be legally obligated to intervene with what your users are doing (more on that later.)

In this blog post, we’re covering six excellent reasons why moderating your user-generated content is important for your platform.

1) Enhancing Your User Experience: while a little bit of heated back-and-forth in a comments section can be good for engagement, conversations that take things too far can drive people off your platform — potentially forever. User-generated content is kind of like a swimming pool: everybody can have more fun if there’s a lifeguard on deck making sure nobody is getting hurt. Removing content detracts from the user experience helps ensure that the high-quality and valuable contributions on your platform get the focus and attention they deserve.

"Thank you and your engineers for blocking those extremely annoying, pointless trolls… Insticator's credibility has gone through the roof!" – Community Member

An enhanced user experience will mean more time on-site for users, a lower bounce rate, repeat visits, and potentially-large improvements in your Google Search ranking, all of which can pay huge dividends for your website or business.

2) Upholding The Law: in a famous example, while Americans have the right to free speech, shouting “fire!” in a crowded theater to cause a stampede endangers people’s lives, and you could be prosecuted for it. Threats of violence in the US can get a person charged with assault, regardless of whether they carried out those threats or even intended to. Libel, copyright infringement, and disinformation can all also carry legal consequences depending on the particulars. And that’s just in the United States — if you’re an international company, it’s important to keep in mind that even among developed nations that value freedom of speech, the laws governing expression can have a lot of nuance. There may be times when you have to enforce a different standard to follow the laws of countries like Germany, India, or South Korea, even while the speech in question is legal under US law.

3) Building Credibility: when you’re trying to sell a product, service, or brand, you have to convince consumers that their money is safe with you. Key in this process is producing “trust signals,” little indicators that you’re running a professional operation. Removing abusive, unsightly, or illegal content is as crucial to an online enterprise as keeping the floors free of garbage is in a brick-and-mortar environment.

4) Protecting Your Brand: recently, following reports that advertiser content was appearing right next to antisemitic user-generated posts, a fresh wave of advertisers pulled out of Twitter/X, following in the wake of other brands who abandoned the platform after similar scandals. It’s estimated that the platform could have lost up to $75 million in advertising revenue by the end of 2023, which could have been avoided with a more robust moderation policy. When people purchase advertising space, they want to be sure it will appear alongside content that helps their brand.

5) It’s Better Business: recently, Shark Tank investor Kevin “Mr. Wonderful” O’Leary explained to Inc. why he’d stopped advertising on Twitter/X: “It’s not working.” While the changes made to Twitter since Elon Musk took over range from the technical to the design, some of the largest changes have been made to how content is moderated — or how it isn’t. And although some brands are pulling out of that space to protect their brand integrity, in more-concrete terms the result of all these changes is a platform less conducive to ecommerce. When Twitter had more-robust content moderation, they had happier advertisers and more advertising revenue. That’s the dollars and cents of it.

6) It Prevents Harm: on unmoderated platforms, bad actors have many tools at their disposal to harm other users: hate speech, harassment, inappropriate content, threats, and more. And the users being subjected to that content are your users. Even if the monetary and legal advantages weren’t present, protecting your users (and potentially, customers) from harm is a worthwhile goal in and of itself.

Final Thoughts

"Your commenting application is ideal — congratulations on designing and running a commenting platform that truly encourages discussion. Well done!" – Community Member

Across the board, Insticator’s publishers have seen the benefits of good content moderation. It makes your users happier, it makes advertisers happier, it improves your engagement metrics, and it keeps regulatory agencies off your doorstep.

Insticator’s Commenting 2.0 platform was designed with these facts in mind. Thanks to a combination of AI and human moderation, user-generated content on Insticator’s platforms iswatched 24/7, enforcing guidelines set by the publishers to be as firm or as lax as you prefer.

If you’re a publisher looking to improve your ad monetization, time on-site, bounce rates, brand identity, or more, you should consider your approach to content moderation as a key part of your platform.

And if you’d like to hear more about the moderation solutions that Insticator’s Commenting 2.0 platform can provide, you can reach out to our team today.

Sean Kelly

Written by

Sean Kelly, Senior Content Writer