Why is Content Moderation Important?

It’s hard to believe the World Wide Web is only 30 years old and didn’t really start entering mainstream consciousness until the early 1990s.

Early Internet communities were tight-knit, formed in science departments of major universities or R&D labs of major corporations. These small collections of Internet pioneers laid out many of the fundamental building blocks of digital communications, like acronyms (LOL!) and emoticons :-) and tried to establish good etiquette (or “netiquette”) to encourage quality interactions.

But inevitably, as with most aspects of human psychology, cracks in the utopian ideal started to form. Bulletin board spammers, trolls, and other bad actors quickly found their way into online communication platforms to pollute the open waters in which others were happily swimming. 

Today, when it comes to online comments, it often seems like the inmates have taken over the asylum.

If you scroll through any Facebook thread or comments section on a major news site, you’re bound to find divisive, toxic, and spammy messages muddying up the dialogue. Once banished to the shadows, hateful rhetoric has reasserted itself en masse online and is amplifying fringe voices to wide audiences. The result is a network of platforms that (whether intentionally or not – usually not) make it easy to spread misinformation, sow division, and cause serious harm in the real, offline world. 

Faced with these challenges, moderating content in its many forms is essential to maintaining healthy online communities. Without effective tools to restrict the spread of abusive content, unthinkable horrors can find their way into the most innocuous of public feeds, as seen by the spread of live-streamed video footage from a mass-shooting in New Zealand.

Of course, abusive content doesn’t always come in the form of an explicit video. Toxic comments (harassment, hate speech, etc.) and spam can dissuade people from participating in conversations, thus sidelining reasonable voices who don’t want to deal with trolls or a poor-quality discussion. That may lead to even more serious side-effects – a study last year reported that cyberbullying makes young people twice as likely to self-harm or attempt suicide.

Proper content moderation recognizes + removes inappropriate imagery, harmful websites, and abusive comments to create a safer, more engaging online environment. As the online world continues to grow and connect more people, maintaining high-quality discourse is key to ensuring a substantive experience.