Censorship is a hot topic among online communities. It is your responsibility to step in against behaviors like hate speech and discrimination. At the same time, passionate opinions and honest debate must be allowed to flourish—online communities are not autocracies.
So where do you draw the line? How can you monitor content from toxic individuals without losing all your time to it? And over time as your community grows, do your censorship protocols need to change?
Make sure everyone knows the rules
One of the very first tasks to set new members should be reading & digesting the community guidelines. These can be as long, short, or extensive as you want, but it’s essential that members understand and follow them.
This is especially important when your community is just getting off the ground since you’re shaping the whole ethos and atmosphere of your community.
The first steps are the hardest
Once your community is thriving, with its own identity, recognized leaders and a trailblazing purpose, it will be easier to spot and handle the aggressive outliers. But when getting a community off the ground, you may need to wield your censorship powers more often. You may be beset by trolls, fake accounts, angry customers—who knows.
Unfortunately, at this stage it’s unlikely that other members will take up the fight for you. Be reasonable and communicate very clearly with all members when censorship is being carried out—they in turn will learn what is and isn’t permitted.
Censorship versus filtering
Censorship in the literal sense isn’t what we’re usually going for. This would mean removing isolated offensive text from a post or comment (such as profanity or a racial slur) without any consideration for context. It may also distort the intended meaning of the post.
Instead, identifying comments with malicious intent and filtering (i.e. blocking) them entirely is usually more effective. You could even then have a discussion with the offender (see below) to identify whether it was intentional.
Balancing appropriate censorship with free speech
Usually the case for censorship is quite clear-cut. The rambunctious, hate-fueled content can be flagged and deposed quite easily—it’s the shades of grey which cause the most problems.
Fortunately, community managers have other tools at their disposal in addition to straight-up bans and deletions.
- One on One Moderation: If a comment or post isn’t totally explicit, but could be perceived as outside the guidelines, you might speak with the offender directly and explain the situation. If it’s an honest mistake or misjudgment, the situation can be diffused quietly and calmly.
- Restricted access & safeguards—It’s not uncommon for certain privileges within the community to be afforded only after a certain amount of time and trust have accumulated. This could relate to posting, accessing areas of the community, or other influential perks.
It’s important to remember that your members do have the right of free speech. However, you also have the right to establish specific codes of conduct on what will and will not be tolerated—anyone who won’t abide by those rules is not worthy of community. End of story!
Always put the community’s greater good first
Community leaders aren’t robots—you have subjective opinions and beliefs, and if you think someone is contravening the rules of your community, it’s your right to censor them. Don’t shy away from announcing when something or someone has been removed. A strong community will get behind you and respect your bravery in improving things for everyone.
You may need to walk the line cautiously at times, but trust your instincts and do what you think is right to guide your community in its intended direction. Speaking of, have you seen our own Community Guidelines?