In most cases, deplatforming users you no longer want around with is perfectly okay. Just make sure you don’t cross the line into doing something illegal.
Let’s face it…
The are some users that become a pain in the ass to deal with.
That’s true whether you own a social media company, a paid membership site where customers can post content in a members’ area, or even a business blog that just allows comments by visitors who don’t pay for sucking up your bandwidth.
So, how do you deplatform legally?
Set specific criteria in your website legal documents and uniformly enforce them.
For example, let’s say you don’t want someone pimping their network marketing opportunity on your site. You can ban such in the agreement for using your site.
This agreement can be a written contract between you and those who pay for access to your site. If no payment is required, then generally the agreement is posted as a “Terms of Use” or “Terms of Service” that applies to all site visitors.
And it’s important that enforcement be uniform. Don’t make exceptions for friends or others from the rules you’ve set. If it’s “three strikes and you’re out,” no one should get a fourth chance to violate your rules.
Of course, you should avoid illegal activity when setting up your deplatforming rules. For instance, a rule that bans someone based on their race or gender would be problematic. Even if such conduct might be legal in some countries, there are many places it’s illegal to discriminate based upon those characteristics.
What about free speech rights?
The First Amendment and free speech laws are designed to limit the government’s suppression of speech, not owners of private businesses.
Although there’s talk of legislation that would treat large social media platforms as public utilities in order to regulate the rights they give and take away from users, chances are any such new laws or regulations will be focused on Twitter, Facebook, and the like.
Section 230 of the Communications Decency Act
If the U.S., there’s a law that can provide you with a liability shield as you regulate content and deplatform those you don’t want around. Section 230 provides that “[no] provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”
Why is that important?
If you’re offering a platform for others to post content, you may be shielded from legal claims that arise from the content they post. This includes defamation claims.
The more you regulate content (picking and choosing which content is allowed), the greater danger this shield won’t apply because you could be treated as the publisher (like a newspaper owner).
Because some big companies abuse their power by arbitrarily picking and choosing who to deplatform, there’s a movement to get rid of this liability shield both by federal and state governments.
Setting written rules and applying the across the board is one way to minimize that risk even if Section 230 is repealed.
Of course, you should also avoid coordinating a deplatforming with others. Let’s say you own a paid membership site and decide to ban a member for violating your rules. Don’t encourage other business owners to also deplatform the member to punish the person.
Whether it’s an online membership agreement, a website’s Terms of Use, or another website legal document, an experienced Internet lawyer can help you set up the rules you apply to deplatform problem users legally.