Site icon Youth Ki Awaaz

How To Create And Effectively Manage Online Safe Spaces

What a strange creature, the internet is!

On one hand, it opens up gateways to new worlds and new experiences, allowing us to meet people from across the globe, forge new connections and new relationships. It provides us with our own personal space, where, through pictures, videos, or words, we can express our feelings – our joys and frustrations about the world.

The internet makes sure we are never alone. We can share details about our lives – from the most mundane to the most profound – and find validation among friends.

But beneath all this, festers a darkness that feeds on toxicity and hate. Throbbing and pulsating with revulsion towards views that threaten its much coveted hierarchy by centering society’s marginalised, this malignant outgrowth lashes out with rage and anger, wrecking the very foundations of civil discourse. When the dust settles, all opposing views have been silenced.

Denizens of the internet happen to be quite familiar with this outgrowth, that happens to be the comments section.

If you’ve ever browsed through the comments section of an article talking about feminism, you know what I’m talking about. For instance, check out the kind of messages Marina Watanabe, who runs the feminist vlog marinashutup on YouTube, received.

To be honest, this phenomenon is hardly restricted to feminism. If you’re advocating for queer rights or the rights of minorities, you’re just as likely to be trolled. To break it down, if you’re not in a dominant position of power, and if you’re challenging dominant views, you are going to be trolled in the comments section. The comments in such cases run the gamut from casual whataboutery to explicit rape and death threats.

If we’re trying to use the internet as a platform to centre marginalised voices, then it’s very important that we create the kind of space where such voices can flourish. Trolling – in its harmless or dangerous variant – is meant to stop any kind of discourse that can upset existing power relations in society. It is meant to prevent marginalised voices from speaking up and articulating their oppression. As such, we need to create safe spaces which can amplify a plurality of marginalised voices and, at the same time, avoid trolls. Here are a few tips as to how to do that.

Lay Down Some Basic Ground Rules

Whether you’re running a group or a page, you need to lay down some ground rules first, to make it clear that you’re not going to be tolerating certain forms of behaviour. I can already see the free speech warriors crying “Censorship!” – but no, this isn’t censorship. The right to hold an opinion does not guarantee you the right to an audience. So, decide on the kind of audience you do want to have along with the kind of audience you want to avoid.

Image Credit: reddit.com/r/socialism/

Many of Reddit’s subs have posting guidelines like the one you can see in the image. Facebook groups and pages can have the same in their description.

By setting up these rules, you’re letting trolls know beforehand, about the kind of discourse that will not be tolerated. That way, they have fewer grounds for screaming ‘censorship’ in case you have to block or remove them.

Be Tolerant, But Do Not Hesitate To Drop The Banhammer

Okay, let’s get one thing straight. If you’re running a group for marginalised people, your first and foremost concern should be their well-being, and not abstract principles.

Understand that as it is, marginalised people do not have enough of a voice within society. Creating a safe space is not an attempt to create an echo chamber as a plurality of views can exist, even within marginalised groups. But it is an attempt to amplify the voices of those who are otherwise denied platforms to speak. And they deserve to be able to do so, in a non-hostile and safe environment.

Blocking a user or removing a comment in this context is therefore not a case of being unable to tolerate opposing views, but one of prioritising the safety and well-being of one’s primary audience/user base.

Respect Trigger And Content Warnings

Trigger and content warnings are meant to be a forewarning to one’s audience about content that might worsen their mental health issues. Such warnings can be potential lifesavers for survivors of traumatic experiences such as sexual assault or hate crimes, along with those suffering from mental health issues like anxiety, depression, bipolar disorder, etc.

Trolls will obviously mock trigger and content warnings – but again, as mentioned above, your aim should not be to please the trolls.

If you’re running a Facebook group, another thing you can do in this regard, is have posts go through moderator approvals. That way, a moderator can ask a user to make slight changes if the content demands it.

Obviously, there are many more nuances to such issues. One must always keep in mind that multiple axes of oppression exist, and one can enjoy a certain kind of privilege for one aspect of their identity, and face oppression for another. But these three simple steps are a basic guideline to creating a safe environment for civil discourse among those whose voices need to be heard.

Exit mobile version