As the discussion around mental health becomes less stigmatized, healthy communication and openness around mental health struggles are seen frequently on social platforms.
One of the most popular, yet controversial, aids for trauma survivors are the “content/trigger warnings” on social media — when users introduce their possibly-triggering posts with some sort of warning message.
Social media platforms like Twitter have methods of filtering or muting certain words or tags, whereas others, such as TikTok and Instagram, don’t. Therefore, it’s the users’ responsibility to give content warnings to their followers.
It’s important to talk about sensitive issues and spread awareness of current events, such as the destruction of the climate or police brutality, but it’s crucial to bear in mind how the discussion might impact viewers.
For instance, many trauma survivors have begun opening up about their experiences, such as physical or sexual abuse, often in great detail on social media. Raising awareness and combatting the stigma can be very empowering and provide allies for fellow survivors.
However, these accounts can also trigger and re-traumatize survivors to hear or see specific experiences that are similar to their own. This is where user-provided content warnings should be implemented.
Sophomore Meredith Byrd, who’s currently in recovery from an eating disorder, feels that people should deeply consider what they post when addressing triggering subjects.
“It’s helpful to know when to avoid posts that could possibly bring those [eating disorder] thoughts into my head,” Byrd said.
Some argue that it’s not their job to “coddle” people and that there are no “trigger warnings” in real life. While it’s true that there are no “trigger warnings” in the real world, refusing to take others’ trauma into consideration shows a clear disdain for survivors. You’ve decided that the ability for others to opt-out of posts that remind them of their traumatic experience isn’t something that matters to you.
Others may bring up exposure therapy, which is where people confront their trauma in a controlled clinical setting, in their reasoning not to bother with content warnings. While controlled exposure therapy can be an effective way to recover from trauma, it’s not the role of anyone other than the survivor to make the decision to engage in said therapy.
For those wondering about how to properly provide content warnings, the first step is to determine if what you’re posting about could be traumatic to experience — see posts that have already been reported and flagged on Instagram for reference. If yes, the next step is to determine how to provide a proper warning.
Most trigger warnings are provided in a caption or text box. They can also be put in the first slide of an Instagram post or written over a black screen before any images are shown in a video to allow time to pause and swipe along.
It’s also crucial to be specific in the sensitivity warning about what the video may address so that the user can make an informed decision on whether to watch or scroll on. For example, “This post has discussions of X,” or “Graphic depiction of Y,” instead of simply, “This post may be triggering.”
It would help people recovering from trauma if they are in control of what they view on social media and have the ability to filter out certain sensitive content.
There’s been immense progress in destigmatizing the discussion around mental health which is a testament to the unprecedented levels of communication via social media. However, we continue to struggle with empathy and posting with the perspective of survivors in mind.
Related
Leave a Reply