paint-brush
Mental Health, Body Image, & Social Media: Censorship isn't the Solutionby@andreafrancesb
1,431 reads
1,431 reads

Mental Health, Body Image, & Social Media: Censorship isn't the Solution

by Annie BrownOctober 19th, 2021
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Facebook/Instagram/Tik-Tok's AI has proven faulty to the point of harming teen’s mental health. Facebook says its algorithms suggest helplines whenever it detects a user with an eating disorder, the company also admits that the algorithm simultaneously sends the user to other accounts which share similarly unhealthy, possibly deadly eating habits. The impact of social media and AI on mental health, on the way we perceive and experience our world, cannot be underestimated. Facebook and Instagram have called “the difficult balance between allowing people to share their mental health experiences while protecting them from harmful content” has become a dangerous problem.

Companies Mentioned

Mention Thumbnail
Mention Thumbnail
featured image - Mental Health, Body Image, & Social Media: Censorship isn't the Solution
Annie Brown HackerNoon profile picture

What Facebook and Instagram have called “the difficult balance between allowing people to share their mental health experiences while protecting them from harmful content” has become a dangerous problem for users of social media platforms who suffer from addiction, eating disorders and body image anxiety. To date, a plethora of data, alongside actual user experience,  demonstrate serious harm and little good resulting from algorithms used by the social media platforms Facebook, Instagram, and Tiktok. 

Young women and teens who suffer from body image issues and eating disorders are especially vulnerable to the enhanced idealized female images and standards of beauty promoted by influencers on social media.

Worse, these unrealistic images and ideas are reinforced by the algorithms that literally feed the problem. While Facebook/Instagram says its algorithms suggest helplines whenever it detects a user with an eating disorder, the company also admits that the algorithm simultaneously sends the user to other accounts which share similarly unhealthy, possibly deadly eating habits. 

Social media and the internet, in general, have created unprecedented access to ideas and information while simultaneously distorting and complicating human experience. The impact of social media and AI on mental health, on the way we perceive and experience our world, cannot be underestimated. 

When it comes to designing algorithms that make decisions regarding content teens engage with online, the AI has proven faulty to the point of harming teens' mental health, as Facebook's own research has shown.

The adolescent brain is still developing, especially the pre-frontal cortex, where rational thinking and decision-making take place in adults.  In contrast, adolescent decisions and actions often rely on the more instinctual, emotional, impulsive, and easily influenced part of the brain (the amygdala). Hence a teenager tends to misread and misinterpret social cues and make poor decisions. The teen’s decisions are more reactive, impulsive, and influencer-driven.

For instance, a teenage girl will misinterpret extreme ideals of beauty and body type, mistaking these images for the norm. Comparing oneself to artificially generated norms is a recipe for disappointment, low self-esteem, and a myriad of mental health issues such as eating disorders, depression, social anxiety, and OCD, to name a few.

Question: Does the AI used by Facebook/Instagram/Tik-Tok take into account the nuances of teenage brain function when developing their Algorithms?

Teenage brain difference doesn’t mean that teens can’t make responsible decisions. But this does mean that young people still need guidance and oversight by responsible “adults”. One problem with platforms like Facebook is that any potential for oversight and guidance protecting teen mental health is nullified by their profit-focused business model, which seeks users addicted to the app.

Part of this addiction involves the user’s quixotic seeking of narrowly defined perfection that doesn’t exist in life outside the algorithm. Further, Section 230 of the Communications Decency Act ensures that these platforms are absolved of any responsibility for harm done to young users in the process. 

During the Covid-19 Pandemic, social media users have struggled to decide what is fact-based health information, or good advice, and what is not. People turned to the internet for guidance and were met instead with confusion. In July of this year, new legislation was introduced in Congress to modify Section 230 of the Communications Decency Act in order to remove liability protections from Tech companies if their platforms are algorithmically spreading health misinformation during a health crisis. 

And this past week, the powerful impact of social media on users’ health once more came under scrutiny. Former Facebook employee Frances Haugen, testifying on Capitol Hill, exposed Facebook’s negative influence on the mental health of teenagers. Haugen leaked internal Facebook research documents showing 82% of its teen users experienced emotional issues in the last month, including poor body image, anxiety, and depression.

According to Haugen, Facebook has done nothing to address this problem. Some companies such as ReinventU have taken matters into their own hands. ReinventU now has policies in place to be extra cautious when promoting diet culture, in order to prevent poor body image. However, the problem is systemic and cannot be solved by the individual accountability of a few.

Censorship

An example of how this problem is systemic comes from the racist algorithms and censorship issues across social media.

Black activists and Social Media educators like Brother Ben and Shonda Martin were forced to remove terms like "Black" and "Black Lives Matter" from their Tiktok bios.

Brother Ben and Martin both found that they only appeared in the coveted "For You Page" when discussions of racial inequality were tabled.

"Even photographers who focus on artistic nudity and body positivity are being censored," says professional photographer Gregory Piper. Piper is only able to feature is nature photography online, as any depictions of women's bodies are too likely to be taken down by big tech platforms.

Similarly, artist Criss Bellini has also struggled with censorship issues. Bellini explains,

I have to be careful about what I post online. Even if I am promoting body positivity - any expression of nudity or sexuality will likely be taken down.

In response to Haugen’s testimony, a new bill, related to Section 230, is being introduced in Congress by House Democrats - the Justice Against Malicious Algorithms Act (JAMA) - that would make internet platforms liable when they "knowingly or recklessly" use algorithms to recommend content that leads to physical or "severe emotional" harm. This is a  further attempt to modify Section 230 in order to hold tech companies more responsible for the content published on their platforms.

While sources say it is unlikely legislation to reform or repeal Section 230 will become law any time soon, it has become clear that something needs to be done to protect the health and welfare of social media users, in particular the young and vulnerable. 

However, repealing or modifying Section 230 may in the end force platforms like Facebook and Instagram to censor and control content, even more, further restricting options for users, and role models for vulnerable teens. Specifically, such censorship could have a devastating impact on young women’s mental health.

Algorithms deemed “safe from liability”, driving “what is allowed,” might even further limit political speech, LGBTQ content, diverse body types, and restrict sexual, personal, racial, and cultural expression.

Algorithmic researcher Serdar Karaca found exactly this. According to Karaca, the likelihood of increased negative mental health among teens actually increases with censorship.

A better solution is to address the underlying systemic issue of algorithmic bias that governs how people use social media. As it stands, Facebook/Instagram’s business model relies on a negative self-image to boost user participation (posts and clicks).

Teens already struggle with self-image and peer acceptance. Young women/teens would be better served if they were given access to images of diverse body types, more body-positive content, more life choices in general.