Tech Giant, Meta, Implements Stricter Measures on Suicide and Self-Harm Content for users under 18

By Canaan Arinda

In a move, Meta, the owner of Instagram and Facebook, announced that it will “hide content related to suicide, self-harm, and eating disorders” from users under 18, stating that “we want teens to have safe, age-appropriate experiences on our apps.”

This policy applies even if the content is shared by someone they follow directly on the platforms. These platforms will instead share resources from mental health charities when users post about such struggles.

“Teens will already be placed in the most restrictive content control setting,” making it harder for them to encounter sensitive content, according to the company’s blog post.

While Meta’s efforts have been praised, Andy Burrows, an adviser to the Molly Rose Foundation in London argues that these measures “don’t go far enough,” citing the commonness of harmful content not covered by the announcement.

“The vast majority of harmful content currently available on Instagram isn’t covered by this announcement,” warns Andy. He believes that Meta’s policy changes are a small step when a giant leap is urgently required. He emphasises the ongoing risk of teenagers encountering dangerous material on the platform.

Research by the foundation reveals that nearly “half of the most-engaged posts under well-known suicide and self-harm hashtags on Instagram glorify suicide and self-harm.”

Meta emphasizes its commitment to providing “safe, age-appropriate experiences,” introducing over 30 tools and resources to support teens and parents. The new measures include automatically removing age-inappropriate content and prompting 13-17-year-olds to regularly check and update their privacy settings.

As these changes unfold, experts encourage parents to engage in conversations with their teens about navigating challenging topics on social media.

Comments are closed.