Meta bolsters security for teens, to block eating disorder, self harm and suicide posts

In an effort to create a safer online environment for teenagers, Meta, the parent company of Instagram and Facebook, has announced new measures to hide, and refrain from suggesting ‘inappropriate content’ from the accounts of teenage users, about Suicides, Eating disorders. 

The social media giant, headquartered in Menlo Park, California, outlined its commitment in a blog post, emphasizing the avoidance of recommending "age-inappropriate" material to teens.

Mark Zuckerberg, meta owner

In addition to content restrictions, teenagers' accounts will be automatically placed on the most restrictive settings on both platforms. This means young users, who provided accurate age information during registration, will face limitations in searching for potentially harmful terms. The move aims to curtail exposure to content that could adversely impact their well-being.

Meta expressed "We want teens to have safe, age-appropriate experiences on our apps,” commiting to the intention to foster safe and age-appropriate experiences for teens on their apps. 

The company acknowledged the complexity of certain issues, Meta said “Take the example of someone posting about their ongoing struggle with thoughts of self-harm. This is an important story, and can help destigmatize these issues, but it’s a complex topic and isn’t necessarily suitable for all young people,”

The social media giant, META outlined its commitment in a blog post, emphasizing the avoidance of recommending "age-inappropriate" material to teens.

Is it Meta’s ‘desperate attempt to avoid regulation’?

This announcement comes at a critical time for Meta, which is currently involved in legal battles with numerous U.S. states. The lawsuits allege that the company knowingly designed features on Instagram and Facebook that contribute to the mental health crisis among young people. Critics argue that Meta's recent actions fall short of addressing the core concerns and are seen by some as an attempt to deflect regulatory scrutiny.

Josh Golin, Executive Director of the children's online advocacy group Fairplay, criticised Meta's announcement, calling it a "desperate attempt to avoid regulation" and questioning the timing of these changes in 2024. He highlighted the need for stronger measures to protect children from online harms, particularly those related to suicide and eating disorders.

©️ Copyright 2023. All Rights Reserved Powered by Vygr Media