Meta adds new teen safety features following renewed criticism

News Room
By News Room 5 Min Read

Meta announced Tuesday that it is expanding its youth safety efforts by rolling out new settings for teen Facebook and Instagram users, including content restrictions and hiding search results for terms related to self-harm and suicide.

The parent company of Facebook and Instagram said the new policies will add to its existing slate of more than 30 well-being and parental oversight tools aimed at protecting young users.

Tuesday’s announcement comes after Meta has in recent months faced renewed scrutiny over its potential impact on teen users.

In November, former Facebook employee-turned-whistleblower Arturo Bejar told a Senate subcommittee in a hearing that Meta’s top executives, including CEO Mark Zuckerberg, ignored warnings for years about harms to teens on its platforms such as Instagram. Bejar raised particular concerns about the sexual harassment of teens by strangers on Instagram.

The same month, unsealed court filings in one lawsuit against the company revealed internal company documents that suggest Zuckerberg repeatedly thwarted teen well-being initiatives.

Court documents unsealed in a separate lawsuit weeks later alleged that Meta has knowingly refused to shut down most accounts belonging to children under the age of 13, while collecting their personal information without their parents’ consent.

New Mexico’s Attorney General filed another lawsuit against Meta in December, accusing the company of creating a “breeding ground” for child predators.

The new pressure comes about two years after another Facebook whistleblower, Frances Haugen, released a trove of internal documents that raised questions about the company’s handling of youth safety. Those documents, known as the “Facebook Papers,” sparked outcry from lawmakers and the public and prompted efforts by Meta and other social platforms to improve their protections for teen users.

In a blog post announcing the new policies Tuesday, Meta said it wants “teens to have safe, age-appropriate experiences on our apps.”

Meta said it will start hiding “age-inappropriate content” such as posts discussing self-harm and eating disorders, nudity or restricted goods from teens’ feeds and stories, even if it is shared by someone they follow.

It added that it will place all teens who use Facebook and Instagram into its most restrictive content recommendation settings, which make it more difficult to come across potentially sensitive content in search or explore, by default — a policy that was previously applied only to new teens who signed up to the apps.

The changes are set to roll out to children under the age of 18 in the coming months.

The company is also expanding the range of search terms related to self-harm, suicide and eating disorders for which it hides results and instead directs users to support resources. That list, which will be updated in the coming weeks, will now include terms such as “self-harm thoughts” and “bulimic,” Meta said.

Meta said it plans to continue sharing resources from organizations such as the National Alliance on Mental Illness when someone posts content “related to their struggles with self-harm or eating disorders.”

Meta also said it will prompt teen users to review their safety and privacy settings.

It will offer them an easy, one-tap way to “turn on recommended settings,” which will automatically change their settings to restrict who can repost their content, “remix” and share their reels, tag, mention or message them.

The settings will also “help hide offensive comments,” Meta said.

Updating teens’ privacy settings could help to address concerns, including those from whistleblower Bejar and New Mexico Attorney General Raúl Torrez, that adult strangers can easily message and proposition young users on Facebook and Instagram.

Tuesday’s changes add to Meta’s existing teen safety and parental control tools, which also include the ability for parents to see how much time their kids spend on the company’s apps, reminders to take a break during long scrolling sessions and notifications for parents if their teen reports another user.

Read the full article here

Share This Article
Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *