Skip to content

Meta introduces adolescent safety enhancements and deletes 635,000 accounts involved in sexual exploitation of minors

Meta, Instagram's parent company, introduces protective measures for adolescent users on their platforms, such as displaying details of accounts sending them messages and a straightforward method to block and report troublesome accounts with a single click.

Meta introduces new teen safety measures and eliminates 635,000 accounts involved in child...
Meta introduces new teen safety measures and eliminates 635,000 accounts involved in child sexualization

Meta introduces adolescent safety enhancements and deletes 635,000 accounts involved in sexual exploitation of minors

In a bid to protect young users from online predators and inappropriate content, Meta, the parent company of Instagram, has announced a series of safety measures for teen accounts on the platform.

## Safety Features for Teen Accounts

The new features include expanded protections in Direct Messages (DMs), enhanced nudity filters, private accounts by default, and age verification checks.

Teen Instagram users now see clearer safety prompts when opening new DMs, including options to block, report, or view when the contact’s account was created. This provides more context before engaging in conversations. A new combined ‘Block and Report’ feature simplifies the process of removing and flagging inappropriate users.

Meta has also enhanced nudity filters to further protect users from inappropriate content. Teen accounts are set to private by default, restricting who can send them private messages. They can only receive messages from people they follow or are already connected to.

Meta is testing the use of artificial intelligence (AI) to determine if kids are lying about their ages on Instagram. If a user is misrepresenting their age, the account will automatically become a teen account with more restrictions.

## Measures Against Inappropriate Content

Meta has also taken steps to remove accounts that leave sexualized comments or request sexual images from adult-run accounts of kids under 13. Thousands of such accounts have been removed, including 135,000 commenting accounts and another 500,000 linked to inappropriate interactions.

Adult-run accounts featuring children will now default to the strictest message settings, have offensive comments automatically filtered via Hidden Words, and will be shielded from contact with adults flagged as potentially suspicious.

## Ongoing Efforts

These measures are part of Meta's ongoing efforts to enhance its safety tools in response to regulatory pressure and concerns about youth mental health. The aim is to better safeguard young users from predatory behavior and inappropriate content on its platforms.

Meta continues to face lawsuits from dozens of US states, which accuse it of harming young people and contributing to the youth mental health crisis. However, the new safety features are a positive step towards addressing these concerns and ensuring the safety of its users, particularly teenagers.

The new safety features are designed to make it easier for teens to report inappropriate behavior on Instagram, and are part of a larger effort by social media companies to address the issue of child safety on their platforms. Instagram parent company Meta has introduced these new safety features as part of its commitment to ensuring the safety and well-being of its users.

[1] Source: TechCrunch [2] Source: The Verge

  1. Meta, recognizing the need to safeguard its teen users on Instagram, has introduced safety features such as enhanced nudity filters, private accounts by default, and age verification checks.
  2. In addition to the safety features for teen accounts, Meta is testing the use of artificial intelligence to verify the ages of users and restrict inappropriate content.
  3. Beyond protecting young users from online predators and inappropriate content, Meta's ongoing efforts also involve the removal of accounts that engage in sexualized comments or request sexual images from underage users.
  4. As part of a broader initiative by social media companies to ensure child safety, Meta's new safety features for teen accounts on Instagram are designed to make it easier for teens to report inappropriate behavior and contribute to a healthier online environment, addressing concerns about youth mental health and wellness.

Read also:

    Latest