Meta, the parent company of Instagram, recently announced that they would be testing features that blur messages containing nudity on the platform to protect teenagers and prevent scammers from reaching them. This move comes as Meta faces increasing pressure in both the United States and Europe over concerns that their apps are addictive and leading to mental health issues among young people. The new protection feature for Instagram’s direct messages will use on-device machine learning to analyze whether images contain nudity. It will be turned on by default for users under 18, with adults being encouraged to activate it as well.

Unlike Meta’s Messenger and WhatsApp apps, direct messages on Instagram are not currently encrypted. However, Meta has stated that they plan to roll out encryption for the service in the future. The company is also developing technology to help identify accounts that may be engaged in sextortion scams and is testing new pop-up messages for users who may have interacted with such accounts. These efforts are part of Meta’s broader push to protect teenagers on their platforms by making it more difficult for them to come across sensitive content such as suicide, self-harm, and eating disorders.

In January, Meta announced that they would be hiding more content from teens on Facebook and Instagram to further shield them from potentially harmful material. This follows a lawsuit filed by attorneys general of 33 US states, including California and New York, in October, alleging that Meta had misled the public about the dangers of its platforms. Additionally, the European Commission has sought information from Meta on how the company protects children from illegal and harmful content in Europe. Meta’s new initiatives to blur nudity in messages and protect teens from harmful content are part of their ongoing efforts to address these concerns and make their platforms safer for young users.

The protection feature for Instagram’s direct messages will work by analyzing images on the device itself, ensuring that the nudity protection will also function in end-to-end encrypted chats, where Meta does not have access to the images unless they are reported. This approach aims to strike a balance between privacy and safety on the platform while ensuring that teenagers are shielded from inappropriate content. By alerting adults to activate the feature for younger users and developing technology to combat potential scams, Meta is taking steps to improve the safety of its platform and address criticisms related to the impact of its apps on young people’s mental health.

As Meta continues to face scrutiny over the addictive nature of its apps and their potential impact on mental health, the company is prioritizing the safety and well-being of teenagers on Instagram. By implementing new features to blur messages containing nudity, identifying potential scams, and hiding sensitive content from younger users, Meta is demonstrating a commitment to addressing the concerns raised by regulators, attorneys general, and the public. Through a combination of technological solutions and policy changes, Meta is working to create a safer online environment for teenagers while maintaining the functionality and privacy of its popular social media platforms.

Share.
Exit mobile version