Instagram is introducing separate teen accounts for users under 18 in an effort to make the platform safer for children. Beginning in the U.S., U.K., Canada, and Australia, anyone under 18 who signs up for Instagram will be placed into a teen account, and existing accounts will be migrated over the next 60 days. Teens in the European Union will see their accounts adjusted later this year. Meta, the company behind Instagram, acknowledges that teenagers may lie about their age and will require them to verify their ages in more instances. They are also developing technology to proactively find teen accounts pretending to be adults and placing them into restricted teen accounts.

Teens with the new accounts will have default privacy settings, with private messages restricted to only those they follow or are already connected to. Meta will limit “sensitive content” on teen accounts, such as videos of people fighting or promoting cosmetic procedures. Teens will also receive notifications if they spend more than 60 minutes on Instagram and a “sleep mode” will be enabled from 10 p.m. to 7 a.m. to turn off notifications and send auto-replies to direct messages. While these settings will be turned on for all teens, 16 and 17-year-olds have the option to turn them off, while those under 16 will need parental permission to do so.

The announcement of teen accounts comes as Meta faces lawsuits from numerous U.S. states, accusing the company of harming young people and contributing to the youth mental health crisis. While Meta has made efforts to address teen safety and mental health on its platforms, these changes have often faced criticism for not going far enough. For instance, while kids will receive a notification after 60 minutes of app usage, they can bypass it unless a parent activates “parental supervision” mode to limit their time on Instagram to a specific amount, like 15 minutes. With these latest changes, Meta is giving parents more options to monitor their kids’ accounts, with those under 16 requiring parental permission to change settings to less restrictive ones.

Nick Clegg, Meta’s president of global affairs, stated that parents have not been utilizing the parental controls previously introduced by the company. The introduction of teen accounts is meant to create a significant incentive for parents and teens to set up parental supervision. With the family center feature, parents can monitor who is messaging their teen and have important conversations about online safety and potential bullying or harassment situations. U.S. Surgeon General Vivek Murthy has criticized tech companies for putting too much responsibility on parents to keep children safe on social media, considering the rapidly evolving technology and how it influences children’s perceptions of themselves, friendships, and the world.

Overall, Instagram’s introduction of teen accounts aims to address concerns about the impact of social media on young people’s mental health and safety. By implementing default privacy settings, limiting sensitive content, and introducing time management and parental supervision features, Meta hopes to create a safer environment for teens on the platform. Despite facing lawsuits and criticism in the past, Meta continues to make efforts to improve safety measures for young users and reduce the negative impact of social media on their lives.

Share.
Exit mobile version