Meta Platforms has announced enhanced privacy and parental controls for Instagram accounts of users under 18, addressing concerns about the negative effects of social media on teenagers.
As part of the update, all designated Instagram accounts will automatically switch to “Teen Accounts,” which are private by default. These accounts will only allow messaging and tagging from users they follow or are already connected with, while sensitive content settings will be set to the most restrictive level.
For users under 16, changes to these default settings will require parental approval. Parents will also have access to tools to monitor their children’s interactions and limit their app usage.
This update comes amid rising concerns about the impact of social media on mental health, with several studies linking its use to increased levels of depression, anxiety, and learning disabilities among young people. Meta, along with TikTok and YouTube, already faces numerous lawsuits over claims that social media is addictive, particularly for children.
In July, the U.S. Senate advanced two bills—The Kids Online Safety Act and The Children and Teens’ Online Privacy Protection Act—requiring social media companies to take responsibility for the impact their platforms have on young users.
Instagram’s new measures include alerts for users to close the app after 60 minutes of daily use and a default sleep mode that silences notifications overnight. The rollout of teen accounts will begin within 60 days in the U.S., UK, Canada, and Australia, with the European Union following later in the year. Global implementation is planned for January.