After launching dedicated teen accounts earlier this year, Meta today (Wednesday) announces a major update to Instagram's content policy, which will take effect for hundreds of millions of teenagers worldwide – including in Israel. As part of this move, teen accounts will be set by default according to the PG-13 movie rating, intended for ages 13 and above. The implication: Content shown to young users will be similar in nature to what is considered appropriate for cinema viewing at these ages, with strict filtering of sexually explicit, graphic, or dangerous content. Teens will not be able to remove this setting without parental approval, and the system will also use age-detection technologies to prevent bypassing restrictions – even if users claim they are adults.
In addition to the basic setting, Meta introduces a new mode called "Limited Content," intended for parents who want even stricter control over their children's exposure. In this mode, viewing is further restricted, including blocking comments, access to certain content, and interactions with Instagram's artificial intelligence. According to Meta, 96% of parents in the United States expressed satisfaction with the new option, even if they did not choose to activate it.
The update is designed to enhance Instagram’s existing protection framework and align it with standards parents are familiar with elsewhere. Meta emphasizes that its goal is to give parents peace of mind that the content their teens are exposed to is age-appropriate while maintaining reasonable freedom of action within the platform. Alongside this, the company has developed advanced monitoring mechanisms that identify inappropriate content and accounts and automatically block them from appearing in feeds, recommendations, and searches. Even accounts that consistently post adult-only content will not be able to interact with teen accounts under any circumstances.
To ensure the update meets user expectations, Meta involved thousands of parents worldwide in testing content and assessing its suitability for teens. The company reports receiving more than three million ratings from parents, which were used to refine the new policy. In the future, Meta will allow parents to report posts they believe are inappropriate for teens, with the company committed to reviewing these posts as a high priority.
The new policy will roll out gradually starting today in the United States, the United Kingdom, Canada, and Australia, and will expand to additional countries by the end of the year. Later, Meta plans to extend the protections to Facebook as well, providing teens with a unified layer of protection across the company’s platforms.