Instagram to Implement PG-13 Style Controls to Enhance Teen Safety, Says Meta
Essential brief
Instagram to Implement PG-13 Style Controls to Enhance Teen Safety, Says Meta
Key facts
Highlights
Meta, the parent company of Instagram, has announced plans to introduce a content rating system inspired by the US PG-13 movie classification to better protect teenage users on the platform.
This initiative aims to provide parents with stronger controls over their children's Instagram experience by automatically placing users under 18 into a 13+ content setting.
Teenagers will only be able to opt out of this setting with parental permission.
Currently, Instagram already restricts sexually suggestive content, graphic images, and adult themes like tobacco and alcohol for teen accounts.
However, the new system will extend these restrictions to include posts featuring strong language, risky stunts, and content encouraging harmful behaviors such as marijuana use.
Additionally, Instagram will block searches for terms like "alcohol" and "gore," even if misspelled, to further limit exposure to inappropriate material.
Meta explained that while social media differs from movies, the goal is to align Instagram's teen experience with the familiar PG-13 standard, which allows some mature content but with restrictions.
For context, the UK equivalent rating is 12A, which permits fleeting nudity and moderate violence, and Instagram's new policy will reflect similar allowances rather than outright bans.
This move follows criticism from independent research led by former Meta engineer Arturo Béjar, which found that 64% of Instagram's new safety tools were ineffective, concluding that children remain unsafe on the platform.
Meta has disputed these findings, emphasizing the availability of robust parental controls.
The UK regulator Ofcom has also urged social media companies to prioritize safety or face enforcement actions.
The updated Instagram safety features will initially roll out in the US, UK, Australia, and Canada, with plans to expand to Europe and globally early next year.
Despite these efforts, child safety advocates remain skeptical.
Rowan Ferguson of the Molly Rose Foundation called for transparency and independent evaluation to ensure the new measures genuinely protect teens from harmful content.
This development highlights ongoing challenges in balancing content accessibility and safety for younger users on social media platforms.