Meta, the parent company of Instagram, is introducing a new account setting specifically designed for users under the age of 18 in a bid to improve safety on its platform.
Beginning Tuesday, teenagers in the US, UK, Canada, and Australia will automatically be placed in a restricted “teen account” with enhanced parental supervision options when signing up for Instagram.
Existing accounts held by users under 18 will transition to this new setting over the next 60 days.
Meta plans to roll out similar changes to teen accounts in the European Union later this year.
This move comes amidst increasing public backlash over social media’s influence on young people’s mental health, with lawmakers, parents, and advocacy groups criticizing tech companies for failing to protect children from harmful content and online predators.
In January, the Mark Zuckerberg-led social media giant announced that it will implement new content guidelines to ensure teenagers using the platform get a secure and age-appropriate digital environment as advised by experts.
However, in June, a Wall Street Journal investigation revealed that the platform was continuing to recommend adult content to underage users.
New features: Parental supervision, content restrictions
One of the most significant updates to the new teen accounts is the enhanced parental supervision options.
Parents will now have the ability to oversee their children’s Instagram usage by setting time limits, blocking app access during nighttime hours, and monitoring who their teens are messaging.
Teens under the age of 16 will need parental permission to change their account settings, while 16 and 17-year-olds will be allowed to modify certain restrictions independently.
“The three concerns we’re hearing from parents are that their teens are seeing content that they don’t want to see, that they’re getting contacted by people they don’t want to be contacted by, or that they’re spending too much time on the app,” explained Naomi Gleit, Meta’s head of product.
“Teen accounts are really focused on addressing those three concerns.”
In addition to the monitoring tools, these accounts will limit “sensitive content,” such as videos of violent behavior or cosmetic procedures.
Meta will also implement a feature that reminds teens if they’ve been on Instagram for more than 60 minutes and introduces a “sleep mode,” which disables notifications and sends auto-replies to messages between 10 p.m. and 7 a.m.
This feature is designed to help teens manage their time on the app and avoid excessive use at night.
While these restrictions are enabled by default for all teens, those aged 16 and 17 will have the option to turn them off. However, kids under 16 will need a parent’s consent to adjust the settings.
Growing legal challenges and mental health concerns
The introduction of teen accounts coincides with ongoing legal battles Meta is facing, as dozens of US states have sued the company, accusing it of deliberately designing addictive features on Instagram and Facebook that harm young users.
The lawsuits claim that Meta’s platforms contribute to the worsening youth mental health crisis, with teens exposed to unhealthy amounts of screen time, harmful content, and online bullying.
US Surgeon General Vivek Murthy voiced concerns last year about the pressures being placed on parents to manage their children’s online experiences without adequate support from tech companies.
In a statement in May, 2023, he said,
“We’re asking parents to manage a technology that’s rapidly evolving, that fundamentally changes how their kids think about themselves, how they build friendships, and how they experience the world.”
Meta’s latest effort to improve online safety for teens follows a series of prior attempts, many of which were criticized for not going far enough.
For instance, teens will still be able to bypass the 60-minute time notification if they wish to keep scrolling, unless parents enable stricter parental controls through the “parental supervision” mode.
Nick Clegg, Meta’s president of global affairs, acknowledged last week that parental control features have been underutilized, saying, “One of the things we do find … is that even when we build these controls, parents don’t use them.”
Teen accounts part of a global strategy
Unlike some of Meta’s other recent actions, such as the ability for EU users to opt out of having their data used to train AI models (a feature not yet available in other regions), the teen accounts are part of a global strategy.
In addition to the US, UK, Canada, and Australia, Meta plans to introduce these changes across the European Union by the end of the year.
Antigone Davis, Meta’s director of global safety, emphasized that this new feature was driven by parental concerns rather than government mandates. “Parents everywhere are thinking about these issues,” Davis told Guardian Australia.
“The technology at this point is pretty much ubiquitous, and parents are thinking about it. From the perspective of youth safety, it really does make the most sense to be thinking about these kinds of things globally.”
Countries explore social media regulation for teens
The timing of Meta’s announcement aligns with broader governmental efforts to regulate children’s access to social media platforms.
Just a week prior, the Australian government proposed new legislation to raise the age at which teens can access social media platforms to somewhere between 14 and 16.
If enacted, this law would place Australia among the first countries to enforce such a ban, with other nations like the UK monitoring its progress closely.
As countries like Australia and the UK explore further restrictions on social media for teens, Meta’s new teen accounts reflect a growing global awareness of the need for greater online protections for young users.
With its new features, Meta hopes to strike a balance between empowering parents and keeping Instagram a safe space for teens