Mark Zuckerberg's Instagram announced a major overhaul of its child-safety features on Tuesday, a move that was quickly condemned by online watchdog groups as an attempt to head off a looming Congressional crackdown on social media giants.
Instagram announced that it will automatically categorize users under the age of 18 as “teen accounts” and block people who don't follow them from seeing or interacting with their content.
It will also mute Instagram app notifications for teen users between 10pm and 7am and send them “time limit reminders” encouraging them to close the app after 60 minutes per day.
Parents will be able to see which accounts their children have recently messaged, set daily time limits, and even block their children from using the app during certain times.
Additionally, users under the age of 16 will need parental permission to change safety settings on their account.
The reforms were announced as the bipartisan Child Online Safety Act – landmark legislation that would impose a legal “duty of care” on Instagram's parent company Meta, TikTok and other social media companies to protect children from online harm – gains momentum in Congress.
In July, the Senate overwhelmingly passed KOSA and another bill called COPPA 2.0 by a vote of 91-3, which would have banned targeted advertising to minors, banned data collection without their consent, and given parents and children the option to remove their information from social media platforms.
The House Energy and Commerce Committee is scheduled to make amendments to the bill on Wednesday, a key step that could pave the way for a full floor vote in the near future.
Fair Play for Kids, one of the groups leading the push for KOSA's passage, denounced Mehta's announcement as an attempt to avoid meaningful legislative crackdown.
“Defaulting to private accounts for minors and turning off notifications at night are safeguards that Meta should have put in place years ago,” said Josh Golin, executive director of Fair Play. “We hope lawmakers won't be fooled by this attempt to block legislation.”
“The Child Online Safety Act and COPPA 2.0 will require companies like Meta to ensure their platforms are safe and privacy-protecting for young people all the time, not just when it's politically convenient,” Golin added.
Alix Fraser, chair of the Council for Responsible Media, echoed the announcement.
“The simple fact is that this announcement comes amid growing congressional pressure and continuing support for bipartisan Child Online Safety legislation,” Fraser said. “This is not the first time that Meta has promised to circumvent congressional action and then either not followed through or quietly backed away.”
Policymakers have specifically criticized Mehta for failing to protect children from “sextortion” scams and other online sexual abuse.
Critics also accuse apps like Instagram of contributing to a mental health crisis among young people, with negative effects ranging from anxiety and depression to eating disorders and even self-harm.
Last fall, a coalition of state attorneys general sued Meth, alleging that the company uses its addictive features to hook kids and make profits at the expense of their mental health.
In January, Zuckerberg offered a shocking apology to the families of victims of online abuse during a tense hearing before Congress.
After easily passing the Senate, KOSA's ultimate fate in the House is unclear, with some critics in both parties expressing concern about its impact on free speech online.
In July, US Surgeon General Vivek Murthy called for social media apps to implement tobacco-like “warning labels” to raise awareness of potential mental health risks, including depression and anxiety.




