Meta is rolling out more parental controls for Instagram and VR: NPR
Noah Berger/AFP via Getty Images
Facebook parent company Meta is rolling out additional parental oversight measures for Instagram and its virtual reality headset, expanding a suite of tools launched in the US in recent months.
The changes follow a year of intense public scrutiny by the company, with significant criticism focusing on child safety and Instagram’s harmful effects on young users, especially teenage girls.
Last fall, a Wall Street Journal survey reported that the company’s studies have repeatedly confirmed the photo-sharing app’s adverse effects on the mental health of teenage girls, even as Meta proceeds with a controversial plan to develop a version of the platform. social media for kids under 13. (This project has since been put on hold.)
The months that followed brought additional revelations from whistleblower Frances Haugen, a congressional investigation into child safety, and an investigation by state attorneys general into how Instagram recruits and affects children.
The company announced in December that it would release new safety tools for teens and their parents, which they began rolling out in March.
Instagram says users must be at least 13 years old to create an account – an easy rule to circumvent because the app has no age verification process.
Antigone Davis, Chief Security Officer of Meta, said morning edition that the company is working on specific safeguards – like developing artificial intelligence to better identify underage users – but that remains a challenge.
“There really is no panacea to solve this problem,” she said. “This is a problem that the industry is facing, and we are trying to find several ways to solve this problem.”
In the meantime, Meta is taking steps to give parents and guardians more control over their children’s activities in VR and on Instagram – implementing some of the changes it first announced in March.
Meta announced on Tuesday that it was rolling out parental supervision tools to all of its Oculus Quest virtual reality headsets, and expanding some Instagram parental controls in the US before rolling out more to more than half a dozen. of country.
The new features will allow parents to approve or decline requests to purchase certain apps for the Quest, block apps that may be inappropriate for younger users, and view their child’s apps while they play. headset screen and Oculus friends list. Parents can also prevent their teen from accessing their PC content on their Quest headset by blocking Link and AirLink.
The teen must initiate the process and both parties must agree before parents can log into their teen’s Quest account, Meta added.
On Instagram, parents and guardians can now invite their teens to initiate supervision tools (a process that previously only worked the other way around), can set limits on their teen’s Instagram usage at specific times of the day or days of the week and can see more information when their teen reports a post or account.
Instagram will also launch new “nudges” for teen users in select countries, encouraging them to switch to a different topic if they repeatedly view the same type of content on their Explore page.
“We designed this new feature because research suggests that nudges can be effective in helping people – especially teens – be more aware of how they’re using social media right now,” explained Meta. The company cited internal research showing a one-week testing period, which showed that one in five teens who saw the new nudges moved on to a different topic.
The company says it will soon be rolling out reminders for teens to activate its existing Take a Break feature when they’ve been browsing Reels for a while.
As part of this new suite of updates, Meta is also working to provide parents and guardians with more information and resources. It says it’s adding new articles — including tips for talking to teens about various topics online — to its Family Center education center and launching a parent education center for virtual reality.
“This is just a starting point, informed by careful collaboration with industry experts, and we will continue to develop and evolve our parental supervision tools over time,” he adds. .
The company’s announcement came after it recently faced eight lawsuits across the country, all of which accused it of deliberately making Instagram and Facebook addictive to young people in order to boost profits from Meta, as reported by Bloomberg.
A spokesperson for Meta declined to comment on Bloomberg’s litigation, but noted the time limits and other parental controls it has developed for Instagram.
Editor’s note: Meta pays NPR to license NPR content.