Reviews | Facebook’s strategies to hook our kids are dangerous

0

Facebook was on Capitol Hill Thursday to receive its semi-regular rebuke from Congress on how bad its services are for America.

“Facebook is like Big Tobacco, offering a product that they know is harmful,” said Sen. Ed Markey, Democrat of Massachusetts, calling his photo-sharing site “Insta-greed”.

“Our products add value and enrich the lives of teenagers, they allow them to connect with their friends, their family,” insisted Antigone Davis, global head of Facebook security, without conviction.

After Facebook’s countless contrite appearances before Congress, that doesn’t even make for good theater anymore. It’s a shame because the audience focused on the most vulnerable users of the technology – children.

Ahead of the hearing, Facebook said it was suspending work on a controversial app designed to hook young people on Instagram.

If Facebook is to be believed, the planned Instagram Kids app would include checks to ensure that Instagram’s worst – body shame, trolling, bullying, racism, targeted advertising – is brushed aside in favor of a suitable antiseptic version. for children 12 years of age and under. .

But who can trust Facebook after years of collecting and hiding pernicious data about the inner workings of its much-vaunted news feed? Time and time again, leaks from the company have shown it ignoring signs that its apps sow hatred, encourage extremism, and widely disseminate dangerous misinformation.

Facebook “consistently puts profits ahead of children’s online safety,” said Sen. Richard Blumenthal, Democrat of Connecticut. “We now know that it prioritizes the growth of its products over the well-being of our children.

Clearly, a break isn’t enough – children’s social media apps just aren’t ready for prime time. They only serve to build a bridge to the main apps, where the cool and grown-up stuff happens, and hook them young. (My own kids avoid the YouTube Kids app like spinach.) And rather than fixing systemic issues with their main sites, the apps place more responsibility on parents who don’t have an army of moderators on their side.

Before the Senate, Facebook’s Ms. Davis detailed a comprehensive list of design features, policies and other provisions needed to protect teens and young children from the dangers of her services when using them. Maybe Facebook should read this as a sign that its products are a bad idea for kids?

Instagram, in particular, is a hub for youth anxiety and mental health issues. The company’s own research indicates that the app exacerbates body image issues for nearly a third of teenage girls who suffer from it, according to a recent Wall Street Journal report. Equally disturbing is that Facebook appears to have proceeded without fully and properly consulting child safety experts. Instagram chief Adam Mosseri said the break “will give us time to work with parents, experts, policy makers and regulators.” Wasn’t that the original plan?

Not that Facebook is likely to heed expert advice anyway. It went ahead with its Messenger Kids app despite an outcry from health experts who said it could be deleterious to users’ health.

“The goal is simply to capture the most users and become the intermediaries in our social interactions,” said Priya Kumar, an assistant professor at Pennsylvania State University who studies the impact of technology on people. families. It is also likely to fuel Facebook’s advertising machine, providing more information to deliver targeted ads to parents on its main sites.

The Journal reported this week that Facebook views tweens and young children as a lucrative market that has yet to be fully tapped. Researchers at the company seemed puzzled the kids weren’t interacting with each other on screens during in-person play dates. “Is there a way to harness play dates to boost slogan / growth in kids?” A document asked.

Children are irresistible to business. Amazon on Tuesday introduced an impenetrable video device that automatically enrolls families in a subscription-based content service (it also reportedly considered a tracking device for children ages 4 and up).

Businesses know that kids’ versions of their apps will quickly lead kids to mainstream apps, where they can be reached by targeted ads and fall prey to their data collection programs just like everyone else. YouTube agreed to pay $ 170 million in 2019 to settle allegations that it served targeted ads to children under 13 and collected their personal information. This was four years after the rollout of YouTube Kids, which was supposed to block kids from accessing the main video streaming site. Not exactly a resounding success.

Facebook’s Messenger Kids application for online chat has allowed some children to join groups with strangers. According to Wired, the company’s research to justify the child safety project mainly involved groups and individuals with whom Facebook had financial ties.

YouTube CEO Susan Wojcicki recently claimed the video streaming site was “useful” for adolescent mental health, as a way to de-stigmatize sensitive issues. But one lesson from the Wall Street Journal’s Facebook series is that public statements by tech companies don’t often mock their private data.

Mr Mosseri and others claim that their children’s products are a necessary balm for an unsolvable problem: Children can lie about their age to use apps or just use their parents ‘or friends’ accounts, making it difficult to tell. filtering objectionable content. Facebook, which seems to know my innermost thoughts, must surely have some idea of ​​who is using its services at any given time.

The truth is, Facebook, YouTube, TikTok, and other companies are looking for continued growth. Operating all elementary schools helps ensure stability of new users who will move quickly to the most profitable properties of the platforms.

That’s why companies haven’t made any serious effort to clean up their core apps – there’s just too much money at stake. But when their hand is strained, they quickly find creative ways to comply with local regulations. . A law that came into full force in Britain this month to better protect young people has sparked a wave of new privacy measures from tech giants, including requiring Instagram users to state their date of birth before using the app.

Without a comprehensive privacy law, the United States has largely left businesses to self-regulate. With the same effort and financial commitment they put into creating (and championing) children’s versions of their apps, social media companies should have designed better age verification systems. Instagram Kids’ target audience is supposed to be 10-12 year olds, but what’s really stopping a first grader from lying again to access the youth version of the app?

It is unreasonable to expect children not to use major versions of Facebook, YouTube, and TikTok, nor is it to expect them not to be attracted to. just about anything that makes them feel taller. (Seventeen magazine was always aimed at 14-year-olds.) Parents I know aren’t clamoring yet another site that tells them Big Tech is good medicine. They much prefer to have a safe experience on the sites their kids and friends are already using.

Lawmakers then have an obligation to protect our children by demanding better age verification software and pushing for other design changes like stopping autoplay features that can send teenage users to children. Extremist rabbit holes and more transparency about the data collected from minors and how it is used. . They should consider fast-track proposals to update long-standing children’s online privacy law, such as tighter controls on marketing to children.

Mr. Mosseri is right about one thing: Facebook and its competitors have created irresistible services for teens and toddlers, and kids will find their way to them by hook or crook. And data from his company shows that there is enough harm in allowing them to access their main application.

Outrage is certainly there on Capitol Hill. Let’s hope for the sake of our children that this is not just about bluster.


Source link

Leave A Reply

Your email address will not be published.