YouTube deleted 120,000 sexually explicit videos with children in the first semester

0

YouTube is the world’s largest online video platform.

Angela Lang / CNET

YouTube told Congress on Tuesday it had removed more than 120,000 videos that sexually exploited children or were sexually explicit featuring minors in the first half of this year. By comparison, YouTube removed a total of 15.9 million videos during that period for violating one of its community guidelines. YouTube also reported them to the National Center for Missing and Exploited Children.

The statistic was part of YouTube’s written testimony for a Senate hearing on protecting children online Tuesday morning, delivered by Leslie Miller, YouTube vice president of government affairs and public policy. The hearing – in which lawmakers toast representatives from YouTube, Snapchat and TikTok to the Senate subcommittee on consumer protection, product safety and data security – is similar to that which the Facebook whistleblower Frances Haugen touched on earlier this month, when she called Facebook for “moral bankruptcy” because of its products which “harm children, fuel division and weaken our democracy.”

The latest hearing comes as the platform owned by Google and other so-called Big Tech companies face unprecedented heat from lawmakers and regulators over the real effects of their products and policies. Some of the more intense reviews have looked at how technology hurts or endangers children. Recently, YouTube has largely avoided the fiercest criticisms from lawmakers of how social platforms can harm children. YouTube, however, is very popular with young people. A to study suggests that children’s content may be the most watched video category on YouTube overall.

YouTube also noted on Tuesday that it had deleted 7 million accounts believed to belong to young children and tweens on the sly in the first nine months of this year, including 3 million in the third quarter as the company “ramped up our automation. removal efforts. ”(For context, YouTube has over 2 billion accounts that actively visit YouTube each month.)

YouTube’s terms of service require accounts to be owned by persons 13 years of age and over. Children under 13 are technically not allowed to have YouTube accounts. They can access YouTube – and still follow its rules – through so-called supervised experiences, which limit certain content and aspects of the platform that may be risky for young viewers, or with YouTube Kids, an app. specialized for small children.

But many online platforms with age limits, including YouTube, have come under fire for sluggish enforcement of their age restrictions.

Recently, YouTube has stepped up the automated application of age violation in other aspects of its service as well. A year ago, he said artificial intelligence would automatically apply age restrictions on videos. Essentially, machine learning would decide whether a video should be classified as suitable only for people over 18.

During its first testimony on Tuesday, YouTube also noted that 85% of the videos it deleted for breaking its child safety rules in the second trimester were deleted quickly before reaching 10 views. This is a moderately more aggressive rate for removing videos before they spread widely than the percentage of all videos deleted by YouTube for breaking any of its rules during the same period. Of all the videos deleted from YouTube, around 75% were viewed by 10 or fewer people.

YouTube has been criticized for a series of scandals involving children in the past. In 2019, Google accepted a record fine of $ 170 million to settle a federal investigation into children’s data privacy on the giant video site. YouTube has also faced scandals involving videos of child abuse and exploitation, nightmarish content in its YouTube Kids app, and predatory comments that sexualized clips of young children.


Now playing:
Check this out:

Whistleblower tells Congress Facebook is suffering …


9h00


Source link

Leave A Reply

Your email address will not be published.