YouTube banned antivaxxers months after Facebook and Twitter.

0

YouTube on Wednesday announced a major escalation in the way it deals with content that poses a risk to public health: the platform bans disinformation related to any vaccine approved by local health authorities and the World Health Organization. . YouTube’s medical disinformation policies previously prohibited the promotion of untested harmful treatments and false claims about COVID-19, but the Google affiliate says the pandemic prompted it to review anti-vaccine content in general. “We have regularly seen false claims about coronavirus vaccines turn into misinformation about vaccines in general, and we are now at a point where it is more important than ever to expand the work we started with COVID- 19 to other vaccines. A corporate blog post explaining the decision reads. Some of the inaccurate narratives the platform will target include links between autism and vaccines, microchips hidden in vaccines, and general representations of vaccines as ineffective and dangerous. In addition to establishing these new policies, YouTube has also suspended the accounts of prominent anti-vaccines such as Robert F. Kennedy Jr., Joseph Mercola, and Sherri Tenpenny.

The action is important and has been applauded by public health experts. Timing… is something else. Facebook announced its own ban on vaccine misinformation in February, after years of advocates and researchers calling for such a policy. Twitter did so in March. And although YouTube released its own policy on Wednesday, the platform notes in its blog post that “as with any major update, it will take time for our systems to fully strengthen the application.” As by far the largest video platform, YouTube is a major part of the online anti-vaccine ecosystem, serving as a repository for deceptive videos that spread on the platform, then circulate widely on Facebook and Twitter. Critics have pointed out during the pandemic that YouTube appeared to escape scrutiny of public discourse on anti-vaccine misinformation, particularly when President Joe Biden suggested in July that Facebook was a major force propelling vaccine reluctance to people. United States. tend to be difficult to follow on YouTube, as they often come in the form of statements made in long videos; it’s easier to keep an eye out for snippets of text containing vaccine claims circulating on Facebook and Twitter. At the same time, some of the accounts banned by YouTube on Wednesday had tens of thousands of subscribers and millions of views. They were clearly not unknown to the company.

This isn’t the first time YouTube has been late to the party when it comes to misinformation issues. In 2020, YouTube took a largely passive approach to the disinformation surrounding the presidential election compared to Facebook and Twitter. The platform did not begin taking action against misleading videos of widespread electoral fraud until a month after the election, after the “safe harbor” deadline expired for states to resolve disputes with them. results. During the election, YouTube opted to attach information boards to all videos relating to the results of the vote, whether or not they were truthful. Facebook and Twitter, on the other hand, had characterized some posts as disinformation and limited the dissemination of some misleading content. YouTube was also the last major platform to shut down former President Donald Trump’s account in the wake of the Jan.6 riot on Capitol Hill it helped incite, and arguably the least aggressive in the world. management of spinoffs. While other platforms have either permanently banned Trump’s account or specified the length of his suspension, YouTube has simply said that it will allow the former president to re-post once the threat of violence subsides. which means YouTube might be his best chance to get back to mainstream social media in the immediate future.

In another example, YouTube went public with its policies cracking down on the QAnon conspiracy theory last fall, a week after Facebook and months after Twitter. Even then, YouTube’s policies weren’t as strict as those of other major platforms. While Facebook has banned QAnon-related accounts and groups outright, YouTube has failed to completely eliminate conspiracy theory from its platform and has given more space to what it called “borderline content.” “.

Of note, YouTube CEO Susan Wojcicki also didn’t have to appear before Congress to answer questions about social media misinformation alongside Facebook CEO Mark Zuckerberg and Twitter CEO Jack. Dorsey. As her boss, Google CEO Sundar Pichai, testified before lawmakers on the matter, technical experts and reporters are eager to hear Wojcicki herself explain YouTube’s approach to Congress. It would seem that there is a lot to explain.

Future Tense is a partnership between Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.



Source link

Leave A Reply

Your email address will not be published.