How the war in Ukraine rocked Facebook and Instagram

Meta, which owns Facebook and Instagram, took an unusual step last week: It suspended some of the quality checks that ensure that posts from users in Russia, Ukraine and other Eastern European countries abide by its rules.

As part of the change, Meta temporarily stopped monitoring whether its employees who monitored Facebook and Instagram posts from those areas were accurately applying its content guidelines, six people with knowledge of the situation said. That’s because workers couldn’t keep up with changing rules about what types of positions were allowed over the war in Ukraine, they said.

Meta has made more than half a dozen content policy reviews since Russia invaded Ukraine last month. The company allowed posts about the conflict it would normally delete — including some calling for the death of Russian President Vladimir V. Putin and violence against Russian soldiers — before changing its mind or developing new guidelines, the people said.

The result has been internal confusion, especially among content moderators who patrol Facebook and Instagram for gory text and images, hate speech and incitement to violence. Meta occasionally changed her rules on a daily basis, causing a lash, said the people, who were not authorized to speak publicly.

Perplexity over content guidelines is just one of the ways Meta has been troubled by the war in Ukraine. The company also faced pressure from Russian and Ukrainian authorities in the battle for information on the conflict. And internally it has faced dissatisfaction with its decisions, including from Russian employees worried about their safety and Ukrainian workers who want the company to be tougher on Kremlin-affiliated organizations online, have said. said three people.

Meta has weathered international conflicts before – including the genocide of a Muslim minority in Myanmar in the past decade and skirmishes between India and Pakistan – with varying degrees of success. Now, the biggest conflict on the European continent since World War II has become a litmus test of whether the company has learned to control its platforms during major global crises – and so far, it seems remain a work in progress.

“All the ingredients of the Russian-Ukrainian conflict have been there for a long time: calls for violence, disinformation, state media propaganda,” said David Kaye, a law professor at the University of California, Irvine, and a former special rapporteur at the United Nations. “What I find baffling is that they didn’t have a game plan to deal with it.”

Dani Lever, a spokeswoman for Meta, declined to comment directly on how the company handled content decisions and employee concerns during the war.

After Russia invaded Ukraine, Meta said it set up a 24-hour special operations team made up of native Russian and Ukrainian-speaking employees. It has also updated its products to help civilians during the war, including features that direct Ukrainians to reliable, verified information for locating housing and refugee assistance.

Mark Zuckerberg, chief executive of Meta, and Sheryl Sandberg, chief operating officer, were directly involved in the war response, two people with knowledge of the efforts said. But as Mr. Zuckerberg focuses on transforming Meta into a company that will rule the digital worlds of the so-called Metaverse, much responsibility around the conflict falls – at least publicly – on Nick Clegg, the chairman of the world affairs.

Last month, Mr. Clegg announcement that Meta would restrict access within the European Union to the pages of Russia Today and Sputnik, which are Russian state-controlled media, at the request of Ukraine and other European governments. Russia retaliated by cutting off access to Facebook inside the country, saying the company had discriminated against Russian media, and then blocking Instagram.

This month, Ukrainian President Volodymyr Zelensky praised Meta for moving quickly to limit Russian war propaganda on its platforms. Meta also moved quickly to remove an edited “deepfake” video from its platforms that falsely depicted Mr. Zelensky bowing to Russian forces.

The company also made some high-profile mistakes. He allowed a group called the Ukrainian Legion running ads on its platforms this month to recruit “foreigners” for Ukraine’s military, a violation of international law. It then removed the ads – which were running in the United States, Ireland, Germany and elsewhere – because the group may have misrepresented its ties to the Ukrainian government, according to Meta.

Internally, Meta had also begun to change its content policies to deal with the fast-paced nature of wartime posts. The company has long banned posts that may incite violence. But on February 26, two days after Russia invaded Ukraine, Meta informed its content moderators – who are usually contractors – that it would allow calls for the death of Mr. Putin and “ calls for violence against Russians and Russian soldiers in the context of the invasion of Ukraine,” according to the policy changes, which were reviewed by The New York Times.

This month, Reuters reported on Meta changes with a title this suggested that messages calling for violence against all Russians would be tolerated. In response, Russian authorities labeled Meta’s activities as “extremist”.

Shortly after, Meta backtracked and said it would not let its users call for the death of heads of state.

“Circumstances in Ukraine are changing rapidly,” Clegg wrote in an internal memo that was reviewed by The Times and first reported by Bloomberg. “We try to think through all the consequences and we constantly revise our orientations because the context is constantly changing.”

Meta has changed other policies. This month it made a temporary exception to its hate speech guidelines so users could post about ‘removal of Russians’ and ‘explicit exclusion against Russians’ in 12 countries in Eastern Europe. ‘Is, according to internal documents. But within a week, Meta changed the rule to note that it should only be applied to users in Ukraine.

The constant adjustments have puzzled moderators who oversee users in central and eastern European countries, said the six people with knowledge of the situation.

The policy changes were costly because moderators typically had less than 90 seconds to decide whether images of dead bodies, videos of ripped limbs or outright calls for violence violated Meta’s rules, they said. In some cases, they added, moderators saw posts about the war in Chechen, Kazakh or Kyrgyz, despite not knowing those languages.

Ms. Lever declined to say whether Meta had hired content moderators specializing in those languages.

Emerson T. Brooking, senior fellow at the Atlantic Council’s Digital Forensic Research Lab, which studies the spread of misinformation online, said Meta faces a dilemma with war content.

“Usually the content moderation policy is to limit violent content,” he said. “But war is an exercise in violence. There is no way to sanitize the war or pretend it is something different.

Meta has also faced complaints from employees about its policy changes. At a meeting this month for workers with ties to Ukraine, employees asked why the company had waited for the war to take action against Russia Today and Sputnik, said two people present. Russian state activity was at the center of Facebook’s failure to protect the 2016 US presidential election, they said, and it made no sense that such outlets continued to operate on Facebook’s platforms. Meta.

Although Meta has no employees in Russia, the company held a separate meeting this month for workers with Russian connections. Those employees said they feared Moscow’s actions against the company would affect them, according to an internal document.

In discussions on Meta’s internal forums, seen by the Times, some Russian employees said they had deleted their workplace from their online profiles. Others wondered what would happen if they worked in company offices in places with extradition treaties to Russia and “what kinds of risks will be associated with working at Meta, not only for us but for our families”.

Ms Lever said “Meta’s thoughts are with all of our employees who are affected by the war in Ukraine, and our teams are working to ensure they and their families receive the support they need.”

At a separate company meeting this month, some employees expressed dissatisfaction with changes to speech policies during the war, according to an internal poll. Some questioned whether the new rules were necessary, calling the changes a “slippery slope” that “served as proof that Westerners hate Russians”.

Others have asked about the effect on Meta’s business. “Will the Russian ban affect our revenue for the quarter? Future quarters?” read a question. “What is our recovery strategy?”

Comments are closed.