Meta delays encrypted messages on Facebook and Instagram until 2023 | Meta

The owner of Facebook and Instagram is delaying plans to encrypt user messages until 2023 amid warnings from child safety activists that his proposals would protect attackers from detection.

Mark Zuckerberg’s social media empire has come under pressure to abandon its encryption plans, which UK Home Secretary Priti Patel called “simply unacceptable”.

The National Society for the Prevention of Cruelty to Children (NSPCC) has said private messaging is the “front line of online child sexual abuse” because it blocks law enforcement and tech platforms from seeing it. messages by ensuring that only the sender and recipient can see their content – a process known as end-to-end encryption.

Facebook and Instagram parent company security chief Meta announced that the encryption process would take place in 2023. The company previously said the change would be arrive in 2022 as soon as possible.

“We are taking our time to get it right and we do not plan to complete the global rollout of end-to-end encryption by default on all of our messaging services until 2023,” Antigone Davis wrote in the Sunday Telegraph.

“As a company that connects billions of people around the world and that has built cutting-edge technology, we are committed to protecting people’s private communications and keeping people safe online. “

Meta already uses end-to-end encryption on its WhatsApp messaging service and had plans to extend it to its Messenger and Instagram apps in 2022. It has already encrypted voice and video calls on Messenger. Announcing the privacy campaign in 2019, Zuckerberg said: “People expect their private communications to be secure and only seen by the people they sent them to – not hackers, criminals, too extensive governments or even the people who operate the services they use.

Meta’s apps are used by 2.8 billion people every day. The tech industry made more than 21 million reports of child sexual abuse identified on its platforms around the world to the US National Center for Missing and Exploited Children in 2020. More than 20 million of those reports were from Facebook.

Davis said Meta would be able to detect abuse under its encryption plans using unencrypted data, account information and user reports. A similar approach has already enabled WhatsApp to report to child safety authorities. “Our recent review of some historical cases showed that we would still have been able to provide critical information to authorities, even if these services had been end-to-end encrypted,” she said.

Patel has been a vocal opponent of Meta’s plans. “We cannot allow a situation where the ability of law enforcement to fight heinous criminal acts and protect victims is severely hampered,” she said in April.

The issue is also a concern for Ofcom, the communications regulator responsible for enforcing the Online Safety Bill, which will become law around 2023 and imposes a duty of care on tech companies to protect children from harmful content. and prevent abuse from happening on their platforms. Ofcom chief executive Melanie Dawes told The Times on Saturday that social media companies should ban adults from messaging children directly or face criminal penalties.

NSPCC Child Safety Online Policy Officer Andy Burrows praised Meta’s decision. “Facebook is right not to do end-to-end encryption until it has an appropriate plan to prevent child abuse from being detected on its platforms,” he said.

“But they should only move forward with these measures when they can demonstrate that they have the technology in place that will ensure children are not at greater risk of abuse.”

Comments are closed.