Skip to content

Tech companies should “live up to their moral duty” and do more to protect children, the Home Secretary would later say.

Priti Patel will speak at a virtual panel discussion on end-to-end message encryption hosted by the National Society for the Prevention of Cruelty to Children (NSPCC) later today.

She will say that companies should “take the safety of children as seriously as they sell online advertising, phones and games.”

A Facebook spokesperson said it would ‘continue to lead the industry in developing new ways to prevent, detect and respond to abuse’

According to the NSPCC, private messaging is the “front line of online child sexual abuse” and there is now a “one or the other” argument between adult privacy and children’s safety .

An NSPCC poll, conducted by YouGov, found that public support for end-to-end encryption would almost double if it could be shown that children’s safety would not be compromised.

Tech companies use a range of metrics to identify child abuse images and detect grooming and sexual abuse in private messages on their platforms.

But there are fears that using end-to-end encryption for Facebook Messenger and Instagram would make these measures redundant.

The NSPCC said about 70% of global child abuse reports could be lost as a result.

In her remarks at the event, the Home Secretary is expected to accuse Facebook of “blinding itself to the problem” that end-to-end encryption will cause in cases of child abuse online.

“Unfortunately, at a time when we need to do more, Facebook is pursuing end-to-end encryption plans that jeopardize the good work and progress made so far,” Patel said.

“The offense will continue, images of abused children will proliferate – but the company intends to blind itself to the problem with end-to-end encryption that prevents access to email content.

“It’s not acceptable. We cannot allow a situation where the ability of law enforcement to deal with heinous criminal acts and protect victims is severely hampered.

“Simply deleting accounts from a platform is far from sufficient.”

The Home Secretary is also expected to urge Facebook to deepen its engagement with ministers to ensure public safety is central to its system design.

A Facebook spokesperson said, “Child exploitation has no place on our platforms and Facebook will continue to lead the industry in developing new ways to prevent, detect and respond to abuse.

“End-to-end encryption is already the primary security technology used by many services to protect people from hackers and criminals.

“Its full roll-out to our courier services is a long-term project and we are building strong security measures into our plans.”

Sir Peter Wanless, Chief Executive Officer of NSPCC, said: “Private messaging is the front line of child sexual abuse, but the current debate around end-to-end encryption risks leaving children unprotected where it is needed. there is the most harm.

“The public wants to end the rhetoric that heats up the problem but sheds little light on a solution, so it is in the interests of businesses to find a fix that allows them to continue using technology to disrupt business. abuse in an end-to-end encrypted system. world.”

Please use Chrome browser for a more accessible video player

Please note: this report contains descriptions of rape and violence against a child.

According to the survey commissioned by the NSPCC, 33% of adults support the use of end-to-end encryption on social media and messaging services.

That number jumps to 62% if tech companies can make sure children’s safety is protected.

More than half of adults (55%) believe that the ability to detect child abuse images trumps the right to privacy, according to the survey.

Meanwhile, over 90% of those social networks and messaging services surveyed have the technical capacity to detect child abuse images on their platforms.

Sir Peter added: “We need a coordinated response across society, but at the end of the day, government must be the safeguard that protects child users if tech companies choose to endanger them with dangerous design choice. “

Source link