Skip to content


Technology like the one Apple is proposing to search iPhones for images of child sexual abuse would open the door to mass surveillance and be vulnerable to exploitation, global security and crypto experts have said.

Client-side analysis (CSS) provides access to data on users’ devices, including stored data, which “takes surveillance to a new level,” according to analysis by academics at the Harvard Kennedy School, from the Massachusetts Institute of Technology (MIT) and the University. of Cambridge, among others.

They write that the technology, which brings backend software to users ‘devices, “tears at the heart of citizens’ privacy,” but is also fallible and could be evaded by those who are supposed to be targeted and badly targeted. used.

In Bugs in Our Pockets, the Risks of Client-Side Scanning, a 46-page analysis of CSS published on the open-access website arXiv on Friday, the authors state: “In reality, CSS is a bulk interception, although automated and distributed… CSS makes law-abiding citizens more vulnerable with their industrial-scale searchable personal devices.

“To put it bluntly, this is dangerous technology. Even if it was initially deployed to search for child pornography, clearly illegal content, there would be tremendous pressure to expand its reach. We would then be hard pressed to find a way to resist its expansion or to control abuse of the system. “

Apple’s plans, unveiled this year, involve a technique called “perceptual hashing” to compare photos with known images of child abuse when users upload them to the cloud. If the company detects enough matches, it will manually review the images before reporting the user account to law enforcement.

Apple put the implementation on hold after a backlash from privacy activists last month, but not before researchers managed to build very different images that produced the same fingerprint and therefore appear to be identical to the system. Apple’s scan, creating false positives.

Others have succeeded in doing the opposite: changing the mathematical output of an image without changing its appearance at all, thus creating false negatives.

The report’s authors say people can also try to turn off scanners or avoid using devices like iPhones with CSS. They added, “The software vendor, the infrastructure operator and the targeting curator must all be trusted. If any of them – or their key employees – misbehave, get corrupted, hacked, or coerced, system security can fail. “

While CSS can be referred to as intended to target specific content, the report warns, “Come on the next terrorist alert, a little nudge will be all that is needed to reduce or remove current protections. “

He points out that Apple appears to have already bowed to state pressure, such as moving iCloud data from its Chinese users to data centers under the control of a Chinese state-owned company and removing the voting app. tactics of imprisoned Russian opposition leader Alexei Navalny. from its Russian app store.

Ross Anderson, one of the report’s co-authors and professor of security engineering at the University of Cambridge, said: “It’s a very small step from there. [targeting child sexual abuse material] to what various governments are saying “here is a list of other images that we would like to put on the list of naughty images for iPhones in our country”.

Approached for comment, Apple referred the Guardian to a statement that said, “Based on feedback from customers, advocacy groups, researchers and others, we have decided to take more time over the next few years. months to gather feedback and make improvements before posting these critically important items. child safety features.