More than a dozen prominent cybersecurity experts on Thursday criticized Apple and the European Union’s plans to monitor the phones of people looking for illicit material, calling the efforts ineffective and dangerous strategies that would embolden the government oversight.
In a 46-page study, researchers wrote that Apple’s proposal to detect images of child sexual abuse on iPhone, as well as an idea transmitted by members of the European Union to detect Similar abuse and terrorist images on encrypted devices in Europe, used “dangerous technology”.
“It should be a national security priority to resist attempts to spy and influence by law-abiding citizens,” the researchers wrote.
The technology, known as client-side analytics, would allow Apple – or, in Europe, potentially law enforcement officers – to detect images of child sexual abuse on someone’s phone. ‘one by analyzing the images uploaded to Apple’s iCloud storage service.
When Apple announced the planned tool in August, it said a so-called fingerprint of the image would be compared to a database of known child sexual abuse material to look for potential matches.
But the plan sparked an uproar among privacy advocates and raised concerns that the technology could erode digital privacy and possibly be used by authoritarian governments to hunt down political dissidents and other enemies.
Apple has said it will reject any such request from foreign governments, but outcry led it to put the scan tool on hold in September. The company declined to comment on the report released Thursday.
Cyber security researchers said they started their study before Apple’s announcement. Documents released by the European Union and a meeting with EU officials last year led them to believe that the bloc’s governing body wanted a similar program that would not only analyze images of sexual abuse on women. children, but also signs of organized crime and signs of terrorist links. .
A proposal to allow the digitization of photos in the European Union could arrive as early as this year, say the researchers.
They said they are releasing their findings now to inform the European Union of the dangers of its plan, and because “the extension of state surveillance powers really crosses a red line,” said Ross Anderson, professor. of Security Engineering at the University of Cambridge and member of the group.
Surveillance issues aside, the researchers said, their results indicated that the technology was not effective in identifying images of child sexual abuse. Days after Apple’s announcement, they said, people pointed out ways to avoid detection by tweaking the footage slightly.
“This allows a personal private device to be scanned without any probable cause for anything illegitimate,” added another member of the group, Susan Landau, professor of cybersecurity and politics at Tufts University. “It’s extraordinarily dangerous. It is dangerous for business, national security, public safety and privacy.