Tech

EU watchdog questions secrecy surrounding lawmakers’ proposal for CSAM analysis to break encryption

The European Commission has again been asked to more fully disclose its dealings with private tech companies and other stakeholders, in relation to a controversial piece of tech policy that could see a law mandating the analysis of citizens’ private messages of the European Union for the purpose of detecting child pornography material (CSAM).

This is an important issue as concerns have been raised that lobbying by the technology industry could influence the Commission’s drafting of the controversial CSAM analysis proposal. Some of the information withheld relates to correspondence between the EU and private companies which could be potential suppliers of CSAM analysis technology – meaning they could benefit commercially from any pan-EU law mandating the analysis of messages .

The preliminary finding of maladministration by the EU Ombudsman, Emily O’Reilly, was made on Friday and made public yesterday on her website. Last January, the Ombudsman reached a similar conclusion, inviting the Commission to respond to his concerns. Its latest conclusions take into account the responses of the European executive and invite the Commission to respond to its recommendations with a “detailed opinion” by July 26 – the saga is therefore not yet over.

Draft legislation on CSAM analysis, meanwhile, remains on the table of EU co-legislators – despite a warning from the Council’s own legal service that the proposed approach is illegal. The European Data Protection Supervisor and civil society groups also warned that the proposal represented a turning point for democratic rights in the EU. Last October, lawmakers in the European Parliament, also opposed to the Commission’s direction, proposed a substantially revised draft aimed at limiting the scope of the analysis. But the ball is in the Council’s court, because the governments of the Member States have not yet decided on their own negotiating position on this issue.

Despite growing concern and opposition from a number of EU institutions, the Commission has continued to back controversial CSAM detection orders – ignoring critics’ warnings that the law could force platforms to deploy analysis on the client side, with disastrous consequences for European Internet users. ” Privacy and Security.

The continued lack of transparency over the EU executive’s decision-making process in crafting the controversial legislation does little to help, fueling fears that some selfish business interests may have played a role in crafting the the initial proposal.

Since December, the EU Ombudsman has been examining a complaint filed by a journalist who sought access to documents relating to the CSAM Regulation and the EU’s “associated decision-making process”.

After reviewing the information withheld by the Commission, as well as its defense of non-disclosure, the Ombudsman remains largely unimpressed by the level of transparency displayed.

The Commission released some data following the journalist’s public access request, but withheld 28 documents entirely and, in the case of five others, partially redacted the information — citing a series of exemptions to refuse disclosure , including the public interest in public safety; the need to protect personal data; the need to protect commercial interests; the need to protect legal advice; and the need to protect its decision-making process.

According to information published by the ombudsman, five of the documents linked to the complaint concern “exchanges with representatives of interests in the technology industry”. It does not specify which companies corresponded with the Commission, but US company Thorn, maker of AI-based child safety technology, was linked to lobbying on the issue in a BalkanInsights investigative report last September.

Other documents in the bundle that were withheld or redacted by the Commission include drafts of its impact analysis when preparing the legislation; and comments from its legal department.

Regarding information relating to EU correspondence with technology companies, the Ombudsman questions many of the justifications put forward by the Commission for withholding the data – finding, for example in the case of one of these documents, that while the EU’s decision to redact details of the information exchanged between law enforcement and a number of unnamed companies may be justified on public security grounds; there is no clear reason why they do not disclose the names of the companies themselves.

“It is not clear how the disclosure of the names of the companies involved could possibly harm public safety, if the information exchanged between the companies and law enforcement has been redacted,” the ombudsman wrote.

In another case, the Ombudsman takes issue with the Commission’s apparently selective information releases regarding comments from tech industry representatives, writing that: “Based on the very general reasons for non-disclosure provided by the Commission in its confirmatory decision, it is not clear why he considered withholding “preliminary options” be more sensitive than those she had decided to disclose to the complainant.

The Ombudsman’s conclusion at this stage of the investigation reiterates his previous finding of maladministration against the Commission for refusing to give “broad public access” to the 33 documents. In his recommendation, O’Reilly also writes: “The European Commission should reconsider its position on the access request with a view to providing significantly increased access, taking into account the Ombudsman’s considerations shared in this recommendation.” »

The Commission was contacted about the Ombudsman’s latest findings on the complaint, but at press time had not provided a response.

techcrunch

Back to top button