Close
svg
Mohamed Elmaazi 17 May 2023

Leaked EU report Reveals EU, UK proposed censorship laws risk obliterating right to privacy and expression

Analysissvg14 min read
Post Image

As the UK Online Safety Harms Bill continues to be scrutinised at Committee Stage in the House of Lords, a recently leaked legal opinion from the Council of the European Union’s Council Legal Service (CLS) has cast serious doubt on the legality and practicality of a similarly proposed EU regulation.

 

In May last year, the Council of the EU proposed a Regulation which would obligate the providers of information society services:

 

i) to assess the risk that their services are used for online child sexual abuse; 

ii) to detect and report online child sexual abuse, and;

 

iii) to remove or disable access to child sexual abuse material on their services.

 

Yet, internal legal analysis by the Council’s own legal experts have warned that this Regulation could lead to the “permanent surveillance of all private communications”, the weakening or destruction of encryption, and a significant undermining of the rights to a private and family life, personal data and freedom of expression, amongst others.

 

The CLS legal opinion, dated 26 April 2023 and leaked last week, “provides a legal analysis of the conformity of the detection order applied to interpersonal communication services with Article 7 [respect for private and family life] and 8 [respect for personal data] of the Charter of Fundamental Rights (the Charter) as interpreted by the relevant case law of the [Court of Justice of the EU].” 

 

Undermining Respect for Private and Family Life, Personal Communications and Freedom of Expression

 

Services providers must respond to “detection orders” under this Regulation by scanning for child sexual abuse material (CSAM). The opinion notes that this implies that “content of all communications must be accessed and scanned, and be performed by means of available automated tools” while pointing out that “the exact nature of which is not specified in the proposal, as the proposal’s ambition is to remain technologically neutral.”

 

According to the CLS, this Regulation “would require the general and indiscriminate screening of the data processed by a specific service provider, and apply without distinction to all the persons using that specific service, without those persons being, even indirectly, in a situation liable to give rise to criminal prosecution.” 

 

“The screening of interpersonal communications as a result of the issuance of a detection order undeniably affects the fundamental right to respect for private life, guaranteed in Article 7 of the Charter, because it provides access to and affects the confidentiality of interpersonal communications (text messages, e-mails, audio conversations, pictures or any other kind of exchanged personal information)” the CLS analysis states.

 

“It is also likely to have a deterrent effect on the exercise  of freedom of expression, which is enshrined in Article 11 of the Charter.”  Notably, “It does not matter in this respect whether the information in question relating to private life is sensitive or whether the persons concerned have been inconvenienced in any way on account of that interference.”

 

The analysis goes on to states that “such screening constitutes the processing of personal data within the meaning of Article 8 of the Charter and affects the right to protection of personal data provided by that Provision.”

 

“Permanent Surveillance of All Interpersonal Communications”

 

The obligations imposed by detection orders “would imply that content of all interpersonal communications concerning that service (or the affected part or component where applicable) must be accessed and scanned by means of automated tools.” This means that “processing of data would not be limited” to private communications of people “in respect of whom there are reasonable grounds to believe” that they are “in any way involved in committing, or have committed a child sexual abuse offence, or presenting a connection, at least indirectly, with sexual abuse offences.”

 

The CLS deems that there is a “clear risk that, in order to be effective, detection orders would have to be extended to other providers and lead de facto to a permanent surveillance of all interpersonal communications”.

 

Destroying Encryption of All Private Communications and Beyond

 

The CLS notes that in order for “the screening of content of communications” to be effective “would require de facto prohibiting, weakening or otherwise circumventing cybersecurity measures (in particular end-to-end encryption)”

 

This would create an even stronger interference with other fundamental rights, freedoms and objectives including “safeguarding data security”.

 

One of the requirements of the proposed Regulation is for screenings to detect “solicitation of children”, including by scanning audio and written communications. But in order to do this, the CLS states, an age assessment/verification for all users of internet and private messenger services would be needed. 

 

“In fact, without establishing the precise age of all users, it would not be possible to know that the alleged solicitation is directed towards a child.” 

 

This would result in even further major interferences in the rights and freedoms of tens, if not ultimately hundreds of millions of people because implementing this age verification process would “have to be done either by (i) mass profiling of the users or by (ii) biometric analysis of the user’s face and/or voice or by (iii) digital identification/certification system.” 

 

In conclusion, the vast necessary consequences of implementing this Regulation guarantees “the serious risk” of “compromising the essence of the fundamental right to respect for private life.”

 

Lack of Legal Clarity Regarding the Automated Technology

 

Another CLS concern is that “the proposed Regulation does not specify in sufficient detail the nature and the features of the technologies to be made available” in the context of automated technologies which must be installed for the purpose of scanning and detecting CSAM. Two notable questions raised concern the definition of a “sufficiently reliable technology”, and what the acceptable “rate of errors” would be when attempting to balance “effectiveness” and using the “least intrusive measures” to detect CSAM online and in private messages.

 

UK Online Safety Harms Bill Even Larger in Scope

 

Like the proposed EU Regulation, the UK Online Safety Harms Bill was ostensibly designed to stop the flow of and remove unlawful and “harmful” material across the Internet, including internet service providers, internet search engines, websites, social media and encrypted private messenger applications.

 

It will “protect children and adults online” and make “social media companies more responsible for their users’ safety on their platforms,” the government says.

 

The Council’s proposed Regulation is far less broad in scope than the UK’s proposed Online Safety Harms Bill in terms of the types of prohibited content which would be covered by law — the latter of which currently stands at over 260 pages at the House of Lords. 

 

Furthermore, the UK is no longer an EU member and the current government is in the process of repealing domestic laws which result from its decades-long EU membership and diverging from the EU’s Regulations, Directives and case law. UK court’s will no longer be bound by CJEU case law but “may have regard” to its decisions when it may be relevant to do so. 

 

But the CLS analysis of the proposed EU Regulation is directly applicable to the problematic nature of implementing the UK’s own detection and censorship proposals, on our fundamental rights and freedoms, which are even more far reaching because they are not limited to CSAM.

 

The UK government states that the Online Safety Harms Bill will protect adults by requiring that all online platforms “remove all illegal content”, “remove content that is banned by their own terms and conditions” and “empower adult internet users with tools so that they can tailor the type of content they see”. Children, meanwhile, will be “automatically prevented from seeing this content without having to change any settings”.

 

In addition, social media companies will have to:

  • remove illegal content quickly or prevent it from appearing in the first place. This includes removing content promoting self harm
  • prevent children from accessing harmful and age-inappropriate content
  • enforce age limits and age-checking measures
  • ensure the risks and dangers posed to children on the largest social media platforms are more transparent, including by publishing risk assessments
  • provide parents and children with clear and accessible ways to report problems online when they do arise

The content, which must be removed or even pre-emptively prevented from being uploaded, includes, but is not limited to:

  • child sexual abuse
  • controlling or coercive behaviour
  • extreme sexual violence
  • fraud
  • hate crime
  • inciting violence
  • illegal immigration and people smuggling
  • promoting or facilitating suicide
  • promoting self harm
  • revenge porn
  • selling illegal drugs or weapons
  • sexual exploitation
  • terrorism

Most recently, a Tory peer has proposed adding sexist abuse to the list of material which must be removed.

 

“For young people who have become addicted to social media and its darker sides, this does not need to be your life. To people in power, and to people who can make change, please, criminalise harmful content,” Kate Winslet said during her Bafta acceptance speech on 14 May.

 

The UK-government backed charity Hope-Not-Hate supports the bill as does the National Society for the Prevention of Cruelty to Children. 

 

On the other hand, legal professionals, digital and human rights experts and technology specialists have questioned the practicality and potentially privacy destroying consequences of the Online Safety Harms Bill. The wide ranging list of prohibited materials, along with the subjective nature of their interpretation, will inevitably mean that companies will over-censor in order to avoid being targeted for non-compliance.

 

It also seems highly possible that material leaked to journalists could also be blocked or removed, either because of demands by the government of the day or as a result of decisions made by risk-averse internet companies and websites.

 

The Online Safety Harms Bill, like the proposed EU Regulation, would also require all facilitators of digital communications to scan private messages, even encrypted ones such as WhatsApp, Telegram or Signal, for prohibited content. 

 

As the damning CLS leaked report states, such surveillance and censorship is impossible without undermining or destroying encryption.

 

“As written, the Bill contains provisions that are positioned to undermine encryption, and could create an unprecedented regime of mass surveillance that would all but eliminate the ability of people in the UK to communicate with each other outside of government interference,” Signal’s president Meredith Whittaker said in a statement in March.

 

“Let me be blunt: encryption is either broken for everyone, or it works for everyone. There is no way to create a safe backdoor,” she added.

Avatar photo

Mohamed Elmaazi

Mohamed Elmaazi is the new Editor-in-Chief of Truth Defence. He is a UK-based researcher and journalist who writes on a variety of subjects including geopolitics and legal encroachments upon on civil liberties and human rights. Mohamed has covered all of Julian Assange's extradition hearings. His work has appeared in numerous outlets including The Dissenter, Consortium News, Jacobin, The Canary, The Grayzone, The Real News Network, the BBC and The Guardian. He also publishes articles via his website TheInterregnum.net. You can follow him on Twitter @MElmaazi.

Leave a reply

ten + 18 =