Opinion

EU must take decisive action to tackle online child sexual exploitation and abuse

New privacy laws introduced by the European Union (EU) to protect private online communications from being monitored by internet firms have caused widespread ambiguity over the legality of companies applying online detection mechanisms to identify and remove online child sexual abuse material. As a consequence, it will become  easier for perpetrators to groom and sexually exploit children without being discovered, and harder for law enforcement to investigate crimes, and protect victims and the vulnerable.

The ePrivacy Directive introduced by the EU in December 2020 has had the unintended effect of making voluntary detection “illegal” by banning big tech firms such as Facebook and Microsoft from using automatic detection tools that are commonly utilised to identify material containing images of child abuse, or to detect online grooming.

Tools such as Microsoft’s PhotoDNA and Google’s CSAI Match use image hashing, classifiers, and anti-grooming applications, and the vast majority of the world’s child sexual abuse material is identified and reported in this way.

Risk

Millions of children globally are now at increased risk at a time when the COVID-19 pandemic is fuelling an alarming increase in online child sexual exploitation, facilitated by children and adults spending an unprecedented amount of time logged on. Compounding the problem is an exponential growth in the volume of digital content being produced, making it ever harder to locate illegal activity.

Online child sexual abuse is a borderless crime, with sex offenders in Europe able to use social media platforms to contact children around the world. Troublingly, Europe has become the epicentre for harmful content, with the Internet Watch Foundation’s 2019 Annual Report finding 89% of known URLs containing child sexual abuse hosted in European countries, up from 79% in 2018.

On Europol’s database, there are over 51 million unique images or videos containing child sexual abuse. Adding to this repository relies on tech companies using technologies that identify harmful content, as well as screening and blocking offenders uploading and sharing abuse material.

Over the past year, voices across Europe have raised concerns about this spiralling problem. In June, the EU’s Commissioner for Home Affairs, Ylva Johansson, spoke of how Europe has become “the global blackspot in hosting child sexual abuse images,” and the scale of the problem was highlighted in the European Commission’s child sexual abuse strategy, which noted that demand for child sexual abuse material has increased by as much as 25% in some member states.

Digital rights

To protect children and bring perpetrators to account, the EU has proposed to adopt measures that ensure stronger cross-border cooperation between law enforcement in different countries. The European Parliament and the Council of Europe have called for more concrete action, and in July 2020 announced the EU strategy for a more effective fight against child sexual abuse. This laid out minimum rules for governments on defining criminal offences and sanctions on child sexual abuse, both online and in person, and introduced provisions to strengthen the prevention of crimes and protection of victims.

Privacy rights have also been at the forefront of concerns amongst European leaders, and to protect the digital rights of online users the EU has been developing regulations to clarify the role and liability of technology companies and platforms.

Privacy activists and some European lawmakers have argued that  automatic scanning of digital content, including for chat and messaging apps, is a major infringement of people’s fundamental right to privacy and violates privacy and data protection rights.

Legally binding for member states, a draft EU Digital Services Act was released in 2020, providing rules on dealing with disinformation, transparent advertising, and illegal content, and in December the EU passed the Electronic Communications Code Directive. This aims to harmonise EU legal frameworks for electronic communications and regulate the telecommunications sector throughout the EU. Objectives include providing an improved level of consumer protection and maintaining the security of networks and services.

The Code is complemented by other directives and regulations, including the e-Privacy Directive, which applies to email, internet phone calls, instant messaging app, and personal messaging provided through social media, as well as to traditional telecom providers.

Abuse

Both the Code and e-Privacy Directive expose the difficulties in juggling competing needs. Fundamental questions arise about how to balance safeguarding an open internet and the protection of digital rights – which incorporates basic human rights regarding privacy and freedom of expression – against the need to have limitations that protect online users, including children, from abuse.

The reaction from tech companies to the new regulations is mixed. Facebook handles a staggering volume of private messages via its own messenger service as well as from WhatsApp and Instagram, which it owns. Between July and September 2019 alone, Facebook removed 11.6 million images of child sexual abuse, including 754,000 from Instagram. As soon as the e-Privacy Directive was passed, the company switched off some of its child abuse detection tools in Europe, stating it had no choice as automatic scanning of private messages is now banned.

Other tech firms such as Microsoft, Google, LinkedIn, and Roblox have not made such changes, arguing that the most responsible approach is to keep the technology functioning while EU policy makers work to address the situation and develop a harmonised regulatory approach. However, despite their continued action, the impact of the Code remains potentially devastating and will set back the EU’s own efforts to address online sexual abuse and exploitation.

The effects will be felt far beyond Europe, as the tools are used to identify child sexual abuse materials located in countries across the world. Tech companies also report to the US-based National Center for Missing and Exploited Children (NCMEC), and it’s CyberTipline received a colossal 69 million files in 2019.

Illegal content

Whilst the reliance on voluntary actions by tech companies is making a substantial impact in detecting and removing illegal and harmful content, tech companies cannot be left to make their own rules. Clear laws and policies are needed to place obligations on companies to moderate, detect and remove illegal content. Laws that are responsive to and predictive in nature can help ensure that online service providers can be trusted with the data that they collect, process, and store.

The global ramifications of EU law demonstrate the need for a multi-jurisdictional approach. Governments and law enforcement agencies have to cooperate to bring offenders to account and there is a strong case for developing international laws and standards that provide uniformity in laws, legal clarity, and agreed penalties for the online exploitation and abuse of vulnerable people.

The pressing urgency for the EU to take decisive action cannot be over-emphasised. There is some hope that the situation will be resolved as the EU has resumed discussions about solutions to enable tech companies to continue protecting children online.

Negotiations on the final text of the Temporary Derogation of the ePrivacy Directive – which would provide a legal basis for the use of online tools to detect child sexual exploitation and abuse – is due to finish on 26 January.  Even as regional and national efforts progress, the ultimate goal for all governments should be towards the formulation and agreement of international standards.

By Tsitsi Matekaire, Global Lead on End Sex Trafficking at Equality Now

Related: Cuts bring level of child exploitation in Britain ‘almost back to Victorian times’

Published by
Tags: Whatsapp