The EU Commission is proposing to scrutinize the internet more closely in the hunt for sex offenders. Critics fear that the rules will allow for widespread surveillance. However, figures on abuse pictures show a frightening trend.

In the fight against the sexual abuse of children, the Internet could be screened much more intensively in the future. According to a draft law presented on Wednesday, providers such as Google or Facebook could be obliged to use software to search their services for corresponding representations. In addition, an EU center is to be set up that, among other things, should provide the appropriate technology. “We will find you,” EU Home Affairs Commissioner Ylva Johansson said to criminals.

The network is currently being flooded with corresponding representations and the problem is getting bigger. According to the EU Commission, 85 million images and videos showing sexual abuse of children were reported worldwide in 2021. The number of unreported cases is significantly higher. The Internet Watch Foundation has found a 64 percent increase in reports of confirmed child sexual abuse in 2021 compared to the previous year.

According to Johansson, the perpetrators are often people the child trusts. “And these crimes very often remain in the dark until the perpetrator publishes them online.” It was often the photos and videos that made criminal prosecution possible. The fact that images of serious sexual child abuse are increasingly finding their way onto the Internet is also due to the culture of exchange among criminals. In order to get child pornography from other perpetrators, it could be a prerequisite to broadcast a rape of a child via live stream.

Such extreme examples are just the tip of the iceberg. Johansson emphasized that there was a study from Sweden in which 80 percent of the girls surveyed between the ages of ten and 13 said they had already received nude pictures of unknown adults unintentionally. “I think I have a large majority of citizens on my side,” said the Swede, referring to her draft law.

Specifically, this states that companies must analyze how great the risk is that depictions of abuse will be spread via their services or so-called grooming – i.e. if adults with intention to abuse contact minors – will be operated. If it is concluded that there is a significant risk, national authorities or courts can order that content be automatically checked by software and criminal content detected.

According to the draft law, the technology used for this should not be able to extract any information other than that which indicates the dissemination of abusive material. The same goes for grooming. The software should also be designed in such a way that it represents the least possible intrusion into the privacy of users.

The draft law does not specify which technology is to be used. It is therefore also unclear how the screening of the network content would be technically implemented and whether, for example, the encryption of messages could be circumvented. However, providers must specifically ensure that children cannot download apps that pose an increased risk of grooming, and that depictions of abuse are deleted or blocked. It must also be known whether an account belongs to a minor or an adult.

The EU Parliament and EU states must now discuss the proposal and agree on a final version. So there may still be changes.

The first reactions were mixed. Federal Interior Minister Nancy Faeser (SPD) welcomed the proposal. “With a clear legal basis, binding reporting channels and a new EU center, we can significantly strengthen prevention and criminal prosecution across the EU,” she said. “The fact that we will oblige companies to recognize and report the sexual abuse of children in the future is an important and long overdue step in the fight against child abuse,” said the domestic policy spokeswoman for the CDU/CSU group in the European Parliament, Lena Düpont.

The FDP MP Moritz Körner, on the other hand, spoke of a “Stasi 2.0”. He fears invasions of the private sphere of citizens. Konstantin von Notz from the Greens criticizes that private companies could be obliged to systematically scan private text, image and video content. “There are massive doubts that this is compatible with applicable European and German fundamental rights and ECJ case law.”

The SPD MEP Tiemo Wölken described the software intended for detecting the network content as a “horror filter”. The regulation tries to pretend that privacy and data protection are guaranteed. “The text is also impenetrable and confusing,” Wölken wrote on Twitter.


Please enter your comment!
Please enter your name here