Friday, October 4, 2024

Using safety tech to help stop people from watching child sexual abuse images and videos

By Larissa van Puyvelde, Minne De Boeck and Catherine McShane

Editors note: the PROTECH tool and the consortium that produced it were funded by the EU Horizon 2020

The growth of technology, ease of access to the internet and the ability to be anonymous online has fuelled the supply and demand for child sexual abuse images and videos online. Millions of child sexual abuse images and videos are distributed on the internet every year and these are only the ones that we know are reported by organisations like the Internet Watch Foundation in the UK or the National Centre for Missing and Exploited Children in the US.

The impact on survivors of child sexual abuse is horrific, as imagery of their abuse can continue to circulate online long after they may have been rescued and safeguarded. Survivors suffer repeated victimisation whenever the record of their ordeal is shared and viewed. Traditional law enforcement approaches struggle to cope with the volume and scale of the global problem.

New ways to tackle the complex issue urgently need to be explored and, as part of the EU’s strategy for a more effective fight against child sexual abuse, the EU has been funding several studies aimed at early intervention.

One such project, Protech, regards the development, implementation and evaluation of a safety tech tool to stop the viewing and distribution of child sexual abuse material (CSAM). The intention is for it to be used as part of a prevention programme by people who are at risk of viewing CSAM. The project is mid-way, and what follows are early insights into the project’s progress so far.

To develop the tool, the Protech project has brought together experts from the European Union (Belgium, the Netherlands, Germany and Ireland[i]) and the United Kingdom, from diverse backgrounds including criminology, public health, clinical and forensic psychology, technology, child protection and internet safety.

The aim is to create on-device technology in the form of an app that aims to identify and prevent sexual abuse images of children from reaching the screen of the user’s device and that displays blocking messages warning of harmful content.

The design of the safety tech started with the 'for and by' principle, which implies that the intended users would help inform how the tech should be shaped, to ensure that the app could effectively support them to stop their risky behaviour, and significantly reduce the viewing and demand for sexual imagery of children on internet-connected devices.

To do this, the project team invited two cohorts of participants to give input on how the technology should work and what was needed to ensure that it would be used. The first group included 30 individuals at self-reported risk of CSAM offending – those likely to want to view images and videos of child sexual abuse. The second group consisted of service providers, mainly therapists of CSAM users, who participated in focus groups.

All participants were asked to share their opinions and concerns regarding four domains for the app’s development: privacy issues (1), blocking functionality (2), potential interactivity (3) and possible deployment methods (4).

The results show that privacy (domain 1) was a challenging point, as most respondents indicated that they are concerned about the storing of personal data, data security and the potential legal consequences. Hence, sensitivity around privacy needed to be taken as a core point around which the app was developed.

For domains 2 and 3, Protech researchers could not find consensus among all responses. Overall, the results showed that every respondent would like to personalize features of the app. This highlighted the need to make the app customizable, for example, in terms of blocking messages or implementing a pornography filter, because some respondents felt that access to adult pornography would either help them control their impulses or perhaps make them worse. Some felt that adding certain features such as a diary would be helpful. Participants also advocated for access to FAQs relating to use of the app, and an option to provide feedback to the developers.

Among the participants, there was no unanimity on how best to download the app (domain 4). Some thought it would be easier to download it through an official channel. Then again, others found anonymity important. Through discussion with the app’s developers, it was decided to offer the download link through an external portal, the most secure method for deploying the app.

These findings created the first version of the app, called Salus. Salus is now ready to be tested in a pilot stage. During this stage it will be assessed whether the app or similar technology and its implementation, has the potential to be part of an effective prevention programme. The app uses machine learning and conventional techniques (such as keyword or URL matching for detecting known imagery of child sexual abuse) to block CSAM from appearing on the screen of the user’s device.

The Protech project relies on voluntary participation and participants will be able to make informed decisions during the entire pilot to ensure transparency about functioning, onboarding and how to withdraw from the app. There are, and always will be, methods by which users could circumvent technology such as this but the important characteristic of the Protech target group is that these will be individuals who want to take part, who want to get help and who have volunteered to do so.

The three month pilot will be rolled out for up to 30-50 participants in a treatment programme and will consist of an intervention group, which will effectively test the app. All participants will be sent a survey at two different time points with questions regarding their wellbeing, their (online viewing) behaviour and their opinions on the safety tech.

The final step of the project will draw on survey findings and evaluate the effectiveness and outcomes of the pilot. Furthermore, project partners will evaluate what has been achieved and learned as well as address any questions regarding the technology, implementation and the pilot that emerge because of the project.

If effective, the novel use of technology to help at-risk individuals to control their viewing of CSAM could hopefully be used as a blueprint for using safety tech for perpetrator prevention across the EU, but it is by no means a silver bullet. Ideally any technology would need to be implemented as part of a wide, holistic package of measures intended to help prevent child sexual abuse and exploitation, further protecting victims and alleviating the workload of law enforcement.



[i] The Protech project members are Charité – Universitätsmedizin Berlin; Tilburg University; The Lucy Faithfull Foundation; Offlimits/Stop it Now Netherlands; the University Forensic Center/ Stop it Now! Flanders at University Hospital Antwerp/University of Antwerp; SafeToNet; the International Policing and Public Protection Research Institute (IPPPRI) at Anglia Ruskin University and the Internet Watch Foundation.

 

No comments:

Post a Comment