Wednesday, October 9, 2024

Million Mile Reflections

By David S. Prescott, LICSW, and Kieran McCartan, PhD

ATSA’s Sexual Abuse blog reached a milestone in the past few weeks: It has now been read well over a million times. This is the blog’s 546th post, and it was recently rated as the world’s 5th most-read blog in the area of sexual abuse. We regard this last statistic as particularly welcome; ATSA’s blog focuses in different areas from those of organizations such as RAINN, NCMEC, The Centre for Expertise on Child Sexual Abuse, which often work more on raising awareness of common issues and challenges among those who have been victimized. They are less likely to focus on areas like risk assessment, treatment innovations, or the use of the polygraph.

The blog itself started as a venue for discussing studies published in the journal, Sexual Abuse. Robin Wilson was the original “blogger.” Kieran remembers getting the email to take over from Robin in the summer of 2014, he was on a train from Leeds to Bristol after having attended a Circles of Support and Accountability conference. David was already contributing at that point and then participated more regularly when Kieran came on as chief blogger. The blog became more regular moving to three or four times a month, and its content expanded into other directions: taking note of innovations in assessment and treatment, sometimes mentioning trainings or reviewing conferences, and occasionally providing reviews of multiple studies. Since 2014, Jon Brandt, Alissa Ackerman, and Kasia Uzieblo have all served as Assistant Bloggers. Others have often stepped in to provide guest posts, including Joan Tabachnick, Cordelia Anderson, members of ATSA’s Prevention Committee, Don Grubin, Norbert Ralph, and many others. We have been grateful to them all.

The aim of the blog has always been two-fold, communication and upskilling. It’s important to emphasise the importance of talking about sexual abuse and making it a true lived reality, so that people and communities and understand and own it. Because both Kieran and David write academically and professionally as part of their day jobs, they understand the importance of research and the evidence base, but academic writing is not always the best way to communicate issues to different populations. We must talk to people where they are at and in a way that appeals to them. Writing the blog has help communicate our understandings of sexual abuse and hopefully upskilled a range of populations, encouraging further debate and insight. The most important thing that the blog can do is make people stop and think; hopefully we have done this over the years.

If there has been anything we’ve learned by watching which posts garner the most attention and feedback, it has been the importance of how we all frame our messages. Guitarist and composer Frank Zappa once observed that, “The most important thing in art is the frame.” Otherwise, he observed, it’s all just a bunch of stuff on the wall. So, it is with the work of responding to and preventing sexual abuse. Several ATSA conference plenary speakers in the early and mid-2000s, along with our most recent ATSA lifetime achievement award winner Joan Tabachnick and Gail Burns Smith winner Alissa Ackerman (both past and present bloggers) have challenged us to consider how we construct our messages and arrange them in a way that meshes with the public’s highest aspirations. For example, we don’t simply work in prisons or civil commitment centers; our work really is about preventing sexual abuse from occurring and recurring. From there, we’ve learned the importance of writing tight prose for a general audience. If anything, the proof this approach lies in the fact that ATSA’s blog posts are rarely misunderstood.

Friday, October 4, 2024

Using safety tech to help stop people from watching child sexual abuse images and videos

By Larissa van Puyvelde, Minne De Boeck and Catherine McShane

Editors note: the PROTECH tool and the consortium that produced it were funded by the EU Horizon 2020

The growth of technology, ease of access to the internet and the ability to be anonymous online has fuelled the supply and demand for child sexual abuse images and videos online. Millions of child sexual abuse images and videos are distributed on the internet every year and these are only the ones that we know are reported by organisations like the Internet Watch Foundation in the UK or the National Centre for Missing and Exploited Children in the US.

The impact on survivors of child sexual abuse is horrific, as imagery of their abuse can continue to circulate online long after they may have been rescued and safeguarded. Survivors suffer repeated victimisation whenever the record of their ordeal is shared and viewed. Traditional law enforcement approaches struggle to cope with the volume and scale of the global problem.

New ways to tackle the complex issue urgently need to be explored and, as part of the EU’s strategy for a more effective fight against child sexual abuse, the EU has been funding several studies aimed at early intervention.

One such project, Protech, regards the development, implementation and evaluation of a safety tech tool to stop the viewing and distribution of child sexual abuse material (CSAM). The intention is for it to be used as part of a prevention programme by people who are at risk of viewing CSAM. The project is mid-way, and what follows are early insights into the project’s progress so far.

To develop the tool, the Protech project has brought together experts from the European Union (Belgium, the Netherlands, Germany and Ireland[i]) and the United Kingdom, from diverse backgrounds including criminology, public health, clinical and forensic psychology, technology, child protection and internet safety.

The aim is to create on-device technology in the form of an app that aims to identify and prevent sexual abuse images of children from reaching the screen of the user’s device and that displays blocking messages warning of harmful content.

The design of the safety tech started with the 'for and by' principle, which implies that the intended users would help inform how the tech should be shaped, to ensure that the app could effectively support them to stop their risky behaviour, and significantly reduce the viewing and demand for sexual imagery of children on internet-connected devices.

To do this, the project team invited two cohorts of participants to give input on how the technology should work and what was needed to ensure that it would be used. The first group included 30 individuals at self-reported risk of CSAM offending – those likely to want to view images and videos of child sexual abuse. The second group consisted of service providers, mainly therapists of CSAM users, who participated in focus groups.

All participants were asked to share their opinions and concerns regarding four domains for the app’s development: privacy issues (1), blocking functionality (2), potential interactivity (3) and possible deployment methods (4).

The results show that privacy (domain 1) was a challenging point, as most respondents indicated that they are concerned about the storing of personal data, data security and the potential legal consequences. Hence, sensitivity around privacy needed to be taken as a core point around which the app was developed.

For domains 2 and 3, Protech researchers could not find consensus among all responses. Overall, the results showed that every respondent would like to personalize features of the app. This highlighted the need to make the app customizable, for example, in terms of blocking messages or implementing a pornography filter, because some respondents felt that access to adult pornography would either help them control their impulses or perhaps make them worse. Some felt that adding certain features such as a diary would be helpful. Participants also advocated for access to FAQs relating to use of the app, and an option to provide feedback to the developers.

Among the participants, there was no unanimity on how best to download the app (domain 4). Some thought it would be easier to download it through an official channel. Then again, others found anonymity important. Through discussion with the app’s developers, it was decided to offer the download link through an external portal, the most secure method for deploying the app.

These findings created the first version of the app, called Salus. Salus is now ready to be tested in a pilot stage. During this stage it will be assessed whether the app or similar technology and its implementation, has the potential to be part of an effective prevention programme. The app uses machine learning and conventional techniques (such as keyword or URL matching for detecting known imagery of child sexual abuse) to block CSAM from appearing on the screen of the user’s device.

The Protech project relies on voluntary participation and participants will be able to make informed decisions during the entire pilot to ensure transparency about functioning, onboarding and how to withdraw from the app. There are, and always will be, methods by which users could circumvent technology such as this but the important characteristic of the Protech target group is that these will be individuals who want to take part, who want to get help and who have volunteered to do so.

The three month pilot will be rolled out for up to 30-50 participants in a treatment programme and will consist of an intervention group, which will effectively test the app. All participants will be sent a survey at two different time points with questions regarding their wellbeing, their (online viewing) behaviour and their opinions on the safety tech.

The final step of the project will draw on survey findings and evaluate the effectiveness and outcomes of the pilot. Furthermore, project partners will evaluate what has been achieved and learned as well as address any questions regarding the technology, implementation and the pilot that emerge because of the project.

If effective, the novel use of technology to help at-risk individuals to control their viewing of CSAM could hopefully be used as a blueprint for using safety tech for perpetrator prevention across the EU, but it is by no means a silver bullet. Ideally any technology would need to be implemented as part of a wide, holistic package of measures intended to help prevent child sexual abuse and exploitation, further protecting victims and alleviating the workload of law enforcement.



[i] The Protech project members are Charité – Universitätsmedizin Berlin; Tilburg University; The Lucy Faithfull Foundation; Offlimits/Stop it Now Netherlands; the University Forensic Center/ Stop it Now! Flanders at University Hospital Antwerp/University of Antwerp; SafeToNet; the International Policing and Public Protection Research Institute (IPPPRI) at Anglia Ruskin University and the Internet Watch Foundation.