Thursday, May 8, 2025

Evolutions and Challenges in Preventing Online Child Sexual Abuse: A Conference Report

By Minne De Boeck, Larissa Van Puyvelde, & Kasia Uzieblo, 

Introduction

Online sexual abuse is a global issue, and traditional approaches are increasingly inadequate in addressing its evolving forms. Various stakeholders are encountering limitations in their ability to contain the exponential rise of, among others, child sexual abuse material (CSAM) (Lee et al., 2020). Over the past years, another major challenge has emerged: artificial intelligence (AI). For this reason, the University Forensic Center (UFC), together with PROTECH and funded by the European Union, organized a conference at the University of Antwerp on January 28, 2025, focusing on new evolutions and challenges in online sexual abuse. This blog presents findings from studies on individuals who have engaged in CSAM offending, hereafter referred to as CSAM offenders for brevity and clarity in presenting group-level data.

Research on CSAM Offenders

Nina Vaaranen-Valkonen from Protect Children and Hanna Lahtinen from the University of Eastern Finland presented an overview of their various studies on online child sexual abuse. The first study aimed to gain deeper insight into the motivations of CSAM offenders. More than 72,000 individuals who searched for CSAM on the dark web participated in the study. Findings showed that 70% of these individuals were searching for CSAM for the first time, 42% had previously contacted a minor, and 55% wanted to change their behavior. Given that over half expressed a desire for behavioral change, it is essential to offer accessible and appropriate support.

They also studied the channels used to access CSAM material. The study found that 77% found CSAM on the open web, 32% via pornography websites, and 29% through social media platforms. Image sharing occurred in 32% of cases via social media. These findings debunk the idea that online CSAM is mainly or exclusively found on the dark web. Moreover, contact with minors mainly took place via social media, messaging apps, and online games (Insoll et al., 2024a). 

The 2KNOW project also investigated factors contributing to illegal sexual behavior behavior. A total of 4,549 respondents completed the survey on the dark web, of whom 68% were men. Seventy percent of participants first encountered CSAM before the age of 18. Most participants sought material featuring girls between the ages of 11 and 14, and nearly half reported having contacted a minor. The study identified several contributing factors to CSAM offending behavior in individuals active on the dark web, such as motivations (e.g., paraphilias), facilitating and situational factors (e.g., pornography exposure, internet anonymity), and certain barriers (e.g., deterrence campaigns, preventative interventions). 

Finally, the 2KNOW project also explored differences among CSAM offenders (n=3782). The research distinguished between: (1) individuals convicted of sexual offenses against minors, (2) those convicted of sexual offenses against adults, and (3) individuals with no convictions. The main findings indicate that most participants sought material featuring girls, and a significant portion had a sexual interest in children between the ages of 1 and 3 years old. 

The majority of convicted offenders sought CSAM based on a sexual interest in children. The non-convicted group showed higher levels of desensitization to adult pornography, and some reported seeking inappropriate or illegal physical contact with minors. The study further confirmed distinctions between the three groups. Convicted offenders had more extensive criminal histories, were more likely to engage in grooming and direct contact with minors, and showed stronger preferences for young children (Lahtinen et al., 2025).

Investigation and Law Enforcement

Kevin Reulens from the Federal Judicial Police (FGP) elaborated on the digital evolution of online sexual abuse from an investigative standpoint. The sexual development of minors often occurs online, which is also exploited by offenders, with the internet acting as a catalyst. This leads to the rapid spread of self-generated material and the eventual loss of control by the victim, resulting in a vast amount of CSAM becoming unintentionally available.

Technological developments also involve generative AI, which is increasingly used to create CSAM, with growing realism, more extreme content, and at greater volume. Additionally, all information on how to use this technology is being shared on abuse platforms, increasing accessibility.

Extended Reality (XR) was also discussed, highlighting the rise of platforms that integrate VR devices with audio and video to simulate touch, for example. The conclusion: these technological developments are unprecedented and evolving at an extremely rapid pace, posing complex challenges for law enforcement, prevention efforts, and evidence-based responses. 

Prevention and Treatment (Innovations) in CSAM Offenders

Ida Oeverland from the Lucy Faithfull Foundation (LFF) presented their treatment approach for CSAM offenders. LFF aims to create a safe group setting, offering information about (legal) consequences, psychosocial support, and practical guidance. The content of the programs covers various topics including offending behavior models, self-care, relationships, and relapse prevention. Sessions follow a cognitive-behavioral approach, with a focus on individual responsiveness. In 2023, 296 people participated. Participants generally reported a better understanding of their own behavior patterns and progress in managing their risk of offending. A positive impact on mental health was also noted.

Hannes Gieselier from Charité explained the STOP-CSAM project, which uses therapeutic chat interventions with adult CSAM offenders. These involve four conversations covering topics such as understanding offending behavior, acceptance of sexual preferences, personal values, emotion regulation, problem management, and a safety and change plan. Research using test and control groups showed that these interventions had a positive impact on illegal viewing of CSAM. Key takeaways included the importance of targeted, accessible, and rapid therapeutic interventions to facilitate behavioral change. However, maintaining privacy and anonymity is crucial for effective reach.

Minne De Boeck from the University Forensic Center (UFC) and Stop it Now! Flanders presented their one-on-one online support approach for CSAM offenders. The Rethink project uses an AI-based chatbot linked to a deterrence message when users search for banned terms on legal porn sites. This aims to improve referral to support services. A blended care approach has been developed, linked to the online self-help module for individuals who have viewed CSAM. Topics include expectations, problem definition and insight, safety planning, and future support needs. De Boeck also explored the potential role of AI in specific interventions related to CSAM prevention. Benefits include improved accessibility, better referral, faster and more efficient care, training support, etc. However, challenges remain, such as the sensitive nature of the topic and target group, risk assessment, and various ethical dilemmas (e.g., lack of human contact, safety evaluations).

Finally, Larissa Van Puyvelde from UFC introduced the PROTECH project, which developed an app called Salus with input from CSAM offenders themselves. The app uses machine learning (an AI application) to prevent  the illegal viewing of CSAM. It is voluntarily installed on users' devices and monitors internet traffic for CSAM or, where relevant, legal porn use. When detected, a blocking message appears and the content is blocked. Built with privacy and anonymity in mind, the app was tested in four countries at treatment centers and through Stop it Now!, with 38 CSAM offenders over a period of three to six months. In addition to using the app, participants were asked to complete surveys at fixed intervals and an interview at the end of the pilot. Most participants reported that Salus functioned as expected and did not interfere with device use. Some technical difficulties were noted (e.g., issues with other apps, slow internet, false positives). Still, the majority found Salus helpful, necessary, and supportive, and believed it could be beneficial in therapy.

Conclusion

Rapidly evolving technological developments clearly present various challenges in terms of detection and prevention. But at the same time, these developments offer new opportunities in terms of innovative prevention strategies and accessing otherwise hard-to-reach groups. Since online technology knows no boundaries, it is essential that we, as professionals, also know no boundaries in developing collaborations and  exchanging ideas to strengthen each other in the fight against online sexual abuse.

References

Insoll, T., Soloveva V., Viaz Bethencourt, E., Ovaska, A. & Vaaranen-Valkonen,N. (2024a). Tech platforms Used by Online Child Sexual Abuse Offenders. Research Report. 

Insoll, T., Soloveva, V., Díaz Bethencourt, E., Nieminen, N., Leivo, K., Ovaska, A., & Vaaranen-Valkonen, N. (2024b). What Drives Online Child Sexual Abuse Offending? Understanding Motivations, Facilitators, Situational Factors, and Barriers. 2KNOW project. https://www.suojellaanlapsia.fi/en/2know-research-report.

Lahtinen, H., Honkalampi, K., Insoll, T., Nurmi, J., Quayle, E., Ovaska, A. K., & Vaaranen-Valkonen, N. (2025). Investigating the disparities among child sexual abuse material users: Anonymous self-reports from both charged and uncharged individuals. Child Abuse & Neglect, 161, 107299. https://doi.org/10.1016/j.chiabu.2025.107299

Lee. H-E., Ermakova, T., Ververisa, V., & Fabiana, B. (2020). ‘Detecting child sexual abuse material: A comprehensive survey’, Forensic Science International: Digital Investigation [e-journal] 34, 301022. https://doi.org/10.1016/j.fsidi.2020.301022  


No comments:

Post a Comment