Danielle Williams, Resolver


Since the advent of the internet, extremist groups and Child Sexual Abuse (CSA) perpetrators have exploited digital platforms to recruit, communicate, and commit offenses, with a growing shift toward more enclosed spaces such as gaming-adjacent platforms (e.g., Discord, Twitch and Steam) and the Dark Web (e.g., forums like the now-dismantled Boystown). These actors use such environments to evade detection and build communities, as seen in high-profile cases like the Christchurch mosque shooter, who engaged with extremist ideologies via gaming and fringe platforms, and in operations like the FBI’s Operation Pacifier and the UK’s Operation Delilah, which exposed large-scale CSA networks online. Despite successful interventions—including Europol’s takedown of Boystown and Microsoft’s Project Artemis for CSAM detection—law enforcement and tech companies continue to face challenges posed by encryption, anonymity, and the speed at which harmful content can spread. This panel draws on research and investigatory case studies to examine how policing agencies, tech firms, and researchers can collaborate to counter online radicalisation and child exploitation, exploring the effectiveness of strategies such as AI-driven monitoring, undercover digital operations, and community-based prevention.