The deployment of warning messages is a critical strategy in the fight against child sexual abuse material (CSAM) online. Warning messages are a form of situational crime prevention, which focuses on reducing opportunities for crime by altering the environment in which potential offenders operate. When users encounter warning messages while searching for or attempting to access CSAM, they are reminded of the legal and ethical implications, which can discourage them from proceeding. This approach aligns with situational crime prevention principles, which include increasing the effort required to commit a crime, increasing the risks associated with the crime, reducing the rewards, and removing excuses for the behaviour.
By strategically placing warning messages in online environments, such as search engines, websites, and social media platforms, the likelihood of individuals engaging in CSAM-related activities is reduced, thereby contributing to a safer digital space, enabling technology companies to fulfil their duty of care to their users and the wider community.
A key consideration worth highlighting is that warning messages are very cheap to deploy on platforms. The detection mechanisms can be automated, and the cost per message is extremely low, enabling the intervention to scale to large platforms of millions or billions of users without substantial financial or computational cost on the platform.
Primary, Secondary, and Tertiary Interventions
Before examining the specific contexts for deploying warning messages, it is essential to understand the broader framework of interventions for CSAM prevention:
- Primary Interventions: These are proactive measures aimed at preventing the onset of CSAM-related behaviours. They include public awareness campaigns, education programs in schools, and parental guidance initiatives that inform children and adults about the dangers of CSAM and how to stay safe online.
- Secondary Interventions: These measures target individuals at risk of engaging in CSAM-related activities, or perhaps at higher risk of being a victim. Warning messages fall into this category, in relation to offenders, they aim to deter individuals from accessing or distributing CSAM by increasing their awareness of the legal and ethical consequences. For victims, it can enable them to be alert to offending behaviours and inform them of support and reporting mechanisms.
- Tertiary Interventions: These interventions focus on individuals already engaged in CSAM-related activities. They include therapeutic programs, legal actions, and rehabilitation efforts designed to prevent reoffending and support victims.

Warning messages can be deployed in a wide range of contexts, as shown in the image above. They can be triggered when users attempt to locate (e.g. via search), or access (e.g. via a URL) CSAM; images and videos can be scanned (on upload or as they are shared) to determine if they contain known CSAM; and other automated approaches to detecting grooming and other harmful behaviours can serve as triggers. This additional resource explores where warning messages can be deployed.
Warning messages and other interventions provide an opportunity for the user to reflect upon their intention and behaviour. The situational nature of their deployment, within a context where they are attempting to commit a crime, provides friction and enables the offender to consider changing their behaviour. Therapeutic-themed messaging enables this opportunity to direct them to support services, such as Stop It Now. Tertiary interventions, which can be paired with a warning message, where additional friction is added, serve as a reminder to the user of the harm they are causing and the ethical and legal implications of their behaviour.