The Scale of the Problem

Child sexual abuse material (CSAM) is a severe and pervasive issue that concerns communities globally. CSAM offending has dramatically escalated with the advent of the internet and digital technologies, making it a critical area of concern for law enforcement, policymakers, NGOs, and technology companies. This summary provides an overview of the frequency of CSAM offending, particularly in Australia, to provide context for the work of the CSAM Deterrence Centre.

The Scale of the Problem

The scale of CSAM offending is alarming. According to the US-based National Centre for Missing and Exploited Children (NCMEC), there were 20.5 million reports of suspected child exploitation material in 2024, including 73,370 reports from Australia. 

The Australian Centre to Counter Child Exploitation (ACCCE) recorded 58,503 reports of online child abuse in the 2023-24 financial year, averaging 160 reports per day (4,875 reports a month) and representing a 45% increase from the previous year’s total of 40,232 reports.

The global reach of CSAM offending is further evidenced by the increasing number of reports received by international organisations. For instance, the Internet Watch Foundation (IWF) received 424,047 reports in 2024, an 8% increase from 2023. Over the last five years, the IWF has worked to remove over one million webpages containing CSAM.

These statistics highlight the massive growth of CSAM and the increasing burden on law enforcement agencies to respond effectively. The rate of reports consistently outpaces the capacity of law enforcement, underscoring the urgent need for stronger preventative measures and the involvement of technology companies and other organisations in combating this issue.

Children at Risk

According to the Australian Child Maltreatment Study, 1 in 4 Australian children are sexually abused. The Australian Child Maltreatment Study survey of 3500 individuals aged 16–24 years of age found that 17.7% of all children and young people surveyed had experienced online sexual solicitation by an adult, with 1 in 4 girls (26.3%) being propositioned, facing a significantly higher risk compared to boys (7.6%). Of those who had experienced solicitation, 80% said it started by age 15 and 25% reported it began before age 12, primarily from individuals they had met online. Within this same survey, the national prevalence of nonconsensual sharing of sexual images of the child before age 18 was reported to be 7.6%.

Australian children are increasingly at risk from CSAM through sexual extortion, a form of online blackmail where offenders coerce victims into sending sexual images and then threaten to share these images unless their demands are met. The ACCCE reported receiving an average of 93 reports of sexual extortion per month in the first half of 2024. This crime often involves organised crime syndicates targeting Australian teenagers, exploiting their vulnerability and trust. The psychological impact on victims is profound, leading to severe emotional distress, anxiety, and long-term trauma.

Offender Profiles and Behaviours

Research suggests that many individuals who use CSAM (but do not also sexually abuse children offline) first encounter CSAM accidentally. Surveys of anonymous CSAM offenders found that half of the respondents’ said that their first exposure to CSAM was accidental (Insoll, et al., 2021; Napier, et al., 2025), in many cases before they turned 18 years old. This accidental exposure can act as a gateway to further engagement with CSAM, highlighting the importance of early intervention and prevention strategies.

CSAM offenders come from diverse backgrounds, and their behaviours vary widely. In a 2023 study by UNSW and Jesuit Social Services of 1,945 Australian men, one in six (15.1%) reported experiencing sexual feelings towards children. Of these, 9.4% reported having ever sexually offended against children, with 2.5% having knowingly viewed CSAM as an adult, 1.8% having engaged in sexual activities via webcam with a child, and 1.7% having paid for online sexual interactions, images, or videos involving a child. These statistics indicate that a significant number of individuals are involved in CSAM offending.

On a per capita basis, Australia accounted for the highest number of suspicious financial transactions in the Philippines related to child sexual abuse over 2020-2022. In the Philippines, 1 in 100 children were trafficked in 2022 to produce CSAM for paying online offenders from countries including Australia, often via livestreamed video.

Technological Advancements and CSAM

The proliferation of digital technologies over the last few decades has significantly contributed to the increase in CSAM offending. Advances in hardware, such as digital cameras, mobile phones, and personal computers, have made it easier to produce and distribute CSAM. Additionally, software advancements, including artificial intelligence (AI), have facilitated the creation of new CSAM content. AI technologies can be misused to create non-consensual intimate imagery, de-age individuals, generate explicit content from innocuous images, and even generate new images based on existing CSAM, amplifying the trauma on children who have been abused.

The ease of accessing and distributing CSAM has also increased with technological advancements. CSAM can be found on both the open web and the dark web, and it is distributed across various platforms, including mainstream social media, messaging platforms, gaming platforms, legal pornography sites, peer-to-peer networks and search engines. Data reported by NCMEC highlight the frequency with which mainstream technology platforms are used to distribute CSAM. The adoption of private messaging platforms using end-to-end encryption has increased the difficulty of detecting CSAM material and the networks of users sharing the content.

The Role of Technology Companies

Technology companies play a crucial role in combating CSAM. Their platforms are often used to distribute and access CSAM, making their involvement essential in prevention and disruption efforts. By collaborating with the CSAM Deterrence Centre, technology companies can help design, implement and evaluate effective warning messages and other preventive measures. These strategies can deter individuals from accessing CSAM and fulfil the companies’ duty of care to protect users.

Moreover, technology companies can leverage their technological capabilities to identify and disrupt CSAM networks. Advanced technologies such as AI and machine learning can be used to detect and remove harmful content swiftly, making it increasingly difficult for offenders to operate.

Conclusion

The increasing prevalence of CSAM offending is a critical issue that requires a coordinated and comprehensive response. The increasing number of reports, the global reach of CSAM, and the role of technological advancements in facilitating CSAM offending underscore the need for innovative prevention strategies and strong collaboration between stakeholders. The CSAM Deterrence Centre, in partnership with technology companies, law enforcement, and other stakeholders, aims to create a safer online environment by implementing effective prevention measures and disrupting CSAM offending behaviours and networks. By leveraging the combined expertise and resources of all involved parties, we can make significant strides in reducing the prevalence of CSAM and protecting vulnerable children from harm.