Home > Discover > Child Predators Are Exploiting AI to Generate Sexual Images of 'Stars'

Child Predators Are Exploiting AI to Generate Sexual Images of 'Stars'

Written by
ArticleGPT

Reviewed and fact-checked by the HIX.AI Team

4 min readJun 20, 2024
Child Predators Are Exploiting AI to Generate Sexual Images of 'Stars'

In a Nutshell

The rise of AI technology has enabled offenders to target specific child victims known as stars in the production of more harmful content.

Child predators are using artificial intelligence to create sexually explicit images of children, focusing on specific victims, according to child safety experts. These predators, active on the dark web, are engaging in conversations that involve generating new images based on existing child sexual abuse material (CSAM).

AI is creating new CSAM and recovering old CSAM.

Survivors like Megan, who's suffered from CSAM, have expressed their growing anxiety over AI's potential to further exploit their images. Megan emphasizes that AI manipulation could create false impressions that diminish the pain caused by abu se or even suggest the joy of being abused.

The use of AI has empowered these offenders to obsess over specific child victims (namely the '"stars") and produce more content featuring them. This disturbing trend has raised concerns among survivors of CSAM, who fear that their images may be manipulated with AI and circulated, potentially threatening their personal and professional lives.

Advocates are urging legislation to stop predators from making CSAM, however, they doubt the efficiency of enforcing the ban on generating new CSAM now that predators utilize AI's encrypted messaging services.

Cultural reform is also necessary to tackle the root causes of abuse and prevent such incidents in the first place.

Technology's Role in AI-Generated CSAM

Dark web chat room conversations reveal the predatory fascination with AI-generated CSAM. Offenders expressed excitement about the prospect of using AI to create new material based on familiar victims.

These discussions ranged from recreating images of former child pornography stars in specific settings to digitally remastering old, low-quality abuse material. AI's advancements have provided predators with easy access to tools that can generate increasingly realistic abuse imagery.

Predators place a particular emphasis on "star" victims within their online communities. Similar to Hollywood celebrities, these victims are ranked and cataloged, and their images are meticulously manipulated to create different poses or scenarios.

Law Enforcement Responses and Calls to Action

Letter to U.S. Congress by Attorneys General

In a letter addressed to the U.S. Congress, the attorneys general urged members to intervene and study the harmful impacts of artificial intelligence (AI) on child sex abuse material.

Urgency for Preventative Measures

In June, the FBI released a public alert emphasizing the prevalence of "deepfakes" - fake content made by altering innocent images or videos to show explicit sexual activities. Predators often source content from social media platforms or other online websites, altering them to target victims.

The FBI has received reports from minors and adults who did not consent that their images were altered and shared on social media or pornographic websites for harassment or extortion.

The National Crime Agency (NCA) in the United Kingdom has identified up to 830,000 adults who pose a sexual risk to children. NCA Director General Graeme Biggar emphasized that the viewing of these images, whether real or AI-generated, significantly increases the risk of offenders progressing to sexually abusing children.

The agency has already observed hyper-realistic AI-generated images and videos, which pose challenges in identifying real children in need of protection while further normalizing abuse.

Based on 3 search sources

3 sources

Child predators are using AI to create sexual images of their favorite ‘stars’: ‘My body will never be mine again’

Predators active on the dark web are increasingly using artificial intelligence to create sexually explicit images of children, fixating especially on “star” victims, child safety experts warn.

Predators are using AI to sexually exploit children, FBI says. Here’s what we know

As the use of artificial intelligence grows, officials across the world are expressing their concerns about its use in the creation of child sex abuse material.

Predators using AI to exploit children

Sexual predators are using a new tool to exploit children: AI (artificial intelligence) image generators.

On This Page

  • AI is creating new CSAM and recovering old CSAM.
  • Technology's Role in AI-Generated CSAM
  • Law Enforcement Responses and Calls to Action