Physical Address

304 North Cardinal St.
Dorchester Center, MA 02124

Alaska man reported someone for AI CSAM, then got arrested for the same thing


If you want to contact the police and screw someone who has expressed an interest in child sexual abuse material (CSAM) to you, having the same material on your own devices may not be the best idea. Or to consent to another search so law enforcement can collect more information. But that’s exactly what a man from Alaska allegedly did. It landed him in police custody.

404 media reported Earlier this week about the man, Anthaney O’Connor, who was eventually arrested after a police search of his devices allegedly uncovered AI-generated child sexual abuse material (CSAM).

From 404:

According to newly submitted information Loading documentsAnthaney O’Connor, contacted law enforcement in August to alert them to an unidentified airman who had shared child sexual abuse material (CSAM) with O’Connor. While investigating the crime, and with O’Connor’s consent, federal authorities searched his phone for additional information. A review of the electronics revealed that O’Connor had allegedly offered to develop virtual reality CSAM for the plane, according to the criminal complaint.

According to police, the unidentified airman shared with O’Connor a picture he took of a child in a grocery store and the two discussed how they could incorporate the minor into an explicit virtual reality world.

Law enforcement claims to have found at least six explicit, AI-generated CSAM images on O’Connor’s devices, which he said were intentionally downloaded, as well as several “real” images that were unintentionally mixed in. Through a search of O’Connor’s home, law enforcement discovered a computer and several hard drives hidden in a vent in the home. A review of the computer allegedly revealed a 41-second video of a child rape.

In an interview with authorities, O’Connor said he regularly reported CSAM to internet service providers, “but the images and videos still gave him sexual gratification.” It is unclear why he decided to report the flyer to law enforcement. Maybe he felt guilty, or maybe he really believed his AI-CSAM wasn’t breaking the law.

AI image generators are typically trained on real photos; This means that images of children “generated” by AI are fundamentally based on real images. There is no way to separate the two. In this sense, AI-based CSAM is not a victimless crime.

The first such arrest of an individual for possession of AI-generated CSAM just occurred already in May when the FBI arrested a man who used Stable Diffusion to create “thousands of realistic images of prepubescent minors.”

Proponents of AI will say that it has always been possible to create explicit images of minors using Photoshop, but AI tools make it exponentially easier for everyone. This is what a recent report revealed one of six members of Congress have been targeted by AI-generated deepfake porn. Many products have protections to prevent the worst uses, similar to how printers don’t allow money to be photocopied. Implementing hurdles prevents at least some of this behavior.

Leave a Reply

Your email address will not be published. Required fields are marked *