Federal prosecutors secured the first conviction under the Take It Down Act after a months-long investigation in Ohio uncovered a campaign of AI-generated sexual images and harassment. The defendant pleaded guilty to producing explicit AI material, sending it to victims and their contacts, and distributing altered images and videos that targeted adults and minors. Authorities say hundreds of images were posted or stored, including files tied to child sexual abuse sites, and that the case relied on a coordinated effort between local police and the FBI. The conviction highlights how the new law is being used to punish nonconsensual AI-driven abuse and forces platforms to remove illegal intimate material quickly.
A Columbus man, identified in court records as James Strahler II, 37, admitted producing and sharing explicit images and videos created with AI tools. Prosecutors describe a deliberate operation: multiple AI platforms were installed, and more than a hundred web-based models were run to craft fake intimate content and realistic deepfake videos. The scheme allegedly stretched from late 2024 into mid-2025 and involved at least six women who were targeted with nude images, threats, and repeated harassment at workplaces and in other parts of their lives.
Investigators found that some files placed local boys’ faces onto sexual content and then posted those files online, creating an especially disturbing angle to the offense. More than 700 files connected to a child sexual abuse site were identified, and another 2,400 images and videos were recovered from the defendant’s phone. Many of those files were flagged by authorities for nudity, alteration, or violent content, underscoring the scope and cruelty of the campaign.
The legal framework used in this case is the Take It Down Act, signed last year, which outlaws the nonconsensual posting of intimate images, including AI-generated fakes. The statute imposes criminal penalties—up to two years for adults and up to three years when minors are involved—and obliges websites to remove illegal material within 48 hours of notice. Prosecutors portrayed the law as a necessary tool to address a modern form of exploitation that weaponizes technology against private citizens.
“We believe Strahler is the first person in the United States to be convicted under the Take It Down Act,” said U.S. Attorney Dominick S. Gerace II.
“We will not tolerate the abhorrent practice of posting and publicizing AI-generated intimate images of real individuals without consent. And we are committed to using every tool at our disposal to hold accountable offenders like Strahler, who seek to intimidate and harass others by creating and circulating this disturbing content.”
The investigation began with local complaints that were referred to the FBI’s Cincinnati Division, and agents built the federal case over several months before the arrest in June 2025. Court documents detail outreach to victims, the circulation of altered videos at workplaces, and threats that referenced addresses and daily routines. Prosecutors say the defendant sent material not only to the targets themselves but also to coworkers, family members, and others who would amplify the harm.
Digital forensics turned up the scope of the operation: dozens of AI programs and hundreds of model runs used to assemble images and videos tailored to harass specific people. The recovered files included both genuine photos altered with AI and fully synthetic images matched to real victims. That combination increased the emotional and reputational damage, forcing victims to respond to both real exposures and convincing fakes that suggested broader dissemination.
Under the new law’s removal mandate, platforms face a strict timeline to take down illegal content after notice, an enforcement tool meant to limit ongoing harm. This case tested those provisions in a federal prosecution setting, with authorities signaling they will use the Take It Down Act aggressively. The sentence the defendant will receive is pending, but the guilty plea marks a clear precedent for how prosecutors will handle AI-enabled abuse going forward.
Melania Trump acknowledged the conviction on social media, noting it as the first under the law and thanking federal prosecutors for their work. Republicans who supported the legislation argue that it fills a legal gap by targeting nonconsensual synthetic intimate content and holding both creators and distributors accountable. The case in Ohio is being presented as the kind of practical enforcement that supporters said the law would enable.
As courts and investigators adapt to AI’s role in crafting harmful material, law enforcement officials emphasize coordinated responses among local police, federal agents, and digital forensic teams. The Take It Down Act gives prosecutors a clearer path to charge offenders who weaponize AI against private citizens, and this conviction will likely shape how similar cases are prosecuted in the months ahead.


Add comment