close
close

Federal authorities say Army soldier used AI to create child abuse images

Federal authorities say Army soldier used AI to create child abuse images

play

A U.S. soldier stationed in Alaska used artificial intelligence to generate child sexual abuse material, a case that highlights the lengths online predators will go to exploit children, federal prosecutors said this week.

Seth Herrera, 34, used AI chatbots to create pornography of minors he knew, the Justice Department said. Court documents say he also viewed tens of thousands of images depicting brutal sexual abuse of children, including infants.

“Criminals considering using AI to commit their crimes should pause and think again – because the Department of Justice is prosecuting AI-enabled crimes to the fullest extent of the law and will seek increased penalties wherever warranted,” said U.S. Assistant Attorney General Lisa Monaco.

Earlier this year, the FBI issued a public notice regarding child sexual abuse material, pointing out that all such images and videos, including those created using artificial intelligence, are illegal.

The arrest comes as federal officials warn of a rise in sexual abuse content driven by AI that allows perpetrators to create images and videos on an exponentially larger scale, according to the Department of Homeland Security. The technology presents new challenges for law enforcement in combating this content, but can also serve as a tool to quickly and accurately identify perpetrators and victims, the department said.

Court documents contain details of chat groups containing child pornography

According to a detention support memo filed in the U.S. District Court for the District of Alaska, Herrera joined online messaging groups dedicated to trafficking offensive content. The soldier, stationed at Joint Base Elmendorf-Richardson in Anchorage, stored “secret recordings” of minors undressing in his home and then used AI chatbots to create exploitative content from them, according to federal court documents.

He also used images and videos of children posted on social media to create sexually abusive material, the memo said.

According to court documents, Homeland Security Investigations agents executed a search warrant on Herrera's home, where he lives with his wife and daughter. Three Samsung Galaxy phones contained tens of thousands of videos and images depicting rape and other sexual abuse of children as young as infancy, the memo said, dating back to at least March 2021. Herrera stored the material in a password-protected app disguised as a calculator on his phone, prosecutors said.

According to the memo, Herrera also looked for sexually abusive content depicting children around the same age as his daughter, and he had six children living under the same roof in the four-family home on the military base.

According to court documents, he admitted in an interview that he had viewed child sexual abuse content online over the past year and a half.

“No child should have to suffer from such travesties, and no one should feel immune from the exposure and prosecution of these crimes by HSI and its law enforcement partners,” said Katrina W. Berger, deputy director of Homeland Security Investigations.

Herrera was arrested Friday and is charged with transporting, receiving and possessing child pornography. He faces a maximum sentence of 20 years in prison. His first court appearance is scheduled for Tuesday.

A public defender listed in Herrera's court records did not immediately respond to USA TODAY's request for comment Monday.

Combating sexual assault in the age of artificial intelligence

This arrest is the latest in the country as federal officials grapple with sex offenders' use of new technologies.

“Federal law prohibits the production, advertising, transportation, distribution, receipt, sale, access with intent to view, and possession of any CSAM (child sexual abuse material), including realistic computer-generated imagery,” the FBI said in a public notice.

Officials say they were also able to use the new technology to catch perpetrators. In 2023, Homeland Security Investigations used machine learning models to identify 311 cases of online sexual exploitation. The three-week mission, called Operation Renewed Hope, resulted in the identification or rescue of more than 100 victims and the arrest of several suspected perpetrators, according to HSI.

Suspicions of the production of child sexual abuse content, including AI-generated material, can be reported to the National Center for Missing and Exploited Children by calling 800-THE LOST or online at www.cybertipline.org. Suspicions can also be reported to the FBI Internet Crime Complaint Center at www.ic3.gov.

Related Post