Monday, March 9, 2026

Boys use AI to fake nude photos – a lawsuit could stop it

Boys use AI to fake nude photos – a lawsuit could stop it

Nearly a yr after nude photos of highschool students created with the assistance of artificial intelligence turned a community in southern Spain the other way up, a juvenile court sentenced 15 of their classmates to a yr’s probation this summer.

But the substitute intelligence tool used to create the malicious deepfakes remains to be easily accessible on the web and guarantees to “unmask any photo uploaded to the site within seconds.”

There is currently a brand new try to ban these and similar apps in California. The state of San Francisco filed the primary lawsuit of its kind this week, which experts say could set a precedent but may even face many hurdles.

“The proliferation of these images has exploited a horrifying number of women and girls around the world,” said David Chiu, San Francisco’s elected city attorney, who filed suit against a gaggle of highly trafficked web sites based in Estonia, Serbia, Britain and elsewhere.

“These images are used to harass, humiliate and threaten women and girls,” he said in an interview with The Associated Press. “And the impact on victims is devastating: their reputation, their mental health, their loss of autonomy, and in some cases they have even triggered suicidal thoughts.”

The lawsuit, filed on behalf of the people of California, alleges that the services violated quite a few state laws covering deceptive business practices, non-consensual pornography and child sexual abuse. But it may possibly be difficult to seek out out who runs the apps, which are not available in phone app stores but are still easily found online.

One service contacted by the AP late last yr claimed via email that its “CEO is based in the U.S. and travels all over the U.S.,” but declined to supply evidence or answer other questions. The AP just isn’t naming the precise apps being sued to avoid promoting them.

“There are a number of websites where we don’t know exactly who those operators are and where they’re operating from right now, but we have investigative tools and the authority to issue subpoenas to investigate,” Chiu said. “And we will certainly exercise our authority as this litigation progresses.”

Many of those tools are used to create realistic fakes that “nude” photos of clothed adult women, including celebrities, without their consent. But they’ve also appeared in schools around the globe, from Australia to Beverly Hills, California. Typically, boys create images of female classmates which can be then widely shared on social media.

In one in every of the primary cases to draw widespread attention within the Spanish town of Almendralejo last September, a health care provider whose daughter was amongst a gaggle of women who were victims of assault last yr and who helped raise public awareness of the incidents said she was satisfied with the severity of the sentence her classmates face following a court ruling earlier this summer.

But it’s “not only the responsibility of society, the education system, parents and schools, but also the responsibility of the digital giants who profit from all this garbage,” said Dr. Miriam al Adib Mendiri in an interview on Friday.

She welcomed San Francisco’s measures, but said further efforts were needed, including from larger corporations akin to Meta platforms based in California and its subsidiary WhatsApp, through which the pictures were distributed in Spain.

While schools and law enforcement agencies attempt to punish those that create and share deepfakes, authorities are struggling to cope with the tools themselves.

In January, the European Union’s executive said in a letter to a Spanish member of the European Parliament that the app utilized in Almendralejo “did not” fall under the bloc’s comprehensive rules. latest rules to strengthen online security since the platform just isn’t sufficiently big.

Organizations monitoring the rise of AI-generated child sexual abuse material will likely be closely following the San Francisco case.

The lawsuit “has the potential to set a precedent in this area,” said Emily Slifer, policy director at Thorn, a company that campaigns against the sexual exploitation of youngsters.

A Stanford University researcher said it will be harder to bring the defendants to justice because so a lot of them are based outside the United States.

Chiu “has a tough time in this case, but may be able to get some of the websites taken down if the defendants who operate them ignore the lawsuit,” said Riana Pfefferkorn of Stanford.

She said that would occur if the town wins the case by default judgment within the absence of the plaintiffs and obtains injunctions affecting domain name registrars, web hosts and payment processors “that would effectively shut down those sites even if their owners never appear in the litigation.”

Recommended newsletter: Fortune’s Next to Lead newsletter is required reading for the following generation of C-suite executives. Delivered every Monday, the newsletter provides the strategies, resources and expert insights you want to land essentially the most coveted positions in business. Subscribe now.
Latest news
Related news