Saturday, November 23, 2024

Google has hosted over 100 YouTube videos promoting AI deepfake porn

FFor people curious about creating artificial nudes using AI, YouTube is place to begin.

Although there aren’t any “nudifier” tools, the video-sharing platform utilized by 2.5 billion people hosted greater than 100 videos with thousands and thousands of views promoting how quickly these AI apps and web sites can create clothes from images of Women can remove, in keeping with a review by Forbes found.

Some of the YouTube videos provided tutorials for an app that prime school students in Spain and New Jersey had allegedly used to create nude photos of her classmates. Students suspected to be victims of hazing have been victims of bullying, public shaming and panic attacks.

Another website featured in several YouTube videos was cited in court documents for a 2023 case involving a baby psychiatrist sentenced to 40 years for using artificial intelligence to create depictions of the sexual abuse of youngsters and for sexual exploitation of a minor. He was accused of using the tool to change images of his underage highschool girlfriend by removing her clothing. “In this digital age, it is frightening to realize that images of me, innocent images, can be taken and twisted for illegal and vile purposes without my consent,” his former girlfriend testified in court.

“When I see or hear AI, it’s in the back of my mind.”

AI nudifier app victim who talks about being targeted by a baby psychiatrist.

It’s “unthinkable” that Google would make it easier to make use of these apps, said Signy Arnason, deputy executive director of the Canadian Center for Child Protection Forbes. “It is easy to find educational videos or services with titles that obviously promote these types of applications on YouTube and even in Google search results,” she added. She said her organization is increasingly hearing from schools whose students have fallen victim to AI-generated nude images.

The problems with Google’s AI nudifier don’t stop at YouTube. Forbes identified three Android apps that provide clothing removal from photos, including a “nudity scanner photo filter” with greater than 10 million downloads; a Spanish-language app that permits the user to “swipe their finger over what they want to delete, for example a swimsuit,” which has greater than 500,000 installs; and Scanner Body Filter, which offers the flexibility so as to add a “sexy body image” to photos, also with half one million downloads.

Forbes Additionally, 27 ads promoting “deep nude” services were present in the Google Ads Transparency Center. One was promoting a web site with the word “baby” within the URL. The National Center on Sexual Exploitation (NCOSE) provided information on 4 additional individuals, including a Nudifier website that openly offered to create AI photos of Taylor Swift.

After Forbes‘ questioned whether the videos, ads, and apps violated Google’s policies, removed all 27 ads, and YouTube removed 11 channels and over 120 videos. One of those channels, hosted by a male deepfake AI, was chargeable for over 90 of those videos, with most pointing to Telegram bots that undressed women. The Scanner Body Filter app was also not made available for download, but other Android apps remained online.

Tori Rousay, corporate advocacy program manager and analyst at NCOSE, said that Google created a “continuous profit cycle” through Nudify apps by accepting promoting money from developers and cutting promoting revenue and one-time payments when the app is hosted within the Google Play Store. Rousay said that by comparison, Apple was quick to remove Nudifier apps when NCOSE highlighted a number hosted on the App Store.

“Apple has listened… Google needs to do the same,” Rousay added. “Google must develop responsible practices and policies regarding the spread of image-based sexual abuse.”

AI-generated deepfake porn is on the rise, including amongst children. This was announced by the National Center for Missing and Exploited Children Forbes This week the corporate said it had received 5,000 reports of AI-generated child sexual abuse material (CSAM) within the last 12 months. Earlier this 12 months, a Wisconsin man was indicted for allegedly using the AI-powered image generator Stable Diffusion 1.5 to create CSAM.

In the case of the convicted child psychiatrist, along with his childhood friend, other victims also testified in court that he caused the lasting trauma of their childhood photos through the use of AI to undress their children.

“I fear that if he has created child pornography using my image on the Internet, others will also have access to this image. “I fear that colleagues, family members, community members or other pedophiles will have access to this image,” said considered one of his victims. Another added: “It’s because of him that I’m afraid of artificial intelligence and when I see or hear AI it’s in the back of my mind.”

MORE FROM FORBES

MORE FROM FORBESAI nude photos of celebrities like Margot Robbie and Selena Gomez are on the market on eBayMORE FROM FORBESHow real individuals are caught up in Reddit’s AI porn explosionMORE FROM FORBESEtsy has hosted deepfake porn of celebritiesMORE FROM FORBESAI is driving an influx of kid sexual abuse images, data shows

Latest news
Related news