Skip to main content

Deep Fake Pornos

As a follow-up to yesterday's blog, I provide the complete Boston Globe article about fake nudes and the damage they cause, especially to young girls.  AI not at its best to be sure.  We should never underestimate the depths to which scoundrels and demons will stoop.  This shameful and destructive behavior is the tip of the iceberg for those who prefer evil to good.  This Pope-In-A-Coat is benign.

(begin) AI fake nudes are booming. It’s ruining real teens’ lives. by Pranshu Verma of The Washington Post, November 5, 2023

When Gabi Belle learned there was a naked photo of her circulating on the internet, her body turned cold. The YouTube influencer had never posed for the image, which showed her standing in a field without clothes. She knew it must be fake.

But when Belle, 26, messaged a colleague asking for help removing the image he told her there were nearly 100 fake photos scattered across the Web, mostly housed on websites known for hosting porn generated by artificial intelligence. They were taken down in July, Belle said, but new images depicting her in graphic sexual situations have already surfaced.

"I felt yucky and violated," Belle said in an interview. "Those private parts are not meant for the world to see because I have not consented to that. So it's really strange that someone would make images of me."

Artificial intelligence is fueling an unprecedented boom this year in fake pornographic images and videos. It’s enabled by a rise in cheap and easy-to-use AI tools that can “undress” people in photographs — analyzing what their naked bodies would look like and imposing it into an image — or seamlessly swap a face into a pornographic video.

On the top 10 websites that host AI-generated porn photos, fake nudes have ballooned by more than 290 percent since 2018, according to Genevieve Oh, an industry analyst. These sites feature celebrities and political figures such as Representative Alexandria Ocasio-Cortez of New York alongside ordinary teenage girls, whose likenesses have been seized by bad actors to incite shame, extort money, or live out private fantasies.

Victims have little recourse. There’s no federal law governing deep-fake porn, and only a handful of states have enacted regulations. President Biden’s AI executive order issued Monday recommends, but does not require, companies to label AI-generated photos, videos, and audio to indicate computer-generated work.

Meanwhile, legal scholars warn that AI fake images may not fall under copyright protections for personal likenesses, because they draw from data sets populated by millions of images. "This is clearly a very serious problem," said Tiffany Li, a law professor at the University of San Francisco.

The advent of AI images comes at a particular risk for women and teens, many of whom aren’t prepared for such visibility. A 2019 study by Sensity AI, a company that monitors deep fakes, found that 96 percent of deep-fake images are pornography, and 99 percent of those photos target women.

“It’s now very much targeting girls,” said Sophie Maddocks, a researcher and digital rights advocate at the University of Pennsylvania. She said targets are ”young girls and women who aren’t in the public eye.”

On Sept. 17, Miriam Al Adib Mendiri was returning to her home in southern Spain from a trip when she found her 14-year-old daughter distraught. Her daughter shared a nude picture of herself.

"Look, Mom. What have they done to me?" Al Adib Mendiri recalled her daughter saying.

She'd never posed nude. But a group of local boys had grabbed clothed photos from the social media profiles of several girls in their town and used an AI "nudifier" app to create the naked pictures, according to police.

The application is one of many AI tools that use real images to create naked photos, which have flooded the Web recent months. By analyzing millions of images, AI software can better predict how a body will look naked and fluidly overlay a face into a pornographic video, said Gang Wang, an expert in AI at the University of Illinois at Urbana-Champaign.

Though many AI image-generators block users from creating pornographic material, open-source software, such as Stable Diffusion, makes its code public, letting amateur developers adapt the technology, often for nefarious purposes. (Stability AI, the maker of Stable Diffusion, did not return a request for comment.)

Once these apps are public, they use referral programs that encourage users to share these AI-generated photos on social media in exchange for cash, Oh said.

When Oh examined the top 10 websites that host fake porn images, she found more than 415,000 had been uploaded this year, garnering nearly 90 million views.

AI-generated porn videos have also exploded across the Web. After scouring the 40 most popular websites for faked videos, Oh found more than 143,000 videos had been added in 2023, a figure that surpasses all new videos from 2016 to 2022. The fake videos have received more than 4.2 billion views, Oh found.

The Federal Bureau of Investigation warned in June of an uptick of sexual extortion from scammers demanding payment or photos in exchange for not distributing sexual images. While it’s unclear what percentage of these images are AI-generated, the practice is expanding. As of September, over 26,800 people have been victims of “sextortion” campaigns, a 149 percent rise from 2019, the FBI said.

Belle, the YouTube influencer, is still unsure how many deep-fake photos of her are public and said stronger rules are needed to address her experience.

“You’re not safe as a woman,” she said. (end)

Deacon David Pierce

Comments