Press "Enter" to skip to content

As AI porn generators get better, the stakes get higher

Porn generators have improved while the ethics around them become stickier

 

Image Credits: Unstable Diffusion

As generative AI enters the mainstream, so, too, does AI-generated porn. And like its more respectable sibling, it’s improving.

When TechCrunch covered efforts to create AI porn generators nearly a year ago, the apps were nascent and relatively few and far between. And the results weren’t what anyone would call “good.”

The apps and the AI models underpinning them struggled to understand the nuances of anatomy, often generating physically bizarre subjects that wouldn’t be out of place in a Cronenberg film. People in the synthetic porn had extra limbs or a nipple where their nose should be, among other disconcerting, fleshy contortions.

Fast-forward to today, and a search for “AI porn generator” turns up dozens of results across the web — many of which are free to use. As for the images, while they aren’t perfect, some could well be mistaken for professional artwork.

And the ethical questions have only grown.

No easy answers

As AI porn and the tools to create it become commodified, they’re beginning to have frightening real-world impacts.

Twitch personality Brandon Ewing, known online as Atrioc, was recently caught on stream looking at nonconsensually deepfaked sexual images of well-known women streamers on Twitch. The creator of the deepfaked images eventually succumbed to pressure, agreeing to delete them. But the damage had been done. To this day, the targeted creators receive copies of the images via DMs as a form of harassment.

The vast majority of pornographic deepfakes on the web depict women, in truth — and frequently, they’re weaponized.

A Washington Post piece recounts how a small-town school teacher lost her job after students’ parents learned about AI porn made in the teacher’s likeness without her consent. Just a few months ago, a 22-year-old was sentenced to six months in jail for taking underage womens’ photos from social media and using them to create sexually explicit deepfakes.

In an even more disturbing example of the ways in which generative porn tech is being used, there’s been a small but meaningful uptick in the amount of photorealistic AI-generated child sexual abuse material circulating on the dark web. In one instance reported by Fox News, a 15-year-old boy was blackmailed by a member of an online gym enthusiast group who used generative AI to edit a photo of the boy’s bare chest into a nude.

Reddit users have been scammed with AI porn models, meanwhile — sold explicit images of people who don’t exist. And workers in adult films and art have raised concerns about what this means for their livelihoods — and their industry.

None of this has deterred Unstable Diffusion, one of the original groups behind AI porn generators, from forging ahead.

Enter Unstable Diffusion

When Stable Diffusion, the text-to-image AI model developed by Stability AI, was open sourced late last year, it didn’t take long for the internet to wield it for porn-creating purposes. One group, Unstable Diffusion, grew especially quickly on Reddit, then Discord. And in time, the group’s organizers began exploring ways to build — and monetize — their own porn-generating models on top of Stable Diffusion…

READ FULL ARTICLE HERE… TechCrunch

Home | Caravan to Midnight (zutalk.com)

We Need Your Help To Keep Caravan To Midnight Going,

Please Consider Donating To Help Keep Independent Media Independent

Breaking News: