AI deepfakes, women, and the liberating imagery of feminist sexual vengeance

In 1968, performance artist Valie Export put on a pair of crotchless pants and strolled into a Munich cinema. Legend has it she was wielding a machine gun, pointing it at the heads of male viewers as she stalked the aisles of the pornographic theater. Without firing a shot, she bared her hairy genitals at eye level, moving row by row and asking if anyone wanted the real thing, as the men began scrambling out of their seats toward the door.

Export’s legendary performance piece, “Action Pants: Genital Panic,” came to mind this week when 404 Media reported that yet another internet forum has popped up for guys who create AI-generated deepfake porn. And how this one actually offers bounties for deepfake porn of ordinary women.

I imagine those men in the theater, sweating and stunned into silent panic. I imagine the heft of metal in Export’s hands, her face shape-shifting into those of a thousand women.

“It’s a nice story but it wasn’t a pornographic cinema. I don’t think I’d be sitting here now if I had gone into a porn cinema with a machine gun,” Export said in a 2019 interview when asked about her performance.

Alas.

“‘Genital Panic’ was not violent,” she said in another interview. “I walked through the rows of cinema chairs, but the visitors were afraid to see female genitals; to see the vulva and be too close to it. The fear of the vulva is present in mythology, where it is depicted devouring man. I don’t know if this fear has changed.” 

That fear hasn’t changed, only the medium by which men now seek to control the power of that image. Last month, Wired reported deepfake porn is “out of control,” with at least 244,625 videos uploaded to the top 35 sites for this garbage, and 113,000-plus uploaded in the first nine months of 2023. That’s 54% more than 2022’s total 73,000 videos.

A 2019 report by deepfake monitoring firm Sensity AI found 96% of deepfake images were non-consensual pornography. 99% of those targeted women. The hype we all fell for was that AI deepfakes would target political races. Maybe someday. But Sensity AI only found 35 images of politicians, and 2020’s election-deepfake scare turned into a nothingburger

US and South Korean female celebrities account for a big chunk of deepfakes found by Sensity, but experts note more men now target young girls. One New York man was caught running a depraved sex scheme targeting 14 women. Many said their stolen and manipulated photos were from middle school and high school. And high school boys are learning to do the same — like those who deepfaked 34 female classmates

But who needs homemade deepfakes when you’ve got bots? An investigation into deepfake porn on Telegram found guys distributing manipulated stolen images of more than 680,000 women and girls. The most common user request? “Familiar girls.” 

The master’s tools 

We’ve been inundated with coverage about how pervasive and destructive deepfake porn is, and now we’ve even got documentaries about it. Tech news has tracked the rise of this digital sewage for the last several years, and the increasing ease of its circulation with stupid-simple software. Since 2014, we’ve seen the kinds of believable faces that could be created with AI GANs (generative adversarial networks, the tech that makes deepfakes so convincing). By 2017 Nvidia’s first deepfake videos emerged, and so did the first deepfake porn. We’ve known from the start that this is what it would be used for. 

Samantha Cole’s 2017 coverage for Motherboard was among the early signal flares on open-source deepfake AI porn. Her reporting on porn sites using facial recognition software to detect and root out deepfakes likewise remains a prescient rebuke to those who propose letting the fox guard the henhouse; just look at all the good it for us when Google banned “involuntary synthetic pornographic imagery” back in 2018. And remember when top AI researchers were “racing” to detect deepfakes in 2019? Or how about when Facebook’s deepfake detection showed “promising results” in 2020?  

In 2024, it’ll be 10 years since deepfake images hit the scene, seven years since deepfake porn emerged. And op-eds keep coming, each describing some new humiliation or psychological devastation endured by this sexual harassment. The Centre for International Governance Innovation calls this doxxing, Zoom-bombing and deepfaking of women the “shadow pandemic.” 

Women steering violence we can’t outrun by offering anticipatory compliance is a survival strategy that obviously precedes current western tech-capitalism, but never has it been more conveniently and profitably exploited by more men

We’re not likely to see a sudden feminist revolution in AI tech, either. In 2019, women only accounted for 25% of the tech industry worldwide. You and I will never have enough cash to strongarm Google or Amazon Web Services into removing deepfake porn content with as much gusto as they do DCMA-protected .mp3 files. Let’s face it: If the internet’s corporate giants and Congress actually wanted to stop deepfake porn, they have the power and would have already done so. 

States can’t do this alone either. Vigorously enforced state laws make a difference in setting privacy standards (thank you, Illinois) — but that’s mainly for corporations. Miring Facebook’s legal department in compliance paperwork isn’t stopping deepfake porn. Already, 46 US states have laws banning revenge porn, and I’d love to see judges take those for a test drive. But only Virginia and California laws include deepfakes, and none have stopped the rising tide.

In his recent executive order, President Joe Biden included clauses that flirt with regulating AI-generated deepfakes. The political concern has largely been couched in language about how AI could be weaponized for attacks against the US’ critical infrastructure, or lead to nuclear war. But Biden’s directive to have the Commerce Department study deepfake identification does absolutely nothing to slow the pace of this fake-flesh industry, and barely constitutes more than a nod at the problem. It also misses the point entirely: It’s not just the medium of AI.


Want more health and science stories in your inbox? Subscribe to Salon’s weekly newsletter Lab Notes.


Pimpin’ ain’t easy

Before AI, it was Photoshop and airbrushing. In the 1980s, it was the Hustler magazine dillholes who patched together fake nudes of female readers (and were subsequently slapped with lawsuits). Every high school girl who’s seen a boy’s crude drawing of her passed around a class knows what I’m talking about.

“This is the point I kept coming back to: We have to pay attention to the spirit of deepfakes as it started,” Cole wrote in 2018. “We can….talk at length about ethical uses of artificial intelligence, fake news literacy, and who has access to powerful tools like machine learning. But we must first acknowledge that the technology that could start a nuclear war was born as a way for men to have their full, fantastical way with women’s bodies.”

She’s right. She’s so right, in fact, that it’s boring how right she is. It’s boring because it’s so painfully predictable. 

I wouldn’t hang Artemisia Gentileschi’s “Judith Slaying Halofernes” in the living room, but I could stare at Janina Baranowska’s “Acteon Devoured by His Hounds” every evening over dinner

The historical problem of media which extends the male gaze is at the heart of deepfake porn: it’s a male reduction of women to passive parts, an assertion of the viewers’ relational and psychological power over the subject being viewed. And it doesn’t take a genius to know that wherever women have historically faced a choice between enduring patriarchal sexual humiliation or taking radical and potentially dangerous political action, a delusional reformist with counterrevolutionary commercial aims is never far behind to offer a less-lethal third option. 

It won’t be long until a misguided liberal-feminist proposes some commerce app as capitulation, promising “consensual” deepfake sales as “empowerment.” An iron fist demanding surrender will be wrapped in the velvet glove of its coercive premise: you can fight men about this and they’ll do it to you anyway while you go unpaid, or you can comply with men’s demands for it and maybe some of them will throw you a few bucks for the convenience of your voluntary submission. (Of course, we’ll be expected to make sure daddy-app gets his cut. Pimpin’ ain’t easy, after all.) 

That’s the choice we’re headed toward because that’s the choice they always lay on us: one whose core assumption is that our sex-class won’t meet the threat of male violence with an equally lethal reply. And when we arrive at the next iteration of this choice, no amount of misappropriated feminist language about body-positive girl-boss empowerment will transmute our coerced participation in sexual self-subjugation into the material reality of our liberation. They aren’t going to suddenly clear the rape-kit backlog, promote us to CEOs and put pockets on our jeans if we concede politely and let them cut-and-paste our kids’ faces onto their porn. 

They aren’t going to suddenly clear the rape-kit backlog, promote us to CEOs and put pockets on our jeans if we concede politely and let them cut-and-paste our kids’ faces onto their porn. 

Ladies, let’s remember what our therapists have told us about honoring outdated survival mechanisms so we can let them go and escape abuse cycles: We survived patriarchal violence (both physical and psychological) by grasping onto any tiny thread of relational-social power we could find and adjusting the trajectory of that violence — even if only via the slightest, life-saving margin — by using our trauma-heightened pattern-recognition skills to anticipate its rhythms, and our ingenious subversion of socialized gender roles to soften its blows. 

Women steering violence that we can’t outrun by offering anticipatory compliance is a survival strategy that obviously precedes current western tech-capitalism, but never has this self-preservation instinct and strategy of ours been more conveniently and profitably exploited by more men, more quickly, and at grander scale for the sake of male mass-surveillance and sexual voyeurism. 

We taught our daughters what we learned from our mothers: lean into his punches so they don’t hurt as much and so you don’t die from them. I love us for that. But now it’s time to teach our daughters a strategy once attributed to Andrea Dworkin: “Harden your hearts and learn to kill.”

Get some

I wouldn’t be the first to describe Export’s arthouse performance as a brilliant and defiantly apotropaic act of anasyrma; to look at her 1969 photo series is to be stared down by a leather-clad punk rock sheela-na-gig with a machine gun. As the Tate notes, the keystone wasn’t just Export’s unapologetic ownership of a fetishized body — but occupying a public space of mostly men.

“It was important for me to present my works to the public, in the public space and not within an art-conservative space,” Export said. “But also aggression was part of my intention. I wanted to provoke, because I sought to change the people’s way of seeing and thinking.”

I wouldn’t hang Artemisia Gentileschi’s “Judith Slaying Holofernes” in the living room, but I could stare at Janina Baranowska’s “Actaeon Devoured by His Hounds” every evening over dinner. And I could watch women creators today birth from the screaming AI abyss a thousand deepfake revenge fantasies against the same men whose fearful masturbatory obsessions runs unchecked over our daughters.

I’m not saying we should stalk data centers with machine guns and crotchless pants (though if ever executive-ordered to, I’d lock and load). But we’d be wrong to deprive male deepfakers the consequence of their actions, including our anger. That begins by reclaiming public space, arming ourselves with digital weapons wielded against us — and, in our own way, asking viewers “Who wants a piece?”

Read more

about this women, girls, and the power of rage unleashed

Comments

Leave a Reply

Skip to toolbar