Imagine sitting in your office, minding your business, when a coworker draws your attention to a pornographic video — with your face on someone else's body. Thanks to deepfake technology, that's now something that could happen to anyone. Over the past few weeks, HuffPost's Jesselyn Cook interviewed several women who've become victims of deepfake porn; we asked her about the resulting story.
How did this story come about?
While doing research for an earlier explanatory article on deepfakes, I was disappointed by the lack of media coverage and public conversation about the ways these videos are harming women. There's been a lot of concern about the political problems deepfakes could hypothetically cause in the future, but deepfake porn is already upending women's lives, and almost no one was talking about that.
What was the hardest thing about reporting, writing or editing this piece?
I had to spend a lot of time digging through deepfake porn forums, and I'd often see users anonymously dumping photos of the women — co-workers, exes, friends — they were hoping (and typically paying) to have inserted into erotic videos, without the women's knowledge or consent. It was nauseating to see such a brazen sense of sexual entitlement to women's bodies.
But the hardest part was definitely contacting the women themselves to ask if they'd be willing to talk to me about such a sensitive issue, and hearing their stories. One woman told me how she'd learned that she's featured in deepfake porn when her colleague showed her the video in front of her other co-workers. Can you imagine?
What did you find that most surprised you?
There's this really disturbing kind of camaraderie on deepfake porn forums, where users cheer each other on for creating and sharing believable videos, and seem willfully blind to how their actions affect the women who are featured. One creator, who sells his videos on multiple forums, told me deepfake porn is "no different from a photoshop manipulation or artist drawing," and described his video-making as a "service" he provides to others. It honestly was surprising to see just how adamantly people tried to justify their clearly abusive behavior.
What do you want readers to take away?
A couple things: First, for most victims of deepfake porn, there's essentially nothing they can do to get the videos taken offline. This can damage their reputation, their personal relationships, their job prospects, and so many other parts of their life. These are serious problems that are happening now — and have been for some time — and they should be included prominently in the public discourse about the impacts of deepfakes and what should be done about them.
Second, there's fear that holding tech platforms legally accountable for wittingly hosting and amplifying malicious deepfakes could lead to broader censorship. But the reality is, deepfake porn is already silencing women. Right now, women are having their photos and videos distorted and used against them in incredibly harmful ways without consequence. As one expert told me: “There’s a massive chilling effect that deepfake pornography has on women’s speech, because the way to make yourself safer is to censor yourself.”
What were you left wondering?
I'm interested to see how Congress will decide to handle deepfakes. In the meantime, I'm nervous to see how quickly the technology will continue to advance.