Deepfake Pornography and the Downplay of Image-based Sexual Abuse
I didn’t recognize Atrioc when I saw his teary-eyed apology for watching deepfake porn of his fellow female streamers; women who were under the impression that they had a mutual respect for one another as colleagues of sorts. However, the online discourse surrounding this scandal was language that I, a survivor of image-based sexual abuse, did recognize far too well.
Deepfakes are videos of someone that have been digitally altered to look like someone else. In this case, the faces of female streamers were plastered seamlessly onto already existing pornography. If one does not know any better, they truly appear as real videos of these women. And while deepfake pornography isn’t a novel issue pushed into the cultural zeitgeist with this scandal, when we consider the reality that the subjects of this digital manipulation are friends with the individual caught seeking out the material, another dimension is added to the issue. Among the tweets perpetuating the age-old falsehood that overt sexualization without consent is “just part of being a woman” that women must learn how to endure, was another, relatively newer argument that digital sexual abuse isn’t sexual abuse at all. Again, an argument I know far too well.
As a matter of fact, this flawed argument is exactly why I got into this field in the first place.
After intimate images of myself were spread around my high school without my knowledge or consent and ended up online, I saw first hand just how downplayed image-based sexual abuse truly is. The perpetrators of my abuse created an entire group chat dedicated to collecting and distributing the explicit images of their classmates without their knowledge or consent as though they were trading cards. I felt completely violated, unsure how many people had seen my body without my permission, including people I had gone to school with. People I sat next to in class. Danced with in my choir shows. Shared notes with. People who, when they saw me in person, pretended like everything was normal, but then went home and had their way with me on their phone screen. When speaking with mutual friends of myself and my abusers, I was told I “went too far” by labeling what they did as sexual abuse, because to them, it wasn’t. I have seen other survivors of image-based sexual abuse describe it as digital rape, which is exactly what it is. Because, when someone is viewing pornographic images, what are they doing with it? What is going on inside their head?
Whether or not this is a fully conscious thought that is able to be articulated doesn’t really matter, the viewer of pornography is imagining in their mind that they are physically engaging with the person depicted. They have visual material to base these sexual thoughts on. And when this material is viewed without the consent of the person depicted, these sexual acts are non-consensual.
Engaging with pornography is inherently a sexual act. When you do not have the consent of the person depicted, it is sexual abuse. Simple as that.
However, the simplicity of this argument gets called into question when discussing deepfake pornography. Based on the online discourse surrounding the Atrioc scandal, it is clear that many people downplay the victim’s feelings just because technically, it was not their bodies. However, speaking as a survivor of image-based sexual abuse, whose real body was involved in the abuse, I can say that while the images being of my own body definitely adds an intense layer of pain to my experience, it is by far not the only thing that caused me to suffer emotionally. There is a deep sense of betrayal involved when you learn that people who posed as well-meaning, platonic friends actually sexualized you in secret and acted upon those sexual desires by seeking out a way to engage with you sexually without your consent. It is a betrayal of trust so severe that it is difficult to trust anyone else ever again.
And, I’ll say it again, that is sexual abuse.
In their minds, those of us at the center of their covert sexualization will “never know,” making their behaviors “okay.” From their perspective, it is a victimless crime. When confronting my abusers, one of them said something that I can never forget, something that perfectly articulates why any of us are here discussing this in the first place. He said:
“When I got those pictures, I didn’t see you as a real human being, can’t you understand that?”
This was someone I had gone to school with for six years before he received those images. He berated me until I agreed with what I now recognize as an acknowledgement that the sexualization of women’s bodies often goes hand in hand with our dehumanization. There is a compartmentalization that happens when an individual engages non-consensually with real or doctored pornography of someone they know. One compartment is that of their friend, acquaintance, or colleague, who is a real human being.The other is that of a sexual object, intended for their sexual gratification. To disregard someone’s humanity in order to benefit from them sexually is predatory behavior and, surprise, is sexual abuse. One cannot engage in the non-consensual sexualization of someone else without first and foremost viewing them as an object intended for their pleasure. This cuts deep when the person sexualizing us is someone we trusted, or at least believed respected us as a human being with rights.
In short, deepfake pornography is another form of image-based sexual abuse. Just because the body of the person depicted in deepfake pornography isn’t actually theirs, doesn’t mean that the violation is negligible. There are multiple different emotional injuries that come with image-based sexual abuse. One of those emotional injuries, and perhaps the most impactful, is the fact that someone you know felt so entitled to engaging with you sexually that they sought out material that sexualized you without your consent.
Written by: PAVE’s Director of Content Development, Elle de los Reyes