Deepfakes: AI's Dark Side and the Fight for Survivors' Rights

There have been many conversations lately surrounding the rise and use of artificial intelligence (AI) technology, and whether it does more harm than good. One of the ways AI has been used is through deepfakes. According to Forbes, a deepfake is an AI technique that can superimpose images and videos onto source footage using a variety of machine learning techniques (Wang, 2019). Deepfakes differ from photoshop or other edited images and videos because it tends to look seamlessly authentic.

This technology has been grossly misused, and its main application today is pornographic videos. Many of these deepfake videos feature images of people who do not consent to being involved, and often, have no knowledge that their image and/or likeness is being used in such a way. Not only is this a major violation of rights to privacy, it can also be extremely harmful and traumatizing to the survivor. Non-consensual deepfake pornography is a form of image-based sexual violence and should be treated as such.

Combating deepfake pornography is a grueling and long-winded process that can leave survivors even more traumatized, and often, there is no clear route for survivors to take. Very few states have legally addressed this issue, which only makes it harder for victims to get justice. As of 2023, there are only four states that have deepfake laws; California, Georgia, New York, and Virginia (CCRI, 2023). Other states have some sort of ban on revenge porn, but the aforementioned four are the only ones that explicitly cover faked images.

This lack of any meaningful legislation only causes more hurdles and strife for survivors who are attempting to seek out justice. Karolina Mania, a legal scholar who has written about the issue says, “This leaves only a smattering of existing civil and criminal laws that may apply in very specific situations. If a victim’s face is pulled from a copyrighted photo, it’s possible to use IP law. And if the victim can prove the perpetrator’s intent to harm, it’s possible to use harassment law. But gathering such evidence is often impossible” (Hao, 2021). For a vast majority of cases, there seems to be little to no legal remedies available.

However, that does not have to be the case. With the ever-increasing use of this technology, legislators will have to be forced to confront this issue, but they will not do it on their own whim. Although it is difficult, and often re-traumatizing, it is crucially important for survivors to speak up and share their stories. This can be a daunting task, but it can also lead to a lot of good. Sharing personal experiences can be empowering for both the survivor and other survivors as well, who may feel more inclined to share their story after hearing of experiences that resemble their own. Additionally, sharing these stories can help legislators truly understand the scope of these issues, and force them to act a lot sooner.

There is clearly a lot of work that needs to be done to ensure survivors of deepfake pornography can rightfully get their justice via the legal system, and it may at times feel hopeless. However, there are still plenty of resources available, and there are lots of organizations who are pushing for justice. Here is a short list of those resources/organizations:

And to any survivor of image-based sexual violence: we see you, we hear you, we believe you, and we support you.

Written by: Summer 2023 Intern Cora Gordon

Sources:

https://www.forbes.com/sites/chenxiwang/2019/11/01/deepfakes-revenge-porn-and-the-impact-on-women/?sh=2f181e221f53

https://www.technologyreview.com/2021/02/12/1018222/deepfake-revenge-porn-coming-ban/

https://cybercivilrights.org/deep-fake-laws/

Previous
Previous

Survivor Stories (pt. 9)

Next
Next

The Cost of Rape