While there have been attempts to create laws and guidelines surrounding A.I., its rapidly expanding abilities call for a constant need to change and update these policies. As we’ve discussed many times throughout the semester, the usage of A.I., specifically to create deepfake images of people– often women– in the nude or in sexual content, can be extremely harmful, yet is still widely unregulated.

As Natasha Singer wrote for the New York Times, this abuse impacts women and girls of all ages, and it no longer requires the time-consuming, technically complicated methods we saw used in Another Body— there are now apps you can download in just a click that do this. Regulations around these apps and images are murky.

During the Q&A portion of Another Body, someone asked a question that, although well-intentioned, came across to me as rather harmful. They asked, “What can women do to protect themselves from these deepfakers?”

Maybe this is a helpful question to answer– keep your accounts private, be careful about what you post, and so on– but this does nothing about the root of the issue. The FBI has already “warned” against doing this, but despite the policies and laws we have, there will always be people who break rules and do harmful things. Still, the answer isn’t to tell these girls to stop living their lives.

As said in Singer’s article, something as simple as allowing a classmate to follow you on Instagram can lead to them creating your deepfake nudes and spreading them around. Girls excited to share their prom dresses find themselves on nude bodies that are A.I.-generated yet ‘convincing.’ This is ‘image-based sexual abuse,’ but many schools dealing with this issue don’t know how to treat it. Is it harassment? Cyberbullying? Singer describes a situation in which a school official, when asked why she did not report deepfakes created and spread around of students to the police, responded, “what was I supposed to report?”

Try as we might, the internet is forever– this type of thing never fully goes away online or in our minds. The concept is somewhat abstract, but even though there has technically been to harm to or photography of one’s actual, physical body, knowing that someone violated your privacy and autonomy to create a ‘digital body’ with the intent of doing harm can be just as damaging. It’s not you– it’s an A.I. image with your face edited on– but at the same time, it’s still you. This type of harm isn’t some abstract, made-up issue. It’s very real, and very damaging, thus, should be treated seriously.

I genuinely wonder– what can a deepfake nude app be used for that isn’t harmful? Apple has banned games like The Binding of Isaac: Rebirth due to depicting harm towards children. Why not ban apps that are used to do actual harm to actual children?

In Computer Chess, we see only one woman, Shelly (Robin Schwartz), at the tech conference. She is tokenized– the event’s host makes comments about being so glad to have their first ever woman at the competition. Papageorge (Myles Paige) behaves inappropriately towards her. This film was made in 2013 and set in the 80s, but the environment towards women in technology has barely changed.

As we’ve discussed, bringing diverse perspectives in can start to reduce bias in A.I., but in the case of these deepfakes, it’s not the A.I. itself using its dataset to create harmful content– it’s people behind a screen using the A.I. to do harm. This comes from a culture of intense misogyny, and the solution is a complex one. Women shouldn’t have to hide, and the tech industry should certainly be a more welcoming one. There is no clear path to this change, but while people work at it, policy should exist to protect people from the negative potential of A.I., and A.I.s that will indubitably be used to cause harm must be regulated.


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *