When Reuben Hamlyn and Sophie Compton delved into the world of anonymous digital abuse of women, they opened a major can of worms. They found “Taylor” a young student seen as a deepfake as protection, describing the living hell she’s endured since someone she knew posted porn videos of her, superimposing her face on bodies. The ripple effect of harm became a tidal wave. Taylor felt life wasn’t worth living, she feared the images would harm her chances of a normal life and job. And no authorities did anything to help her. What She Said’ Anne Brodie spoke with Hamlyn on the scope and prevalence of this growing problem, a product of advancements in AI.
Reuben, the artifice of living a machine life and the very real dangers as you present them in Another Body are brutal.
We had no idea of the scope of Taylor’s story when we got in contact with her, she had only become aware of the videos two weeks prior. And so it was the really early stages, she was still at a complete loss as to what was going on, and she was a bit shaken. She hadn’t had any advancements with the police. So it was really unknown to her and to us how her story would unfold.
It’s entirely possible that any of us could be inserted into a video of any kind, not just sexual but criminal, political, or cultural, how terrified are you that AI is getting so advanced?
Over the course of production, there were many moments when we thought we wouldn’t get the material for the future, and we didn’t know where the story was going. There were points when it seemed like Taylor was going to give up, we definitely didn’t know she would discover who the perpetrator was. So when we began this, we had no idea the scope of the story. And it was riveting to watch this roller coaster unfold, going through production. I mean, that being said, we did have an extensive research process where we spent a really regrettable amount of time on 4chan trying to make sense of how these communities operated and also looking for potential stories. And so we had a strong sense of the pervasive misogyny and hatred that was ever-present in these forums.
Real-life consequences for your main and very eloquent victim are horrific – the videos could ruin her professional opportunities and reputation. How successful would lawsuits be against the poster?
These posters hide behind the moniker “Anonymous”, all of these are real people. And so, as is the case with I think, most sex crimes or crimes of violence against women, more often than not, that it is someone who is known to the victim, to the target. And so I think in this situation, we assumed that more likely than not, it would be someone who was known to Taylor.
There’s no police help for victims as no existing laws are broken. The law isn’t up to date on the life-altering use of deepfakes to victimise people. And AI advances are racing forward.
I’m not terrified by AI in and of itself. I think that AI has a lot of potential for good a lot of potential to sort of better society. And I do, I really want to try and caution against just bliss blanket sort of fear-mongering around AI. And because I think what’s really important at this point, where we are at the beginning, just at the very early stages of a technological revolution, where the very early stages of this wave is for people to do what they can to educate themselves and try and understand what different AIs are. Because AI is not one thing, it’s many different technologies. It’s like a category of technologies and AI has been in our lives for a number of years now, even from your, Spotify algorithms, a form of AI. That being said, I think what I’m worried about in regards to AI is the fact that there’s so little regulation at this point and that it seems to be so poorly understood by lawmakers.
Is there any way to protect ourselves from being deepfaked?
If you look back on the last paradigm shift, the technological paradigm shift in our lives, was probably, the explosion of the smartphone, I was just having instant access to this, an infinite amount of information in my pocket, and this instant connectivity to each other. And that, again, it’s something that had so much potential for good and obviously has done a lot of good to society.
But because we had these companies that were racing to be dominant and racing to extract as much capital from their users without much oversight without proper regulation, because the policymakers, again, didn’t really understand what was how these applications worked. We had these really horrific negative side effects, we have a generation growing up with massively increased anxiety, and we have an internet, which is totally unsafe as a landscape and very hard to police and regulate.
And so I worry that we’re falling into the same situation with AI technologies because all of these new AI companies want to be the dominant one, they want to be the next Apple. And so they’re racing ahead, they don’t have time to stop think and think about the ethical implications of everything they’re doing. Because if they do that to someone else, we’ll just leapfrog them.
And so we’re in a situation where AI is poorly understood by the public, poorly understood by policymakers. We have these companies that are working in somewhat arcane ways, who are basically deciding how our future is going to unfold, and how society is going to work because we’re about to have an explosion of a new revolutionary technology. And we just need to sort of make sure it’s not businesses whose like sole businesses whose priority is maximizing capital, or the people that are deciding how our future is going to look. And so everyone just needs to slow down and educate themselves and basically decide what future we want. And how are we going to sort of make sure AI is oriented towards that future? We will have consequences.
What about the lawsuit? Surely something good could come out of civil lawsuits brought by the victims?
The situation with the lawsuit is a tricky one, because there’s no one, as the film articulates, it’s not a criminal offence, creating deep fake pornography. And so in the film, we hear Taylor say that the detective said to her, that he didn’t do anything wrong, there’s no law against us, we didn’t do anything wrong. And in truth with a civil lawsuit is like, you might be able to get a restraining order, which is not going to be successfully policed on the internet. Or you can try and get cash. But it’s like, what use is, is getting cash from someone from this guy who’s like, just graduated like a year ago, he doesn’t have any money, and won’t even be able to cover your legal fees. So there really isn’t much recourse for targets of abuse at this point in time in the United States. In most states in the United States, I should clarify. There’s no federal law.
“Taylor” deepfaked in Another Body
Do you hope that a broader conversation might begin and move toward solutions when people see your documentary?
Yes, I do have hopes for our documentary getting this conversation going. But the main reason for that is we haven’t just been a documentary, we’ve our impact campaign is very unorthodox. We spoke to so many women who had been targeted by nonconsensual, intimate images being uploaded online. And wanted to share that story. They wanted to help to be one piece in the puzzle to help change the world to make the world a better place to make the internet landscape safer for women, but they didn’t know how to share their story.
And so we started collecting their Testimonium and created a databank of testimony which we gave to research bodies, and started connecting them with various like journalists, you could publish the stories, either with their name or anonymously. And because of this, we sort of built a coalition of victims and survivors of image-based sexual abuse, which I called #MyImageMy Choice. And we’ve been doing a lot of work over the course of production.
Our researchers have been fairly pivotal to the laws being changed in England and Wales. The UK Law Commission did a report on intimate image abuse. There’s a lot of work to be done. In that regard, you know, we want to get this film into schools. Because really, it’s young men who I think need to see this and need to understand that virtual crimes like this have real-world impacts that can really them up in life. And so that’s very important. And we want to target tech platforms that are enabling this form of abuse, not just enabling this form of abuse, they’re enabling businesses to be built on this form of abuse.
Another Body at Hot Docs Ted Rogers Cinema and on TVOD