Sophie Compton is a founder of the #MyImageM圜hoice campaign against deepfake imagery and director of Another Body, a 2023 documentary following female students seeking justice after falling victim to nonconsensual deepfake pornography. Male-dominated AI companies appear to incubate a culture that fosters a profound lack of empathy towards the plight of women online The whole system enabling these businesses must be forced to take responsibility.Įxperts on images created with artificial intelligence (AI) concur that for the proliferation of sexual deepfakes to be curtailed, social media companies, search engines and the payment companies processing transactions – as well as businesses providing domain names, security and cloud-computing services – must hit the companies making deepfake videos where it hurts: in their wallets. While introducing regulation to criminalise sexual nonconsensual deepfake production and distribution is obviously crucial, this would not be enough. But the legislation does not go far enough.Īnd no such protection yet exists in the US, although a bipartisan bill was introduced in the Senate last month that would allow victims to sue those involved in the creation and distribution of such images. Photograph: M Anzuoni/Reutersīritain is ahead of the US in having criminalised the sharing – but not creation – of deepfakes and has some legislation designed to bring greater accountability to search engines and user-to-user platforms.
View image in fullscreen Taylor Swift is one of the latest high-profile victims of deepfakes. But what steps can be taken to stop this burgeoning industry from continuing to steal identities and destroy lives? There is no doubt that nonconsensual deepfake pornography has become a growing human rights crisis. Referral links to the companies providing these images have increased by 2,408% year on year. Often using images lifted from private social-media accounts, every day more than 100,000 sexually explicit fabricated images and videos are spread across the web. Technology now allows a 60-second deepfake video to be created from a single clear image in under 25 minutes – at no cost.
Unsurprisingly, women were the victims in 99% of recorded cases. Last year’s State of Deepfakes report revealed a sixfold increase in deepfake pornography in the year to 2023. Yet Taylor Swift is just one of countless women to suffer this humiliating, exploitative and degrading experience. Every day more than 100,000 sexually explicit fabricated images and videos are spread across the web A few weeks ago, nonconsensual deepfake pornography claimed the world’s biggest pop star as one of its victims, with the social-media platform X blocking users from searching for the singer after a proliferation of explicit deepfake images. For many people across the world, this nightmare has already become a reality.