Deepfake technology endangers us all (2024)

  1. Comment

23 May 2024

It's not just politicians and global celebrities who suffer the consequences of AI-generated images: they can affect anyone with a social media presence.

By Sarah Manavis

Deepfake technology endangers us all (1)

The past year has been a wake-up call about the prevalence and sophistication of deepfakes. Be it the fake p*rn created using Taylor Swift’s likeness that spread across social media, or deepfake audio of Sadiq Khan speaking about the war in Gaza, AI-generated content is becoming more convincing – and dangerous. In what looks to be an election year in both the US and the UK, the threat such images pose to our democracy feels more tangible than ever (deepfakes of Joe Biden and Donald Trump are everywhere – both Rishi Sunak and Keir Starmer have already been targeted).

Politicians and global celebrities are the people we spend the most time saying are at risk of deepfakes. But another demographic is being targeted more than any other: social media influencers, particularly women. When the social media agency Twicsy conducted a survey of more than 22,000 influencer accounts on Twitch, TikTok, Instagram, YouTube and Twitter/X in March, they found that 84 per cent had been the victims of deepfake p*rnography at least once (89 per cent of the deepfakes found were of female influencers). These weren’t small accounts – each had a five-figure follower count. And in the space of just one month, some of these deepfakes had received more than 100 million views.

Influencers make good subjects for deepfake technology. They upload thousands of images and videos of themselves in short spaces of time, often from multiple angles (you only need one high-quality image to create a convincing deepfake). They speak in similar cadences to one another to fit algorithmic trends, meaning their voices can be straightforwardly mimicked. They might use filters that leave them looking smoother and more cyborg-esque than any person you’d encounter in real life. And there is a litany of apps available for anyone to download to create deepfakes – such as HeyGen and ElevenLabs – which only require users to upload a small number of images in order to make something that looks very real.

Influencers also make easier targets than your average celebrity. While there is also a wealth of images of, say, pop stars and athletes, these public figures typically have the money and resources to be litigious about deepfakes. Comparatively, influencers have limited means to do anything about the videos and images created using their likeness. Platforms, too, are far more likely to respond to celebrity deepfakes than of less famous individuals. When the p*rnographic deepfakes of Swift went viral on Twitter earlier this year, the site blocked all searches of her name, stemming the spread almost immediately. It’s difficult to imagine the reaction would have been the same for an influencer with only a few thousand followers.

Deepfake p*rnography is likely the most concerning problem for famous women. But this technology can be used for many other nefarious purposes beyond the creation of humiliating sexual content. Influencers’ likenesses are now increasingly used to create fake advertisem*nts to sell dodgy products – such as erectile dysfunction supplements – and to push propagandist disinformation, such as the deepfake of a Ukrainian influencer praising Russia.

Subscribe to the Saturday Read View all newsletters

Your new guide to the best writing on ideas, politics, books and culture each weekend – from the New Statesman.

THANK YOU

Even beyond using deepfakes of already-popular influencers to make ads they didn’t agree to, we are also starting to see how – via a scrapbook of images of multiple media figures – tech entrepreneurs can build whole new fake influencers, created entirely via generative AI. These accounts are full of hyper-realistic, computer-generated images, where the fake influencers talk about their fake hobbies, share their fake personality quirks, while securing very real and lucrative brand deals. Some have gained hundreds of thousands of followers and generated thousands for their male creators every month. Entrepreneurs can also fabricate deepfake influencers who embody sexist stereotypes of the “perfect woman” to appeal directly to male audiences, who could become more popular than the real, human influencers whose likenesses were used to make them.

Of course, this impacts many more people than just social influencers, affecting the livelihoods of anyone who does creative work – be it those who make music, art, act or write. Just this week, the actor Scarlett Johansson claimed OpenAI, the tech organisation that builds ChatGPT, asked to use her voice for a chatbot and, after she declined, mimicked her voice anyway. OpenAI pulled the voice but claimed it was not an imitation of Johansson.

It’s easy to vilify influencers for being shallow and attention-seeking, promoting over-consumption, and narrow beauty standards. But this trend shows us the danger deepfakes (and other forms of technology that could be used misogynistically) present for all of us – especially women. Anyone who has shared any image of themselves online is now at risk of having a deepfake made of them by anyone with malicious intent and internet access. If there is any digital representation of you online – an image, something as common as a Facebook profile picture, or even a professional headshot for LinkedIn, or a video; even your voice or your written work – then you are susceptible. This reality should help us to see why deepfake technology needs immediate legislation – holistic, wide-reaching laws that address the risks deepfakes pose to all of us.

[See also: What we must learn from the infected blood scandal]

Content from our partners
What you need to know about private markets

Spotlight

Work isn’t working: how to boost the nation’s health and happiness

Spotlight

The dementia crisis: a call for action

Spotlight

Related

Comment

The curse of Rishi Sunak’s election campaign

Comment

The democratic battle for Georgia’s soul

Comment

The Tories have no hope of avoiding defeat

Topics in this article : Artificial intelligence (AI) , Big Tech , Women

Deepfake technology endangers us all (2024)

References

Top Articles
Latest Posts
Article information

Author: Roderick King

Last Updated:

Views: 6247

Rating: 4 / 5 (71 voted)

Reviews: 94% of readers found this page helpful

Author information

Name: Roderick King

Birthday: 1997-10-09

Address: 3782 Madge Knoll, East Dudley, MA 63913

Phone: +2521695290067

Job: Customer Sales Coordinator

Hobby: Gunsmithing, Embroidery, Parkour, Kitesurfing, Rock climbing, Sand art, Beekeeping

Introduction: My name is Roderick King, I am a cute, splendid, excited, perfect, gentle, funny, vivacious person who loves writing and wants to share my knowledge and understanding with you.