Credit: Becca Caddy
News
“I was the target of an AI nudes scam, and it could happen to you too – this is what everyone needs to know”
By Becca Caddy
10 months ago
8 min read
When technology journalist Becca Caddy received an email containing AI-generated naked photos of herself, she was shaken. She shares her story of what happened next, along with helpful information about this increasingly common type of scam.
I’ve used trending AI photo apps before – to see what I’d look like at 90, if I switched genders, if I was a fairy princess – and it was always lighthearted fun. But nothing had prepared me for seeing my face on an AI-generated naked body in a scam email.
It was a Sunday afternoon and I was making myself a cup of tea. Like many work-obsessed millennials circling burnout, I opened my emails while waiting for the kettle to boil. I saw a message from an unfamiliar email address with a subject line that read: Becca – Photoshoot. I’m a tech journalist, and I’d recently filmed some content for a brand, so I opened it, assuming it was related to that.
The first thing I saw was a photo of myself that I recognised from a few years ago. It’s one that’s been used by online publications I’ve written for, as well as on my literary agent’s website. But something was off.
I scrolled further and saw that it looked like I was wearing either a tiny vest top or naked. This struck me as bizarre: it’s a photo I know well and I was aware that I was wearing a long-sleeved black top when it was taken. So why was I now seeing bare shoulders?
Credit: Becca Caddy
I continued scrolling and saw another photo. It’s a recent favourite, taken by my mum when we were having a festive hot chocolate at a local coffee shop after getting a Christmas tree. I’m smiling, holding a big pink mug up to my face, and wearing my favourite blue fleece. I’d shared it on Instagram Stories and Facebook shortly after it was taken.
But as I scrolled, I saw that instead of my blue fleece, there was skin. I wasn’t wearing anything at all. Instead, my head was attached to a body with big breasts and a tiny, distorted waist.
There was a follow-up email, sent from the same address. It said that the images in the previous email would be sent to my friends and family on Facebook, to my LinkedIn contacts and to men in my industry and city. It warned me of what this would do to my personal and professional life and my mental health. It said I had 12 hours to send Bitcoin to the provided address to prevent the images from being shared.
I wanted to cry at seeing my likeness being violated
I couldn’t wrap my head around what I was seeing and reading. I wanted to cry at seeing my likeness being violated this way. I wanted to hide because, coupled with the photos, the threats felt way more threatening than they would from a usual scam email. I also wanted to laugh at the absurdity of it all – definitely my face, absolutely not my body.
It felt like a scam that I should just delete. But I had flashes of fear that the images, or similar ones, might be shared before I’d had a chance to tell anyone about it. I wanted to warn people close to me, just in case, so I decided to talk to close friends and family about it.
I then shared the email – and the images – on Instagram Stories and X (formerly Twitter). I wanted to take the power away from the scammer and show other people that this can happen. The more I’ve learned about other people’s experiences since, the happier I am that I shared and talked about it right away.
Scam emails and altered images are nothing new. But I still struggled to describe to friends, family and the police what had happened. I spoke to Clare McGlynn, a professor of law at Durham University specialising in sexual violence and online abuse. How would she describe what happened to me?
“The term ‘revenge porn’ is not right and really problematic,” McGlynn tells Stylist. “It’s a well-known term, but it doesn’t even cover this type of conduct – it’s not a malicious ex-partner sharing something for ‘revenge’.”
Instead, McGlynn says that’s why she’s worked with colleagues to develop the term ‘image-based sexual abuse’. Other terms she suggests could be used are intimate image abuse, deepfake sexual abuse, or AI sextortion.
It’s important we know how to talk about these incidents, because similar cases are on the rise, and all kinds of people are being targeted – from celebrities to children. In January 2024, many non-consensual deepfake images of Taylor Swift were shared online.
But McGlynn tells me this problem is everywhere. It’s a growing issue in schools, where other children and teens are making deepfake images to humiliate and scam their peers. “Young boys are being targeted with AI nudes of them from social media and are being threatened,” McGlynn says. “Often for small amounts of money, so it’s realistic that a teenager could transfer the funds.”
It’s difficult to know how prevalent this kind of deepfake sextortion scam is. McGlynn says there are probably many cases like mine that go under the radar because people don’t want to talk about it.
Non-consensual deepfakes are becoming more common as they become easier to fabricate. Whereas it may once have required photo software and some skills to create realistic-looking images, AI tech is now integrated into easy-to-use tools.
People are already familiar with many of the popular image apps and while some AI tools prevent users from making explicit content, not all of them do. Some non-consensual deepfake porn apps, known as ‘nudify’ apps, are easy to access and designed specifically for making explicit content and ‘stripping’ the clothes from photos.
Kathryn Goldby, helpline manager at The Cyber Helpline, explains that there’s likely a person behind the crimes, but soon ‘bots’ that can crawl for images and send extortion messages may allow them to extend their reach. How can the law keep up?
Credit: Becca Caddy
Laws currently in place can deal with phishing scams and blackmail, but only recently have laws been pushed through in the UK that apply to deepfakes. Earlier this year, several of the biggest deepfake pornography websites were blocked from the UK. It’s now also a criminal offence for someone to create a sexually explicit, non-consensual deepfake.
But some campaigners and researchers are worried the law isn’t comprehensive enough. “The criminal and civil law should be amended to clearly prohibit creating and sharing deepfakes,” McGlynn says. These are positive moves, but they’re still only steps in the right direction.
“Platforms need to do more. The ‘nudify’ apps, for example, are advertised on Instagram and X. People are accessing them through TikTok. Google returns deepfake porn websites and nudify apps at the top of a search for ‘deepfake porn’,” McGlynn tells me. Tech companies may not be making deepfake tools, but they are giving people a way to discover and use them.
People have asked me if I’ll be sharing less online to protect myself. But the most explicit deepfake of me was fabricated from a photograph of me sitting in a coffee shop drinking hot chocolate. Why should I have to stop posting these happy moments?
Deepfake sexual abuse is often used to silence women
“In reality, there is no way to protect ourselves,” McGlynn says. “Almost everyone has a social media presence of some sort and many cases of deepfake sexual abuse and the use of nudify apps are by friends and followers.” She also explains that it’s important we don’t place the responsibility for these forms of abuse on victims.
It’s a personal choice, of course, but I don’t want to hide. Especially as McGlynn tells me that’s sometimes the primary goal. “Deepfake sexual abuse is often an attempt to silence women,” she says. “To push us out of public life.”
If you’re affected by a similar scam, there are different recommended steps you can take. I asked Charlotte Hooper, operations director at The Cyber Helpline, what the best plan of action is.
“If this happens to a child, immediately report to the police, to Internet Watch Foundation (IWF) and report the content on the website [that it appears on],” she says. “But be careful of gathering evidence, eg saving screenshots or downloading the content, as this is an offence in and of itself.”
If it happens to an adult, report it to the police or Action Fraud and contact a helpline like the Revenge Porn Helpline and The Cyber Helpline for guidance. As with all scams, the key is to try and remain as calm as possible. “Seek help and try your best not to panic. Receiving these emails might seem terrifying, a threat to your privacy and safety, and violating. However, remember that support is out there and you are not alone,” Hooper says.
Hooper also advises against paying the ransom. “Criminals will often continue threats after the ransom is paid or add you to a database of individuals who have paid in the past and target you again,” she says. “Paying doesn’t necessarily prevent them from posting the images either.”
Finally, remember you’re not alone. McGlynn tells me that people who have had sexually explicit deepfakes made from images of themselves struggle in the same way. Know that your feelings are real and valid.
Helpful resources:
- The Cyber Helpline provides free, expert help and advice to people targeted by online crime and harm in the UK and USA.
- IWF can help identify and remove global online child sexual abuse imagery.
- Revenge Porn Helpline is a UK service supporting adults (aged 18+) who experience intimate image abuse.
Images: courtesy of Becca Caddy
Sign up for the latest news and must-read features from Stylist, so you don’t miss out on the conversation.
By signing up you agree to occasionally receive offers and promotions from Stylist. Newsletters may contain online ads and content funded by carefully selected partners. Don’t worry, we’ll never share or sell your data. You can opt-out at any time. For more information read Stylist’s Privacy Policy
Thank you!
You’re now subscribed to all our newsletters. You can manage your subscriptions at any time from an email or from a MyStylist account.