The Deepfake Epidemic — It’s Deeper Than Digital

By Gabriella Cavalcante

Four billion.

More than double India’s population, and 11 times that of the United States. That is how many views pornographic deepfake videos amassed in 2024 on the most popular websites. Behind every video is a face, and as the audience for these deepfakes continues to grow, so do its victims. It could be you, your sister, mother, neighbor, or a 17-year-old from Australia.

The number of deepfake videos is growing rapidly.

Scroll to draw the curve

New videos cataloged 150K 100K 50K 10K 0 2016 2017 2018 2019 2020 2021 2022 2023 143,868 new videos in 2023 alone

Source My Image My Choice, “Deepfake Abuse: Landscape Analysis,” October 2023. Data covers cataloged nonconsensual deepfake videos through October 2023. The 2023 figure is partial-year (January through October).

Noelle Martin, a lawyer, Ph.D. candidate and advocate against deepfake abuse, was just 17 years old when she found sexually edited pictures of herself online. Martin’s selfies were taken from social media platforms and doctored. Her face had been lifted from her body and placed atop pornographic images of adult actresses.

“It started off as a very shocking, very unexpected, extremely violating experience,” Martin said. “Especially being a non-public figure, just you know, an ordinary person who found out about this.”

Martin discovered these doctored images in 2013 after using Google’s reverse image search, a feature that allows users to search for exact or related image matches across the web.

Gone are the days of meticulously cutting out pictures of women and attaching them onto the bodies of adult stars in magazines. Now, advancements in AI allow people to undress girlfriends, classmates or complete strangers from the comfort of their phone.

Celebrities, students, influencers and ordinary people are increasingly falling victim to this same exploitative practice. Regular pictures are taken, fed into AI technology that warps them into sexualized images and videos, then passed around forums and private messages.

“The scale of the problem is beyond anything I thought could happen,” Martin said. “There’s a whole ecosystem. There’s an industry; there’s economies. It’s profitable.”

Since her discovery, Martin has dedicated much of her life to fighting against deepfakes. She has delivered speeches across the world, written a case study about online safety, served as an advisor for the Pathways to Digital Justice project with the World Economic Forum, and more. As the years went by, what initially started off as photoshopped images turned into sophisticated, hyper-realistic content. Martin believed it was a punishment for shining a light on the developing issue.

“A target is put on your back,” she said. “They will continue to target you, and new people will find out, and new people will target you.”

While sexual exploitation through media is not new — some instances dating back well before the first person pressed enter on a keyboard — AI has only made the process of sexual exploitation more accessible to the masses.

The rapid improvement of deepfakes is continuing to make impacts that go far beyond the screen.

Deepfake legislation is here — but is it enough?

“I think that the law moves slowly,” Assistant District Attorney Melissa Scannell said. “In this case, we were on the defensive. We were reacting.”

When Patrick Carey was arrested in 2021 after creating and publishing deepfakes of several women online, including his ex-classmates, the Nassau County District Attorney’s Office in New York found themselves trying to fit a round peg into a square hole. At the time of his arrest, there was no law targeting this kind of digital sexual exploitation in New York. Scannell, along with Assistant District Attorney Jared Chester and others, spent months examining copious amounts of AI generated pornography on Carey’s devices.

“[It] was wildly time consuming,” Scannell said. “Specifically for him [Carey], because of the volume of what he was creating, what he was posting, what he was collecting.”

Carey spent years collecting images from the women’s social media accounts, some of which were childhood photos, then used free apps to undress them. He boasted of his deeds on various discussion forums and called out victims he posted by name, encouraging other users to harass them.

“He was manipulating [images of] his classmates,” Chester said. “Showing photographs of them wearing braces and them in bathing suits… and manipulating them so they looked like they were engaged in sexual conduct.”

Carey was caught red handed. His devices were littered with deepfake pornography of over a dozen women. It was not until Scannell found an image of real child sexual assault material (CSAM) that they felt justice was on the table. Carey pled guilty to multiple charges, and was sentenced to six months of incarceration along with 10 years of probation, and is now listed on the sex offender registry.

Carey was out of prison before legislation caught up. In 2023, nearly four years since he began creating deepfakes, the New York State Legislature passed its first deepfake law.

“Technology advances quickly,” Chester said. “And the law changes slowly.”

Legislation is not just slow moving, it is also bound to its respective states.

Like monstrous hydra heads, deepfake websites just keep coming back. Since 2024, the city attorney’s office in San Francisco has been in a litigation battle against 16 deepfake websites to prohibit them from operating in California. Their deepfake laws bar individuals from the creation and sharing of non-consensual deepfake pornography. They do not want just the head, but the whole monster.

“Our lawsuit is really unique in the sense that it's trying to go after the websites themselves,” Deputy Press Secretary Alex Barrett-Shorter said. “That is the ideal. That this content isn't even created in California.”

Despite California’s strict laws against deepfakes, individuals who create and host websites dedicated to it were not deterred. In the first half of 2024, the 16 sites served by the city attorney’s office accumulated over 2 million views. Before the City Attorney’s Office could get these websites taken off the internet, they had to identify the figures behind the operations. Barrett-Shorter calls them “doe defendants,” secretive people or companies who hide in dark holes on the internet. Like the Carey case, the investigation process was extremely time consuming. Without putting a name to these websites, justice was not on the table.

“To be honest, it was quite difficult.” Barrett-Shorter said. “It was not easy to figure out who owned these websites, and I think that was intentional, because what they were doing was wrong.”

Eventually, the office successfully identified all of their doe defendants, some even opted to take their websites down entirely. But no matter how hard offices hack at the hydra heads, their reach ends at the state line.

“It’s very pervasive,” she said. “Our lawsuit is trying to be part of the solution, but it certainly can’t solve the problem by itself… This is a worldwide issue.”

Federal law does not seem to be keeping a lid on this growing problem either. In 2025, the federal government’s Take It Down Act followed states in criminalizing nonconsensual deepfakes and mandated that social platforms remove reported deepfake content. However, its popularity has continued to skyrocket. DeepStrike, a cybersecurity firm, projected that 8 million deepfake files were shared in 2025 alone.

The bipartisan legislation has faced scrutiny from many sides for its reactionary nature.

“Once it’s out there, it’s out there,” Assistant District Attorney Melissa Scannell said. “They can invoke it… but other people have it, and they're going to put it other places.”

After nearly a decade of fighting against deepfakes, Martin is tired of playing an endless game of whack-a-mole. Take it down policies, she says, should not be where advocacy ends.

“You're not going to tackle it in any meaningful way if you just try and take things down,” she said. “Because the problem doesn't stop. The problem continues.”

Since being signed into law, only one person has been convicted under the Take It Down Act.

People behind the deepfakes, and the consequences of surviving them.

It is impossible to determine how many people have had their pictures turned into porn.

Similar to Martin and the victims of Patrick Carey, people may not know they have been violated until they stumble across it. However, some groups are much more likely to be impacted. Pornography made of women’s stolen images comprise 99% of all deepfakes created according to a study by Security Hero, a cybersecurity firm.

Young girls are being increasingly victimized by their male peers.

Many victims never find out who was responsible for the creation of their deepfakes. While Carey was identified by Nassau County detectives, Martin is yet to discover the people behind her deepfakes after nearly a decade. Initially she believed the deepfakes were created for sexual purposes, but now she is convinced their motives have changed.

“The motivation is deep power and control, and misogyny,” she said.

The anonymity the internet provides is a comfort for those who create deepfakes, but for survivors, the fear of the unknown often seeps into real life. Martin’s life changed the day she uncovered her deepfakes.

“When you live it, and you breathe it,” she said. “You realize… how much it impacts you in terms of employability, interpersonal relationships, your emotional, mental health.”

The impact of the sharing of nonconsensual images stretches far beyond a mere embarrassing moment. Survivors carry the weight of the digital violation through their everyday lives. Many suffer similar mental effects to sexual assault victims like anxiety, depression, posttraumatic stress disorder (PTSD), and thoughts of suicide. Martin experienced extreme paranoia when she first discovered her deepfakes. The uncertainty of who had created them ate at her, but she learned to compartmentalize that fear. She believes had no other choice.

“If I'm worried about what's happening online and also in person, I couldn't function,” Martin said.

She is also worried that the never-ending stress has affected her life expectancy.

“The stress, you can’t even avoid it,” she said. “The stress of everything, it never ends.”

Martin’s fears are not unfounded. Chronic stress – a form of stress that is prolonged and has no end point – has been found to have serious physiological effects, some of which could contribute to a shortened life. Christine Guardino, Ph.D., a health psychology researcher and professor at Stony Brook University (SBU), has researched the havoc chronic stress can wreak on the body. It can exacerbate depression, anxiety and autoimmune conditions, all of which disproportionately affect women.

Autoimmune disease

Up to four in five Americans living with an autoimmune condition are women.

~80%
of Americans with an autoimmune diagnosis are women.

Source Stanford Medicine, “Why are women more prone to autoimmune diseases?” February 2024. Estimate reflects baseline gender disparity across diagnosed autoimmune conditions; not a measure of stress-related onset.

The immune system responds to stress in the same way it responds to sickness. When people with immune disorders are under chronic stress, the body responds by heightening immune activation, thus potentially leading to worsened autoimmune conditions. Guardino believes that women are overlooked when it comes to cardiovascular disease, another way chronic stress causes physical harm to the body.

“You’re running away from the lion,” she said. “That classic fight or flight response. But when that response is repeatedly activated, or when it’s turned on for too long and doesn’t turn off, it results in wear and tear on the cardiovascular system that then increases risk of cardiovascular disease.”

Martin has not let her concerns of a shortened life stop her from living. While advocating for justice against deepfakes, she was admitted as a lawyer in 2020 and is currently a Ph.D. candidate at the University of Western Australia Law School. She also says her love of solo traveling has given her the strength she needs to keep up the fight against deepfakes.

“You find comfort in your own solitude,” she said. “You also find strength in it. Fighting this requires sometimes being alone in the fight or experiencing things quite in a quite lonely way.”

Deepfakes, meet the real world.

Women’s sexuality has served as the origins of many well known websites. Before Facebook, there was Facemash, a website created by a college-age Mark Zuckerberg to rank women based on their attractiveness. Google Images was developed after Jennifer Lopez’ revealing, deep-plunge dress became the most searched query Google had seen. The term “deepfake” was coined on Reddit, a popular discussion forum, when a user of the same name used AI to manipulate female celebrities' faces onto pornographic content.

In his research, Matthew Salzano, a Stony Brook University assistant professor with a Ph.D. in Communication, found that real world abuses can be recreated online. He believes the origins of Google Images serve as a key example of this.

“The fact that this image of Jennifer Lopez created such a now vital part of Google’s search infrastructure is a testament to the way that our media replicate our sort of societal norms and biases and systems of oppression,” Salzano said.

Women are becoming increasingly anxious about becoming victims of deepfakes, but Salzano says it is the men who make deepfakes that will change how they maneuver around the internet.

“The group it [deepfakes] probably changes most is incels, or sexually aggrieved men online, who feel that they have a right to women's bodies that they're not getting to exercise,” he said.

Incels, or involuntary celibates, are men who blame women for their own inability to find romantic or sexual partners. They often associate themselves with relatively anonymous online spaces, like the “doe defendants” in the San Francisco litigation, to express their grievances. These men are a smaller part of the manosphere, a vast online community that claims to promote the wellbeing of men at women’s expense. This ecosystem of discontent men is most associated with men like Andrew Tate, a self-proclaimed misogynist whose ideology aligns with incel beliefs and increasingly, young men. His online content promotes financial, social and sexual domination over women, all of which can be achieved through the creation of deepfakes.

Telegram, a messaging app known for its high level anonymity, has become a hot spot for deepfake distribution and other illegal activity. In a six week investigation by AI Forensics, a technology-focused investigative nonprofit, 2.8 million messages across 16 groups and channels found that thousands of mostly young men were using the app to sell deepfakes, dox women and promote hacking services. The same misogynistic, violent language espoused by incels and members of the manosphere was rampant in these communities.

While men use online spaces like Telegram as an outlet for misogyny and criminal behavior, these acts are not just confined to the screen. Like the manosphere and its mascots, deepfakes are becoming more popular among school-aged boys. The Center for Democracy & Technology (CDT) found that 15% of students – roughly 2.3 million young people – knew of deepfakes that depicted someone in their school.

“We see this happening in high schools and schools all the time,” Assistant District Attorney Jared Chester said. “It's very easy to type into a computer and say ‘Hey make this person naked’ and then send it to their friends and be like, ‘Look what I did.’”

Chester believes the rapid advancement of this technology has helped the process of normalization. The creation of these deepfakes used to require leg work to seek out the capable software. Now, deepfakes are advertised on most major porn sites.

“Now, it's so easy that literally anyone can go on and type in, ‘Hey, create this image for me, create this pornography, create this naked stuff,’” he said. “And that makes it so people think ease means it's not illegal or it's not a problem.”

Going forward.

It is safe to say generative AI has no intention of going anywhere. Data centers are popping up across the world. An estimated $2.5 trillion will be spent on AI in 2026 and deepfakes are as popular as ever.

AI spending

Worldwide AI Spending by Market, 2025-2027.

Market 2025 2026 2027
AI Services $439,438,000,000 $588,645,000,000 $761,042,000,000
AI Cybersecurity $25,920,000,000 $51,347,000,000 $85,997,000,000
AI Software $283,136,000,000 $452,458,000,000 $636,146,000,000
AI Models $14,416,000,000 $26,380,000,000 $43,449,000,000
AI Platforms for Data Science and Machine Learning $21,868,000,000 $31,120,000,000 $44,482,000,000
AI Application Development Platforms $6,587,000,000 $8,416,000,000 $10,922,000,000
AI Data $827,000,000 $3,119,000,000 $6,440,000,000
AI Infrastructure $964,960,000,000 $1,366,360,000,000 $1,748,212,000,000
Total AI Spending $1,757,152,000,000 $2,527,845,000,000 $3,336,690,000,000

Scrollable on mobile

Source Gartner, Worldwide AI Spending by Market, 2025-2027, January 2026. Original figures were reported in millions of U.S. dollars; table values are shown here as actual U.S. dollars.

The Nassau County District Attorney’s Office is bracing for impact. They have updated their internet safety program, S.T.O.P. Then Send, to include discussions about AI and the risks of posting online.

“It's another way that we get out and meet children and people where they are,” Deputy Communications Director Nicole Turso said. “To be able to speak to them in a language they understand, that's appropriate for their grade level to understand what internet safety really looks like.”

Some have chosen to use deepfake technology to their advantage. In a Wired article, adult content creators divulged they have signed their likeness over to AI companion companies like OhChat, Joi AI and SinfulX AI. Their digital doppelgangers are used to create whatever content paying customers desire, barring illegal activities.

As more and more people create deepfakes, the struggle over the truth has only increased.

“The hardest part is the erosion of truth,” Chester said. “You don't know what's real, and that's why the harm to the victims is so real.”

Battles against this digital sexual exploitation are being fought online, in the courtroom and beyond. New laws are being created, and old ones amended. States like California are bringing the fight to deepfakes. Since Martin first discovered her own deepfakes, she has spent over a decade turning her own violation into advocacy, research and law.

However, despite her achievements, degrees and awards, Martin believes her biggest achievement to date is being alive.