10 states have passed legislation to criminalize the creation or dissemination of deepfakes specifically. Indiana is likely to soon join the growing list by expanding10 states have passed legislation to criminalize the creation or dissemination of deepfakes specifically. Indiana is likely to soon join the growing list by expanding

Nonconsensual AI Porn Will Make Victims Out of All Of Us: Can The Law Keep Up?

2025/12/20 05:43
11 min read
For feedback or concerns regarding this content, please contact us at crypto.news@mexc.com

This article was copublished with The 19th, a nonprofit newsroom covering gender, politics, and policy. Sign up for The 19th’s newsletter here.

\ More than two dozen students at Westfield High School in New Jersey were horrified last year to learn that naked images of them were circulating among their peers. According to the school, some students had used Artificial Intelligence (AI) to create pornographic images of others from original photos. And they’re not the only teenage girls being victimized by fake nude photos: Students in Washington State and Canada have also reported facing similar situations as the ability to realistically alter photos becomes more broadly accessible with websites and apps.

\ The growing alarm around deepfakes—AI-generated images or videos—in general was amplified even further in January, as one involving the superstar Taylor Swift spread quickly through social media.

\ Carrie Goldberg, a lawyer who has been representing victims of nonconsensual porn—commonly referred to as revenge porn—for more than a decade, said she only started hearing from victims of computer-generated images more recently.

\ “My firm has been seeing victims of deepfakes for probably about five years now, and it’s mostly been celebrities,” Goldberg said. “Now, it’s becoming children doing it to children to be mean. It’s probably really underreported because victims might not know that there’s legal recourse, and it’s not entirely clear in all cases whether there is.”

\ Governing bodies are trying to catch up. In the past year or so, 10 states have passed legislation to criminalize the creation or dissemination of deepfakes specifically. These states—including California, Florida, Georgia, Hawaii, Illinois, Minnesota, New York, South Dakota, Texas and Virginia—outlined penalties ranging from fines to jail time. Indiana is likely to soon join the growing list by expanding its current law on nonconsensual porn.

\ Indiana Rep. Sharon Negele, a Republican, authored the proposed expansion. The existing law defines “revenge porn” as disclosing an intimate image, such as any that depict sexual intercourse, uncovered genitals, buttocks or a woman’s breast, without the consent of the individual depicted in the image. Negele’s proposed bill passed through both chambers and is now awaiting the governor’s signature.

\ Negele said she was motivated to update Indiana’s criminal code when she heard the story of a high school teacher who discovered that some of her students had disseminated deepfake images of her. It was “incredibly destructive” to the teacher’s personal life, and Negele was surprised to see that the perpetrators could not be prosecuted under current law.

\ “It started with my education of understanding the technology that is now available and reading about incident after incident of people’s faces being attached to a made up body that looks incredibly real and realistic,” Negele said. “It’s just distressing. Being a mom and a grandmother and thinking about what could happen to my family and myself—it’s shocking. We’ve got to get ahead of this kind of stuff.”

\ Goldberg, whose law firm specializes in sex crimes, said she anticipates more states will continue expanding their existing legislation to include AI language.

\ “Ten years ago, only three states had revenge porn or image-based sexual abuse laws,” Goldberg said. “Now, 48 states have outlawed revenge porn, and it has really created a tremendous reduction in revenge porn—not surprisingly—just as we advocates had said it would. The whole rise of deepfakes has filled in the gaps as being a new to way to sexually humiliate somebody.”

\ In 2023, more than 143,000 new AI-generated videos were posted online, according to The Associated Press. That’s a huge jump from 2019, when the “nudify” websites or applications were less commonplace, and still there were nearly 15,000 of these fake videos online, according to a report from Deeptrace Labs, a visual threat intelligence company. Even back then, those videos—96 percent of which had nonconsensual pornography of women—had garnered over 100 million views.

\ Goldberg said policymakers and the public alike seem to be more motivated to ban AI-generated nude images specifically because virtually anyone can be a victim. There’s more empathy.

\ “With revenge porn, in the first wave of discussions, everyone was blaming the victim and making them seem like they were some sort of pervert for taking the image or stupid for sharing it with another person,” Goldberg said. “With deepfakes, you can’t really blame the victim because the only thing they did was have a body.”

\ Amanda Manyame, a South Africa-based digital rights advisor for Equality Now, an international human rights organization focused on helping women and girls, said that there are virtually no protections for victims of deepfakes in the United States. Manyame studies policies and laws around the world, analyzes what’s working and provides legal advice around digital rights, particularly on tech-faciliated sexual exploitation and abuse.

\ “The biggest gap is that the U.S. doesn’t have federal law,” Manyame said. “The challenge is that the issue is governed state by state, and naturally, there’s no uniformity or coordination when it comes to protections.”

\ There is, however, currently a push on Capitol Hill: A bipartisan group of senators introduced in January the Disrupt Explicit Forged Images and Non-Consensual Edits Act of 2024—also known as the DEFIANCE Act. The proposed legislation aims to stop the proliferation of nonconsensual, sexually-explicit content.

\ “Nobody—neither celebrities nor ordinary Americans—should ever have to find themselves featured in AI pornography,” Republican Sen. Josh Hawley, a co-sponsor of the bill, said in a statement. “Innocent people have a right to defend their reputations and hold perpetrators accountable in court.” Rep. Alexandria Ocasio-Cortez has introduced a partner bill in the House.

\ According to new polling from Data for Progress, 85 percent of likely voters across the political spectrum said they support the proposed DEFIANCE Act—with 72 percent of women in strong support compared to 62 percent of men.

\ But younger men are more likely to oppose the DEFIANCE Act, with about one in five men under 45 (22 percent) saying they strongly or somewhat oppose legislation allowing subjects of explicit nonconsensual deepfakes to sue the creator.

\ Danielle Deiseroth, executive director of Data for Progress, said this issue showed one of the “more sharp contrasts” between young men and women that she’s seen in awhile.

\ “We can confidently say that women and men under 45 have diverging opinions on this policy,” Deiseroth said. “This is an issue that disproportionately impacts women, especially young women, who are more likely to be victims of revenge porn. And I think that’s really the root cause here.”

\ Goldberg said that creating policies to criminalize bad actors is a good start but is ultimately insufficient. A good next step, she said, would be to take legal action targeting the online distributors, like the App Store and Google Play, that are providing products primarily used for criminal activities. Social media platforms and instant messaging apps, where these explicit images are distributed, should also be held accountable, Goldberg added.

\ The founders of #MyImageMyChoice, a grassroots organization working to help victims of intimate image abuse, agreed that more should be done by private companies involved in the creation and distribution of these images.

\ The founders—Sophie Compton, Reuben Hamlyn and Elizabeth Woodward—pointed out that search engines like Google drive most of the total web traffic to deepfake porn sites, while credit card companies process their payments. Internet service providers let people access them, while major services like Amazon, Cloudflare, and Microsoft’s Github host them. And social media sites like X allow the content to circulate at scale. Google changed its policy in 2015 and started allowing victims to submit a request to remove individual pieces of content from search results and has since expanded the policy to deepfake abuse. However, the company does not systematically delist image-based sexual violence and deepfake abuse sites.

\ “Tech companies have the power to block, de-index or refuse service to these sites—sites whose entire existence is built on violating consent and profiting from trauma,” Compton, Hamlyn and Woodward said in a statement to The 19th. “But they have chosen not to.”

\ Goldberg pointed to the speed at which the Taylor Swift deepfakes spread. One image shared on X, formerly known as Twitter, was viewed 47 million times before the account that posted it was suspended. Images continued to spread despite efforts from the social media companies to remove them.

\ “The violent, misogynistic pictures of Taylor Swift, bloody and naked at a Kansas City Chiefs football game, is emblematic of the problem,” Goldberg said. “The extent of that distribution, including on really mainstream sites, sends a message to everybody that it’s okay to create this content. To me, that was a really pivotal and quite frightening moment.”

\ Given the high profile nature of the victim, the incident sparked pronounced and widespread outrage from Swift’s fans and brought public attention to the issue. Goldberg said she checked to see whether any of the online distributors had removed products from their online stores that make it easier and cheaper to create sexually explicit deepfakes—and she was relieved to see they had.

\ As the country’s policymakers and courts continue to try to respond to the quickly developing and increasingly accessible technology, Goldberg said it’s important that lawmakers continue deferring to experts and those who work directly with victims, such as lawyers, social workers and advocates. Lawmakers who are regulating abstract ideas or rapidly advancing technologies can be a “recipe for disaster” otherwise, she added.

\ Manyame also emphasized the importance in speaking directly to survivors when making policy decisions, but added that lawmakers also need to be thinking more holistically about the problem and not become too bogged down by the specific technology—at the risk of always being behind. For example, Manyame said the general public is only now beginning to understand the risks posed by AI and deepfakes—something she helped write a report back in 2021. Looking ahead, Manyame is already thinking about the Metaverse—a virtual reality space—where users are starting to reckon with instances of rape, sexual harassment and abuse.

\ “A lot of the laws around image-based sexual abuse are a little bit dated because they speak about revenge porn specifically,” Manyame said. “Revenge porn has historically been more of a domestic violence issue, in that it is an intimate partner sharing a sexually exploitative image of their former or existing partner. That’s not always the case with deepfakes, so these laws might not provide enough protections.”

\ In addition, Manyame argued that many of these policies fail to broaden the definition of “intimate image” to consider diverse cultural or religious backgrounds. For some Muslim women, for instance, it might be just as violating and humiliating to create and disseminate images of their uncovered head without a hijab.

\ When it comes to solutions, Manyame pointed to actions that can be taken by the app creators, platform regulators and lawmakers.

\ At the design phase, more safety measures can be embedded to limit harm. For example, Manyame said there are some apps that can take photos of women and automatically remove their clothing while that same function doesn’t work on photos of men. There are ways on the back end of these apps to make it harder to remove clothes from anyone, regardless of their gender.

\ Once the nefarious deepfakes are already created and posted, however, Manyame said the social media and messaging platforms should have better mechanisms in place to remove the content after victims report it. Many times, individual victims are ignored. Manyame said she’s noticed these large social media companies are more likely to remove these deepfakes in countries, such as Australia, that have third-party regulators to advocate on behalf of victims.

\ “There needs to be monitoring and enforcement mechanisms included in any solution,” Manyame said. “One of the things that we hear from a lot of survivors is they just want their image to be taken down. It’s not even about going through a legal process. They just want that content gone.”

\ Manyame said it’s not too big of an ask for many tech companies and government regulators because many already respond quickly to remove inappropriate photos involving children. It’s just a matter of extending those kinds of protections to women, she added.

\ “My concern is that there’s been a rush to implement A.I. laws and policies without considering what some of the root causes of these harms are. It’s a layered problem, and there’s many other layers that need to be tackled.”


Credits

  • Mariel Padilla for The 19th

Illustration

  • Rena Li

Editing

  • Flora Peir for The 19th

\ Also published here

\ Photo by Steve Johnson on Unsplash

Market Opportunity
null Logo
null Price(null)
--
----
USD
null (null) Live Price Chart
Disclaimer: The articles reposted on this site are sourced from public platforms and are provided for informational purposes only. They do not necessarily reflect the views of MEXC. All rights remain with the original authors. If you believe any content infringes on third-party rights, please contact crypto.news@mexc.com for removal. MEXC makes no guarantees regarding the accuracy, completeness, or timeliness of the content and is not responsible for any actions taken based on the information provided. The content does not constitute financial, legal, or other professional advice, nor should it be considered a recommendation or endorsement by MEXC.

You May Also Like

A Netflix ‘KPop Demon Hunters’ Short Film Has Been Rated For Release

A Netflix ‘KPop Demon Hunters’ Short Film Has Been Rated For Release

The post A Netflix ‘KPop Demon Hunters’ Short Film Has Been Rated For Release appeared on BitcoinEthereumNews.com. KPop Demon Hunters Netflix Everyone has wondered what may be the next step for KPop Demon Hunters as an IP, given its record-breaking success on Netflix. Now, the answer may be something exactly no one predicted. According to a new filing with the MPA, something called Debut: A KPop Demon Hunters Story has been rated PG by the ratings body. It’s listed alongside some other films, and this is obviously something that has not been publicly announced. A short film could be well, very short, a few minutes, and likely no more than ten. Even that might be pushing it. Using say, Pixar shorts as a reference, most are between 4 and 8 minutes. The original movie is an hour and 36 minutes. The “Debut” in the title indicates some sort of flashback, perhaps to when HUNTR/X first arrived on the scene before they blew up. Previously, director Maggie Kang has commented about how there were more backstory components that were supposed to be in the film that were cut, but hinted those could be explored in a sequel. But perhaps some may be put into a short here. I very much doubt those scenes were fully produced and simply cut, but perhaps they were finished up for this short film here. When would Debut: KPop Demon Hunters theoretically arrive? I’m not sure the other films on the list are much help. Dead of Winter is out in less than two weeks. Mother Mary does not have a release date. Ne Zha 2 came out earlier this year. I’ve only seen news stories saying The Perfect Gamble was supposed to come out in Q1 2025, but I’ve seen no evidence that it actually has. KPop Demon Hunters Netflix It could be sooner rather than later as Netflix looks to capitalize…
Share
BitcoinEthereumNews2025/09/18 02:23
Hoskinson to Attend Senate Roundtable on Crypto Regulation

Hoskinson to Attend Senate Roundtable on Crypto Regulation

The post Hoskinson to Attend Senate Roundtable on Crypto Regulation appeared on BitcoinEthereumNews.com. Hoskinson confirmed for Senate roundtable on U.S. crypto regulation and market structure. Key topics include SEC vs CFTC oversight split, DeFi regulation, and securities rules. Critics call the roundtable slow, citing Trump’s 2025 executive order as faster. Cardano founder Charles Hoskinson has confirmed that he will attend the Senate Banking Committee roundtable on crypto market structure legislation.  Hoskinson left a hint about his attendance on X while highlighting Journalist Eleanor Terrett’s latest post about the event. Crypto insiders will meet with government officials Terrett shared information gathered from some invitees to the event, noting that a group of leaders from several major cryptocurrency establishments would attend the event. According to Terrett, the group will meet with the Senate Banking Committee leadership in a roundtable to continue talks on market structure regulation. Meanwhile, Terrett noted that the meeting will be held on Thursday, September 18, following an industry review of the committee’s latest approach to distinguishing securities from commodities, DeFi treatment, and other key issues, which has lasted over one week.  Related: Senate Draft Bill Gains Experts’ Praise for Strongest Developer Protections in Crypto Law Notably, the upcoming roundtable between US legislators and crypto industry leaders is a continuation of the process of regularising cryptocurrency regulation in the United States. It is part of the Donald Trump administration’s efforts to provide clarity in the US cryptocurrency ecosystem, which many crypto supporters consider a necessity for the digital asset industry. Despite the ongoing process, some crypto users are unsatisfied with how the US government is handling the issue, particularly the level of bureaucracy involved in creating a lasting cryptocurrency regulatory framework. One such user criticized the process, describing it as a “masterclass in bureaucratic foot-dragging.” According to the critic, America is losing ground to nations already leading in blockchain innovation. He cited…
Share
BitcoinEthereumNews2025/09/18 06:37
Trump-voting mom accuses DHS of lying after son killed by ICE agent

Trump-voting mom accuses DHS of lying after son killed by ICE agent

A Texas mother and self-described Trump supporter is demanding answers following her son's deadly encounter with immigration agents on South Padre Island nearly
Share
Rawstory2026/03/07 09:34