House Passes Take It Down Act to Combat Revenge Porn and Deepfakes

by Bridget Luckey | Apr 29, 2025
Photo Source: Adobe Stock Images by fizkes

The House of Representatives passed bipartisan legislation Monday to criminalize the nonconsensual sharing of sexually explicit images and videos, including AI-generated deepfakes, and to require online platforms to remove such content. Known as the Take It Down Act, the bill passed by a vote of 409 to 2 and now heads to President Trump, who is expected to sign it into law.

The legislation seeks to combat "revenge porn" by mandating that social media companies and other online platforms take down explicit images within two days of being notified. It marks the first significant internet content law to clear Congress since 2018, when lawmakers passed a bill aimed at curbing online sex trafficking.

Introduced by Senators Ted Cruz (R-Texas) and Amy Klobuchar (D-Minnesota), the Take It Down Act brought together a rare bipartisan coalition. The bill passed the Senate unanimously in February and gained momentum after President Trump and First Lady Melania Trump expressed support, with Mrs. Trump citing her focus on youth mental health and online bullying.

Senator Klobuchar emphasized the devastating effects of nonconsensual image sharing on victims, particularly teenagers. "This was one of the first times that we’ve actually gotten something done on consumer tech issues that is meaningful," she said.

Lawmakers from across the political spectrum described the urgent need to address the proliferation of deepfake technology and the harmful use of sexually explicit content online. Representative María Elvira Salazar (R-Florida), who introduced a companion bill in the House, highlighted the growing trend of young girls being victimized through manipulated images circulated among peers.

Although every state except South Carolina has laws against revenge porn and at least 20 states have laws addressing sexually explicit deepfakes, the federal legislation creates a uniform national standard and places clear obligations on tech platforms.

The law's passage reflects mounting frustration in Congress with large technology companies like Meta (Facebook and Instagram), X (formerly Twitter), and TikTok for their perceived failures to protect users, especially minors, from online harms. During a January 2024 hearing, tech executives faced intense scrutiny from lawmakers and public apologies from Meta CEO Mark Zuckerberg to families affected by online exploitation.

Despite broad support, some civil liberties advocates have raised concerns about the law's potential impact on free expression. Becca Branum of the Center for Democracy and Technology warned that the measure could lead to "weaponized enforcement" and the suppression of legitimate content.

Republican representatives Eric Burlison of Missouri and Thomas Massie of Kentucky voted against the bill. In a post on social media, Mr. Massie expressed concern that the legislation represented a “slippery slope” that could be “ripe for abuse” and lead to “unintended consequences.”

Nevertheless, supporters argue that the Take It Down Act is a crucial first step in regulating online harms and protecting vulnerable users from image-based sexual abuse in an increasingly digitized world. The bill's overwhelming passage signals growing bipartisan willingness to impose new responsibilities on internet platforms after decades of largely unregulated growth.

Share This Article

If you found this article insightful, consider sharing it with your network.

Bridget Luckey
Bridget Luckey
Bridget studied Communications and Marketing at California State University, Long Beach. She also has experience in the live music events industry, which has allowed her to travel to festivals around the world. During this period, she acquired valuable expertise in branding, marketing, event planning, and public relations.

Related Articles

Illustration depicting a man and a woman with digital data graphics in the background, related to the topic of political deepfakes and a new law in California.
California’s New Law on Political Deepfakes Faces Legal Challenge

California’s new law, Assembly Bill 2839, aimed at combating AI-generated deepfakes in elections, faces a legal challenge after a federal judge issued a preliminary injunction blocking most of its enforcement. The law, signed by Governor Gavin Newsom last month, targets the distribution of “materially deceptive” AI-generated media designed to mislead... Read More »

A California Assembly member speaking about a bill for age verification on pornography websites.
California Assembly Advances Bill to Require Age Verification on Porn Sites

In a bipartisan effort, Republican Assembly member Juan Alanis and Democrat Rebecca Bauer-Kahan have successfully persuaded their colleagues in the California Assembly to advance legislation mandating age verification on pornography websites. The bill, Assembly Bill 3080, aims to protect children from exposure to explicit and violent sexual material online. Juan... Read More »