President Donald Trump has officially signed the Take It Down Act into law, which criminalizes the unauthorized sharing of intimate images, including AI-generated deepfakes. The law mandates that social media platforms swiftly remove such content upon notification.
The bill received broad support from various technology firms, as well as parent and youth advocacy groups, alongside endorsements from first lady Melania Trump. However, some critics, including activists focused on addressing issues of nonconsensual image distribution, have raised concerns that the bill’s implementation could inadvertently harm the survivors it aims to assist.
Under the new law, the dissemination of nonconsensual intimate images — whether authentic or generated by artificial intelligence — can lead to penalties of up to three years in prison and potential fines. Additionally, social media platforms must establish processes to ensure the removal of such content within 48 hours of receiving complaints and are required to make “reasonable efforts” to eliminate duplicate copies. The Federal Trade Commission (FTC) will oversee enforcement of these regulations, granting companies one year to comply.
“I’m going to use that bill for myself, too”
Had this legislation been introduced under a different administration, it would likely face significant opposition from organizations such as the Electronic Frontier Foundation (EFF) and the Center for Democracy and Technology (CDT). These groups caution that the law’s takedown clause could lead to censorship of a broader range of content and pose risks to privacy-enhancing technologies such as encryption, as services relying on encryption would be unable to monitor or eliminate conversations between users. Comments made by Trump during the early days of his presidency have intensified fears that the law might be misused against political opposers. In a recent congressional speech, he stated, “I’m going to use that bill for myself, too, if you don’t mind, because nobody gets treated worse than I do online. Nobody.”
The Cyber Civil Rights Initiative (CCRI), an organization advocating against image-based abuse, has supported the criminalization of nonconsensual intimate image distribution. However, the CCRI has expressed reservations about the Take It Down Act, arguing that it could give false hope to survivors. In a post on Bluesky, CCRI President Mary Anne Franks described the takedown provision as a “poison pill” that may ultimately harm victims more than it helps.
Franks elaborated that platforms confident in their alignment with the current administration—such as those “unlikely to be targeted by the FTC”—may disregard legitimate reports of nonconsensual distribution. This could also result in an influx of false complaints that disrupt platforms’ abilities to function effectively. “It’s going to be a year-long process,” she warned in an interview with Technology News. “The FTC will become selective in how it addresses complaints regarding non-compliance. The focus won’t be on empowering individuals depicted in these images to have their content taken down.”
During the bill’s signing ceremony, Trump brushed aside concerns regarding its potential First Amendment implications. “People talked about all sorts of First Amendment, Second Amendment… they talked about any amendment they could make up, and we got it through,” he remarked.
According to Becca Branum, deputy director of CDT’s Free Expression Project, immediate legal challenges to the law’s more contentious provisions may not arise due to its vague language. She stated that it might be complicated for courts to determine when enforcement could be deemed unconstitutional before platforms are required to implement it. Ultimately, users could pursue litigation if their lawful content is wrongfully removed, or companies might seek court intervention if the FTC imposes penalties for noncompliance—much will hinge on how quickly enforcement begins.