【Deborah Driggs Archives】
Congress has passed a bill that forces tech companies to take action against certain deepfakes and Deborah Driggs Archivesrevenge porn posted on their platforms.
In a 409-2 vote on Monday, the U.S. House of Representatives passed the "Take It Down" Act, which has received bipartisan support. The bill also received vocal support from celebrities and First Lady Melania Trump. The bill already passed the Senate in a vote last month.
The Take It Down Act will now be sent to President Donald Trump, who is expected to sign it into law.
You May Also Like
First introduced by Republican Senator Ted Cruz and Democratic Senator Amy Klobuchar in 2024, the Take It Down Act would require that tech companies take quick action against nonconsensual intimate imagery. Platforms would be required to remove such content within 48 hours of a takedown request. The Federal Trade Commission could then sue platforms that do not comply with such requests.
In addition to targeting tech platforms, the Take It Down Act also carves out punishments, which include fines and potential jail time, for those who create and share such imagery. The new law would make it a federal crime to publish — or even threaten to publish — explicit nonconsensual images, which would include revenge porn and deepfake imagery generated with AI.
Digital rights groups have sharedtheir concerns regarding the Take It Down Act. Activists have said that the bill could be weaponized to censor legally protected speech, and that legal content could be inaccurately flagged for removal.
Despite these concerns, the Take It Down Act even received supportfrom the tech platforms it seeks to police, such as Snapchat and Roblox.
Congress isn't finished addressing AI and deepfakes this year either. Both the NO FAKES Act of 2025and Content Origin Protection and Integrity from Edited and Deepfaked Media Act of 2025have also been introduced this session. The former seeks to protect individuals from having their voice replicatedby AI without their consent, whereas the latter looks to protect original works and require transparency around AI-generated content.
Topics Artificial Intelligence Social Media Politics
Search
Categories
Latest Posts
NYT mini crossword answers for April 24, 2025
2025-06-26 08:39Is April the giraffe pregnant again?
2025-06-26 07:47Dogecoin is mooning, and we're listening for the popping sound
2025-06-26 07:43Q&A with tendercare founder and CEO Shauna Sweeney
2025-06-26 06:55Popular Posts
Best Amazon Fire TV Cube deal: Save $30 at Amazon
2025-06-26 08:52Bright brows are the offbeat beauty look that works for everyone
2025-06-26 07:45Meet Quimera, the two
2025-06-26 07:34A second 'Downton Abbey' movie is in the works
2025-06-26 06:36Best keyboard deals: Save on Asus gaming keyboards at Amazon
2025-06-26 06:18Featured Posts
Your 'wrong person' texts may be linked to Myanmar warlord
2025-06-26 08:44Detective Trump has cracked the case on Russian interference, and uh
2025-06-26 08:04'The Oregon Trail' game gets a millennial version for Oregon tourism
2025-06-26 07:36Apple gets nostalgic and brings back colorful options to the iMac
2025-06-26 06:39Useful or Little Known Android Features
2025-06-26 06:37Popular Articles
CPU Price Watch: 9900K Incoming, Ryzen Cuts
2025-06-26 08:46Politicians give this problematic response to Roy Moore allegations
2025-06-26 07:58Apple's $29 AirTag Bluetooth trackers will be available April 30th
2025-06-26 07:49Shop the Shark FlexStyle for 20% off at Amazon
2025-06-26 06:48Newsletter
Subscribe to our newsletter for the latest updates.
Comments (321)
Sky Information Network
Best Sony deal: Save $100 on WH
2025-06-26 08:32Torch Information Network
How to listen to audiobooks: Free apps vs. paid subscriptions
2025-06-26 08:27Culture Information Network
'How I Met Your Mother' reboot starring Hilary Duff coming to Hulu
2025-06-26 08:21Smart Information Network
Apple rebooted the iMac line with a splash of color and its M1 chip
2025-06-26 08:02Visionary Information Network
CPU Price Watch: 9900K Incoming, Ryzen Cuts
2025-06-26 06:54