Taylor Swift may be the most famous victim of AI-enhanced pornography, but the creators of these nude “deepfakes” have spread this vile and terrifying new form of online abuse across the country. The Washington State Legislature enacted protections earlier this year, but Congress needs to act.
Those targeted – primarily women and teenage girls – have little recourse in many parts of the country: In Swift’s case, the image was circulated 47 million times on “X,” the site formerly known as Twitter, before being removed, according to The Guardian.
The Senate Commerce Committee, chaired by Sen. Maria Cantwell (D-WA), is considering legislation that would make these deepfakes a federal crime and give victims assurances that they will have their content quickly removed from the internet. Congress should act quickly to pass the legislation.
Washington state is one of at least 14 states that already have penalties for AI-generated deepfakes. Earlier this year, Caroline Mallett, daughter of state Senator Mark Mallett, bravely testified about this deeply disturbing, but increasingly common, trend: One of her classmates was taking photos of girls at homecoming and distributing fake images that were digitally manipulated with an AI app to include nudity. Lawmakers unanimously voted to make these images the equivalent of the state’s child pornography possession law, while also creating a way for victims to sue creators and publishers in court.
But the Internet doesn’t stop at state lines, and making this a crime in all 50 states and U.S. territories is the only way to ensure uniformity for all victims of this humiliating new form of online exploitation. And under a bill pending in Congress called the TAKE IT DOWN Act, publishers would be required to remove images or face penalties from the Federal Trade Commission.
The bill, or “Tools to address known abuses by perpetuating deepfake technology on websites and networks,” would do two things. First, it would make AI-generated fake nudes punishable by prison time, up to two years if the victim is an adult and three years if the victim is a minor. Second, it would require publishers, whether they are small website publishers or large social media companies like Meta, to remove such images within 48 hours of being contacted by the victim.
Senator Cantwell will have the opportunity to introduce the bill in the Senate Commerce Committee. While she is not the lead sponsor, the effort is bipartisan, with 10 Republicans and seven Democrats signing on. Senator Cantwell has been a vocal supporter of establishing digital privacy protections for Americans, and recently told the Editorial Board that she supports the bill.
The Editorial Board also supports comprehensive digital privacy protections in a bill that Cantwell introduced with Rep. Cathy McMorris Rodgers (R-Spokane) earlier this year.
Aside from this legislation, there is also a desperate need to “TAKE IT DOWN” – everyone from Taylor Swift to teenagers growing up in an era where AI can create harmful and damaging content deserves that right.
Members of The Seattle Times Editorial Board are editorial page editor Kate Reilly, Frank A. Blethen, Melissa Davis, Josh Farley, Alex Fryer, Claudia Lowe, Carlton Winfrey and William K. Blethen (professor emeritus).