Senators introduce COPIED Act to push for better watermarking on AI content


A bipartisan group of senators introduced a new bill to make it easier to authenticate and detect artificial intelligence-generated content and protect journalists and artists from having their work gobbled up by AI models without their permission.
The Content Origin Protection and Integrity from Edited and Deepfaked Media Act (COPIED Act) would direct the National Institute of Standards and Technology (NIST) to create standards and guidelines that help prove the origin of content and detect synthetic content, like through watermarking. It also directs the agency to create security measures to prevent tampering and requires AI tools for creative or journalistic content to let users attach information about their origin and prohibit that information from being removed. Under the bill, such content also could not be used to train AI models.
Content owners, including broadcasters, artists, and newspapers, could sue companies they believe used their materials without permission or tampered with authentication markers. State attorneys general and the Federal Trade Commission could also enforce the bill.
It’s the latest in a wave of AI-related bills as the Senate has embarked to understand and regulate the technology. Senate Majority Leader Chuck Schumer (D-NY) led an effort to create an AI roadmap for the chamber, but made clear that new laws would be worked out in individual committees. The COPIED Act has the advantage of a powerful committee leader as a sponsor, Senate Commerce Committee Chair Maria Cantwell (D-WA). Senate AI Working Group member Martin Heinrich (D-NM) and Commerce Committee member Marsha Blackburn (R-TN) are also leading the bill.
Several publishing and artists’ groups issued statements applauding the bill’s introduction, including SAG-AFTRA, the Recording Industry Association of America, the News/Media Alliance, and Artist Rights Alliance, among others.
“The capacity of AI to produce stunningly accurate digital representations of performers poses a real and present threat to the economic and reputational well-being and self-determination of our members,” SAG-AFTRA national executive director and chief negotiator Duncan Crabtree-Ireland said in a statement. “We need a fully transparent and accountable supply chain for generative Artificial Intelligence and the content it creates in order to protect everyone’s basic right to control the use of their face, voice, and persona.”
A bipartisan group of senators introduced a new bill to make it easier to authenticate and detect artificial intelligence-generated content and protect journalists and artists from having their work gobbled up by AI models without their permission. The Content Origin Protection and Integrity from Edited and Deepfaked Media Act (COPIED Act)…
Recent Posts
- Balatro has had its PEGI 18 age rating overturned following appeal: ‘I hope this change will allow developers to create without being unfairly punished’
- Three years later, the Steam Deck has dominated handheld PC gaming
- Google Gemini’s AI coding tool is now free for individual users
- Attention, Kindle owners –today is your last chance to download backups of your ebooks
- Scooby-Doo is a good movie with a bad Rotten Tomatoes score – here’s why you should ignore the critics and watch it before it leaves Netflix
Archives
- February 2025
- January 2025
- December 2024
- November 2024
- October 2024
- September 2024
- August 2024
- July 2024
- June 2024
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022
- June 2022
- May 2022
- April 2022
- March 2022
- February 2022
- January 2022
- December 2021
- November 2021
- October 2021
- September 2021
- August 2021
- July 2021
- June 2021
- May 2021
- April 2021
- March 2021
- February 2021
- January 2021
- December 2020
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- June 2020
- May 2020
- April 2020
- March 2020
- February 2020
- January 2020
- December 2019
- November 2019
- September 2018
- October 2017
- December 2011
- August 2010