Senate Introduces New Bill to Protect Artists From Deepfakes

Senate Introduces New Bill to Protect Artists From Deepfakes


bipartisan group Several U.S. senators on Wednesday formally introduced a long-awaited bill that aims to protect people's voices and visual images from being exploited in artificial intelligence-generated re-creations without their permission.

Sens. Chris Coons (D-Del.), Marsha Blackburn (R-Tenn.), Amy Klobuchar (D-Minn.), and Thom Tillis introduced the Fostering Native American, Fostering Art, and Keeping Entertainment Safe (No Fake) Act on Wednesday, about eight months after a discussion draft of the bill was first introduced last October.

The No Fraud Act, similar to the No AI Fraud Act introduced by Congress earlier this year, would establish increased protections for individuals’ right to publicity by strengthening their legal claims against unauthorized use of their voice and image. As the bill notes, this right does not end with someone’s death, but instead passes to the individual’s heirs or executor of their estate.

The need to protect virtual likenesses from AI has been underscored since the NO Fakes discussion draft, with cases such as the controversy over OpenAI’s use of a voice that was “eerily similar” to Scarlett Johansson’s after the actress declined an offer to use her voice in a conversational AI program that the creator of ChatGPT was developing. Fake porn has also become a major concern, with celebrities and students falling victim to the disturbing trend. Alexandria Ocasio-Cortez is taking the issue head-on, and the Senate unanimously passed her bill to combat it last week.

AI remains a particularly hot topic in the music industry, and AI voice cloning is frequently used on recordings. It’s happened with the artist’s permission, as when Randy Travis used the technology to release his first song since suffering a stroke a decade ago, and without permission, as when Drake used Tupac’s voice on his parody song “Taylor Made Freestyle” with Kendrick Lamar. Drake — whose voice was featured on Ghostwriter977’s “Heart on My Sleeve” last year — removed “Taylor Made Freestyle” after Tupac Shakur’s estate sent him a cease-and-desist order.

The industry has expressed cautious support for the use of AI in music creation, but only when it’s done with the permission of rights holders. Major record labels sued AI music-making companies Suno and Udio in June, alleging that the companies used thousands of unauthorized artists’ songs to train their software.

The NO FAKES campaign has received significant support from the music industry, with the Human Artistry Campaign, the RIAA, NMPAm and the Recording Academy expressing their endorsement on Wednesday.

“The Humane Art Campaign applauds Senators Coons, Blackburn, Klobuchar, and Tillis for crafting strong legislation that establishes a fundamental right to give every American control over their voices and faces against a new onslaught of hyperrealistic voice clones and deepfakes,” said Dr. Moya McTeer, senior advisor to the Humane Art Campaign, in a statement. “The No Fakes Act will help protect people, culture, and art—with clear protections and exceptions for public interest and free speech.”

Common

The bill has also received praise from some entertainment figures, including Screen Actors Guild (SAG-AFTRA) President Fran Drescher and the heads of talent agencies CAA, UTA and WME.

“In the coming decade, legislation like the No Counterfeit Act is urgently needed to protect Americans from falling victim to technology that can imitate our image and voice,” said Drescher. “Thank you to Senators Blackburn, Queens, Klobuchar, and Tillis for standing up for human rights by introducing the No Counterfeit Act. People and communities must be protected from innovation.”



.

Leave a Reply

Your email address will not be published. Required fields are marked *

gomen gomen gomen gomen gomen gomen gomen gomen