California’s AI bill awaits governor’s signature : NPR

US

An aerial view of the California State Capitol in Sacramento, Calif.

Justin Sullivan/Getty Images


hide caption

toggle caption

Justin Sullivan/Getty Images

A new bill to protect performers from unauthorized AI is now headed to the California governor to consider signing into law.

The use of artificial intelligence to create digital replicas is a major concern in the entertainment industry, and AI use was a point of contention during last year’s Hollywood strike. Other national proposals offering AI protections to all Americans are also in the works.

California Assembly Bill 2602 would regulate the use of generative AI for performers — not only those on-screen in films and TV/streaming series but also those who use their voices and body movements in other media, such as audiobooks and video games. The measure would require informed consent and union or legal representation “where performers are asked to give up the right to their digital self,” according to the bill.

The bill was passed overwhelmingly by both parties in the California Senate and the Assembly this week. The legislation was also supported by the union SAG-AFTRA, whose chief negotiator, Duncan Crabtree-Ireland, points out that the bill had bipartisan support and was not opposed by industry groups such as the Motion Picture Association, which represents studios such as Netflix, Paramount Pictures, Sony, Warner Bros., and Disney. A representative for the MPA says the organization is neutral on the bill.

Crabtree-Ireland says the new law would mean performers could no longer be forced to relinquish their rights to their likeness. “Good riddance to that practice,” he said from the picket line outside of Warner Bros. Games on Wednesday, where striking video game performers are pushing for more AI protections.

“The concept that each of us should have the right to say yes or no to any kind of replication of our face, voice, body movement, etc., should be a no-brainer,” he said.

Some tech companies have opposed regulations of AI use. In a statement sent to NPR, a spokesperson for the video game companies wrote, “Under our AI proposal, if we want to use a digital replica of an actor to generate a new performance of them in a game, we have to seek consent and pay them fairly for its use.”

But unlike voice actors, performers whose body movements are used to animate video games argue the companies consider what they do as “motion capture” and not performances.

Last year, SAG-AFTRA members went on strike against the major studios and streamers for months. In the end, they claimed victory when language in their new contract offered performers the right of consent and fair compensation for the use of their digital doubles. Just before the final ratification contract vote, Crabtree-Ireland said he was a victim of an AI-fabricated social media post.

“Some unknown party created a deep fake video of me,” he recalled, saying the video manipulated his face and voice “to say a bunch of false things about our contract and was encouraging people to vote no on the contract.”

Crabtree-Ireland said Instagram voluntarily took down the deep fake video, but there was no legal requirement to do so. He said he’s hoping legislation will outlaw this kind of misinformation. “If that can happen to me,” he said, “it can happen to anybody.”

Other proposed guardrails

In addition to AB2602, the performer’s union is backing California bill AB 1836 to protect deceased performers’ intellectual property from digital replicas.

On a national level, entertainment industry stakeholders, from SAG-AFTRA to The Recording Academy and the MPA, and others are supporting The “NO FAKES Act” (the Nurture Originals, Foster Art, and Keep Entertainment Safe Act) introduced in the Senate. That law would make creating a digital replica of any American illegal.

Around the country, legislators have proposed hundreds of laws to regulate AI more generally. For example, California lawmakers recently passed the Safe and Secure Innovation for Frontier Artificial Intelligence Models Act (SB 1047), which regulates AI models such as ChatGPT.

“It’s vital and it’s incredibly urgent because legislation, as we know, takes time, but technology matures exponentially. So we’re going to be constantly fighting the battle to stay ahead of this,” said voice performer Zeke Alton, a member of SAG-AFTRA’s negotiating committee. “If we don’t get to know what’s real and what’s fake, that is starting to pick away at the foundations of democracy.”

Alton says in the fight for AI protections of digital doubles, Hollywood performers have been the canary in the coal mine. “We are having this open conversation in the public about generative AI and it and using it to replace the worker instead of having the worker use it as a tool for their own efficiency,” he said. “But it’s coming for every other industry, every other worker. That’s how big this sea change in technology is. So what happens here is going to reverberate.”

Editor’s Note: Many NPR employees are members of SAG-AFTRA but under a different contract than the performers.

Products You May Like

Articles You May Like

The sneaky way Big Tech is acquiring AI unicorns without buying the companies
Could ‘dangerous’ Lake Michigan waves spoil Labor Day weekend? – NBC Chicago
Hiker allegedly stranded by co-workers on Colorado mountain was raising money for World Central Kitchen
Projecting Blackhawks’ lines, pairings: Several forwards will compete for spots next to Connor Bedard
Daily Horoscope for August 24, 2024 – New York Daily News

Leave a Reply

Your email address will not be published. Required fields are marked *