C THE BUZZ: IMITATION GAME — It’s not just actor Scarlett Johansson who’s worried about being copied by AI. The rapid growth of artificial intelligence has opened the possibility that the voices and likenesses of entertainers — even dead ones — could be replicated without their knowledge or consent. That’s raising the alarms of California's lawmakers and labor groups, pitting them against Hollywood and Silicon Valley. Hollywood studios have already been pushing to use digitally replicated — or, in some cases, entirely synthetic — copies of performers, which was a major point of contention in last year’s negotiations between studios and SAG-AFTRA. But the recent dispute between Johansson and OpenAI, which she claims used her voice for the latest version of ChatGPT after she said no, animates a complicated debate playing out in Sacramento. Lawmakers here are considering dozens of AI bills, including two that would limit how and when companies can digitally replicate a person. “This is what we’re afraid of,” said Assemblymember Rebecca Bauer-Kahan, who is authoring a bill that would penalize those who create digital replicas of deceased people without the consent of their estate. A proposal from San Jose Democrat Ash Kalra that is co-sponsored by the powerful California Labor Federation would prevent studios from replicating a person’s voice or likeness unless that performer gave informed consent. It would also require terms to be negotiated with an attorney or union representative present. Supporters argue the bill is critical to make sure artists aren’t unknowingly signing away their likeness in perpetuity. Opponents, including the Motion Picture Association of America, are playing to state officials’ fiscal worries, arguing it would burden the state’s judicial system in a tight budget year. Hollywood, one of California’s marquee industries, has increasingly butted heads with the state’s powerful labor groups as it seeks to capitalize on the advancements in technology. Lorena Gonzalez, head of the California Labor Federation, said Johannson’s experience highlights a problem that lawmakers may have to address down the road. “She didn’t agree to this, and that in itself is a huge issue,” Gonzalez said. “We knew that this was coming, so how do we regulate this? How do we allow performers to be able to sue on behalf of their own likeness?” Bauer-Kahan’s bill, which passed out of the lower chamber on Monday, highlights a more unusual — and to some, unnerving — AI dilemma: the posthumous use of someone’s image or voice. The Orinda Democrat recently described it as "the right to not be reanimated without their consent.” That proposal's opponents include the Electronic Frontier Foundation, which argues it doesn’t allow for important uses like plays, films and news commentary. “And it does all of this not to protect any living person, but only those who hope to grow rich exploiting their identities long after they are long gone,” the group argued in its written testimony. Johansson’s situation could help Kalra make the case for the protections he’s proposing as his bill awaits a vote on the Assembly floor this week. "At the end of the day, people should have self-determination in how their voice or image or likeness is used," he said. GOOD MORNING. Happy Wednesday. Thanks for waking up with Playbook. You can text us at 916-562-0685 — save it as “CA Playbook” in your contacts. Or drop us a line at lkorte@politico.com and dgardiner@politico.com, or on X — @DustinGardiner and @Lara_Korte. WHERE’S GAVIN? On his way back from Europe.
|
No comments:
Post a Comment