Deepfakes vs Right of Publicity: Navigating the Intersection Between Free Speech and Protected Rights

The Briefing by the IP Law Blog - Podcast tekijän mukaan Weintraub Tobin - Perjantaisin

Kategoriat:

The rise of deepfakes is a growing concern within the entertainment industry. Scott Hervey and Jamie Lincenberg discuss this and the intersection between free speech and protected rights on this episode of The Briefing. Watch this episode on the Weintraub YouTube channel here.   Show Notes: Scott: Deepfakes and AI-generated likeness are not just the concerns of striking actors. Just ask Drake, The Weeknd, and UMG, a Drake and Weeknd collaboration that busted the Internet in May of this year wasn't real. It was generated by AI and made to sound like the performers. Where is it that current write of publicity laws work? And in what situations do they fail to address the scenarios presented by Deepfakes and AI-generated images? We are going to talk about this next on The Briefing by Weintraub Tobin. Let's first identify the type of AI output that triggers the right of publicity concerns. It's visual likeness and appearances, and then it's also voices or vocal likeness. California's right of publicity statute is Civil Code section 3344, and it prohibits the use of another's name, voice, photograph, or likeness on or in products, merchandise, or goods, or for the purpose of advertising or selling such products, merchandise or goods without such person's prior consent. California also has a common law right of publicity that's a bit broader than the statute. But whereas a celebrity's likeness isn't being used on or in products, merchandise, or goods, or for the purpose of advertising or selling such goods, California’s right of publicity statute isn't applicable, really. As for a common law claim, even though common law provides for a broader right of publicity protection than the statute, the First Amendment may prevent any recovery. Generally, a claim for common law appropriation will not stand in the case of an expressive work due to First Amendment concerns. Jamie: So, Scott, it seems well settled that where a celebrity's likeness, whether that be visual or vocal, is used in connection with the advertising or sale of goods or services, consent of that celebrity is required. The void seems to be where that celebrity's likeness is used in an expressive work. Scott: That's true, Jamie, and void is a good way of putting it since it's not clear that this void is a shortcoming or some type of legal failure, or rather the greater importance of the First Amendment. Take, for example, the AI Drake song. Section 114 B of the copyright act permits sound-alikes. A publication of the US. Copyright Office specifically says that under U.S. Copyright law, the exclusive rights in a sound recording do not extend to making independently recorded sound-alike recordings. Copyright protection for sound recordings extends only to the particular sounds of which the recording consists and will not prevent a separate recording of another performance in which those sounds are imitated. The imitation of a recorded performance, no matter how similar to the original, would not constitute copyright infringement, even where one performer deliberately sets out to simulate another performance as exactly as possible. To extend a state right of publicity to cover the use of a celebrity's vocal likeness in an expressive work like the A. I. Drake Song would put a law in effect that directly conflicts with the Copyright Act. Jamie: Let's talk about New York's right of publicity statute. In particular, section 50 F of New York's Civil Rights Law, which took effect in 2021. This law addresses and prohibits certain uses of AI-generated lookalikes or digital ...

Visit the podcast's native language site