The ELVIS Act Protects Artists’ Voices but Creates Risks for Media Companies
Tennessee’s new right of publicity law has broad effects, including for media companies. Often, right of publicity claims are directed at companies that use another’s name, image or likeness for an unauthorized commercial purpose. The ELVIS Act, a new Tennessee law, expands liability for broadcasters that air material that previously would have only created liability for the companies advertising their goods or services. The Act also expands liability for generative AI companies that provide the tools for voice cloning.
On July 1, 2024, the ELVIS Act goes into effect. In March, Governor Bill Lee signed the Ensuring Likeness, Voice, and Image Security Act into law. The ELVIS Act amends the Tennessee Personal Rights Protection Act by expanding the law to prohibit the unauthorized commercial use of an individual’s voice. More significantly, it creates secondary liability for those that provide the technology that allows for the creation, distribution or transmission of the infringing likeness or voice.
Tennessee is home to a large portion of the recorded music industry, especially in the country and gospel genres. The ELVIS Act seeks to protect that industry. To help record labels, the Act allows them to file suit on behalf of their recording artists. Specifically, the Act clarifies that where an individual has entered an exclusive contract as a recording artist or granted an exclusive license to distribute sound recordings featuring the individual’s voice, either the individual or the record label may enforce the artist’s rights under the Act.
The Tennessee Right of Publicity statute did not previously protect a person’s aural likeness. However, many other states already provide this protection. After Bette Midler and Tom Waits rejected major companies for advertisements, the companies went ahead with the advertisements using sound-alike singers. Both Midler and Waits prevailed on right of publicity claims and were awarded $400,000 and $2.5 million, respectively.
The second part of the Act creates two new forms of civil liability where a person or company:
-
- “Publishes, performs, distributes, transmits, or otherwise makes available to the public an individual’s voice or likeness, with knowledge that use of the voice or likeness was not authorized by the individual.”
- “Distributes, transmits, or otherwise makes available an algorithm, software, tool, or other technology, service, or device, the primary purpose or function” of which “is the production of a particular, identifiable individual’s photograph, voice, or likeness, with knowledge” that the use “was not authorized by the individual.”
The first provision likely targets social media companies but can apply to traditional media as well. A television or radio station that broadcasts an ad containing a voice clone could potentially be liable under this new theory, if the station knew that the use of the voice was unauthorized. Given this change in the law, media companies may want to consider putting in place processes to vet ads featuring celebrity voices before airing them.
The second provision impacts technology companies, especially those that offer generative AI tools where “the primary purpose or function of which is the production of an individual’s photograph, voice, or likeness without authorization from the individual.” The law does not define “primary purpose or function.” For now, it is unclear whether generative AI tools that perform all sorts of functions could be liable because they allow for voice cloning or if only tools specifically designed for voice cloning could face liability.
Both of these new potential claims also expand the underlying conduct that is actionable. The existing law made it unlawful to “knowingly use … in any medium” a person’s name, image or likeness “for purposes of advertising . . . or for purposes of fund raising, solicitation of donations, purchases of products, merchandise, goods, or services.” These new causes of action do not explicitly require a commercial element. Liability under the Act could potentially attach to non-commercial uses of another’s voice or likeness, creating tension with the First Amendment for expressive content.
What Remedies Are Available for Violation of the ELVIS Act?
The remedies available under the ELVIS Act are largely the same as those under the existing law. A claimant can seek injunctive relief to stop the unauthorized use of their name, voice or likeness. They may also seek the destruction of all materials made or used in violation of their publicity rights. Claimants can also obtain monetary damages, including actual damages and any profits attributable to infringement of their voice.
A Narrow Fair Use Defense
The ELVIS Act narrows the fair use defense that previously existed, potentially expanding liability for media companies. The pre-Act law provided that “if the use of a name, photograph, or likeness is in connection with any news, public affairs, or sports broadcast or account,” it does not violate the law. The ELVIS Act newly provides that this exemption only applies to the extent “such use is protected by the First Amendment to the United States Constitution,” which raises the burden on companies that wish to invoke this defense.
The ELVIS Act also narrows the existing third-party advertising exemption. Previously, that exemption provided that the mediums used for advertising, such as television and radio stations, newspapers, and magazines, were not liable for publishing or disseminating any advertisement in violation of the Act unless those entities had actual knowledge of the unauthorized use. The new law takes this provision one step further by excluding from the exemption any company that has “knowledge or reasonably should have known of the unauthorized use.”
Issues for Media Companies
-
- Previously, the fair use defense was broader than the First Amendment. Now broadcasters should be extra cautious in reporting on voice cloning incidents.
- Previously, media companies could only be liable for running advertisements or sponsored content if they had actual knowledge of the misappropriation of a person’s likeness. The Act lowers the standard for filing suit and seeks to impose liability where the media defendant “should have known” of the voice cloning. This means broadcasters should evaluate advertisements and sponsored content more closely than before.
How Can Broadcasters Protect Against Liability?
-
- Be sure all advertising and sponsored content agreements provide indemnity for voice cloning and other right of publicity liability.
- Consider adopting processes to identify name, image and likeness cloning.
- Explore the use of technology to identify deep fakes.
Please contact Andrew Coffman, Ashley Heilprin or any member of the Phelps’ Media and First Amendment Law team if you have questions or would like further guidance.