The generative AI market is growing rapidly and is estimated to increase from $40 billion to $1.3 trillion by 2032. Generative AI refers to artificial intelligence that learns the patterns of content created by people and uses that information to create new content, such as music and videos. Generative AI is touted to boost efficiency, improve customization, and expand creative expression.

However, AI's expansion of creative expression conflicts with intellectual property rights and the right to control one's persona. For example, in October 2023, Tom Hanks warned fans that he "had nothing to do with" a dental commercial that appeared to feature him, but was actually created by a third-party using AI.

Similarly, the AI-generated song, "Heart on my Sleeve," posted by an anonymous TikTok user, featured voices that mimicked Drake and the Weeknd without their authorization. The video of the song was viewed millions of times, and the song was even submitted for a Grammy. Universal Music Group was able to get this song removed from platforms like Spotify, TikTok, and YouTube, because it included a copyright protected producer tag. After this incident, Universal Music Group issued a statement asking "stakeholders in the music ecosystem [which side they] want to be on: the side of artists, fans and human creative expression, or on the side of deep fakes, fraud and denying artists their due compensation."

Although Universal Music Group relied on copyright to get this song removed, if the producer tag was not included in the song, Universal Music Group would not have a copyright claim, because no other content was actually copied. Additionally, although voices nearly identical to that of Drake and the Weeknd were featured in the song, it is unlikely that there was a violation of these stars' rights of publicity under current applicable law, because the use was likely not commercial in nature.

The Right of Publicity

As social media use has exploded and the power of influencers and celebrity endorsements has grown exponentially, protecting the right to control the use of one's persona is more important than ever.

Currently, the right of publicity generally protects individuals' rights to control the "commercial" use of their names, images, and likenesses, and in some states, other aspects of their persona, such as voices or signatures.

The right of publicity is governed by state law. Only 38 states recognize the right of publicity (some by statute, some by common law, and some by both), and only 20 states recognize a post-mortem right of publicity. These state laws often conflict and therefore, whether an individual's rights in his or her persona are protected, and the scope of such protection depends, in large part, on where the individual resides.

The right to control one's persona does not trump the First Amendment. Therefore, right of publicity cases often turn on whether the particular use was commercial in nature or not, and each state that recognizes the right of publicity has its own test to balance individuals' rights to control the use of their personas with the First Amendment right to free expression.

First Amendment defenses have been successful in certain cases where a third party used an individual's persona in connection with entertainment programs, news, public affairs, sports broadcasts, political campaigns, or in connection with parodies. For example, Cardtoons, L.C. v. Major League Baseball Players Ass'n held that trading cards featuring caricatures of Major League Baseball players and humorous commentary were parodies and not violative of the right of publicity.

Moreover, First Amendment defenses have also been accepted where significant transformative components have been added to the use of the persona. Winter v. DC Comics determined that the use of characters in a comic book that evoked musicians Johnny and Edgar Autumn did not violate the musicians' rights of publicity where the images were of distorted cartoon characters that were half-human, half worm. In addition, No Doubt v. Activision Publ'g, Inc. rejected a video game manufacturer's First Amendment defense against the band No Doubt's right of publicity claim where the manufacturer used "computer-generated recreations of the real band members, painstakingly designed to mimic their likenesses," in its video game, and creative elements in the game were nothing more than conventional images of the band members.

No Fakes Act

Given that right of publicity laws vary from state to state (with some states not even recognizing such a right), there are obvious gaps in protection. These gaps will become more pronounced when applied to new technologies like AI.

In an effort to "protect the voice and visual likeness of all individuals from unauthorized recreations from generative AI," Senators Chris Coons, Marsha Blackburn, Amy Klobuchar, and Thom Tillis have proposed a new law to Congress entitled the Nurture Originals, Foster Art and Keep Entertainment Safe Act of 2023 or the No Fakes Act. Applicable to all states, the No Fakes Act would create a uniform, federal right of publicity law with respect to uses of images, voice, and visual likeness in sound recordings and audiovisual works. Any additional rights provided under state law will still be available, as state law is not preempted by the Act. Organizations like SAG - AFTRA have applauded the efforts to protect performers' right to control their "most valuable assets."

Particularly, if enacted, the No Fakes Act would:

    Create a post-mortem right of publicity that would protect one's image, voice, and visual likeness for 70 years after death;
  • Prohibit the production of digital replicas or computer-generated electronic representations that are nearly indistinguishable from the actual voice or visual likeness of individuals without their consent;
  • Prohibit the distribution, publication, or transmission of unauthorized digital replicas that one knows is unauthorized; and
  • Create civil liability for violations equal to the greater of $5,000 per violation or the damages suffered by the injured party, with punitive damages and attorneys' fees available for willful violations.
  • Much like the affirmative defenses under state law based on First Amendment grounds, the No Fakes Act does not extend to digital replicas that are: (i) used as part of a news, public affairs, or sports report, (ii) used in a documentary or historical work if the digital replica of the applicable individual is used as a depiction of that actual individual, (iii) used for purposes of comment, criticism, scholarship, satire, or parody, (iv) used in an advertisement or commercial announcement for the purposes in (i)-(iii) above, or (v) are de minimis or incidental.

    As more content is created that includes AI-generated images, voices, and/or likenesses - particularly of individuals who were not involved in, and did not content to, such inclusion, a push for a more robust set of rules governing the right of publicity is inevitable.

    The No Fakes Act would create a uniform law to protect individuals' rights in their persona from unauthorized use in digital replicas and would fill in some of the gaps where state laws fail to address the use of generative AI. Notably, unlike the various state laws, the Act does not specifically require that the unauthorized use be commercial in nature. As such, if the Act ultimately becomes law, there will likely be First Amendment considerations that will be litigated, and the constitutionality of the law may be challenged.

    Originally published by Association of National Advertisers, December 15, 2023.

    The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

Brianne L. Polito
Lowenstein Sandler
1251 Avenue of the Americas
New York
New York 10020
UNITED STATES
Tel: 973597 2500
Fax: 973597 2400
E-mail: rcolon@lowenstein.com
URL: www.lowenstein.com

© Mondaq Ltd, 2024 - Tel. +44 (0)20 8544 8300 - http://www.mondaq.com, source Business Briefing