On September 17, California Governor Gavin Newsom signed into law a pair of artificial intelligence (AI) bills, Assembly Bill 2602 (AB 2602) and Assembly Bill 1836 (AB 1836), which introduce new regulatory requirements and compliance obligations aimed at ensuring transparency, accountability, and ethical use of AI. These new laws have significant implications for businesses utilizing AI technologies, who now need to account for the additional, novel rights granted to Californians, and actors in particular, under these new laws.
The governor’s endorsement of the new laws follows Executive Order N-12-23, which builds on the White House’s AI Bill of Rights. This executive order outlines how California, which is home to 32 of the world’s 50 leading AI companies, should deploy generative AI “ethically and responsibly . . . protect and prepare for potential harms and remain the world’s AI leader.” Signed by Governor Newsom last year, this order underscores the state’s commitment to remaining at the vanguard of AI regulation.
AB 2602 renders a contract provision between an individual and any other person, solely for the performance of personal or professional services, as contrary to public policy if the contract:
(i) allows for the creation and use of a digital replica of the individual’s voice or likeness in place of work the individual would otherwise have performed in person;
(ii) does not provide a “reasonably specific” list of all proposed uses of the digital replica; and
(iii) the individual is not represented (by either legal counsel or a labor union) during the negotiation of such contract.
Under the law, “digital replica” means a “computer-generated, highly realistic electronic representation that is readily identifiable as the voice or visual likeness of an individual” that is embodied in a digital work in which the individual did not actually perform or in which they did perform, but the “fundamental character” of the performance has been “materially altered.”
The law is effective only as it relates to a new performance, fixed on or after January 1, 2025.
The law will prevent businesses from using AI to replicate performers and actors’ voices or likeness without their permission, which means that businesses currently using standard form agreements for the engagement of actors, influencers, and other talent who wish to secure rights for digital replicas (now or in the future) will need to modify those forms. Broad, customary language such as “all media known or hereafter devised” will not suffice for digital replicas. In fact, standard forms should exclude digital replicas unless there is a specific negotiation with the applicable talent and their counsel or is engaged under union contracts, as reflected in the form.
Businesses should also include a certification that talent has consulted with counsel prior to executing any agreement or release that includes digital replica rights. This law will likely have implications on the negotiation of talent agreements in that simple releases will be more extensively considered on the front end to account for any digital likeness rights that the business requires, and such digital likeness rights must be specifically accounted for in the agreement.
Businesses may also need to include further downstream licenses of such voice or likeness rights to large language models in the release negotiations, which could create additional delays in the process. Duncan Crabtree-Ireland, the executive director of SAG-AFTRA, stated in support of the law last August that “voice and likeness rights, in an age of digital replication, must have strong guardrails around licensing to protect from abuse, this law provides those guardrails.” However, as with other forms of new regulation, the impact on the industry remains yet to be seen.
AB 1836 amends Section 3344.1 of the Civil Code to include a prohibition on creating or distributing a digital replica of a deceased personality’s “voice or likeness in an expressive audiovisual work or sound recording” without receiving prior consent from the personality’s estate.
Under this law, “digital replica” means “a computer-generated, highly realistic electronic representation that is readily identifiable as the voice or visual likeness of an individual” embodied in a sound recording, image, audiovisual work, or transmission in which the individual did not actually perform or in which they did perform, but the “fundamental character” of the performance has been “materially altered.” However, the definition specifically carves out “the electronic reproduction, use of a sample of one sound recording or audiovisual work into another, remixing, mastering, or digital remastering of a sound recording or audiovisual work authorized by the copyright holder.”
The law aims to curb unauthorized uses of digital replicas, and except as set forth above, encompasses any audiovisual work or sound recordings linked to performances delivered by artists when they were alive. Noncompliance with the digital replica requirements is subject to a fine in an amount equal to the greater of $10,000 or the actual damages suffered by the performer’s estate.
It is not clear what, if any, impact this amendment to the posthumous rights of publicity in California will have on the rights of publicity for living individuals.
Businesses that use, create, or rely on digital replicas of persons (such as AI content creation businesses, interactive entertainment companies, social media content creators, etc.) should carefully review their existing policies and forms when it comes to the creation of digital likenesses. The impact of these new laws for these businesses will include:
The enactment of AB 2602 and AB 1836 marks a significant step in regulating the use of AI in California, emphasizing the importance of negotiating with talent for desired rights to use AI generated likeness and reflecting it specifically in their agreements. Businesses utilizing AI technologies must now navigate these new compliance requirements to ensure that they do not infringe the rights granted by these laws.
The regulatory landscape surrounding AI may continue to change in California as various bills are proposed by California legislators. These bills include SB 1047, which would, among other things, require that developers of an AI model implement the capability to enact a full shutdown of the AI model and adhere to several audit requirements, and AB 2013, which would, among other things, require developers of generative AI to publicly disclose a summary of datasets used in the development and training of the generative AI.