Following the post-strike AI provisions negotiated by SAG-AFTRA and the Writers Guild of America, the entertainment industry is entering a new phase of grappling with artificial intelligence. Performers and writers demanded meaningful limits on the use of AI to replicate their voices, likenesses and performances and now the burden shifts to lawyers to operationalize them with clarity and foresight.
Studios and unions are rolling out AI protocols, residual consent frameworks, and standardized definitions of “digital likeness” that were hard-won at the bargaining table. These evolving contractual regimes will directly impact major IP deals for Texas-based counsel representing talent, tech developers, and rights holders.
For transactional attorneys—especially those managing endorsement deals, production agreements, and IP licenses—digital replica clauses are no longer exclusive to Hollywood guild contracts. These provisions, which govern the AI-driven recreation of a performer’s likeness, voice, or performance, have become standard in influencer deals, podcasting agreements, and video game voiceover contracts.
Updating contract templates to include boilerplate AI clauses is the easy part; the real challenge is drafting for a technology that is moving faster than the law can keep up with. To help other lawyers stay ahead of the curve, here are my drafting strategies for navigating this shifting landscape:
1. Use specificity when drafting
When negotiating a digital replica clause, it is prudent to define exactly what is being replicated—is it the talent’s voice, face, body, mannerisms, biometric data, past performance or something else? You should also clarify which technologies are permitted: machine learning, generative AI, synthetic media, voice synthesis, or deepfakes? Depending on the scope of the engagement and the nature of the talent, you may also wish to expressly prohibit specific uses (think, political, religious, illegal or in a vice category such as gambling or smoking).
A poorly defined digital replica clause can inadvertently grant rights far broader than intended or leave gaps that invite disputes as technology evolves. If the agreement is silent on training data, for example, could a company use an actor’s performance to train a proprietary voice model? If you don’t address these risks upfront, the ambiguity leads to a higher likelihood of litigation. Plus, once your client’s performance is baked into a machine learning model, it’s nearly impossible to put the genie back in the bottle.
2. Indemnity and downstream liability risk
If your client is a brand or production company, the risk may not be the performer—but the vendor you’re using to create the marketing assets. Many AI tools are trained on scraped or disputed data sets. Because of this, ensure your contract has: robust representations and warranties regarding lawful training data, indemnity provisions for IP and right-of-publicity claims and a clear allocation of responsibility for third-party infringement.
Texas companies deploying AI-generated content in advertising should be particularly cautious about downstream liability under right-of-publicity statutes and unfair competition claims. If your AI-generated “generic model” looks a little too much like a real person, you could be opening the door to a misappropriation claim. Texas courts have held that liability may arise where a defendant derives a commercial benefit from the unauthorized use of an individual’s likeness, even if unintentional.
3. Negotiate approval rights of AI-generated performances
We are seeing a rise in AI-generated performances that actors never physically filmed—notable examples include the late Ian Holm’s role in Alien: Romulus, which was made possible through his heirs’ approval of generative AI. When representing talent open to this type of licensing, it is critical to ensure the contract includes robust approval rights and restrictions on potentially controversial future uses. For example, explicitly restricting use in political or defamatory contexts protects both parties: it shields the brand from potential tort-based or reputational claims and also guarantees that the talent’s name, image, and likeness remain consistent with their established public persona.
4. Data security and biometric considerations
Voice prints and facial scans may implicate biometric privacy concerns, depending on jurisdiction. Even where Texas law is less expansive than other states, multi-state exposure is common in digital distribution. Accordingly, negotiate the storage and security standards of such data and strictly limit or prohibit the sublicensing of biometric data. Finally, ensure the agreement requires the total deletion of data upon termination, preventing it from being held in perpetuity for future use or unintended technological applications.
5. Avoid boilerplate that collides with AI language
As with any newly added provision, practitioners should review the agreement holistically when inserting a digital replica clause. Traditional “work made for hire” language and broad license grants covering “all media now known or hereafter devised” in one IP section of your deal could potentially undo negotiated guardrails in a carefully crafted clause elsewhere. This can then create ambiguity and increase the risk of future disputes. To avoid these pitfalls, ensure that AI-specific restrictions are explicitly labeled as prevailing over any conflicting general grant of rights.
Final Takeaway
For Texas lawyers advising clients in media, sports, advertising, and tech, the takeaway is practical: AI risk is now a drafting issue. A single, generic “AI clause” is unlikely to be sufficient. Instead, thoughtful definitions, clear consent structures, indemnities, and data-use restrictions must work together to allocate risk intentionally. Until legislatures and appellate courts provide clearer guidance, contracts remain the primary safeguard—and the difference between a controlled business decision and a costly test case.
Disclaimer: The views expressed here are strictly my own—they don’t reflect the opinions of my employer or anyone else I’m affiliated with.
Natalie LeVeck is Senior Counsel at Google, where she negotiates high-profile celebrity endorsement deals and oversees some of YouTube’s largest sponsorship and co-marketing partnerships. Outside of work, she teaches Entertainment Law at SMU Dedman School of Law, produces independent film and television projects with her husband, and watches her three children play piano and sports in North Dallas.

