Artificial intelligence has arrived on set—whether we’re ready or not. From scriptwriting and voice cloning to visual effects and even casting, AI is no longer just a future-facing buzzword. It’s here, and it’s changing the game for filmmakers, especially in the documentary and independent film world, but also in podcasting, digital video, and throughout the media and entertainment fields.
But with innovation comes legal and ethical complexity. And if you’re not keeping pace, you may be exposing your production (and yourself) to serious risk.
Here’s what smart creators need to know:
1. Using AI to Recreate Real People? Proceed with Caution
Whether it’s a reenactment with a digitally aged actor, a voice generated by AI, or a “synthetic” character modeled after a real person—this raises red flags.
You could be violating right of publicity laws, treading on defamation territory, or facing ethical backlash. Consent and clear licensing are non-negotiables here, especially if your subject is a public figure or, worse, a private individual.
2. AI-Assisted Editing and Audio Cleanup Can Backfire
Using AI to clean up poor audio or generate missing footage might sound like magic. But in the documentary world (including the true-crime podcasting genre), authenticity is everything.
If your use of AI misrepresents someone’s words, tone, or intent—even subtly—you could be facing a claim of false light or misrepresentation. Be transparent. Better yet, be explicit in your guest releases and appearance consents.
3. Who Owns the AI-Generated Content?
This is the copyright question of the decade. If your AI-generated b-roll, script, or even music cue was created with the help of a tool trained on copyrighted works, who really owns it?
Courts are still figuring that out, but recently, the US Copyright Office and the Courts have concluded that works created by AI are NOT entitled to protection, and therefore can't be owned by anyone. That opens up loopholes for infringers to leap through.
As litigation over AI tools continues, there are more questions than answers. So, until there's greater clarity, it's wise to treat AI tools with caution—read the terms of service, keep human authorship involved, and consult your lawyer before commercial release.
4. Update Your Contracts—Yesterday
Most production agreements, contributor releases, and crew contracts weren’t drafted with AI in mind.
Your contracts need to address:
- Whether and how AI can be used on the project
- What rights are granted (and retained) if AI modifies a contributor’s work
- What limitations exist on reuse of their image, voice, or performance via AI
The bottom line? If it’s not in writing, it’s not protected. Get help from an experienced media and entertainment lawyer to make sure your deal memos, contracts and releases cover things properly.
5. Transparency Isn’t Just Ethical—It May Soon Be the Law
Some jurisdictions are already proposing legislation that would require disclosure when AI has been used in media production. Platforms like YouTube and its brethren, have already begun requiring disclosures for AI generated material.
So, it's wise to include appropriate messages to let your viewers and listeners know about your use of AI. Even if disclosure isn’t mandatory yet, audiences (and platforms) are demanding more transparency. Don’t let a lack of disclosure damage your project's credibility—or its distribution potential.
Conclusion (for now)
AI is not the enemy—but blind use of it can be. As filmmakers, your job is to tell compelling, truthful stories. My job is to help you do that legally and ethically, without stepping on any landmines.
Need help updating your contracts or reviewing your project for legal exposure? Let’s talk. The best time to get your legal house in order is before the cameras roll—or the lawsuit does.