AI Parallel

People Are Watching Dead Actors and Talking to AI Athletes

Val Kilmer died of pneumonia in April 2025. He’s starring in a new movie in 2026. Not archive footage. Not a cameo stitched together from deleted scenes. A full AI-generated performance in a film called As Deep as the Grave, built with generative AI, sanctioned by his estate, and produced under SAG-AFTRA guidelines. His daughter Mercedes signed off. The audience will pay $15 for a ticket. And most of them won’t think twice about it.

Meanwhile, if you’re a WNBA fan, you may have already had a one-on-one conversation with Kelsey Plum — except it wasn’t Kelsey Plum. It was her “verified digital twin,” built by Talk2Me Inc., trained on her voice, her advice, her personality. The AI talks about training, mindset, and resilience. It sounds like her. It responds like her. She approved it. But she’s not in the room. This is the new normal and sooner or later it this dystopian society, it will be more than celebrities.

Everyone’s Getting a Clone Now

The celebrity examples grab headlines, but the real story is the infrastructure being built underneath them. An entire ecosystem of platforms now exists to let anyone — literally anyone — build an AI version of themselves:

Platform

What It Does

Who It’s For

HeyGen / Tavus

Video avatars from a short recording

Sales teams, content creators

ElevenLabs

Voice clones from minutes of audio

Podcasters, narrators, anyone

CustomGPT

Chatbots trained on your knowledge base

Consultants, coaches, authors

Synthesia

Enterprise-grade AI avatars

Corporate training, HR

IgniteTech MyPersona

Digital twins of internal experts

Enterprise IT, HR departments

CloneOps.ai

AI agents handling customer service calls

Logistics, operations

This isn’t experimental. Forbes covered five ways to clone yourself with AI back in late 2024. Tavus already requires users to read an on-camera authorization statement before their likeness can be replicated — which tells you the volume is high enough that they need a consent protocol baked into onboarding.

The use cases aren’t hypothetical either. The internet coaches of the world are deploying AI versions of themselves to handle client intake. Sales reps are sending personalized video pitches that are entirely synthetic. Ray Dalio — the billionaire investor — has been building an AI clone trained on his principles, decision-making rules, and years of recorded conversations. He says it tested at 95% as effective as speaking with him personally. That’s not a gimmick. That’s a replacement.

The 18-Month Forecast: Your AI Parallel Is Coming

Here’s the trajectory. Connect the dots:

  • Late 2024: Forbes publishes mainstream guides on self-cloning. The tools are consumer-ready.
  • January 2026: IgniteTech debuts MyPersona at CES — AI twins for every employee, not just executives.
  • March 2026: Kelsey Plum becomes the first professional female athlete to launch a verified AI twin. Val Kilmer’s estate announces a full posthumous AI performance.
  • April 2025 (yes, last year): Reid Hoffman used AI to build a functional LinkedIn prototype from a single text prompt.

That last one matters more than it looks. The co-founder of LinkedIn demonstrated that AI can replicate platforms, not just people. If the infrastructure for digital identity is itself being automated, the gap between “tech demo” and “default feature” is about 18 months. Within that window, expect:

  • LinkedIn and professional platforms to offer AI avatar features — your profile answers recruiter questions while you sleep.
  • Dating apps to deploy AI versions of users for initial screening conversations.
  • Customer-facing businesses to replace “About the Founder” pages with interactive AI twins.
  • Content creators to run 24/7 engagement channels powered entirely by their digital clone.

Having an “AI parallel” won’t be a celebrity flex. It’ll be a LinkedIn premium feature. And if you think that sounds dystopian, consider that you’re already watching a dead man act in a movie and nobody called a press conference about it.

The Questions Nobody Wants to Answer

This is where the story shifts from “cool tech” to “who owns you.”

Who Owns Val Kilmer Now?

Kilmer lost his natural speaking voice to throat cancer in 2014. AI recreated it for Top Gun: Maverick in 2022. Now, a year after his death, AI is generating his entire performance. His estate approved it. His daughter approved it. SAG-AFTRA’s rules require consent and compensation for digital replicas of deceased performers. But here’s the thing: there is no federal law governing post-mortem rights of publicity in the United States. It’s a patchwork of roughly 25 state laws, and the protections vary wildly:

State

Duration After Death

Type of Right

California

70 years

Property right (inheritable)

New York

40 years

Requires anti-deception provisions

Indiana

100 years

Property right

Most states

None

No protection

Robin Williams placed explicit limits on the use of his likeness in his will. Most people don’t have that foresight — or those lawyers. And even Williams’ protections have time limits. The uncomfortable precedent is already set. Peter Cushing was digitally resurrected as Moff Tarkin in Rogue One back in 2016. Paul Walker’s brothers stood in for him in Furious 7 while CGI mapped his face onto their bodies. Carrie Fisher appeared in Rise of Skywalker from unused footage. Harold Ramis showed up in Ghostbusters: Afterlife. Audrey Hepburn and Bruce Lee have appeared in commercials — decades after their deaths.

Audiences consistently say they hate this. Reddit threads are full of words like “gross,” “sad,” and “ghoulish.” The core objection: a computer-generated performance isn’t acting. It’s puppeteering a corpse. And yet — every one of those movies made money. Audience discomfort hasn’t translated into box office resistance though. 

What Happens When Kelsey Plum’s Contract Expires?

Plum’s AI twin is positioned as the ethical, consent-based model. She participated directly. She has oversight. But the AI was built by Talk2Me Inc. The training data — her voice, her personality patterns, her advice — lives on their servers.

When that contract ends, what happens to the model? Is the data deleted? Can Talk2Me license the underlying architecture — not “Kelsey Plum” per se, but the conversational patterns derived from her? What if a competing platform reverse-engineers something functionally identical from her public interviews and social media posts?

These aren’t paranoid hypotheticals. The legal frameworks don’t have answers yet. A license to use someone’s likeness doesn’t automatically grant rights to copyrighted content they appeared in. But it’s entirely unclear whether “personality patterns” extracted from public data constitute a protectable likeness at all.

What Happens When Your Clone Goes Rogue?

For the average person building an AI twin on CustomGPT or HeyGen, the risks are less Hollywood and more mundane — but no less real. Your AI clone says something defamatory in a customer interaction or stripped from a training video and used in a deepfake. Your “cognitive legacy” chatbot, trained on your documents, gets acquired in a company merger you didn’t consent to.

The commodification of self is the real trend line. Your face, your voice, your personality — these are becoming detachable, replicable, marketable assets. Once they exist as data, they follow data rules, not human rules. They can be copied, transferred, licensed, and leaked.

The Strike That Already Happened — and What It Actually Settled

SAG-AFTRA went on strike for 118 days in 2023. AI protections were a central demand. The result was a contract ratified in December 2023 with provisions that looked strong on paper:

  • Two categories of digital replicas: “Employment-Based” (created during a gig) and “Independently Created” (built without the performer present — i.e., resurrecting the dead).
  • Consent and compensation required for both.
  • Synthetic performers — AI characters built from multiple actors’ likenesses — trigger union notification and potential bargaining.
  • Background actors can’t be replaced wholesale by digital replicas.

Here’s what actually happened: the contract applied to SAG-AFTRA members working under SAG-AFTRA contracts. It didn’t cover your face on a HeyGen video, a coach’s voice clone on ElevenLabs or a logistics company’s AI phone agent built by CloneOps. It covered actors in union productions. Period. The FTC is making noise about synthetic voices in marketing. California and Illinois have passed laws around deepfakes and biometric data. But there is no comprehensive federal framework for AI likeness rights. Not for performers or athletes and especially not for you.

SAG-AFTRA fought for guardrails. They got guardrails — for their members, on their projects. Meanwhile, the rest of the economy kept building. The audience kept watching. And the tools got cheaper, faster, and easier to use.

That’s not a labor story. That’s a market signal. The question was never whether AI replicas would happen. It was whether anyone would opt out. So far, the answer is no — not the studios, not the platforms, not the consumers, and increasingly, not the individuals building clones of themselves to scale their hustle. The dead are acting. The living are being cloned. And the only people who drew a line were union actors — who got a contract that covers a fraction of the problem. You’re next. You just haven’t been asked to sign yet.

Facebook
Twitter
LinkedIn
Reddit

Leave A Comment

Your email address will not be published. Required fields are marked *