Thursday, December 4, 2025
Trusted News Since 2020
American News Network
Truth. Integrity. Journalism.
US Tech & AI

Its time to add AI protections to your will

By Eric December 4, 2025

In a world increasingly shaped by technology, the emergence of AI-powered avatars and digital likenesses is prompting individuals to reconsider their legacies in unprecedented ways. A recent advertisement for the app 2wai illustrates this shift, showcasing a pregnant woman interacting with her mother through a video call, with the implication that her mother’s digital avatar could continue to provide guidance long after her physical presence is gone. This concept, often referred to as creating “deadbots” or digital resurrections, is no longer confined to celebrities; it is now accessible to everyday people. As AI technology becomes more sophisticated, it raises critical questions about privacy, consent, and the ethical implications of allowing one’s likeness to be used posthumously.

Experts like Sarah Chavez from the Order of the Good Death emphasize the importance of proactive planning regarding one’s digital legacy. With cases like the use of AI to resurrect the voice of a deceased victim for a public statement, the legal and social ramifications of digital likenesses are becoming increasingly relevant. Individuals are encouraged to take inventory of their digital assets, assign digital fiduciaries, and clearly state their wishes regarding the use of their likeness in their wills. This includes considerations about whether they want a synthetic version of themselves to exist and how their data will be utilized by tech companies. The current lack of comprehensive legislation surrounding the use of digital likenesses further complicates matters, making it imperative for individuals to articulate their desires explicitly to avoid potential misuse.

As the conversation around AI and death evolves, it also intersects with deeply human experiences of grief and memory. Emma Payne, a bereavement researcher, warns against replacing authentic human connections with artificial simulations, arguing that the process of mourning is essential for healing. The rise of “Death Tech” prompts society to grapple with how technology can aid in memorialization without overshadowing the genuine human experience of loss. As we navigate these uncharted waters, it becomes crucial to reflect on what a meaningful legacy looks like—one that aligns with personal values, honors the memory of loved ones, and respects the complexities of grief. The future of digital legacies may be bright with possibilities, but it also requires careful consideration and ethical deliberation to ensure that technology enhances rather than diminishes the human experience of life and death.

https://www.youtube.com/watch?v=gq9oJ51zMHs

A visibly pregnant woman stands in the middle of a bright, modern kitchen, rubbing her belly and speaking to someone on the other end of a phone. The phone screen turns. It’s a video call. And it’s not just anyone, but her mom, wearing a bright sweater and giving advice.
Ten months later, grandma is telling the toddler a bedtime story. She’s wearing the same sweater from before. Ten years go by, the preteen is telling grandma about his day at school. We see that red sweater again. Hm. The grandson is 30 now, he’s about to be a dad. Grandma hasn’t aged a day.
The
scene
is an advertisement, selling you the services of 2wai, an app currently in beta that turns a short video clip into an
AI-powered
avatar. They’re one of many companies trying to win people over into creating AI versions of themselves to be used after they die. 

This Tweet is currently unavailable. It might be loading or has been removed.

No longer is the fear of deepfakes and AI-powered legacy projects (frequently called resurrections or
“deadbots”
) the sole worry of famous celebrities. It is here, for the average person, in the hands of your family and friends. 
So what if you don’t want a synthetic version of yourself giving advice to your ancestors in perpetuity? Or your AI replica being used in advertisements, art, or by corporations who have access to your data?
It’s still uncharted territory, but you have options to ensure your digital likeness stays offline. And there’s many reasons, not just legal or financial, why you might want to do it. Here’s how.

SEE ALSO:

‘Alien: Romulus’s biggest cameo is its greatest error

Start thinking about AI before you die
There’s one thing that needs to be stated right off the bat: Everyone should be planning for their death. 
“We invest so much time and consideration into milestones like weddings and having children, but very little thought is given to how we want to live our final months and years,” said Sarah Chavez. Chavez is the director of
Order of the Good Death
, a global network of advocates and professionals working to reframe death and dying. 
So alright, you know you need to make sure your digital ducks are in order before you get too old. But do you really need to think about AI, deepfakes, and digital likenesses, of all things?   
If you had asked Chavez this question a year ago, she would have had an entirely different response. That’s rapidly changed. “AI has become so prominent in our everyday lives, not just professionally and personally,” Chavez explained. “We’re also starting to see the dead used in a way that can have legal and social impact, too.” She points to a case of Chris Pelkey, a victim of a road rage incident whose
voice was resurrected by his family
to give his own victim’s statement. Chavez recalls the viral
Shotline projec
t, too, which used AI audio deepfakes of gun violence victims to urge politicians to pass common sense gun reform legislation. Similar tech was used to
create an AI likeness
of Parkland shooting victim Joaquin Oliver. 
There’s a high degree of risk associated with allowing digital versions of yourself to exist online, with no parameters. Could your digital likeness be used as a tool for scammers, for example, to con your family and friends or even strangers? What about the legal and social ramifications of a chatbot created in your image, one that may become embroiled in the same courtroom battles
currently faced by ChatGPT
and others. Another big question: What about your personal data privacy? Are you okay with your loved ones providing a tech company or AI developer with the mass amount of data needed to personalize an AI version of you?
“It’s important to remember that these tools are created by for-profit tech companies, which raises a number of concerns about ownership of that data and how it will be used,” warns Chavez. 
Regular people, not just celebrities or those who become headlines, are seeing the fallout of unhampered access to generative AI, like targeted scams and growing misinformation. Just a handful of bullet points in your will could decide whether your digital legacy is mired in the same controversies. If there was ever a time to start planning for the end of your life, it’s now. 

First task: Take a digital asset inventory.

AI, your death, and the law 
Cody Barbo, the founder of digital estate planning tool
Trust & Will
, suggests people use estate planning to better control their digital footprint. The service is like TurboTax but for writing a will, and he says he built it to help regular people who may be avoiding the conversation completely. It’s also a way to bring tech into an industry that has been slow to adopt, even as AI poses huge security and estate questions. 
“Over the past decade, end-of-life planning regarding tech has primarily focused on encouraging people to include information about what they want done with their cell phone, email accounts, and social media platforms, and making sure they’ve provided passwords and login information for their accounts,” Chavez explained. With AI an emerging and yet dominant tech, the industry needs to catch up. 
“We’re just at the entry point,” Barbo said. “We’re dipping our toes in the water of what an AI version of ourselves could look like. [But] we want people to know that you can be in control.”
How does that work in practice? “The challenge with trying to protect something that is so new, that is so innovative, is that there’s no legislation to help you,” explained Solomon Adote, the chief information security officer of
The Estate Registry
and former Chief Security Officer for the state of Delaware. “Some states say you cannot violate certain privacy protections, but nothing that explicitly says that you cannot abuse this person’s likeness, image, or other aspects of their representation.” In the background, a patchwork of state laws are trying to address these concerns through extended privacy laws, which would better protect your digital assets, including data privacy, after you die.
For now, individuals have to turn to proactive estate planning. 
What are you trying to protect?
First task: Take a digital asset inventory. This involves surveying and noting all your digital accounts, log-ins, and data, like social media pages, bank log-ins, but also Cloud-based drives, or even text messages or DMs. This also includes defining exactly what your digital likeness includes — is it just depictions of you as an adult? Does it include your voice and physical mannerisms? What version of yourself can or cannot be turned into AI?
Some people may want to solicit the services of a digital identity trust, Adote said, which can help manage your online identity and intellectual property. 
Who will help you protect it?
Next: Assign a digital fiduciary and know the (albeit limited) law. This is a person (or persons) who is given designated access to your digital assets, including online accounts. You can grant permission to just specific assets or entirely limit access through both your will and fiduciary. You can also provide them with guidance for your digital likeness, which is in itself a digital asset, Adote explained.
The boundaries of digital fiduciaries are covered under the Revised Uniform Fiduciary Access to Digital Assets Act (RUFADAA), which has not been passed by every state. Under this law, a person assigned as a digital fiduciary can legally provide or gain access to someone’s online accounts after death or even incapacitation. But only trustee executors can access the content of said accounts, and only if the person who died consented. Tech companies, like Google and Meta, also operate under RUFADAA (that’s why we have things like Facebook legacy accounts and contacts now). If you don’t assign a fiduciary, your accounts default to the tech company’s Terms of Service. 
What will you allow and who will benefit?
Once you’ve assigned a fiduciary, you need to have a direct conversation with them about what they should and should not allow. With your “explicitly written and validated position” on AI use, Adote said, fiduciaries can more easily take legal action, like issuing cease and desist orders on intellectual property.
You can, quite simply, write that you do not consent to someone creating an AI-generated likeliness of yourself in your will, said experts.
You may want to phrase this as “living on in AI-form” or the “publication of an AI-generated, synthetic version” of yourself. You may also want to be clear about data usage: I do not consent to the use of my personal data to create an AI-powered digital likeness of myself. Adote suggests your will should show clear intent, with phrasing like “I do not authorize my image or likeness to be used in any way, form, or fashion.”
Go over these with an estate attorney, as everyone’s situation and end of life needs are different — and state laws vary. 
You can also stipulate very precise cases for how your digital likeness can be used, if it’s not a hard no. But be conservative and narrow with this language, other experts suggested. Write down, for example, exactly who is allowed to use or release it, just as you would with other assets or accounts. List any explicit charities or companies that are allowed to use your likeness, as well. 

Think deeply about what the end is for you.

– Emma Payne, Help Texts

If your likeness is in any way attached to your livelihood — that includes influencers — be clear about potential financial gain that could be generated from a personal AI, and decide where that money will go. 
These directives should be expressly written down in your will or another document that is accessible after you die. It comes down to just a few, clear bullet points, experts say. 
AI, grief, and memory
There’s a few, non-legal things to consider, too, especially if you are raring to live on in AI form. What are your values, and what is best for those who will miss you? 
You may have ethical concerns about the use of AI — like its environmental impact or the political and financial motives held by its developers — and you’ll want to account for those at the end of your life too, said Chavez. 
Or maybe you want to curb any general use of your digital likeness, but still leave room for a digital version of yourself to be used by your family, for example. Consider what that entails. “While a griefbot can be trained with your own writing, and voice, it’s still selective or biased data used to create an inauthentic version of the deceased,” said Chavez, who also warns that prolonged interactions with the AI version of a person may fundamentally change the way they are perceived and remembered. 
Emma Payne is a bereavement researcher and the founder of
Help Texts
, a text-based grief support subscription service. Payne is concerned not just with the typical ways that AI has infiltrated posthumous legacies, like AI deepfakes and chatbots programmed to mimic your loved ones, but also how technology is encroaching upon our social relationships. To her, memory matters. But imitation is an entirely different thing. 
“End of life is a deeply human time and a massive opportunity for human connection and caring. So pushing it out, and trying to say that it’s not the end, worries me. Think deeply about what the end is for you,” Payne recommends. “By trying to extend or mitigate or transform that experience, knowing that you’re in the most human of times, are you helping the people you leave behind or are you actually hurting them?”
Take the
recent words
of Zelda Williams, director and daughter of actor Robin Williams, who took to the internet to decry AI-generated content of her father and other late celebrities: “To watch the legacies of real people be condensed down to ‘this vaguely looks and sounds like them so that’s enough’, just so other people can churn out horrible TikTok slop puppeteering them is maddening… If you’ve got any decency, just stop doing this to him and to me, to everyone even, full stop.”

SEE ALSO:

Deepfake voice scams are more sophisticated than ever: How to keep your family safe

Bereavement is a complicated process, but there are a few solid truths. First, one must accept the person’s death. Second, they need to find appropriate ways to memorialize them. Anything that tries to replace a real person and their memories with a pretend, future version, Payne says, is missing the entire point of healthy grief. 
AI is becoming a bigger player in death, even behind-the-scenes. But even players in the industry that have embraced AI technologies are hesitant to incorporate them fully into the realm of end-of-life planning. Zack Moy is the co-founder of
Afterword
, a tech company that provides AI-powered infrastructure for funeral planning. Moy says he doesn’t build tech-based solutions unless he’s sure they’ll better the human experience. He’d never replace grief with a bot, for example, but he can use AI to make it easier to execute a person’s wishes after death. 
“The vast majority of funeral directors we work with care about what they’re doing and deeply care about that family experience, and we followed their example,” Moy said. “The technology isn’t going to make the suffering any easier. We can’t make death not suck.” 
As a technological society, we are skirting close to a grief precipice, a social reckoning with death and memory that’s been expedited by what is now referred to as “Death Tech.” With the rise of generative AI, tech isn’t just helping account for digital assets or speeding up funeral planning in order to make the grief of our loved ones a little lighter. It’s trying to change our lives post-mortem. Now we must reconcile with how we will be memorialized, mimicked, or even mocked by our very own likenesses at the hand of strangers and loved ones.
“We all have a ‘legacy’ to consider,” said Chavez. “Just as we ask people what a ‘good death’ looks like for them, we need to ask ourselves what does a good legacy look like? Actions that align with your values and beliefs? Authenticity?”

Related Articles

Whats new to streaming this week? (Dec. 5, 2025)
US Tech & AI

Whats new to streaming this week? (Dec. 5, 2025)

Read More →
Vinted blocks ‘sickening’ sexually explicit ads
US Tech & AI

Vinted blocks ‘sickening’ sexually explicit ads

Read More →
Jorja Smith’s record label hits out at ‘AI clone’ song
US Tech & AI

Jorja Smith’s record label hits out at ‘AI clone’ song

Read More →