The New York Times recently published an article entitled: What Should Happen to Our Data When We Die? The focus centered on the use of deceased actors, athletes and celebrities likenesses in newly created movies and other video. Carrie Fisher and Peter Cushing, for example, posthumously reprised their roles in Star Wars – The Rise of Skywalker only because artificial intelligence is sophisticated enough to construct new scenes.
But these issues transcend the famous. That same article quotes Carl Ohman, a digital ethicist, who estimates that Facebook could have 4.9 billion deceased users by this century’s end. Artificial intelligence capabilities therefore represents a huge sociological shift. For centuries, only the rich and famous were documented. Now all of us might be.
From an ethical standpoint, who owns the rights to our emails, online search history, images, likenesses, and social media posts? Should we specifically mention them in our wills? Various state laws, Florida’s included, detail what rights our beneficiaries have to access these online resources, as do the “User Agreements” we never read but click through in order to transact business on the web.
But what about recreations? In California, for example, digital likenesses are protected up to 70 years following death; in New York, it’s 40 years.
What, if any, right to privacy does our estate enjoy? Robin Williams prohibited the use of his voice or likeness for a period of 25 years following his death in 2014. This prevents a user in a state other than California attempting to so profit.
Some individuals are busy creating their own A.I. selves using apps and services. HereAfter is focused on family history, and for a fee, interview clients about critical moments in their lives. The answers are apparently used to create Siri-like chat bots. If your descendants want to learn about your first job or early adulthood, they could ask the bot and it answers in your voice!
The NY Times article also mentions Replika, a chat bot app that creates avatars mimicking their user’s voices. Last year, Microsoft filed for a patent that combines 3D imagery with chat bot-like voice replication. Imagine Gunsmoke actor James Arness standing outside a steakhouse, urging customers to step in and enjoy a meal there.
I don’t know whether artificial intelligence will replicate many of us after our deaths. But there’s certainly a danger. Just as technology can be put to good use, consider how fraud might be perpetrated. I envision an adult child creating a video of a deceased parent explaining how much they disliked a sibling, and why that sibling was written out of a will. Today the video technology probably isn’t good enough to fool anyone, but what about in the decades to come?
There are other scams that may entrap more people. Many have heard about a recent one, where an elderly relative is usually called by a grandchild (but it’s not really the grandchild), pleading for money because they’re in jail and must make bail. The worried grandparent quickly sends money on an app like Venmo. How many more will fall prey when the voice is an A.I. version of the grandchild’s actual voice? It doesn’t fall outside of reason that someone can lift audio files off social media to construct the grandchild’s voice.
It’s amazing to me how fast the technology improves. It’s both exciting and troubling. I’m sure the next Heckerling Estate Planning Conference I attend will include discussions and a review of law over these issues.