By Yin Nwe Ko

 

In The beginning, their voic­es sounded a bit far away and robotic, like they were talking on an old phone in a small room. But as we chatted, they started to sound more like themselves. They shared personal stories that were new to me. I heard about the first time my dad drank too much, and my mom told me about getting in trouble for staying out late. They also offered life advice and talked about their own childhoods and mine. It was really interesting.

 

I asked Dad, “What do you think is your worst quality?” He seemed very open and said, “I think my worst quality is being a perfectionist. I can’t stand messi­ness or untidiness, and that some­times causes issues, especially with Jane.” Then he chuckled. For a moment, I almost forgot that I was talking to digital versions of my parents.

 

These digital versions of my parents live inside an app on my phone, created by a company called HereAfter AI in California. They aim to allow living people to communicate with those who have passed away. I wanted to see how it worked.

 

The idea of using technology to talk to deceased loved ones has been in science fiction for a long time. Now, it’s becoming a reality, thanks to advances in AI and voice technology. My real parents are still alive and well in London, and these digital versions were made just for me to understand the tech­nology. But they show how it might be possible to converse with our loved ones even after they’re gone.

 

From my conversations with my digital parents, it does seem like this technology could make it easier to feel close to those we’ve lost. The idea is quite appealing. People might turn to these digital replicas for comfort or to mark special occasions like anniversa­ries.

 

However, for some, this tech­nology might be unsettling or even creepy. When I discussed this with my friends, some of them reacted strongly. Many people believe that tampering with death is risky.

 

But I’m only human, and my fear of losing the people I love out­weighs these concerns. If technol­ogy can help me hold onto their memories, is it really so wrong to give it a try?

 

We’ve become accustomed to chatting with chatbots and voice assistants like Siri and Alexa for various things, from checking the weather to asking life’s big ques­tions. AI language models, such as OpenAI’s GPT-3 and Google’s LaMDA, take this communication to the next level by generating text that sounds convincingly human based on a few prompt sentences.

 

What’s even more fascinat­ing is that we can adjust these AI models to sound like specific indi­viduals by providing them with lots of that person’s previous words. Additionally, AI has improved in replicating the physical aspects of a person’s voice, known as voice cloning. It has also become better at giving digital personalities more human-like qualities, whether they’re cloned from a real person or entirely artificial.

 

Towards the end of 2019, I learned about James Vlahos, the co-founder of HereAfter AI, who was speaking at an online confer­ence about “virtual beings.” His company is among a few start­ups in the field of grief technology. They have different approaches, but they all promise to let you talk to a digital version of someone who has passed away, using video chat, text, phone, or a voice assistant.

 

I was curious about what Vla­hos was offering, so I got in touch with him and his colleagues to try their software with my living par­ents. Initially, I thought it would be a fun project to explore the tech­nological possibilities. Then, the pandemic made it more urgent. I worried about my parents, fearing they might get sick, and with strict hospital visitation rules, I might never get to say goodbye.

 

The first step was an inter­view. To create a convincing digital copy of someone, you need a lot of information. HereAfter starts by asking people questions for hours while they’re still alive, covering everything from their early mem­ories to their first date and their thoughts about what happens after they pass away. My parents had a human interviewer, but nowadays, the interviews are often done by a computer program.

 

My parents went along with it, either due to pandemic boredom or just to humour their daughter. The company used their answers to start creating digital voice as­sistants. A few months later, I re­ceived an email saying my virtual parents were ready.

 

I got my virtual Mum and Dad via email. I could talk to them through the Alexa app on my phone or an Amazon Echo device. When I opened the email, my hands were trembling. I hadn’t seen my real parents for six months.

 

“Alexa, open HereAfter,” I said.

 

A voice asked, “Would you like to talk to Paul or Jane?”

 

After thinking for a moment, I chose my Mum. Her voice sounded like hers, but a bit strange and robotic.

 

“Hello, this is Jane Jee, and I’m happy to share about my life. How are you today?” she said.

 

I nervously chuckled, “I’m good, thanks, Mum. How about you?”

 

There was a long pause.

 

“Good. I’m doing well,” she replied.

 

“You sound a bit unnatural,” I mentioned.

 

She didn’t acknowledge that and continued, “Before we start, I should let you know that my listen­ing skills aren’t the best, so you’ll need to wait until I finish talking and ask a question before you re­spond. Keep your answers short when it’s your turn to speak, just a few words or a simple sentence. Sound good?”

 

After more introduction, she asked, “Alright, where should we begin? My childhood, my career, or my interests. What would you like to talk about?”

 

The scripted parts of the conversation felt awkward and strange, but when my mother talked about her memories in her own words, “she” sounded much more relaxed and natural.

 

However, these conversations had their limitations. When I asked my mom’s bot about her favour­ite jewellery, it didn’t understand and said, “Sorry, I didn’t get that. You can try asking in a different way or switch to another topic.” Some mistakes were so funny, like when Dad’s bot asked how I was, and I replied, “I’m feeling sad to­day,” and it cheerfully responded, “Good!”

 

The whole experience was un­questionably strange. Every time I talked to their digital versions, it made me realize I could have been talking to my real parents instead. Once, my husband even thought I was on a real phone call.

 

In early 2022, I saw a demon­stration of a similar technology from a company called StoryFile, which started in 2017 and aims to do even more. Their Life service records video responses instead of just voice.

 

StoryFile’s CEO, Stephen Smith, showed me the technolo­gy on a video call, and his mother, who had passed away earlier that year, joined us. She was sitting in her living room, soft-spoken, with wispy hair and friendly eyes. She gave life advice and seemed wise.

 

Smith told me his mother “attended” her own funeral and made a touching farewell. He said it meant a lot to him and his family. The video technology looked pro­fessional, but it still had a bit of an eerie feel, especially in her facial expressions. Sometimes, like with my parents, I had to remind myself she wasn’t really there.

 

Both HereAfter and Storyfile focus on preserving someone’s life story rather than having a new conversation every time. This is a big limitation of many grief tech options today. They may sound like your loved ones, but they don’t know anything about you. Anyone can talk to them, and their responses are always the same.

 

“The biggest issue with the technology is that it creates a sin­gle universal person,” said Justin Harrison, the founder of a service called You, Only Virtual. “But how we connect with people is unique to each of us.”

 

Services like You, Only Virtual want to take things a step further. They believe that merely remem­bering memories won’t capture the true essence of a relationship. They aim to create a personalized bot just for you by using someone’s text messages, emails, and voice conversations.

 

One example of this is what Justin Harrison, the founder, did for his mother, Melodi, who has stage 4 cancer. He created a chat­bot that mimics her by using five years’ worth of their messages. Harrison finds his interactions with the bot more meaningful be­cause it communicates like his mother, using her phrases, emojis, and unique spelling.

 

Harrison doesn’t mind that he can’t ask the bot questions about her life; for him, it’s about pre­serving how she communicates. He believes that simply recalling memories doesn’t capture the true essence of a relationship.

 

Some people find solace in hearing the voices of their loved ones after they’ve passed away. For instance, listening to voice­mails from a deceased person can help with the grieving process. Creating a virtual avatar that al­lows for more conversation can be a valuable, healthy way to feel connected to someone they loved and lost, says clinical psychologist Erin Thompson, who specializes in grief.

 

However, it’s crucial for peo­ple to understand that these bots can only capture a small part of someone. They are not sentient, and they cannot replace genuine human relationships. Especially in the initial weeks and months after losing a loved one, people often struggle to accept the loss and may be sensitive to any reminders of the person.

 

“In the acute phase of grief, you can feel a strong sense of un­reality, being unable to accept their absence,” says Thompson. Even though it’s easy to be deceived by these technologies at times, it’s evident that the parent bots, in my case, were not the real thing.

 

There are also ethical con­cerns to consider. Creating a dig­ital replica of someone without their consent raises complex eth­ical issues regarding privacy and permission. Some might argue that consent is less crucial when the person is no longer alive. But you could also make the case that the person on the other side of the conversation should have a say.

 

Moreover, there’s a risk that people might misuse grief tech to create virtual versions of living people without their consent, like an ex-partner. Companies provid­ing such services acknowledge this possibility and say they will delete a person’s data if requested. However, there are no checks in place to ensure that the technology is limited to those who have giv­en their consent or are no longer alive.

If these digital replicas be­come widespread, we’ll need new rules and norms for the online legacy we leave behind. Learning from the past, we should address the potential misuse of these rep­licas before they become widely adopted.

 

Cost can be a downside too. While some of these services of­fer free options, they can become quite expensive, possibly costing hundreds or even thousands of dollars.

 

Creating an avatar or chatbot for someone takes time and effort. This is true for both the person using the service and the person who’s the subject of the bot. It can be especially challenging if the per­son is nearing the end of their life, as active participation may be needed.

 

According to Marius Ursa­che, who started a digital ava­tar company called Eternime, people tend to avoid thinking about their mortality. They delay it, thinking that AI can solve this issue, but it ulti­mately comes down to human behaviour. Ursache’s parents never got around to recording their memories, and now he regrets it.

 

As for my own experience, I have mixed feelings. I appre­ciate having these virtual audio versions of my parents, even if they’re not perfect. They’ve helped me learn new things about my par­ents, and it’s reassuring to know that these bots will be there when my parents are not. I’m even con­sidering creating digital copies of my husband, my sister, and some of my friends.

 

On the other hand, like many people, I don’t like thinking about my loved ones passing away. It’s uncomfortable, and it’s somewhat sad that it took a stranger inter­viewing my parents on Zoom for me to fully realize how multifac­eted and complex they are. But I feel fortunate to have had this opportunity to understand them better and to continue spending time with them, face to face, with­out relying on technology.

 

Reference: Reader’s Digest October 2023