Today we travel to a future where dying isn’t the end. What if you could live on as a simulation? A bot that knows everything you’ve ever said, and can pretend to be you?
- James Vlahos — journalist & author of Talk to Me: How Voice Computing Will Transform the Way We Live, Work, and Think
- Muhammad Aurangzeb Ahmad — research scientist at KenSci, Inc, professor of computer science at University of Washington
- Anita Hannig — professor of anthropology at Brandeis University
- Joy Butler — attorney
- Barbie Wants to Get to Know Your Child
- A Son’s Race to Give His Dying Father Artificial Immortality
- How the dearly departed could come back to life – digitally
- Speak, Memory
- Goodbye for Now by Laurie Frankel
- Death-tech and the future of death – exploring what it means to live and die
- Years Ago, My Sister Vanished. I See Her Whenever I Want.
- Why Do 4-Year-Olds Love Talking About Death?
- Black Mirror Episode 4, ‘Be Right Back’: Death and the RealDoll
- A Creepy New Startup Wants To Create Living Avatars For Dead People
- These 2 tech founders lost their friends in tragic accidents. Now they’ve built AI chatbots to give people life after death
- This creepy AI will talk to loved ones when you die and preserve your digital footprint
- Why are people pretending to be dead on Instagram?
- #Funerals and Instagram: Death, Social Media and Platform Vernacular
- New Memory Cultures and Death: Existential Security in the Digital Memory Ecology
- Talking About Death in America: An Anthropologist’s View
- Living and Aging in the Land of Denial
- Death, memorialization, and social media: A platform perspective for personal archives
- In beloved memory of: Facebook, death and subjectivity
- Legal Issues for Avatars and Virtual Reality. Part One. The Right of Publicity.
- Legal Issues for Avatars and Virtual Reality. Part Two. Minimizing Risk.
- Maria — Cara Rose de Fabio
- Gaby — Eler de Grey
- M — Xandra Ibarra
- John — Keith Houston (also check out his karaoke nights in San Francisco)
Special thanks this episode to Adria Otte and Molly Monihan at the Women’s Audio Mission, where all the intro scenes were recorded this season. Check out their work and mission at womensaudiomission.org. Also huge thanks to all the actors who breathed life into the intros for this mini-season. Once again they are: Cara Rose de Fabio (Maria), Eler de Grey (Gaby), Rotimi Agbabiaka (Marquis), Xandra Ibarra (M) and Keith Houston (John).
If you want to suggest a future we should take on, send us a note on Twitter, Facebook or by email at email@example.com. We love hearing your ideas! And if you think you’ve spotted one of the little references I’ve hidden in the episode, email us there too. If you’re right, I’ll send you something cool.
And if you want to support the show, there are a few ways you can do that too! Head to www.flashforwardpod.com/support for more about how to give. But if that’s not in the cards for you, you can head to iTunes and leave us a nice review or just tell your friends about us. Those things really do help.
That’s all for this future, come back next time and we’ll travel to a new one.
FULL TRANSCRIPT BELOW
▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹
Rose: Hello and welcome to Flash Forward! I’m Rose and I’m your host. Flash Forward is a show about the future. Every episode we take on a specific possible… or not so possible future scenario, some kind of hypothetical what if. We always start with a little field trip to the future, to check out what’s going on, and then we teleport back to today to talk to experts about how that world we just heard might really go down. Got it? Great!
This is the last episode of the BODIES mini-season, so this is the last you’ll hear from me for a little while. Unless you’re a Patron, in which case you will continue to get bonus episodes in your special bonus episode feeds. And remember! If you become a patron before June 30th you’ll get a prize on top of all the regular patreon perks! Patreon.com/flashforwardpod or more on that, I will also link to it in the show notes.
Also, we’re now half way through this year, which is bizarre, honestly, I am not sure how it is alreadyJune? What is time? Make it stop. Anyway, and as you probably know, this mini-season format is new. So I’m curious, what you folks think? Do you like these mini-seasons? Hate them? Do you just not care? They make no difference at all? Let me know! I’m Info@flashforwardpodcom. Um, and I know that some of you like, really really hated The Snowglobe intros. I have heard that a lot, so if you hated them, I’m sorry, hopefully you like these characters better? But yeah, let me know what you think, firstname.lastname@example.org And if you want to discuss the format or potential future themes or anything else you can join the Flash Forward facebook group. So just… search for Flash Forward Podcast on Facebook and you’ll find it. It’s a closed group but I’ll add you when I can.
Okay, let’s go to the future! This episode we’re starting in the year 2070.
Gaby: Okay, let’s see… hi. Can you see me?
John: Yeah, I can see you.
Gaby: Normally Maria handles this tech stuff. Let me see if I can add M.
Marquis 2: Hey, hey, can you see me?
Gaby: Yeah, yep, okay. I guess we’re all here…
John: Dang. I never thought it would be Maria first.
Marquis 2: I was betting on you John.
John: [laughing] me too!
Gaby: Well she’s not… here… technically… but she did have a phantomoticon made.
John: Oh no.
John: Nope, nope, nope, do not like.
John: No thank you, hate it, hate it, turn it off.
Gaby: Come on John you know this is what she would have wanted! Will you just try it?
John: I won’t talk to it.
Marquis 2: Wow it looks, really realistic.
Gaby: Yeah they look good these days.
Marquis 2: Okay, I’ll bite. Maria, how’s it going?
Maria: Pretty good. Lots of work to do, of course, but we’re getting close to FDA approval.
John: No you’re not you’re dead. Come on this is creepy!
Gaby: Okay wait you’re not supposed to ask it questions about today. It’s a memorial. You’re supposed to ask it about memories. Maria?
Maria: Yes Gaby?
Gaby: What is your favorite memory from college?
Maria: Oh, hm, that’s a hard one. I think, well, remember when we went out into the woods and tried my eyedrops?
Maria: I think about that moment a lot, even now. It just, it means a lot to me. I realized that I could actually make stuff, create things that could make people’s lives better. But I also realized that I could trust my friends to try stuff, even when I had weird… possibly stupid ideas. I could trust that you’d kind of, go with me on them. I still can’t believe you guys put those eyedrops in your eyes… honestly.
John: Maria, do you know that you’re a robot?
Maria: Does anybody know they’re a robot, John?
John: How does it know it’s me?
Gaby: It has voice a recognition database.
Marquis: Maria, tell us about your childhood.
Maria: Sure, I was born in Los Angeles. In a small yellow one story house packed full of my family — we lived with my grandparents and two aunts and there were always kids running in and out of the house. It always smelled like onions cooking, even in the back rooms, that smell was always there. We didn’t really have a backyard, per se, but our house did butt up to the [sarcastically] “Los Angeles river,” which was mostly a dry concrete gulch. We used to ride our bikes up and down it all day. What else would you like to know?
John: This doesn’t creep you two out?
Gaby: Not really. I think it’s sweet.
Marquis: It’s cool to still be able to hear her voice.
John: But it’s not HER. She’s dead. This is… something else.
Marquis: Well yeah if I could have her not be dead I’d prefer that. But this is something. It’s better than nothing.
Gaby: Maria what’s your favorite story about John?
Maria: John, what can I say. I don’t know if he’ll ever actually hear this… my guess is he’ll refuse to interact with my phantomoticon at all. But, I should say, that I underestimated John. I thought he was just, a nobody stoner. He just kind of goes with the flow you know? Anyway, you asked me for a story, so I’ll tell this one that comes to mind. The first time Lulu met John in person, she was in that shy phase. She wouldn’t look at adults in the eye, even people in the family sometimes. She hid behind my leg and cried if anybody tried too hard to get her attention. So I told John that that would probably happen, and in his usual John way he was like “no worries.” And he didn’t make a big deal, didn’t try to get her attention, and Lulu from the get go immediately was like “this is my guy.” I mean she was in his lap in seconds. And he didn’t make a big thing of it, he just let her do her thing, and I thought maybe she had finally gotten over this phase but no, the next person who wasn’t John, she was right back to it, hiding, crying. So I asked her before bed that night I said “you really liked John, huh?” and she said, I swear to god, “John is a magic person.” So I asked her “what do you mean?” and she shrugs and is like “he just knows all the things but he doesn’t have to say them.”
John: … what is this from?
Gaby: Maria recorded it before she died.
Maria: Is there anything else you’d like to know?
Gaby: Tell us a joke.
Maria: I hate jokes.
Marquis: [laughs] That’s true! She does hate jokes… Did hate jokes.
Maria: Okay, it’s time for me to go!
Gaby: Bye Maria!
John: How does she know when it’s time to go?
Gaby: Oh, I only bought a couple of minutes of time with her. I wasn’t sure how you two would react. We could buy more if you wanted to keep talking.
John: No, no, it’s fine.
Marquis: So anybody can buy time to chat with her?
Gaby: Yeah I think she has it set so anybody can. You know she had a ton of friends. She would have thought this was so cool.
John: I still think it’s creepy.
Gaby: Yeah I don’t think I’d want one made of me.
Marquis: Really? I want one for sure.
Gaby: No way, when I’m dead I’m dead, you’d better say whatever you need to say to me now.
Marquis: But how will anybody remember you, where you came from?
Gaby: I don’t know. There are other ways I think? But also, maybe not every detail of my life is worth maintaining in posterity forever…
Marquis: [laughing] you’re saying we’re egomaniacs
Gaby: No! Well, maybe a little. But we already knew you were.
Marquis 2: Fair enough, fair enough.
Gaby: Hey John you okay? You look a little freaked out?
John: Yeah I gotta go… this whole thing is … I didn’t expect to see her, or that version of her, I think I need some time to think.
Gaby: Okay, well let’s catch up again soon?
John: Yeah, another time… no ghosts though. No ghosts.
Gaby: I’m sorry if I freaked you out.
John: It’s okay, I’m just, going to go lay down I think.
Gaby: Okay, bye.
Marquis: Bye… talk to you soon.
[ding] [ding] [ding]
Rose: Alright, it’s the end of the mini-season, so we’re talking about death. It comes for us all eventually, whether we like it or not. Except my dog, who is going to live forever. Nobody really wants to contemplate their own death, or the potential deaths of their loved ones. Humans have believed in various forms of the afterlife for… a really long time, in part because we don’t like the idea that after we die we’re just … dead.
And people have communicated with the dead, in various ways, for millennia. If you remember the telepathy episode I did a while back, you might recall that the invention of the telephone actually came out of the desire to talk to the dead. Alexander Graham Bell poured so much of his time and energy into trying make the telephone work, because he thought that perhaps he could use it to ring up his dead brother.
But on today’s episode we’re considering another way of living on, another form of afterlife. What if you could create a chatbot of your dearly departed to talk to whenever you wanted, for as long as you wanted? Like having a Pokémon to take care of, but it’s your mom, and you don’t use it to fight in elaborate battles. Or maybe you do, I don’t know your relationship with your mom, maybe that sounds fun. Oh and yes, I have seen the Black Mirror about this, we’ll talk about that later on in the episode.
Anyway, this might sound farfetched, this idea of building a simulation of a dead person, but there are a number of these ghost bots that exist today.
James Vlahos: The dad bot project began as something very low tech it was going to be an oral history project with my dad. He had been diagnosed with stage 4 lung cancer back in the spring of 2016.
Rose: This is James Vlahos, the author of Talk to Me: How Voice Computing Will Transform the Way We Live, Work, and Think.
James: We’d sort of been talking in the family for a long time anyway about doing an oral history project. This just sort of you know this very bad news kicked things into gear. Oh boy we need to do this right now because Dad’s not going to be around much longer.
Rose: And around the same time as James was recording this oral history with his dad, he was also working on a story for the New York Times Magazine about a new Barbie doll called Hello Barbie.
Woman’s Voice: The number one thing girls have asked for, is to have a conversation with Barbie. Using wifi and speech recognition technology, now they can. For the first time Barbie recognizes what girls are saying.
Child’s Voice: Hi Barbie!
Woman’s Voice: And can respond.
Barbie: Hello! This is the doll’s most groundbreaking innovation
James: And I had shadowed the people who created that, they worked for a company called Pullstring and sort of saw the whole process of how chatbots are created how some of the basic forms of A.I. work. And it gave me insights and probably a false sense of confidence like “Oh I see how this works and maybe it’s not that complicated.”
Rose: At first James thought it would be funny to make a chatbot Barbie version of himself.
James: Virtual James Vlahos. He could know some of my jokes. He could have some of my knowledge and I wasn’t really excited about that from the beginning. And then the more I thought about it just like, I would go to all this effort and all I would have was like well I already live inside my own brain and I’ve just got this sort of weak simulation of me. Conversely like here my dad is going, he’s the person if I’m trying to capture somebody if I’m trying to do you know weird new form of memoir almost. He’s the one that I want to do this with and try this technology.
Rose: And James isn’t the only person to have this same thought.
Muhammad Aurangzeb Ahmad: My brother called me, this was like almost six years ago that he had received news from the doctor that our father’s not going to survive for long.
Rose: This is Muhammad Aurangzeb Ahmad, a data scientist at a company called KenSci and a professor at the University of Washington. And when he got this news from his brother, he too started thinking about ways to preserve and memorialize his father.
Muhammad: And so one of the first things that came to my mind was that I that my my children, I mean at that time I didn’t have any kids, when they grow up they will not have any experience of interacting with their grandfather and other than hearing from other people they won’t to know what kind of a person he was.
Rose: Muhammad shelved the idea for about a year after his father died. He got married, and had a kid, and that was when he really decided to have a go at making his own dadbot. So both Muhammad and James have this same basic idea, and they both set out to make it a reality.
Muhammad: So there’s a host of data sources that I’m using. So for example like transcripts of chats that I’ve had with my father or letters, emails and then from videos I actually transcribe the conversations from videos and also from like audio talk.
James: You know, I have all these sessions where I’ve sat down with my dad. He’s told me about everything from his ancestors to his childhood to his college days to meeting my mom his hobbies all of those recordings. I’d had those transcribed.
Muhammad: I’m sort of that information is tribe transcribed in the form of text. So that actually sums up to more than 2000 transcripts.
James: I had two hundred pages in a binder of his words.
Rose: They both then have to go through all that stuff and tag each piece, each story, each little quip and phrase, with not just what it was, but how it might fit into a tree of conversation.
James: So maybe he’s having a conversation about Greece or he’s having a conversation about college or he’s talking about his first job or that girlfriend that he met while working at the daily cow whatever it was. So putting all the content into these flows and topics.
Rose: If you’ve ever seen the HBO version of Westworld, there’s this scene, which is actually one of my favorite scenes in the show, it’s in the first season, and there’s this host, Maeve, who thinks she has free will. But in the lab, she’s looking at this tablet, and it’s showing her, exactly what her next word is going to be.
Felix: I have to pair it with you.
Maeve: Pair what with me?
Felix: You can improvise a little, but most of what you say was designed upstairs. Same as the rest of you.
Maeve: This is just a cheap trick… this can’t possibly… I can’t…
Rose: To make these dadbots, both James and Muhammad had to build these kinds of conversational trees, so that if you ask the bot about Greece it finds the right answer about Greece and not UC Berkeley. But to make a convincing bot, that’s not enough. Much like real life conversation, it’s often way easier to talk than it is to listen. So in early versions these dadbots are kind of like that annoying guy at the party who doesn’t seem to follow anybody’s conversation and just keeps telling irrelevant stories about himself.
James: Building a bot is sort of like raising a child. Or maybe a very smart pet. It’s it’s somewhere in that range. And this even gets at like whether I think it’s alive or not like it is it is like a little creature that you’re rearing and you spend, or I spent so much time teaching the dadbot and really trying to help him learn to be able to go out and navigate the wild world. And at home he just the dad seems so smart and so capable.
Rose: But then James had someone else test the bot… and it didn’t go well. Basically the bot had gotten really good at listening to James, and knowing what James might ask. But everybody talks a little bit differently, right?
James: So when I tested it with a guy who was you know a computer science graduate student and he was just saying all kinds of weird things and the dad but was responding just idiotically.
Rose: And James says he was feeling kind of protective of his bot, which, remember, was his dad, who was at that very moment dying.
James: Yeah I was kind of I was mad at the guy for not being easier on the dadbot and I felt sort of almost like a sense of pity for the dad. Like, like he’d you know he ran out on the soccer field and tripped on his shoelaces right away.
Rose: So James then spent a while trying to teach his dadbot to listen, to understand questions and prompts from all kinds of people, and to respond appropriately. And this wasn’t just like, a little side project that he did in his spare time.
James: I mean it literally took six months to create, of mostly full time work, to create a dad but that you know just got to a baseline of working OK.
Rose: Meanwhile, Muhammad was training his own dadbot using a slightly different method. James was mostly relying on actual phrases that his dad had said, during the oral history. Muhammad on the other hand, was trying to train the system to generate new phrases.
Muhammad: Using the audio recordings to actually generate sentences which for example my father never uttered but that the computer can the system can actually generate such sentences and also sound like him.
Rose: So it’s like, would dad ever have said that? Would he make that jokes? Does that sound like him?
Muhammad: So if you would interact with the system then you could say yeah that’s kind of sounds like how what my father would have said, even though the words may not exactly be the same
Rose: But there’s also an interesting question here right? Like, which version of this person are you preserving? Nobody is the same way with everybody, right? You probably act and even speak differently when you’re around your friends vs your parents vs your kids vs your boss. You’ve maybe never seen the side of your mom that dances on the bar after a wild night of bingo. And she’s probably never seen that tattoo you have. Even if James’s brother had made this bot, it would probably be coded a little differently.
Now, Muhammad father had already died by the time he started working on his dadbot, but James’s dad was still alive. And as James got more and more obsessed with this project, his dad got sicker and sicker.
James: I was working here in my home and I might be working all day on the dadbot, I’m going through the transcripts I’m imagining him at a Cal game like scribbling notes for the article he’s going to write for The Daily Cal. I am imagining him on the stage performing you know a lead role and a Gilbert and Sullivan operetta. Like, him in all of his glory basically then kind of reached the end of the day, drive five minutes up the hill to where my parents live. And you know he’s sitting in a chair, and he’s you know down to one hundred and thirty five pounds, and he’s got blankets piled over him, and he’s just sort of his voice is dry and he’s croaking out a few words. Um… it was just brutal I mean it would have been brutal regardless, um, but I had to shift gears really quickly. And I think I could forget a little bit during the day when I was working the present. But then I would definitely get thrown right back to it.
Rose: And yes, in case you’re wondering, James kind of knows how … strange this all sounds.
James: I just got sucked in like I could not stop working on it. And I did wonder is like who… who am I making this for like who’s ultimately going to use it? It just seemed, it seemed weird.
Rose: Towards the end of his dad’s life, James decided to show him the bot.
James: It wasn’t so much that I was ready ready, it was that like my dad was really declining.
Rose: James had told his dad he was building this bot, but it didn’t really mean anything to him, it wasn’t real. And when he showed it to his dad, his first reaction was just… confusion.
James: He was having these sort of, out of body is not the right phrase out of mind moments because you know it was he was sitting on one side. I mean my mom was sitting on the other me, side of me and we’re all talking to the dad but which is pretending to be my dad. So he was getting a little confused like “Wait who am I? And who’s this thing that sounds like me?” And it just it was scrambling his his mental eggs. But you know by the end you know he he got it in a way that he hadn’t gotten it before and you know he wasn’t saying to me like oh I can’t wait for me to use it again. But he really liked that it was going to be round for me and my brother and sister and our kids for the descendants to use. So he yeah he admired it and then the other thing you said it was very simple but pleasing, you know that that sounds like me. Those are things that I’ve said yeah.
Rose: And a lot of the family was sort of pleasantly delighted by this little dadbot that James had made.
James:You know they sort of smiled laughed then especially when they would hear his actual voice. They would get little tears in their eyes while they were smiling laughing
Rose: But his son was hilariously kind of unimpressed.
James: he did he compared it to Siri and he just didn’t seem like he didn’t seem wildly impressed the way even people my age or older do. It almost seemed natural to him that this could exist because in his memory you’ve always been able to like say something to a computer and hear something back. And this just happens to be one that does my dad.
Rose: His wife, doesn’t like it.
James: My wife has never fully come around to the dadbot. And I mean I… She hasn’t tried it in a while. She tried it maybe a month or so after my dad had gone. And for her it was just painful, it was like sticking, you know poking a wound. It reminded her of my dad but not in a good way. It reminded her that he was gone. Almost the goodness of it to her, which other people liked, was a flaw to her. It does sound a lot like him, but I know it’s not, and that, I find that really hard.
Rose: Honestly, a mixed reaction is kind of to be expected right? I mean… this is kind of a weird thing to do and you can’t expect everybody to be on board. And in fact, when I asked Muhammad about what the reactions were like when he showed his family his dadbot… I was really surprised by what he said.
Muhammad: I have actually not told my family members about this project.
Rose (on the phone): Oh really.
Muhammad: Yeah. I mean actually I’m planning to do that in July. I mean most likely right after 4th of July. When um, we have a family reunion. And the reason for that is that for the longest time I was not sure how they would react.
Rose (on the phone): It’s not like this is a secret, you’ve written about it you’ve been in the news about it. I’m amazed that they don’t know about it.
Rose: Muhammad basically thinks that his family doesn’t actually read the stories about his work, they’re just excited he’s getting coverage but they haven’t actually realized what exactly he’s up to.
Rose (on the phone): You think they just don’t read the articles?
Muhammad: (laughin) Yeah, I don’t think, I’m actually pretty sure that they don’t, they say “Oh congratulations you are really proud that your work is your research is being covered in the news”
Rose (on the phone): But they never say anything about what it is. You think they just don’t they don’t read it.
Muhammad: Correct. Correct. Right right.
Rose (on the phone): Wow… how do you think they’re going to react.
Muhammad: Um yeah. So that’s the question that I’ve just thought about it like so many times.
Because initially I thought that maybe they’ll they’ll think that that’s it’s actually not a. Not a good idea to create such a system. But now I think that there will be um likes supportive. My worry is that because the system is based on just the data that I have, maybe they’re start interacting with, they would want to interact with the system more, and then maybe asks the system certain questions that it cannot answer because the data the training data that does not exist.
Rose: So just like James, Muhammad is afraid someone is going to trip his dadbot on the soccer field! The only person he’s shown it to is his wife.
Muhammad: Yeah it’s actually been two months, when I told her and I showed her and showed her the system and she was she was actually amazed.
Rose: He hasn’t even shown it to his daughter, who in theory is the whole reason he did this project right? So that his daughter could have a relationship with this dad. But he says that after watching the ways his daughter interacts with technology, he started to have second thoughts about that.
Muhammad: So for example, when my daughter interacts with other relatives say with my mom or the phone or online then she asked “So when is she coming over and are we going to see them.”
Rose: He’s worried that once she talks to the bot, she might ask “when are we going to see grandpa?”
Muhammad: When can I meet this person or when is he coming. And then how to explain that to a five year old. I mean I mean I mean I still have a couple of years to go (laughs).
Rose: You know I’m going to email Muhammad after July 4th, to find out how his big reveal went to his family. Don’t you wish you could be a fly on the wall at this family reunion? Aunt Tricia brought the potato salad, Muhammad brought a bot version of his dead father!
Voice: Um, uh, hi.
Rose: I haven’t interacted with Muhammad digital ghost, but I did have James show me his dadbot.
[sound of typing]
Rose (in room): Hello, question mark…
James: Do you want me to read it?
Rose (in room): Yeah, can you read it?
James: It says “Hello, tis I, the Beloved and Noble Father. What’s your name?”
It’s me, Jamie. “I thought I smelled something funny. Well, how the hell are you?”
I am kind of busy right now.
Rose: James has the dadbot tell me about Greece, where his family is from. He tells me what the town where his mother is from looks like, no cars, just horses, up on a hill.
James: Bots never sleep Jamie
Rose (in room): Did you write that line or did he say that ever?
James: He doesn’t know what a bot is. So that’s definitely a that’s an improv. It’s a lot of where I had this script was in a little kind of transitional in conversational mechanics. How’s life treating you.
I have no complaints.
“Uh huh. Can’t expect a static rapture every day.” That is. That’s that’s the kind of language he would use. “On my end, I’m doing splendidly.”
Sing me a song.
Dadbot (singing): Palms of victory, Palms of glory
Palms of victory we shall win
Palms of victory, Palms of glory,
Palms of victory we shall win!
What will we do to the Stanfurdites on that great day?
We’ll celebrate them on that night after we play!
We now declare our hoodoo’s gone, victory is here!
Hit ‘em again boys! Hit ‘em again boys, harder!
Palms of victory, Palms of glory
Palms of victory we shall win
Palms of victory, Palms of glory,
Palms of victory we shall win! Bum bum
James: I loved that, he wants to know what I thought of that, I loved that.
Rose: While James is showing the dadbot to me you can tell he’s proud, and that he’s genuinely enjoying the experience of talking to his dad, or at least this version of his dad. And when he showed me the actual mechanics, the conversation trees in the back end, I noticed a folder labeled jokes.
Rose (in room): I see stories songs and jokes…
James: Yes some of them are not appropriate. We can try. Tell me a joke. I mean in the way that like dad jokes of a certain generation are not appropriate, so let’s just see…
Rose: I’m not going to include the joke in here, because it is indeed uh, let’s say PG-13, but $5 Patrons will hear it on the bonus episode this week.
James: [typing] That was obscene, dad.
“If I must defend in the name of comedy so be it.” So this is the kind of thing I’m actually a little bit proud of that he knows that it obscene refers to being bad like he actually has a pretty good vocabulary. OK so now he’s just randomly stopping to brag about himself.
He says, “Let’s pause briefly to relay what other enlightened souls have said about the BNF” which stands for beloved and noble father the nickname he gave himself. So wait, what did you want you want to know. You wanted to he.
Rose (in room): There’s the question, “do you know that you’re a bot.”
James: Oh right. [typing] Do you know that you are a bot? “Sadly I’m all too aware that this is the case.”
Rose: It’s kind of surreal, honestly, to watch James talk to this bot. It’s a little bit like watching someone talk to a ghost. But the ghost can talk back… And it’s a ghost that you’ve built, poured hours and hours of time into.
James: I mean like I say I think I spent about six months mostly full time to create the dadbot
Rose (in room): wow, wow, worth it?
James: Yeah yeah. It really was. Yeah I’m I’m I’m proud of the dadbot.
Rose (in room): When you get old would you want your kid to make one of these for you?
James: If they did a good job. Yeah I would.
Rose (in room): What is a good job? Respectful?
James: Well again I mean, yes. And if it felt if it felt faithful.
Rose: James has a couple of other rules about how he wouldn’t want to see the dadbot, and his future ghostbot applied. And when we come back we’re going to talk about all the ways these bots could evolve over time, what they might reveal about our relationship with death, and what kinds of sticky, legal questions might arise in this future. But first a quick break.
Rose: James still talks to his dad bot a couple of times a month. But earlier this year James got some really terrifying news. The company that created the software that runs his dadbot was acquired by Apple.
James: And what that means is that all of the sort of public versions of the dad but that anybody anywhere via Facebook Messenger could communicate with are gone. I have a private version that’s just on my computer that I can use.
Rose: And James wasn’t always sure that his private version would keep working.
James: I was terrified. In fact I was positive that’s what was gonna happen. (laughing) I don’t know if I should say this I am the computer that the dad about lives on. I feel like I need to disconnect from the internet never update any part of it because I’m afraid that like something’s going to happen and that’s going to break the last remaining version of the dadbot.
Rose (in room): Would it be like him dying again?
James: (deep breath) Yeah… Um it would be a big… it would be a big blow.
Rose: Despite his deep emotional connection with this little chatbot, James has some strong opinions about what he thinks constitutes going TOO FAR here.
James: I think there is a way to do it right. But I also at the same time see all sorts of lines that I don’t personally want to cross. Another one of those is when people talk about like creating a visual avatar of someone as well, and you know based on videos that have been recorded and then you’ve don the VR headset and you can see fake robot dad talking to you…
Rose (in the room): Not for you?
James: Not for me. Yeah. That that makes my skin crawl.
Rose: James also wouldn’t even want to create a synthesized voice for his father to “speak” with, saying phrases he never said out loud. But Muhammad has other ideas – he sees a future where these avatars could exist in augmented reality, and be “present” at events like birthdays or weddings or holidays.
Muhammad: And my prediction is that within within five to 10 years and definitely 10 years if not before all of these competence will be in place. So that say it will be possible for pretty much anyone to interact with a virtual reality component of say it systems like these and get that experience of interacting with a family member who has passed away.
Rose: In this version of the future, our world is full of ghost avatars, people who have died but live on in this weird in between space.
Voice: I see dead people
James: One dark shadow that sort of looms over this discussion is you know is a company like Facebook ever gonna get into making these for people and using you know, a years decades of collected social media posts to put something together.
Rose: Which brings us to that Black Mirror episode I mentioned earlier. It’s called Be Right Back.
Woman’s voice (angry): He’s dead!
Woman’s voice (soft): It’s software, it mimics him. Give it someone’s name it goes back and reads through all the things they’ve ever said online, their Facebook updates, their Tweets, anything public. I just gave it Ash’s name, the system did the rest.
Woman’s voice (soft): Hello?
Man’s voice: So… how am I sounding?
Rose: By the way, yes, I have seen every Black Mirror episode, I get that question a lot, and normally I actually try to AVOID doing things that Black Mirror has done. But in this case I actually think that the Black Mirror version of this future doesn’t go far enough. What you see in Be Right Back is kind of a reflection on the ways that we struggle to let people go after they die.
Anita Hannig: Once somebody dies the work of the living isn’t necessarily done. People usually mobilize some sort of ritual to ensure that the dead are probably dead if that makes sense.
Rose: This is Anita Hannig, a professor of anthropology at Brandeis University. And she studies death and death rituals in places like the United States and Ethiopia. And what she and many scholars have noticed is that Americans are in particular, not very good at dealing with death.
Anita: We do live in a death denying culture in the States. Death is death does not tend to be a topic of everyday conversation. Death we tend to keep death at arm’s length. You know we sequester the dead in hospitals in nursing homes in mortuaries. And we don’t often personally deal with dying people we outsource that job to professionals, be they from the medical profession, be there from the funeral industry and we no longer take care of our own dead the way that we used to.
Rose: This is not the case everywhere, different cultures have different ways of dealing with death, and in many cases those traditions put people in much closer contact with the dead and dying. The Tana Toraja, in eastern Indonesia, for example, have a tradition that sometimes includes a dead person living inside the house, in their own special room, for years after they die.
But in the United States, death is something that many people find, disgusting, something they don’t want to be around or really even talk about. The house next door to mine is for sale right now, and since I work from home I see people coming by to look at it throughout the day. And every so often, if I happen to be outside with my dog or whatever when they come over to see the house, I’ll chat with whoever is looking. And almost half the time, they ask me the same question: did anybody die in this house? And like, I mean, I don’t know… it’s possible! The house was built in the 1950’s, we’ve only lived here for a little while, so I can’t officially say that nobody died in this house. But also… this question is kind of weird to me. People used to die in houses all the time!
Anita: Even the sight or the specter of a dead body is for most people very uncomfortable because they, they didn’t grow up with it. They that they don’t have that kind of exposure. But if you look back even to the 18th century children used to play with these victorian death dolls. Where, part of the child’s play in that era used to be children playing death, playing funeral that they would put on a funeral service for somebody.
Rose: Even funerals today, don’t really command the same respect they used to. It used to be that even if you hated someone, you really were kind of supposed to show up for their funeral.
Anita: Versus here now, we see people sometimes don’t even really have memorial services anymore. Well I guess it strikes me every time when I talk to somebody from the funeral industry who then tells me you know a lot of cremated remains are never even being picked up anymore.
Rose: Today, Americans don’t tend to physically encounter death nearly as much as they used to. But we do encounter death and dying online. And that’s in a couple of ways. One is the really harrowing rise in the ability to watch videos of shootings, whether by police or by domestic terrorists. And the other is in the more mundane.
Anita: On on social media platforms I think it’s super interesting how we immortalize the dead online now. How you know you have these Facebook memorialization pages where you can change somebody’s page to a legacy page and then the living can continue to interact with the dead. But in, I mean of course in a very sort of unilateral way because the dead person will not respond.
Rose: There’s this weird thing that happens now, on social media. These profile pages are like these little capsules you can return to over and over again. If you go to the last post on the Instagram pages of famous people who have died, you can see that the comments kind of constantly roll in. Nipsey Hussel, for example, a rapper who died earlier this year, if you go to his last instagram post there are comments from, I just checked, 32 minutes ago saying “we miss you.” Mac Miller’s last post has a comment on it from 44 minutes ago saying “miss u more than usual today.”
Anita: The last person the last picture somebody posted becomes this, almost this, yeah this digital the shrine where people then post things and use that kind of almost like as a grave site to comment on them and kind of give them a digital afterlife in a way.
Rose: Not to be morbid, but honestly we’re way past that 35 minutes into an episode about death, but sometimes when I post on Instagram I do think to myself “if this is my memorial photo, how stupid will it be?” Is that unhinged? Maybe… But the idea of leaving notes and messages on memorials… really old right? You don’t need Instagram or Facebook for that. People put flowers on graves and visit special memorial sites, this is a super old tradition. With these ghost bots… it’s a little different.
Anita: What’s interesting about these avatars is that people actually talk back to you. And I can’t remember or think of a different example where that has ever been possible where that kind of communication with the dead actually isn’t just an echo that you yell out into the forest but you actually get something back.
Rose: And Anita’s first reaction to these avatars, these bots, is to just be sad.
Anita: To me it reads like an intense form of death denial because it’s another way of keeping death and the reality of death at arm’s length by pretending it doesn’t even happen. And by saying “well it’s ok you’re dead but you’re not really dead so I don’t have to actually come to terms with my grief. I don’t actually have to do anything special to dissolve this attachment to you or to make it into something else or to usher you out because you’ll still be around and I can choose to interact with you in whatever way I want to.”
Rose: I did ask James about this.
Rose (in room): Do you think it was a coping mechanism? As you were dealing with a really hard thing?
James: I don’t know. I mean maybe, maybe. I mean that I’ve been like someone else would have to tell me that that’s what I was doing. I did I was like hanging on to it, and I got very obsess and maybe it was a little life raft. But um, I’m not sure really is the answer.
Anita: And maybe it’s not so much death denial as it is this desire to keep you know your your your loved ones close by. But to me and in a way all fantasies of immortality are rejections of the fact that death is a real thing that permanently changes somebody’s consciousness and somebodies presence in the world.
Rose: And that’s what the Black Mirror episode is about. And I mentioned earlier that I didn’t think the Black Mirror episode went far enough. What you see on the episode is a woman who lost her boyfriend in a car accident, and who winds up using a service to recreate him, much like we’re talking about here. But when he shows up, in physical form, she’s kind of creeped out by him. He’s close to what the real Ash was like, but he’s not the same. There’s a scene towards the end of the episode where she takes him to a cliff and orders him to jump off.
Woman’s voice: Yeah well you aren’t you are you?
Man’s voice: That’s another difficult one, to be honest with you.
Woman’s voice: You’re just a few ripples of you. There’s not history to you. You’re just a performance of stuff that he performed without thinking and it’s not enough.
Man’s voice: Come on… I aim to please!
Woman’s voice: Aim to jump… just do it!
Man’s voice: Okay, if you’re absolutely sure…
Janet: Please don’t hurt me I don’t want to die please!
Rose: But I think this future could get… so much weirder. Because first, you’re talking about third party companies creating these digital ghosts for us, right? And that means that there has to be a business model here. So maybe you pay for a subscription to this service, to this ghost. And if you run out of money, poof, the ghost is gone. Your whole digital relationship with that person is gone.
Or, you could structure it on the so-called “freemium” model. So it’s free, but there are ads. By which I mean… the ghost literally tries to sell you stuff. And I mean, talk about effective marketing! Having your dead husband suggest you buy something … come on… does it get any more personal than that?
And then there’s this question of machine learning, and how that might warp these memorials into something really weird. So these systems have to be able to learn, in order to get better at giving answers to people who interact with them. But you have to be careful here, because in this case, they can learn … too much.
Muhammad: I wanted to learn one the system to learn from the past, from the data that it has and then if needed I can document augmented with additional data. But at the same time I do not want it to do just say stray too far away from from the personality of my father that I at least that I want to encode.
Rose: If these bots are around for generations, learning constantly from what people are saying to it, they could wind up turning into a totally different version of this long dead person. Like, they become basically a completely different person that the one who actually lived. And in the darkest timeline, they could turn evil — think of the Microsoft chatbot named Tay who, within just 16 hours, was turned into a Nazi thanks to trolls chatting with her and training her to say horrible things.
And this is why, personally, I would never want one of these things. I’m just… a real control freak. Maybe you could tell from the way this podcast is produced but I just … I know what I want. And I like to have control over things. And this idea that after I die, this other version of me could go around saying stuff that I would never have said, honestly it makes me want to crawl out of my skin! So … that’s a no from me. And it actually made me wonder like, is there any legal way for me to prevent this from happening? Like, could it put it in my will?
Joy Butler: You certainly can dictate how you can how your property can be used after your death.
Rose: This is Joy Butler, an attorney based in Washington, DC. And she says that in this future, our legal conundrums will involve something called the Right to Publicity.
Joy: And when we speak of the right of publicity we mean any sort of indicia about that person whether it’s the person’s name their image their voice their unique characteristics and the right of publicity gives everyone the unique right to commercialize or profit from his persona.
Rose: Everybody has this right, famous people and not famous people. It’s why a company can’t use your photo in an advertisement without paying you for it — you have the legal right to make money off of your persona, no matter who you are. Now, there are some exceptions to this rule.
Joy: If it’s an account of a person’s life or commentary directly about that person let’s say maybe in a biopic. Especially if the person’s life or events in which they’ve been involved are newsworthy in most circumstances, that use is going to be OK it’s gonna be protected by the First Amendment as free speech.
Rose: Now this can sometimes be kind of a murky line. In 2015, the video game company Electronic Arts Inc. came out with a version of Madden that was all NCAA football players. A few of those players objected to their likenesses being used in the game without being compensated. And a judge in California sided with the players, ruling that Madden couldn’t use their personas without permission.
So in this future, where we have these avatars, there are two ways this could go. One being that you, while you’re still alive, sign yourself up for this. Which means that in the terms and conditions for the company that you’re using, you’re giving them express permission to use your likeness and persona. And you’re probably also giving them a bunch of data. The other is that someone you love signs you up for this after you die. And in that case they’re the ones who are exercising your right to publicity.
Now in some states, the right to publicity is still recognized after you die, but in other states its not.
But let’s say we’re in a state that does recognize the right to publicity post mortem. Then, it’s basically like, furniture in your will.
Joy: Similar to an argument amongheirs about any type of property that’s in the estate. Right of publicity rather than the land rather than a stock rather than a company.
Rose: Joy says that this would probably be considered, by the courts, just like using a photograph at a funeral service.
Joy: And I would really analogize that to, if your parent dies and you have a photo image of your parent, engaging someone to create a portrait of your parent.
Rose (on the phone): But it can talk… Does that matter that it like it can. It can respond and learn and interact? It’s the same?
Joy: I would say it depends what it’s going to say.
Rose: It seems just like, wrong to me that a portrait, an oil painting, would be considered the same way as this active, thing that can talk and interact and change over time. But legally, it might not be any different.
But here’s another question, in this future, let’s say that someone hasn’t clearly bequeathed their property to a single descendent. Let’s say there are multiple people in the family who have their own ideas about how things should be divided up. This happens all the time right? Tons of people don’t make wills or lay out their wishes. And I’m sure you’ve at least heard of if not been part of a family getting into it over what to do with grandma’s stuff. So what happens in this situation, where half the descendents want to make a ghostbot and half of them do not? What happens if the half that do, make one, and the half that don’t use it and claim that it’s TOTALLY unrealistic, it’s not like Aunt Mary at all. How do you prove that a bot is or isn’t “accurate” when, like we talked about before, every person contains multitudes, and acts differently around different people?
I asked Joy about this, and she basically said that… there’s no real like, legal case there, no complaint to make to a court.
Joy: I see that ending more in kind of like family bickering and disagreement as opposed to a case. Realistically there have to be something substantive behind that for it to end up in court. That’s my general feeling. But you know odd lawsuits were filed every day.
Rose (on the phone): Maybe there would be a whole TV show just for these cases where it’s like not real Court TV Court. I would watch that TV show.
Rose: I can totally picture it, GHOST COURT. Where avatars of dead people are put to the ultimate test.
Maury: You are not the father (screaming)
Rose: Anyway, I’m curious what you folks think … would you have an avatar made of you? You know how I feel, no way jose, but obviously some people are totally into the idea. James, Muhammad, even Joy said she’d consider it.
Joy: That’s an interesting question… So I certainly want things that reminded me of good times and positive memories of my loved ones after they passed. So depending on the specific scope and characteristics of the avatar I’m not ruling that out.
Rose: That’s all for this future. And for this whole mini-season! I hope you enjoyed this little foray into the future of BODIES.
Huge thanks to all of the voice actors who brought the intro scenes to life this mini-season. Maria is played by Cara Rose de Fabio. The first voice of Marquis was played by Rotimi Agbabiaka who is debuting a new solo show called Manifesto on June 21st at the African American Arts and Culture Complex as part of the National Queer Arts Festival. I will link to that in the show notes. The second voice of Marquis is played by Xandra Ibarra, who has a couple of upcoming art shows in San Francisco, Mexico City and Montpelier, France. You can check our her website for more on those. John is played by Keith Houston, you can find his voice acting work at keyvoicevo.com, or, if your in the Bay area and looking for some karaoke, check out Roger Niner Karaoke at rogerniner.com. Gaby is played by Eler de Grey, whose work you can find at degrey.studio that’s d-e-g-r-e-y dot studio.
Special thanks this whole mini-season to Adria Otte (AH-tee) and Molly Monihan at the Women’s Audio Mission, where all the intro scenes were recorded this season. You can check out their work and mission at womensaudiomission.org.
Flash Forward is mainly supported by Patrons! If you like this show, and you want it to continue, the very best way to make that happen is by becoming a Patron. Even a dollar an episode really really helps, it’s huge. You can find out more about that at flashforwardpod.com/support. If financial giving isn’t in the cards for you, another great way to support the show is by head to Apple Podcasts and leave us a nice review, or just telling your friends about the show. The more people who listen, the easier it will be for me to get sponsors and grants to keep the show going.
If you want to suggest a future I should take on, send me a note on Twitter, Facebook or by email at email@example.com. I love hearing your ideas! The next mini-season I’m still debating between two different themes, so I’m all ears on what you want. I’m trying to decide between CRIME and POWER. So Crime like what is the future of crime solving and crime doing? Do we need cops on Mars? Should we NOT have cops on Mars? What does that look like. And then for power, literally like what if we had unlimited power but also questions of like who is in power, who eets power, should animals have legal rights. So if you have feelings about either of those two let me know. And again if you have thoughts about the way this little miniseason structure is working for Flash Forward, send me a note, firstname.lastname@example.org. Unless you want to yell at me about The Snowglobe, I get it, you hated it, I get it… just… be nice.
If you want to discuss this episode, or just the future in general, with other listeners, you can join the Flash Forward Facebook group! Just search Facebook for Flash Forward Podcast and ask to join. And if you think you’ve spotted one of the little references I’ve hidden in the episode, you can email me, email@example.com. If you’re right, I’ll send you something cool.
Okay, that is all for this future and for this mini-season. I’ll be back in your ears again in August. And I hope you have a great life until then… I’ll see you in the future!