In this series, we’re taking a look at some of the real science, policy, economics, law and ethics that inspired the events of Vanguard Estates. Today: can robots really help the aging? What are the pros and cons of these devices, and how do you evaluate their safety?
- Victor Wang: CEO of care.coach
- Dr. Amanda Lazar: assistant professor at the University of Maryland College of Information Studies
- Dr. Clara Berridge: associate professor at the University of Washington School of Social Work
- Laurie Orlov: founder of Aging and Health Technology Watch
- Kate Swaffer: activist & author, co-founder of Dementia Alliance International
- Nikki: care partner & founder of A Log Cabin in Brooklyn
- Dr. Tia Powell: psychiatrist and the director of the Center for Bioethics at Albert Einstein College of Medicine, author of the book Dementia Reimagined: Building a Life of Joy and Dignity from Beginning to End.
- The Grey Dawn (Flash Forward episode)
- What Happens When We Let Tech Care for Our Aging Parents
- Robots could replace real therapy dogs
- Pilot testing a digital pet avatar for older adults
- Negotiating Relation Work with Telehealth Home Care Companionship Technologies that Support Aging in Place
- Aging and Health Technology Watch
- Age-related decrease in sensitivity to electrical stimulation is unrelated to skin conductance: an evoked potentials study
- Let’s Talk Tech
- Investigating the Potential of Artificial Intelligence Powered Interfaces to Support Different Types of Memory for People with Dementia
- “Taking care of myself as long as I can”: How People with Dementia Configure Self-Management Systems
- Breathing Room in Monitored Space: The Impact of Passive Monitoring Technology on Privacy in Independent Living
- A systems approach towards remote health-monitoring in older adults: Introducing a zero-interaction digital exhaust
- Hawthorne effect
- Tavour: Tavour is THE app for fans of beer, craft brews, and trying new and exciting labels. You sign up in the app and can choose the beers you’re interested in (including two new ones DAILY) adding to your own personalized crate. Use code: flashforward for $10 off after your first order of $25 or more.
Flash Forward is hosted by Rose Eveleth and produced by Ozzy Llinas Goodman. The intro music is by Asura and the outro music is by Hussalonia. The episode art is by Mattie Lubchansky. Amanda McLoughlin and Multitude Productions handle our ad sales. This is a companion episode to the Vanguard Estates series, which you can hear in the Flash Forward feed.
That’s all for this future, come back next time and we’ll travel to a new one.
FULL TRANSCRIPT BELOW
▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹
Vanguard Estates: “Can Robots Really Help the Aging?”
[Flash Forward intro music – “Whispering Through” by Asura, an electronic, rhythm-heavy piece]
Hello and welcome to Flash Forward. I’m Rose, and I’m your host.
Today we are going to talk about the technology piece of “Welcome to Vanguard Estates.” Again, this is one of five episodes about the real stuff that inspired that story, so if you haven’t listened to it and you care about having certain elements of the story spoiled, now is the time to hit pause on this episode and go listen to the series.
Okay? Great, now let’s talk about tech. And let’s start with Missy.
[clip from Vanguard Estates Episode]
MISSY: Hello Marcus. Nice to meet you.
Missy was loosely inspired by a real service called Care.coach.
We coach people to improve self-care and healthcare outcomes. And the dot means that we do it through technology.
Well, it looks like an adorable little dog or cat most of the time. That avatar is the face for an entire global team of empathic, and intelligent, and caring people who we call health advocates. And we hire them, like, in the Philippines and Latin American countries for Spanish, and staff the avatar with all of these great people, and thereby partially solve the caregiver crisis.
Basically, what the app does is it consolidates a whole team of people into one avatar that the user engages with. And that cat or dog does everything from medication reminders, to talking people through physical therapy, to just general social conversations.
And they picked the dog and cat avatars for a couple of reasons. For one thing, there is evidence that animal therapy has benefits, even if that animal is a robot. But there is another reason, too.
People also tell their dog all sorts of stuff they don’t even tell their own family members. We had a guy wake up and his little dog avatar… I think he called him Buddy. People call their avatars all sorts of things like Sparkle, El Capitan, and things like that. I think he was Buddy. He was like, “Hey, Buddy, I fell in the shower. Kind of banged up. I think I’m okay, but can you stay with me and make sure I’m okay? But don’t tell my daughter because she’s going to put me in a nursing home if she finds out I’m falling in the shower.”
Now, I asked Victor, “Okay, in that situation, did you tell the daughter?” Or does Buddy keep the secret? And the answer is kind of complicated.
A lot of those decisions are coded into the system.
Care.coach isn’t something you can easily just go buy at the store or download on your phone. They usually work with specific health plans and insurance programs. And each of those health plans has specific rules about what gets reported to who.
So there’ll be certain types of things that they want to escalate, certain types of things they want to go to, like, an after-hours nurse line. They might have a suicide prevention line they want us to direct certain things towards. They might have like the “social determinants of health” type of pathway. And certain organizations like to leverage family members as much as possible. And so they might actually want us to directly go to a family member.
In our story, one of the things that the dad confesses to Missy is technically a crime. And I asked Victor, in real life, if someone told their Care.coach avatar that they had committed a crime… what would actually happen?
I think that technically depends on the organization’s policies and the nature of the supposed crime. There’s an element of common sense, you know, like are we talking about, like, somebody smoked marijuana in a state that doesn’t allow marijuana? Or are we talking about, like, they’re in the process of burning the building down? So there’s a… You know, in a lot of cases it’s going to be a judgment call.
In our story, Marcus, the dad, gets really connected to the person on the other end of his robotic cat. And that is a thing that really does happen with people who use these kinds of systems.
A couple of people completely bypassed the interface
This is Dr. Amanda Lazar, an assistant professor at the University of Maryland College of Information Studies. And a few years ago, she did a study on these kinds of avatars. In the study, one participant said that they could tell who was on the other end based on the sense of humor that was coming through the little dog avatar. Another participant said that she would ask certain questions that she knew some teleoperators wouldn’t answer, so that she could get to the one that she wanted to talk to.
I think this is a really interesting thing to think about, because often we assume that older people are just going to simply sit back and use the devices exactly the way they’re told to. But of course, nobody does that, right? We all find ways around certain elements of our technologies to try and get what we want. And often, what older folks want is relationships. Contact. Social interaction. And they will use technology to get them in all kinds of interesting ways.
DR. CLARA BERRIDGE:
“I’m socially isolated and I need some connection. So I’m going to turn my temperature in my apartment up above 92 because I understand that that’s the point at which a Telecare operator calls me and I’m going to have a social chat with that telecare operator.”
This is Dr. Clara Berridge, an Associate Professor at the University of Washington School of Social Work.
I asked Victor what would happen if someone like the dad in our story asked for a worker’s name or location.
Yeah, our health advocates are trying not to go there.
And this was something that the users in Amanda’s study found kind of frustrating.
Some people were just so focused on the human interaction that they would kind of play around with the system so they could get to those people that they developed relationships with, or be less interested because they knew there were people on the other end that were not giving them, they felt, an authentic interaction, but they were expected to give an authentic interaction of their own.
One participant in Amanda’s study said, “The digital pet can’t really be a friend to me because the people that I talk to on the other end can’t tell me anything about their personal lives… When someone asks me a question, I answer the question. Then I ask back, ‘Well, how about you, and what do you do, and how do you feel, and what do you like?’ Well, that didn’t go over very big because they’re not supposed to tell me who they are, where they are, or what their family’s like, how many children they have, and all of that.”
We forget, I think, when we design technologies for older adults that people don’t want to just be cared for. Many people do not want a relationship just to be comforted or reached out to in their social isolation. They also want to make genuine connections with other people, and help them, and care about them; ask them questions about their lives, and their families, and things like that.
So, that was also something with the robotic pets we found, where some people described how they wanted to be able to care for it, too, right? Whether that’s like changing the batteries in, like, a more meaningful way. They were happy they didn’t have to clean the poop anymore. Like, there are certain things that are like, “Great, I’m happy not to be doing that.”
Of course, robotic avatars are not the only kind of technology that exists to help seniors.
Communication and engagement is the first one I talk about. Health and wellness, learning and contribution, and safety and security. So those are the four big categories
This is Laurie Orlov, the founder of Aging and Health Technology Watch. She’s been tracking technologies related to aging for over a decade.
I’m probably the first one who identified as such.
Her four categories of technology encompass a whole bunch of types of devices, apps, and services.
Email, virtual reality software, applications, games, video, cell phones, smartphones, tablets, smart speakers, voice assistants and hearables. Mobile health applications, telehealth, medication management, disease management, fitness trackers, voice assistants, and health-related wearables. Shall I keep going?
Home security systems, which is something the basics in technology for people who want to age in their own home. Voice-enabled help capabilities from smart speakers, cameras which are increasingly smart, fall detection technologies, home and activity monitoring sensors and radar.
So, lots of stuff. From her perspective as an industry analyst, Laurie is watching a few sectors in particular. One of them is sensors.
This can be everything from a basic, motion sensor light that turns on when you walk by it, all the way up to much more complicated detection devices.
Sensors that can tell you about whether the stove has been left on. Sensors about your health, your heart rate, blood pressure, your personal temperature. And camera-based sensors that can detect what you’re doing, match it up to some information about you, and make a suggestion… through voice for example, make a suggestion about something you should change in your behavior.
In the first episode, you heard Nikki talking about using an Amazon Echo to set up the lights and music to keep her mom calm and relaxed during sundown. She also uses other sensor technology.
So the camera’s phenomenal. I love it. Because when my mom was in her natural space, she would do things that I would… my mind would be blown. Like, “Why are you hiding cookies in the closet? What do you think? I’m not going to give you more?” Like, you know, so when I would when I was able to find her doing those things, it kind of helped me with assessing how I should run her day-to-day, like maybe give her an extra snack because obviously she’s still hungry but doesn’t know how to communicate that she’s hungry.
I do have cameras that also assist with caring for her because they let me know… they notify me through the app when her body temperature has gone down or when she has moved a little bit too much. So I’m able to use technology to, kind of, ease all of the heaviness and the weight of caring for someone who is actively dying.
Another thing Laurie is watching, along with almost everybody else in the tech industry in general, is AI.
Software that can actually learn something about your behavior, assemble the data, and predict some possibility of change in the future, I think that’s the most interesting thing. And it’s not all that well evolved this year, but hopefully next year it will be more well evolved and will be increasingly part of technology offerings that serve older adults.
One potential application is helping to decode and understand non-verbal kinds of communication. Many folks with dementia struggle to verbalize, at some point or another. But there are lots of ways a person can show what they do or don’t like.
The application, I think, for AI that’s most interesting is actually, kind of, detecting people’s non-verbal signals, because we think someone who can’t say, “Yes, I would like to do that, please,” or “I’m having so much fun right now,” that, you know, maybe they’re not able to understand anything. But then, if you look at the clinical literature, there’s all these observational measures of engagement. Like is someone leaning forward during an activity?
For his part, Victor and his team are already incorporating AI into Care.coach so that the system can work faster and more autonomously.
Like, for example, if somebody says something to you, it takes you a moment to be like, “What is a thoughtful, empathic, supportive thing that I should say? Hmm. Okay, let me say it,” and then you have to type it out, and then “Oh, I typed it wrong.” Fix that, and then you have to hit enter. And meanwhile, your clients or the person on the other side is waiting for you and their avatar to respond. So, we’re leveraging some really cutting-edge techniques to take all the training data that we’ve built and automate all of that, make that faster.
But in the future, there might be apps and devices where there isn’t a human involved at all.
How do you leap to that future where AI is actually able to do this kind of thing, and build this type of trust, and completely solve the caregiver shortage?
There are some big questions here that involve discussions of algorithmic bias and what it really means to care for someone. If the users in Amanda’s study wanted a real, authentic connection, they’re not going to get one from an app. You can’t have a two-sided conversation about your kids or your lives with an algorithm.
And what about those sensors we talked about? What are the ethics when it comes to things like cameras installed in people’s spaces? Can someone with dementia consent to something like that? How do you have that conversation?
ROSE (on call):
Let’s say your husband was like, “I want to install a camera in our house to, like, keep track of, like, safety-wise, making sure, like, the stove’s not on.” How would you feel about that?
I’ll tell him the bugger off. Yep, yep. And I think there’s been a lot of tracking implemented that’s being called “for health,” that’s actually being used for surveillance.
We’re going to talk about that, and how to ethically design and deploy some of these things, when we come back.
ADVERTISEMENT: TAB FOR A CAUSE
This episode is sponsored in part by Tab for a Cause. Tab for a Cause is a browser extension that lets you raise money for a charity while doing your thing online. It is incredibly simple. Whenever you open a new tab, you will see a beautiful photo and a small ad. Part of that ad money goes towards a charity of your choice. That’s it. That’s how it works.
You can join Team Flash Forward by signing up at Tab.Gladly.io/FlashForward.
Okay, so technology can be really useful in some situations. Nobody is arguing that it can’t. But what’s the right way to design and use this stuff?
Let’s start with design. A lot of technology simply forgets that seniors even exist as a market or user base.
What does our tech industry look like, right? Like, who is designing them and what are their, kind of, mental models in forming things? Who are they designing things for?
That’s Dr. Amanda Lazar again, and in her work, she does a lot of thinking about what older adults actually want out of their technology. And Laurie Orlov, our industry analyst, says that there are so many examples of tech that is clearly not designed with seniors in mind.
Where to tap on an iPad. I would say that’s my best example. There’s a lot of screen area on an iPad. The other one is the Apple TV remote. There’s no clue on the remote where to touch, what it means, what does that big circle button thing mean? And an iPad is another one where there’s a lot of blank screen. And I’ve seen older people pounding at various parts of it, trying to figure out which part is going to wake it up.
When we get older, the conductance of our skin actually changes, which makes it physically harder to use touchscreen devices. And this kind of thing contributes to the idea that older adults are “bad” at technology, or aren’t interested in technology, or can’t understand technology. But is that really true?
Older adults, like the people we are considering older, have experienced the most technological change of, like, anyone ever, right? Like, the advances that they’ve seen and kept up with in technology are, like, a ridiculous amount of change.
Many of today’s seniors were born before credit cards, before commercial television, before flu shots. They’ve learned a lot about a lot of new technologies over their lives. So the idea that they simply can’t learn, probably isn’t true, right? And yet, I’m sure a lot of you have probably had the experience of trying to walk an older family member through tech support, to varying levels of success? So what is going on here?
There are a couple things to say about this. The first is baseline education. People who are younger have been taught, either in schools or in their workplace, how to use a lot of things. Younger people have access not only to actual classes in school on how to use computers, and the internet, and all that jazz, but also to tech support teams in offices. For a lot of you listening, if there is a new bit of technology that you need to use for work, you have an actual team of people whose job it is to teach you how to use it at your company. Older folks often have none of that.
Since new technologies are entering the market at all times and old technologies become obsolete, the question really is how to stay current and how motivated are people to stay current? What is the training cycle, for example, to learn a new smartphone? Is it even worth it to get one?
That last question, is it even worth it, is also a good one. Because honestly, sometimes it’s not? For some people who have lived through a hundred different new bits of technology, staying up on the latest cell phone is just not that interesting.
The Covid-19 pandemic actually provides a pretty good example of this when it comes to video chatting.
Older people start using Zoom and, you know, are still using it to connect with family members. And that’s because the old ways of doing things weren’t working. Maybe the reason they “didn’t know how to use it” before was because they didn’t have to. Everything was working, why do you have to learn this new thing?
On top of all of that, you have the fact that often when seniors do try to learn to use something new, they are treated like they are incompetent babies. Which isn’t fun. So why even bother?
Then you add in dementia, and you get another layer of assumptions that those who have cognitive decline definitely cannot understand what is happening with technology.
I asked Nikki, for example, if she had ever asked her mom about the cameras and how she felt about them.
I don’t know if she would understand what it meant, having a camera there. And, you know, having the camera around, I had to do it in the middle stage. So that was like from stage 4 to 5, because that’s when things started picking up. You know, she’s moving things, and she might be defecating on herself and I don’t know. Or she might be wandering into a place that is not technically safe for her. So I don’t know if she understood what camera meant or videotaping meant at that time.
I didn’t start off studying dementia, I worked with older adults not living with dementia. And when I would present findings, the most common question is, “Yeah, but this is all out the window when it’s dementia.” And so that’s sort of why I started to turn towards dementia.
That’s Dr. Clara Berridge again. When Clara did start looking at people with dementia, she found that, actually, that’s often not true.
And in my own research, I found that people, adult children, for example, would say, “No, I probably wouldn’t involve my mother, for example, in the decision about putting a camera in, or a sensor, or location tracking, because I don’t think she’d understand it.” I would then interview the parent, you know, the older adult. And you know, everybody understood it. Everybody was capable of comprehending the basic function of these technologies.
Of course, Clara has an advantage here. Part of her research is about finding the best ways to explain these tools to older folks and folks with dementia. Not everybody has that expertise. It’s not necessarily easy. But that is something that she’s hoping to help change. In the last couple of years, Clara has been working on something called Let’s Talk Tech, which is essentially a method of walking both people with dementia and their care partners, through various technologies.
So we piloted it with 29 people living with mild Alzheimer’s disease and their care partner, and most of them are spouses, we had one adult daughter, who lived together. And so, we actually found that it was successful on all of our measures and it was really great findings.
We were able to significantly improve the care partner’s knowledge of what the person living with dementia wanted. We were able to significantly improve their comprehension of the technologies, both the care partner and, on a couple of the technologies, a person living with dementia.
We talk on this show all the time about how important it is to involve users in your design practice. And that’s true of folks with dementia too, especially if your app or service or device is supposed to be for them.
In fact, people with dementia have probably already worked out some cool uses of technology that you don’t even know about. Amanda Lazar did one study talking to people with dementia about their technology use, and the participants all described their own bespoke, often very clever, systems. The study also included ideas that these folks had for technologies they would actually like to use.
Take getting ready for the day. How do you know what to wear? You or I might look at the weather, or think about the context of an event; is it work, or social, or some combination? But those things can be really hard to do when you have dementia. So one participant wished for a device that could provide what she called “social background information” including “how I need to be presented so that I can feel I can participate like everybody else.”
And the study also showed that, sometimes, folks with dementia actually tailor the technology they use, not based on their own desires, but based on what they think their loved ones want and need.
One participant in this study said she was comfortable with, like, a Geofencing application. She wanted to use that, but she thought it’d be too hard for her daughter. So she didn’t. Because, like, emotionally for her, her daughter wasn’t ready for that.
The consequences of these assumptions, the assumption that people with dementia, or just older people in general, can’t possibly understand questions about technology, are very real. Often, in facilities, stuff is installed without talking to the residents at all.
I interviewed residents once at a high-end nursing home. This was a few years ago. And they described to me this device above their beds and they did not know what it was. They had not been consulted about it or informed if it was even in use. And they found it really disturbing. And from their description, I think it was probably a sensor over their bed.
Now, often cameras and sensors are installed in facilities as a money-saving effort, not necessarily because they are best for care.
Residential care agencies, primarily at that time in the world of intellectual and developmental disability services in the home, like adult family homes, for example, were putting cameras in residents’ bedrooms and removing their staff from the building at night so they can monitor the feed from multiple cameras all at once and then ideally rush in if there was a problem, send somebody. And then the drive, of course, to do that was cost savings and workforce shortages.
We’re going to talk more about this, and the ways that surveillance in the workplace impacts care, next week when we talk about the economics of this world.
But it’s not just facilities that make executive decisions about technology on behalf of people. Other times, it’s the families that are making decisions without their loved ones’ consent.
One woman in particular that really stands out for me, she had presented the idea to her mother of using a sensor system that her HUD senior housing program was offering. And her mom said, “I don’t think I’d like that.” And her sisters, the adult daughter’s sisters, agreed with their mother and said, “Absolutely, that’s an invasion of our mom’s privacy.” But because this adult daughter was a power of attorney, she decided to use it regardless.
So, she decided to get a 360-degree web camera and put that in her mom’s apartment. And then she, you know, pulled out her phone and showed me the interior of her mom’s apartment. And so it just made me realize, “Wow, her mom didn’t even want the sensors and she ended up with this camera.”
We talk on this show all the time about the importance of consent, and privacy, and being able to make your own, informed decisions about what kinds of information is being gathered and shared about you. So why are so many people quick to throw all of that out the window once someone is older? And the potential drawbacks here aren’t just some kind of theoretical violation of someone’s wishes.
The consequences of someone else making decisions for you are really steep in dementia because it could involve, for example, you no longer living in your home and being sent to an assisted living facility or a memory care unit.
So for example, there are sensor systems that claim to be able to help track someone’s progress as they age. These sensors could detect changes in someone’s movements, or voice, or habits, and decide that they now fall into a new category — either a new diagnosis or into a new stage of dementia, for example. And that might change what they are allowed to do or where they are allowed to live.
We’re like, “Oh, AI, this is going to detect the progression of dementia. It’s going to be so great, or detect the presence of dementia.” Which is like, great, in terms of, you know, maybe looking at pharmacological treatments and understanding more. But it’s very, very charged when you’re a person with dementia, kind of like, coping with life, right?
Sometimes we’ll talk to people and we’ll use the word dementia, but they’re actually in a living situation where if they have, you know, advanced cognitive impairment or dementia, they’re going to be kicked out and sent somewhere else.
Or what if the person in question has secrets that they don’t want their kids to know?
So I remember… I think she was about 85, an 85-year-old woman who was living with acquired disabilities and she needed significant assistance from her daughter, with whom she was actually very close. And when I was talking to her… I think it was about the sensors. I interviewed her about various technologies, and she said, “Well, what if this hypothetical person you’re telling me about, this hypothetical older adult who needs the sensors, what if she’s in love? And what if she doesn’t want her daughter to know?”
And then at the end of my interview with this woman, she indicated that she’s very much in love with another woman. And she feels like… I remember because it was such a random, specific age. She’s like, “I feel like I’m 36 again.” And she was beaming and so happy talking about how being in love makes her feel. So it became clear to me that she didn’t want her daughter to know about this. She was close to her daughter. But there are things that older adults still, for whatever reason, their own reasons, want to keep private.
In our series, there is a storyline in which our narrator accidentally snoops on their father having sex. And this is a thing that happens! And when it does, it brings us back to those questions of agency that we talked about on the first episode.
If you are able to express preferences about sexual behavior, then you’re an adult and we should really kind of stay out of your way. That’s obviously not true towards the end, and it could easily be exploitation of people with dementia. But our traditional default in nursing homes and for older people was just no, which is a very kind of Calvinist sex-negative idea about what it means to care for somebody and protect them.
That’s Dr. Tia Powell again, you heard her last week. People are entitled to their secrets, to keep certain things to themselves for whatever reason they choose. And when people know they are being watched, they change their behavior.
I learned about people rushing in the bathroom, deciding not to take long afternoon naps anymore because too much inactivity might be detected, having to account for behavior that deviates from their routine, being found out that you own a pet that you’re not supposed to, being found out that you’re dealing with incontinence, or that you like to take long baths. And just having to answer to somebody, and you know, whether that be a building social worker, you know, frontline staff, or a family member about it, that’s something that most people don’t expect to have to do. I think it’s really not hard to imagine the ways living under constant surveillance could impact somebody’s well-being.
I mean, you can imagine this, right? Imagine if every single thing you did was recorded and transmitted to your parents, or your friends, or a doctor that is supposed to watch you. Again, there are reasons why cameras could be useful and could make people safer, but we also have to weigh the tradeoffs too, right?
And there’s some things, including with your technology, that we can really look at that would help this person stay home. Like, what about these cameras? If you live alone, can we set up a camera that just takes photographs, like only films like the bottom 12 inches of a room? So if you’re lying on the floor, we see you. If your feet are walking around, that’s all we see. But you actually need that. And I agree that privacy is important, but privacy that puts you faster in the nursing home is probably not what people are asking for.
The point here is not that you should never ever use cameras or sensors. Plenty of people have great experiences using these devices to make their lives, and their loved ones’ lives, better. The point is that their use needs to be something that everybody understands and agrees upon. Which, again, is totally possible! Tia suggests a kind of technology Genius Bar for older adults and their families.
And asking every family again and again, what’s hard for you? What do you need help with? Because we have, like, a Genius Bar. We have a sort of Genius Bar over here where you can say, “Well, he’s figured out how to undo the locks and goes out of the house in the middle of the night. So, okay, how can we figure that out? Give me some solutions. What’s affordable? What doesn’t make him feel like he’s in prison? What’s not going to be a fire hazard?” All that kind of stuff. But you need, like, a Genius Bar for everybody where they can go and do some problem solving and figure out, “He’d like to stay home. We can’t go and live with him. What do you have for us?”
It is tricky. And I have heard loads and loads of positive stories about that type of use of cameras. And a friend of mine in Canada who wasn’t able to be an in-person support or care partner for one of his parents or family members back in Asia, Malaysia, somewhere, he set up a camera system and that was incredibly successful.
That’s Kate Swaffer again. And Nikki says that, for her part, she really tried to respect her mom’s privacy and agency.
It’s about us. What are we going to do to make this work? How are we going to have this new lifestyle? You know, I want to make sure that we’re both feeling seen and we’re both feeling safe. I think that’s what cameras are about.
Having the conversation earlier, rather than later, is a key part of this. Talking to someone about what they want, what they don’t want, and what would work best maybe as a compromise.
In the series, we tried to explore some of these concerns and questions; from kids making decisions for their parents, to the complicated conversations you might have with a parent or partner about what they do or don’t want to know about technology. This stuff is complicated and it’s hard to navigate.
I don’t have all the answers here. But I do think that we can begin to get to a better possible future by reconsidering our assumptions about the capacity of older adults, including those with dementia.
I’m really interested in thinking about technology changing attitudes toward dementia and I do think there’s space there, right? Because then instead of focusing on the person who’s affected by all this, like, stigma and layers of discrimination and, you know… We’re thinking about the people who, if they think about things a little differently, we might open up some more space.
Instead of jumping to invent a million devices to “solve” one problem or another, we could talk to people, and ask them what they want and need, and then think about why that is. When a senior turns up her thermostat to 92° because she’s lonely and the only human interaction she gets is that telecare worker… maybe the next step isn’t to send in a robot or AI-driven app, or to fix the thermostat? Maybe it’s to consider why it is that we’ve set up our structures such that people we care for are aging in such secluded, lonely places in the first place?
There’s no one answer to that question, but one of the reasons certainly has to do with money. Specifically the cost of care. And that is what we are going to talk about next week.
[Flash Forward closing music begins – a snapping, synthy piece]
This is the third episode in our series that explains the science, tech, economics, and policy behind the “Welcome to Vanguard Estates” series. If you haven’t already checked out the big, choose-your-own-path story, please go do that! We worked very hard on it; we’re very proud of it. You can find it in your podcasting app as well as online as a text game if you prefer that. Just go to FlashForwardPod.com/Vanguard and you can find that there.
Okay, see you next week!
[music fades out]