Home Episode Dollars For Data

Dollars For Data

July 21, 2020

Today we travel to a future where you can sell your personal data directly to a company, to get a better deal on a car or a house or a latte. What could go wrong? 

Guests:

Voice Actors:

Further Reading:

Episode Sponsors: 

Flash Forward is produced by me, Rose Eveleth. The intro music is by Asura and the outtro music is by Hussalonia. The episode art is by Matt Lubchansky. The voices from the future this episode were provided by 

If you want to suggest a future we should take on, send us a note on Twitter, Facebook or by email at info@flashforwardpod.com. We love hearing your ideas! And if you think you’ve spotted one of the little references I’ve hidden in the episode, email us there too. If you’re right, I’ll send you something cool. 

And if you want to support the show, there are a few ways you can do that too! Head to www.flashforwardpod.com/support for more about how to give. But if that’s not in the cards for you, you can head to iTunes and leave us a nice review or just tell your friends about us. Those things really do help. 

That’s all for this future, come back next time and we’ll travel to a new one. 

FULL TRANSCRIPT BELOW

▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹

FLASH FORWARD

S6E08- “Dollars for Data”

[Flash Forward intro music- “Whispering Through” by Asura, an electronic, rhythm-heavy piece]

ROSE EVELETH:

Hello, and welcome to Flash Forward! I’m Rose, and I’m your host, and I’ve realized recently that I say the exact same thing on every episode intro, and maybe I should change it up, so… hey. (small laugh) It’s me. 

I’m Rose. I’m the host of this show. This show is Flash Forward. Flash Forward is a show about the future. Every episode we go to a specific possible, or, sometimes, not so possible future scenario. We always start with a little audio field trip into the future to check out what’s going on, and then we teleport back to today to talk to experts about how that world that we just heard might actually happen.

Sometimes people ask me why I always say that at the top of every episode, that we do fiction first and then journalism, and the answer is that, in the early days of the show when I was piloting it and I sent it to people, if I didn’t say that at the top, they thought that the whole thing was made up, like all the experts were also actors- which is not the case. (small laugh) So, we do fiction first, then we do journalism, and, together, we get a better understanding of the future. Got it? Great!

Oh, and today’s episode starts with a little fictional sketch about a murder, so if you want to skip that head to about six minutes into the episode.

This episode, we are starting in the year 2022. 

[Intro music fades out]

FICTION SKETCH

[Music- A lazy, easy listening piece. Think elevator music, but less aggressive melodically. It’s got a good beat.]

[A machine is turned on, and it starts whirring. Something is uncapped 

[Someone begins typing extremely fast. Things beep! and ding! in the background]

[Then a voice comes in. It’s warm, friendly, soothing. Trustworthy. It’s also an ad.]

[It’s backed by new music: warm strings coming in over some light guitar strumming. It’s designed to feel comforting, to put you at ease.]

ADVERTISEMENT:
Hello Monica. Do you have a moment to talk about your data? (slight glitching) It is- yours, after all.

[Monica ignores it and keeps typing]

ADVERTISEMENT (cont’d) (slight glitching):

And just like you, it’s valuable.

Did you know that you could make hundreds of dollars a month, just from your data? Lots of places offer services to sell your data, but only Omicron offers complete customization.

Sell only what you want- nothing more, nothing less- at Omicron. We see you for you, not for data.

MONICA:

Oh shut up

[She promptly turns off the ad, then goes back to her typing. She keeps at it for a second, but then:]

[Knock-knock-knock-knock-knock— someone’s at her door]

[She stops, gets up, and walks over to the door. Opens it.]

COP:
Cynthia Bell?

MONICA:
Uh… no.

COP:
What’s your name? 

MONICA:
Monica Pugh

COP:
Can I come in?

MONICA:
Uh, (slight laugh) why?

COP:
I just- want to ask you some questions.

MONICA:
Do you have a warrant?

COP:
No, and I’m not here to search your apartment, I, I just- just want to ask you some questions.

MONICA:
I didn’t do anything. 

COP:
Great. Can I come in?

MONICA:
Uh, no! 

[The cop sighs]

COP:

Monica, you, you said your name was? I know that’s not your actual name. And this will be much easier if you talk to me, before my boss gets here and drags you into the station. 

[Beat]

MONICA (sighing):
Okay.

[She lets the cop in and closes the door. They walk down the hall to the table, where they both sit]

[In the background are the faint strains of an uppity video-game type music]

COP:

Alright. Do you recognize this person?

[He slides a piece of paper across the table to her. She picks it up]

MONICA:

No.

[The cop slides another photograph over]

COP:

How about now?

MONICA:

Jesus, no; what happened to him?!

COP:

He was murdered last weekend, coming home from his shift at Wendy’s. 

MONICA:

I have no idea who that is or why you’re asking me about him.

COP (sliding another paper over):

Do you recognize this person?

MONICA:

No.

COP:

Monica, that’s you. 

MONICA:

No, it’s not. You think I look like this person? 

COP:

Yeah. 

MONICA:

Are you kidding me? This doesn’t look like me at all, and I would never wear that!

COP:

Well, you fit the description. African American woman, five four, dark hair.

MONICA:

My nose doesn’t look like that! You seriously think that’s me? You really can’t tell Black people apart, huh? That’s not me. (flips the photo, softer) That’s not even a real person. 

COP:

An unreal person doesn’t get caught on CCTV and have facial recognition systems match them to you, Cynthia. 

MONICA:

No, you don’t understand, this- “person” never existed!

COP (disbelieving laugh): Well we traced their GPS data to the scene of the crime. 

MONICA:

Look, o-okay… I do recognize this picture- because I made it. This is a computer generated image; Cynthia Bell is not a real human. I made her up. All her data is generated on the servers in my back office. She “walks” around the city generating data that I then sell, to make money. 

I have a bunch of them, but they’re not real okay? They’re just code. They don’t go anywhere, they definitely don’t murder anybody!

COP:

Well.. that’s fraud, first of all, but, uh, that’s, that’s not really my problem.

MONICA:

I didn’t murder anybody. You know, I want a lawyer.

COP:

Well, you’ve already confessed to a crime, so you, uh, you forfeited that option.

MONICA (nervous half-laugh):

I- I don’t think that’s how the law works? Uh- 

Look, fine, take the servers, do whatever, just- (steadying inhale) I swear to god I had nothing to do with this! I was just trying to make some money to pay off my student loans, okay? Jesus christ!

COP:

You expect me to believe that there is a woman who fits the description, yeah? Whose data signature matches the murderer, and whose location data proves that she was there at the scene of the crime, but (short, sarcastic laugh) yeah, it’s all coincidence?

MONICA:
This woman doesn’t exist. She’s not real!

COP:

I’m pretty sure I’m looking right at her.

MONICA:

No- (laugh of disbelief) it’s not me! It’s just data. I was here coding all night!

COP:

Uh huh. Yeah. Yeah, riight.

MONICA:

Okay, look, I’m not saying anything else. I want a lawyer. 

Cop1 (sarcastic, dismissive)

Oh! Well, maybe.. you can code yourself one. 

[He walks out]

[Music- MUSIC DESCRIPTION (6:05)]

SKETCH ENDS 

ROSE:

Okay, today I want to talk about a proposal that I have been seeing pop up here and there for maybe something like ten years now:

This idea that since we, you and me, generate all this data, and data is what makes companies like Facebook and Amazon and Google, all their money, that we should be entitled to a cut of that money. That we should be paid for our data.

RICHIE ETWARU (recording):
So to me, the value of data has nothing to do with the size of the data. It has a little bit to do with the quality of the data, but it has a lot to do with what is that data going to be used for? 

ROSE:
This is Richie Etwaru, the founder and CEO of a company called Hu-manity.co.

RICHIE:
So if I’m selling you my healthcare data just to be able to do some broad ranging randomized study of something, by all means, that’s probably the lowest y- the lowest value of the data. But if you’re using that data because I’m a very unique individual, and I have very specific things in that, that piece of data is needed for you to make a multi-billion dollar drug where you’re gonna make- hundreds of billions of dollars of profits on it? That data might be priced a little bit differently.

ROSE:
And Richie has been thinking about monetizing personal data for a long time.

RICHIE:
In my early 20’s I founded my first company that.. that bought driver records from the Department of Motor Vehicles, uh, legally, and sold those driver records to trucking companies so that trucking companies can check the driver history of those drivers to make sure that they’re safe, and to comply with federal motor carrier safety laws.

And to make a long story short, I was twenty-four, I was co-founder of a startup, you know, young and excited. And, and I remember writing code one night- (stumbling) you- which is how you develop software, and stopping and saying, you know, I’m seeing people’s suspensions. I’m seeing them being suspended for not paying alimony. I’m seeing their addresses as I’m processing this DMV data. And more importantly, I’m buying this data for X amount of dollars, and I’m selling it for X, you know, times five.

And, and it just occurred to me at the moment that, you know- this is too good to be true, like, how can I be making money? The drivers don’t know me. They certainly don’t know this is going on. And, uh, that- you know, I’m either doing something really illegal or this is, like, the best country in the world. (laughing) And, and obviously, I went to the lawyers and they said, no, it’s perfectly legal. This is the way it works. 

ROSE:
But even if it was legal, Richie said it still did not feel- great. 

RICHIE:
It was at that moment that I realized that at some point in the future- and this was 20 years ago, right; it’s nothing compared to data today. At some point in the future, the creating source of that data, where the data came from, given that it has value, is going to be a part of the economic question. 

ROSE:
Today, he is an advocate for a personal data marketplace of sorts. For you to be able to control exactly who gets your data, what they use it for, and most importantly, understand what you get out of it.

RICHIE:
We’re on a mission to make sure that we create a world where data ownership is, is so embedded in society that it feels like a human right.

ROSE:
And he means “human right” very literally. A few years ago, Richie and his company launched a campaign to add data ownership to the list of official human rights. Right now there are thirty of them, as defined by the UN Universal Declaration of Human Rights. Things like “No one shall be held in slavery or servitude; slavery and the slave trade shall be prohibited in all their forms.” 

RICHIE:
We actually went ahead somewhat, um, you know, bravely or courageously and defined what a thirty-first human right would look like. There are only thirty human rights in the world today, and we define it as human beings should have the right to negotiate choice, control, and consumption of their data as they do around any property that they own today. Just like you own your car or your home, you would own it as property.

ROSE:
So far, there hasn’t been a ton of traction on the idea of adding this to the official list of thirty human rights. And, to be honest, I don’t actually think that Richie would necessarily disagree with this; the push to make a thirty-first human right around data is kinda a publicity stunt to bring attention to this conversation. 

And- it worked. When they proposed this, they got a lot of press.

Today, Richie is not campaigning for that right directly; that’s not what he spends most of his time doing. Most of Hu-manity.co’s work focuses on building tools for companies to use that let them hand you more control over your data. So, when you sign up for an app, and you go to click “I Accept” at the bottom of a very long document that you probably didn’t read, Richie wants to add a few more choices.

RICHIE:
The way that Terms & Conditions are presented to the consumer in the data privacy section, now, instead of having a simple- a single “I accept” button, they now have choice, where they can use- they can choose different versions of privacy. You want the silver level, the gold level or the platinum level. 

And, as you select silver, it shows you what data you’re giving up and what you’re getting in exchange; as you give up- as you select gold, what you’re giving up and what you’re getting exchange; and in platinum, what you’re giving up on, what you’re getting in exchange.

ROSE:
And of course, if you pick the lowest level of privacy, you get the best deal.

RICHIE:
You know, one car company that we’re working with wants to put it in their leasing agreement where, if you leased a car and you give them the highest level of access to the data that comes off of the car, so that they can use that data to train their autonomous driving algorithms, which is their self-driving A.I., to drive the cars, they will give you a discount on the car lease. 

ROSE:
Richie sees this additional level of choice as the first step towards protecting people’s rights. 

RICHIE:
And eventually we get to a point where, look, this is just the way we behave. It’s like a human right. You got it. You, you- we just don’t abuse people’s data anymore. 

ROSE:
And, to get there, Richie, and other people who are in favor of this way of thinking about and handling our personal data, have pushed for laws all over the US that establish, essentially, a marketplace for your personal data, one that you can opt into and participate in, for money or discounts. And these laws have seen political support in states like Oregon and California. Democrats like it because they see it as a way to rein in big tech companies, and Republicans like it because it proposes a marketplace solution to a problem. 

And as this marketplace develops, in Richie’s version of this future, we all start to make real money from our data being passed around. 

RICHIE:
So far, we’re a two income species. Primary income.  Yes, there are many secondary sources of income, right, you know, all sorts of stuff. But primary is IP, which is your, your creativity, uh, labor, which is, which is your productivity. And I think data is about to become the third income stream. We’re about to become a three income species where your data, which is your activity, will generate income for you.

ROSE:
And if you go even further down this road, Richie thinks that your data could in fact become your primary income, replacing your labor and your creativity. 

RICHIE:
So if your data is being used to over-automate and your data is being used to take the friction out of society, and we can all actually just go off and live on Gilligan’s Island and don’t have to worry about eight hour jobs per week, where do we get money from? 

Well, we have to get the money from the thing that’s coming from us that is actually used to automate the jobs that we should have done to get money. And I see that as socialism and capitalism coexisting, where the data- (laugh) that the money from the data actually becomes universal basic income.

ROSE:
If this idea sounds familiar, it could be because former presidential Andrew Yang also started talking about this recently.

But not everybody is excited about this idea. The ACLU, for example, has come out solidly against most of these bills, and has specifically argued that Richie’s company, Hu-manity.co, should be considered a threat to privacy, not an advocate for it. They argue that Humanity is simply trying to invent a new kind of middleman who can siphon money off the already existing data marketplace by inserting themselves into the process instead of fighting against data abuse itself. In a statement on their site, the ACLU says that, quote, “Hu-manity.co is a for-profit company that promotes a data-as-property model in order to artificially generate market demand and substantial profits through government action.”

Other people have asked questions about how, exactly, we’re going to set the value of this data. In California, they have stipulated seven possible ways to calculate the value of data, and when you look at these methods side by side this problem becomes really clear: For the same data, one method of valuation says an account might be worth $140, and another method says it’s worth only fractions of a cent. 

There are also a few ethical and legal critiques of this idea.

ROSE (on phone):
Do you in fact own your data, legally? 

SARAH JEONG:
No..? That’s actually- so that depends on how you define data, right? Sorry, I, I can’t believe I just did the “that depends” thing; that’s what all lawyers do. 

ROSE:
You really are a lawyer! (laughs)

SARAH:
Yeah, um-

ROSE (Mono):
And, when we come back we’re going to talk about why this idea might seem alluring- but has some pretty serious caveats. 

ADVERTISEMENT BREAK

ROSE:
This episode of Flash Forward is sponsored in part by PNAS Science Sessions.

Science Sessions are short, in-depth conversations with the world’s top scientific researchers. In less time than it takes to drink a cup of coffee, you can learn all kinds of incredible facts. For example! One recent episode spoke with two researchers who study ancient predators to learn about evolution. 

On this particular episode, you will learn how to pronounce incredible words like crocodylamorph and thalattosuchians, which are these amazing extinct crocodile relatives that researcher Steve Brusatte describes as, quote, “essentially like a croc version of a killer whale,” which is amazing to try and picture.

And these killer croco-whales, which is not a scientific term but rather one that I just made up, evolved to dominate the ocean from land-dwelling ancestors. By studying fossils and, in particular, the structure of their inner ears, researchers can better understand how a species goes from land to sea.

And you can learn all about that in just eleven minutes.

Listen and subscribe to Science Sessions on iTunes, Spotify, Google Play, Stitcher, or- wherever you get your podcasts.

ADVERTISEMENT END

ROSE:
Okay, so I totally get why this idea is enticing. Google’s parent company, Alphabet, is worth a trillion dollars. A trillion, with a T! And a lot of that money has its origins in the data that you and I generate. Yes, Google does smart things with it; they analyze it, they package it, they make stuff with it, all that, but it all starts with us, with our information. So why shouldn’t we get some small cut of that? 

Well, for one thing, you don’t actually own your data. Not legally speaking at least.

SARAH:
In general, let’s talk about, like, your movements, right. So, your, your GPS coordinates over a given period, like, your browsing habits, your biometric data, for instance. So, like, the sum total of your blood type, plus.. how often you get your period plus your medical history plus, like, your medications- all that stuff. No, you don’t own it. No one owns it, in fact.

ROSE:
This is Sarah Jeong, a lawyer and a journalist who writes about technology. And she says that data, like say your GPS coordinates, they’re considered facts. And you can’t own a fact.

SARAH:
There is no copyright, for instance, in the phone book. There is no copyright in geographic locations. 

So, like, even though mapmaking is an incredibly, like, difficult process, very onerous, no one can assert copyright in, like, the shape of a landmass, even though it took a lot of work to sketch it out in the first place. 

ROSE:
In the honor of maps, I’m going to take us on a short detour here. (softer) Ba-dum ching! (regular volume) Okay, so, you might have heard that maps have so-called “copyright traps” in them. Fake streets, or even sometimes fake towns, placed to catch people who might be copying a mapmaker’s hard work. 

This is true, some map makers do do this. And in researching this, I came across a truly incredible story that I cannot resist sharing with you, so bear with me here. 

In the 1930s a guy named Otto G. Lindberg of General Drafting Co. was looking at a map made by one of his competitors, Grand McNally [sic]. And he noticed something on that map that was very suspicious: In upstate New York, between the towns of Rockland and Beaverkill, there was a little town on the map called Agloe. 

Otto could not believe his eyes: He knew that Agloe didn’t exist, because he had placed Algoe on his map as a trap. The story goes that the name Algoe was a mashup of the first letters of his assistant’s name, Ernest Alpers, and his name, Otto G. Lindberg: AGLOE. Otto had caught Rand McNally in a blatant act of plagiarism.

So, Otto took the giant map maker to court, arguing that obviously they had stolen his map because Algoe did not exist. And Rand McNally’s lawyers countered that in fact, it did. 

Their mapmakers had driven along the route, and encountered a general store. The Agloe General Store. Apparently, some locals had seen the name “Agloe” on a map and decided to name their general store after it. And thus, a trap became a real-ish place. Otto, according to the stories at least, lost the court case. 

I do want to add one thing about this story, which is that I’ve seen it told and reproduced all over the Internet, but none of the stories about it actually provide any original evidence. It has some signs of being kind of like an urban legend— where the story is told in almost the same way everywhere you find it? 

So, I tried to do some additional sleuthing to figure out if this (laughing) actually ever happened, and I searched some old newspaper archives for the name Agloe, and it doesn’t appear in any of the newspaper archives that I have access to. I also couldn’t find any records of this court case that supposedly happened, which isn’t necessarily surprising, given that it happened in the 1930’s and records from this time are really hard to find. 

But from there, I went down a little bit of a (small laugh) research rabbit hole, which we won’t get into right now, but if you want to hear about that process, you can check out the bonus podcast this week, which you can get access to by becoming either a Patron or a member of the Time Traveler Club. 

And you can get more on both of those things at flashforwardpod.com/support

So it is true that some maps have these copyright traps in them. But! You might be wondering, as I was, how that actually works- what’s the point of putting a copyright trap in them?  

The answer is that these copyright traps don’t work, at least not in a court of law copyright case. 

In 1992, a map company called Nester’s Map & Guide Corporation took a rival to court after they caught a copycat using this kind of trap. And the court decided that these fake facts are not protected by copyright either. The judgement included the following explanation, quote: “To treat “false” facts interspersed among actual facts and represented as actual facts as fiction would mean that no one could ever reproduce or copy actual facts without risk of reproducing a false fact and thereby violating a copyright.” 

One last fun fact about these traps is that people used to do the same kind of thing in the phone book too. Phone book companies would put fake names into their books to catch copy cats. But again, that did not hold up in court

The basic gist here is that people can potentially claim copyright over the order, selection, and presentation of facts, but not the facts themselves. So while I could write an opera based on, say, the fact that NASA once sent bullfrogs into space on a mission called the “Orbiting Frog Otolith” (which is real,) I cannot copyright that fact itself. 

SARAH:
Lin-Manuel Miranda, for instance, does pay the author of the Hamilton biography that he based Hamilton on. But arguably he wouldn’t need to, if he, like, really didn’t want to, because that guy doesn’t have a copyright in the facts of Hamilton’s life. 

ROSE:
I cannot own the fact that I used four gallons of gas while driving yesterday. Or the fact that I had to make four U-turns because I could not find the address I was looking for. And this makes selling those facts hard, because, well, you usually can’t sell something you don’t actually own. And if you did try to start applying copyright to your personal data, things could get kind of weird. 

SARAH:
If you just look at sort of the copyright context, you can kind of see all of the ways in which this could, you know, go wrong. Like, in the copyright context, we have, you know, an established industry in which creative people, like, sell their art for money as they engage in this marketplace where there is a law that makes it so that they can make their own art into property, and then they can sell it to people. 

ROSE:
So if you start to think about your data as if it is your creative output; your data is your album, for example; you can then start to imagine what some of these contracts and deals and fights might look like. 

SARAH:
But then you have stuff like- Taylor Swift’s whole deal with her old label where she couldn’t use her own masters. 

ROSE:
If you are not familiar with this, the basic outline is that— when she was fifteen years old, Taylor Swift signed a deal with a record company called Big Machine. That label now owns all six of her multi-platinum albums, including the masters, which are the original recordings of a song or album. Swift left the company in 2019, and that means that she doesn’t have any right to use her own original recordings anymore. So if she wanted to remaster these songs, which is a thing that artists do all the time, she would have to pay the record label to be able to use them, and they could say “No.”

SARAH:
Taylor Swift is a pretty sophisticated actor, and.. we’ve had an American music industry for a really long time with, like, a lot of precedent for setting value and so on and so forth. And it still feels really weird to take these master recordings and no longer give Taylor Swift access to them. I- It’s, I think, much more dangerous when we’re talking about ordinary people and their personal information.

ROSE:
So, let’s consider this scenario: you sell your genetic data to a company that wants to use it for drug development. Sweet. You get paid, awesome. They now own that data. Some number of years down the line, you get sick, and you want to undergo, for example, a gene-based therapy from a rival drug company. For that to work the drug company needs your genetic data. But.. you sold it to company #1. You don’t own it anymore, and if they don’t want to play nice and share with drug company #2, they do not have to. 

SARAH:
Oh, totally. Like that, that is, like- that is, kind of, unfortunately, a plausible dystopian future. Like, that’s probably one of the more plausible ones. If you want to go with implausible, there’s actually one of the Gitmo cases- The United States was asserting classified status on the contents of one of the detainees heads, their actual thoughts and memories. 

And yeah, like, if you want to really go full dystopian, it’s like- it’s this idea that the thoughts in your brain can be alienated from you and sold. (slight laugh) And they no longer belong to you. 

ROSE:
Sarah wrote a piece a while back for the New York Times arguing that not only can’t you really, legally sell your data, but also that you shouldn’t be able to. There are some things that we don’t let people sell, like organs

SARAH:
The idea is that, because of the fundamental inequalities in society, you don’t want them to be able to sell certain things. So.. your body parts is one of them. Your liberty, like you’re not allowed to sell yourself as a slave, for instance. Like, that is- I mean, that’s illegal on so many levels besides being morally repugnant. But the idea is that people cannot meaningfully consent to certain things, and if you did indeed consent to X thing, the structure of society is such that people are so likely to get taken advantage of. We should forbid it. 

ROSE:
We talked about this a little bit on the head transplant episode, the question of harm and consent and what we should be allowed to sell or click “I accept” to. 

The moral hazard argument says that because we don’t live in a utopia where everybody’s basic needs are met, the most marginalized among us might be pushed to do something genuinely risky and harmful just for money. Like selling an organ. So we should forbid that. 

When you ask Richie about this critique he gets… uh, let’s say passionate.

RICHIE:
I grew up in a tremendously poor neighborhood as a tremendously poor person. And there has been many times, many times where I was in a store, especially when I used to go buy groceries, when I was in the store and I got to the line in the front, and I realized that I didn’t have enough money to pay for the things that was in my bag [sic], like I had to put some oranges back, or I had to put some flour back, or I had to put a bread back, because I just didn’t have enough money; I was that poor. 

Let me tell you something. When you’re that poor, if you can get money for your data, I goddamn believe you should get that money for that data, as opposed to arguing that, no, no, you shouldn’t get money because rich people can actually not sell their data. I think that is such a low IQ argument. I think it is an injustice to mankind, and it’s an abuse of power. It’s an abuse of a political bias to bring that argument into the mainstream. It’s shameful. 

ROSE:
Richie argues that there is no real hazard to selling your data. And besides, he says, this kind of data stealing is happening anyway. 

RICHIE:
You have my data! You’re using my data! You’re using my data for free! Okay? All I’m saying is, hey, man, give me a piece of that action. 

It’s not like poor people’s data are not going to be abused. They’re going to be abused equally. Why not give them money? 

[Pause; Rose, on the other end of the line, begins to ask another question, but cuts off as Richie continues]

RICHIE (cont’d):
And when you take the data from me, it’s not like you’re taking my kidney. 

ROSE:
This is an argument that I think is actually pretty common: This idea that well, it’s happening anyway. And, hey- I mean, how much harm can there really be if I give my data away?

And sure, your GPS data is not your kidney. No one is arguing that those things are equivalent. But to say that there is no potential harm in sharing your data is also a little bit naive, I think. 

SARAH:
I, I think people would be shocked at how much you can extrapolate out of stuff that isn’t necessarily protected by HIPPA. 

So, there’s the- really, the example that everyone trots out is, is the example in the Charles Duhigg piece from years and years ago in your Times magazine, where a family receives a flier from Target advertising pregnancy products, and the father of the family calls Target up and screams at them for sending it to them when they’re not expecting a child in the family. 

And then Target reaches out again later, after apologizing profusely. And the father’s like, never mind. Sorry about that. It turns out my teenage daughter is indeed pregnant.

ROSE:
Target was probably able to figure this out based on all kinds of data points— not just what she bought, but smaller, more granular data points, like how long she spent in an aisle of a store. What she might have looked at, which shelf she might have looked at. Those kinds of really specific pieces of information. 

And, with that, companies can come to conclusions, not just about you, but about your community, and even the nation. When they have that kind of data on a hundred or a thousand households in each area across the country, companies can start to make really precise and accurate predictions about our behavior, our desires, and our private lives. 

SARAH:
This is why people think their phones are secretly spying on them to serve up advertisements. 

ROSE:
Okay, we’re going to take a really quick left turn here because this is actually something that I get asked about all the time.

Probably, like, once a week I get a text from a friend being like “no, but really, my phone is listening to me right?” and detailing some example of how they had been talking about a product with their friend, or their partner, and then all of a sudden saw an ad for that product on their phone or when they were browsing the Internet. 

So let me just clear the air here: Your phone is not listening to you. At least, not like that.

SARAH:
Yeah, no, it’s actually- it’s, it’s a hundred times worse than your phone listening to you. Right, like, it’s this idea that all of this, all these tiny pieces of information. And it’s not just like the sites you visit, it’s the sites that other people visit while they’re on the same Wi-Fi network as you or in the same location as you. Like I and my boyfriend don’t need to log onto the same computer ever in order for data brokers to be able to link us together, because we’re on the same Wi-Fi, because we’re often at the same location. We have each other’s phone numbers on in each other’s, uh, phones. So there’s all these little discrete pieces of information that link us together. 

And when one of us makes a purchase, there are all these other inferences that data brokers can make about the other person. And the same goes for your friends, your extended family, your neighbors, other people that data brokers think are in your same sort of social class. 

There’s so many things that can be pulled together to create this predictive map that- now suddenly you think your phone is listening to you when in fact you’re just giving away a lot of information about yourself, and if you aren’t, your family is, or your friends are.

ROSE:
We are, and I include myself in this, really predictable. And we’re exuding revealing data about ourselves all the time. Like PigPen, just constantly leaving behind a cloudy trail of data dust.

And when you combine those two things, it’s really easy for an advertiser to guess that you might want this particular brand of toothpaste or shirt or workout equipment. Plus, you never notice it when it doesn’t match what you were just talking about, so you can add onto this whole thing a healthy dose of confirmation bias. 

And this gets us back to the question of data selling, because your data isn’t just yours. I want you to imagine, again, PigPen, right, with all the dirt swirling around him? So that dust cloud is your data. And we are all constantly surrounded by it. And when you and I get close to one another, our data clouds mingle, and I take some of your dirt with me as I go by and you get some of mine. So when I decide, then, to sell my data, I’m also kind of selling yours

SARAH:
I mean, like, just take health data, for instance. When you give away information, for instance, about your blood type, you’re giving away information about your parents’ blood type and your children’s blood type as well. Not their co- the complete information, right, but you’re giving away like a strong likelihood of what your parents’ blood type is or what your children’s blood type is. You- Whatever you do, you give away something about someone close to you who hasn’t necessarily consented to having that given away, even if you have.

ROSE:
This is all extra true for health and DNA information— which is something we’ve talked about in the past on this show. If you decide to do 23andme or some similar DNA sequencing service, you are basically consenting to give the genetic information of a lot of your family away, too. 

This idea that this data isn’t really all that dangerous to sell or pass around to get a better deal on a car, is, again, in my opinion, kind of naive. It ignores the ways in which law enforcement tracks protestors, for example. It ignores the kinds of tracing and data gathering that ICE does

Even if you personally don’t care that much about whether- Target knows what you’re shopping for, it doesn’t take a super dystopian brain to think about situations in which your data could put other people in your life at risk. Maybe your family is fine with you being queer or trans, but what if your data winds up accidentally outing a friend to their far less supportive family? 

There are documented cases where sex workers, who use a completely different name and identity, will log into Facebook and see their clients recommended to them as possible “Friends.” Which- likely means that their clients were seeing the same thing, and could suddenly know their real names and identities, which can be really dangerous for people.

Even if it is technically anonymized, data can still lead to negative outcomes. And, it’s worth saying, that as data processing techniques get more sophisticated, the idea of anonymized data becomes more and more implausible. One recent study was able to re-connect people with their so-called “anonymous” data with a 95% accuracy rate

Even Richie agrees that the ways that companies pass your data around these days is creepy and weird.

RICHIE:
I know my doctor has my data. But, but why- I, I don’t- why does, why does my toothpaste company suddenly know to advertise to me because I went to the dentist yesterday. Why did that happen? How did that happen? And that is where every consumer is a tinfoil consumer at that point, right. Every consumer wakes up when they realize that the issue is not with the first party company that they’re dealing with; it’s with the second third party, fourth party downstreams who might be using that data in a completely inconsistent way that was, that was part of sort of that unspoken contract with the first-party company.

ROSE:
I guess my question is, like, why not then just stop that from happening, instead of trying to kind of turn it into a marketplace? 

RICHIE:
Yeah.

[Pause]

RICHIE (cont’d):
So that is a very high IQ, um, point of view, and I’ve dealt with this point of view before, and I think it is, it is.. particularly enticing if we believe that, that human beings are not driven by greed. But, as I think many of us would agree, (slight laugh) human beings are driven by greed, vanity and hedonism. 

ROSE:
If you are a long time listener of the show, you know (slight laugh) that I wouldn’t actually agree with that. 

But this is an argument I hear a lot from folks who work in Silicon Valley, and I think that this is actually a great encapsulation of why some of the products and services and ways of thinking that come from the tech industry can feel so brutal: Because they are operating from this mindset of greed, and they assume that everybody else is too. 

But I don’t think that’s the right assumption to make. In fact, there is behavioral research that shows that most people want to help each other, not purely get a leg up at any cost. That in many cases, when people could pick the greedy selfish option, they opt instead to help others. I will link to some of that research in the show notes, and talk about it in more detail on the bonus podcast this week. 

If you assume that the main driver of human behavior is greed, then, yeah, I guess this model of capitalism meets private property makes sense. 

RICHIE:
This thing is going to become a marketplace. It’s gonna be transacted. Anything of value in human history has always become transactionalized, and that’s because we’re driven on a premise of value exchange; it’s how we civilized and how we grow.

ROSE:
But I don’t think it has to be this way. I don’t think that this is a natural way of thinking about our personal information, or about anything, really. Capitalism is an invention, not a law of nature. 

Again, I totally get why this idea appeals to people. It feels like these mega companies are making all this money off of us, and we should do something to take some power or control back from them. But there are, in fact, other ways to do that.

SARAH:
What you want to do when companies are too big and too powerful and- aren’t giving back to society, is- you want to break them up and you want to tax them more. 

Like, this is.. kind of one of the weird things about the era that we’re in, where we keep talking about all of these alternative proposals to deal with.. this wild technological dystopia that we’re entering, and- we just keep skirting around the question of anti-trusts and the question of greater taxation, when those are the obvious answers to pretty much every social ill that we’ve got on the table, here. I- I think that, like, people think that getting money back from Facebook is somehow going to be justice, when really what we need is some kind of regulatory control so that people aren’t exploited. 

ROSE:
And what if, along with those regulatory controls, we could also shift the way we think about data more fundamentally? What other options do we have, aside from saying things like “data is the new oil?” 

When we come back, you’re gonna hear from someone who thinks about data in a totally different way. 

A quick note: after the break we’re going to talk a little bit about violence against Indigenous communities, so if you want to skip that, now is a good time to hit “stop” on this episode. 

ADVERTISEMENT BREAK

ROSE:
Today’s episode is supported in part by MOVA Globes.

MOVA Globes are the perfect piece of conversation-starting decor. With no batteries or messy cords, these globes rotate with the power of light and Earth’s magnetic field. As long as there’s ambient light around, the globes spin, with the help of hidden magnets.

MOVA has forty different designs, from outer space to famous artworks to classic old maps. 

Since this is the last MOVA ad, for now, if you all buy a bunch of globes, maybe they will come back and advertise again!

I was trying to think of something interesting to talk about that is globe-related, and weirdly, the first thing that popped into my head was the globe scene in the 1940 Charlie Chaplin movie The Great Dictator? If you’ve never seen it, there’s this iconic scene where Chaplin, who plays the dictator of a fictitious country called “Tomainia,” dances with an inflatable globe while fantasizing about taking over the world.

The Great Dictator is a really interesting film that came at a really interesting time. It was Chaplin’s first “talkie,” for one thing, and it became his most successful commercial film. But Chaplin himself later wrote that he would not have made the film at all if he had known at the time how bad things were in Germany.

The film is satirical; Chaplin is very clearly playing Hitler, and the whole film is an indictment of Nazi-ism. But in his autobiography just a couple of years later, he wrote, quote, “Had I known of the actual horrors of the German concentration camps, I could not have made The Great Dictator. I could not have made fun of the homicidal insanity of the Nazis,” which I think is a really interesting comment on where satire stops being useful as an artistic strategy, and it got me thinking a lot about our current moment and how you know when you’re at that point. How do you know when satire is no longer appropriate?

Anyways, the globe scene is iconic, and it’s on YouTube, and you can watch it, or you can watch the whole movie at the Internet Archive. This doesn’t really have anything to do with MOVA Globes, but- you’ve just gotten a little bit of an insight into how my brain works.

And if you want your own globe to dance around with- (slight laugh) but hopefully, not to try and conquer- you are in luck. Flash Forward listeners get a special offer. Visit MOVAGlobes.com/flashforward and use coupon code ‘FLASHFORWARD’ at checkout for 10% off your purchase.

That’s MOVA, M-O-V-A, Globes.com/flashforward, and use the code FLASHFORWARD, all one word, at checkout for 10% off your purchase.

ADVERTISEMENT ENDS

ROSE:
If anybody knows what it’s like to have your data gathered and then used in ways that don’t benefit you, it’s Indigenous people. 

ABIGAIL ECHO HAWK:
When I first started in this field, the first thing that I recognized is that every time somebody talked about my community, about American Indian, Alaska Native people, they talked about us in a very deficit way. They wanted to say how bad off our communities were. 

ROSE:
This is Abigail Echo Hawk. 

ABIGAIL:
I’m a citizen of the Pawnee Nation of Oklahoma, and I am the chief research officer at the Seattle Indian Health Board.

ROSE:
Abigail is the director of the Urban Indian Health Institute,  

ABIGAIL:
I am really blessed to direct this tribal epidemiology center that works to ensure the representation of American Indians and Alaska Natives living in urban settings across the United States in the data. 

ROSE:
And that representation is important, because Abigail says that, since colonization, the data that has been gathered about American Indians and Alaska Natives has always been from this perspective of deficit. Researchers would come in, ask questions that didn’t always make sense, and then go off to make statements about these communities that didn’t match their experiences- and certainly didn’t help them. 

ABIGAIL:
And so for many native communities, there’s hesitancy to participate in government-funded data gathering and state-funded and county-funded data gathering, because that information has always been used against us, never for us. 

ROSE:
This history runs so deep that when Abigail started working in the field, she got a lot of pushback from her own community.

ABIGAIL:
They looked at me and were like, why are you this, like, Native person? 

[Rose laughs lightly]

ABIGAIL (cont’d):
You’re an Echo Hawk. You’re a Pawnee. Why are you participating in something like this? And I had to go back to my communities and to my elders, and reflect on how do we gather this information up in a good way?

ROSE:
Those conversations and that work has led to what Abigail calls “decolonizing data:” Working with data in a way that doesn’t just resist the colonial narratives, but actually builds a way of thinking about data that honors community, tradition and history.

ABIGAIL:
How do we go back to our traditional value systems of what data is and how it is meant to be used, and reclaim that we have always been data gatherers, we have always been the scientists who analyzed the data, and we have always been the community that participated in the analysis of that to ensure the well-being of our future generations? And so all of our work is centered in that idea. 

ROSE:
Where Western colonial science often describes data as cold, hard, impersonal, raw; a decolonial perspective sees data differently.

ABIGAIL:
We know that data is living, breathing, speaking entities to us in a cultural and spiritual way. And with that comes deep responsibility. So, when I talk about- I don’t say data points, when I talk about a graph I may have on, uh, maternal mortality. Instead, I talk about the mothers. I talk about the families. I talk about the children. I talk about the grandparents. 

We see each- data point in our health data as an individual, as a loved one, as a community member. 

ROSE:
To give a specific example of what this looks like— in 2018, Abigail collaborated with a researcher named Annita Lucchesi to publish the first ever study on missing and murdered Indigenous women and girls in the United States. 

ABIGAIL:
There is an absolute crisis, ongoing crisis of missing and murdered Indigenous women and girls across the United States and across the world where Indigenous populations live. Here in the United States, there is a legacy of violence against femme-identifying native people very focused on sexual violence, on murder, and kidnapping. And this started way back, and first colonization has continued to today.

ROSE:
For years, tribes would go to the FBI, the Department of Justice, to their state politicians.

ABIGAIL:
And these tribal leaders were saying, Our women are going missing, our women are being murdered, and.. nobody is doing anything about it.

ROSE:
And every time they heard the same thing: 

ABIGAIL:
Well, you have this story, and you have two stories, maybe you have three stories, but where’s the data?

ROSE:
When they asked for someone to try to collect that data, they were told wellll, it will take time and we’ll need a bunch of money from Congress. And of course, nothing happened. 

So, in 2017, Abigail connected with Anita, a PhD student, who was working on gathering this information herself.

ABIGAIL:
I self-funded that study; I raised the money by going out and doing speaking engagements. And it cost me twenty thousand dollars.

ROSE:
Because they approached the study with this decolonized framework, they knew the questions to ask, and they were able to get the data they needed. Participants were willing to talk to them, and they knew how to frame the questions to understand the problem. And in just a year, with a very small budget, they put together the study.

ABIGAIL:
And as a direct result of that, we have seen more than fifteen pieces of legislation passed across this country and numerous states. We’ve seen it talked about on the floors of Congress. We’ve seen it written into federal pieces of legislation and stated that data with twenty thousand dollars, and the support of a young P.H. D student, and my organization, working together, to elevate and to understand that as Native people we have a responsibility to those women who have gone missing, who continue to go missing, and who have been murdered and continue to be murdered in this country. 

ROSE:
Something that they were told would take millions of dollars from Congress happened because they knew how to make it happen. 

So, when I asked Abigail how she felt about this idea of creating a marketplace for data, a way that you and I could sell our data to get a better deal on a car lease, you probably won’t be shocked to learn that she was not all that excited about it.

ABIGAIL:
Our greatest fear is that folks are going to attempt to monetize and to share what we see to be sacred knowledge. 

Our DNA is sacred knowledge. Our way of being, acting, driving, living is sacred knowledge. And the only folks who should and do have any purview over that is our communities, and so we live in a society as Indigenous people where we think as a community. And how do we as individuals benefit that community as a whole.

ROSE:
You might have noticed that Abigail comes back to that last point a lot— that the data must be in service of the people, the community. This is kind of the opposite of the “everybody is driven by greed” argument. This way of thinking says no, instead of seeing each person as an island, as a source of capital, you should think about all the ways we are connected, and care for one another.

ABIGAIL:
We are not here for individual wealth. We are here for community wholeness, and in this bidirectional relationship that we have with the land that we live on, with the people that we live next to, and the communities that we have been entrusted by our ancestors to hold in a way that benefits those future generations, monetization of the data is not what an Indigenous community is. 

ROSE:
And I think this is one of the lessons that we could all try to learn from this idea of decolonizing data— that instead of looking out for just you, and trying to make a few extra dollars, what could it look like if we looked out for our community and thought about what tools and ideas could benefit us all as a whole. 

This core idea, that the community should drive the data collection and processing is one that everybody could learn from.

ABIGAIL:
The techniques that we use absolutely should be being applied and looking at data across other populations of people. In fact, what if the data that was gathered in the African-American community was gathered in the Latinx community, was gathered in the Pacific Islander and Asian communities, was done from the perspective in which we do it? 

We are gathering our data because we love our people. We are gathering our data because we want a better future for the next generations and for the people living in this country today. What if all data was gathered for those reasons? What would it look like? [00:29:16][40.0]

ROSE:
And these aren’t hypothetical questions, either. These days, Abigail spends a lot of her time asking the CDC for data on COVID-19 so she can help her community. 

ABIGAIL:
And I have been fighting the CDC for data access that, as a tribal epidemiology center, as a tribal public health authority, I have a right to by federal law, and the CDC still will not give me all of the information that I asked for.

And right now, we know that American Indian/Alaska native people are disproportionately impacted by COVID-19. I need that data. And so every single day right now, I am fighting one of the biggest federal agencies in the country to give me what I have a right to by law, so that the health and well-being of my people is not so disproportionately impacted by COVID-19. 

ROSE:
So, what if, instead of turning our data into a commodity that we sell, we treated our data as a resource that could help our community? I think that future could look a lot brighter.

ABIGAIL:
An ideal future for data from an Indigenous perspective is one that is grounded in our relationship to the land, to the air, to the sun, to each other, to the plants, to the animals, to.. the humans that we live next to. That is all about a natural ecological system of responsibility. 

And- when we think about the future of data, what that means is that all information is being gathered for the purpose of creating that better world that looks at that ecological system.

So I know that our data practices will always be grounded in our traditional scientific knowledge systems. And as I continue to fight towards equity for our people, and data collection and data analysis and data storage and anything related to data, I am always reminded of my ancestors who survived that I may thrive. 

And across this world, Indigenous people are coming with that exact same idea. And when I think about this Indigenous future of data, I absolutely know we are going to achieve it. I know that it is coming and we are just that one small step in getting there. And so this will be data for Indigenous people by Indigenous people grounded in our knowledge systems and always done for the love of native people.

[A snapping, synthy music comes in- the Flash Forward closing theme]

Flash Forward is produced by me, Rose Eveleth. 

The intro music is by Asura and the outro music is by Hussalonia. The episode art is by Matt Lubchansky

The voices from the future this episode were played by Brian Downs, Keith Houston and Shara Kirby. You can find out more about them in the show notes, and please do check out their work. Keith does a regular radio show on Sundays, and Shara has worked on some amazing projects. I will link to all of those things in the show notes.

If you want to suggest a future that I should take on, you can send me a note on Twitter, Facebook, or by email at info@flashforwardpod.com. I do love hearing your ideas! 

And if you think you’ve spotted one of the little references that I’ve hidden in the episode, you can email us there too. If you’re right, I will send you something cool. And if you want a hint about where to look for references, one tip I will give you is to not skip the advertisements.

If you want to discuss this episode, or some other episode, or just the future in general, you can join the Facebook group! Just search ‘Flash Forward Podcast’ and ask to join. There is a question you have to answer, just to weed out bots. It’s a very easy question, but please do answer it, because if you don’t, I will not let you in.

And if you want to support the show, there are a few ways you can do that, too! Head to www.flashforwardpod.com/support for more about how to give. If you become a Patron or a member of the Time Traveler Club, you get a bunch of cool things, like a special newsletter, a bonus podcast, a book club, a goody-bag, so- check that out if you’re interested.

If financial donations are not in the cards for you, you can head to Apple Podcasts and leave the show a nice review, or- you can just tell your friend about the show. Literally just pick a friend. Tell them, be like, “Hey, I like Flash Forward, it’s this podcast; have you listened to it?” That really does help. 

That’s all for this future. Come back next time and we’ll travel to a new one. 

[Music fades out]

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.