Home Episode Unreel

Unreel

April 4, 2017

This month on Flash Forward, we go to a future where anybody can make a video of you doing anything they want. And that technology is cheap and easy to access. What happens?

This episode we start by talking about the technology as it exists now. Hamed Pirsiavash the show to explain his research into generating videos using algorithms. Here’s a little video of how it works.

And let’s just take a second to appreciate how creepy those baby videos are.

So that’s where the technology stands now. But once it gets better, there are all kinds of applications. Hal Hodson, a tech reporter at The Economist, tells us about how it could be used in movies. Right now, movie-makers use CGI to project faces onto other faces. Recently, in the latest Star Wars, the faces of Princess Leia and Grand Moff Tarkin from the original trilogy were projected onto actors faces for the few scenes in the new movie. Here’s a look at how they did it.

But in the future, they might not have to do any of this. They could simply generate the video they need using images of Leia and Tarkin’s faces. Which also means that movie stars could wind up being in hundreds of movies a year, since they don’t have to actually be there, on set, to act. And they could keep acting in movies long after they’ve died, too.

That’s a fun thing to think about. Here’s a less fun thing to think about: how people would use this technology to seek revenge and ruin people’s lives. And to talk through the legal implications, I called Carrie Goldberg, a lawyer who specializes in revenge porn cases. She explains how these generated videos of the future would actually get around today’s revenge porn laws.

Then, to wrap it all up, I talk to Jenna Wortham, a writer for the New York Times Magazine and the co-host of an amazing podcast called Still Processing. In a world where online identities are not only personally valuable, but economically valuable, what does this do to us? When anybody can torpedo your finely crafted online persona with a fake video, do we all just give up? Do we try to erase everything from the internet about ourselves? Or do we lean into this and start making wild aspirational and experimental videos? Or maybe all of the above?

Bonus: You will also find out what butter, The Falkland Islands, and Snakes on a Train have in common. According to Rose.

Some further reading for this episode:

Flash Forward is produced by me, Rose Eveleth, and is part of the Boing Boing podcast family. The intro music is by Asura and the outtro music is by Hussalonia. Special thanks this week to Wendy Hari, Jacki Sojico and Dan Tannenbaum. The episode art is by Matt Lubchansky.

If you want to suggest a future we should take on, send us a note on Twitter, Facebook or by email at info@flashforwardpod.com. We love hearing your ideas! And if you think you’ve spotted one of the little references I’ve hidden in the episode, email us there too. If you’re right, I’ll send you something cool.

And if you want to support the show, there are a few ways you can do that too! We have a Patreon page, where you can donate to the show. But if that’s not in the cards for you, you can head to iTunes and leave us a nice review or just tell your friends about us. Those things really do help.

That’s all for this future, come back next time, and we’ll travel to a new one!

▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹▹

TRANSCRIPT

Rose: Hello and welcome to Flash Forward! I’m Rose and I’m your host. Flash Forward is a show about the future! Every episode we try to really overthink a possible tomorrow, how would it work? What would it look like? Who lives? Who dies? Who winds up with a closet full of robots?

 

We always start with a trip to the future, a little bit of fiction to set the mood and let you know what future we’re headed towards. Then, we come back to today to talk with experts about how that future might really go down. Got it? Great! Let’s start this episode in the year 2034.

 

[SCENE]

 

Person1: Hey, did you see this video?

Person 2: No… what is it?

Person 1: Hang on let me start from the beginning….. Isn’t that you?

Person 2: No I’ve never been to an Outback Steakhouse.

Person 1: Well it really looks like you.

Person 2: Yeah… it does… but it’s not me.

Person 1: Weird

 

[phone rings]

 

Person 2 (on the phone with parent): Hi mom. …. No I know … No it’s not me…. Where did you even see that video? Yeah no it’s not me mom. I know… I know but it’s not. I don’t really know what to tell you. Okay, love you too, bye.

 

[text message noises]

 

Person 2: Jesus christ….

Person 1: I mean it REALLY looks like you.

Person 2: Yeah thanks.

 

[text message noises]

 

[phone rings]

 

Person 2: Hello?

Oh, no sorry you’re mistaken that’s not me. No… no I know… but it’s not. No thank you. No, really, no thank you.

 

Person 1: Who was that?

Person 2: The local news… they think it’s me.

Person 1: I mean it REALLY looks like you!

Person 2: Can you play it again?

 

Person 2: Oh my god. I know who did this…. Give me my phone?

 

Brian!

Yeah I’ve seen it.

Yes it does look like me.

No I’m not impressed, do you know the news just called me?

Take it down.

Brian this is not funny. Take it down.

What?

No, not for a date… take it down come on.

Fine, if you take it down I will go on another date with you. Just take it down.

Okay

Bye.

 

Person 1: Um, what?

Person 2: Remember that weird computer art guy I dated? He figured out how to make videos of people just with computer graphics and their facebook photos.

Person 1: So … he basically just blackmailed you into another date…

Person 2: I guess?

Person 1: What makes you think he won’t just do this same thing again?

Person 2: I don’t know…

 

[text messages]

 

[/SCENE]

 

Rose: Okay, so this is a future in which anybody can make a video of you, doing anything they want. They just give the system some photos of you, and say, hey, make a video of Rose dancing in the middle of Times Square, or whatever it is.

 

Just as a quick word to anybody who might be listening with kids, this first half of the episode is totally kid friendly, but after the ad break we get into some adult stuff involving sex and porn. I’ll warn you again when we get there, but just an FYI! In case you’re doing dishes and don’t think you’ll get to that pause button in about fifteen minutes. But this first part is fine! Don’t go yet!

 

Okay, anyway, video that anybody can generate. We’ll call it generative video in this episode, although that’s not a technical term, but it kind of encompasses what we’re talking about. Now, this might seem impossible, but… it’s not. Just this weekend, an MIT researcher gave a talk about an AI system that could generate realistic videos of politicians giving speeches that they never gave. And there are other researchers working on algorithms that can generate videos from still images. Like this guy.

 

Hamed Pirsiavash: My name is Hamed Pirsiavash. I am an assistant professor at University of Maryland, Baltimore County.

 

Rose: Hamed and his team are working with researchers at MIT to figure out how to generate video using an algorithm.

 

Hamed: We have an artificial intelligence algorithm which is basically a deep learning algorithm. And what it does is it is producing a video which doesn’t exist but is trying to make the video as close as possible to natural video. Basically when a person is looking at the video the person can not tell if this is a natural video or computer generated.

 

Rose: Here’s how it works: first, the algorithm has to learn. And to teach it, the team fed the system two million unlabeled videos. That’s about two years worth of video material. Then, they set up two different neural networks. Those two networks were basically competing with one another. One of those networks would generate videos, and the other one tried to determine whether those generated videos were real or fake.

 

Hamed: So what is happening is there is a competition between these two, it is like a game. One of them is pushing to do the task, the other one is pushing to generate outputs so the second one can not do the task well.

 

Rose: So researchers would take an image, feed that image to algorithm #1, and ask it to generate the next few seconds of video. Then, algorithm #2 would watch that generated video and try to decide: wow that looks fake. Or, looks good to me! The idea here is that having an actual human watch all of the videos that a machine puts out to see if they’re realistic takes a ton of time and money. So this system allows them to train a machine to make better and better videos much faster using another machine.

 

I’ll post some images of what they’ve been able to make so far on flashforwardpod.com, but let’s just be clear that right now this videos are not realistic. Some of them you can kind of see what the algorithm was going for, there’s this one of a  train going past a station, where you’re like, yeah that’s okay. Maybe if you were really drunk, that’s what it would look like. But some of them are incredibly creepy — they have one where they asked the computer to generate a baby, the face of a baby, and it is….terrifying.

 

And in part that’s because humans are really hard to model, because we’re so wiggly.

 

Hamed: So human is not a rigid body, right? Humans can bend, it’s a flexible body. So the motion of a human is much more complicated compared to the motion of a train. We know that a train can move only along the tracks, and it is very rigid. It doesn’t have legs, it doesn’t have arms. It is just a box that is moving. But it is not the case with humans, humans have a lot of different types of motion. Like the skeleton of a human can move in many different directions. And handling that is much more difficult.

 

Rose: It’s also hard because we’re also much better at picking out fake humans than we are at picking out fake trains.

 

Hamed: The human brain is very sensitive to humans. We have actually learned a lot of things about the human face, human bodies, and all this because we have seen a lot of them around us and those are the very important things that a human brain needed to process. So that’s why human is very sensitive to the irregularities on those lines. Even if you see a person walking, if he isn’t walking perfectly, your brain is going to catch it very easily. If the gait of the person is not perfect, you’re going to realize something is wrong here. But it may not be the case for animals, or it may not be the case if the train doesn’t move perfectly.

 

This is actually much deeper in human face, particularly. If you look at the human face, if you move a couple a few pixels, like a little patch on the human face, it is very obvious for the human that this is not a perfect face; it is manipulated. So generating a face is not actually very easy.

 

Rose: I mean the baby one is … so creepy.

 

Hamed: Exactly, exactly. If you look at the baby ones, there is no way that you can say it is natural. It’s really difficult to generate such videos.

 

Rose: And right now the videos they can generate are really small, but they’re also making pretty swift progress.

 

In the end, in theory at least, you could feed a particular algorithm an image and it could generate the next several seconds of incredibly convincing video for you. These teams also experimenting with giving the algorithm a foreground and background to use, to put two different images together into one videos. This is something that movies have done a different way for a long time

 

Hal Hodson: Things like Gollum are pretty advanced CGI.

 

Rose: This is Hal Hodson, he’s a tech reporter at The Economist.

 

Hal: And that’s actually the way that they did the Moff Tarkan face in the new Star Wars is exactly the same tech as they used to put Gollum’s face on Andy Serkis. But just with a human face instead of a Gollum face.

 

Rose: And the movie business is definitely interested in these generative video techniques.

 

Hal: The idea is that they can generate exactly what you want. So instead of this kind of slightly uncanny-ness that we get when we look at Tarken or Leia in the new movies or a lot of other CGI, these generative tools would be able to build something that doesn’t have that  But it’s sort of yet to be seen if they’re going to work.

 

Rose: Once this tech can go from rooms, to people, it could totally change the movie business. If you could cast Marilyn Monroe in any movie now, because you can just generate her, that totally changes things.

 

Hal: One thing that I’ve been thinking about a lot is what happens to actor’s identities. Because that’s kind of the ultimate identity, it’s the ultimate personal brand. If this exists and you can totally capture your own data and propagate yourself into other roles, then I’ve been thinking that doesn’t that mean that you can massively grow the extent of your influence over space. So you wouldn’t necessarily just be famous in the US you could you could sell your face to China and they could make Chinese market movies using your famous face. But not just space, but also in time. You could also sell yourself into the future so that future people can watch new movies made for them and their time with John Wayne’s face on them.

 

Rose: I wonder if then you have two tiers of movies. Where you have quote-unquote real movies where the real actors are there. I wonder if you there is different categories in awards shows. A real actor doing a real thing, doing a new thing. Maybe that’s more valuable or feels more authentic, quote-unquote. Or maybe just film snobs are like, oh no its better if it’s on reel to reel and with real humans instead of regenerative people.

 

Hal: I think that offers a lens to think about how you would do verification in this weird world where everything is everything and nothing is real. I’m pretty sure those films would need some kind of Kitemark or a verification strategy to say, yes I’ve been I’ve been following this shoot the whole time and all of the scenes in this movie were shot with a hundred percent Grade-A human flesh. Otherwise if we assume that they’re going to get the point where you couldn’t really know, then how would you know. Not just human flesh but the identity in question. You know I’ve been to their trailer I have asked them if they like coffee. I am highly confident that whoever, George Clooney, really was in this movie.

 

Rose: I can see that also being like a huge scandal. Like a Russian doping scandal.

 

Hal: And queue a thousand think pieces about why does it really matter, man. Everything is relative and if you enjoyed it that’s all that counts.

 

Rose: This all got me thinking about this movie production company called The Asylum, and their whole business model is making ripoff movies of big blockbusters. And they give these ripoffs really similar names, essentially hoping that people in Blockbuster, or online will get confused and accidentally rent or buy their movie instead of the actual blockbuster that they’re looking for. So when the big movie “War of the Worlds” came out, The Asylum made a movie called “H.G. Wells’s War of the Worlds”. When “The DaVinci Code” came out they made “The Da Vinci Treasure”. When one of the “Pirates of the Caribbean” came out they made “Pirates of Treasure Island”. And my very favorite, they released a movie called “Snakes on a Train”, two days before “Snakes on a Plane” hit theaters.

 

Anyway, I bring all of this up because The Asylum is probably going to have a field day with this technology. Now, instead of actually hiring actors to make these so called “mockbusters”. they can just computer generate them. They can even make their computer generated characters look ever so slightly like the actors in the actual films they are ripping off.

 

So companies are ready to exploit this, and so are scam artists. Who, of course, have been faking stuff like this for a long time. You’ve probably been fooled by a fake viral video on YouTube already. But I also want to tell you about this faked piece of media that goes back way before YouTube or even really the internet.

 

In 1982, the British were fighting a war with Argentina in the Falkland Islands. Margaret Thatcher is the prime minister of England at the time, and that was a point in which she was really unpopular. Then, in May of 1982, the Argentinians sank a British ship called the HMS Sheffield, killing twenty people on board. Thatcher responded really quickly, and many believe that swift response is what won her reelection the next year, in 1983.

 

But just after that election, a tape surfaced that cast some serious doubt onto Thatcher’s decisions in the Falklands, and her reelection. The tape seemed to be of Margaret Thatcher and Ronald Reagan on the phone, talking to each other. And it seems like Thatcher is implying to Ronald Reagan that she actually sacrificed the HMS Sheffield for political gain; to escalate the Falklands war and have an excuse to really step in. Which is a huge accusation.

 

It turns out, though, that the tape was fake. It was in fact a series of clips from various Thatcher and Reagan speeches, spliced together to make it sound like a conversation. Once they figured out that the tape was fake, people thought that maybe the Argentinians or even the KGB were responsible for the tape. But it turns out it was was created by an anarchist punk band called Crass.

 

So this is all to say that fake tapes have been around for a long time. And they nearly caused at least one international crisis. Fake video, will likely be a whole other level of chaos.

 

Which has really interesting implications for journalism. A few episodes ago, we took on the question of a future where nobody trusts the news, and here we are again. If you can make a video of anybody doing anything, that could totally wreak havoc on the news cycle. People already get fooled all the time by fake images online, including journalists. Fake video is certain to be the same way. How will journalists vet and confirm videos? If a video pops up from someone that depicts a politician doing something… how do you tell if it’s real? I think that the answer is that a lot of people won’t be able to.

 

Hal: I think it might make people start asking the kinds of questions that journalists ask about these media things. You know all those threads, like the one of the giant jellyfish and a scuba diver swimming along next to it, that go viral and everyone says wow this is the coolest thing ever. And then someone, I think with this one it was a science journalist, says, I don’t think the jellyfish get that big. I think that we would be better off if everybody asked those questions. But probably there will be a horrible transition phase where everything sucks and people get fired because people make fake things of them. That’s probably already happening somewhere.

 

Rose: If we’ve learned anything from the past year or so, people are not good at determining what is real and what is fake. And when it comes to images and videos that has real impact. There’s a whole field of research out there about the ways that images and videos change people’s memories. There’s one study in particular about this I think is really, terrifying, honestly.

 

In this study, researchers asked participants to play a gambling game. Then, these same participants were called back for the second part of the experiment. Before they’re brought into the room to play again, they’re pulled aside and they were shown doctored footage of their partner cheating at this gambling game. Now, the partner didn’t cheat. The video is completely fabricated. So the subject here could not have seen this person cheat. But even though there was no way for them to have seen their partner cheating,  20% of participants were willing to sign a witness statement saying that they had. And even after they were told that the footage was doctored, some participants refused to relent. They really believed that they saw something that never happened, because they had been shown the footage.

 

So to prevent that from happening, news outlets need to vet their stuff. But how? Can they even do that?

 

Hal: If generative video does become a really big thing and anybody can just click a few buttons and generate an embarrassing video of you, hashtag. Then the government, or official channels may start using higher resolutions in a more immersive content. Maybe all official government decrees come out in VR that can be generated yet. Maybe there becomes this arms race between technological truthiness and our scepticism.

 

Rose: Or maybe there’s some kind of law that says that fake videos have to have some kind of marker or indication that they are fake.

 

So here’s a kind of weird analogy. In the the 1870’s, margarine arrives in America from Europe. Yes, we are now talking about butter substitutes and not high tech video, stick with me here, I promise there’s a payoff. So margarine shows up and American dairy farmers are extremely worried about this, because margarine is cheaper than butter. And this results in a huge battle between dairy farmers and margarine manufacturers. And a lot the fight centered around the fact that the average consumer might not actually realize the difference between real butter and fake butter. And the margarine makers knew this, they would actually die their product that classic yellow butter color, so if you didn’t realize that margarine was a thing, or you didn’t read the packaging carefully, you might buy fake butter, thinking it was real butter.

 

So the solution that the dairy lobby came up with, to combat this fake butter, and the potential consumer confusion, was to push laws that said that margarine could not be dyed yellow. By 1902, 32 states had rules about what color margarine could be. In Vermont, New Hampshire and South Dakota there were laws saying that margarine had to be dyed pink. Other states opted for red, brown, or even black. Can you imagine buying black butter, that sounds awful. And those margarine-color laws stuck around for way longer than I realized. Wisconsin only repealed it’s margarine-color law in 1967.

 

So maybe we have “fake video” color laws. See I told you I would bring it back. Maybe these fake videos have to be dyed pink or have some kind of watermark on them. Or maybe they have little pixels embedded in them, that are invisible to the naked eye but can be detected by video verification software.

 

Hal: You could imagine video broadcasts being signed by the government publicly using cryptography to make sure the video is verifiable.

 

Rose: But of course, videos aren’t like margarine… which is not something I ever thought I would say… but managing an endless stream of fake videos is way harder than managing margarine production. Videos spread across the web faster than any truck of fake butter can move, and it’s way harder to enforce rules about online video than it is to enforce rules about food color.

 

Even if there was a rule about pixels or encryption, it might not matter. Even today, there are tons of fake images that are VERY OBVIOUSLY fake and easy to fact check, that I see reputable journalists sharing. Journalists, stop it! All those cheeky London tube signs are fake! ARGH!

 

Okay, maybe now is a good time for a break, and I’m going to calm down. When we come back, we’re going to talk about the legal side of all of this. Can you sue someone for making a video of you like this? And how does this change our sense of identity in the future? That, and more, in a second. But first, a word from our sponsors.

 

[[ADS]]

 

Rose: So in this future you can create a video of anybody doing anything. And I will admit that my first instinct was to think: oh my god, revenge porn. So, this next bit of the episode has some adult stuff in it, so if you’re listening with little kiddos that you don’t want to hear about some pretty gross revenge porn stuff, now’s the time to turn it off….

 

… Okay, so like I was saying, my first thought when I heard about this kind of technology, was to think about the ways that it could be used against people. I mean if you could really just make a video of anybody doing anything, you know there would be people making videos of politicians in all kinds of crazy sex acts. Or making videos of their ex girlfriends doing drugs or having sex with people. Or making videos of their bosses killing puppies, or something like that.

 

And to talk us through the legal side of this kind of thing, I called Carrie Goldberg.

 

Carrie Goldberg: I run a law firm in Brooklyn and we focus on victims of online harassment, sexual assault, and blackmail. And I got into this because I was somebody’s target myself and I had difficulty finding a lawyer to help me and I became the lawyer that I really needed.

 

Rose: Carrie has represented people in all kinds of cases that fall under a phenomenon called “revenge porn,” or “nonconsensual-porn.” Even if you don’t know someone who’s been targeted personally, you’ve probably heard about celebrity cases, where nude photos of actresses like Emma Watson and Jennifer Lawrence are leaked to the internet. And in the past few years, a lot of states have actually adopted revenge porn laws to try and deal with these kinds of cases.

 

Carrie: We now have 35 states that have non-consensual porn laws. We have a federal bill that’s that’s also being introduced. And then all states have video voyeurism laws, defamation laws, and there’s a growing trend toward e-personation laws.

 

Rose: So revenge porn is something people are dealing with already. And in fact, even today some of those cases involve faked photos. One thing Carrie constantly tries to explain to people, is that it doesn’t actually matter if you’ve never taken a nude photo in your life. You are still vulnerable to this kind of attack. Even without fancy computer generated graphics.

 

Carrie: Whenever I give a talk or come across somebody who doesn’t really understand what I do, they always say something like, Oh well you know I just tell everyone not to take nude pictures in the first place. And there’s a smugness that goes along with that because people wrongly think that it could never happen to them, they can never be the victim of a non-consensual pornography if they just don’t take naked pictures. However I have lots of cases where someone’s head is photoshopped onto a body, or they never consented to the picture in the first place, there was video voyeurism.

 

Rose: And you don’t even need any nude photos for online impersonation to be scary. One of Carrie’s clients had an ex post a fake profile of him on a gay dating site called Grindr.

 

Carrie: And over a thousand people have come to his home and his workplace expecting to have sex with him. And the profiles are very very crude, saying things like my client on all fours with his ass looped waiting to be pounded. And we’re really very, very frustrated by the app’s refusal to intervene and their claim that they don’t have the technology to forbid people from using their site.

 

Rose: Carrie has also represented people who have had their faces photoshopped onto porn star’s bodies, or people who’ve had their videos spliced together with a porn star’s video.

 

Carrie: You know I had an interesting case a while ago where the client resembled a porn star. So the offender had videos of the client and then interspersed the videos of the client with the porn star engaging in sexual acts. And then sent those around to the clients friends and families and coworkers.

 

Rose: So this kind of stuff is already happening even without a computer that can generate any video you want. But in this future where you have fully computer generated video, the legal recourse you have as a victim is a little bit different from what victims do now. Right now, the most effective way to get revenge porn taken down is by leveraging this thing called the DMCA.

 

Carrie: It stands for the Digital Millennium Copyright Act and it’s one of the few laws that is really taken seriously by Web sites all over the country and internationally. And it relates to situations where somebody’s image is on a Web site and the Web site is infringing on the copyright. We use it a lot to get removal of revenge porn. If it’s a selfie, then our client owns the copyright. And so we can notify the Website and say that there’s been a copyright violation. And if the Web site refuses to to honor that that takedown notice, then we can actually sue them for copyright infringement.

 

Rose: It’s kind of ridiculous and sad, but most sites don’t actually care if someone is posting private photos to get revenge on you. But they do care about violating copyright law. But in the future we’re talking about, since this video is being generated by an algorithm or a computer, copyright claims won’t work.

 

Carrie: The copyright would probably be be owned by the the person who created the video. I think that this is a pretty brilliant workaround of the non-consensual porn laws. They’re just not designed for fake images or fake situations.

 

Rose: Instead, people would probably have to turn to libel and defamation laws.

 

Carrie:  So let’s say, yeah, I want to use a picture and create a generative video of x y z giving Donald Trump a blowjob. If that never happens then I really do think that the defamation laws would apply.

 

Rose: But here’s a key difference. In the United States, defamation cases are civil suits, not criminal suits. Which is a big distinction.

 

Carrie: Defamation is a civil action. So it would be the the harmed person versus the the offender.
The 35 states that have revenge porn laws, those are those are criminal actions. And so it’s the state or in some cases, if we get the federal law passed it’s the United States, versus the offender. And it’s up to prosecutors and law enforcement to work through those those cases. Defamation is never going to be a criminal matter so it’s always going to be up to the victim too to raise it.

 

Rose: Which means that a defamation case is WAY more expensive for the victim than a revenge porn lawsuit.

 

Carrie: Lawyers are expensive, and unless the offender has money then lawyers are not going to be taking on contingency.

 

Rose: But regardless of what kind of court the victim can use to seek justice here, the results of revenge porn — computer generated or otherwise — are the same.

 

Carrie: The results would be exactly the same. I mean, a person would be equally humiliated and attacked on the Internet. And if these were to show up on the Internet they’re just it’s likely to go viral.They’re just as likely to impact the person’s search engine results. They’re just as likely to be sent around to all of her friends and family in an act of vengeance. Even if it’s not real and they know it, you can’t unsee something. If you see somebody engaging in some sort of sexual act, you might know that they never did that, but you still have that image in your head.

 

Rose: One of the things I’m really interested in, in this future, is how it synchs up with the way that our online identities have changed over time. It used to be that having a profile online was kind of a fun thing to do, but it wasn’t crucial to work or relationships. Today, in a lot of industries, you kind of have to have an online profile of some kind, whether it’s Linkedin or Twitter or whatever it is.

 

In fact, there is currently a case in front of the Supreme Court here in the United States that considers whether or not access to Twitter and Facebook is a constitutional right. The case looks at a law in North Carolina that bars sex offenders from accessing websites like Facebook and YouTube. The lawyers involved here argue that not allowing someone access to those sites violates their constitutional rights. In their initial argument, the justices pointed out that not having access to a site like Linkedin has serious consequences for people. And without access to Twitter, citizens miss out on statements by their elected representatives. Being a part of these online communities isn’t just a fun luxury for people anymore, it’s how they get jobs and communicate with potential employers. It’s how they meet new people, talk to their representatives, and hear about local news.

 

CARRIE: I talk about that all the time about just the the value the monetary value of our our online identities. And there really is no line of demarcation between our online selves and our selves. I mean, I would never hire somebody or go on a date with somebody if I didn’t look them up online. I mean, I’m renting out my apartment right now and I won’t even let anybody look at it until I’ve done a google search on them. So it’s basically like the big validator, if you meet anybody they’re going to look you up online. If there’s fake videos of you killing puppies, then even if somebody is like, oh that’s clearly fake, they at least know, oh well this is somebody who is a high drama person because there is somebody who hates her enough to make this video.

 

Rose: So for a lot of people, online identity is incredibly valuable, I would argue more valuable than it ever has been. So, in this future, where anybody can make a video of anybody else doing anything they want, was does that do to the value of that identity?

 

Jenna Wortham: I think online identities have become this really interesting form of currency. I mean, it’s the way you communicate your worth. It’s the way you communicate who you are and what you have to offer.

 

Rose: This is Jenna Wortham, a staff writer for The New York Times Magazine and co-host of an amazing podcast called Still Processing. And Jenna is probably the best writer and thinker I know about our online lives.

 

Jenna: But I think we’ve gotten to the point where we use all of the different facets of the way we present ourselves online to relay what’s interesting about us as people and our aesthetic and why you would want to get to know us, or date us, or hire us. So we’ve come to use them as calling cards, which is a really basic way of just saying that they they just they already exist as avatars for who we think we are or how we want to market ourselves.

 

Rose: And in some ways, the ability to create these videos might play into the facade people are putting up. If you want to be the kind of person who does yoga in Bali, you can simply make a video of yourself doing yoga in Bali. And we might not know it’s not real.

 

Jenna: It would definitely go both ways. But I also think, too, that that type of technology has the power to really expose our notion of authenticity anyway. The truth is that whenever you’re making an image of yourself or you’re presenting something about yourself and putting it online, it’s still mediated, its still being filtered through a lens. It’s never really authentic.

I think we’ve already seen some of that happen, when there have been those CGI apps that have come out where people can use images of their face and then put themselves into little cartoon avatars that dance and do funny things. I think people like to push the boundaries of what they can do or how they imagine themselves. I was thinking about it just in terms of entertainment. How fun it would be like, I’ve always wanted to go surfing, so what does it look like if I made a video of myself surfing. I want one of those cool GoPro on a surfboard type videos of myself popping up and surfing on a cool wave in Hawaii. I’ve always wanted that. So I was thinking it can be really fun, we would see this whole new genre of fan fiction emerge. And I got really excited about the possibilities and creativity emerging.

 

But it is  interesting, how do we how will we how do we delineate between what’s real and what’s fake and what happens to social media when you can’t always tell what’s fantasy and what’s reality? To me I feel like it’s very exciting because we already are blurring that line anyway. So, to see it actually happen in reality would be really interesting and really disruptive in a way that I would find so thrilling.

 

Rose: So what happens in this world where anybody can totally pervert your online identity just like that?

 

Jenna: When you first sent me the info about this episode that you were doing, the first thing I thought was, finally we’ll start to really value anonymity again. Because I think for a little while there has been this idea that if you aren’t online then you must be hiding something. Snd then maybe it will get to the point where the only way to to be functional is to hide everything, which actually really excites me, because I think where we’re at this really dangerous intersection where companies like Facebook and Twitter and LinkedIn have convinced all of us that in order to be seen as successful and viable you have to have a presence on their platforms. I mean it’s really unusual to not have a LinkedIn or not even have a Facebook and still be seen as fluent or or in possession of the skills that are necessary to be functional in today’s world. Which is really clever, it’s the ultimate mind trick; in a lot of ways it’s true, but it’s also completely not true. And so there is something to me really interesting about getting to a point where we don’t have to have an online presence to be seen as functional or successful or popular. I find that to be really tantalizing,.

 

Rose: There’s this pushback happening online right now, where people are starting to draw back, to pull away and try to get some of their privacy back.

 

Jenna: There is this thing happening right now where opacity has become very much a word of the moment, and there are a lot of discussions happening right now around what it means to be opaque, which I think is a stand in for mystery. Mystery is very highly fetishized and extremely valuable I think. We were interested in what we don’t know, and there’s so much about our social media climate that really pushes people to share everything, so we know so much about people. And the more people can be removed, it’s really interesting. I’ve been seeing this word opacity emerge as the opposite or the polar opposite to transparency, and it being a highly prized quality.

 

Rose: And there are real world equivalents to this too. In our episode about facial recognition we talked about a future where everybody wears bold sparkly makeup to fool facial recognition algorithms.

 

Jenna: So you know I’m in L.A. for work and it’s really bright here. So the sun is very, very, very bright. And there was this woman walking around at this book fair. She had a hoody that was unzipped but you could see that there were these two big protected panels on the side of it. So basically she zips the whole thing up and it covers her face entirely, but she can still see out. And I was thinking that’s a really cool thing. That’s smart, because it’s so bright here and with the lights reflecting off the smog, you can protect your eyes and your skin and still see the world. But then I was also like, that’s so wild because you’re basically just in disguise all the time. I was watching this woman walk around the book fair and just thinking to myself, I wonder if that will be the trend where we have clothing or fashion or outerwear that evolves with designs that you can’t be observed and you can’t be tracked because that’s also what it does. That’s working on a lot of levels and it’s kind of ingenious and it looks amazing. I was very in awe.

 

Rose: So in response to some of this stuff, you might wind up with people totally expunging themselves from the internet entirely.

 

Jenna: Maybe instead of Inbox Zero, it’s Digital Footprint Zero. There’s not enough information about you online to make one of these videos, and then that becomes the thing that’s highly, highly, highly prioritized. It would be impossible for this person to ever even have this kind of video because there isn’t enough there isn’t enough data on them online.

 

Rose: I’ve been trying to think of what I would do, if someone decided to make a series of terrible videos of me. I could go with this privacy option, and just totally erase everything about me from the internet. But… that would include this podcast. And I like doing this podcast! So I think the thing I would actually do, would be to use whatever generative video system is out there and make a TON of weird videos of myself. Just totally flood the internet with videos of me doing all sorts of weird embarrassing stuff. Videos of me playing an iguana as a violin! Videos of me underwater with a school of fish that are actually tiny naked Idris Elbas. And also maybe make a bunch of bad stuff, because I can’t possibly have a sex tape with EVERY celebrity, right? But if they all existed on the internet, and I just bombard the world with tons of stuff that couldn’t possibly be real, but has my face on it, then maybe people are already primed to think “oh that’s probably one of those weird fake videos of Rose.” I have no idea if this would work, but I’d probably try it!  

 

Jenna: Just the amount of confusion, it would just be so exhausting, it would be so exhausting.

 

Rose: But also maybe kind of liberating because you’re like well, I guess I don’t have to worry about that anymore.

 

Jenna: You’re right. It’s true. You’re going in for a job interview and they’re like, we’ve found these videos of you online and you’re like, oh you know how it goes though. Who knows when those were made. You do have the ultimate trapdoor excuse for anything wild about you that gets posted online; it’s all fake nudes, it’s all fake persona.

 

Rose:  What would you make a video of yourself doing if you had this?

 

Jenna: Oh my God. I would do… I would… oh my gosh. I would probably make a really cool video of myself onstage at a concert. Nothing that wild. Or surfing or skydiving. I would just want to see myself in really strange and new landscapes. I think I would get a real kick out of that. Putting myself in really bizarre and unusual places. I don’t think I would use it at all to try to create an image of myself that wasn’t grounded in reality. I think it would just be really fun, just from a personal or an artistic standpoint. Just to be like, oh what would it be like if I scaled a building or broke into a bank or… I don’t know, I would just want to have a lot of fun with it. And I would probably make a YouTube video series of just crazy Jenna adventures for entertainment and delight and just totally go HAM. It would be so fun.

 

Rose: I was thinking about this…so I am afraid of birds. And I think I would try to use this to get myself to not be afraid of birds. If I made all these videos of me hanging out with birds and everything’s totally fine… It does feel like it could be super relaxing. Here’s a video of me just chill, not freaking about something.

 

And that’s how you’ll know a video of me is fake, everybody. If you see a video of me just like, relaxing. Totally stress free. Just… hanging out. Just so you know, that is definitely fake.

 

[Music up]

 

Rose: That’s all for this episode! Head to flashforwardpod.com to see links to stuff we talked about, and examples of some computer generated videos.

 

Flash Forward is produced by me, Rose Eveleth. The intro music is by Asura and the outtro music is by Hussalonia. Special thanks this week to Wendy Hari, Jacki Sojiko, and Dan Tannenbaum. The episode art is by Matt Lubchansky.

 

If you want to suggest a future we should take on, send us a note on Twitter, Facebook or by email at info@flashforwardpod.com. We love hearing your ideas! And if you think you’ve spotted one of the little references I’ve hidden in the episode, email us there too. If you’re right, I’ll send you something cool.

 

And if you want to support the show, there are a few ways you can do that too! We have a Patreon page, where you can donate to the show. But if that’s not in the cards for you, you can head to iTunes and leave us a nice review or just tell your friends about us. Those things really do help.

 

See you in the future!

You may also like

4 comments

Best Podcasts of All Time - The Ultimate Ranking - White summary May 22, 2017 at 10:06 am

[…] the world would be like based on a fictional scenario. For example, one of the episodes is on a fictional technology that could make videos of you doing anything. Anyone could create a video of you doing things you’ve never done. As you can already […]

Reply
Martin May July 11, 2017 at 11:54 am

One aspect you didn’t consider was the opportunity to re-create movies with the originally intended cast or replace actors that for whatever reason. You could even re-cast an entire movie!

Reply
Disa J Marnesdottr October 9, 2018 at 1:30 am

I know I’m late to the game, here, but I wanted to second that. Another use might be continuing series that have had gaps in the real world but not in the series setting. Young onscreen forever. 🙂

Reply
Christopher Sunseri July 16, 2017 at 7:04 am

I really enjoyed your podcast. Thanks

Reply

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.