07. THE BEGINNING

 

LISTEN ON

TLC_E7_Wide.jpg
 
 

15% of all human consciousness that has ever existed is present and happening today: as we speed up and grow ever more connected via the Internet, what are the implications for this massive digital shift?

In this episode, renowned Harvard Law professor Lawrence Lessig discusses the present state of the digital landscape and its impact on notions of freedom and democracy. For good or for bad, how will this new era play out in the years to come?

The last episode in Season One looks resolutely into the near future.

 

For this final episode of Season 1 of The Life Cycle, the starting point is the very recent past. The story of Cambridge Analytica could be a good place to start for looking at how Big Data is being used for political ends:  https://www.theguardian.com/news/series/cambridge-analytica-files 


On this point and a lot more of how our life online is feeding into new paradigms of control and government, the term that has been coined is 'surveillance capitalism', the title of a landmark book by Shoshana Zuboff. Listen to her on this British podcast to find out more: 

https://www.talkingpoliticspodcast.com/blog/2019/144-the-nightmare-of-surveillance-capitalism  


And of course make sure to check out Prof Lessig's website, which has links to lots of material and links to his latest publication: https://www.lessig.org/about/

 
 
 

Featuring Prof. Lawrence Lessig. Harvard Law Professor, Founder of Creative Commons, and Author.


 
 
 

07. THE BEGINNING

JOHN HOLTEN: Hey Eva.

EVA KELLEY: Hi John!

JH: How did you sleep last night?

EK: Good! I’m sleeping well. How about you?

JH: I’m also sleeping well. Although I did wake up a bit sad as we’re coming to the end of the season. It’s been quite a ride.

EK: It really has. Today we’re going back to the beginning, actually. Today’s episode features Lawrence Lessig.

JH: Lawrence Lessig, where to start? Lawrence Lessig is a professor of law at Harvard Law School, former director of the Edmond J. Safra Center for Ethics at Harvard University, and he was also a candidate for the Democratic Party’s nomination for the president of the USA in 2016 presidential elections.

EK: Oh my god, really?

JH: Yeah, he was a single-issue candidate.

EK: We are super excited to have him advise Klang on the political structure of Seed, the video game...

JH: Computer game.

EK: Right. The computer game that is being created in the offices we’re working from. Just to sum up again, Seed is a game taking place on a new planet. Players are sent there in pods after our world on Earth ends. The players are meant to settle there and form communities. Lessig creates frameworks and theoretical scenarios for those same communities. As well as the positions and titles John just mentioned, he is also a member of the American Academy of Arts and Sciences and the American Philosophical Association. Lawrence Lessig has received numerous awards including the Free Software Foundation’s Freedom Award, Fastcase 50 Award, and has been named of Scientific Americans’ top 50 visionaries.

JH: He is something of a visionary. This episode feels slightly different in that it is almost a parable from the future. But really, it’s just Lessig talking about the here and now.

EK: It was one of those interviews that gave rise to a whole essay in itself.

JH: He speaks in beautiful sentences merging into whole well-rounded paragraphs. Our conversation with Lawrence Lessig started to overlap with the backstory and lore of SEED, so we fill it out and present it to you all, our conversation with Lawrence Lessig, to round out the first season of The Life Cycle.

EK: So this is the last episode of Season ONE of The Life Cycle.

JOHN and EVA: The Beginning!

LAWRENCE LESSIG: Sure, so I’m the oldest person in the organization by far. Or actually not that old, but still, I first came to Berlin in 1982. And have been coming back to Berlin regularly since 1982. I spent a year here in 1999 at the “Wissenschaftskolleg,” and then a year at the American Academy in 2006. My wife is German and we want to spend as much time as we can in Berlin. So it was a happy coincidence when I met Mundi in Iceland that his incredible company was happening in Berlin. It’s not just because of Berlin that I became involved with Klang, but it’s really great that Klang is part of Berlin.

JH: What you’ve just heard is the very first recording for The Life Cycle podcast and was actually made a year before our first release. It was a time when we were really just starting out, when the podcast was just an idea and it was not yet clear how we would go about making a podcast about the future. It was a warm and sunny day, one of those beautiful Berlin summer days when the whole city is buzzing, and alive, and full of the promise of fun. But the promise of fun somehow quickly dried up in the room. We were recording in the flat of our main man David Magnusson, our sound engineer and editor and we had yet to set up a studio in the Klang office, and the tone somehow shifted, somewhat unexpectedly. I got the sense that Lessig was tired, fed up, and somehow maybe not in the mood. Maybe it was the summer heat or the last minute grind of edits on his latest book, or perhaps it was my bumbling inexpert line of questioning. I wasn’t sure if he, in fact, wanted to talk at all. Or perhaps it was just the exhaustion of the present moment, the current state of affairs in politics, in society, and where we might have gotten to in this moment of history. But keep talking he did.

LL: I wrote that essay at the moment when it was revealed the United States were separating children from parents at the border, and after that article, it was revealed that they had basically lost the connection between the parents and the children. So there are children who can’t find their parents, and parents who can’t find their children. At that moment, there was beginning to be a general outrage was beginning in America at just what did this mean about who we were.

So I’ve been forever thinking about this question of how we understand who we are and what we are responsible for? I never expected in my life we’d get to a moment as poignant and pronounced as it is in the United States right now. Many people overlook the similarities in the evolution of the countries, Germany and the United States – Germany in the 1930s and the United States right now – because they think of the endpoint of the German evolution, which was Hitler. And they think, “Well, we are never going to have Hitler.” That’s true, we are never going to have Hitler. But you can do a lot of really evil things long before Hitler, and what terrifies me right now is to see the acceptance of some of the outrageousness of what’s happening, the unwillingness to stand up for a country over a party, and to do something to stop it. It doesn’t require sacrificing your life. Republican members of congress who are empowering and enabling this president wouldn’t lose anything other than maybe their position in congress if they did the right thing. So it’s hard to see if you’re not willing to take even that small step in the face of what’s threatened, it’s not hopeful for what the future is going to hold for us.

EK: Why have a podcast about the future at all? What is it exactly we’re trying to find out? Where do we expect to go on such a journey? Because in actual fact, to talk about the future, one has to look at and understand the present. That’s what we have done to date on The Life Cycle. From the idea of life extension and the very real sense of apocalypse in everyday life to notions of brain interfacing or implanting our brains, these things are happening already today. They are thought about, and studied, and advanced. What connects them? What wider meaning, if any, can be seen between them? What are the links that we can draw that unite them? The challenge is to investigate the present, because the future is already here among us. The story of SEED could have its beginning here, in the interconnected work and research into the human mind, consciousness, artificial general intelligence, and the use of data to map the terrain of humanity in all its myriad glory.

JH: The first years of the 21st century have seen the dawn of a great age of disruption and innovation, the disruptor in chief being the internet. Those in the West have adjusted to an internet enjoying a certain amount of apparent autonomy, those in other countries less so. Slowly it has crept up to channel almost all aspects of life, from communication to social relations, entertainment, romance, and politics. People buy life insurance via the internet. They find their future husbands and wives on the internet. They find whichever particular belief they might have backed up and supported on the internet. At first declaring their intention to document and organize all of human knowledge, and not be evil while doing so, big tech ended up manning the access gates. All of us search the network and are in turn searched ourselves. And stating their goal as just simply to connect us, the social networks found themselves in possession of our deepest secrets and subconscious biases. The network, interconnected webs of data, the residue and fingerprints of the internet’s users themselves, grows ever bigger. The network has become everything, and in so doing, so many of us take it and its independence wholly for granted.

LL: The important thing about network neutrality is to recognize its relationship to the original architecture of the internet. The way the internet is originally architected, nobody has the power to control who gets to see what. The network doesn’t place any or much intelligence in the network itself. All the intelligence is at the edge. So that means that if you want to offer content or applications, you can put it up on the internet, and there is nobody on the network who gets to decide whether other people get to see it. The internet, as it originally is architected, is the ultimate neutral network. And what wins is just what is popular. From the very beginning, there have been people who have resisted this architectural design. For example, telephone companies who used to sell long-distance telephone service. When internet innovators developed voice-over IP, they had developed an application that radically undermined the capacity of the telephone companies to earn money from long-distance phone calls. And those telephone companies didn’t like that. What they wanted to do was be able to say, no you can’t do voice-over IP across our wires. But in the United States, the government was pretty clear from the very beginning that you can’t exercise that power to block certain applications because we want the internet to encourage all sorts of innovation and creativity on top of the network. Same thing with cable television and YouTube or Netflix.

Right now, in the United States, most broadband comes across cable television wires. If the owners of cable television were able to say, you can’t watch Netflix using your internet service or you can’t watch youtube, and if you want to watch films then buy them from HBO – obviously, that would radically undermine the capacity of the internet to create innovation around video services. The principle of neutrality said that the network owner, the people who own the wires, couldn’t exercise that discrimination over applications or content. So in the United States, when it was pretty clear we weren’t going to have competition among providers of wires – right now, basically, cable is the most dominant provider, and most jurisdictions have only one provider – then people trying to preserve the competitive character, the open network of the internet, adopted a different regulatory response. Which was to say, ok, if you’re a network provider of the internet, you can’t discriminate on the basis of applications and content that go across your network. The reason you want to discriminate would be to favor certain kinds of content that benefit you or favor certain applications that benefit you. But we don’t want the network owners to exercise that power, because that reduces the opportunity of people to innovate. The people who would pay would be the incumbents, and the next generation of the innovators, the next YouTubes, the next Twitters, would not be in the same kind of position to be able to do that.

The objective of network neutrality was to preserve an open network for innovation and creativity, so that everybody is in the same position for innovation and creativity, whether you come from AT&T or you come from Tanzania. If you can preserve that value of open innovation, then you encourage a wide range of innovation that otherwise would be chilled. Because they know, why would I build for this network if I need permission to be able to deploy it. That’s the value network neutrality is trying to protect.

EK: In the 20th century, mass surveillance of citizenry was commonplace and well documented. Each generation moved through different technological means of being snooped upon for various different political subversions. In the beginning of the 21st century, a generation came of age in the West who thought they were free of such ideological oversight. That it was the preserve of other cultures, or that, indeed, it didn’t matter as everything on the network was inherently private. People would say, I have nothing to hide. The network carried the social, the romantic, the intimate; and these domains, it was thought, had nothing to do with any government, any power, any company – until the time would come when they did.

LL: It is really striking and completely unrecognized how radically different the world is in just twenty years. Twenty years ago, I published my first book, Code and Other Laws of Cyberspace. It was reviewed by David Pogue in the New York Times and he called me a »digital Cassandra«. He said I didn’t have any real evidence there was going to be this internet of surveillance, constant monitoring, and the end of privacy. It seemed to me when I wrote the book it was kind of obvious that this is the way things are going to go. I’m sorry that in fact that’s the way things have developed. Because twenty years ago you could still get around without your life being monitored, you could do all sorts of things with the presumption that what you were doing was not surveilled by others in any form. Today, it’s almost impossible to live a »normal life« outside the network of surveillance. You have a telephone, you have the internet, you are going down streets, and you have cameras capturing everything you do. In London, your car is constantly monitored based on license plates. All these technologies have taken us from a world where surveillance was the exception to a world where surveillance is the norm. What we haven’t developed is a robust sense of what privacy in the world where surveillance is the norm is or should be. What we have done is we have taken all the ideas of privacy from the world where surveillance was the exception, and we have just tweaked them a little bit for this new world.

But I think that there is something much more fundamental that has happened, and what is threatening about it is that the business models for data are so overwhelming and irresistible. Twenty-five years ago, the United States basically made a decision that the internet could become commercialized, and therefore an ad-driven model for the internet became the presumption. The ad-driven model for the internet has been an incredibly radical and important change in our society. Because what “ad-driven model” for accessed information means is that we are constantly being surveilled and profiled for the purpose of being able to sell ads better. All sorts of applications are solely for the purpose of figuring out how to sell you ads better – that’s called Facebook. Facebook is a technology for the purpose of being smarter at selling you ads than Google is.

I don’t know if they are smarter. I think all of those guys initially – the Google boys and Zuckerberg – developed their technology originally not really understanding. But Zuckerberg was quicker than Google was in recognizing this was going to be an extraordinary engine for selling data, which means basically selling advertisement information for people who want to sell products. But it drives so much of the behavior. So the fake news dynamic is driven by the fact that there is a very high return coming from enabling people to spread all sorts of crazy stuff, and the more you learn about them as they spread their crazy stuff, the more sophisticated you are in being able to sell information about them to advertisers. That was a decision that was made and if it had gone the other way, it would have built a completely different internet. I’m not sure it would have been better – I’m sure in some ways it would have been better, in some ways worse. But what that reveals is that we are not very good at thinking about the consequence of these decisions down the road very far. I know from my own experience that whenever you paint too far down the road, people very quickly glaze over and don’t really want to engage. The problem is, it can be that it is just too late to do anything about it. How could you go back on an ad-driven network? What would you do to make the internet no longer ad-driven? And what have we given up now that we have walked down the rabbit hole of perpetual data to sell us more stuff?

JH: As the years go by the network grows, the disruption more flagrant. Life, private life, crashes into the screen, and the real and the virtual, the personal and the digital, collapse together and become one. The surveillance economy uses our thoughts and impulses as an asset to profit from. Data-driven systems offer the illusion of agency, but in fact their predictive models make the right to free decision-making more and more difficult, pure freedom ephemeral, and transforms coercion into a pleasurable dopamine hit. Politics become more fractured, flamboyant in the shoddiness of the politicians, the brazenness of their lies and short-termism. Manipulators sit waiting to pull strings, steer history to their own end. The data fields are ready to be harvested.

LL: With ad-driven networks, the natural model is going to be to micro-target down to the person, down to the time of the day of the person. With AI-driven targeting, that’s going to be completely possible, especially in the context of political advertisement.

There is the presumption of what politics was. The politician just stands up in the public sphere and says, this is what I think, and the other side says, this is what I think, and the public decides. But now, we have the capability of the politician to say a million different things to a million different people. To micro-target messages to a million different people, and nobody even knows he is saying different things to you than he is saying to me. This is the filter bubble brought to by politics, and we have no structure for fleshing it out, because we don’t even have a way to automatically know what the politicians are saying to all these different people. The people who don’t see much to worry about say, well, you know, politicians could always do that. They could sit in a meeting with you and say, I support the bridge, and then sit in a meeting with him and say, I don’t support the bridge. But I think what that under-appreciates is how consequential it is to be able to drive that with AI technologies in a massive way across a whole population.

The American prosecution of corporations has changed dramatically over the last twenty years. It used to be when you prosecuted corporations, you’d prosecute the individuals in the corporation. You would send them to jail. Today, that isn’t done anymore. Today, it’s basically: the corporation as a whole is prosecuted, it has to pay a fine, nobody has to admit guilt, and nobody goes to jail. A corporation is considered an individual, so that’s why you can do it. But it’s a pretty dramatic change in the effect of the prosecution. Why would that happen? And the answer is because the salary for being a federal prosecutor has not kept up with the salary of being a private attorney. It’s harder and harder for people to stay as a prosecutor for their whole career. So what they need to do is they need to leave prosecution and go work in the private sector. The best way for them to leave prosecution and go work in the private sector is to have good friends, people who like them or trust them in the private sector. So what they want to do is to signal to these law firms that they are a »good prosecutor«, namely a prosecutor who is not going to screw their clients as harshly as they would if they were a »bad prosecutor«. So this fact that we don’t pay prosecutors, or that they aren’t paid as much as regular attorneys, has this effect of shifting the way we prosecute, away from an effective prosecution that would actually force corporations to obey the law to this wildly ineffective kind of prosecution where corporations see law-breaking as just a cost of doing business. And they just pay that price, and they move on.

Nobody ever intended that. Nobody ever planned it. It wasn’t a product of some evil genius deciding, here is how I’m going to undermine effective prosecution. But it is nonetheless a corruption of the institution of law enforcement that produced, in America, is partly what explains the crash of 2008. The reason nobody inside any of those institutions feared anything except that the money would stop flowing in at a certain moment.

EK: Some of the worst aspects of humanity have been hardwired into the computers they have created. Artificial intelligence is more and more an outgrowth, seeding new ways for humanity to see itself, and at the same time, to become beholden unknowingly to these tendencies unleashed. Because it is created by humans, it is inherently flawed. The network, it turns out, could be incredibly, irredeemably biased.

LL: AI in the context of targeting for advertising raises really difficult questions if the AI is able to produce infinitely differently targeted kinds of politics. What we saw in 2016 in America was that AI was surfacing characteristics of Americans we shouldn’t be proud of. It was doing it as a function of the efficient way to advertise to Americans. I was on a panel with one of the Cambridge Analytica senior people a week after it was revealed that Facebook had been selling ads to »Jew haters«. Of course, the category »Jew haters« had never been crafted by any engineer at Facebook. It was an AI-created category. This guy from Cambridge Analytica who had been in the Trump campaign said, “Oh yeah, we always had to monitor for that kind of stuff.” Because you always would produce these outrageous things you had to turn off or tune down.

The AI in some sense is revealing the dark underbelly of society. Again, it raises the question of: What kind of politics should we be encouraging? The kind feeding off private prejudice and hatreds, or one channeled in a place where those prejudices and hatreds just aren’t permitted, where they wouldn’t be able to do it in the open. Right now, they are going to do the most efficient and most effective, and that turns out to be looking deeply into who we are and exploiting that.

JH: The dissemination and collection of information morphs and changes. Journalism has changed. No longer is it possible to say who should be the gatekeepers to any one doctrine. Everything shifts. The perspective and verifiable truth of any one statement becomes almost impossible to determine. People become trolls. People find parts of themselves on the network that they never knew existed. Society finds part of itself it never knew existed. Nobody is present to put a check on it. Emotions are increasingly deployed at the expense of truth.

LL: Yes, I think that is tied largely to the money. But then, how does that relate to the internet? The internet is Janus-faced. There is one part of the internet that absolutely enables all sorts of people to do things they could never have imagined doing twenty years before. You can rally a whole bunch of people in the context of the internet to do things you could never have afforded to do before. But on the other hand, the internet also enables money to have a much more effective power than it had before. So it’s an arms race of what you might think of as good against what you might think of as not good.

What we discovered about the internet, which none of us writing about this twenty years ago were really focussed on, was the consequence of a news environment where there were no effective editors. In the old days, we used to say, the internet was going to eliminate the censors. But what that means is that the internet was eliminating the editors, not technically, but economically, the economic freedom to make a judgment about what you’re going to cover versus what you’re not going to cover. In the fall of 2015, for reasons we can leave off the table for now, I was spending a lot of time in cable television news networks studios. They were whining all the time about how much Donald Trump was on television. Donald Trump was on television 24/7. I asked, how can you whine about this? You’re putting him on television, you’re the ones doing it. And they said, oh we have no choice. Because if we don’t put him on our station, people just turn to the next station and watch him on the next station. They were economically incapable of editing, of making a judgment about whether Donald Trump should have been the sort of person to be on television. You know, in 1976, Donald Trump could never have been a candidate for president. Because the New York Times, the Wall Street Journal, ABC News, and CBS News would have said, this guy is a crazy man, we are never going to cover him. In some ways you could say it’s bad they could exercise all that power, but the flip side of it is to look at the world we have right now where you have these incredible authoritarian populists who can leverage the editor-free platform to spread their message of hate or ignorance in a way that really has resonance, that before would never have this kind of resonance.

This struggle is not going to go away, and we are not going to reestablish editors in the context of the internet. That requires the development of what you think of as a different kind of populism, a kind of populism that recognizes the way the architecture works, and can leverage it to ends that are not this hate-filled separation – the fifty-year-old white male’s perspective on the world, but instead a perspective more like what we thought the world was ten years ago.

EK: The story of the last number of decades can be seen as a humanist-centered liberal story, a story of progress, of the triumph of liberal democracy and technological efficiency. It was very persuasive. A simple story as it were, it would sometimes look at the world around us humans, the ecological world, the animal world, but it never really strayed too far from the main headline: Humanity was advancing exponentially. Since 2016, it is collapsing, splintering apart, morphing into something we have no name for. Out of this debris, a new story will emerge. The angel of history will turn to us from the future and cry out our name. Destiny will take hold of us.

LL: We have to recognize that the idea of democracy, which pretty much is taken for granted around the world right now, is under extraordinary attack. Because increasingly, everybody is sceptical of democracy. Even the people are sceptical of democracy. Most people look at democracy in developed nations and think it’s really a trick the elites have deployed to make the government help the elites, not about helping the people. And then, when they have that feeling, they react in the populist mode, electing people like Trump or any number of examples around the world. The real challenge is to find a way to recover the integrity of democracy, confidence in, and affection for democracy. Because it’s really an exhausted idea right now. It’s terrifying because there are alternatives that are really being pushed by people of power. I’ve been at dinners of billionaires in the United States, and they sit around talking about how stupid the people are, and what we have to do is become more like the Chinese, or have a technocratic elitist structure of governance. Even inside of democracies you have the people who look at what it has produced, and they say, this is terrible, this is why I’m going to support this authoritarian populism.

It’s a really urgent need to find a way to recreate a system that people feel has integrity, and is actually representing the people. In the United States, this is the critical fight. I don’t think the critical fight is defeating Donald Trump. The man is obviously pathological in all sorts of ways, but that’s not the problem. The problem is the reality that created Donald Trump. The problem is the democracy that felt like he was the solution to something. And to figure out how to solve the problems which led the democracy to make that choice. That is a much harder problem to solve than the problem of defeating Donald Trump. Politicians know how to defeat Donald Trump. But defeating him alone will not get us away from a democracy that is eager to elect people like Trump.

That problem is for me the hardest problem right now. I’m working in the United States to try to help build movements that can work on the really constitutional politics, like how do you restructure something that feels like a democracy with integrity. But I don’t think anybody has the solution around the world right now.

JH: We need new solutions then for a future forming around us quicker than any one of us can realize. We feel the need to reach toward collective efforts at organization and understanding. Big data and social media, which we all feed, promised freedom and knowledge. But the challenge became not to relinquish freedom and knowledge at the same time as it was being created. Digital connection should not mean simply someone else or some corporation is going to make money from these connections.

EK: Everything is to be played for. Because every time we pick up our phones or open our laptops, we’re contributing to a vast system that extends out across the world. The beginning starts here, in our every individual social interaction. And from interpersonal interaction it moves out into our interaction with the world of things, automatic, connected, and at times natural. We talk about the weather, not because it cuts us off from the natural cycle of the world. As the artist Roni Horn put it: “Weather is the key paradox of our time. Weather that is nice is often weather that is wrong.” The nice is occurring in the immediate and individual, and the wrong is occurring system-wide. We can only sip our margaritas as the swimming pool dries up.

JH: Over the course of recording these episodes, we’ve come to see that in the not too distant future, there will not only be metaphorical vampires feeding off every aspect of our daily existence and experience, our moments of self-expression even, but real vampiric figures that could extend their lives indefinitely through the ability to explore new unchartered technological domains.

EK: We have seen how we will interact with each other. The modes of communication are themselves coming online. In turn, we’re building robots we must learn to talk to and in turn listen to. All of this comes back to the solutions needed in the face of ongoing political, societal, and technological challenges. All of these things revolve around each other. The path of this journey into the future is what we’ve been looking at on The Life Cycle. The future follows up ahead. In the swirl and mix of shimmering data sets and political agency lies one possible starting point of SEED.

Everything is to be played for. The future is ours. The beginning starts here. The network, automation, artificial intelligence, the internet of things, big data, cloning, robot/human interaction, climate change, the political organization of our very communities and societies. This is the story we tell our children. Together, this growing matrix is one possible starting point of SEED.

EK: Time for credits! What an ending to the season. Thank you all so much for listening.

JH: What a journey we’ve all been on. Do all like and subscribe. If you like what you’ve just heard, or any of the episodes, or all of the episodes, just tell people – that’s how these podcast things work out for all: spread the word.

EK: Please email us at contact@thelifecyclepodcast.com with any thoughts, requests, anything. We read every email. Promise.

The Life Cycle podcast is produced by Klang, and written, hosted, and produced by John Holten and me, Eva Kelley. This episode has also been produced by our main man David Magnusson who did the mix and sound engineering.

JH: Special thanks to David. He has been a great co-producer to have and sound engineer making the sound all good, and nice, and proper. Special thanks to Professor Lawrence Lessig, of course, as well as to Jonathan Baker who is in charge of PR and strategy for the podcast.

EK: Also thanks to Samet Kuru who made our website and Svenni Davíðsson for additional visual identity.

JH: Also we would like to thank our executive producer Mundi Vondi for having given us the opportunity to do this. This episode was recorded at Klang headquarters in Berlin Kreuzberg, and of course in the home of David Magnusson.

EK: We’ll be back real soon with the next season, so watch this space.