Shoshana Zuboff: Surveillance capitalism and democracy

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments

This woman looks like she could be Jermaine Clemente’s (from flight off the conchords) mom

👍︎︎ 1 👤︎︎ u/gshank80 📅︎︎ Feb 05 2021 🗫︎ replies
Captions
miners effort damn one town perhaps this will come you know welcome to the Alania building to our lecture series making sense of the digital society I'm glad to see that you've turned out in such big numbers and have come to our event tonight we decided to continue this lecture series next year and I can tell you well we would look forward to seeing you again next year since December 2017 in cooperation with the home world Institute for Internet and Society we've been examining how digital technologies are transforming our society and what a European perspective on those transformation processes could look like we've had several lectures in the series with speakers like Manuel Castells who talked about power in the digital society we had Aoife alou's she talked about the capitalist subjectivity and the internet and last last lecture was Arminius ae4 what problem is digitalization a solution so after all these lectures I'm very happy to be announced that we will continue the lecture series with good speakers next year I hope you'll keep coming and contribute to exploring societal change to discuss it and to influence it with us in these days where we're remembering the peaceful protests in the GDR 1989 and the fall of the wall some of the key questions within this lecture series remain of big importance how does power in the digital society express itself and how is it distributed are we witnesses of a revival of democracy through increased transparency and participation or do we witness its downfall as a result of fragmented public's and the limitation of the private sphere thirty years ago citizens of the GDR for peacefully to be able to express their opinion freely and to rid themselves of omnipresent surveillance today a majority of people is aware of the potential danger of constant surveillance but they readily surrender to the ecosystems of digital different digital platforms and thereby become data suppliers and said in an exaggerated way they become products of some few digital companies we have to continue to analyze these developments and debate them in society in order to shape digital change together against this background it's a great honor for me to welcome Shoshanna Zubov as a speaker here she wrote an impressive book the age of surveillance capitalism which exposes the threat to autonomy and democracy posed by the monopoly power of the internet corporations thank you very much for agreeing to take part in our lecture series I'd now like to hand over to Christine katzenback from the home world Institute and it remains for me to wish you a fascinating evening enjoy [Applause] good evening it's my pleasure to welcome you all to this special evening I'd let in the name of both the Humboldt Institute for intensity and the German Communication Association this evening is special for two reasons the special of course because of our esteemed guest susanna Zubov who is without doubt one of the leading critical analysts of our time in the work she scrutinizes the contemporary transformations of capitalism that profoundly change how we organize and live our social economic political and private lives with that theme and scope she's the perfect speaker and highlight for lecture series making sense of the digital society we started this serious almost two years ago with a lecture by manuel castells and that hosted since then exciting and almost a dozen exciting and prolific speakers the series aims to foreground readings of the emerging digital society that addressed the big picture questions but at the same time are based on rigorous empirical research and you can see from our book that such an assurance done our homework on that respect these kinds of lectures as we see today seem to hit a nerve given the turnout for each and every lecture although today definitely constitutes a new high in that respect but I'm happy and grateful for to see this series continue into the next year in the third year so thank you to the federal agency for civic education namely Thomas Krueger and Sasha Shia for this wonderful cooperation and your support but this evening is also special because it somehow also constitutes a first in this lecture series because this lecture today is not only part of the lecture series it also kicks off an academic conference that takes place the next two days in Berlin so I have the pleasure also to warmly welcome the participants and speakers of the conference automating communication in the networked society contexts consequences and critique who are all among the audience to most of most of you so you all have the chance to conduct probably even more informed conversations with your after so drinks then usually in this series this conference is the annual conference of the section digital communication in the German communication Association and that this year this jointly organised by the white tsubame Institute for networked society and the home world Institute and I'm profoundly happy to see this cooperation between the two berlin-based German internet institutes we jointly contribute to the city's remarkable growing research community addressing the digital transformation for the conference we chose automation as a theme in order to give the current heated debates about AI and algorithms more historical depth and context and given the tight connection between capitalism and automation who could be better prepared to kick off this conference than an expert in capitalism and its digital transformation so Shoshana and we all are more than happy and honored to have you here in Berlin today but before you enter the stage I need to hand over to our wonderful moderator to be moolam who will guide us through the evening and I promise your patience will be patience will be rewarded tobey's a true master and probably introducing our speakers tobe the floor is yours [Applause] [Applause] don't get your hopes up too high now good evening everybody what it turned out much as I can see it we're all in my back and how I can see a little bit better so about a year ago tonight's much-anticipated guest shot thought she would be on the road for maybe three or four weeks as she told me on the phone it turned out to be much more it has been 11 months mostly away from home for her due to the great success of her book the age of surveillance capitalism but she will find the idea of home as opposed to exile in her writings too as a space that is relatively safe from violence exploitation or powerlessness she has asked that question of home or exile 31 years ago in her first book in the age of the smartmachine the future of work and power and today she says that work turned out to be the opening chapter in what became a lifelong quest to answer the question can the digital future be our home as you have heard tonight's event is also part of the series making sense of the digital society and of course we have talked a lot about the impact of digital technologies on society but we have also talked about a notoriously hard to define term it's called capitalism late capitalism neoliberal capitalism platform capitalism this term is like a giant rock or several giant rocks actually rather amorphous dark and very hard to move for a train of thought to pass without difficulty for a conversation to take shape tonight is the night where those two come together the impact of Technology capitalism and take center stage because our guest has worked for several decades and how quite different forms of capitalism define the quality and shape the future consequences of set technological impact her increasingly warning voice was also hurt in this country and given space accordingly by the late Frank stria Martha Koch publisher of the Frankfurter Allgemeine Zeitung and head of its void tour as we irin a mix of French and German feature pages art section both turbid week translations to convey the discursive power share Martha's paper hat at the time her body of work is impressive as is her career was one of the first women to get to know her at Harvard Business School but after 11 months of hearing her CV read out to her every other night she asked me not to do that and be a bit less formal I find that very refreshing most of you have probably checked Wikipedia anyway or as I hope have read her book speaking of refreshing websites we're going to try out something new tonight in terms of audience participation we're trying to get a little bit more focused and a little bit more diversity in gender and age when we actually converse here after su Shawna's talk and is it is a tool called slide oh may you please show us the slide slide Oh calm you can type in your questions there you'll all have the chance to vote those questions up or down so we can guarantee that also other people are interested in your questions and those questions will be read out to us by somebody from the Humboldt Institute for Internet and Society we do have I think two microphones here on the floor but as you can see will be quite hard to pass around the microphone and we don't really like to give out the microphone so I think there two of them you get I mean if you don't feel comfortable with those devices you do get the chance to ask that question but it will be you know it will be a minority tonight so please try to use that tool it's not part of the surveillance capitalist complex I hope it's actually something made for something quite different and we are you know very many tonight so please you know time is limited no Co lectures or Coast beaches please that of course should apply to me too so before our guests will take the stage I will let others speak of her here's some praise about a book the New York Times wrote light on prescriptivists notions Zubov does propose a rights to sanctuary based on universe if ever more threatened humanitarian principles like the rights to asylum but she's after something bigger providing a scaffolding of critical thinking from which to examine the great crisis of the digital age through her we learned that our friends to the north were indeed correct Facebook is the problem along with Google Microsoft Amazon and others this is the rare book that we should trust to lead us down the long hard road of understanding quote n The Guardian in London lists the book is one of the hundred most important of the 21st century and london-based writer Sadie Smith wrote Zubov is concerned with the largest act of capitalist colonization ever attempted but the colonization is of her minds our behavior our free will our very selves yes it's not an it's not an anti tech book its anti unregulated capitalism red in tooth and claw it's really this generations does copy time and not Naomi Klein last but not least that the hour is late and much has been lost already but as we learn in these indispensable pages there is still hope for emancipation plane therefore points out the remedy section so to speak which will play a certain role tonight I think but you're not going to hear an abstract of the age of surveillance capitalism the fight for a human future at the new frontier of power but actually are going to get a glimpse of ideas of next book with Cambridge that will focus on the epistemic inequality produced by surveillance capitalism and why this is a threat to democracy I'm extremely pleased with us tonight please welcome Shoshana Zubov [Applause] oh my goodness thank you so much for that beautiful introduction Toby your appreciation of that work I'm so grateful and Thank You Christian Thank You Thomas what a wonderful night this is for me to be here in Berlin to talk about this work Toby mentioned my friend Frank Schumacher and I do want to dedicate my talk tonight to Frank it was through Franck that I became introduced to Berlin and fell in love with Berlin and I feel like there's a piece of my heart that lives in this city and always shall so I'm I'm so excited of course wish he could be with with us tonight but I'm so excited to finally be here and and being able to share this work with you so there are a few things that I want to do tonight I want to talk about some of the ideas that are in the book but also couldn't move that forward driving forward into implications driving forward into more careful thinking about remedies the word as using with Toby what are some of the things that we can start contemplating as the way that we come together to move through and beyond this age of surveillance capitalism which as you know if you if you've read the book or at least maybe read the first page in the last page on the last page I'd say the age of surveillance capitalism may it be a short one and that of course is is up to us so funnily enough Toby brought up the New York Times and I'm going to start with the New York Times and this is a piece from the New York Times you know what the Federal Trade Commission is the Federal Trade Commission is the that's the agency in the United States that now has most of the jurisdiction over commerce on the internet so when we think about regulating the surveillance capitalist we're talking about the Federal Trade Commission so here's a New York Times reporter who's describing what he calls an unusually heated debate about privacy individual rights and law at the Federal Trade Commission and he says you know of course industry was represented there and civil society was represented there and the the industry executives were arguing quote they argued that they were capable of regulating themselves and that government intervention would be costly and remember this work because we're going to come back to it later counterproductive all right so that's the executives the civil libertarians were warning that the company's data collection and analysis capabilities posed quote an unprecedented threat to individual freedom then there was another advocate there someone else from a civil society organization and this is what he said quote we have to decide what human beings are in the electronic age are we just going to be chattel for commerce finally one of the commissioners asks the following question where should we draw the line now all of this sounds familiar to you doesn't it does it sound familiar familiar debate familiar points of view what's so interesting about this article it was published in 1997 so I think we know the outcome of this story who won the argument so the executives won the argument and they got their way they got everything they asked for in the United States more than in Germany and more than in Europe but still relatively speaking a near absence of law for them to be able to do what they wanted to do surveillance capitalism is the fruit of this victory of a battle that already those battle lines were drawn in 1997 at the dawn of the Internet so what is surveillance capitalism it rests on the discovery that private human experience was to be the last virgin wood available for extraction production commodification and sales people that means us we did become chattel for commerce that's exactly what happened and the results are shaking democracy to its core they're transforming our daily lives they're challenging the social contracts that we've inherited from the Enlightenment and indeed threatening the very viability of human freedom just as was predicted under siege though it may be the only possible remedy for all of this is democracy and that's why we're here tonight of course so I think about it this way a little bit you know the story of Alice in Wonderland yes everybody knows the story of Alice in Wonderland and you remember the white rabbit who had the clock and he was rushing and I'm late I'm late for a very important date and he goes down the rabbit hole well the way I think about it is two decades ago we were all Alice and we encountered the white rabbit and he was rushing down his hole and just like Alice we rushed after him we followed the white rabbit into Wonderland what happened in Wonderland in Wonderland there are various things that we learned and it took us two decades to learn them okay first of all we learned that we can search Google we search Google but now two decades later there is a fragile new awareness dawning and it's occurring to us that it's not so much that we search Google it's that Google searches us in Wonderland we assume that we use social media but now we've begun to understand that social media uses us we thought that these are great free services while these companies were thinking these are great people who are free free raw material for our new operations of analysis produc and sales we barely questioned why our television sets or our mattresses came with privacy policies but now we're beginning to understand that privacy policies are actually surveillance policies we admired the tech giants as innovative companies but now innovative companies by the way who occasionally made some big mistakes and those mistakes violated our privacy the difference now is that we're beginning to understand that those mistakes actually are the innovations those mistakes are the innovations in wonderland' we learn to believe that privacy is private we fail to reckon with the profound distinction between a society that cherishes principles of individual sovereignty and one that lives by the social relations of the one-way mirror privacy is not private privacy is a collective action problem privacy is a political challenge privacy is about the kind of society that we live in finally our most dangerous illusion of all in Wunderland we believe that the internet offered unprecedented access to proprietary knowledge but in the harsh glare of surveillance capitalism we have come to learn that for proprietary knowledge now has unprecedented access to us the digital century was to have been democracy's golden age instead we enter the third decade of the 21st century marked by an extreme new form of social inequality that threatens to remake society as it unmake democracy this new inequality is not based on what we can earn but on what we can learn it represents a focal shift from ownership of the means of production to ownership of the production of meaning this is what I call epistemic inequality defined as unequal access to learning now imposed by private commercial mechanisms of information capture production analysis and sales best exemplified by the growing abyss between what we know and what can be known about us unequal knowledge about us produces unequal power over us and so the abyss widens further marking the distance now between what we can do and what can be done to us these growing asymmetries ensure that epistemic inequality will be a critical social contest of our time 20th century industrial society was based on the division of labor and it followed that the struggle for economic equality would shape the politics of that time our digital century shift societies coordinates from a division of labor to a division of learning and it follows that the struggle over access to knowledge and the power that is conferred by such knowledge will shape the politics of our time these contests pivot on three essential questions about knowledge authority and power and these frame the fight for epistemic rights and epistemic justice three questions who knows who decides who knows who decides who decides the answers to these questions will determine the fate of equality after Wonderland all right let's talk a little bit about surveillance capitalism because this inequality is forged in the backstage operations of surveillance capitalism its one-way mirror operations engineered for our ignorance wrapped in a fog of rhetorical misdirection euphemism and mendacity invented at Google the turn of the digital century surveillance capitalism begins with the secret theft of private human experience now declared as free raw material for translation into behavioral data these flows of behavioral data are conveyed now through complex supply chains devices apps third parties into a new kind of factory computational factories called artificial intelligence machine intelligence where the data are manufactured kurz in all factories manufactured into products but these now are specific kinds of computational products that are behavioral predictions predictions of what we will do soon and later mmm in case you think I'm exaggerating a leaked Facebook document and I draw your attention to the word leaked you know it's crazy how much we have to depend upon leaked documents and whistleblowers to understand what's going on in these backstage operations so there's a leaked Facebook document came out about two years ago and maybe some of you read about it it's a document about Facebook's computational Factory which they call their quote prediction engine all right so they're describing what happens in this artificial intelligence hub and they note that the their machine intelligence their AI hub is now capable of ingesting trillions of data points every day and that the company is now able to produce six million predictions of human behavior each second that's what's happening inside the factory so these predictions are about us but they're not for us where where do they go they're sold to business customers it turns out that businesses are very interested in what we're going to do they're very interested in our futures so they're sold to business customers in a new kind of market that trades exclusively in human futures our futures like we have markets that trade in to futures and pork belly futures and oil futures we now have markets that trade in human futures in other words surveillance capitalists sell certainty that means they're competing with each other on the quality of their predictions and this is a form of trade that has berths the richest and most powerful companies in history all right so this was invented at Google the invention process began 2000 2001 and we didn't start to learn anything about it really until a company went public which was 2004 and they had to make public their their initial public offering documents so here's what we learned from those documents and this is really crazy so listen to this number I'm about to say between 2000 and 2004 their revenue line now let me underscore something mm why did they invent surveillance capitalism you remember what 2000 was I can't see you though now I can see a little bit better so a lot of people in this room don't remember 2000 because you either weren't born or you were too little 2000 was a time called the dot-com bust everybody in Silicon Valley was going broke and they were all panicked that's when Google announced a state of emergency they declared that famous state of exception where they were gonna let go of all of their previously held values and principles and that's how they invented surveillance capitalism but the point here is that they were in financial emergency in 2000 because they couldn't figure out a way to monetize and their own venture capitalists were threatening to pull out okay so that's the background here let's get back to the story so between 2000 and 2004 the revenue line and of course these are the years where they invented this new logic and started applying it okay so everybody clear on this between 2000 2004 now we're finally gonna get to the punch line their revenues increased by three thousand five hundred and ninety percent that's a very big number okay so what is that this is a startling number and this number represents something that I call the surveillance dividend that number would not be there were it not for this new logic of surveillance capitalism that I've just described to the surveillance dividend and what did that do literally overnight it raised the bar for every investment first in Silicon Valley in the tech sector but eventually of course this has had effects through all economic sectors across our economies right but now imagine you're a venture capitalist you're an investor you're a Wall Street analyst you can invest in a company that it can increase its revenue in four or five years by three thousand five hundred ninety percent or you kidding you can invest in a company that's gonna do innovation the older way like Henry Ford and actually invent a product that everybody wants which one are you going to invest in right the answer is obvious the surveillance dividend all right so what do we learn here the surveillance dividend is the center of this surveillance capitalism produces the surveillance dividend which has driven this logic not only through the tech sector but through our economies surveillance capitalism is not the same as technology surveillance capitalism is not an inevitable consequence of digital technology surveillance capitalism is not restricted to technology companies it redefines businesses in every sector now so I'm going to tell you a great story about this this is a story about chasing the surveillance dividend and what's happening inside our economies alright and so just to make this perfectly symmetrical let's go back to the beginning of the 20th century and the Ford Motor Company the birthplace of mass production as we know it you remember the Model T Ford Henry Ford the Model T the most successful product ever sold until the iPod so today we have a Ford Motor Company and a new CEO not Henry Ford and this CEO Jim Hackett is facing what some of you may know a global slump in Auto Sales auto sales are down and they're not coming back what is the CEO of Ford Motor Company to do well if you were Henry Ford you might say hey I know let's invent a car that will actually compel people to buy it how about a car that's completely affordable and doesn't burn any carbon that's a good idea that's not what Ford Motor Company is up to mr. Hackett says I want to attract investment the same way that Facebook and Google do so what I need to do is I need to find data wait a minute I've got a great idea he says there are a hundred million people driving for vehicles so let's stream data from all those people then we can combine it with the data we have in the Ford credit business where he says we already know everything about you now we have a data set we have data flows that are on a par with Google and Facebook who would not want to invest in us chasing the surveillance dividend no more cars he says now we have a transportation operating system chasing the surveillance dividend and here's what a Wall Street analyst says about it listen this is a great idea he says Ford could make a fortune monetizing these data flows they won't need engineers they won't need workers they won't need factories and they won't need dealers pure profit it's pure profit they can make a fortune okay so you got the picture now we're following the money follow the money that is the whole point here an economic logic human made let's follow the money and see where it leads us you ready yes all right so to follow the money what do we have to do we have to look at the competitive dynamics inside this kind of marketplace remember what kind of marketplace it is it trades in human futures right what are the competitive dynamics in this kind of marketplace I know this is Berlin and you're not used to audience interaction I'm an American what can I say I want to hear from you too I have to I have to know that you're hearing me I have to know that you're with me all right so we said surveillance capitalists self certainty so they're competing on their predictions so let's reverse engineer these competitive dynamics and see what we find well number one everybody knows an AI needs a lot of data right everybody knows that so the first thing is economies of scale drives them toward totalities of information we need de we need data at scale okay that's an easy one competing on scale is good but not good enough because eventually they realize hey you know what we need a lot of data but we also need varieties of data I just wanted to mention that I have some very nice bottles of water here but they're not open so so now I have to do this in front of all of you and you are gonna see how completely hopeless I am oh god I've got to do this well please please let me do this oh I did it well okay good all right excuse me one second I'm going to get a glass I guess I guess we ran out of glasses okay thank you all right as you can tell I kind of have a cold so yeah water is good all right okay so now we know that we need economies of scale but we also need varieties so we need economies of scope different kinds of data now even though you're not old enough to remember the dot-com bust many of you are old enough to remember the Mobility Revolution right so this is the idea that we give you a little computer you put it in your pocket and you go well well we'll call it a phone what the heck and it will go everywhere with you and now we can get economies of scope like where you are and what you're talking about who you're with and what transactions you're making and maybe where you're eating and what you're eating and who you're emailing or texting or what kind of browsing you're doing while you're walking in the park or walking through the city we can get your voice we can get all kinds of things now oh and don't forget what's the most important thing of all that we can get with this new computer and get your face we can get all your faces okay so we've got economies of scale and economies of scope prediction continues to evolve and competition continues to intensify and pretty soon there's a new realization the most predictive data comes from intervening intervening excuse me in your behavior intervening in your behavior intervening in the state of play in order to actually nudge coax tune heard your behavior in the direction of the outcomes that we are guaranteeing to our business customers hurting your behavior in a direction of our revenues and ultimately our profits okay so this is this is something new this isn't just scale and scope with which we're familiar with this is something new and this tracks a process that data scientists talk about they talk about the shift from monitoring to actuation and that shift is a point in systems management where you have so much information about that system that it the excuse me the information cascades over a tipping point and you have so much information that with that cascade you can begin to remotely control the system so you you now know so much about it that you can remotely control it that happens in the management of machine systems wait one second but now the idea is how do we make this work in the management of human systems human systems monitoring - actuation okay so the idea now is we've got to figure out how to do this this has never been done before at scale automate it at scale so this is what I call economies of action economies of scale economies of scope familiar economies of action how do we automate remote control human behavior at scale all right it's out this is a whole new experimental zone this is something that has never been done before it's hard to learn about it because as I said at the beginning these are backstage operations but it turns out some of these experiments are hiding in plain sight and we can learn something about it so one of these something that you probably read about now I know you're old enough to do you know about this the face book what they call their massive scale contagion experiments so they did one and - one published one in 2012 another one in 2014 the first one was to see if they could change people's voting behavior not necessarily who they voted for but just to get them to go vote rather than not voting at all the second one was to see if they could change people's emotions make them happier or sadder so when the researchers wrote up these experiments both 2012 2014 they celebrated two findings number one we now know that we can manipulate subliminal cues and social comparison dynamics on Facebook pages to change real-world behavior and emotion we know we can do that number two we now know that we can do this while bypassing user awareness its undetected they never know that we're doing it that's what makes successful economies of action why because awareness is friction friction is expensive if I know about it I might refuse I might look for a way to hide I might look for a way to camouflage so awareness is friction awareness is the enemy these kinds of systems have to be designed to bypass awareness okay great contagion experiments now we're on to an even more sophisticated zone of experimentation and this one I am certain that you know about how many people in this room went out in the streets of Berlin and played Pokemon go with your friends and family come on audience participation you can be honest we're all friends here oh don't be shy don't be I know this isn't true I know you're not telling the truth right all right well did you know did you know that bookie came from Google did you know so is that because you read my book so Pokemon go was incubated in Google now of course Germany was famous for being the first country to contest Street View right and Pokemon go was invented by the same guy who was the boss of Street View who was the same guy who invented Google Earth before that it was called keyhole and it was invested in by the CIA before Google bought it so this is a man John Hanke who has a long history how to fill the supply chains how to fill the supply chains on their way to the new factories so this man John Hanke had a little shop inside Google it's called Niantic labs and that's where they incubated these new augmented reality games including Pokemon go when they brought it to the market of course they distance themselves from Google Niantic labs became an independent little company and brought it to market that way so no one would know that this came out of Google turns out that when you were playing Pokemon go you were actually playing a little game within a bigger game all right so let's go back to the first round of surveillance capitalism what was the first really really successful prediction product ok that was the click-through rate because the click-through rate we think of it as a click-through rate but actually just you only have to think about it for another couple of seconds and you realize that the click-through rate is a computational fragment that's predicting a piece of human behavior right and of course what were the first markets in human futures that's where these click-through rates were sold these predictions were sold so that first market insanely lucrative market in human futures was called online targeted advertising and it's still insanely lucrative however we now see that same structure now juxtaposed translated to the real world in Pokemon go Niantic labs had established its own human futures markets so they had business customers not online but in real life like McDonald's and Starbucks the real shops the real establishments or Joe's pizza and Harry's bar so they had these businesses paying them not for guaranteed click-through but for guaranteed footfall people's real feet falling on the real floor of real places guaranteed footfall and so the idea with Pokemon go was how to use gamification the rewards and punishments of gamification in order to herd people through the cities to the places where their feet were guaranteed to be right so that's another phase in the work of economies of action and figuring that out now here's another phase comes a little bit later and now we're back at Facebook and this comes from another leaked document this one written by Australian Facebook executives and well I don't know if they were they were Australian let me put it let me put that another way this was written by Facebook executives written for its Australian and New Zealand customers I've always kind of assumed assume that the executives were Australian but I don't know that for a fact okay so this is a report that is selling its business customers on the following idea we have so much information now remember monitoring to actuation we have so much information on 6.4 million young people high school students college students and young adults in Australia and New Zealand that we can now predict their emotional state on a daily and weekly basis we can see their emotional cycles across the seven days of the week and we can predict where they are going to be in this emotional cycle we can predict things like if they feel stressed defeated overwhelmed anxious nervous stupid silly useless or a failure and with these predictions we can alert you to the exact moment of maximum vulnerability when if you send a message that contains a confidence boost you will be successful so for example let's imagine that you have a sexy black leather jacket to sell well we can tell you when to sell it how to sell it what to say in your message and by the way make sure they know you're gonna sell it on a Thursday night because that's when they're most anxious because the weekend is about to appear tell them that you can have it delivered for free to their door the next morning throw in a little price discount and we can guarantee you success alright so that's another phase economies of action monitoring to actuation finally we're seeing the next phase on unveil itself now literally as we speak some of you who follow smart city smart city developments might know that just the other day the officials in Toronto made some decisions about how sidewalk lab sidewalk labs is the subsidiary of Google slashed alphabet that specializes in its smart city work you know that they used to call it the google city but they don't do that anymore they call the smart city smart city they're trying to get the waterfront area in Toronto to rebuild as a Google City and this the dynamic that's been going on for a couple of years become very contested with many citizens getting involved and just the other day some of these officials in Toronto actually made a very good decision and curtail the the development of this plan substantially but the the key point here is that when you look at the documents behind this sidewalk labs proposal and and in fact just last week The Globe and Mail found some secret documents documents that really hadn't been reviewed by the public and and and finally made them public and it's fascinating what you see there because all of these documents if you read them with what we've just been talking about in mind these are documents that are a clear declaration of epistemic dominance and and the intention to use that dominance for behavioral modification at scale I'm not going to go into the into the details but you can trust me on that all right so you know sometimes I hear people saying to me you know Shoshanna I mean take your point but really businesses advertisers commerce always try to persuade people you know always try to change people's behavior and get them to buy something that they didn't want to buy so really Shoshanna there's nothing new about this and of course that's true there is nothing new about our desire to persuade each other to do things that we might not have otherwise done or maybe to do things that we don't even want to do there's nothing new about human persuasion but let's not lose our bearings because what is new here is that at no other time in history have the wealthiest private corporations had at their disposal a pervasive global architecture of ubiquitous computation able to amass unparalleled concentrations of information about individuals groups and populations sufficient to mobilize the pivot from the monitoring to the actuation of behavior remotely and at scale this my friends is unprecedented what is this new power it works its will through the medium of digital instrumentation it's not sending anybody to our homes at night to take us to the Gulag or the camp it's not threatening us with murder or terror it is not totalitarian power but it is a new and unprecedented form of power just as totalitarian what totalitarianism totalitarianism presented itself as a new and unprecedented power in the 20th century this new power is what I call instrument aryan power it works its will remotely it comes to us secretly quietly and if we ever know it's there it might actually greet us with a cappuccino and a smile nevertheless it represents a global means of behavioral modification and is the engine of growth for surveillance capitalism ok so here we we've now climbed a mountain we've climbed the mountain of the division of learning and we've peaked inside the fortress into the AI hub into these backstage operations and what have we found a frontier operation run by geniuses funded by immense amounts of capital are they solving the climate crisis are they curing cancers are they figuring out how to get rid of all those plastic particles that now even are detectable in the Arctic snow no they're not doing any of that instead all of that genius and all of that capital is dedicated to knowing everything about us and pivoting that knowledge to the remote control of people for profit I don't like that this is how the age of surveillance capitalism becomes an age of conquest so you know we're meant to sleepwalk through all of this we're meant to be ignorant this is engineered for our ignorance Mark Zuckerberg says privacy is the future very confusing they just really think that we're stupid and because we're meant to sleepwalk through this when something actually rises up out of the fog to send us a message well that's crazy I mean it really gets our attention this is what happened with Cambridge analytic isn't it Chris Wylie here's the whistleblower now Chris Wylie says this is what we've been doing that really got our attention let's just take a minute and look at what Chris said Cambridge analytical was doing he said quote we exploited Facebook to harvest millions of people's profiles and then we built models to exploit what we knew about them and target their target their inner demons does that sound familiar does it he says mm-hm the objective was behavioural micro-targeting influencing voters based not on their demographics but on their personalities does that sound familiar he says I think it's worse than bullying because at least with bullying didn't people know what's being done to them they have some kind of agency with what we do he said people don't even know what's being done to them he says if you do not respect the agency of people then anything you're doing after that point is not conducive to a democracy well yeah that's for sure all right so then he concludes he says Cambridge analytic I was information warfare correctly acknowledging that information warfare originates in epistemic inequality information warfare is impossible to prosecute without that information dominance that information advantage but what remains poorly understood even today is that Cambridge analytic ah only repeated the mechanisms and methods that represent everyday life for every self-respecting surveillance capitalist I mean what more apt description of the treatment of those young people in Australia and New Zealand whose social anxieties were manipulated for profit then to say we built models to exploit their inner demons how apropos is that so here is a this political consultancy that got the world's attention and still has the world's attention when actually all it was was a parasite a parasite in the host and the host body was not just Facebook the host body was surveillance capitalism itself it's surveillance capitalism that provided the three things that the people who study information warfare say are essential for its success the conditions the weapons and the opportunity it was surveillance capitalism that provided the conditions through the ubiquitous data fication of human experience it was surveillance capitalism that provided the weapons the data the methods and the mechanisms the predictive analyses the intimate simulations of individuals the behavioral mark micro-targeting the techniques for subliminal influence and manipulation of social comparison dynamics the mastery of hidden real time experimentation all of it pioneered in surveillance capitalism the weapons and finally it was surveillance capitalism that provided the opportunity the opportunity being the fact that all of these mechanisms can be applied while completely circumventing human awareness it can all be done in secret and that provides a massive opportunity for successful information warfare the conclusion can only be that what we have failed to recognize is that it's not that can 'bridge analytic ah represents information warfare and it's not that information warfare is strictly a function of the state or increasingly of even non-state but political actors it turns out that surveillance capitalism and its illegitimate use of knowledge to power is best understood as the normalization and the institutionalization of information warfare for profit that is the world that we are living in today okay so I want to conclude with just a couple of thoughts that will allow us to turn the lights on in a minute without everybody feeling really depressed because you may not be able to tell right now from my voice which is a little worked but I'm actually very optimistic about our ability to change this and in fact I'll be very candid with you some of my optimism comes from your country some of my optimism comes from seeing how the generations in your country and in this city learn to confront and internalize the lessons of totalitarianism and completely change the fabric of your culture and your institutions in your laws and I have so much respect for that and I think it reminds us of a larger a larger pattern here which is that as democratic societies we have confronted grave problems in the past and we have overcome them we ended the gilded age we overcame totalitarianism and in fact we have used the levers of our democracy in order to ensure that the post-war world became a prosperous world for ordinary people that the post-war world was the age of the middle class and that capitalism and market capitalism could actually promote and itself be strengthened by democracy and that was part of the legacy of the post-war years so now we're living in a time when we understand that privacy is a collective action problem and we have to look now to only one source for remedies here and that source is democracy that means law and that means new regulatory paradigms and when we're talking with Toby we can get into more details on this but I want to call your attention to at least two things that I think are immediately important and once we start talking about them and begin to get used to them a little bit in our imaginations they won't sound as strange as they might sound when I say them right now the key thing that confronts us here is to interrupt the incentives for the surveillance dividend we essentially need to outlaw the surveillance dividend once we do that we open up the competitive space for the thousands and hundreds of thousands and indeed millions of young people entrepreneurs companies who want to produce digital products and services that will address climate that will address our real needs that will cure the cancers that plague us that will do all of the things that we once expected from the digital but they will be able to do them without having to compete on the surveillance dividend that's what we need so two things I want to suggest one is that we interrupt supply and the other is that we interrupt demand by interrupting supply I mean that the illegitimate secret unilateral taking of human experience for translation into data should be illegal [Music] the surveillance capitalists have fought this fight that you heard about in 1997 continues literally every day they have fought for the right to take our faces whenever and wherever they want to they take our faces on the street they take our faces in the park they take our faces when and wherever they want to our faces go into their facial recognition systems facial recognition systems train data sets data sets we now find out often sold to military operations military divisions including those military operations that are imprisoning members of the wig our minority in central China in an open-air prison where the only walls are facial recognition systems that's what I mean by the way privacy is not private okay so we interrupt supply the next thing that we can do is interrupt demand and that means we eliminate the incentives to sell predictions of human behavior how do we do that we make markets that trade in human futures illegal other markets are illegal markets that trade in human organs are illegal why because they have predictably destructive consequences for people and for democracy markets that trade in human slaves are illegal because they have predictably destructive consequences markets that trade in human babies are illegal because they have predictably destructive consequences markets that trade in human futures should be illegal because first they are the enemies of human autonomy because their competitive dynamics require economies of action for which human agency is the enemy and second because they inevitably produce the extreme asymmetries of knowledge and the power that accrues to knowledge the create epistemic inequality and epistemic injustice okay so now we're at the end what do I want to say to you and now the question is not are you are you old enough to know this but are you young enough to know this Greta Thornburg Greta Thornburg says our house is on fire succinctly framing the climate crisis Cataclysm our house is on fire so I'd like to suggest that global warming is to the planet our house what surveillance capitalism is to society our home not only is our house on fire but our home is on fire this fire though is not kindled in the implacable physics of the climate crisis it's kindled in a human-made logic a human-made economic logic anything that humans make can be unmade all we have to do is decide like the Berlin Wall you decided and ultimately it came down surveillance capitalists are rich and powerful but they are not invulnerable they have an Achilles heel do you know that is they fear law they fear lawmakers who are not confused and and intimidated but ultimately they fear you they fear citizens who are ready to demand a digital future that we can call home thank you [Applause] [Applause] I just want to say I did have my watch up there it didn't do me much good though did it thank you so much for that that means the world to me it really does I'm doing a lot worse than you with the BOD layer actually Oh [Laughter] behold well done Thank You Jovi thank you so much Shoshanna for that very interesting outline again on surveillance capitalism and for the call to arms so to speak at the very end and I would actually like to start this conversation by referring once again to the city wherein you know even I am NOT old enough not may come as a surprise to most of you to have witnessed to fall off the wall here in Berlin that's because I'm not German I was in another small European country but you know apparently everybody knows that obviously it was the GDR as a state that relied heavily and analog surveillance it was estimated that about roughly 200,000 people iver on the payroll of the secret police of the Stasi in a state in a nation that you know had how many 16 millions inhabitants so there's a conversation about how surveillance changes behavior there when it comes to analog surveillance I mean it's still a difficult conversation to have in Germany actually but the conversation is there you know we have talked about this for quite a while in terms of speech public movement action taken actually not taken things like that but I'm wondering 2013 after the Snowden leaks after that story broke you know all over the world is there a much so much of a widespread discourse on how mass surveillance changes modify their behavior actually you know commodifies our traces of the behavior what you think in what ways has surveillance capital already changed our behavior so far because I don't hear a lot about that well I mean I I think it's um yeah it changed our behavior and probably you know I think maybe the younger we are the younger one is the more ones behavior is likely to have been changed and the less the less one has any possibility of even knowing that so we can see change behavioral patterns but it's you know it's not necessarily within the self-consciousness of a young person that their behavior as goals unknown is is is changed but you know this kind of entrapment in the psychology of adolescents right about the life of the hive you know the idea that we're we're living in a way that is so hyper attuned to one another and it appears to be that young people really have been drawn into this kind of this kind of operation so profoundly but there's that interesting age group I love talking to university students probably some some people in this room I had a really moving experience was late 2017 and I I went up to visit do some lecturing and teaching at King's College Ontario up in Canada and Kingston and my favorite thing is being with undergraduates you know college students and we had this wonderful conversation about Erving Goffman you remember Erving Goffman you know a great social theorist of the mid 20th century presentation of self in everyday life and golf man you know part of that group like Stanley Milgram you know the great studies that came out of the war in the post-war environment and Goffman talked about presentation of self and he wrote about this idea backstage and as you know essentially he said like if you don't have a backstage you're gonna go crazy because backstage is where you get to be yourself and you get to replenish yourself and and that's where everyone's just hanging out no one's judging anybody and and so forth and without backstage you go nuts and so I I this group of this this group of students is a big class and they had been reading these theorists like Kaufman and I said well everything we're talking about that you do on Facebook and so on curating your persona for different audiences and so forth isn't that just 21st century golf man isn't that just presentation of self in the digital world so there was a debate about that and then this young woman begun began to speak her name was Helen and and she was feeling and thinking in real time and she began to speak and she said it's not the same and I've just I've just realized it's not the same because we have no backstage there is no place I can be that is backstage this morning I was walking across the campus and I thought I was backstage lost in my own thoughts and I looked up and I saw somebody over there with their phone taking my picture there's no backstage and the whole room was a big big room not quite as big as this but it was one of those big classes everybody went quiet photography is a very interesting executive to talk about privacy actually I think the notion of privacy you know that I think is so central to a lot of your arguments you're making you know the mistakes are the innovation you said the beginning of your talk and that's of course an invasion of privacy or meant at that time I think an invasion of privacy now if you look at that little historically that's what I mean with press photography in the late 19th century right which led to a canonical text of course you know you know it in the history of privacy lost a right to privacy by Samuel Warren and Louis Brandeis that when was the eighty nineteen actually in the Harvard Law Review the philosopher Raymond Kois professor emeritus at Cambridge tells the story who won actually inspired this nowadays canonical text when it comes to the right destroy all my good stories of a story like this yeah the wife of I think it was Samuel Warren's wife it was a rich society lady was very upset about this he was very upset about this right that press reporters actually sort of invaded or reported on the parties he or his wife or the two of them threw together so it was a very concrete instance actually that led to this canonical text nowadays people post the pictures of their parties you know free of well I mean nobody forces him to do that so what I want to get at historically in other words privacy is a very unstable concept as we can see now when it comes to photos but what about today do you think we can frame a universal concept of what privacy is how would you describe that what would mean privacy today you know I mean I was I was reading some some of my materials today just reminding myself of that great quote from Mark Zuckerberg I think I think the quote was from 2011 when he said we just decided that there would be no privacy and that would be the new social norm and we went for it and so rather than you know so so what we've seen is this this assault on you know there had been this sort of evolutionary process of privacy privacy of course is an idea that is only relevant to the growth of the evolution of of the the psychological individual and the the progress of individualization through history which is a arduous and an arduous evolution and one to be celebrated because with the concept of the individual came the concept of rights and with the concept of rights came the concept of democracy equality so privacy is part of this nest of values and sensibilities that are so essential to the the way of of life that that is associated with the the the growth and the you know that the the health of our democracies the possibility of our democracies so for so what's fascinating is that Facebook simply decided there would be no privacy and what happened initially what happened what happened was the world exploded in outrage you know we didn't just accept it people all over the world were really really angry it's like we talked about Street View before there were nobody actually knows how many lawsuits there were against Google because of street view but they were coming from almost every country on earth Germany started Homburg but you know there was outrage and protests so but these these lines were crossed and there's habituation and there's normalization and there's what I call psychic numbing so this goes with what we're just talking about a minute ago what Helen said there's no backstage if there's no backstage you start to go crazy in a certain kind of way and what do you do to protect yourself from that feeling of craziness you go numb and so there's a lot of psychic numbing right now because we are all increasingly experiencing this world of no escape and in order to protect ourselves from these feelings of going crazy we kind of get numb and we stop thinking about it you know or we console ourselves with privacy browsers and ad blockers and things like that or you know how to increase encryption or or maybe you're into the camouflage you know those special materials that you can buy to put over your face and confound facial recognition cameras and so forth but so I don't know that we've given up on privacy so much is that we yearn for it it's become unavailable and that makes us feel kind of sick and crazy and so we're protecting ourselves from that until we can figure out a way through in which case something that you and I have talked about a little bit Toby you know I happen to believe that if we can get rid of the surveillance dividend and really open up the competitive playing field we're at a moment in history where any business that comes on stream giving us the things that we actually want the way we want them without the overhang of these externalities that come with the surveillance dividend these anti-democratic and anti-ag a latarian externalities that that company or those companies really have the opportunity to have every person on earth as their customer because according to literally every survey in Europe every survey in the United States nobody wants this stuff nobody likes it nobody wants it there's just very little choice you mentioned that you know 20 years ago we thought that the internet was going to be about empowering the consumer now right and now we ended up as you know and we're in the era of the end of the consumer as you put it in the book I think the end of the consumer being the raw material for the profit other people call it labor even MIT should be you know another concept but what I'm interested is for now for the moment for we open this up to you in the tipping point what happened in the evolution of say managerial capitalism to distributed capitalism and then advocacy oriented capitalism and made you know that fight loose and so surveillance capitalism won by actually destroying to consumer what kind of forces are there is this about continuity or discontinuity or in other words how much capitalism is in surveillance capitalism that's always been there or is this really a break as did us in the discontinuity you know I would say it's both I mean surveillance capitalism you know replicates the age-old evolution of capitalism claiming things that live outside the market dynamic bringing them into the market for commodification for production and sales but what's crazy about this era of that pattern is that it's not it's not simply about taking you know nature as in the case of industrial capitalism to be reborn as commodities for sale and purchase but now it's it's about taking human nature it's about taking private human experience and that sets into motion all these other kinds of dynamics that we've been talking about but having said that there are also I think very very powerful ways in which surveillance capitalism diverges from the history of capitalism not only do surveillance capitalism not have okay let me put it this way surveillance capitalism breaks with a history of capitalism because it no longer has to sustain organic reciprocity with its own societies with its own populations right right for two reasons one is because it doesn't need us as its source of customers we are not its customers as has already been pointed out I won't repeat but also because it doesn't need us as a source of employees it's not accurate to say we supply the labor because if we were Labor then we would be the workforce and that's a source of reciprocity don't forget democracy you know that the amplification of democracy in Great Britain in the late 19th century was directly linked to the dependence of the elites on the working-class they understood that if they did not expand the franchise the people who worked for them were likely to burn down their factories so that those reciprocity is that's that's not just like an abstraction that's that's real life the same with consumption I mean you know there's wonderful historiography really looking deeply at the American Revolution as a consumer revolution that the colonists actually you know were finally molded into a shared you know a sense of shared political interests across these disparate colonies because they were all outraged at how they were being treated as consumers and when they wanted to protest they said we will cease to buy your products I'll go without a winter coat I'll go without tea right so this is very real stuff that's intrinsic to the history of democracy so that they don't need us as consumers and they don't need us as workers is a really big deal and that is a tremendous difference from the history of democracy from the history of capitalism and and I think a key reason why you know cap democracy and capitalism have found a way to cohabitate especially you know from the late 19th and especially in the in the middle of the 20th century market democracy turned out to be a very powerful and prosperous kind of structure but is very unlikely as long as surveillance capitalism is is the dominant market form let us switch to the new tool slider Natalie do you have our questions have you for me yeah there's a reduction in the river trying this out this evening it's the first time we're trying it out I was not there with Cassell I'm sorry we are trying this out tonight okay [Music] they're still trying it out I'm going to put the strident right there's nothing should be started with the questions from this tool we have received more than 150 questions so we kind of had to put thematic blocks on them and one of them all right this is like Berlin in the 90s right it's coming back resists but we're trying out this tool please respect that all right um so the first block is about regulation about law makers and can we go okay great so for example should big tech companies be broken up as Elizabeth Warren proposes is this a viable way forward and on the other side what policies should the you put in place to limit the power of companies like Facebook and Google so which yeah which regulation forms can be used for that okay well great questions so glad you asked 150 questions I think we're gonna have to get our sleeping bags out so you know when we talk about antitrust law this the whole body of anti-competitive behavior and legislation so that was a very fertile and creative legislative period to invent antitrust law I you know it didn't it didn't happen like that as I'm sure many of you realize there were cartels and there were monopolies and they were a scourge on society and there was tremendous contest and violence and these laws and regulatory visions were developed by progressives and by legal scholars over a period of decades it was a tremendously creative act so you know there's a there's wonderful histories of regulation and one of the lessons these histories of regulation is that regular regulatory efforts fail when they are unable to frame regulatory strategies that are carefully based on an understanding of the industry that they're trying to regulate and I think the same can be said on the larger scale now for surveillance capitalism so let me connect those two points number one it simply can't be a question of taking that very creative legislative work that came out of the 20th century and applying it to a wholly new set of mechanisms and methods problems and phenomena in the 21st century do we have anti-competitive behavior among these companies yes do we have monopolies behavior yes will addressing those monopolies undo will it interrupt an outlaw surveillance capitalism no not in my view what it's more likely to do breaking them up quote-unquote we run the risk of creating more surveillance capitalist companies intensifying competition among surveillance capitalists and therefore intense intensifying the drive towards certainty and predictability that I've just been describing to you so what I believe is that we need to stand on the shoulders of twentieth-century antitrust law twentieth century and even early 21st century privacy law and we need to build on that with specific understanding of surveillance capitalism's logic its methods its mechanisms its imperatives and having a new creative effort that produces the insight the legislation the regulatory vision that will interrupt an outlaw what is unprecedented here okay thank you next question I can't see you can you raise your hand you're there okay there you are thank you you're welcome the second block is about more in the what can we personally do to combat these threats for example anonymous asked which actions we can take to make sure we are not entirely influenced by such companies as Google and Facebook and also the more personal level which actions do you personally take to minimize the data you provide to tech companies well you know I I've argued that privacy is not private so there are a couple things here one is my personal view is that withdrawing from such a dependency and a measurement in these systems will give you a better life and better mental health even before the rise of the digital I was never the kind of person who was very attuned to the others and I certainly would avoid the amplification of that kind of dynamic that we see now in the online media so I think we can do things for ourselves by by withdrawing and by you know not being so dependent on these systems not being so mediated I do believe in eye contact and you know I do believe in actual recognition and I do believe in being in the presence of trees and things like that so but what does it mean privacy is not private does it mean that we have no power as individuals no it means that our real challenge as individuals now is political so to say that privacy is a collective action problem means that friends we need new forms of collective action so you know in the in the 19th century and in the especially in the the first part of the 20th century this was about you know the right to have a union and the right to bargain collectively and the right to strike it was about making sure that children didn't work didn't go to work when they were 7 or 13 but they went to school or they were with their families because that was consistent with the aspirations of a Democratic Society so what are the forms of collective action that will define our challenges in our time so collective action means that we need to discover ourselves not as anonymous users which is their name for us but rather as citizens of democratic societies with shared not only economic but political and social and psychological interests and we have to come together in those interests and create these new forms that are going altom utley to be the vehicles that put pressure on our lawmakers that mobilize our lawmakers into this next era of creativity and a new regulatory vision so we have work as individuals and that work is coming together to create these new movements and these these will be movements defined by the fight for epistemic justice thank you okay it's just a small announcement I know we're running a little bit late I thought we had a conversation up here for 20 minutes and then it's your turn for 20 minutes and I think we're going to stick with that so we have another about 12 minutes maybe until this evening will end a little bit late but you know the subject is so huge and our speaker is great so I think we can all deal with that I can certainly so I think there'd be one more question for the moment from slider and then we'll go to the microphone on both sides please Natalie so we have a top voted question by far which is more about the organizational part of the event the event is co-organized by the Homewood Institute for Internet and Society which received millions of funding by Google do you think that this is a problem [Laughter] absolutely you know years ago when I was a little girl there was a wonderful thinker Herbert Marcuse a-- who wrote a book called the one-dimensional society people don't really read it that much these days but I recommend it so this is our version of the one-dimensional society you know Google's out there whitewashing itself in every way imaginable including if you look at the list of Google fellows year after year it's all kinds of people who are from civil society institutions that should be dead set against Google and instead their Google fellows and they're there they fund conferences and they fund civil society organizations and this is part of the whitewashing that is intended to keep us so confused you know it's like I said before Mark Zuckerberg just told us that the you know the future is privacy what what did you just say mark so you know they think we're stupid and we're supposed to be confused and we're supposed to think that they're really nice but I don't think that and I'm and I wish that they weren't a sponsor of this event to be honest perfectly honest I didn't know this until a little while ago a few days ago but I have been at other events other events that I was very proud of including the CP DP event in Brussels where we launched my book in January which is also you know Google is still the sponsor there and there there are some folks who refuse to go to that meeting because it was sponsored by Google but you know it's a it's a calculation it's a choice cost benefit you know better to go be with you share a message or better to stand on my principles and not come here be silent so that's that's how I calculate the cost-benefit but I'm not happy about it I just maybe have to say something I'm not I'm not on the payroll after H III G so to speak but I vote for them and just to give a little bit back ground information the Institute was founded by different universities here in Berlin not by Google and I think in the Board of five people or six people there's one Google guy that's just to give you an impression I think it's 1.5 million euros a year that the Institute gets from Google that's about in good years that's about half of the budget 1.5 at the most is the other kind of treatment the other funding they're able to gather sometimes it's less just to give you a couple of numbers on that there's a microphone and this site is there one on the other side too you see that's the problem because this is so packed and we don't have aisles and everything it's really hard to handle the microphone this is one of the reasons we opted for slider like back in the day at-at Aquino International with castells that's part of the reason ok how are we going to do this hi thank you so much for your talk I'm Danny Stockman and professor here at the heritage school of governance I'm wondering whether you could speak to alternative business models through the surveillance capitalism for tech companies to use in the future things well you know I think for example I think it could be you know possibly quite useful to have some of the devices that are characteristic of what people call a smart home manage energy efficiency and communicate with this and family and make sure old folks are safe aging in their homes and all these things so there were there were folks in the early 2000 who were developing these models of a smart home and when they did that they were drawing schematics of how it would work and it was a closed loop it had a devices in the walls that were generating information and all that information was channeled to the occupants of the home who received it on a little wearable computer and it was a closed loop then you fast forward to 2017 a very interesting legal analysis of one nest thermostat nest smart home devices now it was bought by Google now there's no more nest it's all Google home one nest thermostat these legal scholars figured out if you're even just a casually vigilant consumer and you install one thermostat they recommend you review a minimum of 1,000 privacy contracts so I don't think we have to look that far for business models the problem is that without the surveillance dividend it's going to be very hard to monetize a refrigerator that's not secretly sending data to my health insurance company right just like Ford Motor can't figure out how to make money without streaming data from the folks driving his cars right and that becomes the basis for revenues and profit I think if if we could actually see our way to eliminating that overhang that the business model is the least of our problems I have at least a hundred messages in my inbox out of several hundred every day that are people who are working on great ideas this piece of technology this kind of product this kind of service that's not surveillance capitalism how do I do it how do I get it funded I don't think we have any problem with business models the question is making space in the competitive landscape so that business models can get the funding and the capital that they that they need to sustain thank you I think we got one more question from before yes my name is Rafael I was actually Danny student and I used to be a Google fellow but I swear that I'm not defending any sides I worked for civil society nowadays and we look into the impact of social media around elections in the world so not only Europe but in Asia in Africa many countries and what it's interesting to see is that the decision to counter the risks of which each election is basically a PR campaign so you're gonna of course invest a lot of money in the u.s. in Brazil in India because there are big democracies and the effects of any bad PR will go to the company but they don't look into other countries that are smaller or smaller markets so basically you have this asymmetry of a very like an us-based company deciding which democracy values more than others basically so in that sense my question goes in two ways so you mentioned about the the this this infrastructure generating predictions about behavior so you have basically inside of those companies probably 98% of the efforts towards that in two percent towards countering the the effect the negative effects of that first of all regulation is enough to really solve this if you consider this global scale of companies and second of all how can you really make sense of this rising global inequalities in terms of democratic standards I wanted to ask more but it's fine thank you okay that last part got a little together the the idea that like some democracies value more than others and how do you make sense of well what's your name repeal what Rafael Rafael I love that name that's my son's middle name Sam look if they were choosing to look out for democracy and India and Brazil they didn't do a very good job so if that's what we get when they're actually trying to do something then you know I don't buy it I mean all they had to do was intervene in whatsapp in Brazil and it's clear that the outcome would have been different or at least I think there's compelling evidence to suggest that the outcome would have been very different so I'm not sure that they're really looking out for democracy anywhere on that part of your question what was the other part of your question oh yeah what was the other part of your question oh [Laughter] yeah yeah so look this is this is what we're I'm talking about regulatory vision and creativity for a new era so we're talking about a war now between computational governance and democratic governance the computational governance side says democracy's too slow democracy limits innovation democracy can never keep up and the democratic side says you bet democracy slow slow is good slow means we're deliberating and we're creating some kind of consensus and the idea that democracy is bad for innovation that's some kind of really crap piece of propaganda that was actually invented by JP Morgan in the late 19th century JP Morgan loved to say capitalism doesn't need law we've got the law of supply and demand and we've got the law of survival of the fittest we don't need anymore loss so the idea that law is somehow you know bad for innovation bad for business that's just an old piece of propaganda that's just been recycled the idea that market actors should be completely free that maybe has had a certain amount of merit when Adam Smith was writing because when Adam Smith was writing the hand really was invisible not for long but today the hand is not invisible these cats know everything so they actually know too much to qualify for freedom so that argument is gone so the point is part of a democratic regulatory vision is to say it is not okay for example to have political advertising that is patently false it is not okay to have videos of elected officials that are known to be false it is not okay to have a virus without a vaccine these things are simply not compatible with how democracies have to run and so can we regulate that yes we can can we do that without that being turned into censorship I believe we can that's where the imagination now you know that's that that's the front line now but we see the alternative and it's unacceptable thank you Shoshanna what we usually do in this series please [Music] not quite done yet not quite done [Applause] what we usually do at the very end in this series is ask you know ask about the European perspective because you know probably in Germany there's the net sticky the network enforcement act that goes colloquial by the name of the Facebook law basically make it easier to force platforms to remove hate speech for instance there's a set of compliance rules more or less then there's the right to forget on a European level things like that now there is the usual European conceit you know that borders on feelings of moral superiority I'm sure we know what I mean and there's those who think I forget it we do not have enough power in this new emerging world order anyways now what's your take on the European perspective of agency in that matter from the US well you know when I when I first met Frank Schumacher and I began writing for the the Fox 2013 I told him van that I wanted to write for him because Europe was the front line and Europe still is the front line so if we are going to make progress on the things that we've been talking about this evening I do believe that that progress will be made here first and I don't know that we have to call it moral superiority I simply think that Europe has had a different experience than America and world war two is an important part of that and what our societies learned from the devastation of that time you know is different and has played out differently in our in our different European and American civilizations and I do believe that in many ways Europe understands that democracy has to be defended and fought for in every generation I was doing a book signing a few months ago in America and this you know long line and I'm kind of got my head down it a beautiful young woman comes and we sign it what's your name and tells me her name and she says I'm very depressed your talk really really made me sad and I said oh my I'm worried why are you depressed what what happened why did I make you sad and she said well I I've you know I've lost faith in democracy and and I I we talked about it a little bit and I said you know you've just given me an insight that I didn't have before therefore someone your age you know especially coming of age eight years of Obama or whatever it's like you're looking at democracy like it's a mountain like it's a boulder and it's there when you're born and it will be there when you die just exactly the same but that's not what democracy is democracy is a creature of human imagination and well and every generation has to keep it going it's like this the kids before there were toys the kids would throw a hoop you know and then you had to run over the hoop and to make sure it didn't wobble and fall over well democracy is like that hoop and you've got to run after it and make sure it doesn't wobble and fall over it's everybody's responsibility in every generation and I think that Europeans have learned that in a way that perhaps Americans have not had to learn it in the same way I mean my father learned it but you know I'm not sure that we that Americans have had to learn it generation after generation in the same way so so I do think that Europe has a unique vantage point and that has been Europe has been and will continue to be the vanguard in this work and that's an important reason why I've been so committed to this work in Germany and and in Europe I didn't know I was fishing for this but it's sure nice to hear so thank you for sunnatullah' for being with us despite your illness the spokes are you're told thank you for being with us shoshanna's Duvall [Applause]
Info
Channel: Alexander von Humboldt Institut für Internet und Gesellschaft
Views: 283,490
Rating: 4.8349895 out of 5
Keywords: Shoshana Zuboff, Surveillance, Capitalism, Making Sense, Berlin, lecture, Google, Platform
Id: fJ0josfRzp4
Channel Id: undefined
Length: 127min 4sec (7624 seconds)
Published: Mon Nov 11 2019
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.