From Social Pariah to Social Fabric: What Is the Future of Digital Media?

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hi everyone welcome i'm will obey dean of the usc annenberg school for communication and welcome to what i know is going to be a fantastic conversation today you know it wasn't so long ago that we talked about the promise and the peril of the internet today i think we understand with great specificity and no small measure of emotional intensity just what promise and peril looks like in 2020. you know as we sit here today our connected devices are providing us a lifeline to necessary and vital work school information but also to human connection at the same time we are tethered to these devices in the media they deliver concerned about their impact on us as individuals and and on society as a whole and as one of the members of our audience in the chat offered maybe we should be asking utopia or oblivion so as we consider both the impact and the future of digital media we have an incredible panel with us today bringing together the perspectives of scholars and technologists researchers cyber security policy and legal expertise so let me introduce them honey fareed professor uc berkeley school of information tristan harris co-founder and executive director of center for humane technology karen kornbluth senior fellow and director digital innovation and democracy initiative at the german marshall fund of the united states and nila for razzie howe senior fellow cyber security initiative new american foundation welcome to all of you great to have you with us i'd like to begin by asking each of you a question that really situates within the context of your expertise where you think we are today and tristan um i think we'll start with you you went from being an industry insider a google employee a design ethicist and now with the social dilemma your hit documentary you've been a very vocal critic of social media's hijacking of human attention this is really in many ways a call to arms a call to action and certainly a powerful vehicle to increase public awareness first and foremost what do you want people to know and understand about today's digital media but specifically today's social media yeah well thank you willow it's always good to see you here again i think this is the third time we've done something together exactly um uh and you know with the film the social dilemma which just came out this last month on netflix which is really about sounding the alarm for the business model i want to be really clear i think we don't have to be against technology at all what we really want to be against are business models that profit from the problem and so a business model that profits off of engagement and human attention profits from conspiracy thinking profits from personalizing us into separate narrower and narrower echo chambers of reality profits from addiction with teenage mental health profits from sort of cyber bullying activity profits from outrage of occasion of our politics and so i think what i really wanted people to get in the film is that you know so long as we are the product we the humans are the product our behavior our predictable behavior is the product um we're being domesticated into the kind of product that's most suited for these companies so just like we take cows and we domesticate them to be you know have the the most milk and the most meat for us to to consume we the humans when we are the product are most profitable when we are addicted outraged anxious attention-seeking polarized and disinformed than if we are thriving citizens of a democracy so what we really have to look at is the business model and i think one of the things we're going to be talking about later there's all sorts of policy changes that are always happening at the tech companies where they'll categorize some new piece of content as a kind of bad apple and they'll take out the whack-a-mole stick and they'll whack the bad apple and say look we solved the problem cumin look we solved the problem flat earth conspiracy theories look we solved the problem holocaust denial but is the problem that these mysterious bad apples just showed up or is there intrinsic business model a bad apple farm that their soil rewards unregulated virality that the thing that keeps us so hooked is the promise that when you post something no matter what it is it can go viral in a heartbeat to hundreds of millions of people that's the essential business proposition of these social media companies and that is fundamentally misaligned with the cohesion of our society because right now uh every problem in the world whether it's inequality poverty racial justice or climate change are coordination problems it requires us to come together and see the same reality and be able to come to agreement about what we want to do about it if we live in a business model that profits by giving us each our own reality that makes us more and more certain that we're site we're right and the other side is wrong but living on different facts that's setting our societal iq down to zero because we can't solve any of our core problems and that's what i hope we get into later um tristan thank you and yes we are going to talk more about the business model um neela far i'd like to turn um to you you've been a tech entrepreneur you've been an investor you've been an executive you've been focused on cyber security you know tristan talked about cohesion and misalignments how did we how did the structure of the internet arrive at a place where for a couple thousand dollars and some infrastructure somebody can launch a disinformation campaign that does serious damage to a country a superpower no less thank you willa well if we look at how the internet came into being we need to understand that the original vision of the internet was utopian it was universal connectivity universal interoperability and as we rushed to operationalize every aspect of our lives in this domain we didn't spend time thinking through the implications and especially the malfeasance that could take place so here we are we didn't predict how fast this technology would change our behavior as human beings and the truth is historically we've had a lot of time to adapt to tech innovation you know the first consequential invention that affected the communication media was the printing press we had 200 years to adapt to that globally we had 200 years to adapt to the agricultural revolutions we had 100 years to adapt to the industrial revolution we had 70 years to adapt to the tech revolution of the early 20th century um but when we look at it today we're out of time to adapt um the ai revolution the genetics revolution the energy revolution are all happening at the same time following really quickly on the heels of the internet revolution of the 1990s and the problem with this pace of change is that our civic institutions our political processes our legal and regulatory frameworks just don't have the agility it takes to thrive in this world the other issue is that as one of the um uh audience members pointed out is we assumed a world of tech utopianism right that the good that tech brings far outweighs the bad and so when we initially designed the internet we very purposefully did not build security we didn't design in security because we wanted to enable rapid adoption of the technology and we somehow assume that either through self-regulation or a communal body that could set norms that the world would adhere to that we would have sufficient uh protections in in place and i think david bowie is probably the only person who pointed out the naivete of that assumption in the 1990s also we didn't predict the categories of harm that could come from this new communication medium so as we look at it as tristan just said from psychological harm to economic harm to national security to threatening the very foundations of democracy not only can't our legal regulatory and political processes keep pace with the innovation but we also can't keep pace with the creativity the patience and persistence of our adversaries and the miscreants in cyberspace who are motivated by some combination of less greed and power to do us harm now you layer on top of that some of the um the fact that western liberal democracy is in a pretty fragile state and the destabilizing economic trends that are taking place and and we have a tinderbox right we have massive income disparity going on uh 70 of us households have had declining income over the past 15 years and there was an oxford university study that said 47 of uh jobs in the u.s are at risk for replacement or reduction by robotics and ai over the next two decades and because of these economic disparities we live in this angry world that feels disconnected from its government discounted in its electoral process disenchanted by the news media and its abdication of responsibility when it comes to reporting facts and the world is just getting angrier so now you layer on top of all of that this free and open internet right so what could go wrong um the internet's democratized communication it's enabled universal connectivity so now you have a situation where every device is a super computer that's aimed at your brain and you don't have a chance against it every application is an attack vector that can be exploited by all manner of bad actors including the tech platforms and as dan grier gear the father of cyber security famously said every sociopath is now your next door neighbor so um that's sort of the environment we live in and uh the truth is that defense is a lot harder than offense in this world bad actors have an asymmetric advantage because they don't have to be right every time defenders we have to be right every single time the consequences for bad acts are pretty minor the these days you know we prosecute only a minor fraction of illegal activity let alone shutting down um hateful speech and bad actors are crowdsourcing their capabilities and they're reusing the same tools and techniques so it doesn't take a whole lot of investment in fact the russian disinformation influence campaign that targeted our last presidential election didn't take a whole lot of money to be pretty effective so the sheer scale of the problem at this point is is daunting and even today one of my colleagues at new america um yeah he used twitter streaming api and found that on any given day between half a percent to one and a half percent of all the tweets qualify as violent uh or hate speech which equates to hundreds of millions of tweets a year that are encountered by hundreds of millions of users and conspiracy theories have significantly more engagement than non-conspiracy theories so um you know we have a structural problem from the way the internet came into existence and really not having anticipated where all of this would go and the uses that the internet would be put to all right thank you neela fire you you talked about defense karen let's talk about offense a little bit you're you have a policy background um involved in the early framework for the the clinton administration remember for the internet and and develop policy in the obama administration you know how did we get to a place where these technological advances have outstripped our legal and policy guard rails how did we lose control of the offense that's exactly right that's a great question and i so rarely get to be the optimist on a panel so thank you everybody for uh um i do completely agree with what both previous speakers have said that when the internet's burned out there was this assumption that it was going to give voice to the voiceless and power to the powerless and so the uh policy framework that was created um really you know was built around that assumption and that what's happening now this you know we've just seen in the last week with facebook and youtube finally coming out against holocaust denial and against um q anon uh is an example just as tristan said of whack-a-mole and it's making us all feel exhausted and it is the opposite of what we thought we were getting into with this wonderful new technology but what i would say is that sense of exhaustion we have and hopelessness you know is really not productive we're running a real-time experiment on no uh regulation no policy uh not the right incentives for these companies the experiment is not working and so we just need to take a different approach we just need to change the incentives as as tristan was alluding and i don't think that's impossible and the reason i say that is because as you alluded to well in the early days of the internet there were a lot of policy decisions that were made um and while they haven't all turned out right i think the fact that policy was so determinative should help us feel like we have agency to make new decisions and to change the way things are operating so what do i mean in the early days of the internet there was a decision made to create competition for the underlying telephone network to create competition for broadcasting cable and so policies were made to allow the early internet to connect there were decisions made that information should flow freely across the wires so this section 230 that's become infamous was put in place a lot of policy decisions that determined what the internet was going to look like if we want to change the policy framework i think the first thing we need to do is say that some of the expectations that you have for fairness for transparency and campaign donations these kinds of things that we expect in the offline world we should be able to expect the same things in the online world and that means updating a lot of our frameworks and there's a lot of common sense solutions that we can adopt that would make uh the information space um a lot more like the physical space but we need to realize that that the internet is not a policy free zone it never has been and it shouldn't be and one of the things that i that always bothers me so much is when people look at these congressional hearings where they haul up the ceos to capitol hill and the members of congress have silly questions and people say well that's it's hopeless we can't do anything about policy because the members of congress don't understand how facebook works and they're still using aol and what i would say to that is that they don't know how aircraft engines work or how you know the chemicals and drugs work either but we can all agree on some goals the kinds of goals that we're talking about here and then we can hand it over to some experts and they can design some policies that just update our existing policies for this new world karen thank you and and i'm going to be asking you more some more about some of those common sense solutions and what they what they might be in in a little bit um professor farid honey you know you bring to this a background in computer science brain and cognitive sciences information sciences and you really believe social media has failed us what does your work suggest about how we got here yeah so first of all if you have not seen it after this panel is over go to netflix and watch tristan's movie um social media and then when you're done with that um if you're as outraged as i've been over the last decade please go and delete your facebook account delete your twitter account and get off of social media not only will you be doing yourself a mental health benefit you'll be helping society and democracy um i essentially agree with almost everything everybody has set up until now let me just say a few words i don't think this snuck up on us i mean it's one thing to say there are unexpected consequences from technology but i don't think that's the issue we can go back to the early 2000s at the birth of the modern internet that we know now and i will tell you having been there that we had a phenomenal problem with even then almost 20 years ago with horrific child sexual abuse material being produced and distributed online something look we can have we can have a disagreement about speech we can have a disagreement about conspiracies but we can all get behind the content of eight-year-olds and four-year-olds and two-year-olds being sexually abused that there is no room in our off or online world for that and yet there they were the early giants of the technology sector saying this isn't our problem we are the wire we just connect to people and so this was not something that surprised us this is something that has been long and coming the problems have escalated from child sexual abuse to terrorism to hate to misinformation that is designed to sow civil unrest and create violence in societies and disrupt democracies to sell illicit drugs that are killing tens of thousands of our citizens every year to sell illegal weapons um to trade human beings and this was foreseeable and the problem has already been laid out with a business problem model we have a lack of regulatory model we have sheer and unadulterated greed from the corporate titans and we have a lack of investment in the safeguards that we have known for almost two decades are in place and i i'm tired of hearing the argument as well the internet is big it is complicated but as as karen just said airplanes are complicated pharmaceuticals are complicated the finance sector is complicated and yet we managed to to you know with relative safety run those types of platforms so this idea that the internet should be immune from regulation from policy and from accountability i think is long over frankly it was long over 10 years ago i think we have to do things at a regulatory front we have to do things in a policy front we have to do things on the technology front and we have to do things at an education front and the last thing i'll say on this what is complicating all of this is now a virtual monopoly of google facebook and twitter because if you want to have competition for a better business model better privacy better safety uh better social responsibility how do you do that when three companies dominate the space and squash any competition so i think most of us agree on what the problem is i think we most of us agree that there are a number of different solutions that have to be put in place but those each have to be pushed i think equally in all dimensions let's talk a little bit about um and thank you thank you to all of that about um text response um we've seen a flurry of activity really over this month you know as you all have have noted um policies on content policies on advertising and you know most recently um there is some content moderation happening by um social media it happens to be content moderation of the new york post so weigh in if you would on um first of all can can and should we expect social media platforms to be policing themselves tristan i'm going to start with you on that um well watching their behavior over the last few years um it's it's always too little too late uh as much as we always like that they're taking us they're stepping up and taking a bigger responsibility now than they than they have in the past i often think recently of the metaphor of uh epidemiology because now with covid we're all very familiar with it and essentially we have kind of the reverse cdc instead of something that's trying to prevent mass infections of whether it's hate speech or conspiracy theories or these kinds of things we have the maximum infection of other people because the whole premise is based on unregulated virality and so what much of what the platforms are doing is going from a position of we're not responsible people just have to make their own choices we're just handing out huge megaphones or in the epidemiology example huge infection vectors to like let people infect as many other people as possible and it's up to you to get your own hazmat suit let alone the fact that it just dumps all this digital fallout onto the balance sheets of society um but you know we're starting to see now are very small moves so people talk about circuit breakers so twitter i think is now asking are you sure you want to share this if it knows that you haven't actually read the article that's kind of like wearing an epistemic mask so now it's sort of asking it's sort of slowing down the rate of transmission of false information uh by educating each of us that there's actually more false information on there that's kind of like dosing all of us with vitamin c making us all more cynical consumers i think we need to ask what would make this entire system ecosystem uh more resilient epistemically resilient to a corrupt information ecology because one thing that we talk about in the film the social dilemma is that this isn't just some recent thing where there's a few new bad apples and if we could only get the perfect set of mallets to do the whack-a-mole on today's bad apples we would have this perfect democracy that's functional again the only way you fix a place where we have such low trust and so many problems is by actually rewinding the clock 10 years and saying we are now 10 years into society going through this mass warping of not staring in the mirror of social media as technology companies will often claim that we're just showing you a mirror to your societies if you don't like the conspiracy theorists you're seeing in the mirror we're really sorry to tell you those are just your conspiracy theorists if you don't like the hate speech or the bullying or the discrimination we're just holding up a mirror to that as opposed to they're actually holding up a fun house mirror that intrinsically expands and rewards those actors who operate in those uh let's call them less constrained or less ethically or just sort of thoughtfully regulated ways um and that's really the the core problem is that the entire system has been kind of rigged for this mass warping effect and so one of the things that i'm excited about less in terms of platforms actions even though they need to do a lot more if i look at the growth rate of the harm so addiction mental health issues conspiracy thinking hate speech and i look at the countries that they're operating in the number of languages that they're operating facebook operates for example 80 elections i believe this year so do you think that they have engineers on staff who speak to hundreds of different languages in let's say ethiopia where there's actually that sort of thought to be genocide number two after myanmar was genocide number one happening on facebook watch the growth rate of these harms is far outpacing the growth rate of product changes that we're seeing being made or legislative changes that we're seeing from government so as those lines diverge the thing that i i'm not optimistic about that i think that scales to the challenge is really culture that we need a cultural awakening that we have been bombed by a business model like this is sort of the pearl harbor attack on the united states of america and around the world not by one bad actor but by leaving the doors open to the mass corruption of our society and that we need to kind of recognize what's happened to us and rewind the clock for uh for a better for a better time but my question is is the activity that we're seeing however small and however it may be too late a sign that that cultural awakening is is happening like karen let me ask you you know we we've been asking for a while for social media companies to police content in some way and now they are so should this be considered progress it's a great question and i guess what i would say is i would much rather they not be look not that they would minimize the individual pieces of content that they need to look at because they're starting the process much sooner that they're making systemic changes so they don't wind up in this position behind the eight ball so what do i mean by that we just released some uh research this week that showed that if you look at the outlets the websites that are pretending to be news outlets yet they violate all the traditional criteria that you would have for journalism they repeatedly publish false content for example provably false content these outlets have doubled their interactions online another set of outlets to be looked at um they don't gather and present uh information responsibly violating basic journalistic standards those have increased their interactions online 300 the platforms know who these outlets are but they're generating enormous amounts of traffic it would be so much easier for them to identify the repeat offenders to look at the ones who are doing risky uh uh um content that's going to cause harm and reduce the amplification so they're not spreading entirely to look at some of the pages that are coordinating to promote some of this content to look at some of the groups before there's a danger of harm so and an example of uh where they they waited way too late was in the fires in the northwest there was this rumor that was started in part uh thanks to some russian outlets that antifa had started the fires and the local sheriff's office was being overwhelmed by 9-1-1 calls as results had to put out a statement people were staying in their homes because they were afraid and tifa was going to steal their home militias finally were setting up checkpoints not letting people leave because they wanted to check if they were antifa the fbi put out a statement and then only then did facebook say uh oh we think there's a risk of imminent harm so i think instead of waiting for that moment when things are controversial they need to root out some of the um some of the supply chain of disinformation a lot sooner right so thanks karen for that so honey let me ask you you talk a lot and study um algorithmic amplification is it possible to do what karen is suggesting when you're fighting against this algorithmic amplification sure so you know to your question by the way i i do think social media has gotten better but there's a playbook that we see over and over again for two decades you deny the problem exists you admit that it exists but it's smaller than people claim you admit that the problem is there but there's nothing you can do about it and then eventually you get around to trying to actually solve the problem but you do it sort of half-heartedly and it's a one-off you're not really creating systemic change so now to your question you know when facebook and youtube and the social media companies tell you how hard it is to deal with the massive amounts of data they have would you please ask them but why is it then possible for you to profit to the tune of tens of billions of dollars doing fairly complex data analytics and highly targeted advertising and highly sophisticated inference about people's behavior you can't have it both ways you can't do very very good data analytics to profit and then say there are no data analytics to minimize there's absolutely mechanisms in place to make sure that viral content is not spreading before the harm is done they have to do better on making sure that bad actors don't simply come back to the platform the problem is these platforms don't have any friction in them so bots can create accounts extremely easily rapidly and rapidly over and over again you get kicked off the platform come back again so there's there's very little friction in the system and when there's no friction in the system it is harder to manage so they could add friction um they could um but but the problem of course is that we are asking them to do things that are counter to their financial interests as karen just said they profit when these things go viral and they get more and more engagement that's the underlying business model and so we are asking them to do things that are against their their business interest so to your question specifically willow should we trust them absolutely not there needs to be oversight to make sure that they are protecting societies and democracies and individuals because the last two decades has shown that they're either incapable or unwilling to do it um you're bringing me to a question that came in from the audience which is a great question which is a direct follow-up to you honey and i may have anybody who wants to weigh in on this please do and i might have all of you weigh in on it the question is from ariel yael if standardized best practices policies and regulations were to be created and enforced across all platforms whom should be entrusted to sit at the table honey to your point of of lack of trust and what should their priorities be yeah um well let me let me see if i can kick that off so first of all i think it's a great question and one of the things that complicates the answer is that these are not u.s companies they're global companies and we in the us can't be making decisions for these companies that have an impact on 7 billion people around the world so who do we need at a table i want people who represent the free speech open society everything goes on the internet i don't agree with that but i think they should be at the table i want smart lawyers i want ethicists i want policy people i want technologists i want human rights workers i want the people who span the space of implications of social media individuals societies and democracies and that has to be a global effort that can't just be a u.s effort um neil i think this brings us to you and your expertise because you know you actually have experience with other countries working to just as a for example contain the spread of disinformation so what do you know about the ways that that other countries sorry i said companies that make countries um have worked effectively that might answer the question of who needs to be sitting at this table well i wish there was a silver bullet that we could point to in a country we could point to that has effectively stopped this and i actually think as we look at it i mean essentially and i agree by the way the documentary social dilemmas is phenomenal um if you think about what's happening with social media as a public health crisis you can look at how we've managed past public health crises and they can be very instructive in in how this could work if you look at tobacco in in the 1980s when we um when they tried to self-regulate to avoid litigation and to avoid regulation it was a complete and utter disaster right what they ended up doing is launching programs in dozens of countries that were nominally about educating folks about the ills of tobacco but they what they actually did was use the most ineffective tactics possible and suppress the effective tactics in fact philip morris like horrifically convinced the california department of education to distribute tobacco sponsored materials to california schools which was nothing less than free advertising at the end of the day when they evaluated all these programs that the industry used they showed that not only did they not prevent smoking but they actually encouraged smoking so uh you know self-regulation i i agree with honey absolutely doesn't work and if you look at what the platforms have been doing it's not sustained it's after the fact it's lacking urgency and and the wildfires is a great example of you know trying to do something way after the fact and as hani said it is simply not in their economic interest to um to self-regulate now i do think because there's no silver bullet it's it's more than regulation and policy here i do think there's individual accountability we need a whole of society response and we have to i mean if you look at what worked for example with tobacco it was explicit education about the hazards of smoking including warning labels it was regulating advertising and prohibiting advertising aimed at children and it was taxing usage and those three things are super interesting to me so education is critical to all of this um uh if you look at what happened in ukraine and how ukrainians have gone up against russian disinformation campaigns they've developed uh with an organization called irex a learn to discern program that the ukrainian ministry of education uses to combat these campaigns by the way irex is bringing those programs now to the u.s for teacher and and school training programs so education and individual accountability is is critically part of this um taxing usage to me is fascinating because what it implies in the context of social media is you know we can't impose a cigarette tax right we can't you can't just tax the user because essentially it's free right now so it requires you to change the business model from an advertising business model to a fee based revenue model as a way to to tax um the usage and um part of what we're trying to do is sort of tamp down the the um the the the amount of uh crazy speech that's going up and then you know how you control what's happening to our children and and honey brought up a great example about child pornography we there is one agreed upon social norm online which is that child pornography is bad and when we can align around a social norm we can develop policies we can develop regulation when we don't align around social norms it's very hard for for a regulatory process for our political process to establish the right um frameworks um child pornography let me ask you about that how in the world at this moment do we think about establishing global norms right norms that cross borders and cultures and business models and and all of it so should i try that karen do you want sure yeah yeah so i mean i think one of the things that the us did in the early days of the internet was we really led so we developed a policy framework in the us and then we sold it to canada and then we sold it to europe and through the oecd where i was most recently the ambassador through the oecd and through the wto we convinced other countries to come together if we decide to take a policy approach to this other countries won't take our policy framework cookie cutter approach but they will learn from it and they will follow it and i think it's really useful to think of two buckets of problems maybe even three but the first two are things that are not such a big first amendment issue such a big free expression is what you would say in the rest of the world problem so what do i mean by that a lot of what we see on the internet is manipulative uh it's laundering content make it more credible to users so they think it's coming from a friend but it's really coming from a bot they think it's coming from a newspaper that's really coming from a content mill um this kind of thing is a classic consumer protection issue uh we can update our consumer protection laws so that dark patterns our consumer protection violation uh taking your data and tricking hidden ways is the consumer protection violation we can update our civil rights laws so that um uh some of the same kinds of protections you can expect against discrimination offline you can expect online so if you're an african-american woman and you need to be on twitter to do your job should it be okay that you can be harassed to the point where you can't be on twitter without being threatened with murder so update our civil rights laws there's a great bill a bipartisan bill in congress called the honest ads act which would update our campaign finance laws for the internet so there's a bunch of stuff we can do that's less of a free expression problem and more of just classic let's realize that the internet is the real world let's update our protections in the offline world for the protect for the online world then there's another set of issues where it's much harder it gets into a whole bunch of free expression issues and there we sort of let the platforms do their content moderation on their own and i think what some folks are starting to talk about with twitter starting to model is should there be some practices that we agreed to that we're transparent about ahead of time so we're tying our hands a little bit when when the real emergency comes and should we say it's not a question of it's gonna cause violence tomorrow it's not just a question of it's absolutely provably true but this thing could be dangerous if it goes viral so let's put in place some practices that will change that and the industry can come together and agree to some of those practices with a little bit of conservative pressure i think from folks around the world and the third bucket i would just get at this idea of taxing one of the problems with what goes on the internet is this vacuum has been left by journalism journalism has been absolutely decimated in part because the ad revenue that supported them has moved to the platforms and so i think what we need to think about is could there be a tax on online ad revenue and creating some kind of support mechanism for real journalism public interest journalism journalism that's not only motivated by political ideology or um immediate clicks and i think that's something that we need to think really hard about so i would look at those three different buckets and the us playing more of a thought leadership role in the world tristan what is your take you've spent a lot of time um in washington talking to folks what what is your take about some of the solutions that or suggestions that that karen is advancing and you know particularly this notion of getting the industry together to develop a set of practices that they all agree to yeah uh this is an incredibly difficult area um i appreciate the comments i've already made um as well as the meelo's comment about big tobacco i think big tobacco is interesting to look at because um is a classic example as they were placing responsibility on the issue of the issue onto individuals so saying they even created a fake lobbying group called citizens for fire safety so when actually apartments would burn down due to the fire on the cigarettes uh being there uh they would blame it on the um uh the way the furniture was too flammable as opposed to the cigarette it's a classic sort of shift of responsibility um i think that has not been talked about yet um is i i think there's a notion that what we need here is equivalent to the level of reform we had after the financial crisis that if you said back in 2008 what is the one bill or one law or practice that's going to fix all of this and the answer is we needed completely comprehensive financial reform akin to the basel three accords and dodd-frank and things like this to go from a comprehensively overweight over your skis risk creating over leverage banks selling junk as if it's real assets all of those issues had to be dealt with as a collective and i think similarly with technology we have a comprehensively predatory extractive and fragility creating tech environment so an example of this i think is similar to how we regulate banks so we say okay there's acceptable risk ratios around how over leveraged how over your skis can you be in terms of lending money out if you're a bank if you were lending out a trillion dollars for every dollar that you had we would call you an unsafe bank we would shut you down one way that i think about facebook is that the the premise of the automated business models of the operator what makes them so profitable is the fact that they don't have to hire any human editors who determine what's true or credible or real they can just depend on the algorithms to sort it all for people as well they don't have to pay journalists to create content because they do each of us into being narcissistic attention seekers posting about stuff which we'll do for free and then they profit from that activity again using automation the the way that this nets out is that as we create things that indeed that create harm they're lending out capacity of billions of items of content so they're billions of items over their skis for every sort of one moderator that they might have speaking that language so for example you could imagine just like we have safe operating thresholds for banks or for other kinds of industries we could have unsafe operating rate ratios for virality how viral should you be able to make your product how quickly should something be able to spread if we know that there are pathways by which very unsafe things that are literally leaving people to die are going viral at high rates and then shutting down surface areas that have too high of a risk ratio a clear example of this is when facebook shut down trending topics um they used to have on the on the right hand side of the interface um the you know the hashtag trending topics of what things were trending the most um within two days of having fired all the human editors and curators i think two out of the eight stories were actually both false trending topics and instead of trying to fix it they just realized they couldn't fix it it was systemically gameable that bad actors could manipulate it and so sasha baron cohen and some others have been pushing this campaign for twitter twitter called untrend october will they just actually turn off these high risky interfaces which are the equivalent of like these sections of a bank activity that are far too risky and turn them off for sensitive periods so before the election and all of october turning off safe youtube recommendations or turning off advertising for different parts of the services we're turning off trending topics uh certainly turning off suggested groups on facebook so the fact that why should we be having algorithms recommend q anon groups to people by accident which has been doing in droves again for many years now we should be turning off these high risk surface areas and so that's one aspect on the kind of financial crisis like a way to see it the second way is that much like in the financial crisis where we had junk assets that we were selling as real assets in this case in the attention economy we started running out of attention so we started slicing people's attention into thinner and thinner fake slices we had you pay attention to three things at once your television screen your tablet and your phone so now we've tripled the size of the attention economy but the quality the grade of that attention is junk and we're selling that fake attention for fake clicks to bots essentially fake users for fake reporting to advertisers uh facebook actually over reported uh how many views there were videos to advertisers by something like 900 in one study and so much like in the financial crisis i think we have a systemically fragile um attention sort of ecosystem that has led to systemic fragility throughout the system and the way to clean it up is by using this lens of what is fragile or harmful that were that's unsafe and what is the acceptable operating threshold that we can we can do i like to turn it off model honey i see you i see you nodding um the turn it off model sounds like something right out of the dark ages like when you had kids who watched too much television and the very simple solution was take the remote control and turn and turn it off um honey i noticed you nodding so first of all i don't think it's out of the dark ages and here's why when the boeing 747 737 maxis fell out of the sky what did we do we grounded every single plane around the world when we found e coli and romaine lettuce what did we do we pulled every single head of lettuce off of every store across the country overnight when there are risks to society whether they are physical products or not we should respond i think very few people doubt that social media has been weaponized against individuals societies and democracies so here we are two weeks up to a national election so why not why not shut it down why not just say for the two weeks before and the 72 hours after we are concerned about how these platforms will be used to disrupt a national election we're going to shut it down and we are going to play it safe maybe we're overreacting okay let's overreact let's play it safe i don't think by the way we're overreacting so i don't think it's draconian and i don't think it's out of the dark age and by the way there's a precedent because in europe in many countries in western europe you are not allowed to have public on the airwaves discussion about elections prior a week or two weeks prior to the election so there is precedent for this and i advocate for this very strongly okay so we know that um reports from gallup and knight say that four and five americans are concerned that that misinformation on social media is going to sway the outcome of 2020 presidential election 80 percent yeah should that 80 percent be demanding that that social media just turn it off i mean is that a solution i think it is by the way there's no reason why you can't do this other than they don't want to because they lose money and the real reason is people will figure out their lives are better without social media so the problem with taking an addictive awful product away from people is that when people go through detox they come out the other end they're like i'm not going back to that but i think it's a perfectly reasonable thing to do i'm talking about the election a little bit if you don't mind i'm going to stay on that topic a little bit neela far what what's going to be different this year from prior elections what have we learned and and do do do 80 of us have real cause to be concerned well a couple of things first of all i i do think um there's a way to control political speech and put controls around political speech versus all speech and the problem with shutting it down is that you're not just shutting down political speech you're shutting down all speech and and uh there's the first amendment that we have to contend to contend with and and since everyone's been using financial services industry as an example i actually think the anti-money laundering scheme in the financial services industry is pretty instructive here because there we as consumers as as bank account owners very specifically trade off our privacy in order to prevent money laundering right the banks report on our activities that hit certain thresholds but we're okay with that because there's a very direct benefit to us that is palpable so i think part of the problem here is that people have to understand this is where education comes in what the harms are we need transparency we need auditability we need reporting we need to establish standards for researchers to be able to access all the platform data and to expose it in order to get to a place where we can then shut it down and for people to believe that that's a good thing to happen now for the election what i'll tell you is you know we know that 2016 was not a great year and that there were there were effective influence operations that that took place and widespread uh disinformation um in fact between the two presidential elections and two uh two presidential debates in 2016 the atlantic reported that about a third of the pro-trump tweets in about a fifth of the pro pro clinton tweets were generated by fake automated accounts and in fact um uh usc did a study uh with indiana university that showed there were about 48 million fake twitter accounts that's about you know somewhere between nine and fifteen percent of all active accounts um didn't belong to real people so i'm interrupting that's our professor emilio ferrara so thank you for the shout out exactly and so we know that they're effective they're cheap um and they're below the threshold of war so we don't have great escalation policy to be able to retaliate against them but what happened was uh with the midterms in 2018 the united states and the intelligence community specifically made it a top priority to stop that activity specifically with respect to russians the united states um set off a special task force called the russia small group and it represented a pretty clear change in approach to how we did this the russia the russia small group pulled experts from both nsa and u.s cyber command to share indicators of compromise to enable dhs to harden security of election infrastructure and to enable fbi to bolster its efforts to counter foreign trolls on social media platforms and it worked um in addition to all that by the way in general nakasone who had nsa and cyber command recently testified to congress the other thing they did which was fascinating is that they sent cyber command personnel on several forward hunt missions where where foreign governments had invited them to search for malware on their networks and thanks to those efforts we were able to undermine efforts of the russians and basically stop influence operations in the midterm elections that all of that and more continues to happen cyber command is continuing to do that in the last few days we've seen a lot of headlines about both cyber command and microsoft uh stopping trick bot um uh which was an attempt to interfere with the elections so there's no question that as a government we're starting to organize we're not we're not going to take it lightly when foreign adversaries want to interfere with our most precious uh democratic process with all that said the key players are the social media platforms and at the end of the day without their full participation whether it's voluntary or it's com compelled there's only so effective we can be um but we're doing 2016 is not going to be repeated by the way for 2020 it's not just russia anymore we have china we have iran we have a whole bunch of actors that have learned the playbook and are trying to interfere and i am incredibly proud of the work that the intelligence community is doing to stop them and are you confident that that they will succeed uh in stopping malicious actors in the next couple weeks um so the problem with what i'm speaking about is is the foreign actors so specifically the russians the chinese the iranians i am confident that we will be very effective against them but we have a domestic problem as well and it's indeed a problem um and that domestic problem uh is not something that nsa and cyber command have any authority over um is a a question coming in on the chat that i think um i'm gonna i'm gonna toss to maybe some combination of karen and and tristan on this on this issue of sort of what should we do um what are the first steps leaders can take as governor as the government and corporate governance aligns with human well-being um and that comes from fey feemi uh well one thing i just feel like i need to go back to our previous conversation i don't think anybody meant this but i want to stand up for uh internet freedom and not shutting down the internet um you know the us sets an example for the world and a lot of the world access to the internet means access to information it means getting around censorship it means getting around um government surveillance so i you know i but i think you're raising such an important point which is uh we're not going to have it in and at the at inexcusable you know huge amounts of risk and so i think we have to think about how do we address the risk so that you know we we shouldn't be convinced that there's no way to have this wonderful jewel called the internet without all this harm there has to be a way to mitigate the harm uh so that we can enjoy the benefits without risking our democracy uh yeah thank you that's an important point because i because to your point we were not suggesting and shutting down free speech really exactly the future no yes yeah sorry um and uh i think i think the issue of like who should be at the table is that is that sort of the question willow um well for well not the there was a question about who should be at the table this one is more sort of what's what's the first step that that leaders can take to make sure that um governance is aligned with the well-being of humans right right right um i mean you know from my perspective um a lot of what we need to do is is separate um the folks who are who have a business model of um getting more and more and more traffic with the people who are thinking about both the user and then separately not just the individual user but the democracy and what we our old model in the world of journalism was um journalists devised to their great credit after world war ii this model of public interest journalism that was going to serve the greater good and they were going to separate we're going to have a wall between the business side of journalism and the editorial side but that was really unique at a unique time in our history and uh where we are now i think we really you know and those and we we had a lot of newspapers a lot of different outlets and um a lot of competition i think in this environment we really need to uh have some folks who are whose job it is to think about democracy and to think about making sure the user isn't manipulated and isn't discriminated against in the government you need to be just you know setting some basic ground rules and then i think when the platforms devise their policies it needs to be transparent and somebody said this before the word auditable we need to we can't be in this black box you know we're all grappling for all these metaphors from other industries and i like all of them and i keep thinking about that black box after an airplane goes down um you know the the people from the national transportation safety board get to go and take that black box and find out what happened why the crash happened what were they saying in the cockpit and then they do a report they go back to the regulators and they say here's what went wrong here's what you need to fix we have no insight into what's going on these platforms in 2016 the only reason we have all the data we do so that we know that the russians were pitting one group against another because the senate intelligence committee demanded it and got it in that one instant we're not going to get it again unless we have some transparency and audibility i think that's going to be a key part of making sure that the platforms are doing this kind of content moderation in a way that really reduces risks to society is it clear that the future of uh digital media is a regulated one given the climate right now yeah honey you're going to tyler tristan yeah can i get an actor can i jump in after karen um so i think um you know there's several different structures here justin rosenstein is the inventor of the like button uh who's in the film the social dilemma has talked about instead of a board of directors we need a board of the people and these platforms really are taking over vital life support systems of democracy whether it's our public square really children's education parents are sitting their kids in front of youtube and youtube's not hiring children psychologists to sort of have it be designed for children from every different life support system of democracy they really are now infused with and have become the new social infrastructure of society they are our new digital habitats i think this has never been more clear in a coronavirus era where we literally spend hours of time on our screen so any way in which technology's not aligned with society is going to be felt to a greater degree almost like you know you don't really care about ergonomics or a bad badly shaped chair if it's slightly misaligned if you only sit in it for 10 minutes but if you sit in it for 10 hours any little misalignment you're going to really notice and so we're all really noticing the comprehensive misalignment of technology right now so what could we have we could have a board of the people instead of a board of directors where the tech companies have to report to the people quarterly with transparency as just basic table stakes where we the people get to say okay we care about addiction we care about misinformation we care about conspiracy theories we get to find to define what the key um problems are and get the measurements back from the companies as everyone said here we often don't get to find out about how big these problems are until too little too late it would be like facebook is both the exxon of instead of oil but of human anxiety but instead of just being the extractive oil company they also own all the observation satellites that measure how much methane and co2 is out there you can't have vertically integrated measurement of the problem where only you know and you create the problem that's totally not going to work i wanted to say that andrew yang was the one candidate running for president who actually proposed an entire department of the attention economy at a federal level that could convene these kinds of civil society groups to come to the table together and say what are the values that we care about and what are the quarterly metrics that we want to uh reduce and then how do we actually do that proactively not reactively because as karen said the wonderful i mean black boxes are wonderful in airplanes but they happen after they crash we are going through a democracy crash and we don't want to have the black box just after this has all happened um and i think as nilo and i have both expressed i think in a similar conference in the past the the sort of zooming out view here is the problem of humanity as we have as eo wilson said paleolithic emotions and brains medieval institutions and then accelerating god-like tech and when you have accelerating god-like tech that's creating harms way faster than your medieval institutions are you know using the steering wheel you know you're going to crash if your steering wheel is lagging behind your accelerator so we need a a proactive steering wheel we have to upgrade our medieval institutions embrace our paleolithic emotions and become self-aware of them and then actually have the wisdom to wield more god-like technology and i think that can be done at combination through sort of a federal government implementation combined with the board of the people that really have to uh speak the people who are most harmed by these issues all right this board of people has yeah yeah you guys have a minute and a half left go for it let me add two things very quickly if you don't mind so first of all i love listening to tristan i can listen to him all day long he's terrific so first of all you can shut down facebook and twitter without violating our first amendment rights you don't have a first amendment right in the freedom of speech to be on facebook and twitter and i think doing so 72 hours before and after election is a modest step without infringing on our rights and to your question willow regulation is coming we may disagree on why or what the form is but it's coming in europe it's coming in uk and it's going to come in the u.s and it's just a question of whether we're going to do it in a smart way that is effective or do more harm than good neelu go for it just really quickly i was going to throw in another favorite professor wilson quote which is people would rather believe than to know is one of the things he said and that's one of the fundamental problems here is we need to teach people they need to know there needs to be transparency around what's going on and education is a huge part of this and i think with the polarized tribal society we live in and the fact that whatever regulation we come up with has to be multilateral because this is a global commons it's not just the u.s commons um i i don't know how much faith everyone has in our ability to resolve these issues and this is where i think individual accountability responsibility education just matters so much and we can do that much faster than trying to get congress to move we're trying to get nations to come together we have to do those as well but education is 100 in our control i'll take it on that note education is in our control and and we're doing we're doing our best over here um thank you all so very much i'm really excited about next year's conversation with you all um let's see see where next year's uh talk takes us um thank you honey ferry tristan harris karen kornbluth and nila farah howe thank you all very very much thank you so much [Music] [Music] [Music] you
Info
Channel: Milken Institute
Views: 254
Rating: 4.3333335 out of 5
Keywords: Milken, Institute
Id: Kr36A0txGhs
Channel Id: undefined
Length: 61min 28sec (3688 seconds)
Published: Tue Feb 23 2021
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.