Rappler Talk: Tristan Harris on fixing social media

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hello and welcome i'm maria ressa welcome to raptor talk and i'm really excited to be speaking with tristan harris he is someone you would have seen in the social dilemma but he also started uh as an ethicist for google i believe and and he started a company called humane tech which is the first time tech people really started saying you know what we've built is really dangerous tristan harris welcome to rappler it's so good to see you here maria really pleasant to be with you no wonderful i mean congratulations first of all on the success of the social dilemma 20 38 million households tell us tell us first you know how is what's happened with the social dilemma yeah well you know it came out september 9th and i think the filmmakers really wanted it to come out before the election in the united states because they felt everyone felt that it was very important that the world understand just what's happened to society why does it feel like there's authoritarianism that is rising so many places around the world and why um everyone sort of is polarized disinformed conspiracy minded outraged lower mental health addiction distraction all of these features is this by accident or is it by design and um i think that the film came out on september 9th and in the first 28 days netflix just released the numbers uh that uh 38 million households saw the film in the first 28 days which is probably closer to about 50 million people and the film was released in 190 countries and in 30 languages so if you really consider you know as a president i don't think there's been a documentary film that has resonated this broadly this widely we think of it like the inconvenient truth but for the tech industry because it's not about the climate change of the planet it's about the climate change of culture around the world in the climate change of democracy and mental health yeah no absolutely and i think it struck a chord with everybody because we've felt it but you know this this puts it out i mean tristan let's talk first about the us elections right i mean we've all been watching it and shocked at where it's gone what role did the social media platforms play in this they were central but i think the thing that people need to get is that they were central for the last 10 years um so we are now 10 years into this mind warp process it really isn't about you know if we take the whack-a-mole stick and like did we get all of the moles you know did we get this piece of disinformation russia throw that one at us and china threw this one at us it's really not the right conversation um because it's not about content moderation or whether we whacked the right moles it's about is the entire business model about growing moles and and putting fertilizer on molds that the most moles grow in the soil as possible and that's really the problem is that these services are as you've said you know behavior modification machines they're an attention casino the the basis of the business model is to be able to post anything and to have it give you the promise that it might go viral to millions and millions of people regardless of whether it's true regardless of whether you fact checked it regardless of whether you made it up or you're a bad actor trying to sow nefarious minded conspiracy thinking etc and society and i think that's the core problem is until we change the business model we really won't have done anything and it's great that i think facebook actually and twitter to commend them especially facebook and twitter specifically more so than youtube uh had war rooms that did a much better job dealing with specific disinformation attempts that were raised i believe that facebook had a metric for what they call incitement to violence one of the things that's alarming is that in the united states i believe that that metric went up by i feel like i'm going to get the number wrong it's either 450 or 45 percent um uh sent in in the five days i think preceding the election or after the election uh i have to i think it's proceed preceding the election um which means that like 50 something percent right at that time when i saw it 52 that's right they have some kind of score from 400 to 700 or something like that so maybe that was i was thinking i think if you're right it's about 45 to 50 percent so um you know this is really problematic but i think that the the overall thing that especially becoming clear to me since spider you know has won the election is that as much as i think not having an authoritarian use of social media anymore if trump is no longer president you know come 10 weeks from now there is still an information environment that has been broken yes because even with say president biden um controlling the bully pulpit and you know controlling what uh you know setting the ban with the attentional airways for what millions of people will be thinking about he won't be using it the same way i think one of the most pernicious things about the way trump used uh social media is that the daily attention of the world was always on him and this is what authoritarians do they they invade the private thoughts and feelings of the dinner table of waking up in the morning of driving everywhere the big posters everywhere that's what authoritarianism is authoritarianism is and i think that social media has not changed the fact that because of smartphones and social media someone can use it in that authoritarian way where it invades your daily thoughts and feelings in this oppressive and tyrannical way so what i really worry about is that that those conditions are still set and even though you know trump may be out of the picture there is still a fundamental way in which social media rewards disagreement and outrage because all of us are inconsistent we'll say something and we might have said something different four years ago and with social media it's never been easier to build a caricature of what someone said four years ago or two years ago or a month ago and show that you're inconsistent and then build a mob that will hate the cartoon version of you because you're walking you're a walking contradiction but guess what that's the entire world and social media has turned our information environment into an environment with grievances and cynicism and outrage and disagreeableness and that is not going to enable any democracy to make choices because you can measure a society in its ability to communicate and coordinate with each other and make real choices together and that's what's exactly being broken down so we'll get into all of this i'm sure but that's unknown it's perfect so you know of course we know this first hand right i mean the bottom up hatred and attacks on social media top down a attacks coming mimicking the attacks on social media and then sandwiching and killing death by a thousand cuts right killing democracy i mean it is really good to see uh that uh that biden and harris won by you know by enough of a margin uh but still uh the effects are there and we're coming up to our elections in two years time or in less than two years time i guess the first is so let me break that down into questions but the first is we know now lies laced with anger and hate spread faster and further uh the real facebook oversight board other groups that you and i are part of have actually already said that this is going to have to be changed the design has changed but that hate speech uh conspiracy theories seeded that don't go viral immediately but then change the way we think all of these things are still there and all of the people on both sides of say black lives matter they have changed the way they look at the world because of social media i guess in this sense you pointed out the problem what's the best way you can see moving forward well it's incredibly challenging um and i as much as you and i both want to solve this immediately i think the first step is you can't solve a problem you don't fully understand i think it was charles kettering who said that a problem um fully understood is a problem that is half solved and a problem that is not fully understood is unsolvable because it means you're not reckoning with the actual space of what that problem is i think we have to define this problem by the way that social media intrinsically with user generated content meaning anyone can participate and post anything and have the promise of morality that can go viral to millions of people and not have that power which is an exponential broadcasting power which is now in the hands of 15 year olds from you know 15 year old girls from instagram accounts all the way up to dictators all the way up to um you know regular people who are using it that that power has not been coupled with ethics or responsibility and if you decouple the amount of power i have to influence other people from ethics responsibility or good fake usage that decoupling fundamentally produces a toxic environment so the challenge is either because it's a bottom-up system we have to ask some really difficult questions i mean i think we have to ask is a user-generated content model where anyone can post anything and it can go viral to millions of people and we don't verify who you are or whether your intentions are good or whether you're just trying to so dissent is that compatible with democracy it's a very big question isn't that a basic one though tristan i mean i guess because because we're journalists the gatekeepers were supposed to do that right because people will lie so that basic question is that not something we know tech hasn't created a system like that but do you think there is enough evidence now that tech will realize it needs to be a gatekeeper well there's tech that needs to be a gatekeeper but then there's also all of us because as you said i mean your work and in the back your background is a journalist right and you you said in our podcast together on your undivided attention about how you know the government had called you and you they wanted you to put the cameras on them right but you actually would held the cameras because you knew that that would enable a coup to happen i think is what you were talking about right right and um you know that ethical discernment that says should i do this or not what will happen if i put attention on this that's the job of a gatekeeper here's the problem each of us every single person who posts on social media is now a micro gatekeeper for contributing to a reality except we are not acting like gatekeepers we are acting like toddlers because actually the more you are a toddler and the more you yell and the more you scream um the more attention rewards likes followers comments you will get right and so a system that in which incentives and money and retention and rewards flow to the loudest most angry bully is not a system that will ever reward good faith unless people don't act like bullies and they act like responsible journalists but the problem is we saw in the 1800s in the united states yellow journalism and we'd almost got us into a war and we had to invent media ethics and journalism to prevent that from happening again but now you can think of what's happening with social media as decentralized yellow journalism each of us are actually incentivized with more rewards more likes and more hearts the more yellow you get and it's giving us sort of societal jaundice where we're all turning yellow right and i think and i'm saying this not to emphasize the problem over and over again it's just that unless we reckon with we we have to be able to take the entire participant body of people participating in social media and not participate as yellow journalists which means and i see by the way i think you and i both do um if you look on social media right after the us elections you see some of the top people in our society still yelling with outrage still yelling with with with basically nina nina nina hey you know you trump supporters you lost yeah the more that anyone everyone in our society is participating that way the impossibility still stands because those people will always get more attention than the calm level-headed people who say very little which is what i've done for example um and i think that's a real thing we have to recognize so how do we do it we either have to you know we each have the power of gods now the power of god-like psychological influence we can broadcast each of us to as many people you know i have 60 000 followers on instagram or something as a result of the social dilemma i can reach as many people as a small tv station or a newspaper could you know just 20 years ago but now i have to act like a newspaper or a broadcast tv station and have the responsibility of that if i'm not acting in that way i probably don't deserve the privilege of freedom of reach and as we say freedom of reach is not the same as freedom of speech we are not all given a god-given right to a state a football stadium-sized audience i mean i think one way to think about this is how would you feel if your ex-romantic partner was on the tv screen of a huge football side stadium and they could say anything they wanted without accountability that's exactly what we have created right so then the question there is do you do the gatekeeping before allowing individuals in a much more uh you know protective i mean due diligence right on the people who do get the power or is it the platforms up top deciding you know who like 230 are uh is holding our platforms media platforms and i guess you're right tristana i'm very frustrated by this and i want to see immediate solutions so we were working for for quick short-term solutions were fact checkers that's why we're fact checkers but it's not enough so right on both those fronts you're identifying okay so either everybody learns which is an impossibility or right so so tell me tell me which way well so i think we're we're narrowing in on a few things here so for example if we try to solve it bottom up then as you said we need to verify good faith participants we'll think about different ways of doing that so one is that facebook shut down i think two billion fake accounts in one three-month period they have three billion users and if they shut down two billion fake accounts you can definitely say i i'm pretty sure they didn't get all of the right ones right yeah so one of the problems is we have bad faith usage because we don't have real people that are using them we have bot farms so if we add a gating function there we say we're going to require a driver's license or a user id something that verifies you are who you say you are that's going to be one way of sort of verifying authentic use but then you could still have authentic people who are using it in inauthentic ways which is exactly what we're seeing you know in many countries around the world so how do you verify that people will use it in good faith ways and act like responsible not yellow journalists sort of gatekeepers that is a problem that i don't think we know how to solve so we can put the responsibility on the platform for let's say the most highly distributed pieces of content but then people will get upset at them because now they are the arbiters of speech and then if the republicans don't like the way that they censor um certain people you can drive up even more polarization political wars which is exactly what we're having in the united states right now but the more facebook in many cases you know you might think positively have taken down or at least restricted some of the distribution of um inauthentic speech and in the in the process it is bundled with that some maybe good actors who were not bots but then they're wondering why they got censored it's driving up huge amounts of polarization in countries because then people think i'm getting censored so if we make the platforms responsible we run into that issue if we make users responsible we know that users really just don't act responsibly even if i look at again the most intellectual and accomplished um you know people at the tops of our of our society they still participate in twitter in sort of nasty toddler-like ways and i don't just mean donald trump i mean even people on all sides yes so i think we have to ask really deeper even more fundamental questions which is recognizing that this is a machine that has hypnotized us into hating each other and that i think that's the benefit of what the social dilemma accomplishes is it shows how it's not that the left is wrong or that the right is wrong it's that the entire system has made each of us see the worst parts of each other and that's not in in that society if i play it forward like we can keep escalating towards civil conflict and then moving towards civil war but that's not gonna result in a world that suddenly you know we end up with a better world so we can either choose to do that or we can choose to heal and de-escalate but that means everyone has to choose that and we need social media changing the design norms and the ways that it promotes certain options and choices to enable healing first instead of america first let's have healing first no no no that's great and let let me take you then to the design that as you pointed out right like shashana zubov wrote a 750 page book on surveillance capitalism that all of the problems that we've identified stem from a business model that incentivizes hate right that's the incentive and the more hate the more anger the more conspiracy theories uh actually the more dehumanization the more otherness then the more money the platforms make so in something like that in in the social dilemma you actually show the creation of ben the machine from the data that's that's very very fun the way the way you guys did that right but that machine knows ben better than ben is the kid guys for those watching who machine learning he was radicalized so this machine radicalizes us makes us uh isolates us strangely enough right so how do we and those are layered effects uh maria which you're so onto it which is the um you know the business model of attention and engagement doesn't just profit from hate and conspiracy theories it also profits from addiction and isolation so the more ben is you see in the film gets isolated it's sort of a cascading effect is the more isolated you are the more vulnerable you are to conspiracy thinking because you're not interacting with real people and the less you want to engage with real people because they don't see the world the way you see it you know the real answer and they don't and they're the muggles and so the more you get radicalized and it's sort of a self-rewarding reinforcing process and again i think we can change this if we realize what heals us i was talking with a friend recently about someone that they knew who was radicalized previously by one of these sort of alt-right groups online and what actually helped them get out of it was that they got a girlfriend because it turns out what we're really seeking is connection and meaning and if we feel and love yeah and i think that i mean imagine this is a little bit of a non-linear solution but if each of us had deep connection and deep community and deep love because we actually got that in the real world because we were actually able to physically we are living in a mind-body meat suit you know that needs touch we need you know affection we need eye contact we need connection and social media profits from the screen based version of that which is not the same as the fulfilling in person thing and especially in a covered world where we're more isolated we can't spend time together it makes us even more vulnerable to these effects but i think that you know personally speaking i spend time with a very small community and that really does i think help us heal and and not get sucked into all of these things because you find it more ridiculous and you don't trust it as much no that that's that's fantastic so you've sorted out kind of you know the human behavior part that that we're looking for the problem though is i guess and this will be our last question because our our wrapper plus so much to talk about but yeah i know i know um but you've you've drilled it down to this how do we get to that from where we are i mean i'm part of this forum on information and democracy infodemix were coming out with four papers on recommendations uh for how to regulate tech if i've been waiting i think when we first met right we were sitting next to each other and i was like oh my god if they don't move fast i will feel this um and now i i feel like if i go to jail it is partly because of the information ecosystem it normalizes it right so how do you get from where we are to the vision that you have um i just want to say that i wish the tech industry acted a lot faster over the last few years and i'm really sorry that everything you've had to deal with personally in all of this and the fact that your reputation and what people even believe about your situation is the product of what random people post and gets the most clicks you know and what people can assert without you being able to respond as we say in the film fake news spread six times faster than true news which you know um before you go there is that we that we haven't even talked about the countries that are manipulating us russia china iran saudi arabia right facebook took down on on september 22 a chinese network that was attacking us and and and was pro china and was starting a campaign for the presidential elections of duterte's daughter president duterte's water so we already see it coming but anyway please how do we get to a more ideal or safer place from where we are i mean this is a climate change scale problem that we need climate change scale solutions so um i think that you know on the national security side we have to realize that there is a new information war that's happening and you know if i asked you how much does the united states as a country or the philippines spend on national security on physical borders on weapons on border patrols on passport controls on iding systems you know billions of dollars the united states will spend trillions of dollars on their nuclear arsenal and in f-35s and meanwhile they left their digital border wide open because when you protect your physical borders and you have passport controls once you install facebook in your society you just open the border to anyone because they can walk right across and you're only as good as the number of people that facebook has hired to deal with the national security of that country we actually use sometimes recently the covid metaphor everyone knows now the flat and the curve thing there's only so many icu beds right and so we have to make sure we keep the number of people that are infected low facebook only has so many icu beds for the nations that have been attacked and so recently with the us elections it's devoted most of its resources in the election war room to the united states which is this huge blob of an i you know of a country that's sitting on top of a very small number of icu beds and now when azerbaijan or the philippines or myanmar need help it only has so many researchers and resources and data scientists and people on the integrity teams to deal with these threats whether it's from china or north korea or iran or russia or china as you said so i think that's one of the issues is they have to spend billions more dollars on being kind of a state department for the world now we have to ask is that what we really want do we want facebook to be the one in charge of all this and be building a state department that manages whose interest are they representing for the whole world and that's what's really you know become the problem is that they are unelected you know authoritarian you know managers of the global consciousness and i don't you know i'm not a fan of making them the global state department for managing the national security of the world but when we have no other choice at this point for them to at the very least try to protect every single nation and to not even allow them to go into nations where they are not equipped to protect them and you important fact for you also is if you think about the cost to protect an english-speaking user in a western democracy the cost of that is kind of lower because they have these content moderation farms that know how to deal with that and most of facebook's growth has already taken place in the major western democracies so what's going to happen is the growth is going to come from these other countries in which the cost for an ethiopian user is way higher because i have to hire content moderators in countries where the language they don't have the language coverage so as they put on as they bring on these other um users from these other developing countries it's going to cost them more per user than it did for the early stage stages of their growth so we have to put that cost on them we have to remember that no matter how much that costs they they are trillion dollar corporations facebook and google and they can afford this and we have to make sure that they pay for the for the harms that are showing up on society's balance sheet and not their own you know absolutely i mean facebook q3 made 21 billion dollars right i mean so let me ask you last question is you know in the philippines uh we are this is the fifth year in a row that filipinos spend the most time on social media globally i guess advice for us what what what can we do well i think the first thing is to recognize the intrinsic toxicity and unverifiability of this system um i think people just have to realize that as much as this is the only way to reach large numbers of people i mean you are using facebook right now to reach with this conversation mill you know thousands of people or hopefully millions soon if not um and um you know we with our work are trying to reform social media and all these technology companies and we still have to use it because they have monopolized the mechanism by which we can reach other people so we have to be self-aware of that if we just tell everybody get off of it as you know as you've told me the philippines one of the reasons that the usage is so high is that the um connections people have to the rest of the world because you have laborers other around the world people have to use facebook so if we're going to be forced to use it we have to use it with incredible skepticism and cynicism that most of the comments that you see if they're not from a close friend just assume that they're completely made up um you have to realize that uh disagreeableness and outrage are just before you post anything that is of that flavor ask yourself why am i posting this what will i get from this and choose healing over conflict choose de-escalation over escalation because if we don't each individually do that the system will continue to reward that and then even if every single person watching this right now chose healing over over more conflict there would still be other people who don't and we will all still be affected by those who don't choose healing so it really is about a global consciousness that we have to raise and frankly we need to put pressure on the tech companies to radically change and minimize the kind of virality the unchecked virality that is intrinsically unsafe and toxic even though it's great that people like you or maybe me can go viral and we can hopefully spread positive messages um i think we we have to really reckon with the intrinsic unsafety of the system especially in places where the companies don't have a level of attention or integrity teams or icu beds if as it were for those countries no no no it's definitely uh you know i from being the strongest proponent to a real critic right now i mean it's scary it's scary where this can take us um tristan your last words before we close wrap their talk um i just want to say uh you know i am such a huge fan of the work that you're doing and your courage maria and your heart in in everything that you're doing because it is a difficult difficult situation and i feel incredible regret that the tech companies have not been able to recognize more quickly um you know just their responsibility in the world and to have fixed or tried to address these problems before it got worse um i i do think though that the one thing people should realize is the more people understand this problem the faster it will change and i will say that as many years as you and i have been both working on this i've never felt more hopeful than i do right now with the combination of a biden administration getting elected and our ability to reform the tech companies and the possibility that um you know we have 40 million to 50 million people around the world which is bigger than the civil rights movement or the gay rights movement you know potentially to really drive change so we're trying to mobilize more people at the center for humane technology and it's really a global movement and i want the movement to be able to see itself because it's already way bigger than the technology companies i think um so no and we're with you fantastic thank you so much tristan harris guys if you haven't yet seen the social dilemma go on netflix watch it join us we've got to fix this problem this is all on us thank you tristan thank you so much maria it's so good to be with you
Info
Channel: Rappler
Views: 1,274
Rating: undefined out of 5
Keywords: philippine news, news, news philippines, philippine politics
Id: kzROWn68BTA
Channel Id: undefined
Length: 29min 12sec (1752 seconds)
Published: Wed Nov 11 2020
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.