Tristan Harris Says Tech Companies Have Opened Pandora's Box

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
back to our top story Wednesday Senate hearing on fighting online extremism Facebook YouTube and Twitter all feeling the heat from lawmakers Facebook's had a product policy and counterterrorism touted the media Giants efforts take a listen we now have more than 7500 people who are working to review terror content and other potential violations we have 180 people who are focused specifically on countering terrorism now Facebook's efforts against online extremism aren't the only things the company is having to defend lately the very nature of their business is being questioned our next guest called them quote a living breathing crime scene for what happened in the 2016 election Tristan Harris is a former design ethicist for Google and has been called the closest thing Silicon Valley has to a conscience by the Atlantic magazine he joins us now here in the studio just on great to have you here there's a show so what did you hear today that makes you think things are any worse or better than perhaps you would have thought well I think no matter what these companies want to say that they do that they've created Eve unleash Pandora's Box they have 5 million advertisers Facebook has cycling through the network every single day there's no way to check what five million advertisers how they get matched at hyper micro-targeted you know characteristics to each individual users there's billions of channels basically on the new TV and there's no way to basically know or be accountable to all of that complexity so they've been unleashed this this civilization scale mind control machine that they don't even know what thoughts it's pushing into two billion people's minds as Roger likes to say about two billion people use Facebook that's more the number of followers of Christianity 1.5 billion people use YouTube that's more than number of followers of Islam and these products have that much daily influence over that many people's thoughts it certainly is massive what do you think of the changes Facebook recently announced about focusing more on friends and content from family in the newsfeed as opposed to news well it's a step in the right direction and you know Zuckerberg titled his post when he said that as we are embracing time well-spent as the future of the direction of the company both in his recent posts and also in the recent earnings call that concept came from myself and my colleague Joe Edelman we've been incubating the concept of time well-spent and calling out the problems of a time spent based economy for the last five years so I think it's great that they're embracing the concept but the real challenge is it goes against the advertising based business model you can't ask someone whose entire stock price has been told it's dependent on telling Wall Street that we have this many minutes of people's days it's just a multiplication this many minutes times the average ad rates and you get the revenue right so if they're gonna say we're gonna cut down on how much time people spend I mean they can do a tiny bit but not that much and so the real question is are they willing to examine the business model you've called Facebook a living breathing crime scene for what happened in the 2016 election that is a bold claim yep what do you mean well the point is that no one actually has access to what happened in the election only Facebook has that data and so now the question becomes who can we trust Facebook with telling us the truth well if you look back at what they've said since literally a year ago when first Mark Zuckerberg said it's a crazy idea that fake news had any impact on the election and then them continuing to withhold and delay and defer the the release of information first thing it was a hundred thousand dollars in ads but then as Roger and I and many other you know there's a lot of researchers like Renee that I rest on others at data for democracy that did lots of background research finding that the the Russia campaign influenced a hundred and fifty million people and Facebook did not admit that until the day of the November first hearings so forged if they're telling us that we should trust them to self-regulate they don't really they've not really want our trust and so in that way it's a living breathing crime scene now you first started calling attention to this when you were at Google it's right you worked there from 2012 to 2016 and you wrote the first Google memo and and you sent something within the company what were you raising alarm bells about and what was the response well basically what I said in this memo in 2013 was you was a product manager I was actually feeling kind of frustrated that I didn't think we were taking our responsibilities seriously and I made a presentation basically said never before in history have 50 engineers 20 to 30 years old living in San Francisco where we are right now influenced what a billion people are thinking and doing with their time and and their attention and we've enabled this channel that's exploiting people's cognitive biases or exploiting people's psychology and we as Google not as small little startups that are gonna fix the problem but Google and other large technology companies have a moral responsibility in addressing this problem and the presentation went viral it spread to you know 20 thousand people it became the number one meme in the internal culture tracking system and I had to talk with Larry and I ended up working on this topic ever since 2013 and it was it was way before all this stuff about fake news and elections and everything else is an awareness that these technology companies have a larger influence on culture elections children's development than than almost any other actor political actor and so you know how do we start to have that conversation and actually ethically you know nip you late or be careful about how we're steering 2 billion people's thoughts how did Larry respond you know I think across the company there is a a real seriousness and taking to heart the message that people take you know I think Google's actually a very ethical company they really do have good intentions and the real elephant in the room is the business model the advertising based business model means all of these attention based companies so YouTube snapchat Twitter Facebook are all the business of capturing people's attention and you know YouTube stated goals we're how do we get billions of hours watching you know watching on this on this product for as long as possible did he acknowledge that I mean did he share your views or sympathize with your views I don't recall specifically in the meeting with Larry but I mean I think that it's never the conversation likes to get avoided because it's an uncomfortable thing to look at usually it's sort of an innocuous thing I mean I remember the Chrome web browser started measuring how much time people spend in the web browser because they just wanted to know how much time are people spending on the web versus in apps it's a useful thing to know but as soon as you start measuring how much time people spend in the web browser suddenly all these young 20 year olds start going to work and try to maximize how much time people spend on the web and so you know you manage what you measure and we have to ask ourselves what do we actually care about should these products be designed for addiction which is what they're designed for now and this has huge public health consequences for children what are the consequences well we had Jim Steyer of Common Sense Media on the show yesterday and there really isn't a lot of research on how tech impacts children shockingly given that there's so much concern about it but we actually don't know well there's a great article by Jean Twenge that got a lot of traction called have we have had smartphones destroyed a generation and it talks about many of the cultural and social impacts of how addiction to smartphones have changed our relationship changed children's relationship and the children's dynamics when are they having sex when are they going out people are more isolated more depressed and if you just look at the dynamic snapchat for example puts the number of days in a row that you sent a message as a kid to all of your friends and they're inventing this number it's a persuasive manipulative design technique to keep two kids on the hook to keep this ball getting tossed back and forth every day if they stop throwing the ball back and forth they lose the number and so kids start defining the currency of their friendship based on whether they're sending this empty message back and forth and so you literally snapchats the number one way for teenagers in the u.s. to communicate so you have a hundred million teenagers that are basically out there throwing empty messages back and forth is any of this designed to help us or is this just too addictive well it's interesting you mentioned genes article in the Atlantic because I remember reading that and also remember there really was no stand taken you know it was fairly in my opinion fairly neutral yeah in part I think because we don't have a lot of the answers here well I think what we can know is the motivations if you look at what have thousands of engineers at Facebook go to work to do every day do thousands of people wake up and say gosh how can we strengthen the public square' know thousands of people go to work ask to drive up one number which is how much are people engaging with and increasing the time they spend on these services I want to live in a world where the tech industry is actually about help humanity and there's a lot of ways they can do that and so we started this nonprofit time well-spent that's basically about changing that and realigning technology with human values and what technology's supposed to be for why in the world would we not have it be that way have you heard from Facebook have you heard from mark or Cheryl you know I've had lots of conversations and there's people in the industry are my friends my friends started an Instagram I think there's a reluctance there has been a reluctance to admit the extent of the scope of the problem and that the business model is the problem and you know I think there's a lot of good intentions but until we get clear that the business model of advertising is fundamentally misaligned with democracy if the business model is I have to capture your attention that means it's better for me to give you to confirm your worldview and give you things that agree with what you're thinking then just show you things that disagree I'll do worse than the attention economy if I show that reality is more complicated or different than how you think it is there's a subscription model for Facebook eliminate all these conflicts well certainly it would change who the customer is if if all of us are paying for the product then you have thousands of engineers who's go to work every day and who are they working for they're working for us for people who pay what about the people who can't afford the subscription so this is really really level the world's playing field that's right and that's what they'll say is it do you want to introduce an inequality in the system we're only some people can afford to pay and I think the challenge is actually that the advertising business model has indebted us to a whole bunch of cultural externalities so we have to ask how much does those do those cultural externalities actually cost us how much does it cost us in terms of extra data plan usage for example is the free business model really free if you add up all of the costs to society and to their data plans and all this other stuff I mean half of the stuff that we download on our phones is probably ads so if you just cut that out and you actually had all these people I mean we would save a lot of money as consumers as well and I think we have to figure out what is that price point and what are we willing to pay for because we can't really afford it the reason we're Roger and I are doing this Roger McNamee the second brings mentor and we actually met on this show and Roger met here yeah and it started this big project yeah it's crusade yeah and and you know you know I think we have to get clear on what will it cost us I think the reason that he and I are are doing this work and we're not really paid for anything like that is because we don't think we can actually survive or afford to live on this model as this we can't just keep plowing down this road how do you think the danger if you will that you believe Facebook presents compared compares to what Google or Apple or Twitter presidents well the thing about Facebook is it's it's the product that just reaches the most number of people it is the primary surface area for disinformation campaigns for bad actors to manipulate the public it's the scale of Facebook that makes it the predominant I think source of the problem but obviously YouTube and Twitter are also influential actors Twitter influencing more of the media YouTube influencing political beliefs and radicalizing people with confirm or conspiracy theories but I do think that Facebook is the one that deserves the most scrutiny just because of the influence whether they want to or not it's hard for them to make any action that we're gonna be happy about because they have just so much influence that anything they do will negatively and positively affect a lot of number of people but but that's what I mean by Pandora's Box I mean they have now unleashed this this machine that's automated lis throwing thoughts into two billion people's minds and it's doing this in Indonesia and Bali and in you know Burma and it's creating you know genocides in certain cultures around the world and the engineers the company don't even necessarily speak the language of the places where it's impacting politics or culture at least here we have a free press we can talk about fake news we can talk about some of these issues imagine in countries like Burma where they just got the internet overnight now Facebook is the number one way people communicate and there's a genocide and the real hinga are actually crossing over the border into bangladesh so you know they've unleashed this thing and now we have to get really serious about what would it mean to try and protect the public square not just you know here in the West but around the world so what are you doing aside from speaking to the media how do you think you're gonna make this change happen well so the first thing is creating a cultural awakening I think people don't even think to question their relationship with whether it's on our team and so the first thing is just going out there and making sure that people understand this misalignment of incentives so we did that with 60 minutes and the ted and you know things like this the second is engaging employees because I really think that actually the employees at these companies not feeling satisfied with with their leadership not taking this issue seriously is gonna be the number one driver for change if you look at the delete rubric campaign the delete over campaign didn't you know killed ubers business metrics didn't kill that revenue but it totally changed the culture there was a shunning effect no one wanted to work at uber after this this delete uber campaign happen there's a huge exodus of employees and they had to change their practices so I think the same thing is gonna happen with with Facebook I mean it's gonna be really hard to attract and retain the best people in the world if you're not taking these concerns seriously it's made a lot of people very rich it has and I think a lot of those people walk away from its it is hard to walk away from and what Roger and I have found is that there's actually a lot of people who may who are exilim nigh from Facebook who did make a lot of money and they they actually agree with Roger and I they agree because it's not an opinion it's just the truth but it is hard to come out against the source of what funded you look at Jim auth you know who's VP of growth at Facebook said hey we've built a a mind control machine is Sean Parker said the same thing I can't believe what we're doing to children but I think more and more voices are going to start to come out because they have to feel like it's safe to do so alright Kristen Harris former at design ethicist at Google now on this new crusade thank you so much thank you for have you
Info
Channel: Bloomberg Technology
Views: 33,012
Rating: undefined out of 5
Keywords: Bloomberg
Id: nay5w-FC08Q
Channel Id: undefined
Length: 14min 10sec (850 seconds)
Published: Wed Jan 17 2018
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.