The Role of Technology in Society | STYT

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
I like that we're kicking off with you the Wis was sort of reorient ating our our compass as we dig into the future of technology and I think first and foremost we want to get a sense of where we stand at the moment in terms of technologies intertwining with humanity and for all the good and bad that that might bring I know you've got a few slides perhaps you can show yeah I thought we'd do is show just a couple slides to sort of frame where we are and how we got here can we switch to the keynote and I'll make this super quick there's a lot of things that seem to be going wrong in technology and society there's sort of Hurricane Cambridge analytic hurricane fake news hurricane tech addiction hurricane polarization outrage education and it sort of like with climate change you could say oh my god the coral reefs the wildfires but the question is is there something behind all these hurricanes that's causing all of them to happen at once and it's gonna give you a quick intro yo Wilson said that the real problem of humanity is that we have Paleolithic meaning ancient emotions ancient brains we have medieval institutions and we have godlike technology that this is the fundamental problem of humanity Neil Wilson was the father of sociobiology and these three are operating at different clock rates our ancient Paleolithic brains are not changing and it's the weakest point in the system and while we were all watching out for the moment when technological progress crossed the line of human strengths and our IQ that's when it took our jobs that's when we get the singularity is what we're all obsessed with there's a much earlier point when technology overwhelmed human weaknesses and I want to argue that everything that's been going wrong that we were actually feeling in our personal lives and our families and our children starts from this point and the super-quick points are when it overwhelmed our cognitive limits that's when we got information overload that was the sort of Marshall Islands of our limited brains getting overloaded by technology when it overwhelmed our dopamine we feel that as addictive use or compulsive use digital addiction when it overwhelms our social validation takes advantage of our self-image we feel that as influencer culture our confirmation bias and fake news our outrage and you get polarization and when it overwhelms the limits and foundations of what we trust you get thoughts and deep fix and so if you look at the right-hand column all of those effects together represent a system a mutually reinforcing system that we call human downgrading and we can talk about that later but I wanted to give this context because it helps set up a framework for why all these things are happening and a collective poisoning of different parts of the human social fabric I like to phrase that we spoke about before tension capitalism yeah tell us about what you think is driving the way in which technology is being built at the moment towards these negative effects and repercussions yeah so there's a lot of focus on sort of the data side of what's gone wrong with technology but people tend to not focus on attention if I say attention capitalism people say what's what's the big deal like if you get my attention and it seemed like such a big deal but the point is that in the same way that an industrial capitalism a whale is worth more dead than alive a tree is worth more as lumber than as a true than as a living tree in the attention capitalist model a human being is worth more if they're addicted outraged polarized dis informed and narcissistic because that's better for producing effects in human attention and in attention capitalism just capitalism is always looking for cheaper and cheaper ways of accomplishing a result and you can think of attention as a kind of labor so if I think about you like it's really expensive to pay you at Bloomberg to sit in an office with no nanny is not nearly expensive enough but there's a cost to these big black box newsrooms and man that's really expensive so we've got all this attention going to Bloomberg and that's a really expensive way to generate human attention so Facebook points its eyeballs at this Bloomberg box that's a really expensive way of generating human attention says hey I could replace that but and generate the same amount of attention if I got human chimpanzees to share links each other back and forth and make make them feel good about how smart they are and I can generate the same amount of attention by social link sharing because you'll do that work for free all of us we go to work on creating attentional labor for free for Facebook because we're willing to post these links to look good to each other and generate the same amount of human attention then Facebook sucks all the money out and so that was the cheaper way of generating attention a further cheaper way of generating attention is instead of paying the top of the the food chain the editors at Bloomberg to edit and curate what's true what's meaningful what's moral what's what's what we've value in society it's much cheaper to have algorithms do that so let's fire the editors and let algorithms decide what we show people and if you know with YouTube 70% of YouTube's traffic now is driven by the recommendation system but in each case we're hollowing out our media and sense making institutions and getting rid of the employees there's fewer fact checkers now in news organizations and there's ever been so the quality of sense-making that's feeding us has been diminishing over time and it's poison the information environment so you can think of this happening everywhere all at once it's kind of poisonings like the flint water supply but for sort of our brains and information so the role of technology to help the businesses I mean you were designed ethicists that Google clearly they were thinking about this Facebook is talking much more about the investment they're making in human fact-checkers the third parties that they're involving clearly this is something that due to public pressure or learn internal pressure they're starting to respond to how how much responsibility does lie within the business itself I think the the point is that these are not just private companies that are products that we use like a regular private business where you walk over to that business you pay for a service you get a service they don't have responsibility for their impact on society when you walk to you know get that coffee from that one little coffee shop but Facebook has actually taken over and that's what's happening in general technology they're taking over in the classic Marc Andreessen sense of software eating the world you know YouTube has eaten up the Saturday morning cartoons of children's development so they've taken over children's development children spend six seven hours a day now on screens so you can think like no matter what great education you've got in that school YouTube has an educational process conditioning their mind for about six hours right so in general technology is taking over more and more parts of children's development political advertising or information public sphere as its gobbling up these parts of our social life it has to take responsibility for that so if you gobble up Saturday morning cartoons and we used to have regulations and protections that said what do we show children what do we not show children when YouTube takes that over it doesn't preserve those protections it actually gets rid of those protections when Facebook gobbles up election advertising and we used to have a laws and regulations around equal price campaign ads it should cost the same amount of money for one candidate another candidate to run it out at the same place same time but then YouTube sorry Facebook gobbles that up and they have an auction system that's run by a private technology company to get rid of all those regulations so in general as software eats the world its deregulation is eating the world or the loss of protections is eating the world my comeback to that is I have a two-year-old who I see his addiction to my phone he's really into Sesame Street on YouTube kids and I take it as my responsibility that he doesn't have six to seven hours of it and that he has 20 minutes and then I can put the controls in place on YouTube kids to ensure that it is perhaps the more worthy things that he's watching equally Facebook is now saying look actually we will make it very transparent how much a political adverts yeah and try to inform in that way because they don't want to take away the freedom of speech or the freedom to have this debate in public how do you think those responses are going so so what you're getting at now is a question about responsibility so how much are if there's people believing a fake political ID is it our responsibility individuals responsibility or is it the company's responsibility if people are addicted to YouTube is it our responsibility that we didn't set up those time controls right that's our fault that's exactly what it's sort of like BP if you saw BP just did a tweet recently saying like do you want to know what your carbon footprint is use our carbon footprint calculator so you can manage your footprint ah no they're putting the responsibility on the individual when the system is what creates the problem and it actually takes attention away from the system and I the reason I'm so critical of the system is because I'm concerned about the objective raw results that we're seeing so Facebook shut down 2.2 billion fake accounts in three months given they build 2.7 billion juices there's 2.7 billion users of Facebook real people and in three months they shut down 2.2 billion fake accounts so I'm sure they got all of them right now if you then ask so they've had 35,000 content moderators right they'd made up that number in fact they're saying now that they spend more money on trust and safety which is their entire sort of trust safety policing content moderation process then they made in all of their revenue when they went public in 2012 they actually spend more money on trust and safety I'm giving their talking points now they spend more money on trust in safety than all of Twitter's revenue last year totally right so it sounds like a lot of money they're really investing in this but the raw results of this have not borne out very well there's still I mean we all know you saw yesterday the two days ago two Facebook employees basically rebelling and saying look this is actually producing bad results we need stronger controls on political advertising if you look at YouTube let's like just take that example because I know it better in terms of the raw stats one of the problems is that we're on the outside we don't actually know what the all the data is about how bad it is on the inside but with YouTube for example if you let their machines recommend what to show kids there's me these many examples where it shows these disturbing videos of spider-man with machine guns shooting up the Hulk and these kind of playing singalong songs really really disturbing stuff there's stuff around child pedophilia or recommends videos if you watch one video of a kid a young person dancing in front of the camera it'll start recommending through comment trails it really starts recommending these other videos of young kids dancing it sends people down the rabbit hole tik-tok is just like this - not to even just put attention on Facebook and YouTube it recommended YouTube recommended Alex Jones videos know and you know Alex Jones Infowars conspiracy theories 15 billion times and when I say recommended I don't mean there's like 15 times 15 billion times people clicking on and wanting that and searching for it I mean 15 billion times that it was recommending that to people and so it's poisoning the information environment because disinformation is out competing information there's an MIT Twitter study that on and Twitter fake news spreads six times faster than true news and it makes sense because fake things are an unconstrained organism it can evolve in any direction it can keep making up in manufacturing good-sounding lives and you can keep a be testing the things that work the truth is very constrained there's only so many things the truth can be and it can't necessarily always be sexy or salacious or outrageous so the truth is generally out competed by the other things so you know you take to other facts which is on YouTube climate denial videos out competed or actually is about 50% recommended compared to the climate consensus video so you wonder why does everybody not just believe in this obvious thing that we have to get ahead of and then you take a third fact which is the best predictor of whether someone will believe in a conspiracy theory is whether they already believe in one conspiracy theory and it's never been easier to believe in one your first sort of gateway drug conspiracy theory so when you add all this up it I'm going pretty quite pretty fast but the the cost of this is it is poison our information environment at a time when we need to be making you know unprecedented the accurate sense of the world and agreeing with each other about the world it has been pulling us apart and and it's really really serious and it because it's not intentionally because of bad people but it is because of a business model and attention capitalism that is trying to find the cheapest and most efficient way of generating that attentional labor with with algorithms and that produces this this bad result the fix then you're obviously debating this internally every single day and the companies are spending ferocious amounts of money to try and fix that they are injecting humans back into the situation but perhaps from your argument not doing it quickly enough and what successfully enough how are you thinking about steadying the ship to a certain extent you well the the core thing and that's why I started with if you saw in the opening slides on the right-hand side all those different problems we take as our problem statement not just the like let's fix the information environment but the whole thing the shortening of attention spans the polarization and artists is and the unnecessary narshe narcissism which is an undervalued one because a teen mental health issues have risen by a hundred and seventy percent it's called high depressive symptoms for 10 to 14 year old girls went up by a hundred and seventy percent after two decades in decline and it went up and by that much in the last five six years so I say that because in terms of how do we reverse this we don't just want to reverse one of these issues we need to reverse all in it's like climate change you have to get to the root issue the root issue is the attention business model and the fastest way to do that is to simply not have business models where we are forced we're essentially huge supercomputers are pointed at our brains with an incentive to create changes in our attitudes beliefs and behaviors just like we went off the gold standard where our entire financial system was pegged to how much mineral deposits of gold we can scrape out of the physical earth our entire stock prices of these big companies one hundred one and a half trillion dollars of market value is directly coupled to the attention standard how much attention we can scoop out of the brains of children and the brains of our democracy so we have to decouple those things and that means things like jaron lanier x' proposal of data dividends so money's flowing back to us if you're familiar with data jaron lanier work on data labor unions this is stuff you can look up online and also simply paying subscription fees it's the sort of quick answers that we should be paying for services that they're on our team instead of on someone else's team so sort of reward the user the person for their data that they exchanged that's that's one part of it now that sets up a perverse incentive so let's just I'm gonna be really honest about it so if we see that the if Facebook still is a automated attention mapping machine that's mapping lies into people's brains with high precision accuracy but now it's paying us as it does that it's like the digital Frankenstein is wrecking the world and it hands us 50 cents every month to do that that's not that's not what we want so we have to also pay to actually align the goals of all the engineers to be on our our side and it's not just the ad that's the problem it's the entire incentive system to modify your behavior I think people don't really get that in terms of the personal responsibility thing there's a supercomputer pointed at your brain every day when you use these systems and so when you scroll and you say you know I'm gonna be done I'm just gonna scroll one more thing and you find out oh man I'm gonna keep scrolling a little bit more we'd literally do this every single day right and we think that's our responsibility but it's not because on the other side of the screen is a supercomputer with an avatar voodoo doll version of Caroline and the hairdo the voodoo doll is all the likes you've ever made and the clothing the voodoo doll is all the shares you've ever made and so the voodoo doll looks and acts more and more like you because the more data you feed it the more accurate a model I can build of you so when you think I'm gonna scroll one more time and then I'm done what actually happens is they can actually prick your voodoo doll with all these different a million different possibilities they know exactly what's going to happen when they show you this next three content items so it's like playing chess against a supercomputer and you're gonna lose seeing way more moves ahead once we all realize that all human beings being inside of a limited meat suit monkey meat suit brain are on the same team because we're all trapped inside of this system it's like the matrix where the asymmetry is growing every day what pushed you out of what made you think I can't work for Google as an ethicist design ethicist anymore was there because I feel like we're so keen to point out the negativity the bad news the the bad ramifications of technology and we forget the good yeah the things that my child is learning through YouTube the fact that I am able to connect connect via what's up with his grandmother who saddled away in London the way that we are able to go to a new small business and give money to them by you know well manufactured goods and know the source etc was it just you had too much of the negativity and that's what pushed you out I'm sure people can can guess given how I'm talking how why I would I wouldn't stay at Google for too long I want to say that I actually am it I don't say I'm an optimist Jaron Lanier has this quote that the critics are the true optimists because those are the ones who are willing to see what's wrong so we can actually get to the world that is better I used to be a technologist in fact I met someone on your staff here who's backstage when I was actually building a startup that we got acquired into Google so I've done the tech entrepreneur thing I know how the products are built my friends started Instagram we both studied at the same labs at Stanford you know I the positives I think they're unquestionable the the question is a problem of business model and you know if we didn't have business models that are about mining and extracting and treating human beings as the resources to get things out of but treated us instead with dignity and as sort of that the customer not the product that that's just the change that we need to make and I think that if we can get there all the stuff looks so much better I mean it is magic that I can hit a button on my phone in 30 seconds for now I could walk out into the street and a car shows up and takes me where I want to go we live in a magical world I'm just advocating for a humane technology world that's way more about that and not about all the stuff that's destroying our society does regulation change the business model or change the way in which the businesses is focused rather than our own attention is that what needs to happen or yes there's gonna be there's two avenues that are fastest for making this change well the fastest is actually what you saw just two days ago with Facebook's own employees basically pressuring Facebook from the inside saying this is to our company if you saw the rhetoric in their note very powerful and saying we need actually even better standards for political advertising so in other words if if Facebook or Google or tic TOCs own employees say I don't want to be participating in the poisoning of the mental health of a generation or poisoning the information ecology that is the fastest lever for change because it's way faster to have an internal HR crisis you have to respond to then to respond to legislation that takes two years before it actually gets here so that's really the most exciting news is the way that internal people who see it this way are waking up and saying I don't want to be participating in this bad business model and creating that internal pressure and we've been working with a lot of those folks I'd focus a lot on that the second is on legislation Senator Mark Warner has actually been leading this quite a bit he just released a bill called the access bill to create more interoperability between platforms so if you just make one analogy of remember aol's monopolies back in the back in the day and we all used a level Instant Messenger that was the chat client they had this vertical integration between AOL and the messaging and one of the things that happen is they went under as I understand it some regulation to force AOL Instant Messenger to become open so they had to loosen it up so you could have different clients talking to AOL Instant Messenger you could have I chat jabber I think you whatever talking to that network the access bill that Senator Warner just introduced is very similar creates more interoperability so you can move a group of people this is how it should work you can move a group people from Facebook onto a new social network and have interoperability it's like being able to click migrate your phone number from 18t to Verizon click move my bank account from Wells Fargo to JP Morgan that kind of thing and I think we need that kind of competition to force and loosen up the monopoly griffon of the big companies interesting I want to get audience participation out because I'm sure you have some questions on and of yourself we've got some roving microphones ready to dash to you so put up your hand if you do have a question for Tristan yes gentlemen just a minute please just say who you are and what your question is sure I'm Cordell Schachter I'm the CTO of New York City's Department of Transportation do you think that the Electronic Frontier Foundation has society's best interests in mind as you describe them or they just another outlook outlet for big tech oh I don't think the e FF is an outlet for big tech byte by any means but I'm not an expert they've tended though historically to be on this sort of techno limitary inside that it's all about what I say it is the e FF is fighting against Orwellian 1984 dystopia so we want to protect civil liberties but there's this other dystopia which is the Aldous Huxley dystopia which is where control of society isn't by restricting access to information or overwhelming you with surveillance it's actually by saturating people and trivia ego a vacation and noise and that's the dystopia that we're protecting against but I think the e FF is credible organization they do great work thank you great question nice one anyone else can it quick question here Justin you mentioned paying for subscriptions to sort of you know reward the kind of information process that you want what does the recent decision by Facebook for instance and Apple to pay news organizations in order to have their full content on their platforms does that changed the dynamic in your view is that like a DMZ between those two ends of the spectrum yeah so to be honest with you I haven't you know things are changing so fast these days right that's part of the problem the attention economy that I'm not actually familiar with although he's in facebook news announcements is the honest truth in general you know I almost think of it as like a restorative justice fund that they actually owe the news and media and journalism sort of environments like to repatriate the money back into the media ecology that we've actually lost and so I I don't know what the effects are gonna be I think the challenge with these things is whatever analysis that one would do to figure out if that was going to be a good decision Facebook has the actual data that would show us that would let us decide how good or bad that might be from the outside it's very hard to tell and do the economic analysis that's one of the problems Ellen graffman first sensational comments thank you they're extremely illuminating my question is whether you draw any lessons from what is going on and let's say Russia and China you know the future has been described in books like 1984 and just a few days ago we were reported or the New York Times reported activity in China I'm curious whether you've seen any lessons that you're working hard to incorporate in your activity yeah you know the Chinese authoritarian sort of model is really alarming which i think is all sort of self-evident to people I think the really big question if you zoom out is right now digital authoritarianism just Digital closed systems digital closed authoritarian systems out-compete digital open systems because digital open systems are vulnerable to anyone participating aka Russian BOTS Iranian BOTS Saudi Arabian BOTS like poisoning the information ecology spreading noise distracting us confusing us the RAND Corporation has a name for this it's called truth decay the erosion of truth the RAND Corporation the military corporation so I think the real challenge we have to figure out and maybe this is gonna be something we can talk about later is how does a digital open system out-compete the Digital closed system and that's the key question that frankly we in the West have to answer what does it open new social network look like that enables participation that enables conversation that is not censorship but that actually produces better sense making a better choice making amongst everybody participating and that's the big question that we have to be asking irena Maron sq.feet it laureate Paul Hastings apropos of your comment about sort of access to data and interoperability what do you think about the promise of decentralized platforms to sort of bridge that gap yeah I mean I think decentralized platforms are offering the new capability that will help us I think there's a lot of reasons though to be concerned about it like you know you talked to Twitter for example if you if you think today is bad where you can dock someone or have exponential hate speech or all this kind of stuff and oh you can spread lies and and Facebook eventually may take it down you can complain enough in a decentralized world where there is no police force there is no one home there's no one staffing a trusted safety team you actually have no one you can go to and so the decentralized platforms don't automatically solve these issues I think they there's a lot of people in the decentralized blockchain community who are dealing with the data issues but again I ask you if you step into your privacy utopia your data utopia you'd still see shortening of attention spans polarization outrage of vacation narcissism because the attention part wouldn't have gone away so the way the decentralized community can really help is creating platforms that decouple that incentive so that the attention incentives are not connected to the design of the product and there's some new design procedures we have to kind of go about to do that time for one last question I think is gentleman in the back I daniel wise a redeploy you mentioned Jaron jaron lanier earlier so it seems like you're positive on his idea of a new Internet to make that a reality or just your vision a reality what is something that there's a lot of power in this room so I'm in favor of it what can I do or what can anyone else here do thank you it's a great question this is a one I wish I had much better answers to we're a nonprofit by the way who works full-time on these issues we're trying to build a movement and mobilizing people there's a network we say that this is sort of like the climate change of culture but unlike climate change only about a thousand people between regulators top designers VCS some top you know product managers leadership at companies have to change meaning only a thousand people have to change what they're doing instead of in climate change has to change hundreds of companies across hundreds of countries so we're mobilizing people we have a podcast called your undivided attention where we're trying to using as a center point to walk through the problem with people you can subscribe to it online and we're you know pushing for things you know I know Jaron we've worked together on a couple on a couple things I don't think that there's been enough of a we haven't done a good enough job frankly between those of us in the activist community of finding easy ways for people to sign on and pledge so I think we need to do a better job of that so that we can invite more people into the conversation but we do need your help by by leaps and bounds so thank you for asking question a call to arms Tristan Harris our time is up I want to thank you so much for spending this morning with you and give it up - Tristan Harris
Info
Channel: Bloomberg Live
Views: 1,769
Rating: undefined out of 5
Keywords: Bloomberg, technology
Id: FbIJ_zc0E2I
Channel Id: undefined
Length: 28min 3sec (1683 seconds)
Published: Wed Oct 30 2019
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.