Stanford Seminar - Time Well Spent

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
thank you guys all for coming yeah it is very weird for me to be here I did come here as a student and listen to many famous entrepreneurs come and give lectures I remember the first one that I came I had a venture capitalist or I was like a real estate guy come up to me and say wolf he gave me his business card and he said here if you ever need office space you know call me and I was like what do you mean anything well you know you're gonna start a company someday so when you when you want to just get some office space so yeah my role in the in the world now is very different than I actually thought it would be when I was at Stanford I was so I graduated here in 2006 as an undergrad in computer science I took mostly symbolic systems classes I was very interested in cognition in the mind and neuroscience and psychology and I got particularly interested in a lab here called the persuasive technology lab does anybody here know the persuasive technology lab some of you okay if you didn't know the persuasive technology lab is basically top i BJ Fogg he's a psychology professor and he pioneered this field of persuasive design how do you persuade people how does that work and how could technology persuade us for good that was the question and this lab was filled with young engineering students mostly computer science business psychology students many of whom went on to join the ranks of you know Facebook and actually my project partner in the class my Krieger was here from Stanford and found an Instagram actually using a lot of the techniques that we designed I learned about and the thing that caught my attention in that class was this last semester the very at the very last class of the semester was about what is ethical persuasion what does it mean to be able to ethically and personally persuade a human being and we actually talked about this thought experiment well what if you could get the perfect profile let's say every amount of information about how this particular human beings minds works and you could use that information to persuade them towards anything so this was sort of a thought experiment that we were running and it turned out that that question became ten years later the thing that I now do basically for a living and I'm concerned about and that frankly is one of the most important hidden and invisible topics of our time right now if you're aware in the last election including today where the hearings in Congress with the major technology companies Facebook Google and Twitter testified before Congress specifically about the ways that their platforms were used to persuade the American public in the election and and what Russia what Russia's role was in that and using these platforms so and if you didn't know a Cambridge analytic which is a company that basically the Trump campaign used but and also for brexit to basically create a personalized profile of what would persuade people so these questions that I was really interested in became really the foundation of possibly one of the most important you know things that's going on in the world right now so why does this matter what is ethical persuasion matter really this is a conversation about values you know when you when you're in this class and you're thinking what does it mean to ethically persuade people you think well who's to say what that would be good for someone else if you say well we want to help them we want to make their world better we want to make their world more open and connected we want to help them sell stuff so we're starting to define these values right like we want to persuade people to be more open and connected but what I want to talk about today is where our ideas about making the world better meet the rubber meets the road with reality and what it would mean to actually make the world better with persuasion so I want to tell you a little story about kind of how I got into this and got concerned when I was a kid I was a magician and I basically studied from a very early age that this instrument that you're seeing me through right now your conscious experience it can be manipulated there's limits to our attention there are limits to what we can see and not see it as limits to how we can think about something if I split this room in half and said is the number of countries in Africa greater than 50 or less than 50 and I asked you that and I asked the other group as the number of countries in Africa greater than 150 or less than 150 just by anchoring these two groups on different numbers you would get different results and what you start to realize as a magician is that the human mind is living inside of a 24/7 magic trick you're basically inside of an instrument that you can't see that you're experiencing me right now and that there's certain levers and strings you can pull on to get people to think about things in a certain way but I say this because this was a childhood interest and I didn't really do it for that long I gave like one or two magic shows but it gave me the kind of foundations of that thinking and I want to explain that I you know I later came to Stanford I took the Mayfield Fellows classes I got very interested in entrepreneurship and I started a small company called app sure which was about learning and helping people learn things on the Internet and we thought what if we could apply all that stuff I learned at the persuasive technology lab for good let's use it for good and oh I know what good is because I'm a 23 year old you know smart kid from Stanford so we built this thing that basically let the Economist would be one of our customers and you would highlight something on the page and we would persuade you to basically go deeper and learn about something we're just making we called it lighter fluid for sparks of curiosity we thought what could go wrong we have pure good intentions we're just trying to help people learn things were persuading them to learn I'm used all these little persuasive design techniques and as I was building the company you know we raised venture capital we had a team of 12 people as I was sitting there and what got me waking up in the morning was gosh you know I really want to help people learn things that was about all I could think about but at the end of the everything that my company did was not actually about helping people learn what it really matter for the business was whether or not we could tell the economists that we were increasing how much attention or time people were spending on its website because following so you know I had this goal of hate on the web and we're gonna provide this technology that's gonna make it easy to learn about things but the way we sold that to the publisher The Economist was by saying we're gonna help people spend more time in your website and so I had 12 people that worked at the company and I recruited them by being really passionate by being you know just classic like we're gonna change the world we're gonna make people learn about things gonna be great and they really believed it I was very persuasive as I was the CEO of the company and I honestly as the founder of the company couldn't admit that there was a gap between my positive intentions as a human being waking up wanting to do this good and the ultimate thing that we were measured by which was to capture human attention and you know when you're young and you're raising money and all your friends are starting companies there's this temptation to compare yourself right you know they raise this much money from dfj and I only raised it from this other VC or you know they recruited this you know brilliant guy from LinkedIn and oh well we've recruited these other people and there's all these dynamics when you're running something after you've built together a team and you're basically out there in the world with your company in your product that how deeply will you be able to examine your own motivations and beliefs and see maybe what I'm doing isn't about helping people learn at all if that were true would I be willing to see that that was true now think about it if you've got you know we raised four million dollars we had a team of 12 people their families their livelihoods dependent on me I had a huge amount of responsibility I was just trying to make the business work and here I was starting to question whether the entire premise the values the core of what I was doing every day were we actually doing that some friends of mine who were founders of tech companies as well got together one weekend and we got together because we all talked about how there's all this pressure that you face when you're an entrepreneur this pressure to succeed in to talk about how you're growing how things are going better that you can never express your doubts right everyone sends home that says how's this going you say oh well we just raised this much money we just closed these ten more customers it's going great you can never just say like you know I don't even know what I'm doing I'm not sure that I know why I'm doing this anymore like where is it safe to utter that sentence so my friends got together this one weekend and we had a group called a doubt Club and doubt club was basically acknowledging that it's not safe anywhere to talk with other founders about your doubts about your company your product on your life and we basically went around the circle was hyper confidential it's very safe we basically said look the commitment here is even above your connections and other employees who might or might not work for you or other investors are looking at your company this comes first in a small private group and we basically shared our doubts about what we were doing and it was in that moment that I realized that there was a huge gap between my positive intention that I really wanted to do good but there was a gap between this learning goal with our company app sure and the actual delivered thing that we had to do every day which was capture human attention I was able to see that and then I had to figure out what am I gonna do so Tina's told this story that makes it sound like I'm some you know oh we sold this company to Google and you know what a successful entrepreneur but I'm gonna tell you the real truth which is I was terrified about what we were doing I felt really lost about why I was doing the thing I was doing and we had to figure out what to do and we ended up you know being in the situation we could have raised more money but we were also shopping the company around and seeing you know what would happen and ultimately we soft landed the company at Google and so we had a nice out we had two other two of the other people in doubt Club ultimately one who got acquired talent acquired for their company we got talent acquired and the other actually just shut down his operation decided to move to Berlin and do what he was like passionate about but I say this because it's incredibly incredibly hard to question what you're doing Upton Sinclair the writer wrote that you can't ask someone to question the thing that their salary depends on it's even harder to get someone to question something that they deeply believe is their purpose I thought the thing about questioning someone's religion or something like that or questioning someone's identity and the reason I'm talking about this is because all of us in this room I imagine have good intentions for where we want the world to go and what it means to value certain things that we care about but we're only as good as the way that we can self examine our own values so here I was at Google and I was kind of recovering as an entrepreneur I felt honestly like I had failed I'd had I needed to get psychotherapy I went to Burning Man I had to kind of let go of you know the life that I was living and realizing there was more to life than just starting companies right I still had all these friends that it started companies and then I was sitting at Google and I was actually joined the Gmail team and working on some future looking personal assistants type projects and I was working with Gmail and I was in the room with the people who make this product I was fascinated at you know here's this Gmail product that literally I don't know a billion people actively use and you know that hundreds of millions of knowledge workers that that's the win it might be open on some of your laptop screens right now for all I know right we people live in this product and it has this extraordinary influence on the thoughts that arrive in people's heads right like just you're sitting there if an email comes in right now it's gonna push thoughts into your mind and you don't get to choose whether that happens it just it just happens and so I was sitting this room you know thinking about you know I really cared about products that really delivered a positive benefit in the world and I was in the room with the people who would be thinking the most deeply about what it would mean to truly benefit people's lives what email should do like one of the values behind email like of all rooms to be in in the world to be in that design room is like that's the room and I was I don't disappointed but the way that the conversations were had we're about let's make it really fun to use let's make it engaging what if we make it bounce up when you scroll what if it expands vertically instead of slides horizontally and there was all these design questions like that and I felt that it was something missing I couldn't put my finger on it for a while but I realized later it's like when is email actually adding up to a real net positive difference in your life think about the emails that you send whether it's love letters or you apartment searches we're actually getting some real actual delivered life change a real benefit a real value and then think of all the other stuff that we're just kind of shuffling messages back and forth and I felt like of all rooms to be asking this question we were not asking the biggest question and after a year at Google I was kind of burnt out and I decided that I was basically gonna leave at least I thought I did and before I did I kind of took these concerns and I made a presentation and the presentation was basically never before in history have 50 designers in California at 3 tech companies influenced what 2 billion people will think and do right now and we have an enormous responsibility as a company that shapes this screen in terms of what we are causing people to do and the reason I was even thinking about that is because I had this background as a magician and in the persuasive technology lab where I learned that minds really are steered by forces that they don't see so I have to say that in making this presentation I actually thought I was going to get fired it was not alarmist or angry or upset but it was very critical and it was very existential like what does it mean for actually email to be a benefit to people's lives and I was surprised because I sent it to about 10 people and when I came into work the next morning I got some e-mails back and I clicked on their comments and I went to the presentation in Google presentations that showed the number of simultaneous viewers was a hundred and fifty and when I like clicked on it later that day there was 450 viewers and basically I saw that it spread virally around the entire company when all the way up to Larry Page he was in three different meetings that day where people brought it to his attention and suddenly I was in this moment where I felt like you know the whole company was woken up to this question of are we actually influencing the world in a positive way and that led me through the next three years and actually an executive at Google saw this and basically generously offered to host me in a little corner in his lab in New York where I could study ethical design which I basically self titled this role of design ethicist and this new field of what is ethical design or ethical persuasion what does it mean to ethically influence what two billion people are going to be thinking and doing and so that's basically what I did for the next three years and I I went really deep into understanding first of all you know if you take the human evolutionary code you know you're living inside of a meat suit mind body right that was tuned millions of years ago so our predilection for sugar salt fat we're tuned millions of years ago when they were scarce but here we are now in their abundant so we've got all these tunings and you're living inside of it and it can't really change you don't really have a lot much optionality so you're living inside of this thing and if you made a map of every single string you could pull on this mind/body system to persuade it if you could make a map of how could you a dict a human body or a human mind how can you pull on its sense of belonging how can you make it feel like it's missing out how could you get it to do certain behaviors how could you get it to think about certain things how could you get it to make certain choices so if you had a map of every single way that a human being could be manipulated that was the first part of the study the task second part of the study was what would it you know what would it mean to ethically persuade so what what does it what's constituted and ethically pushing this human animal around in the world and then the last question is values what are we pushing it around for and who's to say how do we know and we have values we can stand on that we can actually persuade people in an ethical way so I basically studied this topic and I want to say that I didn't know what I was doing I was basically trying to figure out this answer to this question of whether we want to or not Google's gonna bump its elbow and Apple's gonna bump its elbow and face what's gonna bump its elbow and a billion people are just gonna go in these different directions right because when you you know when you wake up in the morning is important just to set the context you know two billion people that wake up in the morning and the first thing they do is they like check their phone and we checked 150 times a day you know in the bathroom coffee line going to sleep like we spend a lot of time in these devices and even when we're not looking at the device the thoughts that are in your mind right now are still pausing you know partly set by the time you did spend looking at the device so we had this kind of 24/7 immersion in this environment and I didn't know what I was doing in studying this question I just found it to be fascinating and important and interesting and Here I am literally whatever it is three days three years later it's November 1st 2017 and the US Congress is questioning Facebook Google and Twitter about exactly the stuff that I've been interested in for the last three years and I honestly find myself right now at the center of one of the most important and invisible problems I think in history which is that it's not just that there's this system that's kind of bumping its elbows into people's psychology but we gave this system a set of goals that are causing tremendous harm so what do I mean by that the reason I told you this story about why I have these positive intentions and with my company app sure we want to help people learn and how there was this doubt Club process to try and figure out what that might have been you know what we really would have been underneath there and you know that I had to actually examine the core beliefs that I had is because I think that just like everybody in the technology industry all these companies that were up there today in Congress have very positive intentions for the world everybody at Google and Facebook and Twitter that I know really really cares about delivering the best possible you know world that we can we can create and yet I would that's something like the statement our mission is to make the world more open and connected is just the same as me saying I want to help people learn about stuff because what is that actually about what it's actually about is capturing human attention Mark Zuckerberg actually faces this exact same dilemma that I faced except he raised a lot more money and his company's in the public stock you know stock market and one of the most profitable companies in history and controls what 2 billion people would think every day but how could someone like Mark Zuckerberg question whether or not everything that they do every day is actually about making the world more open and connected if you think it's hard as a founder when you're comparing yourself to your friends and whether or not you hire the good people you raise money from some impressive people talk about what it would be like to run one of the most powerful corporations in history influencing what 2 billion people in every language think and believe the terms of people's social relationships and to actually question whether or not the thing that you think you're doing because you know you're good you know that you're trying to do good but how that might be different than the actual result and the first thing you have to do is pay attention to incentives and I think this is really important as an entrepreneur cuz all of us I'm something many of you are gonna go off in the world and start companies is pay attention to who is paying who what is the actual thing that you're beholden to with my company it was the economists they were our customer and we had to do one thing for them which was keep people on their website for longer let's look at Facebook their business model is advertising no matter what good they want to do in the world their stock price is dependent on how much attention they capture youtubes Google's stock price part of YouTube is dependent on keeping people's attention Twitter's stock price is dependent on keeping people's attention everything else they say is an intention that's outside of that it's a dream this is something that they'd like to have happen but the end of the day the thing that they're beholden to is capturing human attention and I want to talk about why this situation I'm gonna take you down a little journey which might leave some of you more alarmed than you intended to be but I promise that all I'll turn it around at the end I want to scare you for a moment about where we are because I think that we're in a much more dangerous situation than people tend to recognize the goal of capturing human attention becomes this arms race for who's better at being a better magician and pulling on the strings of human mind right so you have these products competing and everyone's trying to figure out how can I get more attention and then my friend Mike Krieger at Instagram says hey just like with Twitter let's add the number of followers that you have to our product if we add the number of followers we have then everybody has to log in every day to see how many they have and they want to get more so they have to come back to the product every day so this one tiny design choice right these products just it just evolved like an organism that evolved this new hand and that hand is really adaptive for the adaptive environment which is what's good at capturing attention we just invented this new persuasive thing but what could the consequences of that be and how would you as an engineer designer be thinking about that you might say well we're helping the world because now people know like who's following them and now people know that they can connect with certain people that are interested in they're interesting you can see the list of who follows you and there's all sorts of benefits positive things that could come from that and when you're a human being living inside of the eyes and mind and beliefs of of someone who makes Instagram or Twitter you're thinking about it in terms of those positive things because that's what you're trying to do but how would you know that might cause a whole bunch of other externalities because right now the number of people who define their self-worth based on the number of followers they have David Brooks wrote a book called the road to character and he talks about the world value survey and how people in value different things over time and one of the big things that's changed is that people went from valuing Fame and 18 in the list two valuing Fame as number one analyst so in that's in the last sorry that's been the last like I think ten years or something like that and I would argue that the reason that like literally billions of people now value fame higher on the list is actually because of little tiny seemingly innocuous things like putting the number of followers that you have in the core of our software interfaces right so we have this natural situation where everyone's competing for attention and we are evolving these systems to get better and better extracting human attention and we then it gets really competitive and we have to add something else so what do we add we add AI so now instead of just offering a product to you I actually have to predict with big amounts of data and machine intelligence what's going to keep you on the screen instead of just offering some stuff you can click on I'm gonna actually predict from millions of things I could show you if I'm YouTube what's the videos I can put in front of you if I'm tinder I'm gonna pick from millions of people I could show you what's the perfect reason to not be with the person that you're with if I'm Facebook I could say all the million things I could show you in the news today what's the perfect thing it's gonna get you to like click or share okay so you have an AI that's basically been given this goal now I want you to put this in context when we think about AI we have to remember you know when when you point the AI system at chess right first it kind of Wiggles around and it makes some kind of funny-looking moves and then starts making the smart moves is it gets better and then it makes them surprisingly smart moves and then it beats Garry Kasparov and when a beats Garry Kasparov it doesn't unbe Terry Kasparov it's now better than all human beings at chess right so you take that same AI and you point it at the go game and that took 30 years that made these funny-looking moves and now with alphago with google's alphago it beat all of go players and when it beats all of go players it doesn't unbe all of go players so we built these AIS and then we actually invisibly gave it a new target we pointed it at this and we said whatever gets this human being play chess against this human beings mind and and play twenty steps ahead of where their mind could even possibly see and show stuff to them that is either that perfect next video on YouTube the perfect outrageous news story on Facebook which controls what two billion people will think every day the perfect reason for you to cheat on your spouse with tinder the perfect political message so we can very 60,000 political messages and we can actually combine word choices and different contortions of politicians faces and colors of buttons to perfectly animate a response from your brain stem because we're playing chess against ourselves and we have every clique that we give this system it gets stronger right every time you click or you share you're feeding it attention which feeds it more dollars is feeds more resources which feeds it more computing power which means it's better at playing chess on your mind the reason this should be alarming is that these systems are not neutral we have a tendency to say you know we've always had computers smart I mean computers video games radio TV we always worry about them so why should we so concerned this time and there's a few different reasons that this is so different and the biggest one is this AI enhancement that I just mentioned a couple others are that no other medium could pull on your social psychology so no other medium could show you an infinite set of reasons of why other people are living better lives than you are an infinite set of reasons for why you should feel like you're missing out other people are having fun without you an infinite set of reasons why you Opie police Ponce's you didn't open your TV and it said you know what you owe like 100 people responses you better start getting back to them right so we're doing such enormous I want to say harm or damage but we're doing there is so much that is now put on the human evolutionary animal right more than we've ever put and especially when you add in AI so now the question becomes what's what is the goal of that AI what does it actually once so if this is not a neutral product of this is not just sitting here but it actually wants something from me well if you're Mark Zuckerberg you think the thing you program the AI to do is to make the world more open and connected which you translate into let's engage people let's show people whatever engages them and that's going to be very persuasive to someone who's living inside of that mind but we have this problem because the actual thing that this is all based on is attention it's the same contradiction that I felt and we now have this system that's you know we always talk I don't know how much you talk about it here at Stanford but there's all of this discussion about runaway AI what if in the future we were to build a runaway AI like a paperclip Maximizer and we give it this goal to make paperclips and it turns the whole world inside out just to create paperclips and what if that would happen it how would we make sure that wouldn't happen so there's all these people working on AI safety and the amazing thing is that this basically already happened because we already built a runaway AI that's tearing what 2 billion people's thoughts are and we hid it from society by calling it something else we played a magic trick on the human mind because if you call it a news feed or you call it YouTube's recommended videos or you call it tinder recommendations people won't even notice that's how easy it is to fool the human mind so let's see what do I want to go what we're really dealing with is we have systems that have exponential impact right Facebook just disclosed in the hearings today and yesterday that there are more than five million advertisers on Facebook okay and that means that there are also potentially hundreds or thousands or millions of campaigns per advertiser so you get this combinatoric explosion and so if you have China or North Korea or Russia or someone else who's trying to spread manipulative advertising not even for a politician but just for divisive issues or conspiracy theories or lies the problems we've created an exponential system of persuasion without exponential guidance or exponential ethics to control it to govern it and so where we find ourselves is basically the situation that many of the people who are worried about runaway AI haven't been talking about forever and that's the situation where we're at and I don't mean to depress you but that's that's actually where we are like right now like today and the Congress is just now waking up to basically the reality of what this system has the power to do and the cats out of the bag now you could say well hold on a sec and we talked about these future scenarios where if you have runaway AI there's always this discussion of like well let's put it inside of an air-gapped computer and it won't like get out right or we can always pull the plug let's just shut the thing down we can always pull the plug I think it's not gonna be a big problem because we can always just turn the computer off if we have this runaway AI in the future but the problem is if you think about what that would be today that would basically be like turning the lights off at Facebook now it's not as if from an existentialism perspective as a human being you can't like get out of your chair and walk out of the room and space in time and tell the board of directors you know we're gonna shut this thing off you could physically do that but there'd be consequences and we can't turn off this system that we've created so the only way to solve a system that's runaway pursuing its own goals is not to manage your relationship to it I'm saying this because I want to contextualize a lot of people think of if you've read any of the work that I've done that it's about we got to better manage our relationship to our phones let's be more mindful with our phones that's like saying when chess is kicking your butt you know let's manage our relationship to the chest that's just totally overpowering where we are so the only way to solve this problem is actually put the AI on the same side of the table as us we have to be really honest about what the lines of power are so right now the thing is pointed at us it's extractive it basically says I need to extract as much attention out of you as possible and the classic line is it's a race to the bottom of the brainstem so it's not enough that I offer you the product I have to reach deeper down into the brainstem and create an unconscious habit so now you actually pull for the phone more often that's not enough I have to reach even deeper down to the brainstem and on your social psychology so the way that snapchat basically owns people's social relationships that's not enough I have to reach deeper down to the brainstem and get to your self self-worth and control your sense of how often people like your self-worth so I told you I'd go dark I apologize so we have this extractive economy if you'll notice it doesn't look that different from how the broader economy is structured when it's extractive it's the same as the environment we have a runaway system that makes more money the more you extract without putting something back in balance so the fundamental situation here is we need to find a more ergonomic relationship with the boundaries of our architecture in the case of how it's extracting us and if you don't care about us let's talk about our kids because this thing is just eating kids alive for breakfast so we can do that but I think that what this is about is just like the point where we were where we realized you know after we extracted coal and we created this incredible economic prosperity after generating energy with coal but we also polluted the external environment and no one really saw that at the time good intention people created this bad stuff and we just realized it later and we're at that point we're just realizing the extractive attention economy where we're capturing people's attention he's basically polluting the external environment it's also cluding the inner environment and now we need to invent solar and green basically solutions like things that are regenerative or economic or replenishing to the human being so you can make all the same metaphors of the environment right and that's gonna mean that things don't grow the same way one of the reasons why advertising is such a great business model is because it grows you know incredibly fast and is super profitable I mean who wants Facebook to be regulated when the stock price is like through the roof right now but also who wants the changes that we're going to need to make for climate change to take for climate change to be abated given that we're gonna have to make huge sacrifices and so the conversation actually comes back to values it comes back to mi willing to value something else besides money am I willing to value something else besides status now this is an open question when it comes to corporations who basically they don't have control or choice we created these robots called corporations that are just maximizing profit and they have to there's no like way they can't do that so we have to acknowledge when that misalignment exists and it's the same for our own lives you know just because I can start a company and my friends you know I've friends personally who've you know sold a company for a billion dollars now that's very seductive we can be pulled it's a magic trick we can be pulled by basically other people's success or incredible amounts of money right but the comp the conversation always comes back - what do we care about what is this for what is the problem that this technology is the solution to why am I doing this and if we ask those questions that's really the essence of ethics you know I have this fancy title of having been in the former Google design ethicist I will tell you I didn't study that much philosophy at Stanford I think the deepest ethical question you can ask is simply asking questions and asking why and so with that I will open it up for some questions thank you and really important are there examples of companies that are doing this well yeah are there examples of companies that are doing this well you know I think that yeah all right we can get into the theory of change there is a whole bunch of companies whose business models are aligned with us anyone you're paying right the the money is going directly you're paying the person that's serving you so you know we actually on time well-spent website we have a bunch of different products that we basically show you you pay for the for the product and they basically help you calendly helps you schedule meetings efficiently basically name any product that you pay it delivers value and you pay like on a monthly basis so anything like like that I don't want to give I mean I'm for some reason these these examples are sort of fleeting my mind but there's all sorts of products that are that are like that I think the main thing we have to do is recognize that the advertising supporting products are the ones whose incentive is to extract from you and it's always hard to acknowledge that because we get also benefits from advertising we get the fact that you know we enjoy getting an ad for Nike shoes that we really wanted to buy but the problem isn't the ad it's the misalignment of incentives so yeah and what if we paid for Facebook for example but question the back see your a magic trick in itself and is the first part where you present something ordinary it's humble CEO a start-up in total Valley and then you have a turn which is and here's what's behind curtains and without club and your journey of you know discovering the extraordinary on top of the ordinary but then the third and most difficult part of any magic trick is the prestige it's how you get it back how do you come how do you bring this extraordinary Ness into the ordinary and so and your magic trick was about your journey and your insight and and about how the this journey allows you to create kind of magic through design or to whatever you did you're doing but my question is is it the purpose of this to create people that believe in magic what is it to create more magicians because what I don't understand is that we're spending all this time trying to create the perfect system the perfect Gmail the perfect platform which is in itself a magic trick so that people can continue using it so that people believe in its magic adequacy but my question is why don't we spend more time creating magicians and decentralizing design or allowing people to design their own experience different content was I think I think their take I appreciate your question I think you're taking the magic metaphor further than I was intending yeah it's not it's not about the magic of ethics or values or these kinds of things I think the point of the magic frame is that it exposes the fact that we're inside of a system that can be pulled or drawn it's more delicate or vulnerable or manipulable than any of us would like to admit and right now all of our design institutions and our metrics don't account for that model human nature they're governed by simpler ideas like we're giving people what they want or if you clicked it that's what you wanted which is very places when you tell someone that but then you could imagine if you said well are there any cases where someone clicks something and it's not what they want but they were outraged they were you know upset they were you know that kind of thing another question all right student yeah instance if you were to tell one of them to just remove like how to persuade a company to remove one of these time-consuming features without them jeopardizing their competitiveness to remove snapchat streets yeah you know yes so the question is really in the attention economy where everyone doesn't even matter if you're building you know snapchat or you're building a meditation app even meditation apps need your attention you have to figure out how to put a habit inside of your body so that you use it every day how do you tell any company not to get people's attention or to subtract some of these manipulative features from their products is your question so the answer is you cannot obviously get any person whose business model is to maximize how much attention they get from you to not do that you can't tell YouTube to not try to show people the next videos you can't get snapchat to now subtract the streaks feature so the way to do it is you have to go up a level and you go up a level by going to the device and the platform so in other words we can't actually ask Facebook do something Anson's business model but you can ask Apple Microsoft or Amazon or Samsung who are companies whose business models set up the choice architecture which is the interface between you and this army of things that wants your attention and so right now that choice architecture because we have this model of human nature that says people are clicking or choosing freely is this wild-west right like basically when you get a phone it's like Apple and Google give totally open door access of all those apps to just reach into you and do all the stuff that they want to do so there's a bunch of ways they can clean up and mediate that relationship so it's like if we're jacking people into the matrix do we want to jack their impulses or do we want to Jack basically the top part of the reflective minds and there's ways of designing the choice architecture on a phone like a home screen or notification that are basically asking the most reflective part of ourselves what we want and to make that concrete that's the theory the concrete version are things like conversational interfaces so Siri the air pods watches things that have supplemental interactions that are peripheral interactions in which you basically are doing something with the device for you're not giving people a bottomless bowl of things to scroll through and explore in an infinite world which is what they're all based on now so that's that's one period of change as the platforms and the other one is governments we can talk about that more later yeah so that that sounds like you're talking about further up the chain I'm just wondering on the other side it can you dint chain I guess I'm wondering if you're seeing culturally an acceptance that this is a real problem you know is it widespread or is this just a small group that feels this way yes so is it important for driving change so is there a large group of people who agree or see the problem and is that important for driving change I'll say a couple things on that one is I felt so nervous making that first presentation at Google like I thought that's like yeah I can't be I honestly thought if there's someone else if this problem existed someone else would have thought of this and said it already and and I felt really nervous putting the idea out there because I was basically saying humans don't choose freely here's all these ways that we don't and I felt uncomfortable saying that I felt very vulnerable and it actually it's been this process over for years of being repeatedly validated that people understand that you really this really does happen this really does work this way that I've felt more comfortable stepping into it and I'll say since then the acceptance from more and more X and alumni of the big companies have come out at least privately to me in agreement and so basically I was on 60 minutes earlier this year with Anderson Cooper talking about this problem we call it brain hacking and it was mostly the addiction part of this problem and since that interview Mark Zuckerberg is personal mentor Roger McNamee who actually convinced mark not to sell the company to a Yahoo or Microsoft for a billion dollars and now feels incredibly conflicted about his role in creating Facebook has actually partnered up with me and we've been doing all this great work together and there's been more and more people who've been coming out of the woodwork because they understand this is a the situation we're in is so I would actually say if you don't see the situation this way and it's resembled who doesn't and they're in the industry and they actually work on these products it what's happening is more of the up to the Sinclair can it get people to question the thing that they're doing thing and we always want to look for the positive yeah yes the question was is it true that is it true that executives in their schools to places where they don't have phones there's limits the answer is yes and I think that's always a really telling signal is when the CEOs of companies don't feed their own kids with whatever it is that they're making it's an extension of the golden rule it's like you know don't just do unto others what you they would do to you do unto others what you would do to your own children and I you know I know Steve Jobs limited his phone his kids phone and tablet access I think Sheryl Sandberg doesn't let her kids use social media as I understand it I know there's people at Apple that are very high up that send their kids to Montessori schools and I think the Google is very similar and I'll just say you know in parallel when the the CEO of Lunchables food liners a billion-dollar food product would not let his own children eat lunch apples and you know that there's a problem when that's when that's true so like what do you so what should we as consumers do if we don't want our minds to be pulled out the first thing to recognize and this took me a long time to kind of get across is I interviewed all sorts of people who had like the best experts in the field on persuasion and behavioral economics and magicians and you know I had lunch with them years ago when I started working on this with Danny Kahneman who was actually one of the founders of the field of behavioral economics he's the Nobel Prize winner in Himmel economics which is how we can kind of fool people and in his book Thinking Fast and Slow he says look even though I know how all these techniques work and then be how these cognitive biases work it still works on me now the important part to realize that it's like okay if you told someone I'm gonna give you access to know what your DNA looks like then you don't think well then now I can change my hair color from red to something else right so I think the challenge is that we really are inside of this experience we're really inside of cognitive biases if I do that Africa a number of countries thing and I anchor you on one number versus another number whether you want to or not there's all of these invisible influences that are guiding us and so the question becomes what how do you epically use those those forces and then again for good and for what for what values in terms of being concrete what do you do you can turn off all notifications on your phone except for when a person wants your attention so one of the big things in the industry is just about everything that's a notification that comes at you is actually generated by a machine and the machines goal is usually like what will get your attention right now and so it's automated lis sending you stuff if you ever even you tried the following you don't use Facebook for like a week and then you watch suddenly they send you like 3,000 emails it's like a drug dealer who wants you to come back so I'm not trying to be I hope when I say this I don't sound like I'm just trying to be against something I'd say it's more that there's this model that's just not aligned with all of us right I don't think anyone wants a world where this is going that's hopefully when I communicated it's like this is so dangerous where we're headed I want to say one more thing about what danger that I didn't mention I think the deepest existential risk from this is that the thing that's best at capturing a human beings attention a single person's attention is going to be show them an individual reality that confirmed their World News that's gonna be so different from the thing that would confirm our shared sense of reality so you can think of Facebook and knock you like they gave it this innocuous naive goal of like let's help people let's engage people the most and then that thing is taking entire societies and putting them through a paper shredder we're out the other end you get basically filter bubbles and echo chambers when people don't have shared facts and agree on the same reality that's like I think the deepest part of the problem so if you weren't convinced by the kids threat or the national security side I think that is a sort of step one for good Society okay so the question was if you take a persuasive technique what how would you apply an ethical framework for deploying it okay so let's take I love this example so let's take streaks and snapchat sorry if you've heard me talk about this but it's really important because if you didn't know snapchats the number how many of the students here you snapchat is your primary okay so good number okay so and it's obviously invented by Stanford alum who has good intentions but they use this feature which is a technique because it's persuasive design technique called streaks if you are older in the audience it shows the number of days in a row that you've sent a message back and forth with every contact so why is that is that ethical or not I want actually here what do you think if I put the number of streaks in a row there were days in a row that two kids upset a message back and forth what makes that is that wrong is that good is it bad what okay so if it increases your quality of life and helps you feel more connected with your friends then that would be one reason why we say it's good so let me tell you actually what the people at snapchat when they made this feature what I've heard through the grapevine is the way they they justify this was that unlike these other social apps that connect you to all of these friends all at once you have like all your relationships all ones they help you focus on the relationships that really matter because they show you the depth of your friendships do you see you I'm talking about you again you have a belief in a narrative in your walk you're stepping inside of this belief where everything you see through these rose-colored glasses where this is deepening friendships according to the human being that believes those thoughts they're inside of a magic trick and they believe that this is actually about deepening friendships and there's definitely evidence for that I'm sure that there's someone somewhere two kids who feel like mad my friendship is way deeper because I can see the number of days in a row I've sent a message back and forth there's somewhere in his experience that just like there's magic moments on all these other applications but there's still something that's not quite right about it and I'll just jump to the chase with snap shrieks there's first of all in asymmetric situation so the children don't know that there's like a thousand and/or hundred engineers on the other side of the screen and that they're deliberately choosing this technique against them so that's one thing so there's an asymmetry between the knowledge that the persuaders have and the knowledge the persuade D has okay second one is that the just had it the goals of the persuader are different than the goals the persuade D so when the people who make snapchat do this their actual goal is how can i hook you to use the product every day because that works really well like the streaks feature is super addictive that's their goal and the goal of the person who's using it usually the challenge of ethical persuasion is people don't know their own goals so they actually just sit there and then the persuaders goal infects the persuade D so now the persuade D is like a like the matrix they've been there's a drilled hole in the back and there's goal when it and if like now I need to keep up with these streaks and now they actually want that that's something that they intuitively independently want that's successful advertising the persuaders goal has become the persuade ease goal and now they define their friendship based on whether or not they're able to keep up their streaks and if they don't have their streak they're no longer best friends and by the way that's actually true so there are children walking around who think that they're you know they're that they're terms of their friendship are there streaks that they have but I want to just do one more thing here but it's a great example so have you asked me that is the you could do the same thing a streak feature on a meditation app same feature now we're putting on a meditation app the number of days in a row you've meditated so why is it feeling it like intuitively it feels better there right so why is that and I'll just say quickly it's because I theoretically the person who's trying to meditate actually cares their goal is aligned with the persuaders goal they actually want to sort of mark off the number of days in a row that they've done the thing that they're yearning for whereas in in the first case that's not true so this is a whole well this is the challenge of advertising is they make money whether or not you buy it or not because they still make money from just the impressions for seating the ideas into your mind that's still a success case for advertisers and I know that we're rapping no please but that the challenge here is there are other metrics my first TED talk I went through some of them the challenge is that any business model any business or company technology company whose business is to capture attention they just have to do that like there's just no other choice YouTube just has to do that but if you look at Vimeo it's a counterexample they don't autoplay the next video they don't show you the related videos and try to get you into an infinite world and it's because their business model is a little bit different and so you can imagine a version where you pay where these choice architectures are not trying to bottomless boil you into infinite so yeah Wow I'm sure you'd all agree this was incredibly important and provocative please join me and thank you Trista [Applause] you
Info
Channel: Stanford Online
Views: 17,318
Rating: undefined out of 5
Keywords: stanford seminar, stanford, etl, tristan harris, entrepreneurship, computer science, startup
Id: anEykhlBd-Q
Channel Id: undefined
Length: 57min 22sec (3442 seconds)
Published: Thu Nov 09 2017
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.