Facebook execs Sheryl Sandberg and Mike Schroepfer | Full interview | Code 2018

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
Sheryl Sandberg CEO of Facebook and Mike Trapp for CEO of Facebook we are between all of you and a late dinner no no no we have lots to say lots to talk about thank you for coming first of all thank you for having obviously news field year for you all I told Cheryl this is going to be a tougher interview than usual I think we we won't bring it on bring it on excellent we're ready so why wasn't anybody fired at Facebook over the situation with Cambridge analytical you should start with easy questions no no I think I'll start there nope and three hearts why wasn't even fired who should have been fired and and that's enough okay well we'll do it there okay so mark has said very clearly on Cambridge analytic oh that he designed the platform and he designed the policies and he holds himself responsible the controls in the company on this or under me I hold myself responsible for the ones we didn't have and look Shep and I are here we run the company we do fire people at Facebook we don't chop them out and make examples of them it's not how we are because we want a culture of responsibility atop and we take it and the thing for us and I think what underlies your question is do we know that we relate not just on the data for Cambridge analytical but on fake news on you know misinformation on elections and what are we doing about it and we definitely know we're late we have said we're sorry but sorry is not the point no what's the put your ads are ugly but go ahead well thank you okay but the point is the action we're taking and on all of these fronts we're really thinking about the responsibility we take in a very different way when you think about the history of Facebook and you've been following us in part of it for a long time you know for the last 1012 years we've been really focused on social experiences and building those at enabling those what the world would look like if people knew it was her birthday I was just in Houston people found people and saved people through Harvey because they were posting publicly on Facebook those good use cases but I don't think we were focused enough on the bad and when you have Humanity on platform you get the beauty and you get the ugliness and where we are now is really understanding the responsibility we have to more proactively see the problems and prevent them oh good I'm gonna play kind of good cup um that's hard that's care yeah I know he's isn't isn't the problem not that someone screwed up but that you've built this architecture that's fundamentally open to whether it's Cambridge analytic oh the elections tougher any of the problems that have been surfacing the last couple years where it's built for scale it's built its software it's built for sort of like minimal oversight and you want the humans to sort of populate it with with content and use it automated ad systems it seems like James Murdoch referred to this giant attack surface that you built this thing it's actually working the way sort of you initially thought it was gonna work he just didn't realize what she built I don't know one of you can start I mean I do think what you raise is a real fundamental tension between sort of giving tools are easy for people to have free expression and then for keeping people safe because yes if you you know really want to lock everything down you censor everything and have human reviewers read every single post someone puts on there but I don't think that's what people actually want and what we're trying to balance as it's like easy tools for you to be able to post share photos links whatever you want with anyone you want but to make sure the really bad stuff the abuse is hate speech bullying you know economic abuse is or there's this election interference to get that stuff off the plateau not with with something that's fundamentally going to be built from software and automation that reaches two billion people around the world no I mean this is the this is what we're in right now is trying to do this really well and I think it's a combination of humans and technology to make this work and to figure out and you know in each society in each culture where's the line between political speech and hate speech and how do we make sure that we get things that work for everyone all across the world it's sort of an unprecedented scale let's go back a little bit more there when you you just said that we didn't know we didn't see the negative parts that people I I don't know if you met human beings but I've know quite a few bad ones but the idea is what is in your culture that didn't see that like you're saying you know I know mark you apologized it seems like everybody's doing an apology ad but you what is it in the cultures I can I can remember a meeting where when Facebook live happened where I actually when I was shown it I said well what are you gonna do about so when someone murders someone or bait fit suicide or beat someone up and I think the problem is like Kari you're so negative I'm like what like I'm sorry like it humanity is really awful in many ways I think I think that is part of the tension in lives a good example so when life happened there was a lot of good it really catchy in terms of sharing people really enjoyed the experience but there were things that were wrong and we actually on that one I think moved very quickly we got down to human review of anything live within minutes which is actually hard to do operationally but we got there and that was really important and there have been things that were taken down of live right away there have also been things that have happened in live that we weren't able to really intervene and help people so with all that I think it goes to the point you were making which is a very good point we built an open platform it is a platform where so many people come on and share they are gonna do the good and they are gonna do the bad and it's not that we're ever gonna prevent all of it we will never say that but we can get better we can be more transparent we can put a lot more resources and a lot more thought both technology and automate technology automation in people we're also really working on being more transparent that we think is a huge part of the answer so content policy free expression is fundamental to Facebook it's a very deep value for us but so is a safe community and as Tripp was saying those values really rub against each other we've published our community standards but now we went ahead and publish the internal guidelines that people use to judge because what is free expression for one person is hate for another we worked with over a hundred experts around the world we published those we had a lot of good feedback and we're going to keep iterating we also publish our results so we have out there now how many pieces of you know 1.3 billion fake accounts taken down in six months isis and al-qaeda content we're getting 99% of it before it's reported sexual content we're getting 96% hate speech 38% before it's reported so we can see where the areas we need to invest and by being open about that we think we can get people to help us be the overall plan right is your term we're gonna hire a lot of people marks it we're gonna hire so many people to audit our political ads we're gonna lose money on political ads this cycle but X number of years out from now the the software will be good enough they're going to solve most of this you guys believe sort of top-down but that eventually you can solve the software problem with another software problem you can see this in the numbers that Cheryl just talked about because if we published you know numbers of for example and you know objectionable content nudity in pornography identified by people first and reported to us versus identified by automated systems several years ago you know would have been sort of zero percent and you know 100 percent all you know generally reported by people and now it's 96 percent automated by you know AI systems and you're seeing this again it's with violence the numbers 86 what hate speech it's 38 percent because it's harder it's more nuanced it's more in the frontier of development it's solvable but we've seen years this is and this goes back many many years of development to make these systems work and we see quarter-over-quarter steady progress I'm you know as a technologist I was very worried about some of the these harder problems we've made more progress in the last six months than I thought was actually possible so that gives me a lot of optimism to do this from a technology perspective over time sorry sir but I think one of the things people worry about is do we think we can automate everything do we think we can be neutral do it no no no we want to get things when they're loaded so the difference between something being loaded a technology pulling it down before anyone sees it that's much better and that's what we're able to do with Isis and Al Queda content were able to do that with more with photos and adult sexual content hate speeches language more nuance so the automation which helps us get it down before it's seen that's great but there are humans building the technology and we understand that and there are humans making the decisions on the rules and I think at every stage there's gonna have to be some technology and some human review let's go back to how it happened because I I don't think that was answered in the congressional hearing they seemed riveted by your Terms of Service and what your actual business model was that was an impressive display intelligence by our politicians and so funny she's run yeah in Congress oh I would and then you could yeah thank anyway do those get into that we'll get into that in a second they when they were asking these questions one of the things they did exactly what happened like exactly how did this occur and I'm really interested in and one of these marks said a lot was we take a broader responsibility now why didn't you take a broader responsibility for what is in the culture that creates that and I don't think it's a malevolence I don't I'm not accusing you that I think what is it as part of the Facebook culture that didn't see this coming in in terms and walk through Cambridge analytical first you want to go through the timeline you want to timeline yes let's talk about the actual times I have not I've heard some versions of it but I remember being at the 2008 event where you open up the platform you needed to bring people in you needed subscribers you created this open thing you handed out the data what did you think was gonna what happened there so walk through that and maybe you go all the way back to 2007 2008 when the platform first launched you know the idea was hey people you know are using a service Facebook they want to take their data with them to a third-party app to make it social to enhance it and you know over the years there's a lot of pressure to say hey you know don't don't be a wall guard and let people take their data and easily bring it to another application you only needed to give it to them to come on the play you needed some sort of candy to attract all these app developers in correct or something to get them to use well they wanted to build these great I mean many of these are name-brand companies now and they you know like many people thought that apps are better with your friends and and with with the social data in the early years it was a lot you know the idea and I think this gets back to your optimism versus pessimism for these things and I think for the entrepreneurs in the audience you'll know that as an entrepreneur you get told no and your idea is stupid you know nine times out of 10 during the day and so you have to some degree take it as optimistic attitude to bring something new into the world whether it's a new product a new company or a new feature in a product and so I think you start there when you're building these things and for the platform what we spent a lot of time on was look people are smart they're ultimately using a third-party app so whatever Facebook data they take they're also putting new data into that app so they have to trust that app and understand what it is our job is to kind of give them the notice on what's happening so we built these I remember spending iteration after iteration and how exactly do we design this dialogue to make it super clear exactly what data you're getting from Facebook and bring into this third-party app because that if the customer knows what's happening they can make informed decisions and that was really the focus of the platform in the early years the platform got bigger and things scaled this is one in 2014 we we said look we want to corner restrict access to these things we want to do more proactive review of applications so all new apps had to get get reviewed by our staff and then to get to your specific question about you know that timeline to happen here you know as app was built around that timeframe and then we heard in December of 2015 via media reports that you know an app developer had basically gotten Facebook data as people installed it and then resold it to a third-party wide via media reports you're super smart people I'm pretty certain so what where is the where does it break down there that you didn't know what was going on well the problem is we can't you know observe the actual data transfer that happens there right so we don't you know I don't actually even know physically how the data went from one to the other there isn't a channel that we have some sort of control over again as a consumer you're ultimately trusting a third-party with your data whatever data you brought from Facebook whatever data I mean you're taking these personality quizzes you know you're inputting new data in there that's a relationship with that developer that you have to trust that they'll be responsible with the data they're using whether it's on Facebook or some app you downloaded from an app store and so we didn't observe that until we heard about it through third-party reports and that's when kind of the events went into motion where we that's 2015 yeah and so the first thing we do sable the app from the platform so it couldn't couldn't have further access you know went to the goal was to figure out look who who had this data how do we make sure that that data is is deleted and is safe and that's what kind of happened and at that time frame and then the reason this has come up this year again in 2018 is you know subsequent reports that hey despite kind of agreement to the fact that they had deleted the data they may not have and that's one sort of we resurfaced and looked through all of these things we had made many platform changes as I said 2014 which made a lot of this not possible because you could no longer pull friends data and that's when we've really just taken this much sharper more pessimistic view on everything in the company I mean it's the biggest cultural shift I've ever seen in the ten years I've been there which is just you know top to bottom you know not just what are all the great things that can happen but what are all the ways people can abused this what are all the theoretical ways this could happen how do we make product changes don't we make policy changes how do we you know invest our resources differently both in security and content review and in product development you guys do that and that's like a process that's ongoing right I mean we've been working on this since then you've grown an enormous rate it's not by accident you've created systems to help you grow right there's growth hacking is a term looking back do you wish that you had grown more slowly or reined in growth have been more thoughtful about the ways you're growing and how much of that's attribute all of the problems we have today I mean I do sorry I don't okay let's do sure folks gotcha I mean looking back we definitely wish we had put more controls in place so you know we got legal certification that Cambridge L Annika didn't have the data we didn't audit them and now we're waiting for the government we still want to we're going to you know we wish we had taken more firm steps I think you have those media reports why didn't you well we did go back and got they said they didn't they legally certified they had deleted the data we did not go in on it why this is what we're doing now well it always looks obvious in hindsight we absolutely should have but to your question I think we can grow and we can continue to grow but we can also have controls in place and you know it does exist along things right now that your two billion people mark says we're gonna we're gonna slow down and we'll be more thoughtful about it you were cynical you might suggest it's easier to say this now than it was five or ten years ago well we always had some controls in place but I don't think they were enough so let's talk about data cuz that's what we were talking about we always had ways for people to control your data you always could go in and choose to share your data with apps and you could always delete always there what did we do now we put it at the top of everyone's newsfeed a very easy way here's all the apps you've connected to here's how you easily delete them so I think we are really building on what we did before the reason we were able to do that so quickly is all of those controls existed we already had all of those controls they were just harder to find for people and we made them easier so we are building on some of the controls we had before as we address some of these and in some of the areas we're going much further it's also the case that threats change so let's talk about harden let's talk about the election if you go back to 2016 and you think about what people were worried about in terms of nation-states or election security it was largely spamming phishing hacking that's what people were worried about Allah the Sony a lot of people hacking into systems we run a really good tech team we were very protective on that and we didn't have problems a lot of other platforms had we didn't see coming and I don't think we were alone in this but it's on us we didn't see coming a different kind of more insidious threat but once we saw it we did publish a white paper we found the ads and now we look forward to the next elections and we understand that threat and we're taking very strong step we see you will did know that you're gonna make a lot of money well we'll get to election spending in a minute but you want to go ahead on elections is important so we realized we didn't see the new threat coming we were focused on the old print and now we understand that this is the kind of threat we have we've taken very aggressive steps there elections going on all over the world as we speak and also coming up to the 2018 so fake accounts we publicly reported this we pulled down 1.3 billion in the last six months importantly we pulled down fake accounts in Alabama a Macedonian trill farm that we found that was trying to spread fake information in that election we worked with the German government pulled down fake accounts there are 30,000 in France that could have affected theirs so we are showing that we are able to meet those threats probably not perfectly you know we can talk as much as you want about fake news they really aggressive steps we're taking there we're now set up with third party fact checkers with the AP in 50 states looking at local and state news coming in to our midterm election and ads transparency which I know senator Warner's here in that area we're not waiting for legislation he has a bill out there were supportive of that bill but we built the tool that that bill requires and it's live so anyone can see any of the political issue ads so the issue for us is we were slow we are learning from our mistakes and taking action we're also pretty humble about this we understand that now we're protecting against this threat but we have to have a different mindset of trying to see around the corner and the next threat were you surprised when you read the Muir indictment and saw how few people they spent they committed to that effort how little money they had to spend to sort of fill Facebook with with spam was it was not a it was not a giant even the IRA ads it was a small amount of money that went far and that is why we have to take such strong you know and going back a lot of the source of these things if you think about fake news or elections our fake accounts so fake accounts are a big part of this thing here the other thing is really disrupting the economic incentives so a lot of fake news a lot of it is politically motivated but it's also economically motivated people want to write outlandish headlines so that they can get click so that they can make money so we've taken very aggressive steps to go after the economic incentives kick people out of ad networks make sure they can't make money and in all of this we're gonna have to solve today's problems but also see ahead to the new ones and I think one of the most important things we're doing are around transparency for users transparency for people so right now it's actually pretty amazing if you go in and look at it you can go in and see any ad running that has political or issue content directed in anyone so the problem before is that if you weren't in the targeting group you know if I were targeting Peter you couldn't see those ads but now it's open and transparent for everyone I think that is going to help people surface problems I think people are gonna find more things and that will help us learn pull them down and build the tools to prevent those sue in a more automatic that you're talking about before like there's it looks like there's this before and after like oh no we got woke over here at Facebook or something yeah all right after I think that's appropriate right but many people now have hostility towards Facebook and towards the tech industry because of it and many people yeah obviously people fair not blame Facebook for the election or its part in it I think it's a part of it and other things what's the from your perspective the overall impact on the tech industry and the in the country at large how do you what responsibility do you actually feel for what happened for what happened in the election or just in general in the electricals look the story of this election is going to be studied for a long time and I don't think any of us have perfect answers we're committed to helping to find those answers I think we're unique we've set up an election commission we're giving them access to data third parties they are researchers they are gonna report publicly on what they find and we're cooperating deeply we'll deep deeply with that I don't think we know what about I mean I think at the heart of this is you're asking a question about responsibilities yeah people have responsibility for the impact of the tools they built yeah just the existence it's the drum I like to be right and I think that the the days and tech I've just had I build these tools you know I'm not responsible for what happens with them or sort of over like you you really need to have a deep responsibility to think about not just the good that these tools can have but the bad and what are all the things we can do to guard against it and is the weight of these tools more on the positive than the negative and I think that's again the big cultural shift that I think a lot of people in tech have to have to make to really think about this in advance not just after the thing was created and I think tech has long been as an industry pretty insular and I think that's changing too yeah no I think it's yes so that's changing too so what's going right now with elections we're working much more closely like some of the stuff we found in Germany we found with the German government we worked with the French government we're working with local authorities around the world I think again opening up our community standards opening up to be more transparent that enables people to find things and everyone to work together because a lot of these threats we definitely take responsibility but bad actors will go from platform to platform and so the more we can cooperate and our industry is doing a better and better job at that when we find a bad actor we are cooperating on that and some of the legal changes have allowed that so that we can pull them down and so can everyone else here I think that's very much in the hearings and how comically sort of inept some of those questions seem to be yeah she'll run and then she'll be up were knowledgeable I mean if you ask two different people what's wrong with Facebook they're gonna give you different answers some are upset about the Russian ads or some Arabic said about diamond and silk the name do you feel like the US government and governments in general are really ready to engage in sort of a technical discussion about how to fix specific problems with Facebook or other person the tech business yeah because I mean a lot of people were like Oh mark did well largely cuz he didn't sweat apparently but it was ridiculous low bar but I every was marked it was like no they did badly like it wasn't like sure he did better than they did which was not very hard but that's right are they able to understand and and legislate well because some of the calls are for breaking you up for example so the question of regulation is a real one in a deep one and you know it's not a question of people say should Facebook be regulated should other companies our industry we regulated we are regulated we're regulated on privacy we're all regulated under gdpr which is there two other industries I think most and it's not really a question of if there's more regulation the question is what regulation we're working closely with regulators around the world Evan made a really important point that we feel deeply to which says the regulation often actually entrenches big companies yep so GDP are I think we've done a very good job complying with we've put up expensive to build systems and tools you have a lot of lawyers yeah but if you look at what we built if we were a start-up ten years ago we wouldn't be able to build all those settings and get them out and we're making those available to the world and so we're supportive of the right regulation that supports innovation that is based on an understanding of the technology and that is good for people and there are some we spent 20 minutes talk about what a complex system you've built and how difficult it was for you to figure out the problems and how you're going through it now I mean do you really imagine this is something where a bureaucrat where legislators are going to be able to sort of keep up to date with what's going on with your various platforms I mean look it's hard there are examples you know they're funny examples from history right and the United Kingdom when the car was invented they passed the law saying that in order to operate a motor vehicle you needed two people one behind the wheel and one walking in front of a car with a red flag good that will absolutely save lives but you don't get the car I mean I'll ask the audience a question who here answers a call if there's no caller ID if you don't see the number raise your hand if you won't answer that call a couple of you I'm going to call you but most people most people want when caller ID first came out the state of California tried to pass a law against it because it was considered a violation of the caller's privacy that you would know where that is so there are laws that are clearly either contemplated or passed that are bad ideas there are also laws that are good ideas I think people feel pretty good about gdpr and the controls it's given people and so it's our job to work closely with regulators and legislators all over the world so that if there's more regulation and when there's more regulation it's the right regulation so let me ask that sure first should Facebook be broken up a lot of its suddenly Google and Facebook should they be broken up look I think there's two things one is you know is there competition in the market and and you know if you look at many of the products we build you if you want to share a video YouTube is a better place to do it if you want to have a public conversation Twitter is a great place to do it you want to send a message you know there's snapchat there's WeChat there's line there's there's any number of things out there that you can use iMessage just to send those messages so you know consumers are smart they use the products that they want you know we're you know very small part of the overall ads business so I think you know we're honest when we say we feel like we feel competition all the time the other thing we are able to do in tackling a lot of these issues they are the same across the platforms you know we're able to take the same technology we're using in facebook to deal with you know objectionable content hate speech and bullying and immediately apply it to Instagram at a massive scale and I think that's a really big benefit for where we are today so your answer is no no okay what about you know for all the same reasons yeah you say more yes please yeah I mean look this is a question fundamentally about competition and one of the benefits to consumers of being together and I think it's what Tripp said I'll share a specific example you know if you are doing child exploitative content you know what stops encrypted but we know who you are from Facebook we can take your account down on whatsapp too so there are real benefits and I think the real question is do consumers have a choice and I think along every product we have there is a lot of choice out there do you think you'll be allowed to buy another whatsapp or another oculus or do major acquisitions like that now and the way that you've been able to in the past but Microsoft defense essentially was really restricted in terms of what they can buy Google was there's much more eyes on them because of their size it seems like you guys are gonna be there now as well certainly as you got bigger there's more scrutiny of acquisitions and there should be so we'll see it really depends what it is if it was in something that wasn't core to what we were doing in a new area like oculus was I think it would probably be allowed how are you getting along with your fellow tech giant's I mean you've been competing with Google for a long time Tim Cook recently told Karras and things that yeah the real world would be considered very mild through dinner conversation but in Silicon Valley are know who is considered a rough attack how are you getting along with Apple Google yeah what did you think of what he said or that one yeah I mean look the conversation that you have with Tim and the stuff Apple's saying is important to them right they have a product they feel strongly about won't sock you to know that mark and I strongly disagree with their characterization of our product you know we're proud of it business model we built we have an ad-supported business that allows people all around the world to use a product for free and if you're trying to connect the whole world that's pretty important so we respectfully disagree okay what about you Sam I mean I think that the thing that that I wish we could spend more time on is the substance of these issues like because there's there's you know times when you can get nice quippy sound bytes and sort of kick someone when it's popular and they're down and that's us right now and I get it and we in many ways deserve it you didn't go on very cogent go ahead but but I think that there you know there's there's lots of questions on trade off so on so you know how do you how do you build a product that the whole world can use like what are the different business models that work can ever consumer afford a $10 a month subscription or $700 device you know and for billions of people around the world like no not yet so so I think that there are trade-offs therein and all of these things and I think as an engineer what frustrates me is is you know there are deep issues and a lot of these things and mistakes that we've made and things that I really wish we had done differently but but in many cases you face these really hard trade-offs which is you can have more of something and then you're gonna have less of something else I can make you more secure they'd made you know we're gonna make some mistakes and take down some things that that we shouldn't have taken down or you guys that is about an all-call facebook that's ad free and or paid said I'm sorry working on a product we've looked at subscriptions and will continue to look at them but we're committed to continuing to provide a free service because it's core to the mission of what we do but how far are you along on a paid service we're looking we've always looked but really the heart of the product is a free service and again we think that's really important I'll try you how far are you along I don't like that I'm not trying to be one of the people that's fired over all of this tonight well no one's getting fired apparently what else would I matter not only sad you guys do sell some stuff everyone in the audience has an oculus go it's 200 bucks right yeah explain again why why you're in the hardware business and that's yeah you like it they're super sorry about the Russians so everybody gets an oculus go now I'm teasing you thank you no but let's yes let's talk about it why are you on stage talk about the area like why are you first of all I want us to question why are you making your own hardware and selling it into why are you in VR again Tim Cook says you should be an AR not okay great questions what we'll do the first one first which is if you see the oculus go it's $199 product that you pull out of your bag and you put on your head and you're in VR no headphones it's got built-in speakers it's at a price point that many people can actually afford doesn't require a PCE or your phone to dock it or anything else like that there's a tremendous amount of engineering that goes into making that product sellable at that price I know when we've set the target the team is like you can't do it right and so the only way we know how to do this by doing all the work ourselves so that we can make the right trade-offs needed to sell this product and build the ecosystem around it towards the bigger question of like hey why we are at all you know it's the only technology I can think of that's gonna build the closest thing we have to a teleporter or transporter from Star Trek which is you know I want to be somewhere else or with someone else very far away and I can't afford or don't have the time to take the long flight to get there and VR you know there's there's an app that's coming out that will take you to the Natural History Museum in London you can actually pick up specimens from the drawer that they are so sensitive the scientists themselves can't touch them cuz they'll break them their fossilized you can you know make them different sizes you can see what it's like to see a pterodactyl in flight right and that's an experience that we can bring to hundreds of millions or billions of people and children all over the world through VR I don't know if this is a sporadic as it has not taken off yet you think that's just a function of expense and difficulty and getting the stuff up I think you know this is an early early market and early product and so we're pushing the market forward here I think as you know someone in the audience here said is this device is the first one that didn't involve a bunch of asterisks on the end or just like then do this then do this and by that you like I'm done I got something else to do this is just put it on you are in Jurassic world looking at dinosaurs you're in the Natural History Museum you're seeing you know NBA game you know courtside right these are things that you can't scale out via any other technology and it's a different experience for you guys right to like get in early on a market and in to buy it spend a lot of money on a company that where the market doesn't exist as opposed to whatsapp Instagram you built these things they were widely used by the lots and lots of people I mean really bet I hope the onboarding Instagram and it was 10 million users when we thought it's grown quite a bit since then so so I think that but but you're right I think it's a new market there's a lot of new technology as amazing as the go is we have multi years of sort of rnd in the labs that were ready to bring to next subsequent products that can kind of take this even further and so you know when you look at a space and you say if I can build the product I know people will love it and then you ask the question can we build it I'm pretty sure we can in VR when you go to AR everyone's like yeah be amazing to have these super awesome glasses that give me this full 3d world and then you actually go when you look at the physics of it and battery life and all the rest of it and say there's a bunch of stuff that doesn't exist in the world yet that we need to go invent to make that happen so we're working on that too but I don't think that's coming to market in 2019 and sure oh was that a reaction I mean you all tried the phone and you others succeeded in at Google and Apple particularly did you is it a reaction to having to be in the hardware market or do you ever imagine Facebook going back into phones I don't think we're talking about going back into fence I think this is an exciting new area it's possibly a new platform it's a very can be a very social experience and yeah we're excited about it all right last question then we'll get to use it how do you think this has affected your business for the long term this past year and and you're not as public as as you are Sheryl how has it affected your image I don't think any of our individual images are the point the point is the responsibility we bear for the for the platform and protecting people going forward you know in terms of the business we don't make decision for the short run we don't have to and we shouldn't I don't think any company should have to but we have founder control and protections in place and we're very clear that we're gonna make the investments we need to make I don't think there's a trade-off between the business over anyway the most effective and the average you're the one that interfaces with advertisers the most yeah we've had a handful of advertisers pull some have already come back I don't think it's affected our short-term business your engagement you measure how they feel about it yeah I mean we've we've looked there certainly an impact but I don't think it's you know detrimental right now to the current business but it matters and we're investing because we want to do the right thing we've always wanted to do the right thing I think we were very taken with the social experiences and now we're very taken with the need to provide safety security integrity on our platform we also again approach this understanding that this is going to be an arms race this is going to be an arms race we're gonna do something someone else is gonna do something we're gonna have to do better and there are risks ahead of us and we have not yet seen and so we want to make sure we're working closely with other companies working closely with government closely with civil society around the world so that we deeply understand what's happening on our platform we also really want to protect the good I mentioned this I was in Houston I met this guy he owned a taco store when hurricane Harvey happened he had lots of food but no ability to to bring it to anyone he met a guy on Facebook alone de taco truck they were competitors they didn't know each other he put his food in the truck and they used Facebook to see where people are checking in control around feeding them that doesn't mean that every day on Facebook something happens and I know me not to be Pollyannish but it matters and we care we care about preventing that now those people were all able to be fed because they had shared publicly on Facebook where they are and so people have to trust us that they can share not just in an emergency but in a daily place during an election during a difficult time for them personally or a difficult time for a country people have to trust us and so the responsibility we take to earn people's trust and take real action to prevent the harm while protecting the good we're about as serious as we know how to be I'm gonna ask this one more time how is it affected you all I know it's not about you personally but you know Evan just talked about the difficulty of doing Wall Street stuff I want you each to say if you can if you have human emotions no I'm teasing I'm teasing young jeezy have you read my book yes I have yes yes yeah what how is it forget to say that what has it change that I know anybody what is it affect how is it affected you as an executive how about as an executive not as I am program to Hue minimize very advanced subsystem so look you've worked together forever here's something broke it's never fun to we all read the news every day and see everyone you know mad at us and upset at us and hating on us and that's as an individual that's not fun but I don't think anyone like I don't think we deserve any sympathy right because our job is to build this platform in a way that makes sense and you know the fact that there are some real issues there makes it harder because you know it's not just BS you can you can wave away but say like men like terrible stuff happens on the platform all the time and that's the stuff that really gets you down is this awful thing happened my gosh what are all the things I wish we could have done to fix it what are all the things we're going to do now and and that is it's sort of this it's not fun to be in but it's really important work and so I don't know if that helps it's yeah it's hard but it should be I mean it's hard because so what did you learn as an executive I learned that we needed to invest more in safety in security I learned that we needed to try to find the new threat and I think you're feeling pretty confident that we're doing a much better job than we were before on the threats we know of today and feeling a lot of you know need to figure out what the next threats are and knowing that that we won't do it alone and knowing that we need to work in a much more transparent and open way because I think that's the only way we'll be able to find the next threat I think I did yes you know what I'm saying so I'm Don Graham of Graham Holdings I want to identify myself as for a long time and always a friend of Facebook but for years not an insider I don't know what's going on inside you were on the board I was up to three years ago care so I wanted say this is not the greatest compliment you'll ever have that the care of Peter questioning here is a much better version of the conversation that the senators and Mark had in Washington's honor run that's not you'll you'll get better compliments in your life but there's one thing that's happened since that conversation and that's that Facebook has actually announced to us users a series of what sound like very difficult changes that you've made on the platform and I wish that since you've recapitulated the conversation with mark I wish Rep and Cheryl which summarize the changes you've made and also tell this audience some of whom I think are on Facebook about product changes that you are planning to make in the next few months to address the questions that carried people to the news parts coming what's kind of announced still trying to get me fired tonight it's a thank you Donna it's a great question so there's initiatives in each of these categories so it's so I'll try to just give a high level I think the fundamental of it as Cheryl said is a whole heck of a lot more transparency on what's happening and a whole heck of a lot more proactive taking these things down so you look at news it's it's you know down ranking of clickbait the articles disrupting economic incentives so if you're being sent to a site that's basically just this ad farm you know we figured that out and and down ranked that it's using third party fact checkers as you mentioned in all 50 states you know showing up for local elections it's about this article so you can get more information on the provenance of the article lots of things to help you know consumers better understand exactly what's happening and in the news that's just you know a small sort of things overall and news in the broader platform you know there's been a number of places where we just looked at every nook and cranny of the platform and figured out where can we either just completely deprecated API or require more review in all cases so that we're reviewing the application it's not just for what they do but making sure that there's sort of minimal use of data in all regards we put a notice in front of everyone on Facebook about what apps they've used so you can go in and see what apps you've use delete them if you don't want if you haven't used an app in 90 days will Auto disconnect it from your account so it can't you know you use an app three months later pulls more data that's just stop that's broken you know again I can we put privacy controls no that's all stuff we launched in the last like two months all right not gonna get new clear history is the thing that we announced but haven't yet shipped right which is if I want to disconnect all Facebook data from my facebook facebook account you know like kind of clearing my history and my browser or clearing my cookies tonight can I distract my information from Facebook if I'm not a Facebook user can I go to you and say what do you know about me and by the way can I have that data back the challenges we don't have a personal profile for you on Facebook so we don't actually even know how to identify use that data we guys thank you about how to solve that well well again in May cases you have cookie data from from a device or from a browser but I don't know which person this is associated with and so it's it's pretty hard to get that data back for an individual but there's you know we can I don't know that fully answers the questions or anything else we're clear history we're taking to GDP our settings and controls they're out in Europe obviously but they'll be coming to the rest of the world and the next number of months so a crop along the way things around data things around news things around elections things around fake accounts and country I don't ask them to disclose new product stuff hi I'm Sam from Comcast Sam Schwartz from Comcast we talked a lot about transparency especially around the source of ads Russian Bob's those kinds of things but I'm really confused about the the news feed algorithm itself right on Facebook and transparency around that I I see all kinds of stories online I never know why they're ranked in the order that they are and you know studies show that that newsfeed has the power to influence the moods of billions of people how do we grade you on that awesome responsibility to as a for-profit company I would assume some of that's done for my benefit and some of it's done for yours how do we grade your curation of the other news feed it's it's a great question I mean one of the things that when you look at you know first of all the content of your feet is dominated by who you friend and what pages you like so the easiest way to adjust the you know content in your newsfeed is adjust that then on a story basis you were introducing more and more controls to allow you to sort of mute a particular you know person to unfollow them so you can still be friends but maybe their stuff doesn't show up in feed and ads for example there a really useful control it says why am I seeing this ad that gives you pretty great detailed information on exactly why this ad was was shown to you so I think in every case what we're trying to do is give you very granular inline controls when you're looking at it's like why did I get this story we should help you answer that question on the spot which is what we found is sort of most effective to answer these sorts of things and in terms of hiring people to show you talked about this this crowdsourcing of what are new sources where is that the idea that you that you would that you would have your community rank sources so we've done a lot in news right probably the most important thing we've done with News which has taken down distribution across the board for newt for news news partners and news publishers is that we've really taken very strong action on clickbait on sensationalism and then we did meaningful social interactions really getting back to the heart of what facebook was which was really a place to connect with family and friends we heard from people that they wanted more friends and family less video less public content less news so those signals got taken into account we also really care about psychological well-being and so we started looking at this and we're gonna continue to look at this this research is ongoing and we found that when people are interacting with content where they're actively engaged friends family they like they comment they share that's very positive but it's not as positive when you're a passive consumer either that also meant the signals went that we had more friends and family less news then within news we want news that's trusted that's real accurate information and we also really care about local and this is hard there's no perfect way to do this but what we did on trusted is we went out to the community at large and we asked people to identify news sources they were familiar with not that they read but they were familiar with because if you hadn't heard of him it's not fair to rank them and then do you trust that was one thing all that was used to increase distribution for some news sources and decreased distribution for others and really hid I think some of the more sensational sources we're also prioritizing informative again working hard with third-party fact checkers to mark and really dramatically decrease the distribution of fake news and also prioritize local we've announced that we're going to be supporting local news we are going to make sure people see local news and hopefully accurate local news in their news someone's got to make that local news and someone's got to figure out a business model that makes that locally as possible and that's I mean you guys are playing around the edges for that but it seems like if you just wanted to cut people a check that helped a lot of local newsrooms survive well we're thinking about what we do in local news and can and and considering things kimbridge analytic you know with Axius on cambridge analytical i get in 2015 they certified they deleted it and you thought they deleted it the question I haven't heard a clear answer to is when the Trump campaign you had people working within it when they suddenly had all this data on voters how was no one either higher up at Facebook or working with the campaign suspicious of where did they get that data yeah they they didn't have any data that we could have identified as ours to this day we still don't actually know what data that campaign for general etiqa had we are trying to do an audit the British government came in and put our audit on hold so they could do theirs but we did not see data that we thought was from Facebook otherwise we would have done that did you see a suspicious amount of data that they knew more about voters or that not really not really in Sri Lanka the government recently had to shut down Facebook because of news story or fake news that led to violence and so I was just wondering practically on the ground in international locations what are you doing in order to combat those sorts of situations yeah what's you know there's been issues in Sri Lanka and Myanmar and others and it's you know I talked earlier this is the worst thing to see is when people sort of weaponize this platform and it causes real-world harm it sees its you know the the challenge here is getting as you say people on the ground in the country who understand the landscape the cultural landscape the nuances of the language the NGOs to work with the folks work with there to help understand where the issues are and where we need to intervene and so that's been our focus is to literally just get more people on the ground in each of these countries who can focus on that and then have product teams you know in the company who when they get feedback about changes we need to make there who can who can deal with that we're also looking at technological solutions for a lot of the AI tools that we built you know they require large amounts of training data for those familiar and you know that training get is readily available in in the bigger languages but in languages like Burmese they're just it's it's not as good and so it's actually one of the core focuses of our lab is to figure out how to take you know a classifier in one language like English and transmute it over to a language of very little data like Burmese so we could immediately deploy you know some of the technology we've built you know for other languages there we're kind of doing all these paths and parallel because we want to solve this as quickly as we can all right that's very nice for me do you feel like your company does understand the responsibility you have now do you think that has obviously mark you you does your whole management team feel that I think we feel it really deeply I think we're making huge investments really huge investments they'll hit our profitability we think those are the right things to do and I think we know it's an arms race that we sit here knowing what today's problems are feeling more responsibility for the future and knowing we need to protect people who are using our platform I mean as I said earlier it's the biggest cultural shift I've seen on the company in the whole time I've been there my pretty wife wasn't alright thank you so much both of you thank you
Info
Channel: Recode
Views: 47,158
Rating: 4.1740141 out of 5
Keywords: facebook, sheryl sandberg, mark zuckerberg, news feed, russia, cambridge analytica, snapchat, evan spiegel, kara swisher, peter kafka, interview, code conference, recode, code 2018
Id: i3QBy5T0qxw
Channel Id: undefined
Length: 47min 45sec (2865 seconds)
Published: Tue May 29 2018
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.