>> NARRATOR: Tonight, part one
of a two-night special. >> We face a number of important
issues around privacy, safety, and democracy. >> NARRATOR: "Frontline"
investigates... Facebook. >> We didn't take a broad enough
view of our responsibility and it was my mistake,
and I'm sorry. >> NARRATOR: Told by company
insiders... >> It's possible that we haven't
been as fast as we needed to be. >> We've been too slow
to act on... >> We didn't see it fast
enough... >> I think we were too slow... >> NARRATOR: ...and former
employees. >> I mean everybody
was pretty upset that we hadn't caught
it during the election. >> NARRATOR: How Facebook was
used to disrupt democracy around the globe. >> I don't think any of us,
Mark included, appreciated how much of an
effect we might have had. >> NARRATOR: Correspondent James
Jacoby takes a hard look at the man who wanted
to connect the world. >> JACOBY: Is he not recognizing
the importance of his platform? >> He didn't understand
what he had built. >> NARRATOR: But is he
accountable for helping divide it? >> There is something wrong
systemically with the Facebook algorithms. In effect, polarization
was the key to the model. >> NARRATOR: Tonight on
"Frontline"-- "The Facebook Dilemma." (birds chirping) ♪ ♪ >> Are we good? >> Should I put the beer down? >> Nah, no, actually, I'm gonna
mention the beer. (laughing) >> Hard at work. >> So I'm here in Palo Alto,
California, chilling with Mark Zuckerberg
of the Facebook.com, and we're drinking out of a keg
of Heineken because... what are we celebrating, Mark? >> We just got three million
users. >> 11, 12, 13...
>> Whoo! >> Tell us, you know, simply
what Facebook is. >> I think Facebook is an online
directory for colleges. I realized that because I didn't
have people's information, I needed to make it interesting
enough so that people would want to use
the site and want to, like, put their information up. So we launched it at Harvard,
and within a couple of weeks, two-thirds of the school
had signed up. So we're, like, "All right, this
is pretty sweet, like, let's just go all out." I mean, it's just interesting
seeing how it evolves. We have a sweet office. >> Yeah, well, show us...
show us around the crib. (talking in background) We didn't want cubicles, so we got IKEA kitchen tables
instead. I thought that kind of went
along with our whole vibe here. >> Uh-huh.
What's in your fridge? >> Some stuff. There's some beer down there. >> How many people work for you? >> It's actually 20 right now. >> Did you get this shot,
this one here, the lady riding a pit bull? >> Oh, nice. >> All right, it's really all
I've got. >> That's cool. >> Where are you taking Facebook
at this point in your life? >> Um, I mean... there doesn't
necessarily have to be more. ♪ ♪ >> From the early days, Mark had
this vision of connecting the whole world. So if Google was about providing
you access to all the information, Facebook was about connecting
all the people. >> Can you just say your name
and pronounce it so nobody messes it up
and they have it on tape? >> Sure, it's Mark Zuckerberg.
>> Great. >> It was not crazy. Somebody was going to connect
all those people, why not him? >> We have our Facebook Fellow,
we have Mark Zuckerberg. >> I have the pleasure of
introducing Mark Zuckerberg, founder of Facebook.com. (applause) >> Yo. >> When Mark Zuckerberg
was at Harvard, he was fascinated by hacker
culture, this notion that software
programmers could do things that would shock the world. >> And a lot of times, people
are just, like, too careful. I think it's more useful to,
like, make things happen and then, like, apologize later, than it is to make sure that you
dot all your I's now and then, like, just not get
stuff done. >> So it was a little bit
of a renegade philosophy and a disrespect for authority
that led to the Facebook motto "Move fast and break things." >> Never heard of Facebook? (laughing) >> Our school went crazy for the
Facebook. >> It creates its own world that
you get sucked into. >> We started adding things like
status updates and photos and groups and apps. When we first launched, we were
hoping for, you know, maybe 400, 500 people. (cheering) >> Toast to the first 100
million, and the next 100 million. >> Cool.
>> So you're motivated by what? >> Building things that,
you know, change the world in a way that it needs
to be changed. >> Who is Barack Obama? The answer is right there
on my Facebook page. >> Mr. Zuckerberg...
>> 'Sup, Zuck? >> In those days,
"move fast and break things" didn't seem to be sociopathic. >> If you're building a product
that people love, you can make a lot of mistakes. >> It wasn't that they intended
to do harm so much as they were unconcerned
about the possibility that harm would result. >> So just to be clear, you're
not going to sell or share any of the information on
Facebook? >> We're not gonna share
people's information, except for with the people that they've asked for it
to be shared. >> Technology optimism was so
deeply ingrained in the value system
and in the beliefs of people in Silicon Valley... >> We're here for a hackathon,
so let's get started. >> ...that they'd come to
believe it is akin to the law of gravity, that of course technology makes
the world a better place. It always had, it always will. And that assumption essentially
masked a set of changes that were going on
in the culture that were very dangerous. >> From KXJZ in Sacramento...
>> For Monday, June 27... >> NARRATOR: Mark Zuckerberg's
quest to connect the world would bring about
historic change, and far-reaching consequences,
in politics, privacy, and technology. We've been investigating
warning signs that existed long before
problems burst into public view. >> It was my mistake,
and I'm sorry... >> NARRATOR: But for those
inside Facebook, the story began with an
intoxicating vision that turned into a lucrative
business plan. >> Well, the one thing that
Mark Zuckerberg has been so good at is being incredibly
clear and compelling about the mission that Facebook
has always had. >> Facebook's mission is to
give people the power to share. Give people the power to share. In order to make the world more
open and connected... More open and connected... Open and connected... More open and connected. (applause) >> JAMES JACOBY: How pervasive
a mission was that inside of the company? Give me a sense of that. >> It was something that... You know, Mark doesn't just say
it when we do, you know, ordered calisthenics
in the morning and we yell the mission to each other,
right? We would actually say it
to each other, you know, when Mark wasn't around. >> JACOBY:
And that was a mission that you really believed in? >> How could you not? How exciting. What if connecting the world
actually delivered a promise that we've been
looking for to genuinely make the world
a better place? >> JACOBY: Was there ever
a point where there was questions internally
about this mission being naive optimism? >> I think the short answer is
completely yes, and I think that's why
we loved it. Especially in a moment like when
we crossed a billion monthly active users
for the first time. And Mark's... the way I recall
Mark at the time, I remember thinking,
"I don't think Mark is going to stop until he gets
to everybody." >> I think some of us had an
early understanding that we were creating in some
ways a digital nation-state. This was the greatest experiment
in free speech in human history. >> There was a sense inside the
company that we are building the future
and there was a real focus on youth being a good thing. It was not a particularly
diverse workforce. It was very much the sort of
Harvard, Stanford, Ivy League group of people who
were largely in their 20s. >> I was a big believer in the
company. Like, I knew that it was going
to be a paradigm-shifting thing. There was this, definitely this
feeling of everything for the company, of this, you
know, world-stirring vision. Everyone more or less dressed
with the same fleece and swag with logo on it. Posters on the wall that looked
somewhat Orwellian. But, of course, you know, in an
upbeat way, obviously. And, you know, some of the
slogans are pretty well-known-- "Move fast and break things,"
"Fortune favors the bold," "What would you do
if you weren't afraid?" You know, it was always this
sort of rousing rhetoric that would push you to go
further. >> NARRATOR: Antonio Garcia
Martinez, a former product manager on
Facebook's advertising team, is one of eight former Facebook
insiders who agreed to talk on camera
about their experiences. >> In Silicon Valley,
there's a, you know, almost a mafioso code of silence
that you're not supposed to talk about the business in any but
the most flattering way, right? Basically, you can't say
anything, you know, measured or truthful about the business. And I think,
as perhaps with Facebook, it's kind of arrived
at the point at which it's so important, it needs to
be a little more transparent about how it works. Like, let's stop the little
(bleep) parade about everyone in Silicon
Valley, you know, creating, disrupting this and improving
the world, right? It's, in many ways,
a business like any other. It's just kind of more exciting
and impactful. (Daft Punk's "Harder, Better,
Faster, Stronger" playing) >> NARRATOR: By 2007, Zuckerberg
had made it clear that the goal of the business
was worldwide expansion. >> Almost a year ago, when we
were first discussing how to let everyone
in the world into Facebook, I remember someone said to me,
"Mark, we already have nearly every college student in
the U.S. on Facebook. It's incredible that we were
even able to do that. But no one gets a second trick
like that." Well, let's take a look at how
we did. (cheering and applause) >> JACOBY: What was the growth
team about? What did you do at growth? >> The story of growth has
really been about making Facebook available to
people that wanted it but couldn't have access to it. >> NARRATOR: Naomi Gleit,
Facebook's second-longest serving employee,
is one of five officials the company put forward to talk
to Frontline. She was an original member of
the growth team. >> One of my first projects was
expanding Facebook to high school students. I worked on translating Facebook
into over a hundred languages. When I joined, there were one
million users, and now there's over
two billion people using Facebook every month. >> JACOBY: Some of the problems
that have reared their head with Facebook over the past
couple of years seem to have been caused
in some ways by this exponential growth. >> So, I think Mark-- and Mark
has said this, that we have been slow
to really understand the ways in which Facebook
might be used for bad things. We've been really focused
on the good things. >> So who are all of these
new users? >> The growth team had tons of
engineers figuring out how you could make the new user experience more
engaging, how you could figure out how to
get more people to sign up. Everyone was focused on growth,
growth, growth. >> Give people the power to
share. >> NARRATOR:
And the key to keeping all these new people engaged... >> To make the world more open
and connected. >> NARRATOR: ...was Facebook's
most important feature... >> News Feed. >> NARRATOR: News Feed, the
seemingly endless stream of stories, pictures, and
updates shared by friends, advertisers,
and others. >> It analyzes all the
information available to each user, and it actually
computes what's going to be the most interesting piece of
information, and then publishes a little
story for them. >> It's your
personalized newspaper, it's your "The New York Times"
of you, channel you. It is, you know, your
customized, optimized vision of the world. >> NARRATOR: But what appeared
in users' News Feed wasn't random. It was driven by a secret
mathematical formula, an algorithm. >> The stories are ranked in
terms of what's going to be the most important, and we
design a lot of algorithms so we can produce interesting
content for you. >> The goal of the News Feed is
to provide you, the user, with the content on Facebook
that you most want to see. It is designed to make you want
to keep scrolling, keep looking, keep liking. >> That's the key.
That's the secret sauce. That's how... that's why we're
worth X billion dollars. >> NARRATOR: The addition of the
new "like" button in 2009 allowed News Feed to collect
vast amounts of users' personal data that
would prove invaluable to Facebook. >> At the time we were a little
bit skeptical about the like button--
we were concerned. And as it turned out our
intuition was just dead wrong. And what we found was that the
like button acted as a social lubricant. And, of course, it was also
driving this flywheel of engagement, that people felt
like they were heard on the platform whenever they
shared something. >> Connect to it by liking it... >> And it became a driving force
for the product. >> It was incredibly important
because it allowed us to understand who are the people
that you care more about, that cause you to react,
and who are the businesses, the pages, the other interests
on Facebook that are important to you. And that gave us a degree of
constantly increasing understanding about people. >> News Feed got off to a bit of
a rocky start, and now our users love
News Feed. They love it. >> NARRATOR: News Feed's
exponential growth was spurred on by the fact that
existing laws didn't hold internet
companies liable for all the content being posted
on their sites. >> So, section 230 of the
Communications Decency Act is the provision which allows
the internet economy to grow and thrive. And Facebook is one of the
principal beneficiaries of this provision. It says don't hold this internet
company responsible if some idiot says something
violent on the site. Don't hold the internet company
responsible if somebody publishes something
that creates conflict, that violates the law. It's the quintessential
provision that allows them to say,
"Don't blame us." >> NARRATOR: So it was up to
Facebook to make the rules, and inside the company,
they made a fateful decision. >> We took a very libertarian
perspective here. We allowed people to speak
and we said, "If you're going to incite
violence, that's clearly out of bounds. We're going to kick you off
immediately." But we're going to allow people
to go right up to the edge and we're going to allow other
people to respond. We had to set up some ground
rules. Basic decency, no nudity, and
no violent or hateful speech. And after that, we felt some
reluctance to interpose our value system
on this worldwide community that was growing. >> JACOBY: Was there
not a concern, then, that it could be become sort of
a place of just utter confusion, that you have lies that are
given the same weight as truths, and that it kind of just
becomes a place where truth becomes completely
obfuscated? >> No. We relied on what we thought
were the public's common sense and common decency to police the
site. >> NARRATOR: That approach would
soon contribute to real-world consequences far
from Silicon Valley, where Mark Zuckerberg's
optimistic vision at first seemed to be playing
out. (crowd chanting) The Arab Spring had come to
Egypt. (crowd chanting) It took hold with the help of a
Facebook page protesting abuses by the regime
of Hosni Mubarak. >> Not that I was thinking that
this Facebook page was going to be effective. I just did not want
to look back and say that happened and I just didn't
do anything about it. >> NARRATOR: At the time, Wael
Ghonim was working for Google in the Middle East. >> In just three days, over
100,000 people joined the page. Throughout the next few months, the page was growing until
what happened in Tunisia. >> Events in Tunisia have
captured the attention of viewers around the world, and a lot of it was happening
online. >> It took just 28 days until
the fall of the regime. >> And it just created for me
a moment of, "Maybe we can do this." And I just posted an event
calling for a revolution in ten days, like we should all get to the
street and we should all bring down
Mubarak. >> Organized by a group of
online activists... >> They're calling it the
Facebook Revolution... (crowd chanting) >> NARRATOR: Within days,
Ghonim's online cry had helped fill the streets
of Cairo with hundreds of thousands of
protesters. (crowd chanting) 18 days later... >> (translated): President
Muhammad Hosni Mubarak has decided to step down. (cheering) >> They have truly achieved the
unimaginable. MAN: >> It's generally acknowledged
that Ghonim's Facebook page first sparked the protests. >> JACOBY: There was a moment
that you were being interviewed on CNN. >> Yeah, I remember that. >> First Tunisia, now Egypt,
what's next? >> Ask Facebook. >> Ask what?
>> Facebook. >> Facebook. >> The technology was, for me,
the enabler. I would have not have been able
to engage with others, I would have not been able to
propagate my ideas to others without social media,
without Facebook. >> You're giving Facebook a lot
of credit for this? >> Yeah, for sure. I want to meet Mark Zuckerberg
one day and thank him, actually. >> Had you ever think that this
could have an impact on revolution? >> You know, my own opinion
is that it would be extremely arrogant for any
specific technology company to claim any meaningful role
in, in those. But I do think that the overall
trend that's at play here, which is people being able to
share what they want with the people who they want, is an extremely powerful thing,
right? And we're kind of fundamentally
rewiring the world from the ground up. And it starts with people... >> They were relatively
restrained externally about taking credit for it,
but internally they were, I would say, very happy to take
credit for the idea that social media was being used
to effect democratic change. >> Activists and civil society
leaders would just come up to me and say, you know,
"Wow, we couldn't have done this without you guys." Government officials, you know,
would say, "Does Facebook really realize
how much you guys are changing our societies?" >> It felt like Facebook had
extraordinary power, and power for good. >> NARRATOR: But while Facebook
was enjoying its moment... (man shouting, crowd chanting) Back in Egypt,
on the ground and on Facebook, the situation was unraveling. >> Following the revolution, things went into a much worse
direction than what we have anticipated. >> There's a complete split
between the civil community and those who are calling
for an Islamic state. >> What was happening in Egypt
was polarization. >> Deadly clashes between
Christians and military police. >> (translated): The Brotherhood
cannot rule this country. >> And all these voices started
to clash, and the environment on social
media breeded that kind of clash, like that
polarization-- rewarded it. >> When the Arab Spring
happened, I know that a lot of people
in Silicon Valley thought our technologies helped
bring freedom to people, which was true. But there's a twist to this, which is Facebook's News Feed
algorithm. >> If you increase the tone of
your posts against your opponents, you are
gonna get more distribution. Because we tend to be
more tribal. So if I call my opponents names, my tribe is happy and
celebrating, "Yes, do it, like,
comment, share, so more people end up seeing
it." Because the algorithm is going
to say, "Oh, okay,
that's engaging content, people like it,
show it to more people." >> There were also other groups
of thugs, part of the pattern
of sectarian violence. >> The hardest part for me
was seeing the tool that brought us together
tearing us apart. These tools are just enablers
for whomever, they don't separate between
what's good and bad. They just look at engagement
metrics. >> NARRATOR: Ghonim himself
became a victim of those metrics. >> There was a page, it had,
like, hundreds of thousands of followers-- all what it did
was creating fake statements, and I was a victim of that page. They wrote statements about me
insulting the army, which puts me at serious risk because that is not something
I said. I was extremely naive in a way I
don't like, actually, now, thinking that these are
liberating tools. It's the spread of
misinformation, fake news, in Egypt in 2011. >> NARRATOR: He says he later
talked to people he knew at Facebook and other companies
about what was going on. >> I tried to talk to people
who are in Silicon Valley, but I feel like it was not,
it was not being heard. >> JACOBY: What were you trying
to express to people in Silicon Valley at the time?
>> It's very serious. Whatever that we... that you are
building has massive, serious unintended consequences on the lives of people
on this planet. And you are not investing enough
in trying to make sure that what you are building does
not go in the wrong way. And it's very hard to be in
their position. No matter how they try and move
and change things, there will be always unintended
consequences. >> Activists in my region were
on the front lines of, you know, spotting corners of Facebook
that the rest of the world, the rest of the company,
wasn't yet talking about, because in a company that's
built off numbers and metrics and measurements, anecdotes
sometimes got lost along the way. And that was always a
real challenge, and always bothered me. >> NARRATOR: Elizabeth Linder,
Facebook's representative in the region at the time, was also hearing warnings
from government officials. >> So many country
representatives were expressing to me a huge concern about the
ability of rumors to spread on Facebook, and what do you do
about that? >> JACOBY: How did you respond
to that at the time? >> We, we didn't have a solution
for it, and so the best that I could do
is report back to headquarters that this is
something that I was hearing on the ground. >> JACOBY: And what sort of
response would you get from headquarters? >> You know, I... it's
impossible to be specific about that, because it was
always just kind of a, "This is what I'm hearing, this
is what's going on." But I think in a... in a company
where the, the people that could have
actually, you know, had an impact on making those
decisions are not necessarily seeing it
firsthand. >> I think everything that
happened after the Arab Spring should have been a warning sign
to Facebook. >> NARRATOR: Zeynep Tufecki,
a researcher and former computer programmer, had also been raising alarms
to Facebook and other social media
companies. >> These companies were terribly
understaffed, in over their heads in terms of
the important role they were playing. Like, all of a sudden you're
the public sphere in Egypt. So I kept starting to talk to my
friends at these companies and saying,
"You have to staff up. You have to put in large amounts
of people who speak the language, who
understand the culture, who understand the complexities
of wherever you happen to operate." >> NARRATOR: But Facebook hadn't
been set up to police the amount of content coming
from all the new places it was expanding to. >> I think no one at any of
these companies in Silicon Valley
has the resources for this kind of scale. You had queues of work for
people to go through and hundreds of employees who
would spend all day every day clicking yes, no, keep, take
down, take down, take down, keep up, keep up,
making judgment calls, snap judgment calls, about, "Does it violate our terms of
service? Does it violate our standards
of decency? What are the consequences
of this speech?" So you have this fabulously
talented group of mostly 20-somethings who are
deciding what speech matters, and they're doing it in real
time, all day, every day. >> JACOBY: Isn't that scary?
>> It's terrifying. Right? The responsibility was awesome. No one could ever have predicted
how fast Facebook would grow. The, the trajectory of growth of
the user base and of the issues was like this. And of all... all staffing
throughout the company was like this. The company was trying to make
money, it was trying to keep costs
down. It had to be a going concern. It had to be a
revenue-generating thing, or it would cease to exist. >> NARRATOR: In fact, Facebook
was preparing to take its rapidly growing business
to the next level by going public. >> I'm David Ebersman,
Facebook's CFO. Thank you for taking the time to
consider an investment in Facebook. >> The social media giant hopes
to raise $5 billion. >> The pressure heading into the
I.P.O., of course, was to prove that Facebook was a great
business. Otherwise, we'd have no
shareholders. >> Facebook-- is it worth $100
billion? Should it be valued at that? >> NARRATOR: Zuckerberg's
challenge was to show investors and advertisers the profit that
could be made from Facebook's
most valuable asset-- the personal data it had
on its users. >> Mark, great as he was at
vision and product, he had very little experience in building a big
advertising business. >> NARRATOR: That would be the
job of Zuckerberg's deputy, Sheryl Sandberg, who'd done
the same for Google. >> At Facebook we have a broad
mission: We want to make the world more
open and connected. >> The business model we see
today was created by Sheryl Sandberg and the team
she built at Facebook, many of whom had been with her
at Google. >> NARRATOR: Publicly, Sandberg
and Zuckerberg had been downplaying the extent of the
personal data Facebook was collecting,
and emphasizing users' privacy. >> We are focused on privacy. We care the most about privacy. Our business model is by far
the most privacy-friendly to consumers. >> That's our mission, right? I mean, we have to do that
because if people feel like they don't have control
over how they're sharing things, then we're failing them. >> It really is the point that
the only things Facebook knows about you are things
you've done and told us. >> NARRATOR: But internally, Sandberg would soon
lead Facebook in a very different direction. >> There was a meeting, I think
it was in March of 2012, in which, you know, it was
everyone who built stuff inside ads, myself among them. And, you know, she basically
recited the reality, which is, revenue was
flattening. It wasn't slow,
it wasn't declining, but it wasn't growing
nearly as fast as investors would have guessed. And so she basically said, like,
"We have to do something. You people have
to do something." And so there was a big effort
to basically pull out all the stops and start
experimenting way more aggressively. The reality is that, yeah,
Facebook has a lot of personal data, your chat with
your girlfriend or boyfriend, your drunk party photos
from college, etc. The reality is that none of that
is actually valuable to any marketer. They want commercially
interesting data. You know, what products did you
take off the shelf at Best Buy? What did you buy in your last
grocery run? Did it include diapers?
Do you have kids? Are you head of household? Right, it's things like that,
things that exist in the outside world,
that just do not exist inside Facebook at all. >> NARRATOR: Sandberg's team
started developing new ways to collect personal data from
users wherever they went on the internet and when they
weren't on the internet at all. >> And so, there's this
extraordinary thing that happens that doesn't get
much attention at the time. About four or five months before
the I.P.O., the company announces
its first relationship with data broker companies, companies that most Americans
aren't at all aware of, that go out and buy up data
about each and every one of us-- what we buy, where we shop,
where we live, what our traffic patterns are,
what our families are doing, what our likes are, what
magazines we read-- data that the consumer doesn't
even know that's being collected
about them because it's being collected
from the rest of their lives by companies they don't know, and it's now being shared
with Facebook, so that Facebook can target ads
back to the user. >> What Facebook does
is profile you. If you're on Facebook, it's
collecting everything you do. If you are off Facebook,
it's using tracking pixels to collect what you are
browsing. And for its micro-targeting to
work, for its business model to work, it has to remain
a surveillance machine. >> They made a product that was
a better tool for advertisers than anything that had ever come
before it. >> And of course the ad revenue
spikes. That change alone, I think,
is a sea change in the way the company felt
about its future and the direction it was headed. >> NARRATOR: Sparapani was so
uncomfortable with the direction Facebook
was going, he left before the company's
work with data brokers took effect. The extent of Facebook's data
collection was largely a secret until
a law student in Austria had a chance encounter
with a company lawyer. >> I kind of wanted a semester
off so I actually went to California, to Santa Clara
University in the Silicon Valley. Someone from Facebook was a
guest speaker explaining to us basically how they deal with
European privacy law. And the general understanding
was, you can do whatever you want to
do in Europe because they do have data
protection laws, but they don't really enforce
them at all. So I sent an email to Facebook
saying I want to have a copy of all my data. So I got from Facebook about
1,200 pages, and I read through it. In my personal file, I think the
most sensitive information was in my messages. For example, a friend of mine
was in the closed unit of the... of a psychological hospital
in Vienna. I deleted all these messages,
but all of them came back up. And you have messages about, you
know, love life and sexuality. And all of that is kept. Facebook tries to give you the
impression that you share this only with
friends. The reality is,
Facebook is always looking. There is a data category called
"last location," where they store where they
think you've been the last time. If you tag people in pictures,
there's GPS location, so by that they know which
person has been at what place at what time. Back on the servers, there is,
like, a treasure trove just, like, ten times as big
as anything we ever see on the screen. >> NARRATOR: As Facebook was
ramping up its data collection business ahead of the I.P.O.,
Schrems filed 22 complaints with the Data Protection
Commission in Ireland, where Facebook has its
international headquarters. >> And they had 20 people at the
time over a little supermarket in a small town, it's called
Portarlington. It's 5,000 people in the middle
of nowhere. And they were meant to regulate
Google or Facebook or LinkedIn and all of them. >> NARRATOR: Schrems claimed
Facebook was violating European privacy law in the way
it was collecting personal data and not telling users what
they were doing with it. >> And after we filed these
complaints, that was when actually Facebook
reached out, basically saying, you know,
"Let's sit down and have a coffee and talk about
all of this." So we actually had a kind of
notable meeting that was in 2012
at the airport in Vienna. But the interesting thing is
that most of these points, they simply didn't have
an answer. You totally saw that their pants
were down. However, at a certain point,
I just got a text message from the data protection
authority saying they're not available to speak
to me anymore. That was how this procedure
basically ended. Facebook knew that the system
plays in their favor, so even if you violate the law,
the reality is it's very likely not gonna be
enforced. >> NARRATOR: Facebook disputed
Schrems's claims, and said it takes European
privacy laws seriously. It agreed to make its policies
clearer and stop storing some kinds of user data. >> So without further ado,
Mark Zuckerberg. >> NARRATOR: In Silicon Valley, those who covered the tech
industry had also been confronting
Facebook about how it was handling users'
personal data. >> Privacy was my number-one
concern back then. So when we were thinking about
talking to Mark, the platform was an issue, there were a bunch of privacy
violations, and that's what we wanted to
talk to him about. Is there a level of privacy that
just has to apply to everyone? Or do you think... I mean, you
might have a view of, this is what privacy means to
Mark Zuckerberg, so this is what it's going to
mean at Facebook. >> Yeah, I mean, people can
control this, right, themselves. Simple control always has been
one of the important parts of using Facebook. >> NARRATOR: Kara Swisher has
covered Zuckerberg since the beginning. She interviewed him after the
company had changed its default privacy settings. >> Do you feel like it's a
backlash? Do you feel like you are
violating people's privacy? And when we started to ask
questions, he became increasingly
uncomfortable. >> You know, it's... >> I think the issue is,
you became the head of the biggest social networking
company on the planet. >> Yeah, no, so... but I... the
interesting thing is that, you know, so I started this when
I was, you know, started working on this type of
stuff when I was 18. >> So he started to sweat quite
a lot, and then a lot a lot, and then a real lot. So the kind that... this kind of
thing where, you know, like "Broadcast News," where it
was dripping down, like... or Tom Cruise in that
"Mission: Impossible." It was just... it was going to
his chin and dripping off. >> You know, a lot of stuff
changed as we've gone from building this project in a
dorm room... >> And it wasn't stopping
and I was noticing that one of the people from
Facebook was, like, "Oh, my God," and was...
we were... I was trying to figure out
what to do. >> Yeah. I mean, a lot of stuff
happened along the way. I think, you know, there were
real learning points and turning points along the way
in terms of... in terms of building things. >> He was in such distress, and
I know it sounds awful, but I felt like his mother. Like, "Oh, my God, this poor guy
is gonna faint." I thought he was gonna faint,
I did. Do you want to take off the
hoodie? >> Uh, no.
(chuckles) Whoa. >> Well, different people think
different things. He's told us he had the flu. I felt like... he had had a
panic attack, is what happened. >> Maybe I should take off the
hoodie. >> Take off the hoodie.
>> Go ahead. What the hell? >> That is a warm hoodie.
>> Yeah. No, it's a thick hoodie. We... it's, um, it's a company
hoodie. We print our mission on the
inside. >> What?! Oh, my God, the inside of the
hoodie, everybody. Take a look. What is it? "Making the..." >> "Making the world more open
and connected." >> Oh, my God.
It's like a secret cult. >> JACOBY: From that interview
and from others, I mean, how would you have
characterized Mark's view of privacy? >> Well, you know, I don't know
if he thought about that. It's kind of interesting because
they're very... they're very loose on it. They have a viewpoint that this
helps you as the user to get more information, and
they will deliver up more... That's the whole ethos of
Silicon Valley, by the way. If you only give us everything,
we will give you free stuff. There is a trade being made
between the user and Facebook. The question is, are they
protecting that data? >> Thank you, Mark. >> NARRATOR: Facebook had been
free to set its own privacy standards,
because in the U.S. there are no overarching
privacy laws that apply to this kind
of data collection. But in 2010, authorities at the
Federal Trade Commission became concerned. >> In most other parts of the
world, privacy is a right. In the United States,
not exactly. >> NARRATOR: At the FTC, David
Vladek was investigating whether Facebook had been
deceiving its users. What he found was that Facebook
had been sharing users' personal data with so
called "third-party developers"-- companies that built games and
apps for the platform. >> And our view was that, you
know, it's fine for Facebook to collect this data,
but sharing this data with third parties without
consent was a no-no. >> But at Facebook, of course,
we believe that our users should have complete control
of their information. >> The heart of our cases
against companies like Facebook was deceptive conduct. That is, they did not make it
clear to consumers the extent to which their
personal data would be shared with third parties. >> NARRATOR: The FTC had another
worry: They saw the potential for data
to be misused because Facebook wasn't keeping
track of what the third parties were doing with it. >> They had, in my view,
no real control over the third-party app
developers that had access to the site. They could have been anyone. There was no due diligence. Anyone, essentially, who could
develop a third-party app could get access to the site. >> JACOBY: It could have been
somebody working for a foreign adversary. >> Certainly. It could have been somebody
working... yes, for, you know,
for the Russian government. >> NARRATOR: Facebook settled
with the FTC without admitting guilt and,
under a consent order, agreed to fix the problems. >> JACOBY: Was there an
expectation at the time of the consent order
that they would staff up to ensure that their users' data
was not leaking out all over the place?
>> Yes. That was the point of this
provision of the consent order that required them to identify
risk to personal privacy and to plug those gaps quickly. >> NARRATOR: Inside Facebook,
however, with the I.P.O. on the horizon, they were also under pressure
to keep monetizing all that personal information, not just fix the FTC's privacy
issues. >> Nine months into my first job
in tech, I ended up in an interesting
situation where, because I had been the main
person who was working on privacy issues with respect
to Facebook platform-- which had many, many, many
privacy issues, it was a real hornet's nest. And I ended up in a meeting with
a bunch of the most senior executives at
the company, and they went around the room,
and they basically said, "Well, who's in charge?" And the answer was me, because
no one else really knew anything about it. You'd think that a company of
the size and importance of Facebook, you know, would
have really focused and had a team of people and,
you know, very senior people working on
these issues, but it ended up being me. >> JACOBY: What did you think
about that at the time? >> I was horrified. I didn't think I was qualified. >> NARRATOR: Parakilas tried to
examine all the ways that the data Facebook
was sharing with third-party developers
could be misused. >> My concerns at that time were
that I knew that there were all these malicious actors who
would do a wide range of bad things,
given the opportunity, given the ability to target
people based on this information that Facebook had. So I started thinking through what are the worst-case
scenarios of what people could do
with this data? And I showed some of the kinds
of bad actors that might try to attack, and I shared it out with a
number of senior executives. And the response was muted,
I would say. I got the sense that this just
wasn't their priority. They weren't that concerned
about the vulnerabilities that the company was creating. They were concerned about
revenue growth and user growth. >> JACOBY: And that was
expressed to you, or that's something that you
just gleaned from the interactions? >> From the lack of a response,
I gathered that, yeah. >> JACOBY: And how senior were
the senior executives? >> Very senior. Like, among the top five
executives in the company. >> NARRATOR: Facebook has said
it took the FTC order seriously and, despite Parakilas's
account, had large teams of people working to improve
users' privacy. But to Parakilas and others
inside Facebook, it was clear the business model
continued to drive the mission. In 2012, Parakilas left the
company, frustrated. >> I think there was a certain
arrogance there that led to a lot of bad
long-term decision-making. The long-term ramifications of
those decisions was not well thought through
at all. And it's got us to where we are
right now. (cheers and applause) >> Your visionary, your founder,
your leader. Mark, please come to the podium. (cheers and applause) >> NARRATOR: In May of 2012, the
company finally went public. >> The world's largest social
network managed to raise more than $18 billion, making it the largest technology
I.P.O. in U.S. history. >> People literally lined up in
Times Square around the NASDAQ board. >> We'll ring this bell and
we'll get back to work. >> With founder Mark Zuckerberg
ringing the NASDAQ opening bell remotely from Facebook
headquarters in Menlo Park, California. >> NARRATOR: Mark Zuckerberg was
now worth an estimated $15 billion. Facebook would go on to acquire
Instagram and WhatsApp on its way to becoming one of
the most valuable companies in the world. >> Going public is an important
milestone in our history. But here's the thing: our mission isn't to be
a public company. Our mission is to make the world
more open and connected. (cheering) >> NARRATOR: At Facebook, the
business model built on getting more and more of users'
personal data was seen as a success. But across the country, researchers working for the
Department of Defense were seeing something else. >> The concern was that social
media could be used for really nefarious purposes. The opportunities for
disinformation, for deception, for everything else,
are enormous. Bad guys or anybody could use
this for any kind of purpose in a way that wasn't possible
before. That's the concern. >> JACOBY: And what did you see
as a potential threat of people giving up their data? >> That they're opening
themselves up to being targets for
manipulation. I can manipulate you to buy
something, I can manipulate you to vote for somebody. It's like putting a target...
painting a big target on your front and on your chest
and on your back, and saying, "Here I am. Come and manipulate me. You have every... I've given you
everything you need. Have at it." That's a threat. >> NARRATOR: Waltzman says
Facebook wouldn't provide data to help his research. But from 2012 to 2015, he and
his colleagues published more than 200 academic papers
and reports about the threats they were
seeing from social media. >> What I saw over the years of
the program was that the medium enables you
to really take disinformation and turn it into
a serious weapon. >> JACOBY: Was your research
revealing a potential threat to national security? >> Sure, when you looked at how
it actually worked. You see where the opportunities
are for manipulation, mass manipulation. >> JACOBY: And is there an
assumption there that people are easily misled? >> Yes, yes, people are easily
misled, if you do it the right way. For example, when you see people
forming into communities, okay, what's called
filter bubbles. I'm gonna exploit that to craft
my message so that it resonates most
exactly with that community, and I'll do that for every
single community. It would be pretty easy... it
would be pretty easy to set up a fake account, and a large
number of fake accounts, embedded it in different
communities, and use them to disseminate
propaganda. >> JACOBY: At an enormous scale? >> Yes, well, that's why it's a
serious weapon, because it's an enormous scale. It's the scale that makes it
a weapon. >> NARRATOR: In fact, Waltzman's
fears were already playing out at a secret propaganda factory
in St. Petersburg, Russia, called the Internet Research
Agency. Hundreds of Russian operatives
were using social media to fight the anti-Russian
government in neighboring Ukraine. Vitaly Bespalov says he was one
of them. >> JACOBY: Can you explain, what
is the Internet Research Agency? (speaking Russian) >> (translated): It's a company
that creates a fake perception of Russia. They use things like
illustrations, pictures-- anything that would influence
people's minds. When I worked there,
I didn't hear anyone say, "The government runs us" or
"the Kremlin runs us," but everyone there knew and
everyone realized it. >> JACOBY: Was the main
intention to make the Ukrainian government
look bad? >> (translated): Yeah, yeah,
that's what it was. This was the intention
with Ukraine. Put President Poroshenko
in a bad light and the rest of the government,
and the military, and so on. (speaking Russian) You come to work and there's a
pile of SIM cards, many, many SIM cards,
and an old mobile phone. You need an account to register
for various social media sites. You pick any photo of a random
person, choose a random last name, and
start posting links to news in different groups. >> NARRATOR: The Russian
propaganda had its intended effect: helping to
sow distrust and fear of the Ukrainian government. (chanting) >> Pro-Russia demonstrators
against Ukraine's new interim
government. >> "Russia, Russia," they chant. >> Russian propaganda was
massive on social media. It was massive. >> There was so many stories
that start emerging on Facebook. >> "Cruel, cruel Ukrainian
nationalists killing people or torturing them because
they speak Russian." >> They scared people. "You see, they're gonna attack, they're gonna burn your
villages. You should worry." (speaking Russian) >> And then the fake staged
news. (speaking Russian) >> "Crucified child
by Ukrainian soldiers," which is totally nonsense. (speaking Russian) >> It got proven that those people were actually
hired actors. >> Complete nonsense. >> But it spreads on Facebook. >> So Facebook was weaponized. >> NARRATOR: Just as in
the Arab Spring, Facebook was being used to
inflame divisions. But now by groups working on
behalf of a foreign power, using Facebook's tools built
to help advertisers boost their content. >> By that time in Facebook,
you could pay money to promote these stories. So your stories emerge
on the top lines. And suddenly you start to
believe in this, and you immediately get
immediate response. You can test all kind of
nonsenses and understand to which nonsense
people do not believe... (man speaking Ukrainian) And to which nonsenses people
start believing. (chanting in Russian) Which will influence the
behavior of person receptive to propaganda, and then
provoking that person on certain action. ♪ ♪ >> They decided to undermine
Ukraine from the inside... (gunfire echoing, shouting) ...rather than from outside. >> I mean, basically, think
about this-- Russia hacked us. >> NARRATOR: Dmytro Shymkiv,
a top adviser to Ukraine's president, met
with Facebook representatives and says he asked them
to intervene. >> The response that Facebook
gave us is, "Sorry, we are open platform,
anybody can do anything without... within our policy, which is written
on the website." And when I said,
"But this is fake accounts." (laughs): "You could
verify that." "Well, we'll think about this
but, you know, we, we have a freedom of speech and we are very pro-democracy
platform. Everybody can say anything." >> JACOBY: In the meeting, do
you think you made it explicitly clear that Russia
was using Facebook to meddle in Ukraine politics? >> I was explicitly saying that
there are trolls factory, that there are posts and news
that are fake, that are lying, and they are promoted on your
platform by, very often, fake accounts. Have a look. At least sending
somebody to investigate. >> JACOBY: And no one... sorry.
>> No. >> JACOBY: No one was sent?
>> No, no. For them, at that time,
it was not an issue. >> NARRATOR: Facebook told
"Frontline" that Shymkiv didn't raise the
issue of misinformation in their meeting, and that their
conversations had nothing to do with what would happen
in the United States two years later. >> JACOBY: It was known to
Facebook in 2014 there was potential for Russian
disinformation campaigns on Facebook. >> Yes. And there were disinformation
campaigns from a number of different
countries on Facebook. You know, disinformation
campaigns were a regular facet
of Facebookery abroad. And... I mean, yeah, technically
that should have led to a learning experience. I just don't know. >> JACOBY: There was plenty that
was known about the potential downsides
of social media and Facebook-- you know,
potential for disinformation, potential for bad actors
and abuse. Were these things that you just
weren't paying attention to, or were these things that were
kind of conscious choices to kind of say, "All right,
we're gonna kind of abdicate responsibility from those things
and just keep growing"? >> I definitely think we've been
paying attention to the things that we know. And one of the biggest
challenges here is that this is really
an evolving set of threats and risks. We had a big effort around
scams. We had a big effort around
bullying and harassment. We had a big effort around
nudity and porn on Facebook. It's always ongoing. And so some of these threats
and problems are new, and I think we're grappling
with that as a company with other companies in this
space, with governments, with other organizations, and so I, I wouldn't say
that everything is new, it's just different problems. >> Facebook is the ultimate
growth stock... >> NARRATOR: At Facebook
headquarters in Menlo Park, they would stick to the mission
and the business model, despite a gathering storm. >> ...get their election news
and decision-making material from Facebook. >> The most extraordinary
election... >> NARRATOR: By 2016, Russia was
continuing to use social media as a weapon. >> ...Hillary Clinton cannot
seem to extinguish... >> NARRATOR: And division and
polarization were running through the presidential
campaign. >> Just use it on lying,
crooked Hillary... >> The race for the White House
was shaken up again on Super Tuesday... >> NARRATOR: Mark Zuckerberg saw
threats to his vision of an open and connected world. >> As I look around,
I'm starting to see people and nations turning inward, against this idea of a connected
world and a global community. I hear fearful voices calling
for building walls and distancing people
they label as others. For blocking free expression,
for slowing immigration, reducing trade and in some cases
around the world, even cutting access
to the internet. >> NARRATOR: But he continued
to view his invention not as part of the problem,
but as the solution. >> And that's why I think the work that we're all doing
together is more important now
than it's ever been before. (cheers and applause) >> NARRATOR: Tomorrow night, "Frontline's" investigation
continues. >> There is absolutely no
company who has had so much influence
on the information that Americans consume. >> NARRATOR: He's the man who
connected the world. But at what cost? >> Polarization was the key
to the model. >> NARRATOR: The global
threat... >> This is an information
ecosystem that just turns democracy
upside down. >> NARRATOR: The 2016
election... >> ...Facebook getting over a
billion political campaign posts. >> NARRATOR: And the company
denials... >> The idea that fake news
on Facebook influenced the election
in any way I think is a pretty crazy idea. >> ...Facebook CEO Mark
Zuckerberg will testify... >> ...and I'm responsible for
what happens here. >> NARRATOR: Is Facebook ready
for the mid-term elections? >> There are a lot of questions
heading into this midterm... >> ...the midterm elections... >> I still have questions if we're going to make sure that
in 2018 and 2020 this doesn't happen again. >> NARRATOR: Part two of
"The Facebook Dilemma." Tomorrow night on "Frontline." >> Go to pbs.org/frontline to
read more about more about Facebook
from our partner, Washington Post reporter
Dana Priest. >> For Facebook the dilemma is
can they solve these serious problems without
completely revamping their business model. >> Then watch a video explainer
about what Facebook knows about you and how. >> ...even though you never
signed up for it, Facebook now has data about you and stores it as
a shadow profile... >> Connect to the "Frontline"
community at pbs.org/frontline. ♪ ♪ >> For more on this and other
"Frontline" programs, visit our website at
pbs.org/frontline. ♪ ♪ To order Frontline's
"The Facebook Dilemma" on DVD, visit ShopPBS,
or call 1-800-PLAY-PBS. This program is also available
on Amazon Prime video. ♪ ♪
Stop using Facebook now.
I'm really surprised that Facebook hasn't shut down after all this drama.
I honestly believe that Zuckerberg's mission comes from a desire to unite human beings in a revolutionary way. He's not a revolutionary guy though. He's a guy with an idea that is a pandora's box. The box is open now, and they aren't always focused on the bad things coming out of it. The bad things' impact on civilization has the potential to erase any positive impact that FB has.