Tonight. Part, warm of a two night special, we face a number of important, issues around, privacy safety. And democracy. Frontline. Investigates. Facebook. We didn't take a broad enough view of our responsibilities. And it was my mistake and I'm sorry told. By company, insiders it's possible, that we haven't been as fast as we needed to be too slow to act and I didn't see it faster I think we were too slow and former, employers, everybody. Was pretty upset that we had caught it during the election our Facebook was used to, disrupt democracy. Around the globe I don't think any of us Marc included, appreciated. How much of the fact we might have had correspondent. James Jacoby takes a hard look at the man who wanted to connect the world is, he not recognising the importance, of his platform he didn't understand what he had built but is he accountable, for helping divided, there is something wrong systemically. With the Facebook algorithms, that Bank polarization. Was, the key to the model tonight, on Frontline the. Facebook dilemma. You. We. Get stuck, with the beer down no, no actually I'm gonna mention the beer. Part. Of work I said. I'm. Here in Palo Alto California chillin, with Mark Zuckerberg of the, facebook.com. And we're. Drinking out of a keg of Heineken because, what. Are we celebrate or skip 3 million users. You. Know simply. Wouldn't Facebook is I think, Facebook is an, online, directory for. Colleges, I, realized. That because, I didn't have people's information I, needed to make it interesting enough so that people would want to use the site and want to like put, their information out so launch. Centers Harvard and within, a couple of weeks two-thirds, of the school and signed up you're sort of like alright this is pretty sweet like let's, just go all out so just interesting seeing out of the balls we, have a sweet office yeah. Well show us around the crib. We. Didn't want cubicles, so we got IKEA kitchen tables, instead thought, that can along with our whole vibe here huh. Right some, stuff there's some beer down there how, many people work through it's, actually 20 right now see, this shot, this one hand for the lady, right a pitbull oh nice, you. Know. It's. Really all I've got at, the school, and. Where do you take a fist at this point anyone I. Mean. There, doesn't necessarily have to be more. From. The early days mark, had this vision of connecting the, whole world, so. If Google was about providing, you access to all the information. Facebook. Was about connecting, all the people. You just say your name and pronounce it so nobody, messes, it up and they have it on tape sure it's Mark Zuckerberg it, was not crazy somebody, was gonna connect all those people, why. Not you know we have our Facebook. Fellow we have Mark Zuckerberg, I have the pleasure of introducing Mark, Zuckerberg founder of facebook.com. Yo. When. Mark Zuckerberg was at Harvard he was fascinated. By hacker culture this notion that software, programmers, could do, things that would shock the world, a lot of times people are just like too careful I think it's more useful to like make, things happen and then like apologize, later than, it is to make sure that you dot all your eyes now and then like just not get stuff done so it was a little bit of a renegade, philosophy.
And A. Disrespect. For authority that. Led to the, Facebook. Motto move, fast and break things. Started. Adding things like status updates and photos and groups and apps when we first launched we were hoping for you know maybe 400, 500, people. So. You're motivated, by building, things that changed. The world and then, in. A way that it means to be changed. The. Answer is right there on my Facebook, page. Those. Days that move fast and break things didn't, seem to. Be sociopathic. If you're building a product that people love you. Can make a lot of mistakes it wasn't that they intended, to do harm so much as they were unconcerned. About the possibility. That harm the result so just to be clear you're, not going to sell or, share any. Of the information on Facebook, we're not going to share people's information except. For with, the people that they've asked, for it to be shared technology. Optimism. Was so deeply, ingrained in the, value system and in the beliefs of people in Silicon Valley here perhaps what. Start that they'd, come to believe it as akin. To the law of gravity that of. Course. Technology. Makes the world a better place it, always had it always will and. That. Assumption. Essentially. Masked. A set, of changes that were going on in the culture that, were. Very. Dangerous, from. K xjz and Sacramento. Mark. Zuckerberg quest, to connect the world would, bring about historic, change and far-reaching. Consequences in. Politics. Privacy, and technology. We've. Been investigating warning, signs that existed, long before problems. Burst into public viewed with -, but. For those inside, Facebook the, story began with an intoxicating. Vision, that turned into a lucrative business plan, while the one thing that Mark, Zuckerberg has been so good at is being incredibly, clear, and compelling about the mission that Facebook, has always had Facebook's. Mission, is to give people the power to share. How. Pervasive, a, mission was that inside, of the company give me a sense of that it was something that, you. Know mark doesn't just say when we do you, know ordered calisthenics, in the morning and we yelled the mission to each other right, we would actually say it to each other you know when mark wasn't around and that was a mission that you really believed in. How. Could you not you. Know how exciting, what, if connecting, the world actually. Delivered. A promise, that we've. Been looking for to, genuinely. Make the world a better place, was. There ever a point where there, was questions. Internally, about this. Mission being naive. Optimism, I think. The short answer is completely, yes and, I think that's why we loved it especially, in a moment like when we crossed a billion, monthly. Active users for, the first time and marks, the, way I recall, mark, at the time. I remember. Thinking I don't, think mark is gonna stop until he gets to everybody I, think. Some of us had an early.
Understanding. That. We, were creating, in some ways a digital, nation state this. Was the greatest. Experiment. In free speech in human history there. Was a sense inside the company that we are building, the future and, there. Was a real focus on. Youth. Being a good thing. It, was not a particularly diverse, workforce. It was very much the sort of Harvard Stanford Ivy, League group. Of people, who were largely, in their 20s. That. Was a big believer in the company like I I knew that it, was gonna be a paradigm. Shifting, thing there is this definitely this feeling of everything for the company of this you know world stirring vision everyone, more or less dressed with the same fleece, and swag with logo on it posters, on the wall that looks somewhat Orwellian but, of course you know I don't in an upbeat way obviously and da you know some of the slogans are pretty well known move, fast and break things fortune. Favors the bold what. Would you do if you weren't afraid you, know as always he sort of rousing rhetoric that would push you to go further. Antonio. Garcia, Martinez, a former, product manager, on Facebook's, advertising team. Is. One of eight former, Facebook insiders. Who. Agreed to talk on camera about their experiences. In. Silicon Valley there's, a you know almost a mafioso code. Of silence that. You're not supposed to talk about the business in. Any but the most flattering way right basically you can't say anything you. Know measure truthful, about the business and, I think as perhaps with Facebook it's kind of arrived at the point which it's so important, it needs to be a little more transparent about how it works like let's stop a little book rate about everyone, in Silicon Valley you know creating. Disrupting, this and improving the world right it's in many ways of business like any other it's, just kind of more exciting and impactful. By, 2007. Zuckerberg. Had made it clear that the goal of the business was, worldwide, expansion. Almost. A year ago when we were first discussing how, to let everyone in the world into Facebook I remember someone said to me mark, we, already have nearly every college student in the u.s. on Facebook, it's incredible that we were even able to do that but no one gets a second trick like that well.
Let's Take a look at how he did. What. Was the growth team about what did you do at growth the, story of growth has really been about making Facebook, available, to people that wanted it but it couldn't have access to it Naomi, glite Facebook's. Second, longest-serving employee, is one, of five officials, the company put forward to talk to frontline, she. Was an original member of the growth team one. Of my first projects, was expanding. Facebook to high school students I worked on translating. Facebook into over a hundred languages, when. I joined there were 1 million users and now there's over 2 billion people using Facebook, every month some of the problems, that have reared, their head with, Facebook over the past couple of years seem, to have been caused in some ways by this. Exponential. Growth so. I think mark, and Marcus said this that we. Have been slow, to really. Understand, the. Ways in which Facebook, might be used for, bad. Things we've. Been really focused on the good things so, who are all these new users the, growth team had tons of engineers figuring, out how you could make the new user experience more engaging how you could figure out how to get more people to sign up everyone, was focused on growth, growth growth. And. The key to keeping all these new people engaged. Was. Facebook's, most important, feature newsfeed. Newsfeed. A seemingly. Endless stream, of stories. Pictures, and updates shared. By friends, advertisers. And others. Is all the information available to each user it actually computes, what's gonna be the most interesting piece of information and publishes a little story for them it's your personalized, newspaper it's your the New York Times of you channel you it, is you know your customized, optimized vision on the world but, what appeared in users newsfeed, wasn't, random it. Was driven by a secret, mathematical. Formula, an, algorithm, the, stories are ranked in terms of what's going to be the most important, and we, designed a lot of algorithms so we can produce interesting content, for you the goal the newsfeed is to provide you, the user with. The content. On Facebook that, you most want to see it, is designed, to make you want to, keep. Scrolling keep, looking keep liking that's, the key that's the secret sauce that's how we that's why we're worth X billion dollars the. Addition of the new like button in 2009. Allowed. News Feed to collect vast amounts, of users, personal, data that would prove invaluable to. Facebook, at the time we were a little, bit skeptical about like, button we were concerned and as it turned out our intuition was just dead wrong and what we found is that the like button acted as a social lubricant and of course it was also driving, this flywheel of engagement. That people felt like they were heard on the platform, whenever, they shared something, and. And became a driving force for the product it was incredibly, important, because it allowed us to understand, who, are the people that you care more about that caused you to react, and who. Are the businesses the pages, the, other interests. On Facebook, that are important, to you and that gave. Us a degree of constantly. Increasing. Understanding about, people, newsfeed. Got off to a bit of a rocky start and now, our users love newsfeed they love it. Newsfeeds. Exponential. Growth was. Spurred on by the fact that existing, laws didn't. Hold Internet companies, liable for all the content, being posted, on their sites, so. Section, 230, of the Communications Decency. Act is, the, provision, which. Allows the, Internet economy to grow and thrive and Facebook, is one of the principal beneficiaries. Of this. Provision it, says don't. Hold this, Internet company, responsible, if some idiot says something, violent. On the, site don't. Hold the internet company responsible, if somebody, publishes. Something, that. Creates. Conflict. That. That violates. The law it's the quintessential. Provision. That allows them to say don't, blame Otsu, so. It was up to Facebook to, make the rules, not. Inside the company they, made a fateful decision we. Took a very libertarian, perspective here, we allowed, people to speak and, we. Said if you're gonna incite violence that's clearly. Out of bounds we're gonna kick you off immediately but, we're, gonna allow people to go right up to the edge and we're. Gonna allow other people to respond. We. Had to set up some ground rules, basic. Decency no, nudity and no violent, or hateful speech and. After. That we. Felt, some, reluctance to. Interpose. Our. Value, system, on. This. One worldwide community that was growing there. Not a concern then that it could come become, sort of a place.
Of Just utter confusion. That. You, have, lies. That are given the same weight as truths, and that. It kind of just becomes a place where truth. Becomes. Completely, obfuscated. No, we. Relied, on what we thought were the public's, common sense and common decency. To. Police the site. That. Approach would soon contribute, to real-world consequences. Far. From Silicon, Valley, where. Mark Zuckerberg, optimistic. Vision that first seemed to be playing out, the. Arab Spring had, come to Egypt. It. Took hold with the help of a Facebook, page, protesting, abuses, by the regime of Hosni, Mubarak not. That I was thinking that this Facebook, page was going to be, effective. I just did not want to look back and say that happened and I just didn't, do anything about it at. The time Wael. Ghonim was, working for Google in the Middle East in. Just three days and over a hundred thousand, people joined. The page. Throughout. The next, few months the page was growing until, what happened in Tunisia. It. Took just 28. Days to the form of the regime it just created for, me a moment, of. Maybe. We can do this and. I. Just posted, an, event, calling. For a revolution in. 10 days like we should all get to the street and we should all bring down Mubarak. Within. Days gun. Eames online, cry had helped fill the streets of Cairo with, hundreds, of thousands of protestors. Eighteen. Days later, President. Mohammad, Asti Mubarek has decided to step down. It's. Generally, acknowledged, that go teams Facebook, page first sparked, the protests, there was a moment, that you. Were being interviewed on CNN yeah, I remember them first Tunisia, now Egypt. What's next. Facebook. The technology, was for me the enabler, I would not have been able to engage. With others I would not have been able to propagate. My ideas, to others without. Social. Media without Facebook you're giving Facebook a lot of credit for this, yeah. For, sure I want to like look a bit one day and thank you much. Hey. You ever. Think. That, this. Could have an impact on. Revolution. You know my own opinion is that it, would be extremely, arrogant, for any specific. Technology, company to claim any meaningful role in in those, but.
I Do think the overall trend that's at play here which is people being able to share what they want with the people who they want is an extremely powerful thing, right in and we're kind of fundamentally. Rewiring, the world from the ground up and it starts with people they were relatively. Restrained. Externally. About taking credit for it but internally, they were I would. Say very, happy, to take. Credit for the, idea that social media is being used to, affect, democratic, change activists. And civil society, leaders would, just come up to me and say you, know wow, we, couldn't have done this without you. Guys government. Officials you know would say does, Facebook really, realize how much you guys are changing our, societies. It felt. Like. Facebook. Had. Extraordinary. Power and power. For good. But, while Facebook was enjoying its moment. Back. In Egypt on, the ground and on Facebook, the, situation, was unraveling. Following. The revolution things, went into a much worse direction. Than, what, we have anticipated, there's a complete split in the community. In those who are calling, for an Islamic state what was happening in Egypt was polarization. Deadly clashes between Christians, and military, police. And. All, these voices, started to clash and the environment. On social media breeded. That kind of clash like that polarization, rewarded, it. When. The Arab Spring happened I, know that a lot of people in Silicon Valley thought, our technologies. Helped bring freedom to, people which. Was true, but. There's a twist of this which is Facebook's, news feed algorithm. If. You increase the tone of your posts, against, your, opponent's we're gonna get more distribution. Because. We tend to be more tribal so if I call my opponents. Names, my. Tribe, is happy and celebrating, yes. Do it like, comment. Share so, more people end up seeing it because the algorithm is gonna say oh okay that's engaging content people like it show it to more people. The. Hardest part for me was seeing, the tool that brought, us together tearing, us apart these, tools are. Just enablers. For whomever, they they. Don't separate, between what's good and bad they just look at engagement, metrics. Your. Name himself became. A victim of those metrics there, was a page it had like hundreds of thousands of followers all what it did was creating, fake statements, and I, was a victim of that page. They. Wrote. Statements, about me insulting the army which puts, me a serious, risk because, that. Is not. Something I said I was extremely naive in a way I don't like actually now thinking. That these are liberating. Tools. It's. The spread of misinformation fake. News in, Egypt, in. 2011. He says he later talked to people he knew at Facebook, and other companies, about what was going on I tried to talk to people. Who are in, Silicon, Valley but, I feel. Like it, was not it was not being heard what were you trying to express to people in Silicon Valley it's very serious, whatever that way that, you are building has. Massive. Serious intent. And unintended, consequences. On the lives of people on this planet and you. Are not, investing. Enough in trying, to make sure that what you are building. Does. Not go in the wrong way and it's, very hard to be in their position no. Matter how they try and move and change things there, will be always unintended, consequences. Activists. In, my region were on the front lines of, the. You know spotting. Corners. Of. Facebook. That the rest of the, world the rest of the company wasn't. Yet talking. About because. In a company that's built off. Numbers. And metrics and measurements. Anecdotes. Sometimes. Got. Lost along the way and. That. Was, always a real challenge, and always. Bothered, me Elizabeth. Linder Facebook's, representative. In the region at the time was. Also hearing warnings, from government, officials so. Many. Country. Representatives. Were, expressing, to me a huge concern, about the ability of rumors. To spread on. Facebook. And and what do you do, about that how did you respond, to that word. We didn't have a solution for it and so, the best that I could do is, report. Back to, headquarters that, this is something that I was hearing. On the ground and, what sort of response would you get from headquarters you know I. Impossible. To be specific, about that because, it was always just kind of a this is what I'm hearing this is what's going on but. I think in, a, in. A company, where the the the. The the people that could have actually you. Know had an impact on making those decisions are not necessarily, seen at firsthand I think. Everything, that happened, after the Arab Spring should, have been a warning sign to Facebook zeyneb to fetch II a researcher. And former computer programmer had, also been raising, alarms, to Facebook, and other social media, companies, these, companies were terribly, understaffed.
In. Over their heads in terms of important role they are playing like. All of a sudden you're. The public sphere in Egypt so. I kept starting, to, talk, to my friends. At these companies in saying you. Have to staff up and you have to put in large amounts of people who speak the language, who understand, the culture who understand, the complexities, of wherever. You happen to operate, but. Facebook hadn't been set up to police the amount of content, coming from all the new places it was expanding, to I think. No. One at. Any of these companies in Silicon Valley has the resources, for, this kind of scale. You. Had queues of, work, for people to go through and hundreds. Of employees who, would spend all day every day clicking, yes no keep, take, down take down take, down keep up keep up making, judgment, calls snap. Judgment, calls about does it violate our Terms of Service does, it violate our standards, of decency what are the consequences of the speech so, you have these fabulously, talented group, of mostly. 20-somethings. Who are. Deciding. What, speech matters and they're doing it in a real-time all, day every day isn't. That scary it's, terrifying. Right. The responsibility. Was awesome. No. One could ever have predicted how. Fast Facebook, would grow the trajectory of. Growth. Of, the, user base and of. The issues was like this and of, all, all. Staffing throughout the company was like this the. Company was trying to make money it was trying to keep costs down, it had to be a going. Concern it. Had to be, a revenue-generating, thing. Or what ceased to exist, in. Fact Facebook. Was preparing to take its rapidly, growing business. To the next level, by, going public, I'm. David Ebers been Facebook's CFO, thank, you for taking the time to consider an investment, in Facebook. The. Pressure heading, into the IPO of course was to prove that Facebook, was a great business otherwise, we'd have no shareholders, Facebook. Is it worth a hundred billion dollars should it be valued Zuckerberg, challenge, was, to show investors, and advertisers, the profit, that could be made from Facebook's, most valuable, asset the. Personal, data it had on its users mark. Great. As he was at vision, and product. He. Had very little experience, in, building, a big advertising business that would be the job of Zuckerberg deputy. Sheryl. Sandberg who. Had done the same for Google, at. Facebook, we have a broad mission we. Want to make the world more open and connected. The. Business model we see today was created. By, Sheryl. Sandberg and the team she built at, Facebook. Many of whom had been with her at Google. Publicly. Sandberg. And Zuckerberg, had been downplaying, the extent, of the personal data Facebook, was collecting, and emphasizing. Users, privacy, we are focused on privacy we, care, the most about privacy, our, business, model is by, far the most privacy friendly, to consumers. That's our mission right I mean we have to do that because now, if people feel, like they don't have control over how they're sharing things then then. We're failing that really, is the point that the only things Facebook knows about you are things you've done and told us but internally. Sandberg. Would soon lead Facebook, in a very different, direction there's, a meeting I think was in March of 2012 in which you know it was everyone who built. Stuff inside ads myself. And, you know she basically recited, the reality, which is revenue, was flattening I wasn't slow isn't declining, but it wasn't growing nearly as fast as investors, would have guessed so she basically said, like, we have to do something you people have to do something and so, there was a big effort to basically. Pull out all the stops and start. Experimenting way more aggressively. The. Reality is yeah Facebook has a lot of personal data your. Chat with your girlfriend a boyfriend, your drunk party photos from college etc the, reality is that none of that is actually valuable to any marketer, they, want commercially interesting data, you. Know what products did you take off the shelf at Best Buy what did you buy in your last grocery run that include diapers do you have kids are you a head of household right it's things like that things that exist in the outside world that just do not exist inside, Facebook at all. Sandburg. Steam started, developing, new ways to collect personal, data from users wherever, they went on the internet and when. They weren't on the Internet at all and so, there's, extraordinary, thing that happens that doesn't get much attention at.
The Time. About. Four or five months before the IPO the. Company announces its first. Relationship. With data broker companies companies. That most Americans aren't all aware of that. Go out and buy. Up data. About each and every one of us by. Where. We shop where. We live what, our traffic, patterns, are what, our families, are doing we're likes are what magazines, we read data that the consumer doesn't even know that's being collected about them because it's being collected from the, rest of their lives by, companies they don't know and it's now being shared, with Facebook so that Facebook can target ads back. To the user. What. Facebook, does is. Profile. You if you're on Facebook it's collecting everything you do if you're, off Facebook, it's using tracking pixels, to collect, what you're browsing and for it's micro targeting to work for its business, model to work it has to remain a surveillance, machine, they. Made a product that. Was. A better tool for advertisers, than anything that had ever come before it and, of course the ad revenue, spikes. That. Change, alone, I think. As a sea change in the, way the. Company, felt. About its future and. The. Direction it was headed. Sparrow. Pawnee was so uncomfortable, with, the direction Facebook, was going he, left before the companies work with data brokers, took effect. The. Extent, of Facebook's, data collection, was, largely a secret, until a law student in Austria, had a chance encounter with a company, lawyer I, kind. Of wanted a semester off so I actually went to California, to Santa Clara University, in the Silicon Valley. Someone. From Facebook, was a guest speaker explaining, to us basically how they deal with European, privacy law, and. A general understanding was you can do whatever you want to do in Europe because they do have data protection laws but, they don't really enforce them at all. So. I send an email to Facebook saying I wanna have a copy of all my data. So. I got from Facebook about 1200. Pages and, I. Read. Through it. In. My personal file I think the most sensitive, information was my messages, for. Example a friend of mine was in the closed unit off the of a psychological hospital, in Indiana I. Deleted. All these messages, but. All of them came back up and you. Have messages about you know love life and sexuality, and. All of that is kept. Facebook. Tries to give, you the impression that you share this only with friends, the reality is Facebook is always looking there. Is a data category, called last location where, they store. Where they think you've been the last time if. You attack people in pictures, there's, GPS location, so by that they know which, person has been at what place at what time, back. On the servers there's like a treasure trove just like ten times as big as anything we ever see on the screen as, Facebook, was ramping, up its data collection, business ahead of the IP. Trams. Filed 22, complaints with the data protection Commission, in Ireland where, Facebook has its international, headquarters and. It had 20 people at the time over little supermarkets in a small town it's called poor darling it's 5,000 people in the middle of nowhere and they, were meant to regulate, Google, or Facebook or LinkedIn and, all of them schrems, claimed facebook was violating, European, privacy, law in the way it was collecting, personal data and. Not telling users what they were doing with it and after we filed these complaints that was when actually, Facebook reached out basically, say you know let's sit down and have a coffee and talk about all of this. So. We actually have a kind of notable, meeting that was in 2012, and the airport in Vienna, but. The interesting thing is that most of these points that simply didn't have an answer you, totally.
Saw That their pants were down, however. A certain, point I just got a text, message from the data protection authority, saying they're not available to speak to me anymore that, was how this procedure, basically ended, Facebook, knew that the system plays in their favor so even if you violate the law the, reality is it's it's very likely not gonna be enforced. Facebook. Disputed, schrems claims, and said, it takes European, privacy, laws seriously. It, agreed to make its policies, clearer and stop, storing some kinds of user data. In. Silicon, Valley those, who covered the tech industry, had also been confronting, Facebook, about how it was handling users, personal, data. Privacy. Was my number, one concern back then so, when we were thinking about talking to mark the platform, was an issue there were a bunch of privacy, violations, and, that was what we wanted to talk to him that is, there a level of privacy, that just has to apply, to everyone, or do things I mean you might have a view of this is what privacy means to Mark Zuckerberg so this is what its gonna mean at Facebook, yeah I mean people, can control this Burcham, sometimes simple, control has always been one of the important, parts of using Facebook and Kara, Swisher has, perv Zuckerberg since, the beginning. She. Interviewed, him after the company had changed, its default privacy, settings do, you feel like it's a backlash, or that you feel like you're violating people's privacy, and we. Started, to ask questions he, became increasingly uncomfortable. And. You know it's I think the issue is you became the head of the biggest social networking company. On the planet. You. Know so I started, this when I was you know started. Working on this type of stuff when I was 18 so he started. To sweat quite a lot and then a lot a lot and then a real lot so the kind of this kind of thing were you know like broadcast news where I was dripping down like or Tom Cruise and that Mission Impossible he, was just he was going to his chin and dripping off. From. Building, this project in a dorm room and it. Wasn't stopping, and I was noticing that one of the people from Facebook was like oh my god and was we, were I was trying to figure out what to do yeah. I mean. You. Know a lot of stuff happened along the way I think um you. Know. There were real learning points, and turning points along the way in terms of, in. Terms of building. Things. He. Was in such distress and I know it sounds awful but I felt like his mother like oh my god this poor guy is gonna faint, I thought, he was gonna faint I did I take off the hoodie. Well. Different, people think different things he's, told us he had the flu I, felt. Like he, had had a panic attack is what happened. Is. A warm hoodie yeah no it's it's the hoodie we it's um it's a company hoodie we print our mission on the inside what oh really, the inside of the hoodie everybody, take. Off what, is it making. The making, the world more open and connected oh, my god like a Oh from. That interview, and from others I mean, how, would you have characterized. Marc's, view of privacy, well. I you. Know I don't know if you thought about that it's kind of interesting because they're very they're, very loose, on it they, have a viewpoint, that this helps, you. As the user to get more information and they will deliver up more sir that's the whole length of Silicon Valley by the way if you only give us everything, we will give you free stuff there. Is a trade being made between the user and Facebook, the, question is are they protecting, that that data. Facebook. Had been free to set its own privacy, standards, because, in the u.s. there are no overarching, privacy. Laws that apply to this kind of data collection. But. In 2010. Authorities. At the Federal Trade Commission became concerned. In. Most other parts of the world privacy. Is a right the, United States not exactly. At. The FTC David. Vladek who was investigating. Whether Facebook had been deceiving its users, what. He found was that Facebook, had been sharing users, personal, data with. So-called third party developers. Companies. That built games and, apps for the platform and our view was that you, know it's fine for Facebook to collect this data but, sharing, this data with third parties, without consent. Was, a no-no, but, a Facebook of course we believe that our users have complete control of their information the heart of our cases against, companies, like Facebook, was deceptive, conduct that, is they they, did not make it clear to consumers, the extent to which their personal, data would be shared with third parties the FTC. Had another worry they. Saw the potential, for data to be misused because, Facebook, wasn't keeping track of what the third parties were doing with it they, had in, my view no.
Real Control, over the, third party app developers, that had access to the site they could have been anyone there was no due diligence. Anyone, essentially. Who. Could develop a third party app could, get access to the site it could have been somebody working for a foreign adversaries certainly. It could have been somebody working, yes, for. You know for the Russian government. Facebook. Settled, with the FTC without, admitting guilt and under, a consent order agreed. To fix the problems was, there an expectation, at, the time of the consent order that, they, would staff, up to, ensure that their, user's data was not leaking. Out all over the place yes that's. That was the point of the this, provision, of the Santu order that required, them to identify. Risks to personal privacy and to plug those gaps quickly, inside. Facebook however, with, the IPO on the horizon, they were also under pressure to, keep monetizing. All that personal information not. Just fix the FTC's, privacy, issues nine, months into my first, job in tech I ended up in an interesting situation where, because, I had been the main person, who, was working on privacy. Issues with respect to. Facebook. Platform which, had many many many privacy, issues it was a it was a real hornet's. Nest and I. Ended up in a meeting with a bunch, of the most senior executives at the company and they, went around the room and they basically said well who's. In charge and, the, answer was me, because no, one else really knew anything about it you'd, think that a company of the size. And importance. Of Facebook. You. Know would have really, focused and had a team of people and you know very senior people working on these issues but, ended. Up being me. What. Did you think about that at the time that, was horrified I didn't. Think I was qualified. Fair, Aquila's tried to examine all the ways that the data facebook, was sharing with third-party developers. Could be misused my. Concerns. At the time were. That I knew, that there were all these, malicious. Actors who, would do a wide. Range, of bad, things, given. The opportunity, given the ability to target, people based on this information, that, Facebook had so I started thinking through what are the worst-case scenarios, of what. People could do with, this, data and I showed some of the kinds, of bad actors that might, try, to attack. And I share that with a, number, of senior executives and the. The. Response was was muted, I would say. I got, the sense that it just this just wasn't their priority, they weren't that concerned about, the, vulnerabilities. That the company was creating, they were concerned, about, revenue. Growth and user growth and that. Was expressed to you or that's something that you just gleaned from the the interactions, from, the lack of a response I would gather. That yeah and, how. Senior, were the senior executives, very, senior like. Among. The top five. Executives, of the company. Facebook. Has said it took the FTC, order seriously. And despite, per Achilles's, account had, large teams, of people working, to improve users privacy, but. The pair Achilles and others inside Facebook it, was clear the business model continued, to drive the mission in 2012. Our Achilles. Left the company frustrated. I think, there was a certain arrogance, there that, led. To a lot of bad. Long-term. Decision-making, the, long-term. Ramifications of. Those. Decisions, was not well thought through at all and. It it's got, us to where we are right now. You're. A visionary, you're, a founder your, leader mark, please come to the podium. In. May of 2012, the, company finally went public the, world's largest social network, managed, to raise more than 18, billion dollars making. It the largest technology IPO. In US, history people, literally lined up in Times Square around, the Nasdaq, board will wring the spell and will get back to work with founder Mark Zuckerberg ringing.
The Nasdaq, opening bell remotely, from Facebook, headquarters in, Menlo, Park California. Mark. Zuckerberg was, now worth an estimated 15. Billion, dollars. Facebook. Would go on to acquire Instagram. And whatsapp on its way to becoming one of the most valuable, companies in the world. Milestone. In, our history. But. Here's the thing. Our. Mission, isn't to be a public company our. Mission, is to make the world more open and connected. At. Facebook, the, business model built on getting more and more of users personal, data was, seen as a success. But. Across the country, researchers. Working for the Department, of Defense we're, seeing something else the. Concern was that social. Media could. Be used for really nefarious. Purposes, the. Opportunities. For just. Information, for deception for everything else are enormous bad, guys or, anybody. Could use this for any kind of purpose in a way that wasn't possible before. That's. The concern and, what did you see as a potential. Threat of people, giving up their data that. They're opening themselves up to being targets for manipulation, I can manipulate you to buy something, I can manipulate you, to vote for somebody it's, like putting a target painting, a big target on your front on your chest and on your back and saying Here I am come, and manipulate me you have every I'm giving you everything you need. Have. At it that's. The threat. Walsman. Says Facebook wouldn't, provide data to help his research, but from, 2012, to 2015 he and his colleagues published more than 200. Academic, papers and reports about. Threats they were seeing from social media what, I saw over, the years of the program was. That the, medium enables, you to really take dis information and turn, it into a serious, weapon, was. Your research revealing, the potential, threat, to, national. Security. So if you when you looked at how it actually worked you see what the opportunities are for manipulation, mass manipulation, and is there an assumption there that people are easily mislead yes. Yes. People, I usually miss Lynn if, you do it the right way, for. Example when you see people forming. Into communities. Okay. What's. Called filter bubbles, now. I'm gonna exploit. That to craft my message so that it resonates most exactly. With that community, and I'll do that for every single community, it. Would be pretty easy it would be pretty easy to set up a fake account a large. Number of fake accounts embedded, in different communities, and use, them to disseminate, propaganda at. An enormous, scale yes. Well that's why it's a serious weapon because it's an enormous scale it's. The scale it makes it a weapon. In. Fact, Walsman, spheres, were already playing out at. A secret propaganda Factory, in st. Petersburg Russia called. The internet research, agency. Hundreds. Of Russian operatives, were using social media to, fight the anti Russian government, in neighboring, Ukraine. Vitaly. Bench pilaf says he was one of them. Can. You explain. What. Is the internet research agency. It's. A company that creates a fake perception, of Russia. They. Use things like illustrations. Pictures. Anything. That would influence people's minds. When. I worked there I didn't hear anyone say the government, runs us or the Kremlin runs us but, everyone there knew and, everyone, realized. Was. The main intention to make the Ukrainian, government, look bad. Yeah. Yeah that's what it was this, was the intention with Ukraine put. President poroshenko in a bad light and, the. Rest of the government in the military. You. Come to work there's, a pile of SIM cards. Many. Many SIM cards and an, old mobile phone. You. Need an account to register for, various social media sites. You. Pick a photo of a random person choose, a random last name and start, posting links to news in different groups. The. Russian propaganda had, its intended effect, helping. To sow distrust. And fear of, the Ukrainian, government. Demonstrators. Against. Ukraine's. Russian. Propaganda, was massive, on social, media it was massive there were so many stories that start, emerging on the Facebook cruel. Cruel, Ukrainian. Nationalists, killing, people or torturing, them because, they speak Russian they. Scared, people, you. See they gonna attack they're gonna burn your villages, you, should worry. Faith. Stage, 2 News. Crucified. Child by, Ukrainian, soldiers, which is totally. Nonsense. It. Got proven that. Those people were actually hired actors. Complete, nonsense but it's it spreads, on Facebook so, Facebook, was weaponized. Just. As in the Arab Spring Facebook. Was being used to inflame divisions, but. Now by, groups working on behalf of a foreign power, using. Facebook's, tools built, to help advertisers, boost, their content, by, that time in Facebook you could pay money to promote these.
Stories, So your stories, emerge, on the, top lines and. Suddenly. You. Start to believe in this and you, immediately get immediate response. You can test all kind. Of nonsenses. And understand. -, which nonsense. People do, not believe. And. To which nonsenses, people start believing. Which. Will influence, the behavior of person, receptive. To propaganda, and then, provoking. That person on sending action. They. Decided, to undermine. Ukraine, from the, inside. Rather. Than from outside. I. Mean. Basically think, about this Russia hacked us. Dimitra. Shim Kiev a top adviser to Ukraine's president, met. With Facebook, representatives. And says he asked them to intervene, they. Response that Facebook gave us sorry, we are open platform, anybody, can do anything without. Within, the our. Policy, which is written on the website and when I said but this is fake accounts, you. Could verify that. Well. We'll, think about this but you, know we we have a freedom of speech and we are very proud amaq recei a platform. Everybody, can say anything in, the meeting, do. You think you made it explicitly. Clear that. Russia, was, using, Facebook, to meddle in Ukraine. Politics. I was. Explicitly, saying that, there are trolls factory, that. There are posts, and news that, are fake that, are lying, and they. Are promoted, on your platform. We. Buy very often fake accounts, have. A look at, least. Sending. Somebody to investigate. And. No one sorry no no, one was sent oh no for. Them at that time it, was not an issue. Facebook. Told frontline that Jim cave didn't raise the issue of misinformation, in their meeting, and. That, their conversations, had nothing to do with what would happen in the United States two years later, it. Was known to Facebook. In, 2014. There. Was potential, for. Russian. Disinformation, campaigns. On Facebook. Yes. And there. Were disinformation, campaigns. From a number, of, different. Countries. On Facebook. You know disinformation, campaigns. Were a regular. Facet. Of. Facebook. Eree abroad. And. That's. I mean yeah, technically. That should have, led. To a learning experience I, just don't know. There. Was plenty that was known about the potential, downsides, of social, media and Facebook you, know potential, for disinformation. Potential. For that actor, is an abuse were. These things that you just weren't paying attention to or were these things that were. Kind. Of conscious, choices to kind of say all, right we're gonna kind, of advocate, responsibility. From those things and just keep growing I definitely. Think we've been paying attention to the things that we know and one, of the biggest challenges, here is that this is really an evolving, set of threats and risks we. Had a big effort around scams. We had a big effort around, bullying and harassment. We had a big effort around nudity. And porn on Facebook, it's always ongoing, and so. Some of these threats and problems are new, and I. Think we're grappling with that as a company, with other. Companies, in the space with, governments, with other organizations. And so I wouldn't. Say that everything. Is new it's just different problems. At. Facebook, headquarters in, Menlo Park they. Would stick to the mission and the business model. Despite. A gathering, storm. By. 2016. Russia, was continuing, to use social media as a weapon a. Division. And polarization, were running through the presidential, campaign. Mark. Zuckerberg saw, threats to his vision of an open, and connected, world. As. I. Look around I'm. Starting, to see people and, nations, turning. Inward. Against. This idea of a connected, world and, a global, community, I hear. Fearful, voices, calling for building. Walls and, distancing. People they label as others. For.
Blocking Free expression, for, slowing immigration, reducing. Trade and in some, cases around the world even cutting access, to the Internet but. He. Continued, to view his invention, not. As part of the problem but. As the solution, and. That's. Why I think the. Work that we're all doing together is more. Important, now than it's. Ever been before. Tomorrow. Night frontlines. Investigation. Continues. There is absolutely no. Company, who has had so, much influence, on the, information, that Americans, consume, he's the man who connected, the world but. At what cost. Polarization. Was the key to the model the global. Threat, this. Is an information. Ecosystem that. Just turns democracy, upside down the, 2016, election, Facebook getting over a billion, political, and the company denials, the idea, that fake, news, on, Facebook, includes, the election in any way I think is a pretty crazy idea. And. I'm responsible for what happens here is Facebook, ready for the midterm, election, there are a lot of questions, heading into this midterm midterm, election, I still have questions, if we're gonna make sure that, in 2018. And 2020. This doesn't happen again. Part. Two of the Facebook dilemma, tomorrow. Night on Frontline. Go. To PBS, or, slash frontline, to, read more about facebook from our partner Washington, Post reporter, Dan a priest for Facebook that dilemma is can they solve these serious, problems without, completely. Revamping. Their business, model, then watch a video explainer, about what Facebook, knows about you and how even, though you never signed up for it facebook now has data about you and stores it as a shadow profile, connect. To the frontline community, at pbs.org. Slash. Frontline. For. More on this and other frontline programs visit. Our website, at pbs.org, slash. Frontline. The. Waterfront lines the Facebook dilemma on DVD, visit. Shop PBS, or. Call, 1-800. Play, PBS. This. Program is also available on, Amazon Prime, video.
2018-11-03