The Future of Computational Journalism
Good, evening everyone we were a little worried that we wouldn't have a good turnout because of the rain but thank. You all for, braving, that, rough weather out there, I told. My students, today who are all complaining, that you know any other college, they'd be dealing with this every day but. At, Stanford, we panic, when it rains but, I think we're I, think we made it through the day and thank you again for coming, I've, probably, introduced. To some of you before because we have a series, of events for alumni so, thanks. To the Stanford. Alumni Association. For co-hosting this event it's, also sponsored, by the School of Engineering and, the School of Humanities and, Sciences, in, addition, to the alumni we have here the graduate, and undergraduate alumni, we, have members of the community here. And we are also live-streaming. To thousands, of alumni and Friends all over, the world so, this is the second, in our series called. Intersections. And the. Point, of this series is to bring together faculty from, the School of Engineering in, the School of Humanities and, Sciences to. Share insights, on a common, theme and tonight's. Theme is journalism, and particularly. How journalism, is being influenced, by big data and, new computational tools. At. Stanford. In general, we like to recognize, and emphasize, that when we work on world's thorniest, problems, we, don't do it by scholars, in isolation, we do it by working together across disciplines and. Today's. Panelists. Are a perfect, example of that philosophy so. Let me go ahead and introduce the panelists, we have Manish, Agra, wallah Manish. Is also, an alum of Stanford, so you'll. Fit in with the group he got an undergraduate, degree in, math in, 1994. And he cites, his most memorable course, as an undergraduate math, major as introduction. To computer, graphics so. That set him on the course to pursue a PhD in computer science which, he did here at Stanford, he, received that degree in 2002. Then. He went to across. The bay to Berkeley, where he served on the faculty in, the ISA, CS, department, there for nearly a decade one, of the interesting things that he during, that time was spent a sabbatical in New York City where, instead, of just visiting another computer, science department, he split his time between the, graphics, department at the New York Times and the public radio program, Studio 360. With, the goal of learning about audio and visual, storytelling. In, the. Year 2015. We, were really, delighted to recruit him back to Stanford and to join the computer science faculty where. He is the forest, basket, professor, of computer science he's, also the director of the Brown Institute, for Media innovation, which, is a collaboration, between Stanford, engineering, and Columbia's, journalism, school with, the mission of developing new technologies, and techniques for telling, stories through the media so, I'm sure you'll enjoy hearing from Manish. We, also have Jay Hamilton, Jay. Is the Hearst professor, of communication. He's a senior fellow at the Stanford, Institute for, Economic Policy Research and, he's, the director of the Stanford journalism. Program his. Bachelor's, and PhD. Were from Harvard. Before. Stanford. Jay, was faculty, at Duke School of Public Policy where, he directed, the doit Wallace Center for Media and democracy. In. 2050, in the same year we, recruited, Manish J launched, Stanford's, computational. Journalism, lab some. Of the questions he works on in his lab are for, example how, do we sustain the accountability, function, of journalism, or how. Can journalists, use computational. Methods to benefit society obviously. Both, are extremely. Important topics in this area, in the, end also in this era jay, and manish are not strangers, at to each other in fact they Co teach a course the course is called exploring, computational.
Journalism, And it's offered through, both the computer science, and the communications, departments, at Stanford. Finally. We have our moderator, Jeannine, Zakaria. She, is the Carlos Kelly, McClatchy visiting, lecturer in Stanford's, Department of Communication, from. 2005. To 2009, she worked as chief diplomatic. Correspondent. For Bloomberg News based in Washington, DC she traveled. To more than 40 countries with, then US Secretary, of State Condoleezza Rice, and with, other senior administration and, military officials, from. 2009. To 2011, she, was the Jerusalem, bureau chief and released correspondent. For The Washington Post. She. Has reported, widely through the Middle East including. Israel. The West Bank and Gaza Strip Egypt, Jordan Lebanon, Iraq Bahrain, Saudi, Arabia, the UAE and Turkey, did, I miss anything. Not. Off the bat she. Has reported, on the uprisings, in Egypt and Bahrain as they began in early 2011, so, that must have been quite exciting she, appears regularly on cable, news shows and radio programs, as a Middle, East analyst, so. Please, join, me in welcoming our guests for what's sure to be a great discussion. Thank. You Dean Woodham can everybody hear me okay. In. The back. Greetings, to all of you here from the engineering school alumni, and. People. In the community who braved the atmospheric, River to, join us and to those of you watching on the livestream it's, a pleasure to be your moderator this. Evening for, what I think is an extremely timely, discussion. Of the intersections, between journalism. And computation, a quick, housekeeping note, before we get into the discussion. You. All should have received some, index cards if you didn't they'll be coming around but. I'm gonna briefly try, and frame the discussion and. Facilitate. A chat. Between Manish. And Jay for about 40 minutes and then around 7:50, I'll start integrating those questions, someone, will come around and collect your cards so. Looking. Forward to weaving, those into the discussion what a wonderful, moment to, be exploring. This topic, as, most of you probably noticed. The news industry over, the past decade. Or so has been experiencing. Quite, a period of convulsion, triggered in no small part by the way the, Internet has transformed the way people get their news as well, as which and what kinds of news they get how much they're willing to pay for it etc on, the. Local level news, outlets have shuttered leaving, deficits, in that coverage and accountability, earn ilysm on the on the local level national coverage remains.
But I'd say it's inadequate and major. News outlets as you're quite well aware struggle, with budgets, paywall, subscriptions. And all these trying to raise enough revenue via, digital, advertising and, other ways to sustain a, robust. Reporting, staff. International. Reporting as you heard which was my, longtime. Former identity has contracted. Take for example just the Los Angeles Times as an example, if in, 1991. They had 28 full, staff. Foreign correspondents. Those, numbers held up pretty well through 2004. Today, there are six at least that was real the number as of May an, investigative. Journalism, the most expensive, perhaps, inconsequential. Of all these subgroups you could argue may in great peril since. As Jay Hamilton writes, in his and he's too modest so I had to bring it to show you. Award-winning. Book on the economics, investigative, journalism, democracy, detectives, which I commend to you all especially so you can meet the most colorful, North, Carolina Pulitzer, Prize winner Pat's death in Chapter seven as Jane. Writes monitoring. And they'll, analyzing. And investigating. Can entail substantial. Costs, and there, is a likelihood, of accountability, stories, going untold, so. We need some solutions, for these problems to keep the, cost of public accountability, journalism. Low so, we can continue to have this element of our democracy continue. Because, put simply without. Journalists, trained in credible fact-based, news reporting, and data, mining to hold public officials accountable we're, in deep trouble so. To that and we're gonna we're gonna discuss this tonight the collaborations. That we see across the school of engineering and, in the Department of Communication, and throughout Stanford. On developing. Some of the cutting-edge solutions to, some of these problems so it won't be all doom and gloom so. Let's talk about this Jay when we let's start with you when I used. To work at the Bloomberg in the Washington Post there was always the car reporter, the person who the computer assisted, reporter. And all of us print journalists, would freak, out anytime we had to do, anything in Excel and probably, a lot of journalists still are freaking out about Excel but that, was a long time ago and now new.
Tools Are being developed to simplify, the, data gathering, an analysis process, and you've founded. This copy channel, computational. Journalism lab what is computational. Journalism, so. Computational. Journalism, is a really, clunky term and, I think it's going to be defined, like. Data science, by the set of tools that journalists. Come to use. Right. Now I think it's using computation. To, change, how journalists, discover. Or tell, or, distribute. Or monetize. Stories. And in. The realm of public affairs reporting, I tend to think of it as reporting. By, through, and about algorithms. So stories by algorithm, if you think about The Associated Press each, time quarterly, earnings reports, come out they, write about four thousand stories by algorithm. In the old days when, humans wrote they could only cover 300 companies now. With those four thousand, companies they've, expanded the set of people they can focus on. And that. Has actually affected the trading volume in these small companies because they now get a story about them stories. Through algorithm, that's sort of like electronic. Tip mining it's finding the basis of a story this. Year was, the first year that Pulitzer, Prize had, a machine-learning story, the. Atlanta journal-constitution wanted. To write about doctors, engaged, in sexual, abuse they, scraped all 50, states the medical societies, of the regulators they found a hundred-thousand discipline cases they, couldn't read a hundred thousand cases so they wrote a machine learning algorithm, that was. Able to estimate the probability that the case involves, sexual abuse that took. The number they had to read from a hundred thousand, to six thousand, and then they were able to write that story about sexual. Abuse nationwide, so, you have stories by through, and then about algorithms, that's holding algorithms, themselves accountable, so. If you think about, ProPublica, they have a series called machine bias that was another Pulitzer finalist this, year what, they were able to do was, look. At Princeton, Review and look at the price discrimination. Princeton, Review engages, in what, they're able to do was simulate, being from zip codes around the country and then they looked at the prices that were quoted and they found that controlling.
For Things like income. The, higher the Asian percentage, in a community, the higher the price that, the students taking the SAT prep course were charged that's, not something that a company's going to advertise, but, reporters. Trying to reverse-engineer. Algorithms. Could tell you what's going on so, to me that's what, computational. Journalism, is today and just as a media, economist, how did you fall, into this why why, how'd it's become your mission it. Really is a mission. If the, reason I came to Stanford was that if you look across the university. There, are lots of people who are trying to say I want to take unstructured, data turn. It into structured, information and, tell, you about a pattern that happens in political, science where, people are trying to do big data to analyze, politics, some. Of you are probably familiar with the, paradise'. Papers, that came out the investigation, that came out the. International. Consortium of, international, journalists, they, use software. Was developed in the digital humanities work, at Stanford there, was a project about five years ago called mapping. The Republic, of Letters who wrote whom in the 1700s. And that, software was open sourced it was taken up and used, by journalists, first, in the in the Panama, papers that, data visualization. Software that used to be who, was writing whom became which. Offshore, entities, are associated, with oligarchs, from particular, companies, so, that's, why I came to Stanford there were so many people who, are trying to think about how, you can use data to understand, and hold institutions accountable, I mean that really leads us Venetia in terms of the intersections, between computation. CS. And and communication, how do you see this relationship going, in and defining, computational, journalism's. Prospects. For solving some of these problems yeah. So you know I think J's, definition. Is is really great it's. About. How stories. Are told, by. Through, and about algorithms. And. I can give a few more examples of those, categories. So certainly, we've, seen many many examples of. Using. Computer-assisted. Reporting, to. Gather, information and, to analyze it that then goes into stories. When. I think of telling, stories, through, algorithms I think, a lot about the. Algorithms. That have been designed to synthesize, information and. Write. The stories for you so there. Are companies, now that have, algorithmic. Approaches, to writing stories that are published in newspapers, by The Associated Press. Excuse. Me and, and. There. Are other. Companies, that are designing algorithms, that, will take, media. Audio, and video and put it together to. Tell the stories in a much more visual way and you've probably seen these in your, Facebook, feeds and, on, other sites on the web, so. That's synthesis. And content, creation through, algorithm. Certainly. Holding, algorithms. Accountable, is a very, important, part, what journalism, does. But. When, I think of computational, journalism, I also think about a few. Other things so. A. Lot, of the ways in, which. Journalism. And. Stories. Are. Distributed. Is through. The internet and. There, are algorithms at play that. Help determine, which, stories, you will see for, example in, your Facebook, feed, on. Google, you know the top stories, are ranked by an algorithm, and. So. Distribution. Is another place where computation. Plays a big role and, then, finally on the funding so, we. You. Know the business, models behind. Journalism. Are also changing, very significantly. And we've, seen a big upheaval, in the industry, because, of these changes what. Do you mean by holding the algorithms. Accountable. So. You. Know one of the one of the questions, in. That. We face as as, a, society. Is. Understanding. Some of the algorithms. That are delivering, information to, us, we. Don't know how these algorithms are making, the decisions, that they make and, there are groups. Of journalists, groups of computer scientists, that are trying to reverse-engineer these. Algorithms, to, understand, what it is they're doing under, the hood and, I. Mean the companies what the social media companies are doing exactly, exactly. Ok. So I want to come back to that but I want to I want to get into this issue of, that. I know you've worked on which relates a little bit to online deception, which is also an area, where I think there's a lot of collaboration going on between especially. The department communication, and engineering.
It's, Weird to quote myself but I wrote something a couple weeks ago that kind of summed this up a little bit the, problem that I that keeps me up at night and maybe some of you as well so. I wrote this for a piece for this Expo Chronicle a fragmented, media landscape. Populated. By news outlets, and imposter, outlets that abide by different journalistic, standards, has transformed. What, was once a basic, task reading, the daily news into a major challenge in. An error, unprecedented. Access to information we're. Experiencing. An unprecedented era of noise today, I don't have to only encourage my Stanford students many of them are here tonight to read the news I need to teach them how to do so and this sort of gets into maybe this some of these tools that we were talking, about and I pointed one of the things I pointed out in the piece was that it's not only print that we have to worry about with the fake news and all these things and we're not gonna talk with a whole night about the fake news but. New. Lip synching technology, let's researchers, put words in Barack Obama's, mouth or doctored, photos, and are flooding the internet and human vision is struggling to keep up so this is an area we have a lot of expertise, at Stanford and I know Manish you've worked a little bit on this right in terms of the images if you talked a little bit about that yeah. So, a, lot, of my research is, on. Developing. New. Better. Simpler. Techniques, for, manipulating. Editing. Audio video. And photographs. And. These. Techniques, have, many many. Useful. Purposes, right. There. Are lots of instances in. Which you. Take a photograph of, your. Family, and you want to touch it up a little bit so that a, person, becomes a little sharper, a little clearer you might want to blur the background to, put. More emphasis on the foreground and so on so, there. Are a number of reasons that editing. Tools are. A, really, useful there, are a number of tasks for which they're useful. But. At the same time these tools can be used for. Well. For. Producing. Things that are the, one would call fake news right, and. So, in my lab we've, developed. Tools for. Editing. Video. Of, talking, heads so, we'll. Have an interview of someone speaking to a camera and. We. Have tools that will allow you to. Get. A text transcript, of the, what. The person saying time, align that with the video and then, you can do cut copy, on the text, and propagate. To back to the video in a way that's really seamless, you can't tell that the Edit has been made and a, useful thing you know a place. Where this is useful is, for. Example to. To. What, I would say maybe, is touching, up the video so if there's an or. An um or, someone.
Stutters, Or miss speaks a little bit, this. Is a place where, it might be useful in certain situations. To to. You know cut out those mistakes. But. But. This can also be used for, not. So good purposes and and. This. Is something that I think we, all need to be really cognizant, of one. That there are technologies. That allow you to edit video. And produce. Something that didn't actually happen but. To, that. We. As. Journalists. And authors when, we are making these edits we. Need, to maintain. A certain level of, ethical. Responsibility. In putting. Out what, is. A. Truthful. Video. Jay. Do you want to talk a little bit maybe about either using, algorithms to help surface. What we're talking about really quality, credible, information or. Maybe some of the other projects that are happening the class that I know you're co-teaching, I think with Don Gorske as well as Yaran o Krishna Bharat the founder of Google News also one of your co-instructors. Is here or not yeah. I think when. I look at Manisha's, tools I think of them as. Democratizing. Storytelling. Allowing, people to tell. A really engaging story. Which raises the probability somebody. Will watch it and. One. Of the things that we're working on in in. Our lab is a. Story, discovery, trying, to actually find this story and. It and a favorite, example of that is the Stanford open policing, projects, so Jarrell Phillips who's a lecturer, in journalism, two, years ago she had her students do Freedom. Of Information Act, requests, at all 50 states asking. For electronic, versions. State Police stop data two. Years later we have a hundred and thirty million records, from, 31, states and she. Combined. With, shared. Goal in the. Engineering, department, to. Basically, develop, a set of algorithms that try to tell you what, rule of thumb does a police officer, use when he pulls you over when. He decides to go in to, your car and when he and. What, the, outcome is and, I think it's a great, example because the, data has all been made public, they've, on. A website, allowed. Local. Reporters, to download, the information and, look at how their State Police are operating, and today, that data has been used by The Economist. By NBC. By National, Geographic by, the, Marshall project and, Trevor Noah, now, Trevor. Noah didn't do the math The, Daily Show called. Cherrylle up and said we'd, like to look, at broken. Taillights, and how that may be used for, disparate, impact, across racial, demographics, in the u.s. such. As she. And Sherrod ran the numbers and were able to generate that they've, also trained over a hundred journalists, across the country at journalism conventions, and how, to use that data so, I think it's a good example of H&S, professors, and engineering, professors, at Stanford combining. Because it not only is generated, journalism, to. PhD. Students, and statistics, have actually. Written new algorithms, that try to estimate what's. That rule of thumb police are using, and they have actually shown how for, African Americans and Hispanics, in most states when. They they, will pull you over in a much lower probability or. Expectation. That they're actually going to find something so. Do, you want to add to that at all finish on that on this topic of you know that project or similar projects, where they're collaborating in story discovery using data I.
Think. You're seeing it more and more all over the place you know computer, science and the tools of algorithmic. Analysis, and and synthesis. And. Distribution. Are just being used all over the place I think. One of the better examples is, this one with, Sharada. And, and Cheryl yeah, I think, it was like Reuters came also to Cheryl and said we're going to look at these killings. That are happening in the Philippines right now and and she had some students working on that as well but, I think one of the things we're talking about is the role of the university, actually, visa. Vie. Journalism. Visa vie helping. Democracy, and and I think it right so you're involved a little bit in the Stanford long-term range proposals. That are coming in on that front can you talk a little bit about both you maybe about what Stanford's, role collectively. Isn't all this sure so if you go back to the Stanford, Charter it says that we are supposed, to help, educate, our students, in part for a citizenship. And if, you look at Stanford's, organization. We're a non-profit, in part, because we're supposed to generate public. Goods and positive, spillovers on society, and, right now you've mentioned the economics, of journalism, I think, that there is. Real. Tumult, in the business, model and there, are really five. Incentives. That generate, news. One is I want, to sell your attention, to somebody that's advertising. One is pay me that's subscription, one. Is I want to change how you think about the world that's nonprofit, one is I want, your vote that's partisan, and one is I just like to talk that's expression, or social media and, we're seeing a world where Google. And Facebook do, a tremendous job with, targeted, advertising, and that shifted revenue away from news and it's put more of a pressure on the nonprofit in the subscription motive, if, you come back to Stanford we, can do the R&D, for. The industry, especially related, to accountability, reporting. At, its heart public, affairs journalism, involves, a market failure we. All have different information demands, in our lives as consumers. Or as workers or audience members or as voters and the, first three, markets. Work pretty well because if you don't seek out the information you. Don't get the benefit, that, fourth, demand, the, voter demand it's subject to rational ignorance, because. My vote doesn't matter in, a statistical. Sense I often, don't seek out the information, to inform, it it sets up this gap between what we need to know as citizens, of what we want to know as audience members so, I think Stanford, when, it's looking towards. The future, we, can really through, our students, and through the research the type of research that Manish does and the type of research we're doing we. Can actually help with, that voter demand, Silicon, Valley has been tremendous, in our lives as workers or audience members or consumers, much less on voter and maybe a negative on voter. Probably. Second.
Each Do you want to talk about that in terms of the university's role in all this because you're also working, with the Brown Institute which partners Columbia, and Stanford yeah. You know you, know another major, role of the university, is to educate students, and. Computer. Science, and journalism have. Only really gotten together in the last I would say 30. To 40 years and there. Isn't a lot of curriculum. At the university. To, try and bring these two things together these two disciplines and. I think one of the things that we are really. Excited about I've. Been working with J and Don. Garcia the, the. Director, of the jsk Fellows Program here and, krishna pathi the. Founder of Google News to, think. About ways of. You. Know developing. A curriculum around. Computational. Journalism. So that we can really develop students. That are able to. Do. That computer-assisted, reporting, and. Analysis of, data, can. Think. About how to use. Computational. Tools to, tell stories, through algorithms, and. Also, to do. That kind of investigative. Reporting. Understanding. How, the algorithms. Work, and reporting. On, their, you, know algorithmic, biases, or other issues in algorithms. That are out there, yeah and Jeanine you asked about computational. Journalism, I think of it another. Way to think of it is it's like a Reese's Cup it's data journalism, and storytelling, together and I, think that the, students that we have in our master's program sometimes, we have people with, a quantitative, background so we have two folks. Who are CS undergrad, majors they're, really focused on learning how to tell, a story and. We also then have students who have always been. Storytellers. Have been writers but they see the advantages. Of using data, to find a story that nobody else can see so I think it's really working, across both those types, of skills that will get you the future. Of journalism. So. Algorithms, have been in the news in the last couple. Of, weeks. In. Particular what you referenced earlier Manish about the role that algorithms. Play and surfacing, the news that we see on social. Media and there were set of hearings on Capitol Hill on November. 1st I don't know how many people saw. Them or pay attention a little bit, where. Congressmen. And Senators really grilled the the general, counsel from Facebook, Google and Twitter about these algorithms in the role they played unwittingly. Perhaps. In helping Russia spread disinformation in, propaganda and there, was one of the one of the questions that came up was, from Senator Kennedy, a Republican. From Louisiana and. He said are, you a media company or a neutral technology, platform, it's what he said to these to these companies and they neutral technology platform in each segment so, I wondered. If. You. Could, that's. A good too controversial but spice it up a little in terms of how, you see these companies which are a place, where I think I read the the style was something like 60% of us are getting at least some news from them so they're becoming a larger portion of our news, diet, what. Are they in this, realm and the.
Role That social scientists, and engineers could. Play in fix in finding, technological, solutions to this problem. Yeah. So to me the role that these companies are playing the googles and the Facebook's is, that, they are. They're. The ones that are really distributing. The information, to. The to, the eyeballs, of the viewers and the audiences. And, and. In, that role they hold a lot of power. And. Their. Power comes, from. Deciding. Which. Stories, they're going to surface at the top of your newsfeed, at. The top of the ranking. When they give, you news, stories for, example, and. So. One. Of the ways in which I think. There's. A. There. Can be some issues is, that these. These, companies, don't, reveal, the algorithms. That they're using to do this ranking, or to show you what's in the newsfeed now there are many good reasons for. Them not to share. Their algorithms. In great detail, because as soon as they reveal their algorithms, there is the issue of gaming the algorithms, and, spammers. Trying to get you, know they're fake news sites up to the top of the list at. The same time there's so little transparency. That, we don't actually know in much detail what, they're doing and so there's no way to really, audit. Those algorithms. And figure out how. They're making the decisions that they're making and, so, we've, we've put a lot of trust in these companies, and. It. Would be I think helpful. To get, a little bit more transparency. So. That we can understand. Better how, it is they're, making the decisions, that they make algorithmically. So. If. A company generates. Content. To. Wrap around advertising, that, to me is a media company and so. To me Google and Facebook are both media companies, they're making decisions. About how to engage you what's the. Priority and. Two. Things to note that they're both dual, stock structure, companies, so, historically. If you go back to the 1980s. There. Were two industries. Where individual. Or family ownership were predominant, sports. Franchises, and media outlets because in both of them the owners took psychic, income for, being part of their community, and contributing, if you look at say the souls, burger family in New York or the Grahams and wash in Washington.
They, May have even. Provided. More than an optimal, or profit maximizing amount of public affairs because of that notion of civic duty fast-forward. To today. If. You read the annual report, of alphabet, it notes that the. Voting. Control about 58%. Of, the voting shares are held by three people for Facebook it's one person and both, of their annual reports, say we, may take in against, our. Shareholders, what that means though is that they have the freedom to incorporate. Democracy. Or participation. As a goal if they chose because they've told people ahead of time that they may not maximize, profits, when. I look at Facebook I think you could think of it pre, 2016. As saying we're. Gonna maximize revenue, and we are going to redistribute, through, biomedical. Research and other things like that and so. To me as an economist, they, were saying, we aren't going to think about our positive, externalities. We're going to earn. A lot of money and give. It in philanthropy, but. After 2016, I think the question became not are you leaving. Some positive externalities, or spill overs are on the table but are you actually generating, negative externalities. By the way the way that you do this and if you think about the scale of what they do both. Google, and Facebook each, have R&D, budgets, that, are bigger than the entire budget of Stanford, University if, you pull out our Medical Center together, or individually individually. Individually, so. Where. Are they is Twitter in there - or Twitter is lost approximately. 2.5, billion dollars in ten years, yeah, they have. Contrast. With Google which had about 19 billion in profits last year yes so different scale so, I guess what I'm what I'm trying to say is that they, could take into account having an impact on democracy. And, actually, if you go to the 2010. Midterm, elections Facebook. Did a fascinating study. Where they. 461. Million, people, in their newsfeed on Election Day it, showed you where your poll was and it showed you the pictures of up to six, of your friends who voted and when. Facebook was still publishing, social science research what they told you was they, estimate, that they increased, voter turnout, by, 400,000. People in 2010. By that experiment, on 61, million people nobody. Talks about that study anymore but if you look at 2016, the. National. Election was decided by fewer than 120,000, votes so, I think there they are in a realm where they should probably think, about the. Impact that they're having so, you're, saying that Mark, Zuckerberg has the ability to do it he's cleared it with the shareholders it's. A question of will, he do it given the profit motive at the core of Facebook, and Manish for you is it is there. An algorithmic, technical, solution, to some of these problems the way there, is for example pornography. And spam don't come up in my feed whereas. You could potentially, see a lot of hoaxes. And things like that, yeah. I, think, that, it's. Going to be a combination, of technology and. Figuring. Out what the ethics are of these companies, there. They have a lot of power to decide, what, what. You see and what you don't see and. You. Know they. Are gonna have to decide where, they want to put their effort is it profit, maximization is, it, to. Really. Build. A strong democracy where, should they where should they live and, I think you have, seen some changes, post-election. I think for three reasons number, one advertisers. Advertisers. Are experiencing, some backlash for being associated with fake news or controversy. Now that people are watching number. Two employee morale people. Within both those companies. Want. To work in a way that contributes to society and, they. Were embarrassed, by the performance, on both and then number three, the. Owners are human beings and they have actually, started to see what the impact, of, fake. News different, disinformation, is, krishna, bharat the person, that, we're co-teaching to work with along with john garcia from the night Ellis Krishna has a post on medium that says suppose. You wanted to stop fake. News from going viral if you took a. Level. Like 10,000, shares if, you wanted to stop fake news at 10,000, shares what, you would want to do is look at when things are like at 2,000, and if, it's suspect, based on machine learning show.
It To a human and have them do additional, fact-checking, and then, based on what you see you could then slow, it down so, Krishna. As the founder of Google News I think has street cred and I. Think. That he has basically said it's a matter of will it's not necessarily. An engineering, problem, so. Then you're bringing a human element. Into, this which. The. Companies seem a little allergic to even though we know that there are lots of people working there that are creating, this experience, for us so are you suggesting more, of a sort, of honest reckoning, with this that we actually we, yes we need humans, we are curating. It, is. That part of the solution I. Think. I think humans, are certainly part of the solution, I think the. Companies, have, all recognized. This and and in fact say it you know. The. The, bigger question, is to, me is how. Much transparency. They will allow outside. Of the companies, so, that you, know third parties can understand. A little bit here we here at Stanford, could understand, a little bit what, they're doing and really in a way awed it you know the the, algorithms, and the techniques. And and maybe the you know the human. Algorithm. Hybrid. That, is used. To make these decisions but, and but I'd like. To say. If. You ask me what the biggest failure in journalism, is I wouldn't be the distribution, of the platform, so it would be the stories that go untold at the local level because of the collapse of the business model of local newspapers, so, right, now we are properly focused. On what. Russia did in, the 2016, election, but, if you look across the country there. Are. City. Councils that don't, have a reporter, covering them there's school boards voting, making decisions, and nobody, is watching so. I think that's something where. Computational. Journalism, can really make. An impact. There's. A saying the future is here but it's unevenly, distributed the, best use of AI now is in. Business reporting, and and, actually we've seen this you mentioned a.
Computer-assisted. Reporting. The. Business side of newspapers. Always gets the tech first, and then, eventually it goes to the reporters, so, right now the Washington Post they've got amazing software. That, helps, put. A headline. On something, that helps figure out where to push it on social media that, helps match it with your interests but so far they haven't really used their fools to go inside to discover the story to find stories in different ways so, I think if you have, a strong interest in engineering and, and data, trying. To help us figure, out the, stories, that go untold especially, at the local level the smaller, the radius of, a story the harder it is to generate eyeballs, to fund it and that's where. Software. That's, where. Engineering. That's even where economics, because institutions. Break, down in predictable, ways and, if. We can use, those, signals if we could look at even Google searches if you go back to Flint Michigan before. Any reporter, wrote about water. Quality in Flint Michigan there. Was a beginning, spike of the search why is my water brown so. There, are people who are telling us things are wrong but there's no reporter, out there not a live body out there so, I. Think. That if we were trying to think about, projects. At, Stanford, that could help journalism. Telling. The local journalism story and telling the stories of low-income people because. If you go back to what creates a story. Low-income. People are your, advertising, target, for many companies they vote in lower levels, so actually, they're less likely to, be contacted. And they have a lower willingness, to pay for information so, that, generates, news deserts, across the country I think. I'm just showing. You why economics, is a dismal science this is a little bit of a downer but I. Think, there is hope, because. Of the part of campus were on because of the tools that Manisha is operating, on because of the things that you all are probably capable. Of doing too with your skills but Jay say you're a community member and you're concerned about the news desert, in. East Palo Alto or, even to Palo Alto for them it's not really bad there's some coverage appellate and we try to do it through our own publishing side here and our students work in the peninsula press but there are these news deserts all around the country so there's this like. Would you call it a market failure that, they, I don't want to get too wonky about it but there's the, absence, of local news reporting, that is undermining, in some ways. Democracy. Is what you're saying it's funny I worked as a consultant, for the Federal Communication, Commission on a studying, the information, needs of community, and it, was 400, pages the word market failure was only used once and it was in a footnote because they forgot to take it out they, didn't want the notion they accepted, the logic, of.
Rational, Ignorance they accepted. The logic that the market doesn't fully reward, you for telling a story that changes loss and, my, book on investigative, reporting I do, case studies where I find for, $1 invested in some investigative, reporting you, can generate, several, hundred dollars in net policy, benefits, when you change public policy so. We have suboptimal, levels because, it's hard for the market to do that but, if, you can lower the cost of finding the story you're going to get more of them and so, in our exploring, computational, journalism, class one. Of the teams is looking at the question if, we looked at real estate data what. Public. Policy story. Could you write what anomalies, could you spot there's, a great website called open secrets that looks at campaign finance data they, have something, called anomaly, tracker and you. Can in and it looks at things like. Does. More than half of your, elected, officials money, come from out of state or, is your elected official, being lobbied by somebody. Who only cares about one bill in Congress, because you can deduce that from, lobbyists. Registration, data so, at the national, level people are beginning to use anomaly, tracking, and in the class that Manish and Krishna and Donna and I are teaching that's, one of the things our students are taking on Manish. Before we turn to the audience can you talk a little bit about sort of what's on your wish list right now in terms of projects, in this area or what, are you most excited about in, the, positive way yeah. Yeah so you. Know one of the things that we are excited about is. Building. Tools, to aid in. Synthesizing. Stories, so. One. Of the areas that, we work in in my lab is in. Visualization. And trying, to take complicated. Datasets, and, turn. Them into visualizations. That. Make the, information. A little easier, to. Parse. And digest and understand. And visualizations, do this because they provide a lot of context. For the information that you're seeing but. Designing really effective, visuals. For. A given, dataset is a. Challenging. Tasks, past for most people, even people that deeply understand, how, to create, good visualizations, it takes, some time and effort to do it and so. The. The kinds of things that I'm really excited about, are to build tools that, will make it much much faster and easier to produce high-quality. Visualizations. That, really tell the story that. Highlight. The, important. Aspects, and takeaways of the, data that. That. A reader that a viewer should really focus on, another. Aspect of that is to make these visualizations, interactive. So that an interested. And engaged audience. Member can, go and work with the visualization. And, really. Learn more, about the data by. Filtering. It or by. You. Know, transforming. The data and, looking at it in different ways to, understand, what the data is all about, we're. Seeing lots and lots of efforts. These days at making. Data more, publicly, available so. There. Are websites where you can get government data census. Data. All. Kinds, of data local, data right, and one. Of the problems, with these sites. Is that yes. The data is online but, it's very difficult, to access it and. So, what we need are tools that make, it much more accessible to, a wider range. Of, the public, and journalists. For that matter to, really understand, what's going on to find the anomalies, and. Then build. The stories around them so, if you're an engineering, alum or a person. Concerned, about journalism, in this intersection that we're talking about is there, a way before, we turn to questions and we can start bringing those up are. You that, you can get involved in all this in these solutions here. At Stanford. J, yes. Agreed. A homework assignment yeah yeah, exactly so. Our. Students, love to work on puzzles they love to work with data they love to have an impact on society any of them could be doing other things but they are in our classes. Because they really care about democracy and. Manish. And I and Krishna and Don have been working, on something, that we call the Stanford, journalism. And democracy, initiative and, what. We're trying to do is. Focus. More, people on Stanford, and in the broader Stanford, community on the, challenges, that, are computational. And that relate to data that could be solved, and that would help journalism, so. If. You have ideas please. Email me I'm at jay29. Teen. 80 and. I held. On to it across all the universities, and so, you. Can. Go. To the Stanford computational.
Journalism, Laps sign up for our newsletter, you could go to the Brown Institute sign up for their newsletter we. Would love ideas we, would love data we actually partner, with people in our classes on projects. Too and. So. Yeah, we're. Trying to crowdsource things, here and, let me just add many, of you at least in the audience here in this room are in, the Bay Area come. Visit us we'd love to talk, with you if you're interested, in, any of these issues, around journalism. Around. Computation. And algorithms. And the relationship, to journalism, we, would love to talk with you and one, more product placement, I have, written a book called democracy's. Detectives, because, you came out tonight if you send me an email or come up to me afterwards and give me your card I'll say I'll give you a free copy of my book of, course there is the opportunity cost of your time if you read it. So. So. I used to run. Around as you heard the, Middle East and now I might. I'm fretting. About how we're gonna make sure that everybody. Gets, a daily, diet of credible, fact-based, news and more or less the same credible. Fact-based, news, and. To that end I curled, up with a study out of Yale the, other day that. Looked at because they don't have the data from the companies they have to try and what. Is it called reverse-engineer, it so. And, it was kind of distressing because it found that you know one of these ideas that's out there is labeling, let's just label. True/false. Whatever, you know verified, not verified whatever language you want and what what happened was there was this unintended. Effect that whatever, wasn't labeled this was at the done as a study was, assumed to be true so. If you label it that means that you'd have to label everything in the entire internet for that to work and then they the other idea that's been floating around is well what if we just tell. Them what's more clearly labeled what the news outlet is the New York Times Politico. And, whatnot turns out that didn't matter at all for. People then I've seen this a little bit with. Students that I've talked to you we we ran a little wasn't, a study but in. A, bigger lecture, class that I went, in talked in and I asked, the TA to ask, the students what their most, are what. Sources they considered the most reliable in news and they said the. Shamea list and it was Apple News Facebook. Twitter. In. Other words we've, completed, this notion of some, of us of platforms, and news and so, to that end there's, there are these questions here from the audience many. Versions of this question, are, there algorithms, or other tools to discern fake. News from real news and along those lines what role could algorithms, play or, in fact, checking. Manish. Do you want to try that one or yeah. Let. Me start by talking about images. And video. So. They're. You. Know well. You. Know with images, and video. One, of the questions, is how, much has. Photograph, been manipulated. Right we. Every. Photograph that you see has likely, been manipulated. A bit okay. Certainly. People. Make brightness. Adjustments. Contrast, adjust, adjustments. And. Things like this so. Manipulation. Is happening, at every level even, forming, the image is, a form, of manipulation, when, you develop, an image in the old days and a developer, you, know you, are manipulating, the, image to some extent you're choosing, the exposure of different parts of it so, one. Of the questions is how much. Manipulation, is okay and where does it cross a line and this, is a this, is a deep ethical question. That, I think all of us need to answer now at, a. Technological. Level. We. Could develop, technologies. That are tracking all, of the manipulations. That are happening and we, could store all of that information, in. Some trusted, way some. That you have the full provenance, information. For. You. Know exactly, all of the manipulations, that have happened. Whether. That will work to, get. A, audience. Member, to understand, that the photograph, has been manipulated. Is far, less clear, so, a good example of this is advertising, the. Photos that you see in advertising, every. Single one of them has been manipulated. A lot, right. And yet. Many. Of us don't. Really recognize, how. Much these things have been manipulated, so, there's. You know there's a technological. Solution here, that can go part of the way but. We, need to also work, on education and. Think, collectively, about. What is acceptable, and what is not acceptable, in these, kinds, of manipulations. And. Some. Good news, just today from, a former, night fellow Sally Lehrman she's been working for several years on something. Called the trust project, with Richard ingress from Google and it was announced today at the Museum that they have said, 75.
Partners Around the world who are, have agreed to label, their. Articles, with, information. About the, reporter, so giving you a bio and background, as on a real person their, mission statement, how they are funded should. Be interesting, for RT and others if they ever choose to participate. Also. What type of article, it is an analysis, opinion. And. Those. Indicators. You've you've talked about how it can sometimes be hard for a reader to see but, the platform's have also been involved in this discussion and they can use those indicators, as priorities. There's, a separate, JSK fellow Fred Frederick philu who is here. And he's, working on equality, indicators for, journalism and he's. Working. With the million stories and he's hoping his quality. Score will be used, by advertisers, who. Maybe, for reasons of brand want, to be associated, with truth. Something, like that so his, quality. Scoring is another, another. Example. Right. The thing with the trust project correct, me if I'm wrong jay-z you have to be participating, though so they're not going to score everybody right yeah, because so right Bart is participating, for example. Right. Exactly, but but. It wouldn't put a will happen, if you self-identify and, start ranking, it provides. The platforms with a way. Or a reason, to, treat, your content, differently, and. Again. Krishna has also talked about how there. Are lots of signals that you can use about. Whether something, is fake or not and. How. Long is the account been opened. Does. The person have a track record in things like that so we, talked about the platform's, in a negative, way for part but, they, can also use those signals and and prioritize. If they choose the difficulty, as you pointed out the, reason Yael had to do field experiments, was that. It's. Really hard to know even. The people at PolitiFact who are trying to do fact checking for Facebook now they've. Said Facebook is not transparent they don't know how their information is, being used so as a researcher. It's hard to collaborate if you, sign. In in non-disclosure. Agreement, and still don't know what's going on yeah, and this whole question of fact-checking is sticky, because, there's, a lot of skepticism around who with the fact checker is gonna be and this came up I think it was trey Gowdy on Capitol Hill who said well, warrior who, are your liberal. Fact checkers gonna be in this that's one of the questions here media and journalists are heavily liberally, biased mostly. Democrats it would seem that tools to find create and distribute stories, based on big data will, result in much more liberally biased media, journalism, please comment so. How can you switch. Those concerns a, couple. One is I wrote. A book called all the news that's fit to sell which is about media economics, of Public Affairs coverage, and. What. You see in public. Affairs is, product, differentiation you could think of media bias as product differentiation, so. If you looked at the mean ideology. If we ask you on a seven-point, scale how liberal or conservative you, are each. Media, outlet in the US has a mean, ideology, of its audience the. If. You ask what. Is biased, I can predict what you think is biased by your idea the difference between your ideology, in the media ideology, of that outlet and, we. Have more. Outlets, now because, the cost of having an outlet is lower so, people. Are going to have a higher probability of, having their worldview. Reflected. Back at them that's all the way of saying one. Person's media bias is another person's, Nirvana. And. I and I would actually say, I wouldn't necessarily agree, with the fact that the, media, are. Liberal. For, ideological reasons one. Of the things I showed in my book. Was. If you look at Network News. It. Has a liberal, bias in, terms, of issues, that it covers because. It was trying to target women, in their 41. In her 40s was the marginal, viewer at the time I was watching. Valuable. For two reasons more likely to make purchasing, decisions based, on consumer, data, and more. Likely to be on the edge of watching or not that, meant that the network news was more likely to talk about gun. Control poverty. And issues of family with children not. Because of any, ideology, because of green because, it was profitable so. When people tell stories through, the media you often see it people, think of it as ideological. But. It can also be driven for advertiser, value, so. Along these sort. Of you made a little reference J to the issue of polarization, I think and one of the questions here is about how. Algorithms. Is not a good night for algorithms, it seems that algorithms, are accelerating.
The Polarization, in our society, and is, there a way to address this through computational, methods. I've. Got something hopeful yeah he. There. Is a job market candidate, in the political science department this, week Kevin Munger from NYU, he's a PhD student he. Had dynast an, experiment, in 2016. Where, he. Looked at a hundred people on Twitter who were vitriolic, who, were Democrats, and a hundred who were vitriolic, who were Republicans. He, then created fake personas, on Twitter. He, was able to buy a thousand, followers there, is a market, for followers, and. Basically. He had a Democratic, and Republican persona. And. He found that if he sent a tweet after somebody was mean on Twitter and the tweet was moralistic, it said hey, remember the person you're talking about as a human and has feelings, that, actually, caused the person, to, drop their level of vitriol if the person shared their ideology, so Democrats, responded, to Democrats, and Republicans. Responded, to, two. Republicans, so there is I agree that there is that tribalism, but sometimes it could be used for good or social science or tenure. You. Know I can just add a little bit here so. There. Are lots. Of people these days that are interested. In engaging, in. A thoughtful, way in, a debate with people, that hold maybe different ideologies, and then they do, so. You. Know a Democrat, wants to talk to a Republican, and vice versa on, on. Individual. Issues and really try to understand, what. The other side is is thinking, and, one. Of the great things, about the Internet. Is that it provides a way to connect, people and so, there are a number of groups including one, of the projects, in our computational. Journalism, class, that, are really focused, on trying to bring together, individual. People, of different. Ideologies. And serve, as a moderated. Platform, where, they can really try to engage, in a debate with, the other side without. You. Know, leading. To all the fire and vitriol. That might. Occur. If there is no moderation, so. That. That also leaves me very hopeful. So. One of the other things that just, launched at Stanford, is, something called the global. Digital Policy incubator, it launched on October 6th and Hillary. Clinton was here for the launch, and. One. Of the things that there was a whole. Day on this topic and Timothy. Garton ash, think. About some of these issues he talked about freedom of free speech being indispensable for democracy, and he talked about you needing freedom of expression, freedom of information but also a certain quality of, democratic. Discourse or debate which I think that's what we're talking about is being eroded right, now so one of these questions from the audience is you, know I don't know if it's apropos computational.
Journalism But a very I think important, core question, about should. Freedom of speech be re-examined in, the age of Twitter BOTS and AI. Because. One of the things that Sheryl Sandberg did when she went to Washington, was say if you're for free speech you're all in I think was I'm paraphrasing what she said and they, are focused on the issue of authentication. Of the user but. Not less interested, in the truth, of the matter of whatever that, Russian. Agent, is putting out as long as they identify as a Russian agent so. This, question of freedom of expression. Does. That need to underlies, some of these questions that we're talking about and again could Stanford be a place because we've got people who think about that issue as well I. Think. That. We've. Been critical. Of the platforms, but if you think about the market for truth the, market for truth has always been. Slightly. Problematic, for the following reason, if I thought my car ran on sand I wouldn't really get around much but, if I believe that, Saddam. Hussein was involved in 9/11 and, that global. Warming is a. Chinese. Hoax, I might. Get high-fives I might even get elected president so if. You if, you think about the, market for truth I. Think. That we can't blame the platforms, alone I think we have to look in ourselves and. Talking. About free speech if you go back to the founding, a. Lot. Of what we've been talking about tonight are imperfections, in media markets, but they're just really a reflection of, imperfections, and ourselves if, you, have. A shout out to the Federalist, Papers. Jay. Hamilton, and Madison. You. Know they said if, men were angels because, that's how they talk back then if men were angels there, would be no need for government, and they said you know we need to design our institutions. For flawed people, ambition. Should counteract ambition and we, should supply by opposite, and rival. Incentives. The deep the defect of better motives, that's all a way of saying we should acknowledge, that people are sometimes going to try to deceive, us and. We. Should though stand up for things like the scientific, method and free speech and the reason that I put. Them together is that in the Stanford long-range planning process, when, we heard from alums when we heard from faculty many, of them said we're, concerned that, people are now attacking, the scientific.
Method, Itself, and. There was a great article about behavioral. Economic, and journalism that, essentially, said, guess. What the polar bears don't care who your friends are and by, that they meant global. Warming is happening and, in, some areas of the country that's unpopular. And, in some areas you might lose friends by saying it but, you should acknowledge that there are some things such, as facts, such as the scientific, method and so if. You're talking about defending, free speech I would also add. In defense. Of the. Scientific, method and it's, not just because I'm in the engineering, building right now. A. Speech, question or you. Know I think this is a question that we're gonna be we're. Just as a country, as a. You. Know as a world we're gonna be grappling with this going, forward there there, have always been some, limits to free speech and I think figuring. Out where those limits are in this new age is always going to be something that we have to think about so. Jay you mentioned the polar bears and. You. Know one of the one of the questions here is about how you incentivize. Good local. News reporting, but twinned with that also is, how. Do you incentivize, and. We've done that thinking mostly about broadcast, right, which is probably driven completely, but. To. Do stories, that actually matter as opposed, to political. Horse race reporting, or I'm. Always amazed that we have 24-hour news networks that can only do one story for, not. A whole day weeks months, there's. A giant, world of issues out there. How. Can we incentivize. Coverage. Of stories, that actually impact people and they need to know to be functioning. In our democracy. How. Many people, listen to KQED. Ok. Another. Representative, audience how many, no. No. How many gave, last year. Ok, so the rest of your freeriders that. That. Is in self something. That you can do because, if you're, basically, asking other people to. Step, up and consume public affairs one. Of the things you need to do is also support, it so if it goes back to those five incentives. Sometimes I think it's hard in the. 1970s. There were actually. Requirements. To get your free broadcast, license, that you would broadcast in the public interest convenience and, necessity what. Does that mean that's actually a phrase from railroad regulation. That they borrowed when they started regulating radio, in the 1920s.
It's Always been amorphous, because of the First Amendment, we would like people to broadcast, in the public interest but as soon as we became specific, about it it would violate the First Amendment I think it's hard to expect, profit-oriented. Broadcasters. To give, us the spinach in a way and actually. What. We have tried to do it with children's, educational. Programming, there's, a children's, television act, that says they, have to do three hours a week of that I sent. My students, in my media class to the local television, station, to figure out what they were claiming was educational. Program for kids. Geraldo. Beverly, Hills to. 109. Yeah, so, what they were what they would try to in and there was actually a federal form that one of my students found that said the Beverly. Hills 90210. Episode, Beach Blanket, Brandon about having sex at the prom was educational. Yeah. So when you have tell, a profit person. Maximize. Profits, and tell us something, educational they'll, reel able it that's why I'm a stronger believer in a. Non-profit. Or subscription-based. Advertising. Supported, media has always been problematic, because advertisers. Just, care whether you watched I'm not how happy you were and advertising. Is also biased against, high quality, because again it's just whether you saw it not how much you enjoyed it that's, why Netflix, is better than some of the other things so when, you pay for things. Through subscription, or when you give through philanthropy that. Can generate the information and, the good news is not everybody has to see it facts can circulate for free facts, can also research, get to. Legislators. And staff members, there, was a question along those lines of advertising, which gets away from the broadcast of sponsored, news content and whether you, know when you go in your feed and it says sponsored, news content it's like kind of Posey yeah it takes you a second to right for that should. We push back as not societally. Do we have a right or because this is me after all these things are free products, so, what. Do you think Manish yeah. You know I think. That. The. Labeling, is a great start, right, we, have the ability to see, that this is a sponsored. News product I, would. Love to see more labels, right, like, why. Did I get surfaced. This particular, story, in, my feed, just. The way you. Know we get advertising. And I'm certain advertising. You can actually click click a button and get. More information about why you got, that advertising. I think, you were telling me about this that's right fear of government regulation, motivates. Self-regulation. Sometimes, so the industry does have this. Coalition. That they put a little I and you can click on it and see why you got. Targeted, I think, labeling, helps. I don't see a sponsored, content, necessarily. As deceptive, like, the way the New York Times does it because, it's it's. Clearly labeled, I would. Say another. Positive thing is that the media is starting to generate revenue, through events, so, that's been a nice thing it's not it's not advertising, but the Texas Tribune the New York Times The Washington Post they all have events they, charge money and that's become a separate, revenue stream so a little bit contr