Joan Donovan Social Media Platforms and Their Irresponsibility on The Open Mind

Show video

I'm, Alexander Heffner your host on the open mind. We've intently, covered, technologies. That degrade. Democracy. And civil society. Here on the program and today, we, resume that examination, Joan, Donovan, is the project, lead for media, manipulation. At data and society, she's. Conducted, action, research to map and improve the communication. Infrastructure, built by protesters. She's. Also researched, white supremacist, use of DNA ancestry. Tests social. Movements, and Technology, specifically. Identifying, information. Bottlenecks, algorithmic. Behavior, and, organizations. With like-minded networks. In, The, Guardian she, and Danah Boyd make, the case for quarantine. Extremist. Ideas, when. Confronted, with white supremacist. Newspaper. Editors, should consider what she and Dana talk about as strategic. Silence. Likewise. I would argue broadcasters. Must. Exert, discretion. In not covering, Donald Trump's dis informational. Rallies bigoted. Rallies, that, have provided an open, megaphone, to, terrorize, minority. Groups in this. Spirit, I wanted, to invite Joan. To, share her insights into, this media, downturn, and how, we can incentivize. Integrity. In, our modern-day communications. Thank, you for being here Joan thanks, for having me it's really a pleasure that, really seems to me to be the trillion dollar question how we can, incentivize. Integrity. Or good behavior. Isn't. That, part. Of it one, of the things that I like to think about is I like to keep the long view of the Internet in my mind when I think about how. We first began, to build what, could, have been called a wide-area information. System, which. Is the, first iterations. Of the Internet were. Military. Installations. But then also universities. Were thinking about how do we connect libraries, how, do we share books how do we share legal documents, how do we know that the. Documents, in the same place every time and, so. As the, internet developed, from, this informational. Knowledge-based, space. We. Then. Saw the influx, of commercialization, we. Saw more, open, participatory. Culture online, people, were able to administer their own email, networks, they, were able to have their own chat boards, BBS, was very popular and, as. People. Moved online, the. The. Different subcultures. That these people were sometimes. Ashamed, to participate, in in public space or. Under. Threat of violence. Couldn't, LGBT. Being one group of people that were early adopters of the Internet we. Saw that the internet really allowed for, free, play of identity, and free play of information, in a way that did, build some, really unique in important, communities, and as we go through and think about the. First iteration before. We get social media was this social networking right the first movie about Facebook, the social network, and, it was really about connecting people to people it had very little to do with. Just. Distribution, of news distribution. Of information, it was, really about that people-to-people and, as. We see older, versions, of the Internet that is about knowledge, creation. And knowledge construction, and about giving people information. Collide. With everybody. Being a potential author we. Have a, different, system, of, communication. Entirely that, we haven't built in some. Of the important. Lessons that, journalists, have learned, from. You. Know hundreds. Of years. Of working with print, that. I think, that platform, companies, really need to take a hard look at the ethics of journalism see. Why they. Developed, standards ethics. Protocols. And adopt, those. Ethics. And protocols into their algorithms, as well as into their protocols, and their content moderation, one, of the most pernicious, influences. In terms. Of what must be quarantined are the, algorithms on YouTube, that are feeding people disinformation. Misinformation, in, a lot of cases now. Outright. Bigotry, how, are we how are we to deal with that. Quarantine. Yeah. So we've been thinking a lot about this, and doing multi. Case studies on YouTube to understand. When. Under, what conditions, are you served certain content, so if you're a big fan of a certain music genre, you, continuously, watch bands, that are part of this music genre people.

Who Also watch those bands click and like and subscribe to, bands that like that genre it builds, a subculture and the algorithm, learns machine. Learning, how, to serve, you then more from that genre so that you, stay on the platform longer, it works the, same for, Mom. Verse Asian groups if you're into stuff, about babies. It'll, continuously, serve you stuff about babies. But. In the accounts, that we have where we've been watching lots, of far-right and white, nationalist content, we find that it's very difficult, to break out of those echo, chambers, that even, if you go. On to your own account with the intention, of watching something as stupid as cats eating pizza that. Even in the recommendations. And the autoplay, for the next set of videos it's going to continuously, try to serve you things. That it knows you like, to watch so when. You even try, to pivot out of these echo chambers it'll continuously. Serve you more and more extremists, or white nationalist content i've analogized. Into carcinogens, in the discourse it's feeding you, hate. And. It's, not different from what should be quarantined on on fox news when you had an analyst. Say. To. A black guest. You're. Out of your cotton-picking mind. Mm-hmm. These, are on, youtube. Very. Explicit. And on fox news, coded. Sometimes, ways of calling people the n-word and that's what's going on yeah so it's mainstreamed, when we've looked at some of that in terms of the subcultures, of white nationalists, and when they do figure out that, the algorithm, is flagging for certain things especially the n-word they, pivot so we saw a big pivot, during. Black Panther there was a lot of talk, about the. Movie and white nationalists, wanted to participate in, the discourse about it because there were nationalist, themes in Black Panther and but. What we saw is on message, boards off youtube, because. They knew the n-word was being flagged they were saying call them wakandans. And so. We. It is a lot of the videos from, accounts. That we follow that are known white nationalists would put what quand ins in the title, so that they could then search for this. Turn of phrase and. We. Notice, very, clearly, that the subcultures, are often, experimenting.

With The algorithms, and one, of the difficulties. If you're, coming, at it from a Content moderation, point of view of the platform, companies, is they, employ these trust and safety teams but unless the trust and safety teams are looking off platform. For coordination, they're. Not going to see the effects of that in their platform very, clearly, because they're just going to see potentially. An uptick in something, like we're, quand ins and it's make sense because of course everybody's talking about this movie but, they don't know that it's a code word for, the. N word for, these, white nationalist, communities, as well, we've tracked the. Uptick in the word pit bull and and they're trying to map the you. Know they're trying to replace the, n-word with the word pit bull to be, a subset, of, discourse. Around black criminality and this deals with an emotional, and psychological and. Societal, issue of people, having hate in their heart we're. Not having hearts not having souls in. Their. Desperate. Attempt. To keep, alive, a. Bigoted. Sensibility. Well, there's that but then there's also the. Context, in which we, are trying to understand, this is that you. Have you. Know we've we've now had about 40. 30, to 40 years of identity, based movements, getting, very successful. Redistribution. Not necessarily, wealth but of resources. And. Of Rights. Particularly. The, black lives matter movement and. You. Know really. Making. Inroads with dealing with police brutality we. See The Dreamers movements, with. Daca. And, so. When. We look, at these young white men what, we see is a group of people that are trying to form an identity, based, movement but, they are not thinking. About it in the same way that we see other identity, based movements, thinking about equality, justice. Inclusion. We, see them wanting, to retain.

White. Privilege, they want to retain or, return, to and imagined. America, where they are at, the, top and. Through. This what we're trying to understand, is how those different, movement logics, are, manifesting. In media and manifesting. And different coverage, of. These. White supremacist. Movements. Who, really. Believe and, know that media. Is the, most important. Lever, that you can pull in order to get attention to your movement all movements, know this and so. We. See them coming out into public spaces we, see them showing, up in public parks of course Charlottesville, was one one. Such. Rally that they got a lot of media attention and, they rode that wave for. Months, and months and months thereafter. YouTube. Which, whose. Parent is Google Facebook. Twitter. They. Don't want to operate, in. A, society that, it, that has transcended. Racism. At least in terms, of. The. Contemporary, norms, they they want to continue, to give life to. The. Hateful rhetoric in, defense of this. Puritanical. Absolutist. First. Amendment, notion, mm-hmm so. When we have you or Tarleton, Gillespie's, an dip to Fiji mm-hmm. My. Question, always. Comes. Back to that Google. And YouTube in this case in particular their, unwillingness, to. Assert. That there are norms, by which their members, ought. To subscribe, mm-hmm. Yeah, and it's it's really difficult, because with in Silicon, Valley culture. You have a. Distorted. Lens of. Diversity. Serena Emma root writes about this in. Her. Work on understanding, the, difference between Asian, and white and Silicon, Valley's so there's a lot of Asian people in Silicon Valley and so therefore Silicon, Valley believes. That it's authentically, participating. In. Diversity. In the workplace as, well, there's, evidence, that, we've seen as well as there's an organization called co-worker, that's been mapping and tracking, alt-right. Influences. In Silicon. Valley and, how. Aligned. With this notion that you should be able to represent yourself. And your people and your movement we, see in Silicon, Valley a, lot of people getting very defensive, about the notion that we. Should be opening, up design. In. A way that has, both. Responsibilities. At the front and accountability. At the back so that means that responsibility. Means that you think about your product and you think about and you and you invest a lot of time and energy into. Understanding. The ethics of the design and the deployment just, like you would a. Market. Study for a drug right what are the potential, side effects, that I don't see we, see Facebook is starting, to develop a team that can do some of that work but. Because the algorithms, and the systems, and, the the. Management, philosophy, is also in, these corporations, that are difficult to. Unbox. We, don't really know what happens, in the midstream. Part. Of the machine and then what we lack is any system of accountability, where, someone. Somewhere, takes responsibility for. Saying yes I provided, the. Platform, in the space for these groups to organize and. Show. Up in a place and do. This kind of violent, destruction, and so. That. Kind of system where we start, with, responsibility. We, do the. Time that it takes to develop the products, in an ethical way and then, evaluate. Them there's, many, civil society, groups right now working. With Facebook on what's, called the civil rights audit, and Color, of Change is, one group that's working on that and. The National, Hispanic, media coalition and so, they're trying to understand, that that. Portion, that very few researchers, have insight into and then. On the, back end we, need regulation, or, some form of regulators, to step in because.

When We get to that step what we notice is there's a lot of finger-pointing, between. Regulators. And, technology, companies, and congressmen. But. Nobody, is taking account, of what's really happening right we've talked about that on this program the honest, ads act in particular, sponsored. By Amy, Klobuchar John, McCain and Mark. Warner but. With, respect to this question of quarantining. Racist. Content. From. My perspective as a broadcaster. It. Was heartening to see, at. Least one. Or two examples in recent, weeks of, most. Networks, and cable outfits. Ignoring. The Trump rallies, because, of a collective. Acknowledgement. That they were spewing, hate. And, lies. And that, they were not news events, in fact the only way you could interpret them as news events is that you're, misinforming. Or dis informing a huge. Constituency. Of Americans, so, at least when it came to CNN, and MSNBC. For. The first time since, the campaign, there. Was an acknowledgment we're, not going to go live to, this Trump rally. That's. The way to quarantine it on television. We'll. Talk, about what George, Lake off describes, as a truth sandwich. As opposed, to a lie sandwich, so you. Start. With the truth instead of starting with the lie then. Correct. The record. Instead. Of the lead the headline being the lie so we need truth. Sandwiches, in the discourse. Quarantining. Trump. Rallies, is one way to do that but how do you quarantine. Online. Yeah. So this is part. Of our next. Step, in our research at data in society, Dana. And I have been thinking about what is the. Best way to get, platform companies to adopt a journal, journalistic, ethic and practice around. Choices. In what, to serve people as, news. And it's. A difficult thing because. We have so many people who have learned to write that, are authors, in and of themselves that. Do not. Necessarily. Mainstream. Media which, to, me really means large-scale. Million-plus. Viewers, but we have a lot, of people online that, maybe have half, a million viewers, that are what. We might even call citizen, journalists, that have been. Incredibly. Important, in serving. The investigative. Function, of journalism, now that we've seen quite a big decline in local journalism, as well as investment, in, investigative. Journalism and so one. Of the tensions is, that. Of. Course, you can serve more of the big market news. And. You continue, to perpetuate, those. Problems. You can, focus on, strengthening. That core, of. Journalists. That are using, Internet technologies. In. Order to do. Not. You, know like, Jack Smith is a good example of someone that has a very, broad audience but. Isn't breaking through to that large. Level, so. Mike for instance is a good example of that or Vox you, know how do you get investment in those kinds, of media companies that do, interesting. Journalism, and then, you have a bunch of people that are pretending to be journalists, right. So they're incentivized by. Advertising. And the money that they can make the clicks they, can get they're incentivized. Through. Pandemonium. And mischief and the lulls or. They're incentivized to, be cloaks. They're. Essentially. Cloaked political, campaigns, and this. Is what we saw happen, a lot during the 2016, election, as we see things that are, self-described. As news. Masquerading. As. You. Know masquerading, as news but they're definitely politically.

Motivated. Campaigners. And maybe they're not an endorsed. By any election, election. Candidate. But they're. They're. Invoking, First Amendment, rights and First Amendment protections. Of the press and that's, an interesting split to because the First Amendment protection of the press is different, from First Amendment protection, of just, any old citizen, right. But the courts have been very clear that protection of the press is something that really matters to them and so when a group. Or an organization, has established themselves, as, press, they do get a little bit more latitude, to. Say. And do things that, might be considered libel, or slanderous. Because. Of the. The. Importance, of the work, we. Talk about quarantine but, we have to talk about deletion. - mm-hmm. Deletion. Of. Racist. Content, deletion, of fictitious. Fallacious. Content. Like. You said masquerading. As real news and, on. Facebook and Twitter they've had a real struggle with not just the monetization, of, the misinformation. But. Having. Gone through an entire campaign, of verifying, with that blue. Checkmark folks. Who were actually not, legitimate. News sources mm-hm, and also. Being, unwilling. To. Remove. Content, that. Was posing as. Real. Journalism. And one example is if you google Tom Paine or Thomas Paine you, see an account that, was a political, operation and not a news gathering operation. If. You're searching one. Of the most important, figures in American history on Google. And the. First or second indexed, result, is. A, disinformation. Campaign. Mm-hmm. Which actually poses, as a Pulitzer, Prize winning or Loeb winning, reporting, outfit, that's. A problem when, we see disinformation, proliferate. Online and, we see. Strategic. Coordinated. Amplification. Using. These tools, a lot of those tools are right out of the box of, marketers. Right so for a long time marketers, online knew, that one. Account is never effective, so, they knew they had to be a hundred or a thousand, people right and. You could buy that technology. And it was fairly, cheap and you could run your own small, world's, network online to promote your your. Products, and so that kind of technology just, took a while for states. To adopt but, it's it's, a known quantity online. And, and you, know anti-spam. Teams know exactly, how to do it, groups, that deal with lobbies, online, know. How to, spot these things what. I think was unexpected. About, the use of bots is that we also see. What. We might call cyborgs. Participating. These are people that will operate the count sometime, as a bot but then other times as a. Human. Being so they might actually respond, to you and this is confusing, sometimes, for, the the, ways in which content, moderation, is done. So, there's one level of that and then we see other, nefarious. People, using, those kinds, of technologies, but we're still stuck in the mindset of one, account for one person, but that's not true online, online is very much a, magnification. Of, society. It is not a reflection of, it but, we have a lot of researchers, also invested, in studying, Twitter in such a way that they believe it is a reflection of society and so our research really hasn't even caught up with the technology, being is to manipulate what.

We Think about as information. Flows online. Another. Interesting, case study is Wikipedia itself, because Wikipedia is, used for, a lot of natural, language processing algorithms. But it when, we see manipulation. On Wikipedia, we know that there are downstream, effects for other AI and machine learning technologies. And so. When we're trying to map all of these things out what. We're really trying to understand, is, also. What platforms, can do and unfortunately. The incentive, for, a lot of platform, companies because they're so big is to, stay big and to stay global. And so when they want to stay global and stay big they have to be very general, about all of their rules right and so if, you do start, targeting, things. Related, to white. Supremacist, content, then, they start parsing, at these levels that are absurd, to a researcher, like myself where, recently. There was a disclosure, that in the Facebook, content, moderation. Slide. Deck that there was a difference between white supremacy, and white nationalism, and my, first question is but why right. And it's because someone, somewhere, down the line said, to them white nationalism, and as identity, and white, supremacy, is, sort. Of an ideology. And I, don't agree with that we both reach the same ends but, when you when they get down to this level of content moderation, and they start parsing these things they start making some very irrational decisions. Because. The. Labor involved, in in investing. The, labor involved in doing that correctly is enormous. And they. Can't just blanket, it and and, ban all these words because as we were talking about earlier the subcultures evolve and they move and so that's, what's difficult about it is you can't have this radically, open, platform. And and, make money and at the same time, invest. In the people power necessary to, moderate, all that content we only have a couple minutes shown but, what. You're saying is depressing. Because it shows, just how, far, the, companies, are from. Where. They need to be in. A. Collective. Ownership. Of. Their airwaves, one, of the main things that we've spent the last year really focusing, on is understanding, from the platform, perspective the. Challenges, they really face in. Building. Out products, that essentially. Haven't changed, that much in the last five years there's just been a lot of attention to them. In the post, 2016. Election, but, if we look to the last five years of activism online, we. Know a lot of these problems existed, prior to the. Campaign, but what we're also trying, to understand, is when. These, companies do, provide, access to. Fascists. And dictators, and authoritarians. And. They. Promote. It and allow them to use their. Ad targeting. Networks. Things of that nature that we do get governments, that lean towards the right and so, what we need is a. Much. Stronger network, of people that are reacting, to and, are. Networked. In such a way that they are, trusting. Of one another and then are mindful, of when they're being manipulated and I wouldn't say that all, persuasive. Media campaigns online our manipulation. What I really. Understand. Is manipulation, is when there's forgery. Hoaxing. Trickery. Involved lies sock, puppet accounts these are the things that we tracked so the other is persuasion. Yeah persuasion. You know good, old school propaganda. Might be like a way that people talk about it but persuasion. Campaigns, you, have to stand for something you. Do and if social. Media has given us incredible, voice and I you, know the, discourses, around clicktivism, and slacktivism and, 2011-2012. Or maybe the wrong things to talk about, when we talked about what it means meant, to do action, because. Online what, we do know is that access, to information is, paramount, right. Now we're trying. To understand, the relationship between.

The. Government in society, and media organizes. Our beliefs. About. Society. Right media. Is that our most important, lens with. Which to view the world and so, media tells us how to understand, foreign policy, media tells us how to understand, government media tells us how to understand, technology. And I, do believe that there are many great journalists, doing an excellent, job of. Pushing. These issues and, keeping, them at the, forefront of our public discourse but. At the same time and this might be a product of cable, television which. Is we, do see a difference in, the kinds of things that people share, online. Versus, the kinds of things that they are asked. To watch day, in and day out on. Television and, I think that you're right to point out that what we need is more. Coverage, better coverage, also on TV as well as on the radio so. That people are seeing a wide swath, of information, and that they are equipped, with, better. And more, journalism. Rather, than. The. Echo chambers that the, platforms, are currently. Keeping. Us stuck in thank. You John thank you and thanks. To you in the audience I hope you join us again next time for a thoughtful excursion, into the world of ideas until. Then keep, an open mind please, visit the open mind website, at thirteen.org/openmind, to. View. This program online, or to access over, 1,500, other. Interviews and, do check us out on twitter & facebook @openmindtv for. Updates. On, future programming. Continuing. Production of the open mind has, been made possible by grants from and Olynyk. Joan. Ganz Cooney, the, Angleton Family Foundation, Alfred. P sloan Foundation the. John s and James L Knight Foundation. Joann. And Kenneth, Wellner Foundation and to. The corporate community, mutual. Of America.

2018-08-20

Show video