Building trust in science and technology

Building trust in science and technology

Show Video

Hello and welcome to this session discussing restoring trust in science and technology, even as COVID 19 has made science an everyday topic across the world. It's also revealed how much mistrust in science there is and technology breakthroughs and the media that report on them. What are the driving forces of this mistrust and what needs to be done to restore public confidence? I'm Jeremy White. I am Executive Editor at Wired UK, and I'm joined here by an extraordinary panel that will be asking questions throughout the session. From my far right here, we have Shafi Goldwasser, co-Founder and Chief Scientist at Duality Tech, Director of the Simons Institute for the Theory of Computing at UC Berkeley.

We've got Pauline Paterson, co-Director of the Vaccine Confidence Project at London School of Hygiene and Tropical Medicine, and then Rajaâ Cherkaoui, Professor of Nuclear Physics. Mohammed V University of Rabat and Member of Hassan II Academy of Science and Technology. And finally, Denise, next to me. Denise Dresser, Writer, Political Scientist and Professor, Department of Political Science, Instituto Tecnologico Autónomo de Mexico.

Welcome all of you. Thank you very much for joining us. First question we have is what are the drivers of contemporary public mistrust in science and technology? And so Pauline, if I may start with you. What are the key public concerns around vaccination? How does trust or distrust play a particular role? Thank you, yes, for vaccination programs to be successful.

It really does depend on trust confidence, not just in the vaccine. So, is the vaccine safe? Is it effective? But also trust in the health care providers. So, does your doctor have your best interests at heart? Is your doctor competent or the health care provider at giving you that vaccine? But also trust in the process, trust in the policy. So is the policy based on evidence? Do people have trust in policies that are there? And that's really key.

And what we found is we conducted a study at the Health Protection Research Unit in Immunization exploring parents views of whether they would accept a vaccine or not. 90% said they would. So in the UK, for example, people are willing to be vaccinated when asked. But the reasons why they weren't so keen to get vaccinated was this concern around safety, concerns around effectiveness, or possibly not feeling at risk as well. So we don't just look at confidence, trust, but we also look at convenience. Is it somewhere that's easy to get to? Is it far away? Is it at a time that you can go get vaccinated if you're working during the day? Can you go in the evening, at the weekend and also in terms of cost, but also complacency.

So do you feel at risk of the disease? And we have seen that some people don't feel at risk of the disease because it does affect more vulnerable people, more elderly people. So, the young we've seen are less likely to want to get vaccinated because they don't feel at risk. How do you go about not alienating people when you are encouraging people to get the vaccination, the jabs, but without alienating them? You're talking about making it convenient to them, but how do you - the people that don't want to have them, for example. It's really important not to stigmatize those that aren't vaccinating, and it's really important to not say that they're not vaccinating because they're uneducated or they don't know.

They don't know enough, and we need to educate them. It's really important to figure out why are they not vaccinating? Is the vaccine in a place where people can go? Is it convenient? Figuring out - talking to them, not just communicating out, but also engaging with people, figuring that out and then addressing those issues. It would be interesting to see what has been the most successful effort either of you have seen in fighting vaccine hesitancy? Before I answer that specific question, I want to address some global trends that Russia is simply one more evidence of. Right.

Have to do with declining trust worldwide, and this is measured by the recent Edelman Poll on Trust. Decline in trust in governments. Decline in trust in the media. Decline in trust in journalists. And I think this is fueled by two parallel phenomenons.

One has to do with the disillusionment with liberal democracy around the world. Yeah. Because of inequality, because of the rise in poverty, because of failure to deliver.

And you're seeing the rise of populist on the left and on the right that, because of the particular way in which they govern, tend to provide a narrative whereby they have alternative facts, and people should trust in them as the embodiment of the people. And therefore they tend to discredit data, and scientists are part of a data-driven group of society. So there is that that political factor.

And I think it also has to do with a second trend, which is the rise of social media as a way of conveying disinformation about vaccines as a way of spreading conspiracy theories. And because of the distrust in traditional institutions, people are now getting their information via Twitter, via Facebook. And the hesitancy rises specifically from those sources of disinformation. Where are vaccines working well? Where is there more trust? In places where there's trust in government, where there's trust in science, where there's trust in institutions. Rajaâ that actually leads me to ask you a question, really, even leaving aside this polarization.

But do policymakers, politicians, trust science and research, in your experience? Is this... This seems to be key for whether we are actually imparting trust through to the general public, if the policymakers and the lawmakers are brought on board. Is this your experience in your field? Yes. Um, I think trust is mandatory to build a scientific project. And I can give my example.

I am working in fundamental physics. So when in 1996, as the head of a Nuclear Physics Laboratory, I proposed to my minister that Morocco becomes a member of an international scientific collaboration. It's not easy for our country. So you have to give all arguments. You have to give also what benefits our country will have.

And this is really, really important to have the confidence of our ministry. And when we had his agreement, after that... I can give some example. Please. Yes.

I showed that first we will have a young researcher, a PhD student or post-doc, in a high level scientific [role]. It's really, really important for our country. We will have a technical transfer to in our country. We will have innovative training. So all that is very important.

And after that, the concrete results and positive results establish confidence, until today. And despite the change of government, and this is really, really important. Speaking of widening out to other areas as well, Shafi, I wanted to bring you in to talk about AI applications, something that's in the news, a great deal over the last few years. But we use more and more AI applications every day, and this is an area that you work in specifically. So, do people trust or mistrust algorithms or should they mistrust algorithms? Yeah. So in fact, it's a very good question because as we know in the last five years or so, there's more and more talk about machine learning and AI methods being used in every aspect of life.

And some of it are things that we don't even think about twice. So, the fact that we reroute traffic because we know where the traffic is going. That has to do with the fact that we have a lot of data about traffic and we have smart algorithms who tell us how to drive faster and safer. Similarly, about energy re-routing, and obviously everybody knows about medicine, that we have a lot more data about illnesses, about causes from all over the world. And if we collaborate and bring this data together, we can also learn insights about better, you know, pharmaceuticals and so forth.

So AI has been there. It's because we now have more data and more computing power. We are able to actually take advantage of this data and using these AI techniques. And obviously, that's a good thing. So it would be very unfortunate, if because of mistrust, these algorithms are not employed.

So I think part of the mistrust, I think - touching on your earlier question with the other panelist - is because everything has happened so quickly. Yeah. You know, even with the vaccine, if you think about it, some of the things that the anti-vaxxers say makes sense to all of us. It's so fast. You know, this vaccine was developed so quickly and we're supposed to take it. And who knows? The point is that we have no time. And that's why we're doing it.

So with AI, there's all these techniques. They're incredibly powerful. You know, language translation, the things we know that we're using this, the things we can do today, the illnesses we can cure because we have all this available, and it is an amazing thirst to use it because it's so powerful. However, because it's quick, you know, regulation lags and not only regulation, even technologically.

And this is things that I work on. You do have to pay attention. Is it accurate? How do you verify accuracy? Is it transparent? Why is a machine learning algorithm making one decision versus another, for example, in things like bail? Is it robust? So in other words, are there some outlier cases where this thing is brittle and it will break and it will affect people? Is it fair? You know, there's a lot of use of this term "fairness." In other words, if you have a lot of data about people, but you don't have a lot of data about minorities, now you've changed a whole medical treatment.

It might affect these minorities badly. So it's scientists. So it's more science. It's not less science.

More science will fix these problems, in my opinion, because you pay attention to it, you make the finishes, you come up with methods. But I guess my point is, it would be really too bad not to trust because we won't be able to take advantage of what we can. There are legitimate reasons why people are hesitant because of the speed at which science is moving.

But in my opinion, the way to fight it is to have more science, and education, obviously. An explanation to people that it's not that somebody is always trying to take advantage of them. Denise, let me bring you in here. I want to broaden this discussion because I think we're headed very quickly to a situation where you have incredible advances in technology, in AI. And then those raise all sorts of ethical issues and issues about how does science intersect with democracy? Because those tools that scientists are, in your part of the world, very excited about, at UC Berkeley, I'm beginning to see, as a comparativist, how those tools can be used by authoritarian governments and not for good purposes.

I mean, you see the rise of AI for surveillance technology, and I think science should not be only for the few it needs to integrate policymakers and there needs to be a debate about the ethical implications of artificial intelligence. Even now in Covid, for example, what did we see? That Twitter, Facebook, that were the platforms for the dissemination of incredibly valuable information, also became tools for the spread of disinformation. And only a year and nine months into the pandemic has Twitter started to move against those that were providing information that led people to vaccine hesitancy. Only now are they starting to remove accounts that were virulent spreaders of disinformation, Fox News being among them, for example.

So I think there's a responsibility for corporations, for platforms and for the scientific community to bring in the ethical dimension. And also for policymakers, because we see the weaponization of science, the weaponization of artificial intelligence. And I think that raises significant questions for the scientific community and for those of us who are in the defense and promotion of democracy worldwide. How do we actually combat that, then? If we're talking about this disconnect between the science itself, from what you can actually do, and then how it's implemented, those are two separate things. It's like almost trying to attribute whether fire is good or evil or not. And so, when we're talking in those terms, how do we actually go beyond the point of just saying AI is evil, or AI is used for evil, and being able to educate the public and saying it's not science that's the problem.

It's how it is used, or who is using it. That is the problem. How do we change those attitudes if the public perception is always bound up with how the science is actually used rather than what the science is? I think, in terms of vaccines, if I can add - so we conducted a study looking at disinformation around vaccines, and we did see that if someone is exposed to misinformation around COVID vaccine, they're more likely to decline vaccination. So there is that. But also, although people go to the internet and they go to social media, they don't necessarily trust it. So we can't blame social media for the reasons that people aren't vaccinating. And we also conducted sentiment analysis to look at what people are discussing around the COVID pandemic, for example.

And most of it is neutral, or positive toward public health. So the media do occasionally have headlines that make you worry and fear that most of the information online is not credible and worry that people are going to try and change your mind about things. But but it's a real small minority, that disinformation. But also with the internet, with social media, people that can group that couldn't group before, and you can find information that will reinforce your preexisting views. So if you're worried about something, you'll find someone else who's worried about it.

And actually, you can fall down that rabbit hole of getting into worry and fear. But I think you're being very specific to the UK. If you look at examples around the world - and I'm not even talking about developing countries, look at what's happening in the United States, where it's not only that people receive bad information from the media, they are receiving bad information from elected officials, from governors who are vaccine reluctant, who don't want vaccine mandates, who keep insisting that these are not times - after more than a year of the pandemic - these are not times for masks, obligatory masks in public schools, for example. I think that the issue of bad leadership, not paying attention to science. I think we have to take into account what's happening in many other countries of the world.

And if those countries aren't vaccinated, we're not going to be safe, worldwide, globally speaking. So we haven't just been doing studies in the U.K., we've been doing global studies and we did do a 19-country study where we looked at trust in information from government and those that trusted were more likely to vaccinate. But I think it's really important that policies are based on evidence because people will be more likely to trust them. But I agree that these will be amplified factors of distrust, if your government are making decisions that aren't based on science.

Like, do you think we should be looking at, you know, this idea of politicians, you know, taking science and perverting it in some way? Do you think we should be going back to basics in a way, showing the public what is science? So they can make their own distinction? I think with the COVID pandemic, and the urgency to educate it, everyone has forgotten that science needs time, takes time and this is very important. Indeed, the scientific process is well known and requires several steps. First of all, you give hypotheses, you propose a model, and after that you have to confirm this hypothesis by an experiment or an observation. And finally, the publication of the results and a review by peers.

So large publics and decision makers have to be aware of the time needed to insure the outcome of science. I think it's really, really important. And with this pandemic, all people have forgotten that.

The late Stephen Hawking wrote shortly before his death that as a scientist at Cambridge University, he had lived in "an extraordinarily privileged bubble". Should scientists, therefore really, I think, have more of a responsibility in addressing the concerns of real world, outside of academia? Denise. Absolutely. I'm an academic, but at the same time, I feel that my responsibility is to make very complex issues and academic jargon intelligible to the broader world so that people understand what's at stake. And I think these panels are useful. They would be more useful if we actually had politicians on them because - no, no, they are the ones making the decisions.

I mean, you are all incredible scientists, and I trust and value your work intrinsically. But there is a big gap between your world and the people who are making decisions on the basis of electoral, political imperatives, keeping their party in power, winning an election. And they use science or abuse science. Or ignore science for their own purposes.

I think it's important for the scientific community to know how to communicate to the broader public, to integrate their work or discuss it with policy makers. to develop the field of scientific journalism. For example, in the midst of the pandemic, the work that I found most useful was Ed Young's incredible articles in the Atlantic explaining COVID to the broader world.

He won a Pulitzer Prize for that. And I think in every country, you should really develop that field so that you can directly address the public and its concerns. Oftentimes jumping over politicians who are distorting what you are doing and what you are saying. Should scientists enter the political arena to stop this happening? Denise. I want to take this opportunity to denounce something that is going on in my country and I think is something that is happening elsewhere, which is the political persecution of scientists. Mm-Hmm.

Well, I think in those specific circumstances, science has to defend itself and you need to forge an international community or coalition in defense of science and for people to speak up and say this is wrong. We have to build relationships across communities: scientists, academics, policymakers, media, journalists, etc. At a time when in some places science is being applauded, and in others it is being persecuted.

about public trust in science and does it matter, and what are the solutions? We've already touched on some of these solutions anyway. It was nice that this discussion is weaving between them all. But regarding media, which we've mentioned a couple of times here and obviously I, being a member of the media, it's interesting to me. I'm always interested in should we dumb down science in order to make it more understandable, more accessible? And at what point does simplification of this communication vulgarize science too much or degrade it too much? Is there a tradeoff somewhere, would you rather more people understood? I saw that in France, you have a Master's only on scientific mediation, and this is very important. And also, science is going very quickly.

So international collaboration for each item is very important. We have to change, to have a relation between... No, we don't have to talk about Africa, Europe, America... No, we have to talk about a project, and we call all the experts in the world to explain.

We have to have the project, but also to prepare us to communicate with the decision maker. It is an exercise which is not easy, to communicate to the large public, and the large public is also these decision makers. Because it's very important.

And I think you have to shift towards just better communication, broader communication, clearer communication and resources, governmental resources devoted to that communication. I agree. But just to say something. You were asking about... Should journalists vulgarize, or whatever. I mean, it's the wrong term. But actually I think... There is no low.

- There's nothing low enough. - There's no low bar. No, I'm joking...

But I mean, you know, the "mad scientist" in "Going Back to the Future" - wonderful character. It's good that there are these characters which are scientists and are sympathetic, which I guess the media can use, or cartoons of scientists. I think the more, the better, because I think that scientists being viewed as friendly characters, even for children who are going to grow up to be the adults who either respect scientists or don't respect scientists. And that they're not, you know, nerds or people who cannot communicate or are not sympathetic. Obviously, the media can have a big part in making scientists as characters, as a profession, more accessible, more desirable. I think actually media does a great job in documentary films now about science.

There's a lot more understanding about phenomenas, about climate, and that's a big service that media does. What about - sorry, Pauline. Yeah, I'd love to add that in the UK, we've got the UK Science Media Center, and they say media will do the science better when science do the media better. So they bring together scientists and media, and it's a great network, great collaboration. Because then, if as journalists, you will know, you're in a rush, you need to write your piece, you've only got 24 hours.

I don't know how long you have. Less. And the Science Media Center helps kind of bring in the scientists and helps us write press releases as well. And I think it works really well, and I think it's really important so that media doesn't sensationalize or try and grab the attention by having incorrect facts as well.

So I do often talk to journalists and it helps to clarify what is happening. Because sometimes people, even other scientists in other fields don't understand what's happening, so they need to talk to someone who does. That's a very good point. Do you think the media should be more ready to follow that example and not try and go for a binary answer? It's a very interesting question you asked about the media, but that's your profession.

So, being binary or sensational or being decisive, it sounds like a good strategy, right? But it might not be the only strategy. So, provoking thought is another strategy. So, I assume that - it appeals to me, and my guess is that it appeals to a lot more people than media believes. So, in other words, even about something very technical, you know, that you have an algorithm that makes recommendations, and you want to make it accountable. And then there's the question: Are the algorithms today accountable? Can you... Like, a doctor, if he does a malpractice, you can sue him.

Can you sue an algorithm? Maybe yes, maybe not. How do we make algorithms accountable? So by bringing up these questions, but not as an accusation... Algorithms are not accountable. Let's not use algorithms. No, no. That's not the end. It's an interesting question. How do you make an algorithm accountable?

Can it be done? So, well, if media was able to provoke thought, wouldn't it be wonderful? I don't think it's an impossible task, but the easier task, of course, is to provoke sensationalism. And I would agree with you because I think media has also the responsibility of raising questions. - Yeah. - Food for thought. Right. You know, at the intersection of science, technology, ethics.

What are we going to do, for example, about the ethics of artificial wombs that are going to be down the road in five years? Or who gets to go to Mars? Who gets to live on Mars? I mean, should it just be private corporations that establish colonies? Or is this the responsibility of governments? And is some regulation required? Just to put out two questions there. But there are so many more. I think the media should be posing those questions as hypotheses, as exploratory issues, as food for thought. And I'd like to end the session. We've covered a lot of ground here today, so I want to leave you with some sort of takeaway from our distinguished panel, actually. So I think the best thing I can do is basically to ask you if you had to choose one thing, what's at the top of your list for restoring public confidence in science? We've covered many different factors, but if you can only choose one and I'm going to keep you to one.

And you can, you know, you can agree with each other. I'm asking you to choose different ones. But Shafi, if I start with you, I have one - what's at the top of your list to fix? - I think the media. - The media. Yes. To make the media will be less about sensation and more about exploration. Excellent. Pauline.

I had five in my list, but if I was to choose one, I would go of engagement. Engagement with the public. Listen. Don't assume. Discuss. Listen. Engage. Really key. Wonderful. Rajaâ.

I think scientific mediation. Mediation to cultivate dialog between large publics, decision makers and scientists. Wonderful. And lastly, Denise. Making science intelligible, making it easy to understand, communicating better and understanding that scientists do not work in isolation from politicians and from other members of society. And you do need further integration and collaboration and communication. Wonderful. Thanks so much, indeed.

I'd like to thank our distinguished panel here. I'd like to thank the audience as well Thank you for joining us in this session here on restoring trust in science and technology. Enjoy the rest of the sessions. Thanks again.

2021-12-13 15:51

Show Video

Other news