Fair technologies as a tool to safeguard public health?
What. Do people in Europe really, think about human rights. In. 2019. The EU agency for Fundamental Rights undertook. A major survey, and asked. 35,000. People from 29, countries about. Their views on fundamental, rights. 9. In 10 of you agree human rights are important. For creating a fairer society. More. Than three and five of you do, think human rights can play a useful role in your lives. Should. We be concerned. The. Following answers reveal some stark differences, in what people think about human. Rights. 68%. Of you think some people take. Unfair advantage, of. Human Rights, 68%. But. Only 27%, of you who are better off think. This. And. 27%. Of you think judges in Europe are not independent. Medicine, and. You do. You really think this is acceptable. Let's. Think about those figures, let's. Use them but for protecting, our fundamental, rights for, building a fairer, society. Or. Good. Morning everyone, and welcome back to the second, day, of our. Series, of events at, the European, Union fundamental, rights agency, in. Cooperation. With the Croatian, presidency. Of councils the EU and the EEA and, Norway grants. Now. Yesterday. As we you know we, unpicked. Fras, survey. Into. What, fundamental. Rights mean, to the people of the, European, Union you've just seen. A video summarizing. Those today we're going to dig even, deeper and, look at something which is really. Really topical, right now and that is data. Protection. Privacy. In the context, in particular of. Health. We've, got a really interesting debate. Ahead, of us but, first for. Some welcoming. Words I'm going, to go to. Norway's. State, secretary, Yin's, furnish halter. Thank. You thank, you and I. Really, appreciate. The. Opportunity to, speak, to you today. Over. A. Video. Link, as. We are all being very accustomed, to do in these days. I'm. Ever happy to be here this conference, is a part of the collaboration, between the, Fri. And the e a grants, the. Main goals of the EEA, grants, are quite wide reaching. They. Are, supposed. To strengthen, bilateral. Cooperation, and. Reduce. These positives, in the beneficiary, countries and, with, these grants the, donor s which is Norway. Iceland, and mission Stein we. Financed thousands, of projects led by NGOs, in 14. Countries in, Europe, the, total budget exceeding, 200, million euros of. Which at least one-third shall, be allocated, to democracy, and given right. Respect. For individual rights and freedoms is essential. For a functioning, democracy. The. Norwegian government has, made rights and freedom a core part of our foreign policy and even. If we are in the middle of a pandemic we, will keep this as a priority, and also. Included. It. Our colleague 19, responds. Internationally, we. Are concerned, for the backsliding, of fundamental, rights that we observe in some of the EU member states both. In general and of course. In particular now, during, the covert 19 crisis. This. Tendency. Puts. At risk the necessary, trust in. Which bilateral, and international cooperation. Has. Been built upon. And. According. To the survey. That's being launched we, really cannot, take for granted broad. Popular. Support in, all countries of Europe for the principles, of human, rights, one, third of those asked, agree that the only people to benefit from human rights are those who don't deserve it I say, that's quite an extraordinary. Find, he's. Also very discouraging. The. Fact that a large number of citizens do not acknowledge that. Persons belonging to minorities, should be. Attached to the same rights and freedom as enjoyed by, the, majority. I mean. What what European, history, has sought us all, through, the Renaissance enlightenment. And, also, from the horrors that we saw, during World War two that's. The importance. And the. Importance. Of equal access to justice and of, protection. From violation. Of our, fundamental, rights, so. In a sense it's. Not only. Fundamental. Rights that we're talking about it's really, true. And. Historic. European, and values. That we need to stand forth so. We have a very important, job, to do I think we must find ways to communicate better, how human rights are relevant to. Each and, one of us and it. Must happen before those, rights, and freedom come under, pressure, the. Challenge is that we, act too late that. We. Voice our concern, when it's when. Rights and freedoms are being curved, and, you. Know once. Access. To information is limited and, freedoms, of assembly and Association, are gone it's. Very difficult to have influence. So. Awareness. Of human rights it, is increasingly, important, on. A European, level and also in Norway's. Foreign. Policy so. I really welcome at. The report that the fund might turn on rights agency is. Something here, now. Of. Course human rights is a topic. Full. Of dilemmas. Interesting. Discussions, and they are becoming more prominent in. The light of kovat 19. For. Norway, the bottom, line is that any, extraordinary. Measures.
Introduced, In order to combat the virus must be proportional, and limited in time and it, must be weighted against other. Potential, consequences, for people and society at, large, we wanted to highlight two, dilemmas, involving. Technologies, that's. Particularly. Relevant to the situation we. Find ourselves today. First. Dilemma is related to the use of surveillance, technology, to limit in sections, using. Tracking, tracing, apps. Surely. Is a very effective way of, responding. To this dynamic and also future sun damage but. There is a risk of disproportionate. Interference, in use of privacy, and. Even even, a potential for abuse so. Norway, had a, track. And tracing app it. Was introduced in spring very quickly and. It was, post. Quite. Recently. In. The wake of a statement. By the Norwegian, Data Protection Authority. And. They said that the act interferes. Disproportionately. In user privacy because we. Have a very low transmission, rate in Norway now which, is a very good thing so. The. Success in in handling. Co19. In Norway also, meant that the and the. Legal basis for, for. The act, could, be questioned so the. Public. Health Institute post, it an hour looking at other ways. To. Do it. More. Properly. But, in any case we, still have. The the app in place not. Actively. Being used but. It. Shows that these. Dilemmas, are, very. Much alive. In, in. The situation where Hamlet now, the. Second dilemma which, I wanted to mention about concerns. Increased, flow, of this, misinformation. Related. To economic. There. Is a big debate of how you, know if you should limit freedom, of speech and. Media, the. The, OSCE media feel the freedom, representative. Has. In a number of cases expressed, concern, for. Proportionate. Measures justified, by, they need to prevent spread, of disinformation. And, misinformation. But. You, know if the real motivation, is to curb independent. Voices. This. Becomes hugely, problematic. While. This information about the situation can, kill, reliable. Information will. Save, lives, so, I think, it's more important, than ever to fight, this. Information, and to support independent children. Listen don't, listen and in, social media might also be used to reinforce the, spread of misinformation. Which. Also puts a bigger responsibility, on, us as individuals, not, to contribute to dissemination, not. To like or share. Miss, information, and this information and we, need a collective, push to raise public, awareness on, this. Responsibility, the. Responsibility we, actually have us as a citizen, so. These are the Manning times, hopefully. The current situation will, teach. Us some lessons and make it easier to sort. Out the dilemmas in the future but, I'm convinced, that a democratic. System of governance, based on sound power, sharing principles, on based on transparency, and public. Trust will, lead to good outcomes in. Fighting. And I. Think if you are in doubt you can look, at the numbers around. The world and you can see that, the. Transparent. Democracies, are, in fact doing, a very good job in in combating what. Is a third, demanding, situation, with. Dynamic.
Filters. Ferrocene an openness, from day one was. A crucial aspect in. Stop, it spread in. In Norway we, we. Have had a. Lot, of dialogue with local. Governments, our. Cabinet. Ministers has been very. Accessible we've had daily press briefings, we, have tried, to create a trust between, governments. On all levels and citizens, which. We believe has made us better. Equipped to overcome. The. Crisis. But. You know sound, power-sharing, principles, and, public trust it, does not appear overnight. It. Has emerged. Stepwise. Piecewise. Throughout, the. Centuries, into kind, of a fine, mask system, of governance, which I, believe is a it's a very true European. Value, to have a. Democratic. Transparent. System. Of governance where there, is strong. Trust. Between the government and the. Voters. The people, and. I think that's the strongest welcome, that we as, Europeans, could. Have in the fight against the virus and. Of. Course we are waiting for a vaccine which hopefully appear. But. Our system of governments is also a very key component so. I just. Wanted to end by saying that we appreciate the close cooperation we, have with with Fri and the other EU institutions, and member states on these issues and, we. Know that support that rule of law human. Rights and democracy is, a priority, also for the next three chairmanships. Of the EU from. Norway, side we were able to do more and strengthen, to, strengthen the rule of law and fundamental, rights and we are very happy that we. Are a partner. Of the, Fri in this important. Work so, I'm looking forward to following. The discussions. This morning and I thank you for your s. Incirlik, halter thank you very much for those words, you've outlined very nicely, some, of the. Discussions. We are going to be. Having today and, some of the dilemmas, facing many. European, countries with. The issue of how to protect. Rights and. Tackle. This pandemic, now. No, fundamental. Rights agency event. Would be complete, without, a few words, from its director, Michael O'Flaherty, now he can't be with us in person today, but. He has recorded. A message so, before we go into the, debate proper, let's, just hear from him. Hello. Welcome, to day two of the. Fundamental, rights agency conference. Rolling, out the findings, of our fundamental rights survey, the, views of, 35,000. People in 29. Countries regarding. What role human rights can, and should play in their lives in their countries, we, asked them a whole range of questions, about every, aspect of their lives including, their, attitude. To tech and from a human rights point, of view that's. Our focus today we're, coming right up to the present minute and looking, at the role technology plays, in our lives and what, the rulebook is what, the rulebook needs to be regarding. Respecting, human rights, privacy. Freedom, of expression, freedom of movement freedom, of assembly what, it must, look like at today, including.
In The immediate context, of course of covered, 19. As. Some of you will know the fundamental, rights agency, launched. A bulletin on the topic of covert, 19, human, rights and tech, just, a few weeks ago and we, saw there's some real value in, for, example, tracing. Applications. But, nevertheless a need, for them to be rolled out in full, respect for the fundamental legal. Principles, of necessity. Proportionality. Non-discrimination. And most, basic of all legality. But. Let me leave it there we've had a fantastic panel, lined. Up today to discuss, these issues from, every possible ed dimension, we also invite you to participate, at. The various means were making available enjoy. The conference thank, you for joining us. Eat, for those kind, words now, it's time to get our panel, discussion underway. The, aim is to, unpick, for statistics, in, this, report. To, ask questions. About how. We can. Protect. Fundamental. Rights human, rights data. Protected, our own personal, data and at, the same time. Face. Up to, public. Health crises, and, of course as we know in the reason we're having this debate virtually, is we're in the middle of one of those at, the moment now we have a fantastic, set, of panelists. To discuss this we have Clayton Hamilton, from, the World Health Organization. It's digital, health division, we. Have Rick, Lee of Yurovsky who, is Europe's data protection, supervisor. Max. Schrems, of, the European, Center for digital rights and Cornelia. Cooter of Microsoft. Digital. Privacy in regulatory, policies, in the European. Union. And of. Course, we, have David. Rachel project. Manager, at the fundamental, rights agency and, we're going to ask, you to speak first, David. -, first of all how highlight, some of the key findings of, your, report. Thank. You very much and good morning everybody. Yes, it's my pleasure to give you a very quick overview of the some of the main findings of our report, on data protection and privacy which, was just published and, is based on the fundamental, rights survey as. Was mentioned yesterday the fundamental, rights survey covered, 35,000. People in, the EU in North Macedonia, and in the United Kingdom and it's, representative, of the. General population, aged, 16. Years or older, and. It covers several areas, of fundamental. Human rights in the EU one. Of the areas of the questions, we asked people, included.
Questions Linked to data protection and privacy and, the, report that was published covered. Selected. Findings of these questions. So. I would like to give a quick overview and also, share, my impressions when. Doing the analysis, and I think in general we have quite positive results when it comes to data protection in the you in, there however, there were also some surprising, results and, of. Course there are some issues that need particular attention, let. Me highlight five, points. First. Of all we related, to the willingness of share personal, data, obviously. And not surprisingly. The. Willingness to share data depends, on the type of information people, should, share people, find it much easier to share information related. To their address citizenship. Or date of birth as compared. To information, linked to for example grounds of discrimination such. As sexual orientation religion or belief or political, views I found, it striking, that it's only 7% of, people in the you would, be willing to share their political views, data. On political views with governments, and only 5% with private companies and not. Surprisingly, people are really reluctant to share data. Link to biometric, information such. As the fingerprint, of facial images. However. It's, important to keep in mind there is a certain, share in the population, that just is not willing to share any of those data. 23%. Would not willing to share any of those data with public administration. 41%. With private companies the second. Point is, linked to knowledge of Rights I think here we really have good results, many. People are aware of the general data that. The TDP are which. I think is one of the most famous pieces of legislation in, the EU and also, many people are aware of their, rights for example to access their personal data however. We also observe a strong, variation across countries, and in several countries the majority is still not aware of these, basic, rights, the. Third point and here, I would also like to show you a slide, if. This, can, be shown by tech, is. Linked, to tech skills so apart from the awareness, of your rights some people just don't know for, example how. To check, the privacy, settings on their smartphone, so. It's only 24, so it is 24 percent one in four, does, not know how to check the error privacy, on any of their apps on their smartphone.
Relatedly. And also. Linked to the disk now on contact racing apps it, is one in five 19%, in the you don't know how to switch off the location settings on their smartphone, and this. Obviously. There's also variation across countries, where. In all the countries at least one, in ten does not know how to do this on their phones. Next. Point now we can stop sharing the slides is. Linked to trust and concerns, and I, was surprised to see that there is much. More trust, in government as compared, to private companies this. Result is consistently, through our different questions, different, type of information, we. Heard in the panel yesterday that now during the crisis, the level of trust in government is, higher than ever before but, it also needs to be kept in mind that this survey and they they were collected, before the crisis, so, this issue appeared, already before. People. Have concerns that the data are used without the knowledge obviously most, of them fifty five percent for, people who are engaged in crime and fraud but, the second, point is already, 31, percent concerned. That advertisers, or businesses use the data, without. Their concern just followed by foreign governance and the. Last point I would like to make is linked to tender, we. Do find in some of the questions on data protection a gender difference with, women women, showing somewhat. Lower. Awareness, of how to check the privacy, on their, smartphones, and how to switch off locations. Settings. And there's. Also a wariness, of rights there's a gap. Between men. And women it's. Not a huge gap but it's a significant, gap and I see that gender is not always present in discussions, around data protection. So. These are just a few takeaways from, the fundamental, rights survey our. Rather, short report presents you with several statistics. But. Obviously, rarely, statistics, speak, for themselves, so. And. Especially in the context, of important, societal challenges. I think it's an important, basis for policy, discussions, and I really look forward to hear what the panelists, have, to say and interpret, these statistics, thank. You. Thank you very much David rifle, for that overview, of, the, key. Findings. Clayton, Hamilton of the World Health Organization, I'm going to come to you first because I do know that unfortunately you have to leave, us a bit earlier. Than the rest now. 23%, of people in the European Union, don't. It seems not to share any personal, data not even their date of birth not, even their. Address. Um. That's. Really. Quite. A challenge, at a time when governments, are asking. Their. Populations. To share information on. Public. Health grounds. What. Do you think governments. Can. Do you, know how can they use this technologies, without, you. Know, compromising. Data. Protection, and privacy. Thanks. Very much Imogen, and and let me say that. Let. Me start by saying that since the beginning of the Cova 19 outbreak United, Nations Human Rights officials, the. W8 show director-general and the Regional Director of the Europe have been stressing, the importance, of protecting the right to health to all people, and in. No area is, this message more pertinent, than in the domain of digital health so. What we have actually seen in. The Coburn iting pandemic, is an unprecedented, call, for. Digital technologies, to be employed by countries in supporting, national and international, public, health response actions. Digital. Health has in fact proven to be invaluable in, bolstering the ability of health systems and primary health care to, cope with a message with, the map massive, search scenarios, and at, the same time in maintaining the continuity, and delivery of essential health services we've.
Also Seen that new modalities, of public health surveillance and dissemination, of key public health information, have, opened up and received. Wide acceptance by populations. So. Essentially we've. Already seen a we are already seeing digital health employed by member states across a wide. Range of contexts, in responding, to Coba 19 from. Awareness and prevention, through, diagnosis. And diagnostics, to supporting, the, relaxation of public health and social measures now. The question that you hosed imogen, really, about how can we how, can governments use technologies, with without infringing on the fundamental, rights of people and there, are a number of very concerted actions that we can take, firstly. What we're really talking about is underlying. Issues of trust and digital, literacy so. As we've seen from statistics, in a report we're talking about one, in five for close to one in four individuals, who are unwilling to share their information and, so right. Away that suggests, to us that there is under underlying, issues of trust in the use of technology, and, I would, tend to suggest that there are also literacy, issues not, really understanding. Exactly where data is stored in technology, how its secured and also. From the government's, perspective insufficient. Communication, about how that takes place, so. Really what. We need is to have a mandate that's technologies. Such as the, track, and trace or digital contact tracing tools which have recently gained much prominence. And notoriety should, only be used for the purpose of conducting anonymized. Public health contact, tracing and exposure. Notification. And that. They're, distributed. Under the guidance of. National. Health authorities. We. Also can, certainly, see that using. An operation, model that is on an opt-in basis, which I'm happy to say that in a majority of European. Examples. Of digital health uses it. Is in fact an opting basis with clear consent, applied and as. The case may be the ability for an individual, to withdraw, that consent, should, they feel the need the, systems. Themselves need. To be, designed. On a basis, of fundamental. Principle of what we call privacy, by bizarre privacy, by design and that, essentially means that the very inception, of the tool itself takes, privacy, into account, that. In the context, of digital contact, tracing tools means that there's no location, tracking of individuals, for example that's, a very key point to ensure that trust. And privacy are in fact developed. We. Of course also need to have strict controls and measures apply to the data which is captured through contact, tracing apps and other tools and in, particular have it received stored, and governed, by, an independent third party who has oversight and accountability for. Its use and then. Finally I would say an important, measure is to establish, criteria. For sunsetting these types of tools following. The, the peaks of the pandemic and, specify. Permanent, deletion of data when. That data is no longer used by public health authorities so, they are the main actions I believe that we can take to, ensure that we, can still, benefit, from the technologies, being introduced, without infringing on the fundamental, rights of people. Okay. Just to follow up there because I'm sitting, in the country Switzerland, where the, the app is being rolled out today and we all got a little notification on, our phones this, morning, welcome, to the app please, it's. Voluntary, please download, it, it appears, to tick all the boxes that, you, outlined. There, and yet I know many. Many people here, who, are saying in no way I'm just I'm not going to do that so, I mean can these apps, be useful, if a, significant. Proportion of the population just, don't don't. Want. To use them. Thank. You yes and I think that's a really important, point and in fact there's been a lot. Of conjecture and discussion in the media about, the. Level of adoption, of these, types of tools and I've seen figures Rea ranging. From 30 percent right up to 80 percent of adoption. Quoted. As being required, for them to deliver a public health utility, and I would say that's a that's a misconception and, something that we we need to really address carefully, so, firstly, if we look at the evidence and research that's being done and I'm I'm in particular, referring. To research. That was done by the University, of Oxford where. They've actually predicted. How the use of an app might reduce a viruses reproduction, number so reducing, that value of our zero, R nought to less than one and.
How It would impact of how many people would essentially catch the virus from infection, what, they in fact found that a of about 56, percent of the population, or about 80%. Of all smartphone, users which is a very high level use. This app it alone, could reduce the number from a value, of roughly three right down to less, than the number of one now. This. Model and the resets they were doing actually assumed, that people over the age of 70 remained in lockdown situation and, that, no traditional, contact, tracing was underway so I think there's a very large margin. For error what. We really, have, seen it the media have picked up on this type of research and there has in fact restated. As being that there will be no impact with, scene until. You reach these levels of 60, 56. 60 percent and that is in fact not true we, can actually I think say now with some confidence that there's utilities. Actually gain that much less values, and if. We have, numbers. That are simply, around double-digit. Fingers I figures. I think we're actually going to see some, benefit, from the use obviously, it is the case that the greater percentage, of the population that, trusts. And aDOT's and use these tools then obviously the, greater public health impact that they will have but, it's certainly not the case that we need, unrealistically. High levels, and adoption, for them to return any value. Okay. Thank you very much I think we will almost certainly return to. The. Pandemic, and the app later in the discussion, but you, outlined, some, of the concerns. That people have Clayton. And they are around, data. Protection. So, I'd like to turn to you for, checkmater Oskie because the. Gdpr is. One of the key. Legislative. Of the things that we've had over the last few, years. 69%. Of people in the e you've heard of it which is great. But. That does actually mean that one in three haven't. And, so. It would help people now wouldn't, it if they knew about this legislation, when they're thinking about this this, app it would help if they knew what their data, protection, rights are so, why, do you think warning, three have not heard of it. Thank. You very much thank. You for the invitation, here, but I also think you've heard the question that you are asking because I, have to say that last two years we were rather accused, of talking, about gdpr too often and Maggie, did too, much on the top of the discussion, about each and every, topic. That is. In the social, dialogue. So. A probably. The knowledge that there is still, a. Little lack of awareness about, general. Data protection regulation, is something that the data protection authorities should, be very aware of because this is their, job to make this one was higher of, course on the other hand we have to say that to the people all know that, keen, on the legislation. That. Is providing. Them with the rights to defend proper. Privacy, if their privacy is not in danger so, sometimes, it happens, that those people, who, were. Not interested. In the legal. Protection, of the of the data are often the people who simply do not fear, or do, not feel, the problem. Itself. I think that they actually do, without, of, the, of, the survey it shows, quite the big awareness, about. Gdpr. Itself, and, also. About the rights that are derived from the, DDPs so I would not be dead much. That. Were surprised by that what, was very important, and I'm happy that David said it in his first. Intervention. Is that this data has not been collected during. The comet crisis, but before, the comet crisis so, actually, that would be probably good to. Check, what are the results, after. The crisis, or after, the same time do we have to disguise, this situation, because these that this data is the starting, point for, what we are right, now seeing. During the during, the comet. Crisis of. Course it also means that. The whole trust, the, governments have have. Gained, have. Been gained before the crisis, and of. Course it differs from country to country depending. On the political situation and, the general. Attitude, of the, society. Towards. Towards, the governments, themselves that. Would be once again interesting, to check after, a while how. How. Are the attitudes, of the people after this.
Efforts. That were taken by the governments, but also after, the efforts that were taken by the, by. The companies, or by the private. Business, that's. Something that we would also, very interested, interested too so. I'm. Happy that Frye is, preparing. The, service. Like that and also working father. And, making, more researches, on there but this is why the, day before yesterday we have signed a Memorandum, of Understanding. Between the European, data protection supervisor and. Fundamental. Rights agency and, I had the possibility. With Michael or a country to talk about her a while about this, future. Kind. Of any. Kind of cooperation I think. It's, also, important. When you think about the, attitude, of the people towards, the privacy, in the data protection we. Should also say that there. Are no general, patterns. To, be followed. From. The social point of view I can see even in my family that. The. Attitude, towards sharing, an information about himself, or, herself can. Differ, even between me and my wife and. I. Probably, will open, sure the information, that the other persons. In my family, so even, if we are coming from the same background even, if you are coming from the same country, even. If you have a very, similar education. The, attitude, towards, a think which are personal, might, be very personal. Because they, are personal, so, this is intimacy, that is taken in in. The consideration, here and I. Hope that also max will be able to. Do. To, supplement. This. Intervention. Here about. The. Awareness. In Europe, with. The, checks that were done by the Civic Society above. We. Have maybe lost. Okay. I hope. That max will be able to supplement, it with the point of view of the civil, society, on the, resource. Yeah, absolutely I'm about to come to you max however I, just wanted to say that since we've been talking about the. Dpr, we've got a bit of a treat for people are particularly interested. In how that. Historic. Legislation came. About there. Is a film called, democrats in mounted, art and that's democracy in the blizzard of data and, you. Can watch, it over the, next three, days free, online access, i think if you go to. The. Fundamental, rights agency's website you will find a link, to it they. Are, now. Max. Schrems civil. Society, the the. Part that that, keeps, everybody else on their toes, about. New technology, and data. Protection however. One. In three internet, users in the european union say they never read the terms and conditions, that that thing about cookies, they just click away from it without setting, any any, preferences. They. Don't read, many, of them the terms and services, or if they do and, we all know how long that they are. 27. Percent say, they don't understand. Them and one, in two half, find. It difficult, to consent. To personal, data usage. So. When, we come governments. For example to promoting. Applications. Like contact, tracing. Is. It, right. If if we don't know if governments don't know that the people have properly, consented. So, i was actually reading these results, and there was i think they're probably much worse than these numbers in reality, for. Example, if the question is finding. It easy to consent, it's usually pretty easy to click the big green button that says consent, but, the button that says i do not want this it's usually very hard to find and. I, guess, if in.
Try To raise here is. We're. Not gonna win this probably, by the users, so. It's. Gonna be very hard for people to really understand, how a tracing, F works I've. I learned programming in school I do, need, a protection for a long while but it took me a week or two to understand, how the different trait racing apps really work and what you, know technically, they really do and and. Even there I realize that something's wrong that I suit and so. It's kind of hard to shift. This responsibility. To users and have to inform, user that makes these choices and I. Think that is where we'll have to get, a different system that is probably more based on data protection authorities, and forcing. The law building. A system where we just generally, trust, you. Know digital. Services, or or apps if the government or the private parties. Instead. Of asking the user to understand all of this because the reality is it's it's way too complicated, for an average person to really understand, and to, make these choices. Just. Probably to give you an idea we work on Facebook for nine years I cannot, tell you fully, how Facebook works but. The idea is still that you, know people should be able to read that like an you, know 17 year old mom that, wants to see pictures, of their kids or whatever should, be able to read all of this and understand it which is I don't think very realistic, as an approach to privacy. No. Matter that people should, still be informed this way as much as possible but I don't just don't think it's gonna. Cut it in the end in all ways. David. Right cool you had your hand up there while, max was speaking, so let's bring you in yeah, thank, you know that that's very, interesting input I'm hearing and I mean I would I would like to point to the numbers I mean as we. Said before so it's one in three is not reading and of those reading, 27%, are not understanding which adds up to a half of the population either. Not reading or not understanding, what they're doing and I, mean I hear from your interventions, of course that there is a variety across, different people and these, are also some patterns as I mentioned before we see in the results so that there is there. Is a, gender difference observable. When it comes to data protection we. Also saw in the discussion yesterday when, it comes to general knowledge on human rights related, issues that it's especially disadvantaged. People in this society those, difficulties, with making, the ends meet also. Understanding, or having. Awareness, of the rights. And then of course we also have the variation across countries, and this. Also links to what, you said were check also that of course now we're in the crisis, and the data were collected, before and I think this way the data give a good, baseline, for discussion, because during, the crisis I think people. Might change their minds quite quickly depending, on what the numbers are reported on. TV. So, based. On the numbers we have at hand and. I'm. Wondering also like to what extent because as you said packs that it shouldn't shift on the people but I think it should inform policy actions in the direction if they shouldn't, be general, but more targeted, to certain groups and also, to certain areas me working on artificial, intelligence where it's really difficult for many people to understand, what you're actually consenting. To. Thanks. Very much now we I'm, going to stay with this topic just for a moment because I said Clayton Hamilton, for w-h-o does have to to, leave us, round. About 11, so Clayton, I think there's one thing it's, not entirely data, protection. Fundamental. Rights but um. The. Tracing app is is is is something that's in the headlines all the time, and it is being. Described. By some governments, as a game changer, this will will solve our problem could you put that, aspect, of it its actual role, in public health into context, for us so that when, people are downloading they understand, in health. Terms kind, of what it can do and what it can't do, Thanks. Imagining, yes I think that's an extremely important, perspective, to bring into this discussion so. Let. To, really make sure that we understand, what we're talking about here so digital contact, tracing is. This emerging, technological approach, approach to traditional, contact tracing that utilizes, this inbuilt, Bluetooth functionality, on, the, smartphones that many, of us are carrying around to, record, these proximity, interactions. Between. Individuals. For the purpose of limiting, transmission. Of coded I team so. If we, dig deep into it the working hypothesis, of digital contact tracing is that, community, transmission. Transmission, of covert 19 could, be, substantially. Slowed with the aid of the system to, automate, the human, tracing, process by, rapidly, identifying, and alerting, those people who have been in close proximity to an.
Individual, Who at a later point in time test, positive so. The key objective and the key Public Health objective of digital contact tracing is reducing. The time between the proven, primary, infection, and the identification, of potential, secondary, cases in, order to interrupt, the chain of transmission, now. What makes this more. Pertinent, is the high number of asymptomatic, over 19 cases and the, large volume, of suspected, and confirmed cases, these are important, drivers in introducing, digital. Solutions, to support national contact tracing teams, as is. Admittedly, the highly limited amount of human contact tracing resources, in countries to, adequately, address scenarios. Of rapid propagation, of disease so, we're really here looking at a matter of resources and time we. Know that most member states are doing manual contact, tracing however, this process is not perfect, and there is a scope to do better this. Is where apps, come into the picture we. May not remember who we sat next to in public you know on the bus or on the train we, may not know. Who, we're standing next to in the supermarket, and there, are issues with recall. As well, recollection. Is particularly difficult for. Those individuals. Interestingly who, are asymptomatic so. Apps, can address this gap there's. Also the perspectives, that individuals, may not want to give up close contacts, for fear of guilt or recrimination, I, would. Also say that governments, are also seeking a range of different means to support, now in the easing, of the public health and social measures and essentially. To kickstart economic, activity, once again so digital contact tracing is certainly seen as an, important, factor within that context, as well. Thank. You very much I think that that puts it to, a certain extent in in perspective. For us that it's a complementary. Tool but it, can't replace what what countries are having to do to get on with now which is the hard graft of tracing. And tracking and following up and testing, and tracing and tracking and following up again. Cornelia. Critera, now because, we want to move, on to, an, another aspect. Of. The. Research and, that is to do with facial. Recognition now, interestingly I hear that you and human rights in Geneva about to publish a report on this later, this afternoon. Exactly. On the, the aspects of new digital technology. And, the protection of human rights particularly. In. Political. Protests. Now. Our survey, shows that. 17%. Of people would be willing to share their. Facial. Image with. Public, authorities just, 6% to the private companies though and that. 31%, of, Internet, users and. These. Days. Think. That. Companies. Commercial, companies, advertisers. Will, use the data that, they shared for. Commercial. Purposes, without. The individual's permission. So. A first question I mean were you surprised, by. These findings. And what there's an apparent lack of trust, they are particularly, about facial recognition technology. Um. Thank. Thank You Megan first of all let me think the, fundamental, rights agency for. Inviting Microsoft, to this panel, we. Are thrilled to participate. In. In, here and also thanks to the supporters, of that, event. I, want.
To Say that in particular the introductory remarks, from state, secretary against. Walter. Burley, absolutely. Spot-on for, this discussion, and, before I go into the specifics. Of one. Certain, artificial. Intelligence, technology, let me at least say that what, we learn from. This. Public. Health crisis. Is that, data is, indispensable. To address, this so. We. Have. Learned. Quite a bit over the couple of last, month we could almost say, that the, digital. Transformation. That. Would, have happened in the next two years happened. In the last two months that. Getting. Businesses. Public, administrations. And societies, and whether of three schools, online. To. To continue, daily. Activities. In, this context, Microsoft's. Role has, been broad and we. Have in particular worked. A lot with first, responders. Working. With. Hospitals. And public. Administrations. To, address. The. First reactions, just like for example providing. Check. Box to hospitals. So they they. Can manage, the increased. Demand. For information. During. During, the crisis of concerned, citizens we're. Working with the World. Health Organization. To create, global. Data platform. Two-four, for. Governments. To plug in and end between data, they need, so. This is a broader, context. In which the, the. The. Discussion, has to take to, take, place as background and of. Course there is no doubt about it, but it, can only proceed, and, we can only benefit. From. The. Advantages, of, the technology. In. In, retrieving, insights, from data if. Trust, is. Established so. We. As a company are. Absolutely. Fundamentally. Looking. At trust in this. Context. So. This this this acceleration. Of digital, transformation or. If you will remain. And, it. Will in particular. Accelerate. In the context of artificial, intelligence now. Going. To. The. One. One, chord of, artificial. Intelligence facial. Recognition I, think. It is an interesting outcome. That, aligns, very well with how, gdpr, has really, looked at biometrics. Data so. I'm not surprised, to see a very, very. Big. Alignment, between for, example political. Beliefs, and, the currency. That. People. Express. As well as facial. Recognition or, biometric. Identification. In. This context. There. Is in, a way it's a it's a proof, point that. GPR get it right to say these are really sensitive data, sets, and, require, a specific, treatment. Within, gdpr. Hence. With. Too deep here but also with our fundamental, rights.
Considerations. I think Europe, has already, a very strict string and framework. To deal with facial, recognition technologies. Now, facial. Recognition technologies. Depends. On on, the use case you have a course. Verification. So. Your your iPhone will be open with your face, is. A different, use scenario, then, if. This, is the public debate we are having the. White paper on the eye that, the European Commission has published. To. Discuss precisely, this but, the biometrics, data or. Identification. Used. By. Law enforcement in, public spaces so we also need to dig. A little bit deep, into, what, are people concerned about, and what are the topics that we we, want to address and how. There. Are some analogies. I think between the tracing, discussion. We are having and, the. Discussion, we are having on facial recognition they. Go to the core, of the. Fundamental, rights and when. Something is and, marks knows this very well because, he has led. To European, Court decisions, that were, very clearly, deciding. That there, is a situation, where, the core of a bright is infringed, and where, then there is no justification, possible. So, we will have to really, fine-tune, and. Understand. When. Are we actually talking, about a right that is in, its core infringed, and cannot be justified and where, that is not a case what, is, it proportional is, it, necessary. And what are the specific safeguards. We, need to put in place in. Order to in, order to. Make. Sure that. The, rights of citizens, are not infringed, I think the frameworks, that the fundamental, rights charter, and human rights convention, provide, us in Europe are. A fantastic. Basis. To, fine tune and. Have this public discussion. Estate, secretary, and Polly Houghton said. The. Dilemma, is, eventually. Not a dilemma, when we face our discussions. On these fundamental rights. Cornelia. Thank you very much now, as. You. Know we have, reactions. Also, to this survey, coming from, officials. And, human, rights activists. Across Europe, who've been sending, in video messages, and we, have a couple to, listen, to now, before. We go back to our discussion. So. Let's. Take a look at, what, two very. Well known human. Rights defenders, in Europe, have to say. Humans. And machines are, living in an ever closer, relationship. Ensuring. That technological. Development, works for, and, not against, human, rights democracy, and, the rule of law is therefore, one of the biggest tasks, that states must. Face, I congratulate. FRA, for, putting a renewed, spotlight, on the intricate, links, between data protection, digital. Technology, and human, dignity. Technology. Is never neutral, it, is very personal because it carries ethical. Political and, legal implications. Digital. Technologies, can improve the quality of, our lives increase. Efficiency. Strengthen. Accountability, create. New opportunities, in many key sectors, of life like, health care, education.
And Employment. And it. Can of course strengthen. Human rights protection, in a variety of ways. Some. Say that the right to privacy is dead, luckily, the survey from the fundamental, rights agency shows, that, the rumors of its that are highly, exaggerated, in, fact citizens, of the Union show good knowledge about the right to privacy and also a great amount of trust in our governments, when it comes to the handling of personal data, governments. Should of course take care of this trust especially. When it comes to contact, tracing apps view seen citizens, rates quite a lot of concerns, in terms of privacy, and the collection. Of unnecessary data. Trust, is earned and so if citizens raise these concerns concerns. Governments. Should of course listen, this. Is especially true in the face of a common enemy such as a pandemic in, which human, rights should be front and center, of any and all efforts which. The government pushes, through both, in relation to did today but also for, any other future pandemics who that might. Co2. Videos to will known faces there. Would. Check we heard don't, get me out of it say that technology is, never neutral we, heard Marya, Akhtar. Say, that. Trust. Has. To be earned in, technology. Now. As we heard earlier, 69%. People, in the EU have heard about GDP, are 71%. About, National Data Protection Authority. But. What about the ones who haven't what, what problems, do. You see people. Facing. If they're not aware of their rights given, as, Cornelia. Just said you, know we've made advances. In the use of digital until. Technology. In leaps and bounds over, the last two months to cope with this pandemic I. Think. The most, important, problem that we have is that if. You don't know what are you personal. Rights the, trust is the only thing you have so. It's then, 0-1. That's, binary then, you either trust or you don't don't. Have any ways, to, check it you got no you, have no a plan. Actually how to deal with it, so you either, oppose. Everything. Which is done either by the companies, or, the government's. Or you, have to agree even, if you don't like it even if you don't think that, it's fair so in. This sense the, role of the, data, protection authorities but. Also the civic society. Organizations. Is. To make the people that were we don't want them to use their, rights, to. To to, ask for, for. The access. To the data. Whatever. They know that somebody's toast is that nobody will want them to have this opportunity, so. This, is a like whether this. Is right with taking. The holidays and traveling, around but, you, don't force the people to travel you don't force the people to have to to introduce. Into the to to his business but, you want to hurt them to have a right to that and to have the possibility, there. To to travel, so that's more or less what we do with, the privacy, as well we. Don't, we. Don't mind that if the people say we trust our government, and. Who think that they know better how to deal with, the with the things when, I approach, the plane, I. Really. Think that the fact that it flies is magical. I was. I was, taught how to. How. To measure. The. The possibility, of a plane to fly but I still don't can't believe in it but. I know that there is somebody who. Is checking it who, is controlling, the actor that there is some system to, deal with the fact that it's a secure and that's the tool that I can use and.
That's Probably the, place, where I am NOT the. Trust, is the only thing I can take but coming back to the privacy, coming with filter, protection I think what is very important, and what we have. To remember as the possibility. For the, people to do a check. Is what is the purpose, the data is collected and. Is. There wait. With. This purpose, that, was a day part, of the discussion, that we had there in no, way last. Days. Were. The Data Protection Authority found, that, there it's very hard, to say that. The purpose, the, data is collected have. Been actually, achieved by the contact tracing app I'm. One I. Think. We've perhaps lost you again. Boy. Check. It, says this is the difficulty, with with virtual, debate I'm sorry. It. Has to be some problem with the Internet, in my. Yes. Just. Finishing finishing. With it, as. I said we. Should, know what. The data is collected for, and what it is used for I'm definitely, one of the first persons, to, download, the contact tracing copper but I'm also one of the first person, to ask do you really know what, you are collecting this data for, and what, is it used for. Okay. Thank you very much max, schrems I'm assuming, that, your. Internet, connection, is stratospheric. You. Being in the business that you're in um we. Are talking, about people. Who, undoubtedly. Use, Newton new technology, use apps. You. Have smartphones, but. Don't know entirely, how they work so they don't know what's. Happening with the data you touched on this before in your earlier answer, well one of the findings, in the surveys that 1 in 5 smartphone users, don't. Actually know how to turn off the location, settings, and. That among that group women. Are. Less aware than. Men. Does. That say, to you that we need a gendered, approach to this or is it more a basic, fundamental. Proper. Education, I mean you said you yourself, couldn't still couldn't understand how Facebook work that the whole way. Of divulging this information, is far too complicated for your, average user. To. Be honest I probably. The wrong person to ask about these, but um you know probably. A lot of women. Are just more honest and not knowing what. Because. We, have that usually, I mean to be honest like just, to give you an idea we now have 15 people working at NOAA on privacy stuff we're, sitting there every. Day in meetings and say you know how does this really work we, need the tech person to come in I have calls with big tech companies, to tell us you, know we don't really know which legal basis we use under the gdpr so we just write all of them in there because we don't know ourselves so, I mean that's the reality of even people in the know of privacy. So expecting. The average guy at home to kind of really understand, these things in detail is, it's. Just very hard I mean obviously setting, your privacy you know your your location, settings is something that most people know but, if you go into the details, of questions, like for example who, actually gets my data I'm. Wondering, if people even realize. What. They don't know because you know it's the unknown. Unknown, that it's. Kind, of hard for people to even conceptualize, if, they if they understand that or not and I, think that exactly, dives into into, something, of which I have said before as well is, we. Need experts for these areas we need a. Enforcement. System. Where, we can just blindly. Trust it in a certain extent, you, just brought the example, of an airplane I usually, say you know I get get on a train it runs for 200 runs, 204, is an hour I don't, know how this works like to me it's kind of weird that this thing stays on it and I, don't have to know I'm just you know I basically just want to use these things without having, to worry 24/7.
I Think, that is a bit of a path forward and that could also help a. Lot, for. Before. Systems like contact tracing we actually put out the paper rather. Earlier on contact tracing saying that under, the GDP our most of that is perfectly, fine there's, legal basis for exactly, these things the. Big question is how you do it we, heard about you know having, a privacy, friendly approach which is doable which. Ran into technical, issues but I think overall. This, what. We have to build in Europe in let's say 10 or 20 years is that, we generally don't worry about these topics any because we have experts, in the companies in government. And in the, bodies. That review, it mainly the BPA's right now that. Ensure that stuff is just working just like with you know building codes with, high-speed, rail planes, whatever hits, the, average guy does not have to worry about unless. You have a Boeing. 737. Next, at apparently, someone did not check properly but. That's exactly where, we were outraged, in this one situation and. Could. Not understand, how this could happen while in the privacy, world we kind of still expect, that these planes are just coming down sometimes, and I. Think we still have a lot to catch up there. Cornelia. And David you both wanted, to come in there Cornelia, I'll come to you first Thanks. Yeah I think I think it's quite right but what max is saying. I just want to eventually. Level. This up a bit first. Of all TVRs, we're, still a new new baby, and. There is still a, lot of work ongoing, and. First, the European Commission this this, week has just issued. Its. Review, record, which was due this. Year, and. You can see that while. A success, there is still work in progress in, particular, also, on guidance, that, also, businesses need from the, European Data Protection Board, in how to read, certain. Certain. Provisions, I. I. Also, do believe that there is, a. Culture. Shift necessary. Both in companies, in, how to think, about cbpr but, also in how to. Work with companies in. Advancing. The discussions, around this and this will of course be except except, rated, by new. Technologies. Such as artificial. Intelligence. In. The context of artificial. Intelligence, we are obviously looking not only at EPR but the Commission, is also. Reflecting. On specific. Rules that address, other areas. Than personal, data protection such, as discrimination. And. I'm. Bringing. This up in particular, because gender, was. Was, was raised as a topic. It's. Of course hard to to, reflect on, why. This, discrepancy. Exists. In in, the context of the survey that the. Fundamental, rights agency undertook. But, generally. It. Is I, think, was an opportunity, where now, we with, with the data available we, can actually more. Holistically, address, issues. The. You, know the discussion, around tracing, apps, should. Never be binary we have. We. Have. Clayton. Has already addressed some of these issues like, there are certain populations, that will not have those apps I did, a I did a small. Survey in my family, and fun enough my. My sister has, not downloaded. The German app, because. Her smartphone is not new enough you. Can think of. Her. Being more sustainable aware. Keeping, keeping, an app keeping, a phone for longer period, of time, it's, an aspect, my. My, brother on, the other hand has immediately, downloaded. The app so so you know we, the, artificial.
Intelligence, And data insights, give us also an opportunity to, more holistically, approach, issues. So. That we don't have to talk about dilemmas. Between one, and the other right but we look, at certain, problems, in a more holistic way so in the context, of tracing, apps I think. There microsoft, itself is. Has, been more focused on principles. That we believe should be, underlying. The use of these, apps. For. Example we. Have said, that. Decentralized. Systems. Are certainly, more privacy, enhancing than, centralized, systems. Then. At. The same time there. Is a recognition, that it's, not a panacea, it, is much, more a part of a broader. Concept, in, how to how. To respond, to these, and it's not only about, the tracing, app itself but it's about other tools that can be used. Including. Non. Digital, solutions. As well so, so, my plea, here is really to take. The opportunity. Of the, situation. And and, take. A lot of learnings, from that I think one that, I'm keenly, interested is, that, location, data, is. Really. Very. Scenario. Specific. And in, the. Results. Of the study have very clearly shown that people are very sensitive. When. It comes to location data so, that should, give us some insights, and how how. It. Is dealt with in the. New law to be clear but also in there in other laws that are still, currently, discussed. Alone. I, think. It's Cornelia David you wanted to come in as well thank. You now I wanted to react to, two points, that I heard from previous, speakers, and for what, you gave the the very, interesting, a good, example like when it comes to trust in the airplane and I also used to think about the electricity sometimes. Where I've forgotten, from, high school actually how this works but I trust, that people who, are doing this know how these do because that the decision, process is already been hoped of. Course in the area of artificial intelligence which is developing, so quickly we, don't have these standards yet and, many people are frantically. Working on developing. This and Cornelia, was mentioning, the important political, processes at the European Commission level also. At the Council of Europe, looking. Into how, this can be addressed from human rights perspective. The. Second point I heard from from several of you and I think it's important, to pick up is the, purpose of, using. Personal, data and also. The context. We. Had, cañedo. You spoke a bit about the facial recognition technology issue, and and also the data we had and and Freud published a report in November last year on face recognition technology. In the context, of law enforcement and, I. Mean one of the the, main. Points, we had to make was that we actually don't know exactly, what the purpose will be for many law enforcement agencies. To use the data who would be put on watch lists, and. I. Mean this is I think even, more important that we add some context to the discussions when, using new. Technologies, also in the context, of contact tracing apps and the, survey data showed a stark difference between private, companies, and governments for example and this already, gives us a first indication of context, so for governments it's easier than for companies, but, of course if you ask people a blanket, quick question. Like this there's. A resounding no from, people and this quite, quickly might. Go against the core or the essence of your, right to privacy so. So, but. The, point, is that we, we did a lot of work already on face recognition technology. We can also use, the lessons learned in this area for discussions, which. Are linked now to the crisis. Okay. Thank you very, much, I, wanted. To ask maybe. All of you actually but I was also interested in the point that. Virtue can maximum we trust, that the plane works, and. With. This. Data. And new technology, and artificial, intelligence. I just. Really, wonder. You. Max you said in 20 years we should have like the legislation, to trust at all works and we're all protected. Honestly. I. Bit. Skeptical, I mean, if we see I'm looking particularly at the the, private.
The, Facebook's, etc. Of the world um they, are just racing, ahead the. Legislation, is is creeping, afterwards, yeah. I mean I I actually see that a bit more as a mixed picture if we look at stuff like that it's similar in the environmental, you know protection, world in. Workers. Rights and so on the, Industrial, Revolution totally. Came you, know much quicker than any workers rights but over. Time we realized that there is an issue and that we need to tackle it and we managed and it's none, of these fights are final, like we still talk about I don't know being. At airplanes having, air crews that don't have a collective, bargaining agreement. And worked for less than at McDonald's, you know these these fights are never gonna be done, but, I think fundamentally that. There is a basic, trust that you know your, wages are paid and, so on is, something that we have achieved and I. Think, mmm. It's also interesting. That we don't necessarily. Call for legislation, that much at least when it comes to to the treaty our world I mean artificial, intelligence, obviously we don't have anything yet but, um I think the other problem, that we have in Europe in that we have to face him probably, which triggered, my interest in, privacy in the beginning is the, questioning who actually enforced these things - and. And. When. I studied, in California, the, joke, was still you know if you're in Europe they have wonderful fundamental. Rights but if you break the law you make a profit and we go again and nothing happens and it's. Unfortunate. That after. The GPR, at, least in some Member States we see the same problem, happening again we don't have a single decision in ireland yet, we. Still have, Google. Was just confirmed now with 50 mil