Facial Recognition And Algorithmic Bias

Facial Recognition And Algorithmic Bias

Show Video

Um i'm now going to introduce, our speaker. So our speaker this month is, and i apologize. If i don't get the pronunciation, quite right but uh correct me but i'm going to do my best um so i speak actually why don't you just, tell us your name you know i don't want to do that thing where i'm angry. Yeah, so i know it's not easy my name is estedua. I said. I'm so glad i asked i knew i was probably saying it wrong. It said, um and how do you pronounce your last name. Uh arrivabore. Arriba, bar yes, okay so our speaker this month is. Did i get that right, yes you did. Awesome. So is that, um she's access, now. Is she's access now's. U.s, policy, analyst. Also covering, uh business and human rights. Uh she works to promote human rights in the digital, age focusing on the responsibility. Of the tech sector to respect human rights, she received your jd from fordham law school in new york city, as well as an llm, and international, business, law from uh universidad. Pontifica, camille yes, which i'm sure i also mangled, in madrid. Um, so. This dude is going to be talking to us about a very current topic that we hear a lot about in the news. Um. That i thought especially, given the conversations. Around police justice and accountability, and reform going on right now, would make for an excellent, topic to discuss. Specifically. The technology of facial recognition. There's been a lot of conversation. About this technology. Both people who have lionized. All the amazing things it can do. But another conversation. One which we at eff, austin eff, have also been tracking, and following. About many of the, civil liberties, concerns, around this technology. As it regards ubiquitous, surveillance. Right to privacy. Etc. And just ways that we can incorporate. A technology, like this that has the massive potential for abuse, into our society, in a way that safeguards. Our rights. And one of the main places. That uh facial recognition. Um, presents, potential, risks, is um. Actually in discrimination. Um because there have been repeated, studies, that have been done that show, that it turns out facial recognition. Is most accurate, for uh white men like myself. Um. If you are a woman, if you're a person of color, the accuracy, drops significantly. And this is not just, some hypothetical. Concern, we've even started to have cases, now, of people of color who have been wrongfully, arrested, because a facial recognition, system. Misidentified. Them. And uh slightly, more comically. But you know still, i find it very on those with symbolism. We've even had uh. Certain members of congress, when their pictures have been fed into facial recognition, systems, have been flagged, as known criminals, with arrest warrants, i'm sure that was just a mistake of the system. But, so isa do it is going to basically. Um you know kind of lead us in a discussion, of this technology. And hopefully we will all, gain a better understanding.

Of The potential, risks of this emerging technology, and how we can be sure. That it is equitable. And hopefully, does not, bake, in, existing, worldly, discriminations. In the ways it is designed, under a veneer, of computer, impartiality. So uh without further ado take away just do it. I mean to myself, it's always a mute unmute thing but yeah. Thank you so much, kevin and everyone, for listening to me across your meeting today. And i think it's. Just amazing that this happens to be that excited to talk about first recognition, right when, it's uh something that is really important to your group here so hopefully this conversation, will be, really interesting, and helpful to everyone, here. Um, and i'm hoping this will be more of a, conversation. Rather than me just talking here endlessly, so please feel free to. Jump in with questions, and comments. I think there isn't a, chat function. Uh apparently, there is not, yeah, but i will also say that if you don't feel comfortable. Raising your hand, um. Feel free to at, the eff, austin twitter account, i will be monitoring, it and if you have a question, there i will feed your question, along. Great, so. Yeah so hopefully we just will keep this um as interactive, as possible but i can just start with given. A bit of background, about. Myself and about access now and the work that we do before we dive more into, uh specifically, facial recognition. So accessnow. Is a global or digital rights organization. That works at, the intersection, of technology, and human rights, uh so lots of the issues that kevin was mentioned earlier of encryption. Uh surveillance, so forth we work on issues like that uh from a human rights, perspective. And, so we're sort of organized with we have a policy, team which is a team that i sit on, and, an advocacy, team. A digital security, helpline, which is actually how the organization. Began as, as the helpline, um as well as a team that is focused, on. Our annual conference, rightscon. And we also have like a really a growing, legal, presence as well which i'll talk about later. But basically what we do is sort of advocate, for human rights in a digital, age, um for things. Uh. Rights mainly focused around um. Access to the internet and access to information. Freedom of expression. And and privacy, so those are sort of the main buckets of rights that we work in, um, and, we see that the. Facial recognition. Uh, conversation. Sort of crosses, a lot of those, buckets that we're usually working which is why it's been. A really big part of the work that we do over the last few years. And growing, more. So over. Uh recent months and years. Um, right so like kevin said in. The biopart, i do i work on, business and human rights so. Our engagement, with companies i generally lead on that as well as doing, more specific, u.s. Policy, work, um but because our team is, it's pretty international, i think our us policy team is only. Two people two and a half people. Um. So. We have so yeah so we're based in basically, um every region except antarctica. Um and i think our regional focus, really helps sort of sharpen our perspectives, on these issues and, when we get more into the conversation, and first recognition, i'll talk i would touch a bit on, um some of my colleagues, in other regions, and what they're doing on face recognition, and just to get a picture of what this looks like globally. Uh so, yes i just will start with a little bit of background about. What face recognition, technology, is. And, and then we'll talk about some examples, of how it's being used. The human, and civil rights implications. Of that use. Um, what the current landscape. Looks like. Around facial recognition. And then. Um, recommendations. Are the responsibilities. Both for governments, and for companies.

Uh So i'll start with, a little bit of a definition, and um, dive in a bit more into what fish recognition, is and what it does. So, yeah i guess a lot of your priority know but facial recognition. Um generally. Encompasses. Um, four specific, tasks, which include. Um, verification, identification. And categorization. So the first thing is a detection. Whether there is a face so it's a tool or system that would detect. If there's a face and based on, a picture or whatever the, the view is at the moment. Um and it and then it goes into analysis. So to analyze, the characteristics. Of that face. Um and then the part when we talk about facial recognition, that we usually are focused, on is the verification. And the identification, part of it, so the verification. Is. Um sort of like a one-to-one, match so basically. The system, the tool be able to tell you whether. Um, based on, an id, and another information that's already been presented, whether this face, matches, that id. And, whereas for. Identification. Identification. It's more of like a one. One to many. Uh so, a faces. A face is presented in the picture and it can tell, um based on a database. Of other faces, and connects, one. Picture to a database. So, um. Yeah and almost like technical stuff, is. Only interesting because it sort of helps understand. The, policy, implications. Uh another aspect of that would be the match, thresholds. So when does. When the systems are being used. They have. It doesn't it doesn't give you like a yes or no so you don't just like, have a, have a. Fish recognition, tool that will tell you yes this is this person, notices, not this person, what it will usually do is to give you a score based on whatever parameters. The. Creators, of the of the system, have set, um and then also the people who are using the system, as well, so those are called like the match scores and the match, thresholds. And basically. From, one tool to another it could vary. But essentially what it does is give you. Um. A number it spits out a number that tells, how likely, it is to be whatever. Face or or um person that you're looking for. Um, and. So the the, the match the match thresholds, matter because. If you set, the people you use in the tool can set like basically like let's say we want. Anything. I want to see matches, over like 50.. And so that affects, the, um, the rates of false positives, or negatives, you might get based on if you say, your threshold, too high or too low. Um, yeah so that's more that's, more interesting again as it goes to the. Um, policy implications, which we'll see with uh the human rights and civil rights harms later, but that's just basically, like a quick, very very quick overview. Of what we mean when we're talking about facial recognition and how it works. So the more um. Interesting, part is how it's being used. Um. I think just starting with, startuping, with covert right now it's, a really. Good example. Of. How. Technology, is often, like the first, um. The solution, for every problem we can think of whether or not it should be. The, best way to, uh attack the problem so for instance. During the kovic crisis we've been tracking. Um, governments around the world that are turning to facial recognition. As a way, to, respond, to the crisis, so, um one way so is to track compliance, with curfew and quarantine. So we've seen like in russia and in india for instance. And the governments, there have started using, face recognition, to check if. Um, people are wearing masks, if people are breaking curfew or quarantine. Uh which, you can, imagine. All, the dangers that come with that come with that.

Um. It has also been used to track, or identify, potentially, sick people. So there are companies, like um, dermalog, and taupo. Um, that have adapted their tech to be able to sense temperature. Um and in some. Airports, around the world. They're adapting. Whatever tools already are there, to include, um. Uh. Thermal reading, and as well as facial recognition, so basically, when you go through in airports. They can. They supposedly, they can tell, whether, or not you have covered. Through these tools. Um. Another way that we're seeing it being used is you write as alternatives. To. Other methods like fingerprinting. So for instance, in. Here in new york um, where i'm usually based in new york the nypd. Had been using, fingerprinting, at, their headquarters, for people to get in and out of the building, but because of covet, and concerns, with you know spreading. The disease they stopped using. Fingerprinting. And now turned to. Some forms of facial recognition, to do that instead. Um. And. Finally another way we see that is with contact tracing, so. Um. I'm sure that all of you have heard of, the very, lovely company clearview, ai. And all the trouble that they got into this year, and they were apparently, in talks with the us government so after this like huge scandal, of clearview. Scraping, the internet, for people's pictures, and information, and puts in basically everyone, in their database. And the fallout that came from that, they were reportedly. In talks with the us governments, to provide some sort of like contact tracing. Solution, in which, when you think about like, who do you want to be the one that has information, on your. Your health data information, as well as like every. Sort of form of picture there is of you like query is not necessarily, the person you think of, or the company that you think that they'll be best for that. So yeah so just the. Uh the covet crisis i think has really. Made it. Kind of evident, how. Facial recognition, technologies, can be used. And another big, thing this year that's made that very clear, is the, the blm protests that have been happening. Um so, you know in the us, we. Are. Very, protective, of our first amendment right so you don't necessarily, assume that when you go out, into. Um into the public to go to a protest, that you will be attracted with a tool that's not just a security, camera but, able to detect a face within a crowd of, you know hundreds or thousands, of people. Um and the, the protests, have given. New attention. To, uh the use of effect recognition, specifically, by law enforcement. And you know kevin mentioned earlier that. There have been already reports of people being falsely, arrested. Because of facial recognition, and some of that comes from. Um how people have been tracked, through protests. Which, as we go later, into the human rights and the civil rights that are implicated, by that it's very obvious that people's rights to peacefully, protest, should not be impeded. And. With the concerns, of um. Invasive, surveillance, technologies, being used. In public spaces. Uh. Right so we also see, right so um, law enforcement, use is probably. Among the most, uh. The most egregious, forms of it because, the risks, there are to your life and liberty, but it's also being used for commercial, reasons, and, in the private sector so for instance. In some housing complexes, even here in, new york city. Um, facial recognition, had been employed as a way to let people in and out of their buildings. And this is particularly, worrisome, when it's. Because of this specific, example. Was a building, um in brooklyn, in an area, with a high minority, population. And, again as kevin mentioned earlier this is a technology, that is. Fantastically. Terrible, at, generally. Um. Recognizing. Faces, of, people of color and of women. So we see that we've seen it being used, for things like housing. Employment, as well, um for hiring, practices, as a way to screen. Applicants. Um. And. But then but then there are also. Some. Supposedly. Some, some benign or even beneficial, uses of facial recognition, which is why, the company's, governments, usually would want to push it is that they are they can be positive to this one example, is, the facebook, tool that allows and people with disabilities, who are visually, impaired. And to be able to. Basically, understand. And. And go through pictures, and, visual content, on facebook. Through a facial recognition, tool. So. That's. One of the. One of the and then and also in areas, uh such as, um fighting. Child pornography, and and exploitation. Um it's been used as a way to. Uh as a way to stem those practices, so, although like they are, some of those, beneficial. Uses, of, facial recognition, i think the.

Important Conversation, for us to have is what the risks, are to our human and civil rights. So right off the bat it's very clear that, our privacy. Is at risk, when, you can't go to a protest, without. A worry about, facial recognition. Technology, and tool being used, to surveil, you, and. Particularly, for the us, because we don't have a comprehensive, data protection, law. When, even even when this even when this. System is being used in the most, ideal. Of circumstances. So when it's perfectly, accurate. Uh, when it's yeah it doesn't discriminate. Even when it's being used in that way there's still, the danger, of the information, that one, company, or, agency, or whoever it says that's holding this information. When they suffer a data breach, and again because we don't have a comprehensive. And federal data protection law in the us. The risks of that are grave when people's, rights to privacy are compromised, because there's essentially. Not much remedy. And. In addition to that there's also. Right. When it comes specifically when it comes to, um. To, law enforcement, and police use of fish recognition. Uh the risk, of exacerbating. Already. Discriminatory. Practices. Because we find that these tools are often being used in already, over policed. Neighborhoods, and communities, which are generally communities, of color. So when. A tool that again, is, more likely to misidentify. A person of color it's being used in the community, of color we can imagine. And, the. The dangers that are exacerbated. And that's why. Again when it is, there's no, there's no perfect, scenario. That will. Reduce, that risk, so companies would try to say look well we can make this. Tool, as accurate as possible and even within the last four years i would say companies, like amazon and microsoft. Have made their tech, their first recognition. Tools. Much more accurate. Um the example, that kevin was talking about earlier, of. Uh, members of congress. Being. Misidentified. With i think it was i think it was microsoft's, tool that an aclu, ran that. Little study or experiment. I think after that came out like. Microsoft, ibm, the others like in. Exponentially. Improves, the ability, of their tool to recognize. Um, female faces, and then also faces of color, but that doesn't mean. So but but just because, it becomes more accurate, doesn't mean that it actually. Ends up. Removing discrimination. From the picture, so the underlying, issue. Is that. It is. Ends up becoming where these tools being used, particularly, when it's being used by law enforcement. So, is, a police. Department, more likely to use facial recognition. In like a poor neighborhood, in brooklyn. Than they are on the upper east side in manhattan. And, so if that's the case it doesn't matter, if it's, 100. Accurate, at recognizing, every kind of face because it just again. Continues, to perpetuate. And the discriminatory. Practices that already existed. Um. So the the current, landscape. Right now. That is. Is a bit of a mixed bag but but still, kind of a good time for these conversations, to happen. Um, because of the protests, and because of other. Advocacy, by civil, civil rights organizations, and other groups, we've seen some really important moves on facial recognition, in the last like few weeks and months. Even just today. The court of appeals in the uk. Ruled on a facial recognition, case essentially, saying. That the police department, there were improper, in your use of facial recognition, in the public space. That has like there are other implications, in that ruling but just to get. A. High court, in europe, making such a decision is a really big deal. And, and. Most notably, i'm sure that, some of you saw the um. Announcements. By. Microsoft. Ibm. And amazon. Um. About a month or so ago. Uh saying that they would either, stop or put a moratorium. On sale of their technology, to law enforcement. And those, so those um, those announcements, were really. There were there were greats particularly, for amazon. They're. Their company that i said that in my work we um most of what i do is trying to engage with companies and. So the credits, we've talked with microsoft, we talked with ibm. Amazon, will never respond to you in anything.

Um, And so to see, a company like amazon, actually. Make. Such a concession, was a, really big deal, ultimately, doesn't really mean much because. Uh just because they're not selling, to uh police departments, that doesn't. Address. Used by ice for instance. Or to other, other governments, outside the us. And so it's, while it's positive, to get these statements, from. Yeah from microsoft. Ibm. Amazon. There's still a lot of questions to be asked. And particularly, when the statements are calling for. A one-year. Pause or moratorium. And asking. Uh the us governments and lawmakers, to come up with, laws to regulate. Um the the use of this technology. Uh we have to keep in mind that, at the end of the day like these companies. Still have a stake, in, being able to use this technology. So that's why we'll see, a company like microsoft, who can make very. You know really, appealing. And commendable, public statements, and commitments, about, how they won't use this or they won't sell this and then at the same time support, legislation. On the states and local level. Um or, try to kill legislators in the states and local level that would try to hold them accountable. So a lot of what we're doing now is trying to. Um. Yeah to hold these companies accountable, and ensure that their public statements. Sorry, that their public statements. Match with, their, um, their private lobbying, and activity. As well. Um. So uh. Right and then also. Particularly, in the us, some of the more exciting, things we've seen happen around facial recognition. Has been. Very much around the state and local level on the federal level. I think there's a bill that. Was proposed. Very recently, it's not going anywhere, because. Of the current, administration, we're under, um, but on state and local levels. In calvin, california. That have been mentioned san francisco, and oakland. Uh i think also in as well in massachusetts. Romaine, there have been. Places all over the country. Where, um, there are there, the local governments, are imposing. Bans, on. Government, and law enforcement, use of this technology, which is really. Surprising, and exciting. And i would just say that for a part. Basically, our, um from access now our. Stance, is that any use of facial recognition, that can be used for mass surveillance, should be totally, banned, so enabling, mass surveillance, is not anything that any. Government, should be involved in, and, it's really not far-fetched, like, you think of mass surveillance, you can think of, um the uyghurs, in the northwestern. Region of china and the sort of master villains they're under and you think nothing like that could ever happen here in the us. And yet what we don't often pay enough intention to is the fact that, there's often there's, already lots of the infrastructure, that would be necessary to make that happen.

So In, cities, and, uh locations, where there's already extensive. Video camera surveillance. Um capabilities. It's just one step away to and, um. To outfit those uh those technologies, with facial recognition. And we'll find ourselves, under more of an, of a mass surveillance, uh situation. So one thing we're really pushing for is, any use of facial recognition, that could lead to mass surveillance, should be totally abandoned. And companies, governments, have no. Um. Have no reason whatsoever, to turn to that as a solution. Um. And then. Besides, that, um. I think among civil society, not just in the us but even internationally. With the partners we've worked with, there's. Maybe not necessarily. Um agreement. But generally, there seems to be a feeling towards. A sort of ban or at least a moratorium. Um on, police, use of facial recognition. Uh. Again. Oh yeah. Hey sorry i was trying to get your attention, about interrupting. And i had promised, you, that uh i would throw a softball. But it's not happening, uh, because i had a better question. Uh. So, it seems like. You know companies, like amazon, and facebook. They, they seem to respond. In in a moment, where the whole country is sort of engaged. Um. And uh the whole country. At least in the u.s seems to engage. Mostly, on like questions, of freedom. You know like my right to say what i want to do what i want you know within reason. Even, you know in the face of, a public health crisis. People, really latch on to that idea. Um, but with privacy, it seems like. So important, in the us. Whereas in other countries, uh particularly. In europe they seem to really prioritize. It, um. So i i was wondering it's like sort of a two-prong, question it's a little bit of a wishy-washy. One too. But like why do you think, that privacy. Doesn't seem to be as interesting. To, the average american. Where, uh, those other issues of freedom. Do grab attention. And uh. Do you see a way to sort of attach. Privacy, rights. To. Those uh. More palatable, rights as a way to push forward privacy, agendas, and, uh anti-algorithm. Racial, recognition. Software, and that sort of thing. Yeah. Wow thanks for that softball, alex, why don't we just dive right into it you're welcome. Um. Right so i mentioned that one. One thing that gives the organization i think some, a unique perspective, is the fact that we work across, um, across the globe, so. The other members of my team who work really closely with me on face recognition, are based in brussels. And it's really fascinating. To hear the conversations, we have about what's possible. And the e level versus, in the u.s level and i think alex are absolutely right that, um. Here in the u.s, it's. Freedom of speech. Is. Is a bigger. And much more palatable. Uh, right for us to focus on than for privacy. And i would um personally, i think it's because, the dangers. Of. Um. Of the infringements, and your rights for privacy, are just maybe not quite as obvious. Um and that we. As, a capitalist. Society. Have a lot of, uh trust, in companies. Right so you say we we know that you. You don't get anything for free so you understand yes maybe they will uh, collect some information about me but at the end of the day they're doing me a service so we have sort of like an implicit trust. In companies, even when it seems like we don't, and and i think that really contributes, to it so. Um. Just, just a greater understanding, of what the risks actually are well so for instance, when there's a big data breach, everyone is really upset about it for like two minutes and then, unless you see, something, actually, change about your personal, life, then you just sort of go on, uh going as it is maybe you turn a couple passwords here and there but there isn't, as much outcry, about that as there is for something you know relating to, expression, or speech.

Um. And again i also think that's because, we've come to a place, where we, uh. Trust, companies, with our, personal, information, like you don't, think twice about. Um, all the all the, information, that you're given out, to a company when you sign up for an app or a service. And. A good example is that i just noticed recently i chose with tick tock, i was looking at tech socks uh privacy, policy, and they actually have. At least they have four, privacy, policies. One policy, size. Yeah. One policy applies to users, in the us, another one to users in europe, another one to users in the rest of the world and then there's one specifically, for like underage, children. And just looking at the policies, for users in a us versus europe. You see, again just like. How much more information. Tiktok. Is able to collect about, you because you're based in the u.s and then because you're based in europe. Um and it's something that we don't. We don't really have a choice, um from this other part right so it's not like. You can read a hundred page, privacy, policy, statements, and. Your only options at the end are yes or no in order to get the service so i think we have. There's very much like a reliance, on, companies, under sort of an acceptance, of this is how it is if we want to get the services that we want. And then as to having. Finding a way to make, privacy. A bit more, palatable. I think it just has to be like public awareness, to help people understand. Just how invasive, those technologies, are so. Um again it's not just, a video camera, watching. Um just, not just a regular surveillance, camera, out in the streets but a, camera that can track you from one location, to another. And collects, information, about your own bunch of different sites and so forth. And and i think that the protests. And what's been happening over the last, few, months. Is probably helping a bit with that like the, the level of outcry, certainly hasn't reached other things but. Just the fact that, a company like amazon. Would respond. To what's going on shows that they, that they're aware that people are taking privacy maybe a bit more seriously. Um, than they would have otherwise, so. Yeah i don't really have a good answer on how to make. Us here care more about privacy. Except for the more our privacy is violated, and the more we say that we don't have. As many rights as we thought we did that maybe that'll. Wake people up to the, to the dangers of that. Um. And i found the amazon piece that sort of um. Brings me to another, point again about why. The amazon. Um. Move was really significant, because just, a month before, amazon, announced that they would stop. Um. They'll, stop the, sale of fish recognition, to to police and law enforcement in the u.s. A group of shareholders, that i work, very closely, with, had a proposal, on at the annual general meeting that was exactly, about that, and of course like amazon voted it down. So the exact same thing they don't listen to civil society, they don't listen to ngos, they don't consider shareholders. But they did listen, to the protests, and the public outcry, that happened. Um around beginning of june so i think that gives a little bit of hope that. Hopefully, if if amazon, listens that other companies, listen as well as long as we have. Um a unified, message that we're presenting, to them. Um etc, in, sorry, i, i know you, wanted it to be a discussion. Uh just tell me if you want me to shut up at any point. Um, i won't take it personally. Um. So i i assume, and i could be totally wrong that you have more, uh experience. With uh perhaps, foia requests. Of the types of data. That, various, agencies. Um. Are collecting. From you know facebook, amazon, whoever. To what extent, is that data, available, to you for like a public. Figure, for example, right like is, is demonstrating. The amount, of insight, you can get into a. Into a politician's. Life. Like a viable, way forward, or are there too many roadblocks. Yeah so actually. I haven't done a lot of our four requests but from my calls on the legal team i think that that's.

It Hasn't been quite as, um, as much of the smoking gun as we would expect. Uh. Just. By virtue of how long it takes to actually get a response. On something. And so i don't know that foia requests have been, very good. At least not for us in that way, uh one thing that we do. That we do focus on though is on transparency, reports. And, so, we advocate, very strong way for companies, or tech companies to release, reports, that show. Government's, requests, for user data and to restrict accounts and stuff like that, and and. Now that more companies, are doing transparency, reports it's really interesting to see the trends of like which governments are asking for data how are they doing is it through court orders is it through. Any other means, um and trying to get information, out to the public more i think has been, has been good because we've seen, more interests. From, non-traditional. Tech companies so usually. We. People who do transparency, reports would be like, facebook, google but google actually started in 2010, but like facebook google like the big tech giants, but over the last couple of years and specifically. This year just in 2020. We've seen um. Other, uh. Other companies, start to, feel the pressure to do reports, as well a good example is with zoom. Um earlier this year, uh so, yeah so i think that. For a request for data. To my knowledge haven't been. Very, helpful. For that but, um but transparency, reports i think are a good way to go because it's essentially, like a way for companies to cover their their butts, um and to say well if we, gave any information is because the government forces to and here are the requests we got, and then it'll be up to the public, and to say well why are, why is like the us for instance. Um. Asking for hundreds and hundreds of accounts to be taken down. Uh so, yeah. But yeah maybe you know maybe, uh fourier is something that we could kind of look a bit more and especially when it comes to trying to persuade. Lawmakers, in the us to take this seriously. But. Yeah again hopefully, the. Uh the protest, and more public attention right now will, push them, more towards, that, needle. And actually i figured this was a good time for, yeah before i get to the next part that we can, stop and have a conversation, here questions, comments. Yeah absolutely. I think i can uh follow up with alex, and the uh giving you possibly, a non-softball. Question. We uh, we like, rigorous, debate, at our meetups here so um. So my question for you is. Speaking, of the us's. Tradition. Of you know constitutional. Case law when it comes to thinking about, privacy, in this country. A framework, that has been used for a long time, is sort of the reasonable. Expectation. Of privacy. That you know, anytime, we're trying in this country to decide. Whether something's, a violation. Of our privacy, or not it's like well did they have a reasonable, expectation, being private or not and especially. In the public, space this has usually been interpreted, as. Well you don't have a right to price or an expectation. Of privacy, when you're out in public. Um, and i'm curious, if you think. That the realities. And the scale. Of facial, recognition. Technology. Is, really going to force, us to revisit. The legal, justifications. We're using when we're thinking about privacy, like you know one could easily. Somebody who puts a bunch of facial recognition, in public spaces. That law enforcement, and city officials, have access, to these systems. You know one could easily if you, want to get mad at it you could easily see the counter argument, you get being oh well you were out in public you had no reasonable expectation, of privacy. Like. I think, i think you'd agree like, we're gonna, probably need to build some new legal arguments, and tools to push back against this because the old one doesn't really seem to work with this technology. Anymore. That's why. Honestly, the first step is for the us to have a comprehensive, data protection, law, because the idea of what privacy, is under the reasonable, explanation, of privacy, is a totally, different thing, um like yes someone could recognize you but the fact that you could be traced, across. You could go across the country, and be traced. And even even in even in um even in, what should be, technically, private spaces so you're walking down. And. You're walking down the you're walking down the streets, on your block, do you have a expectation, of privacy, there but then there are people with their amazon, ring. Adorable, cameras. That can, capture your face. So i think that there's. The internet, has really, changed the game on, sort of the some of the traditional.

Uh Legal, arguments, and reasoning that we use to understand, things like, privacy. And free expression. Um. Because the potential. For harm here is something that i think we just didn't even anticipate. Uh when we were initially thinking about what privacy, is so. Yeah no i know i i i totally, agree that i don't think that. That reasonable, expectation, of privacy arguments, is going to do much here because you should. Just because, you. You shouldn't expect. Um. The same, amount of privacy. Inside your house as you would if you're out in a protest. Doesn't mean. That you. Should be exposed, to the sort of privacy implications, of things like facial recognition, because, keep in mind it's not just like someone recognizing, you but information, about who you are being stored. By, a government, agency. By, a company. And, especially when it's being stored by companies like what are they doing with that information, so when you, when, when it gets compromised, in some way, this is it's literally your face it's like it's your identity, you can't you can't get a new one you can't get like a like you can get a new driver not yet anyway, biotech's. Not that advanced, yet. Not yet exactly, so until the time comes when i can get a new face. And this information, that's being. Collected, is not just. It it's. The most. I think sensitive. Isn't even strong enough award but information, about who you are. And. I think just really. Undercuts, what we initially thought, privacy. Looks like. Um. Yeah i'm curious to hear other people's thoughts on this if you think that there are any other. Sort of legal frameworks that we could use to understand, this level of privacy. And before i hand it off to other people i had. One more well if alex, was about to say something i can't tell there's, a word to tell feedback, um i had one more question for you before i, try to put the room and get some other questions for you, um. So. Another thing i hear very often. When i try to have this conversation, debate with people and, get people who aren't taking this seriously. Yet, on the fight about what a big deal this is this technology. I will often, you know, kind of get, from certain, individuals, of sort of maybe the like, you know. Think they're very clever, but also kind of cynical, persuasion, they're like well, the cat's already out of the bag on this there's nothing we can do about it like, you know they'll point to the fact as you you alluded, to something like queer view ai earlier, they'll say like look we live in a world now. Where, there are apps and databases. And systems. Where any citizen. Can now. Have an app that they aim at any random person on the street, and now it's not just the nsa, who knows who that person, is, everyone, knows who that person, is and, that's just how it is and we're all just going to have to adapt, what do you say to that person. Is first of all how dare you. Well i agree, but. Office reference there but also like just because. This is the world we live in doesn't mean this is the world we should be living in and keep in mind that this is just here in the u.s so the same companies. And whether it's microsoft's. Amazon, facebook that operate here operates in other parts of the world, and where they have to, where they have to adhere to stricter. Uh. Regulation, and legislation, they do it so if they can do it in your why can't they do it here, and again the tick tock example of having like a bazillion, different privacy, policies, based on what region you're in, shows that the companies would try to get away with as much as they can and it's up to us not to let them do so.

So Yeah again just because clearview, exists, doesn't mean that they should. Um. And just because, uh. Yeah and just because we we're already seeing ourselves, in. In. A time, where. Basically, anyone, can be the holder of that kind of information, doesn't mean that that's how it should be, um, there are ways to rein this in and even even on the on the company, side, and, i mentioned investors, earlier that investors, are taking this seriously. So if investors continue to ask companies like amazon, and microsoft, to bring back this technology. I'm really confident, that they will, so, when. When when, groups like that with a lot of clout and power, care. Uh i think we should be quite hopeful that they can, that it can be changed rather than just. Sort of rolling over and lesson that happened, um because it doesn't have to be. I agree. All right, now i'm going to start, uh asking the room if they have any questions, and if nobody, but besides me and alex chimes in, those of you i know i may put on the spot. But hopefully, some of you will take the hint and ask a question. I actually have a question. Um. So, so. I've, i've been having a hard time trying to frame it, but regarding. I guess where facial. Like the data, that's, the result of visual recognition, surveillance. Do you. Would you have any information, on i guess. Is there a common, infancy, that's storing that data, is it stored by the people that collected, are people. Going, to use a specific, like data center. To store specifically. Uh. Official. Issue recognition. Data or, how does that work. Yeah that's a great question and a big part of the reason we're so frustrated, because we don't have a lot of the answers to that, um oftentimes, these technologies, and the companies that are, uh that are creating them are super, opaque, about, what their practices, are for storing data for keeping it safe, and things like that so there's no. Generally, it would be, uh so if. Um. If uh. Let's say like for instance, um amazon's, recognition. Is being used by the police, department, so presumably, it would be the police, that, have access, to that, data wherever it's been stored which they won't tell us, um, but there but also like, probably, amazon, does in some way shape or form, and we just don't know the extent, to. What the what those agreements, look like and what the practice of the world practices of how information, is stored. And. Also looks like and. For, for law enforcement, use that's really bad but also for commercial use as well so, if you go to target, and target has a facial recognition. System. That they're using, to you know prevent shoplifting. Who holds that information, is it targets like they have to, so, do they just create their own database. Of, um. Essentially like a black list of people who shouldn't be, allowed in and then cross check it. Or are they cooperating, with law enforcement. And we find that again even for uses where it's supposed to be just within one box or just purely or commercially, is, that there oftentimes, will be overlapped, with that police use as well so, there's just a lot of questions about how this information is stored and where. Um, which is a big part of the frustration. So if you could take a wild guess. Based on that i guess, wild guess if you're using knowledge as you have. Using the knowledge that you have, would you. Say that it's more probable, that. They are. Basically. Enlisting, someone else to do this service do you think it's more probable, that. A lot of these companies, actually have their own like shelf companies, some what's it's their company, but on paper it's some shelf company that's actually owned by them that's holding. The data. Yeah even both, um, there are examples, of companies, that um. Will. So so some companies, uh will just, provide, like just a straight up tool like here's the, the fish recognition, tool to the customer to the clients, use it as you see fit and then we're totally. Separate from that process. Um but then there are other companies that will provide. Along with a tool, and, they'll help you build your own database. So if they're doing that like they also, have. Um, access to that, um but then the ones that might try to. But yeah it's also possible that they'll try to. Distance, themselves, from the toilet. From the tool itself by having, yeah stall companies or different levels of companies that hold that information. Um. And yeah it's just. Another, thing that, we don't know and uh i was going to talk later about some of the avenues, that, we've been using, and that we and our partners have been using around the world to fight, face recognition, and one of them is just. One thing from civil society that we want is just to be able to map how this technology, is being used, and because there are lots of things that we don't know we usually find out when something really terrible, happens.

That This. That fish recognition, has been used there so. Um. There are some efforts. Being made. Um around the world to, just, map. How facial recognition is being used in different regions. Um. As well as like litigation. So alex was talking about foia earlier. Um, when, one of my colleagues. Based in argentina. They have. Used. Kind of the court system to try to get a bit more information, about some, companies, using facial recognition, in argentina. And. And they're. Essentially, just. Bringing a case to court, asking, the, judge to, force this company, to like, first of all. Admits that they've been using facial recognition, and so until the records around it, uh so there are different, ways. We have hopefully ever gets in that information. Litigation. Um. Through just trying to map, based on the reports that we have. Uh, yeah those are some other ways. Great. Uh who else wants to be in the spotlight. Anyone. It tall not to put you on the spot but you usually, have cool questions. If you're there. Yeah, um. I guess the only the only area i'm thinking about is more of like what's going on internationally. I. It sounds like, you know the situation. In the us, is at a. Uh you know startling, point right now and the uh the conversation, about it is kind of held up but i. Um. Uh the administration. And, different, different situations. But. Um. What's working overseas, like what what. In the. Conversation. Like what argument points are resonating, with say europeans. Or. Um. You know any other. Peoples that are. In any other country that are starting to support. The you know privacy. Legislation. Yeah, uh so what's working, um, i'd say for our latin america. Partners it seems like litigation. Latin america. And. Uh some parts of asia that litigation. Is a growing. Uh, means that they have of doing this so i mentioned. The. Case in argentina. We were also involved, in the case in brazil. Where we intervened. In a court case there that had to do with. A. An emotion. It was like emotion, and gender, detection. Tool. Which is just super problematic, so for so there it, were able to make um arguments, kind of based in international human rights law of, right to non-discrimination. And so forth for emotion recognition. And for, gender, recognition. First of all you're forcing people into, into a binary, and.

And Then secondly, beyond that like. How can you how can you detect someone's gender, uh so yeah so that's been, um, litigation, seems to be, the popular, avenue. Uh, in, latin america, and other parts of the world frankly, where. The government, isn't. Necessarily. Um. Where the government is necessarily, going to like be on your side and to be supportive, of civil society, so the courts have been one way, um whereas, in europe we see more government, engagement, so, um all of our european, partners, are doing. Consultations. And advocacy. And lobbying, with uh politicians. There, um, and. There's. There are talks of a mandatory, human rights due diligence, legislation, that may be passed in the eu. Which would have implications, for this as well so. Um companies, that are providing, facial recognition technology, would have to do, a human rights um impact assessments, on on. On the impacts, that this technology, has on different communities, and stuff, so that's awesome, oh yeah, i wanted to clarify, part of my question, uh that i was interested in was the. The public argument, i'm, hearing. That are working which. Uh is consistent, with what you've been talking about but i don't think i've heard much about just. The messaging. And what's what's responding, messaging, wise. Uh overseas. Yeah so i'd say that for, europe, it's, very much it's. Just grounded, in a lot of the. Privacy. Um regulation, that already, exists there that this is. Um. And that this is an infringement, on the right to privacy. Um. Guaranteed, under the gdpr. And all and also under like other european, law, so. The privacy, argument, is very much working, there, um, in other, parts of the world especially, where litigation, is concerned. And. It seems like there it's like so for instance, in, india and i think in kenya, they have. Um. Other kinds of biometric. Data collection and id that's tied to. Social, services, receiving, social services. So there the argument that seems to be working, based on some, again court cases, is. Um. You know rights to access, to. Uh to services. Being. Infringed. By this very invasive, technology. Uh so yeah i would say like for europe it's generally privacy. Um. Other parts of the world it's. More, right to non-discrimination. And access, to public spaces, into services, that seems to be, a good messaging. For the us, i'm not sure how we could. Package, that. Thank you. Do we have anybody, else who would like to ask a question here or shall we let is do and move on to the next part of her discussion. Yeah i can move on because the next part is just pretty short just that talking about responsibilities. For, companies, and for governments. Um. On what they should deal with so, uh when we're talking to governments, things that, we want them to consider. Are. Just, a ban. Total ban of, use of, especially, of government, use of facial recognition. Um, or to find less invasive, ways of achieving the goal so. Oftentimes, these technologies, will be used because, there's. Some goal like improving efficiency. Whether it's in policing. Um, or in, or in governments, where it's being tied to. The provision of social services, and welfare services. As a way to. Um yeah to ensure that the people receiving services. Who they say they are and things like that so. Um. There we're, encouraging, governments to find a less invasive, means, of achieving, those ends whatever the ends may be, um. And. And then just um implementing, strict, limitations. On how.

Vivint, Or commercial, use this technology, is being used. And then specifically, for the u.s context, is having a data protection. Law, it's like a good place to start. And then for companies. And also for governments as well is the human rights impact assessment so i just said a moment ago that in. Europe they are. Probably, fairly close to, um. A mandatory. Human rights due diligence. Uh. Legislation. Which we haven't, really seen. Uh something like that elsewhere so that would be really. Uh. That would be a really impactful, way. Um to make companies, and governments, understand. How this technology, is affecting, you know people's daily lives. Um. Transparency. In. Their. In their contracts, in their algorithms. From companies. It would be really really helpful for us to understand some of the questions that i think was david was asking earlier. Um about where data is stored how it's stored. And things like that that will, again help us like protect our rights, privacy. Um. And then. Finally the it's just understanding, that there are some uses, of. This technology, that are just not acceptable, so you could have all the safeguards, in the world, and, and. Things like mass surveillance, would still not be an acceptable, use of facial recognition. So. Yeah so it's just sort of like creating. That that messaging, to understand. What part. Of this, technology, are we comfortable, being. Being used. And if there is a part, what safeguards, can we put in place to ensure that people's rights aren't violated. And then what parts of this technology, are just not even, like worth. Implementing, safeguards, but should be just totally, banned instead. And, yeah so that's where, that's where we kind of are now. I i would have a quick little interjection, there uh maybe i now find myself curious. What do you find maybe, would be the acceptable, uses of this technology, and it's not the answer is none, but uh i'm just curious what you personally, when you talk about us trying to figure out, new spaces, where it might be okay, versus it's not. Do you have any personal thoughts like a rubric, that we can use when trying to solve that admittedly, very difficult, question. Yeah so i mean there are things like, if you have an iphone and you're unlocking, your, phone. Uh with your with face id which i think is what it's what they call it, which is probably a much more benign, use. Excuse me i mentioned earlier the example, of, uh the facebook, tool that helps. People with disabilities. Access some of. Facebook's. Services. Um as well as things that would put that will prevent fraud, i think the fitness book also has, um. Or at least we're piloting, a tool that would alert you if your face had been uploaded by another user it as a way to, prevent like spoof or, fraudulent, accounts. So there are some of those uses. And. That are probably. More acceptable, but, again they would need strict. Uh limitations, and safeguards, around how they're how they're being used, and. Where that information, is stored and just yet more transparency. To users. But where that information, is being stored, and how it's being stored. And ways of getting consent. Um and i think the consent. Piece is probably, the thing that will. Trip up most of those uses even if they are benign. And, because. Whoever, is making the tool like they can. Whoever whoever is using the tool can provide consent, but, oftentimes, there are people who are impacted, by. Um fish recognition, that are not. The users, of facial recognition, so how do you get consent, from people like go into a process. Do you consent. To having. Your, face, um. Being, well to the process like so it's just not possible to get consent under that's. Under that scenario, so. Yeah i think that there are probably, some, uh but when we start talking about things like consent, and, uh. Uh and safeguards, that it might show that they're just, not very many of those, uses that would actually be acceptable.

That's The end of the steps i prepared, so, more questions are welcome sorry go ahead kevin. Oh and you said something that made me think of another, question, which is. And even as far as people who want to, who want to push facial recognition. As like. You know an easy way, to. Prove identity. You know. Like in a world where we're seeing. The. Exponential. Improvement. In fake. Image generation, technology. In a world of starting, of deep fakes that are starting to look completely, real. I mean, that just strikes me as that tech right there is eventually going to completely, break any use of facial recognition. As, a key to prove your identity, or something, so. I'm just curious your thoughts about like, you know we often talk about the weaknesses, of biometric. Locks, anyway that you know you alluded to it earlier that, you can't change your face, and if it gets stolen, it's stolen. Like. And i guess this is all a segway, of me just kind of going to that. Are making, what seem like obvious, arguments. Like that of why this technology. Isn't going to work in most of these cases to people like us, like, is this to you like a winning argument, to make to the people who are pushing this stuff anyway. Because, i'm not sure if like. They aren't getting, these massive, flaws, in these systems, or they get it and they just don't care. Yeah. It's probably a little bit of both i'll say like so there was this guy from a company who was. Um. On a call with a bunch of us civil society, people and we were. Basically peppering him with all our questions about, facial recognition, and how his company could do this and he was like well for instance, um. When i when i went to the airport the other day, uh, i, i travel all the time so i have this global entry whatever. Um and there's this facial recognition, tool that can recognize my face so i'm not using a passport, or anything and i'll be fine. Um, but then one day this thing wasn't working well i couldn't recognize my face this guy is like a white man and it's like he couldn't recognize my face. Um, so. Uh and so basically i understand that there could be problems with that and as someone rightly pointed out like you are. A privileged, person. That has like 10 different forms of id, so. The fact that this tool couldn't recognize your face, didn't keep you from entering the country it just meant that you had to go through the line with, you know the other poor people. And, to show your id. So. I think just. Trying to make it clear to, these companies and people who are using this technology, that it's not just like. The real and i think sometimes you just really don't understand, the implications. That it could have for people, for already like marginalized.

And At-risk, people. So if your, entire, livelihood, and whether or not you get food. Depends, on, this, tool being able to recognize who you are accurately, and you're a person of color. Um, and the. And the rate from his identification, goes up like it doesn't, there's really nothing else that you can do so. Um. Yeah so it's, and, to the issue of with uh. Deep fakes and other ways of kind of spoofing, this, i think that in response, actually we've seen facial recognition. Improve. To such a degree that, there were, reports, of protesters, in hong kong i believe who were at first wearing masks. And. As a way to sort of trick the face recognition, systems. And that worked for a while but, now not so much anymore, when you have things like. Gates recognition. And, included along with facial recognition. So. It seems like the answer, from the company side is just to make this as accurate, and like. As, good as possible. Um rather than we're, looking at what the actual risks, are so one thing we're honestly trying to do is to. Push for, these companies before you roll out something like talk to the actual affected, communities, to understand, what the impacts, are for them, and not just what you think the risks will be but what the actual. Impacts. Are, and hopefully that will. Help. Uh you know help, some of these policymakers. And, and tool makers see that it's not just an issue of like. Okay well this didn't work well then you get you have another option, right have another option if this tool isn't working well for you but that this. Could really be it and could affect in in strong ways. Yeah it sounds like a very similar. Line of reasoning, to the people who, with voting access. Are like oh well you have this other format id you'll be fine. And and i mean also just to give a personal anecdote, to your your. Uh privileged white man in his flying, situation. I, also as a privileged white man have a flying story, where uh. You know i was flying to france, and uh they wanted to verify my identity, with a facial recognition, system on air france and and i was like well, surely. I can identify, myself in some other means i mean i have my passport, right here and i was basically told, uh the only acceptable, form of identification. Is your face, and you will submit to that or you will not be flying on this plane. So you you talk about consent, and i'm like, they are increasingly, not giving anybody, consent, on this stuff. But see but the difference i think is that we're, that. How long ago did that happen because. If that would happen. About a year ago, okay so while the gdpr, was enforced, so you could probably. Like challenge, that you could probably challenge that and win and have a real, chance of winning in europe, in the us. Well yeah and i mean admittedly. From everything, i knew. Given who i am and i'd read with eff's, research, on refusing, these systems at airports. I was actually pretty confident, that if i'd really wanted to make a stink i could have said air france was wrong and had to accept my passport, as identification. But it's like well by the time i do that i've missed my flight, you know so it's like, you know i really don't have a choice in the moment, even if legally, i do have a right. That's enough meaningful, consent again how can you meaningfully, get consent, from someone, when. Uh this is the only option, that they have is yes yes, you know i i stuck my face in the system and went like well probably some system already has it you know and had to just resign, myself to that. You know, um. What was that there was one other thing i wanted to ask you um, yeah one other question i'll ask you before i, try to cajole. Questions out of other people, um. Which is, sort of getting back to like effective, arguing, tactics, on trying to.

Get People to understand, this. I, find you know that. You, you, you find. A lot of times the people who see most in favor of this technology. Are. Citizens. Who very much, for whatever, reason and i i try not to you know judge anybody's, individual, reasons or, emotional reactions. But, you know there's certain citizens, who for whatever reason facial recognition, technology, gives them a feeling of safety, that like, that this tech is going to make things safer. Bad people are not going to be in places, they'll get caught blah blah blah blah. And then the other people who tend to be in support. Are politicians. Cynically, exploiting, these fears, to gain power for their side, their team, their whatever. And i, i just like a lot of times. You know once again this one says it seems obvious to me, but i'm curious your thoughts about. Persuading. People. Where i i'm just like, but, yeah. This tool may make you feel safe right now. Because, your team's, in charge. But. Once other people are in charge. You've just handed them, the very powerful weapon you were just using on the people you don't like is now going to get used on you and to me it just seems so obvious. Why you wouldn't want to like, take that risk. But, it seems a lot of people are like whatever, i'm winning right now, and i just wonder, how we can break out of. This tribal, mindset, and get people to understand, no this goes beyond, winning your little battle, right now, this is about protecting. All sides, in the future. I i think, yeah i mean that's, probably the, best thing you can do uh actually one example that comes to mind is i read this article in new york times, uh. About a month or two ago, about this guy in san francisco. This really rich guy who was like i'm to solve the problem, of, homelessness, and crime in san francisco. By. Um putting facial recognition, all over the city so that we can catch when people are. Not where they're supposed to be, so it just. And it's difficult, in those, and it doesn't. Say like well who's not, supposed to be here, and, what, what is what is the person who's not supposed to be here look like. Um. So be beyond beyond that in issues of discrimination, that can arise. I think that, one thing that should, give everyone, pause is. Who is, who's the one making the decision so you might live in a neighborhood, where there's fish recognition, and you feel safe, but at the end of the day like that means that you yourself, are also, subject, to it right so it's not just, the people you don't want, that have to go through the system but you as well, and are you comfortable, with the fact that in you know this specific, example, it's one like random billionaire. That, has all this information, about everyone. So. I think for me it's just like. Trying to make people understand the different levels of trust, that you have to have. To believe that this is a good. Idea, and like the best thing when it comes to your safety. So you're trusting, that, the institution. Um, that is collecting, and this information. Is, there for you. And, you're, trusting. That, um. That if something goes wrong that whatever. Law enforcement, is or whatever will be on your side. Um. And when it became, and again i think it goes back to our comfort. With giving, our personal information, to companies. The fact that there could, be a company, that has. Information, about your face, you know when you're leaving, your house when you're coming back who you're coming with, how often you're going out, and, because it makes you feel safer like you're okay with that so i just. I don't know that maybe i don't have a good answer to this but, it just seems right off the bat that things like. Understanding, that it's not, just other people,

2020-08-17 23:47

Show Video

Other news