You. Sure. That's, great hi. Everyone before we get started I want to invite folks to move in a little bit we had to switch rooms because to accommodate, the live stream but, but. You know to take. Advantage, of the fact that we have a smaller, crowd it would be great if you want to come. In a little bit there are a bunch of seats down here too. Yeah. Neither of us bite. It's. More like a conversation. Good. Afternoon everyone my, name is Shobha the Parthasarathy, I'm professor. And director of the science technology and public policy program, here at the Ford School of Public Policy s, TPP. As it's known is an interdisciplinary. University-wide. Program. Designed, to training students, dedicated to training students, conducting. Cutting-edge research, and informing. The public and policymakers, on, issues at the intersection of Technology science. Ethics. Society. And public policy, we. Have a very vibrant graduate. Certificate, program and an, exciting, lecture series. Before. I introduce today's speaker. I want to let you know that next term our speakers, will explore the themes of health activism. Prescription. Drug patents, and pricing, and, graduate. STEM education. Our. First talk on January 22nd, at 4 p.m. is by, Lane sharer Ford. School alum who. Is now at the National Academies of science, engineering and, medicine and she'll be talking about graduate, STEM education in, the 21st century, if. You're interested in learning more about our events, I encourage. You to sign up for our, listserv that, is just the mailing the signup sheet is just outside the auditorium but. Even if you haven't had a chance, to. If you are already on our listserv please do sign up there as well because it gives us a sense of who's, been able to come today. Today's. Talk show, your face the, pros and cons of facial recognition technology, for, our civil liberties is, co-sponsored. By the Center, for Ethics Society in computing, and the Science. And Technology, Policy student, group inspire, as part, of their themed semester. On just algorithms. Inspire. Is a rack of interdisciplinary. Working group run by s DPP students but it is open to all graduate. Students around the university who are interested in Science and Technology Policy and. Now. To today's speaker, mr.. Christopher Calabrese, is the vice president for policy, at the Center for Democracy and technology, before. Joining CDT, he served as legislative, counsel at the American, Civil Liberties Union. American. Civil Liberties Union's. Washington. Legislative. Office, don't try to say that ten times fast in. That role he led, the offices, advocacy, efforts related to privacy new technology, and identification. Systems his. Key areas, of focus included. Limiting location, tracking by police. Safeguarding. Electronic, communications. And individual. Users internet surfing, habits and, regulating. New surveillance technologies, such, as an unmanned, drones. Mr.. Calabrese has, been a longtime advocate for privacy protections, limits. On government surveillance, advocating. For the responsible, use of new and developing technologies such as facial, recognition. This. Afternoon, he'll speak for about 15, minutes giving, us a lay of the land and then he and I will chat for about 15 minutes or so and then we will open the floor for questions. Please. Submit your questions on the index cards that are being distributed now and that will be distributed throughout the talk, sujin, Kim, our student, assistant, at St PP will circulate throughout, the room to collect them and if. You're watching on our live stream you can ask questions via the hashtag, s TPP. Talks. Claire. Galligan, our wonderful, Ford school undergraduate research assistant and dr., Malik lineman s TPP's program, manager will, then collate, and ask the questions, I want.
To Take the opportunity to thank all of them and especially, Molly and sujin for their hard work in putting this event together, and, now. Please join me in welcoming mr., Calabrese. Thank. You thanks, oh thanks to all of you for coming this is obviously topic that I care a great deal about so, it's really exciting to me to see so many people who are equally, interested thanks. To the show PETA for having me and thank you to the Ford school for, hosting I think. These are really important topics as, we incorporate, more and more technology, into our lives we, need to spend more time thinking about the impact of that technology and, and. What we want to do with it and face recognition is, a really, great example it's. Powerful, it's. Useful, and it's, often, dangerous, like. Many technologies. This. Is this is a technology, that can do so, many things it. Can find a wanted, fugitive from, surveillance footage it. Can, identify, everybody. At a protest rally, it. Can find a missing child from. Social, media posts, it. Can, allow a potential, stalker to identify, an unknown woman on the street this. Is really, a technology. That. It. Has the potential to and is already impacting, a wide swath of our society, that's, why it's gotten so much attention we, saw a ban on face, recognition technology. In San Francisco, we've, seen a number of lawmakers really engaged, and. We. You know we as a society really need to grapple with what we want to do with it so before, I get too deep into this just. A word about definitions, I'm going to talk about something fairly specific I'm going to talk about face recognition. Which, is taking. A sum a measurement, of someone's face so that how far apart of their eyes how. High or lower their ears, the shape of their mouth and using. That to create an individual. Template, that is. Essentially. A number, that, can be used to go back to another, photo, of that same person, and do, that same type of measurement and see if there's a match so it's literally a tool for identifying someone it can be a tool for identifying the, same person, so, if I bring my passport, to the passport, Authority. They can say is the person, on the passport photo the person standing in to me or, it can be used as a tool, for identifying. Someone from a crowd so, I can pick one, of you and see and you know do a face recognition match, and see if I can identify particular. People in this room based. Off of a database, of photos, that the face recognition system, is going to run against that's, face recognition, and, that's what we're going to talk about there are a few other things I won't talk about one. Of them is something called face identification. And that's, literally like is there a person, standing in front of me we might use that to count the number of people in a crowd we might use that to you. Know decide if we're going to show a digital, signage, at a billboard that, but we're but and that's usually less problematic. There's, another type of technology, I won't talk about called face analysis. Face. Analysis. Is literally. Looking. At someone's face and trying to make a determination about, them are they lying are they, you. Know going, to be a good employee this, technology, doesn't, work it's. It's basically, snake-oil which is part of the reason I won't talk about it but. You will see people trying to sell this concept, that we can essentially take pictures of people and, and. Learn, a lot about them but, I can, tell you that, face. Recognition does work and it's, something that we're seeing increasingly, deployed, in a wide variety of contexts. So. I already talked a little bit about what. What, exactly face, recognition is this sort of measurement, of people's faces turning, that measurement, into a, discrete. Number that, I can store in a database and then compare against other photos. Take, that see if I get that same measurement, and then see if I've identified the, person. There's. A couple of things that you need to understand, if you want to think about this technology, and how it actually works and whether it's going to work the first is a concept, we call bidding, so. Bidding is literally, putting people in bins putting, them in groups, and so, it turns out and this is pretty intuitive that.
If I, want, to identify someone. It's, much easier, if I if I know they're one of a hundred people in a photograph at one of a hundred people in a group. Versus, one in a million it's. Just a much simpler, exercise, right. So you can so that's one thing to keep in mind as you hear about face recognition is to think not just about the technology that's, taking that measurement of your face but, the technology, that's being used to pull the database, in from, outside and that the size of that database is hugely important, for the. Types of errors, we can see how accurate, the system is. So. A little, bit of history for you so, face. Recognition has been used for a long time even, though it really has only started to be effective, in the last couple of years if you go all the way back to 2001. Before. 9/11, police, tried out face recognition at the Super Bowl in Tampa, and they actually did a face-recognition survey. Of all. The people who entered the Super Bowl and it. Didn't work the, technology wasn't, ready for primetime it, couldn't identify people, it was swamped. By the number. Of different faces and the different angles that those faces were taken at and, so. For. A long time that was the beginning in the Congress and the end of the conversation as far as I was concerned because. If a technology, doesn't work why, should we use it but. You know as I was saying to someone I had a friend who works in the industry and he we had lunch a couple of years ago and he said to me, it. Works now, it's. This this technology, will actually match and identify, people and that was a kind, of a Rubicon, and we've seen that in the last couple of years the the, NIST, which is the National Institute for Science, and Technology, which does standard setting for the federal government has confirmed that they've. Said that in the you, know earlier. This year they said massive gains and accuracy, have been achieved in the last five years and these far exceed improvements, made in the prior period which is the prior five years so, we're seeing this technology, being used more, and more it's, more and more accurate, and. And we can really understand, why that is we. Have more powerful computers, we, have better AI that, does this type of comparison. We. Also have better photo databases. I mean if you look at the, LinkedIn photo database, if you looked at the Facebook photo database these are high resolution, photos often, many different kinds of photos to give you many different kinds of templates all linked, to someone's real identity, that's a perfect, tool for creating a face record database. So. Why. Why. Do we care like what's the big deal, about face recognition, and, there's. A couple of things that I think as advocates. And I hope that we care about and I hope I can convince you to care about a little bit too. The. First thing is that we have sort of all kinds of assumptions, that we make about our privacy, there, are grounded. In technical. Realities, so. We assume that while we might go out in public and somebody. Might see us and they happen to know us they might identify us, that's. That's, where you get this idea that well you don't have privacy in public right you put yourself out there but. The reality is that when, you're out in public you, don't necessarily expect, to be identified. Especially, by a stranger, you, don't expect to be potentially, trapped across, a series, of cameras. And you, don't expect that record to be kept indefinitely, that's. A different. Type of use of the technology and it really sort of changes, our assumptions, about what, privacy, looks like and what privacy looks like in public. And. Of. Course you can imagine the impact on that for if you're taking doing photo recognition on for example a protest rally you, can see how suddenly I have. Knowledge, of who may be worried, about the border and that allows me to take other, kinds of punitive action. And, of course it also allows, me to figure out who your friends are who are you walking with those kind of associational. Pieces of information that we worry about. It. Also changes. The rules in other ways that we don't always think about that but I would encourage you to so. We. Flee jaywalk every day we, cross the street when we're not supposed to you, are breaking, the law when you jaywalk. Everybody. Does it but what. If we could enforce, jaywalking. A hundred, percent of the time what. If I could do a face search identify. You and send you a ticket every. Time you jaywalked that. Would fundamentally, change, how the. Law was enforced it wouldn't change how you interacted. With society, we could do it whether. We would do it or not or whether we should do it as separately but these are laws are on the books that could be enforced using this nology and so. And so that's a concern and the second concern, I think that's related, is if we don't enforce it against anybody and we start to enforce it in a selective, way what, kind of bias does that introduce it to the system and you can just sort of think about that for a minute I'm.
In The private, sector, we also see a lot of changing. In relationships. And that's you, know I already raised, the stalker example. But there is off-the-shelf. Technology. Sold by a variety of companies Amazon, recognition, is is one of the most well known that. You can purchase and, you, can use to run your own set, of data. Databases. And we've already noted, that there's a lot of public databases, of photos on identification you. Can take those run, those databases against. Your own off-the-shelf. Face recognition software and, identify. People and so, there's a you know suddenly that stalker can identify you suddenly those marketer, can it that marketer can identify you suddenly. That photo, that embarrassing, photo of you from 2005. That, it sort of still, exists, on the web but nobody, sees and it's not captioned, and no nobody, knows it's you well suddenly you can be identified and if you're in a compromising. Position or, you're you. Know you were drunk I mean there's a lot of photos out there about all of us potentially. That's revealed, information that, can embarrass you. The. The next sort of the other reason, we might worry about this is that mistakes, happen this, is a technology that's not it's far from from, perfect and in, fact has a great deal of racial, bias in it because. Many as you when, you create, a face recognition template. We don't we don't we can get into this maybe in the QA but. You're using essentially. Your training. The, system to recognize faces so. If you only put, the, faces in the, system, that, you get from Silicon, Valley you. May end up with a lot of white faces a lot of faces that are not representative. Of the broader population and, as a result, your, face recognition algorithm. Isn't going to do as good a job of of eight of recognizing. Non-white faces and literally. The error rate, will be higher and so these are the this is sort of a bias problem, but it's also there's also just a broader mist a problem as the technology, gets used. More broadly people, will rely on it and they will be less likely to believe that. In fact the machine made a mistake, people tend to to trust the technology and that and that can be problematic ultimately. I would just sort of give you this construct. Just. To sort of sit with this. Idea of social control the. More that someone knows about you. The, more they can affect, your decisions. If they. Know where. If they know that you went to an abortion, clinic if they know you went to a gun show if they, know you went to a church. None. Of those things are illegal in you, know in and amongst themselves but. Someone. Especially, if it's the government, taking this action may make decisions, about you I'll give you an example that's not facial recognition related. But is I think instructive. So, when. I was at the ACLU we, had a series of clients. Who, protested. At the at, the border, in San Diego the sandy the border, wall runs right through San, Diego and so, they all, parked, their cars at the border and they, went and they had this protest and then.
You, Know they. As they came out of the protest they found people that they didn't recognize writing. Down their license plate and those. And they didn't know who that was but then many, of those people found themselves on being. Harassed, when they were crossing the border they. Were you know these are unsurprisingly. People who went, it back and forth a lot and they found themselves being. More likely to be pulled into secondary screening faced, more intrusive questions and they, believed and this was something we were never able to prove but I feel, very confident, was, because, of this type of data collection because, they were identified, as people. Who deserve. Further scrutiny, that's, what happens as you deploy these technologies you, create, potential. Information, that can be used to affect your rights in a variety of ways and face, recognition is a really. Powerful way to do that, so. What should we do what. Should we do about this you, know there are some people who say we should ban this technology, face-recognition has, no place, in our society. Well. It's that's a fair argument I, think. It does disk, the potential, benefits. Of face recognition I was. At Heathrow Airport. Maybe. It was Gatlin I think but I would say I was in London and it. Was I was jet-lagged. It was red-eye I was like 6:00 a.m. I. Kind. Of walked up and I looked. At them at, you know the sort of ran through the checkpoint, and then. I looked, up at this literally. 20 best and then I kept walking and I. Realized hat, you know 30 seconds later I just cleared customs, that. Was face recognition and, it sort of completely. Eliminated. The need for them, to do a customs, check, now. Maybe. Maybe it's not worth it but that's that's a real benefit, right if you've ever stood in one of those lines you're saying gosh that sounds great and that's, a relatively, trivial example. Compared to somebody who say has lost. A child and but. Thinks that you know maybe that child has been abducted by someone they know which is unfortunately, frequently, the case you. Can imagine remember, going back to that binning you, can imagine that maybe there's, a photo that might help somewhere, in your social network, if, you could do face recognition on the, people in your social network you might find that child these are real benefits so we have to think about what we want to do when we whenever, we talk about banning a technology, so the. Close cousin of the band and this is one that I think is, maybe. More effective. Or useful in this context is the moratorium, and, that's. This idea that we should flip. The presumption you should not be able to use face recognition unless. You, there are rules around it and rules that govern it so, and that's. That's a really effective idea, because it forces the people who want to use it to explain, what. They're going to use it for what, controls, are going to be in place why, they should be allowed that authorization to use this powerful technology, so. If we did have a moratorium, or even if we didn't and we just wanted to regulate, the technology. What. Would this regulation, look like and by the way this, regulation, could happen at the federal level, and it. Could happen at the state level there is already at least one state the state of Illinois that has very powerful. Controls. On biometrics, for commercial use you cannot collect, a biometric, record, in Illinois without consent. So these are these are laws, that are possible, there's, no federal equivalent, to that so. As. We. Think about how, would we think about this I think the first thing especially in the commercial context is to think about consent, you. If you can say that it's illegal to create a face, print of my face for this service without my consent, that. Gives me the power back on that technology, right, I'm the one who decides whether I'm part of a face recognition system, and what it looks like and you. Know that's a hard that can be a hard line. To draw because it's, so.
Easy To create this kind of face template, from a photo without your permission but, it's a start, and it allows you to you know responsible. People who deploy face recognition technology. Will, deploy you, know and require consent. And. Then. After, consensus, obtained. You probably you want transparency you, want people to know when face recognition, is being able to be used, so. That's that's the, broad idea we can talk a lot more about this in the QA but from, the consent side from the private, side government. Side is a little bit more tricky I, think from. A government, point of view, government. Is going to do things sometimes without your consent that's a that's a fundamental, reality for, law enforcement for example so, what do we do and I think in the, government, context, we fall back on some, time-honored. Traditions. That we find in the US Constitution and that's the concept of probable, cause, so. Probable, cause does this idea, that. And this is embedded in the Fourth Amendment of the Constitution this. Idea, that, it, is, more, we, should be able the government, should be able to search for something if it, is more likely than not that they will find evidence that of, a crime and in, order to get that probable, cause they frequently, have to go to a judge and say hey I have evidence to believe that this going. Into this person's house will, uncover. Drugs because, and here's all the evidence that they were a drug dealer and then, I can search their house we. Can deploy a similar. The, same idea, with, face recognition, we. Could say that you. Knew you could only search for somebody remember I said there's that one fugitive, who's like, I think I can go, look at surveillance camera footage and maybe find him you. Need maybe need to go to a judge and say Your, Honor we have probable cause to say that this person has committed. The crime there. Likely to be somewhere. In this series. Of you know footage, and, you. Know we, would like to you know we believe we can arrest him if we if we find him the judge can sign off on that you know vet that evidence, and then the technology. Can be deployed. Similarly. There's. A you know there. Are exigent. Circumstances. And we have this in the law right now so. If I think that there is an emergency, say I have you. Know a situation, where, someone has been abducted I believe. They're still on the for, example the London Metro which is blanketed. With surveillance, cameras, and I. Believe, that that child's, life is in danger there's. A concept in the law called exigency. Which, is this idea that there's an emergency, I can prove there's an emergency, I need to deploy the technology and, we, can build those kind of concepts, into the law so I'm. Going. Into a lot of detail on this. Mostly. Because I think it's worth, understanding. That, these are not by. These are not binary choices, it is, not flip. On face recognition we're. All identified, all the time I'm sure many of you were old enough to remember Minority, Report the, movie which used, a lot of biometric, scanning throughout that and it, was sort of this everybody, just was by it was scanned and there was face recognition happening, all the time and advertisements. Were being shown to them constantly we, don't have to live in that world but. We also don't have to say that we're never gonna get any of the benefit of this technology and, we're not going to see see, it used for all kinds of purposes that may in fact make our lives more convenient, or more safe so. With that sort of brief overview, I will stop and. Should. Be till we can chat and then take. Some questions and, go, from there. So. I. I'm. Very I've been thinking about this issue a lot and I'm very interested in it and I I think I tend to agree with you in lots of ways but I'm gonna try my best to occasionally. At least play, devil's. Advocate as. My students know I try to do that although sometimes more, successful. That than others but maybe first I'd. Be interested in in, you're talking a little bit more about, the accuracy issue, so, you, said it's evolved over time it's more accurate than it used to be now, NIST, says it's accurate. First, of all you, know what does that mean and. How, is miss determining, that and. Yeah. Why don't we start there so that's a great it's a wonderful, place to start so. So. Accuracy varies, widely depending, on how you're deploying the technology.
It. Depends. So just to give to give an example so if I am, walking, up in a well-lit. Customs. Office. Even, if I it's not a one-to-one, match right somebody's, already holding if it's a well-lit situation I'm looking right at the camera that you are much more likely to get a good face print and one that's accurate especially, if you have a database. That it's, backing up that image whether it's backing up that search that may have like three or four or five or six images of me, from, different angles, like that's a that's a very optimum. Sort. Of environment. To do a face print and you're getting much more likely to get an accurate identification, especially. As if I mentioned before you have a relatively, narrow pool of people that you're you're doing the search against, the, reverse. Is true obviously if you have a side photo of somebody, that you only have a couple of photos of and, the photo quality may not be particularly good you can, see how the accuracy. Is going to sort of sort. Of pin go, up and down depending, on what, what the environment is and so. You. Know part, of the trick, here part of the thing we have to expect from policy makers is to vet. These kind of deployments, like how are you using, what's, your expectation, once you find a match how, accurate are you going to treat it what's, going to be your procedure for, independently. Verifying, that this person you're just essentially. Identified. As a perpetrator, of a crime actually, committed, that crime it can't just be the beginning, and the end of it as a face recognition and, so in terms of what witness does they do sort, of exactly what you would expect they would do right they have their own photo, sets they. Will have they will take the variety, of algorithms, that exist. And that you can they will run those algorithms, against their own data sets and just see how good a job they do see, how accurate they are in these variety, of different contexts and this. I think it bears. Putting. A fine point on the accuracy, doesn't. Just differ depending on whether you're straight. On or on the side right one, of the big issues with accuracy is that, it's different, for its most accurate among white men and. Then it degrades inaccurate, that's it thank you and, I should have I should have made should, have said that first because that's really the most important, thing we are seeing a lot of racial disparity. Because. That most mostly, because of the training set data but I'm not I don't know if we know actually enough yet to know if it's 100% the training set data or not. Or. It's that you know there may be other questions other areas of machine learning that are also impacting, it but, we are seeing a tremendous variation. It's. Problematic, not just because it's. Not. Just because of the identification. Issues but because. Robert. You and I were talking with us the other earlier. Today I mean if you're not identified, as a person at all, right, because the system, does not recognize you that has all kinds of other potential. Negative, consequences for automated, systems so it's a it's a very big deal, it's. Also worth, saying. That it's it. Doesn't you, know I worry. A little bit that people are, gonna say well once we fix that accuracy, problem, that then it's okay and I hope I've sort of convinced, you at least a little bit that we're not the problem doesn't end even if the system isn't racially, biased that's sort of the minimum, level that we need to get over before we can even begin to talk about how we might deploy so.
Sort Of linking, to that and maybe you, you. Mentioned. A few of these. Cases. Of potentially, I'll put it in my language and you. Know sort of new forms, of social control or reinforcing. Existing. Forms, of social control I think some. Of you in the audience may have heard about this but it I think it bears mentioning, in this context, which. Is that now about a month ago. News, broke that a, contractor, working for Google you probably know who, it was. Was, caught. Trying. To improve. The accuracy of their facial recognition algorithm. For the pixel for phone by. Going. To Atlanta. Where. There's of course a large african-american population and. Asking. Men. Homeless. African-american. Men to. Play with a phone, and. To. Play a selfie, game so they were not consented. But. Their faces were scanned right and. So that, keeps. Ringing in my head whenever I'm thinking, about this stuff and I think what's. Interesting to me about it and I wanted to get your sense, of this what's. Interesting to me about this and it ties to what you were talking about in terms of social control is that. That, what. What, the act of, supposedly. Increasing, the accuracy supposedly. To serve. At. Least argument. Arguably. The the, additional. You know to serve, african-american populations, actually ultimately. Serves to, reinforce, existing, power dynamics. And. The. You know discrimination, that African. Americans have historically. Experienced. And so I'm wondering you know in the in the sort of in. Pursuit. Of this goal of accuracy in the pursuit of you. Know this wonderful technology that's going to save our lives you, know these kinds of things are happening. To well. I mean that is the the, funny, thing about rights. I mean it's, everybody, that needs there needs to have their rights respected, everybody need deserves equal rights but the reality, is that those, are the kind of communities who really need to have their rights respected, they really need something like a consent, framework, because they're the kind of people most likely to have, images, because they have less power they have less ability to say I am, NOT going to consent to this or maybe less knowledge, of how the so, they're really when.
We're Creating these rights part of what we're doing is building on existing power, structures, and power imbalances. Where I may, have more power and you may have less and hence it's even, more important, that I had this ability, to, to actually, exercise my rights and know what they are and another. Piece of this which I didn't. Mention in my talk but, is there's. A number. Of already, unfair. Systems, that face-recognition might, be built on top of the, most used. To I think one of the most illustrious example ziz the terrorist, watch list so. There is a list in the United States main ever-changing. List maintained, by a part. Of the FBI that. Where you are you can be identified as, a as a potential, terrorist there's, a master, list that then feeds into a wide variety of different parts, of the federal government it affects things like whether, you get secondary, screening, at the airport, and in rare, cases even whether you're allowed to fly at all so. And. There's. This is a secret. List you don't, know when you're on it it's, hard to know how to get off it and the incentives, are very bad because if, I'm an FBI agent and I'm sort of on the fence about whether to put you in a database I can tell you if I put you in the database and nothing happens no. Harm no foul if, I don't put you in the database and you, do something bad my. Career. Is over so, there's a lot of incentive, to put people in lists well, you can imagine putting. Somebody on a list and combining, that with the power, of face recognition creates. An even greater imbalance. Because now I've got a secret list and I've got a way to track you across, society. So that's, an existing, unfairness, that has nothing to do with face recognition but, face recognition can exacerbate, so. How would a consent, framework, work in that context, given that there are already. Places. I mean in this context where there's information but also you. Know we're in a society, now where our faces are being captured, all the time so how would you envision. So what you would consent to in a very technical way you would consent to the two turning your face. Into a face print you. Would consent, to creating. That piece of personal information about, you literally the way your social security number is a number about you this, would be a number that encapsulates what your face looks like that, would be the point at which you would have to consent, and I think we. Might have to do some stuff around exist a lot of existing face recognition databases, either saying, those databases need, to be reactor, you know but, the reality is that we if you can catch, it there then, at least you're saying you're taking the good actors and you're saying it's not okay, to take somebody's face print without their permission and that and, then again as we said the government's a little different and of. Course it's not these are not magic this is not a magic wand right fixing, the problems with face recognition doesn't, fix all the other problems with society, and how we use these technologies. So. You, mentioned, the you know going through going, through customs are going through European immigration, and the ease of facial, recognition. There. And and that sort of can the excitement, of convenience, right and and I'm wondering. And. You said maybe that's a that's, an acceptable use of it and. I guess when you said that I was like wow I'm not sure if it's an acceptable use of it because. I worry a little bit about the fact that that normalizes, the technology. That then. People. Start wondering why, it's a problem in other domains look it worked when I went through immigration why, would there be a problem for.
Us To use it for. You. Know crime-fighting or. Education. Schools or hiring or fit you know sort of yeah. You. Know it's it's always a balance, I mean I. When. I'm considering some of these new technologies I tend to think about people's real world expectations. And I think in the context, of a border. Stuff you expect to be identified, you expect, that, a photo is going to be looked at and that, somebody's going to make sure that Chris Calabrese was Chris Calibre's so, that to me feels like a comfortable, use of the technology, because it's not it's. Not really invading, anybody, you. Know the idea of what what task is is going to be performed, so. For. A while and they don't do it this way anymore but, a less intuitive example of this but one that I thought was okay and it was this is a little bit controversial. Was that Facebook. Would do a face template, and that that's how they recommend, friends, to you they're, like you know when they get a tagged photo and they say is this Chris, Calabrese, your friend and you could that's face recognition. For. A long time they, would only recommend, people. If you're, already friends with them so the, assumption, was that you would be able to recognize your friends in real life so it was okay to tag them and recommend. Them now that's. A little bit contrl, it's definitely not you're not getting explicit, consent to do that but, maybe it feels ok cuz it doesn't feel like it violates, a norm you expect, to identify, your friends they now do it, they now have a consent based framework where you have to you do have to opt in but for, a while they had sort of that hybrid approach so I think. It's helpful to map in the real world, um I, do. Think that you have, issues where you're potentially normalizing. At night another, area I didn't bring up but one is one that's I think gonna be kind of controversial, is face, identification. In employment. You. Know obviously, we know that the consent. In employment. Context, is a kind of a fraught concept, often you consent. Because you want to have a job. But. You know you really do have a potential there to have that technology, you. Know well don't we're not gonna do the punch cards anymore we're just gonna you. Know do a face recognition scan. To check. You in but. Then of course that same face recognition, technology. Could be used to make sure that you are cleaning, hotel, rooms that fast enough right, make sure that you're you. Know track your movements, across your day see how much time you're spending in the bathroom, like these technologies, can can, quickly escalate especially. In employment, context which can be pretty. Coercive. So. Yes there's a lot to this idea that we want to set norms for, how we, use the technology because the creep can happen pretty fast and be, pretty you know violative, of your privacy, and your rights. So I've been asking questions, that are pretty critical but I but, I feel like I should ask the question that my mother would probably ask, so. My mother would say I live a very yep. Pure. Good. Life. I'm, I live on the straight and narrow you. Know if I'm not guilty of anything if, I'm not doing anything strange, if I'm not protesting, at the border why should I be. Worried about this technology or why should I care what. You. Know it's fine and it actually protects, me from kidnapping. And other things, and I'm getting older and um. You, know this is a great Public Safety technology. Yes. The old if I done, nothing wrong yes right you know what do you have to hide so.
I Mean I think the obvious first answer is just the mistake answer right just because you're just because it isn't you doesn't, mean that somebody may not think it's you and that technology, may be deployed and especially, if you're you. Know part of a population, that may not actually you, know the system may not work as well on so, that's that's one piece. Of it um I, also. Think, that you. Don't always. You. Know who are you hiding from right, maybe you're from you're comfortable with the government but are you're really comfortable with like the, creepy guy down the street who can now figure out who, you are and, and maybe from there like where you live, that's. You know that's that's legal, in the United States right now and it seems like the kind of technology. Use that we would we really worry about you. Know, activists. And and, I think this isn't something I you, know this. Is something CDT, did but there were activists, for fight. For the future they, they, put on big white. Decontamination. Suits, and they, taped a camera to their forehead, and they just stood in the halls of Congress and, took face recognition scans. All day, and they, actually, identified a member of Congress, they, were looking for lobbyists, for Amazon because they were using Amazon face recognition technology, it. Was an interesting illustration of. This idea, of like you, are giving a lot of power to strangers. To know who you are and, then potentially use that for all kinds of things that you don't have control over so. We. Take four grand, I think, a lot of our functional, anonymity, in this. Country and the reality is that face recognition if. Unchecked will, do a really good job of stripping away a lot of that functional anonymity and some. People are always going to say it's fine, but I think at least. What. I would say to them is you don't. Have to lose the benefit of this technology in order to still have some rights, to control how it's used there are ways that we have done this in the past and gotten the benefit of these technologies, without all of these harms, so why are you so quick to just give up and let somebody, use these technologies on, harmful ways when you don't have to so.
How Would you you. I think in our earlier conversation, this morning you may have mentioned this briefly but I'm wondering when. You think about governance, frameworks. How. You think about the. What. The criteria, might be to decide what's a problematic, technology. And what is not. Is that the way to think about it or is it, are. There other criteria, what kinds of experts who should be making these, kinds, of decisions, is. There a role for example, for, academic, work or. Research. More generally in terms of assessing, the ethical. Social dimensions and in what on. What. Parameters, I guess, so. It's a it's a great question so I think I, would say we would kind of we want to start with having. A process for. Getting. Public, input into. How we're deploying these technologies, the ACLU, is and CDT. Has helped with this a little bit has been running, a pretty effective campaign, of trying to essentially. Get, cities, and towns to pass laws that say, any. Time you're going to deploy a new surveillance, technology, is to bring it before the City Council it, has to get vetted, we have to understand, how it's going to be used so we can make decisions about whether. This is the right technology, so just, creating, just a trigger, mechanism where, we're gonna have a conversation first, because it. May sound strange to say this but that actually, doesn't happen all that often oftentimes. What happens is a local police department, gets, a grant from the Department of Justice or DHS. And they. Use that grant to buy a drone and then. They get, drone and then they might get trained by DHS, but you know may not and then they fly that drone and they. Haven't appropriated. Any money from the city they, haven't put that in front of the City Council they just start to use it and then it comes out and sometimes city. Councilor, is really, upset or sometimes the police draw it back and sometimes they don't but. Just, having that public. Conversation, it's a really useful sort. Of mechanism, for controlling some, of that technology so I would say that's a beginning, obviously. You. Know state lawmakers, can play a really important role federal, lawmakers should be playing a role but we're. Not passing as many laws in DC, as you, know or we we're not doing quite as much governing, in DC as maybe we people, would like it's it's a it's a pretty without being too pejorative, I mean we are, at a little bit of a loggerheads, in terms of partisanship, and that makes it hard to pass things federally, but, that, doesn't there's a lot of other you know that's the wonder of the federalist system is there's lots of other places you can go. Academic. Researchers, are tremendously, important, because I mean, I said I think it's a top like for a long time my answer to many of these technologies, is this one specifically, was it doesn't work so. If it doesn't work and if an academic, could say this technology, doesn't work or these are the limits, that's an a tremendously, powerful piece, of information, but, it's really hard for your ordinary citizen to separate out the snake oil from, truly. Powerful and innovative new technologies, and I think technologists. And academics. Play a really important, role in just. As a vetting, mechanism, and saying you, know yes or no to a policymaker. Who wants to know like is what, they're saying true, that. Kind of neutral third party is, really important, so, I don't know how much you you, know about this but facial, recognition has, been particularly controversial in Michigan. So, for, two years over, two years Detroit. Was. Using facial recognition something. Called project greenlight. Without. Any of the kinds of transparency. That you're that, you're recommending, and you're talking about it, came to light with the help, of. Activists. And so. Now the. City you know they've sort of said okay fine I mean and it was sort of being used indiscriminately, as, far as we can tell and. And. More. Recently the mayor came out and said okay we, we promise we'll only use it you, know in very very. Narrow. Criminal. Justice, uses, but of course again. Detroit a majority, african-american City one, in which there's not a. Great, trust between the citizens and the and the government, you. Know that kind, of falls on. Deaf. Ears so and. One, of the things that even. Though they're now using it my sense is that. One. Of the things that's missing is. Transparency. In. Understanding. How. The technology. Where's, the data coming from how is the technology used, what. Kinds of algorithms there's. No independent, assessment. Of any of this. So. I'm wondering if you know anything about this or if you have recommendations.
On How you. Know in those kinds of settings how you might try. To influence that kind of decision. Made because often these are proprietary algorithms. That these police, departments are buying and they're not even asking the right questions, necessarily. Right so, yeah they're not and I think. So. It's a it's a really, compelling. Case, study because you're, right the reality is its gosh, it's really hard to trust a system that hasn't bothered to be transparent. Or truthful, with us for years gets, caught and oh I'm sorry and then kind of we. Have what we'll put some protections, in place so that that's not an environment, for building trust and a technology, it doesn't say you. Know citizens, and government, are partners, and trying to do do this right it says what, can we get away with, so. Yes so, in, no particular order clearly. There should be transparency about what who. The vendor is what. The accuracy, ratings for those products, are without really without, revealing, anything proprietary you, should be able to answer the question of how accurate, algorithm, is in a variety of tests, you know NIST has a series of they. Test these products and they'll tell you they go like you know just Google Miss face recognition tests and you can read the hundred page report that, late, that evaluates, all the algorithms this isn't secret stuff. You, should know when it's being deployed like you should have it you should have be, able to understand. How. Often a searches run what. Was the factual predicate that, led to that search. What. Was the results of that search did it identify, someone was, that identification, accurate. I mean, these are kind of fundamental, questions that don't reveal secret information they just are, sort of necessary, transparency. And we see them in lots of other contexts, if you if, you do an emergency. If. You're a law, enforcement officer. If you're a Department of Justice and you, want to get, accessed and read somebody's email in. An, emergency, context. Right you say it's an emergency, can't wait to get that warrant I you, know I have to get this you have to file a report, it's, ifer it's I. Won't. Bore you with the code section but it's it's, just a legal requirement I have to report why why did why this is and what's the basis for it so, these kind of like basic transparency.
Mechanisms, Are things that, we. Have in other technologies, and. We. Kind of have to reinvent, every time we have a new technology like the, problems. Do not change, many. Of the same concerns, exist it's. Just that the technology, is often written or, excuse me the law is often written for a particular, technology and so when we have a new technology they, have to go back and reinvent. Some of these protections and make sure they're broad enough to cover these new technologies. It's. Also so in my field we would call this a socio-technical system, I mean one of the things that. Then, you didn't say but I would also think you would want but, you know and I'm wondering I guess I'm thinking about other previous, technologies, and everyone, said, yeah. No I was just thinking about there was a recent. Article. Lengthy, article investigative. Article. In The New York Times about breathalyzers. And, in. That, article they talked about how there's. Both. The calibration. Of the device. And. Ensuring that the device, remains appropriately. Calibrated, but also that there's an interpreters. Interpretation. There's a lot of you know it's a human material. System. Right and in this case the. There may be a match right it's a percentage, match it's not you, know you have humans, in the system who, are doing a lot of the interpretive, work who also, need to be trained and we also don't have transparency, about that either do we no, we don't and and and, that's an incredibly, important part of the, sort of training of any system is understanding, what you're going to do with the potential, match and you find it so I'll give you I use, this example we can't really but, so how. Probably, I don't know if they still do it this way but this wasn't a lot longer probably maybe ten years ago I went to the. Big facility, in West Virginia, that handles, all of the FBI's, computer. Systems. Right that the. NCTC. Says we're not the excuse me the system. That when you like get stopped for a traffic violation the. System that they check against your, driver's license before they get out of the car to make sure that you not have wanted fugitive, and they're not gonna you, know their date it's all fake. Quarter here and one of the things that they do in, that head, in that facility is they do all the fingerprint, matches so. If I you, know if I get a criminal. You. Know if I get a print at a crime scene and I want to go see if it's matched against the FBI database is where I send it so. You. Know what happens, when they do a fingerprint, match at least ten years ago but still this is a technology has been deployed, for, 150. Years there's. A big. Room, it's, ten. Times the size of this room it's, filled with like people, sitting at desks with two monitors and, this, monitor is a fingerprint, and on, this monitor is the five or six Matchett, potential, matches and a human being goes. To. See if the world's, of your fingerprint, actually matched the right print. That. So, if you think about that's a technology that's 100. Years old and, we are still having people make sure it's right if, you, so that, is the kind of just to give you sort of the air gap between what automation, can do and then what the system can do imagine. Now how, are we going to handle this protocol, when I have, a photo of my suspect. And then I've got six. Photos of people who look an awful lot like this, person, like, how am I going to decide which is the right one and it may be the interest that you can't definitively you just need to investigate each of those six people and see if they're and the reality is with face recognition that's often kicking out not six, but fifty and so. There are real limitations. To the technology, is, getting better so I don't want to over, solve those limitations, especially. If your there are other things you're doing like narrowing, the photos you're running against but, there's, a there is there are systems, that will have to be built on top of the technology, itself to make to make sure that we're optimizing both, the results, and the protections.
So. You, know we've been it, has to be stvp, doing, a research. Project around this in our new technology, assessment, clinic and. One. Of the things that we've been thinking what we've noticed in our sort, of initial analysis, of the sort of political economy of this is that it is of course a global industry and. And. And, I'm wondering how, you. Know the. Legal, frameworks, what, are the legal frameworks that are evolving. What. Are the global dimensions, of its use and, how are those interfacing, with the legal frameworks and does, that have any implications for the way we think about it here in the US, no. It has huge implications. So. There's. A couple of things to think about globally. I think. Maybe the first is that most develop. Westernized, countries have, a baseline. Privacy law so there's a comprehensive baseline, privacy law that regulates. The sharing. And collection, of personal information so, if you were in the, UK for example there, would be rules for, who could collect your personal information and what they could do with it and getting permission for it and those rules. I believe. You know by and large I believe do apply to face recognition I, think there's, there. May be some nuance there but we, the expectation. From the people in those countries is that face recognition will, be covered and then, what that you know impact of that will be and so, that's a that's important, because it goes back to that idea that, I mentioned, before about you. Know do we start with justify. Why you're gonna use the technology, or do we start with go ahead and use the technology unless you, can prove that there's a reason not to and I think we want to be more in the don't. Use the technology, unless you have a good reason but, what, equally. Interesting, at least to me is that this, technology is becoming as it become diffuses, and becomes more global and there's a number of countries that are really leaders in face recognition technology. Israel. Is one. You. May have a harder, time controlling. It if I can go, online go. To a you. Know an Israeli. Company, download. Face recognition software. Scrape. The LinkedIn, database, without my your permission, and create. A database, of a, hundred. Million people, that. I can then use for identification purposes. That's. Really. Hard to regulate I you. Know it may be illegal, eventually. In the United States but, from a regulatory point, of view it's it's a real, enforcement. Nightmare, to try to figure out what the when that sister how that system was created how, it might be used so this, globalization. Issue. Is a real problem because a us-based. Company may may. Not do that but then certainly. They're gonna be places offshore, where you may be able to use that and it may, be less. Of a problem I mean you see there's lots of places that you can illegally, torrent, content, there. Are lots of people who do that there are also lots of people who don't because they don't want to do something that's illegal because they don't want to potentially, get a computer virus they you, know so it didn't want overstate, that problem but it is a real concern with especially, with the internet and with the. Diffusion, of Technology across the world and often, can be hard to regulate it and it's also being used in. Israel. But also I know in China right and for a variety of different kind of crowd control and. Disciplining, contexts so I'm, always a little bit careful with China because China is, the sort of the boogeyman that allows us to feel better about ourselves sometimes, like well we're not China so, we like, don't, hole just don't make China the example, of what your dear no you're not. But yes China, is a really good example of how you can use this technology they're using it to to. To. Identify, racial minorities, they're using it in many cases to put those racial minorities, and in you. Know in, concentration. Camps or at least separating. Them that from the general population, these. Are incredibly, coercive, uses of the technology. China, is becoming famous, for its social, credit scoring system, where we're starting. This you. Know I think it's it's, not it's not yet it's pervasive as it may be someday but it's, being used essentially, to identify, you and make decisions about whether.
You Should forget. For. Example to take a long distance train, you. Know whether you should be able to qualify for particular. Financial, tools, and so, again. Tools for social control I can identify you, I know where you are and I can make a decision about whether you should be allowed to travel where, you should be allowed to go and this is again as as, part of as you know called it a socio technical, sort. Of you know system, that allows you to sort of use technology. To achieve other ends, and, at, least perhaps. A warning. Yeah. I know it is it is a cautionary, tale but we we, have our own we. Have our own ways that we use this technology I don't have to fit you, know don't don't think that just because we're not quite as bad as China that we we, are not deploying this we cannot be better in how we deploy these technologies if. You will start by, asking. Some questions, from the audience. Do. Citizens, have any recourse, when facial recognition technology, is, used without their permission. If, you're in Illinois you do. No. I mean in Illinois is a very strong law you actually it has a private right of action you can actually sue, someone, for for, taking your face print without your permission and it's the basis for a number of lawsuits against, big tech companies for doing sort of exactly this kind of thing I. Believe, that technology is also illegal in Texas, there's not a private right of action though so you hear less about it. Trying. To think if there's any other I mean, the honest, answer is probably, no. In. Most of the country but. You. Know you you, could, if. You. Were you know if we were feeling kind of crazy there are federal agencies that arguably. Could reach this the Federal. Trade Commission has unfair. And deceptive, trade. Practices, Authority so they decide, you know taking a face print, is unfair. They. Could potentially reach into that it's not something they've pursued before though and it would be a stretch from, their current jurisprudence I. Know. Their audience member, asked, what led to the Illinois rule of consent, and what is the roadmap for getting new rules in. Well. It's. Interesting because in many ways Illinois, happened, really early, in this debate the. Illinois law is not a new one it's a it's at least seven, or eight years old so, oh and a lot of cases I think what happened was the, Illinois, legislature. Was sort of prescient in getting ahead of this technology before, there were tech. Companies lobbying, against it before it became embedded and they just sort of they. Said you can't do this and and for. A long time the only people were really that upset I think we're like Jim's, because. They couldn't you know take, people's fingerprint at the gym right without getting up you know going through more of a process, and. So that in some ways is a way that we've had some success, with regulating, new technologies, as to sort, of get at them before they, become really in trenched. We're. Kind of past that now but. We're also seeing, as, we, see a broader push on commercial, privacy, we're, seeing a real, focus. On face recognition people are particularly, concerned, about the deployment of face recognition we're, seeing it in the debate, over privacy legislation, in Washington State it's. Come up a number of times in California. Both at the municipal, level and at the state level I. Think. Some of the other sort, of state privacy laws that have been proposed include, facial recognition bans, so I would guess I would say that it's it's, something, that is ripe to be regulated. Certainly at the state level and you've seen some federal, we had saw a federal, bill that. Was fairly, limited but did have some some. Limits on how you could use finish recognition, that was bipartisan, and it was introduced, by Senators, Coons. And, Lee, earlier, this week so there is there's sort of interest across the board and I would say right now the.
State Is the most sort of fertile the state levels the most fertile place. Beyond. Policy, advocacy what, actions can individuals, take in order to slow the growth or subvert, the use of this technology by companies or the government. So. This. Is. So. There's it's interesting right I mean there are things you can do right, you could actually. Put. Extensive, makeup on your face to distort the print, image, like there are things you the sort of privacy self-help, kind of things you could do. By. And large as society, we we, don't we tend to like. Look. Askance at somebody who covers their face that's, a thing, that is maybe, we aren't comfortable with but maybe, we could be comfortable with it I mean this, is certainly an environment, I mean you're in an academic setting, you're in a place where you could be a little different without being, you. Know without sort of suffering I think if I tried to check, put checks on my face and go to work tomorrow well. I'm the boss actually so I can just do this but. If I wasn't the boss people might like not might look askance at me for doing that but here you could probably do it and if somebody said gosh why is your face look like that maybe, you could explain like because we, we have face recognition in, deployed. In our cities and that's wrong and, this is this is my response. And maybe, that's sort of a little bit of citizen, activism, that, can help us kind. Of push the issue forward, but. There you know you you can, I mean, you can try to stay out of the. Broader databases. That, that. Fuel face recognition so, if. You don't feel comfortable having a Facebook profile, a LinkedIn. Profile, anything. That links a good high quality photo, of you to, your real identity, is one that's going to make face recognition much, easier. Obviously. It's harder to do if you. Can't stay out of the DMV, database and, that's a you. Know and that's one that police, are pulling from so that that's, harder to escape. What. Are the ethical and technical, implications of the increased use of facial recognition for. Intelligence, and military targeting. Purposes, oh that's. A hard one. I. Mean. There are a lot of they're very similar to the ones. We've laid out their stakes are just higher I mean, we're identifying people, for. The purposes, of potentially. Targeting, them for, you. Know for an attack we've. Seen bodies. Have seen drone strikes for, the last. At. Least seven.
Or Eight years, you. Know you can imagine a face recognition enabled. Drone. Strike, being. Particularly. Problematic not. Just because drone. Strikes are really problematic, and we that goes back to the whole, argument, about unfair, systems, and then layering on face recognition on top of it. You. Know you have a greater potential for error but, to. Be fair and I'm, loathe, to be fair here because I think drone strikes are are just. Unjust. For so many reasons you could, argue that that actually in fact, makes. It more makes, it more likely that I'm not going to target the wrong person, but, in fact it's another safeguard, that you couldn't put in place that. Is as charitable as I can be. Now. This, this audience member wants to know what what can we do when biometrics, fail so, for example your facial measurements changes, you age so. What, are the implications of, facial recognition. Their. Validity in reliance over time. So. There's, a big, impact. Certainly. For children as you, grow up your face your, face print changes substantially. The. Prints have become more, stable. As you grow, older as an adult there is an impact but. If. You have enough, images, and you, did you know you're you have a robust, enough template, the, aging process has, been shown to sort of have less of an, impact, on accuracy, but that has a lot to do with how, many photos you're using to sort of create that initial, template that you're working from there's, also an issue with transgender people right, I mean oh my. I w
2019-11-21