Disability Justice & Crip Technoscience: Racism & Ableism in AI & the Future of Technology

Disability Justice & Crip Technoscience: Racism & Ableism in AI & the Future of Technology

Show Video

good afternoon and welcome to the next program in the liberating webinar series of the autistic woman and non-binary network my name is lydia xz brown pronouns they them and i am awn's director of policy advocacy and external affairs i am a youngish east asian person with short black hair and glasses currently wearing a red hoodie in a room with two windows with blinds behind me today i am excited to host our conversation on disability justice and techno science racism and ableism in ai and the future of technology i am pleased to introduce our two guests for this conversation today the first guest is damien patrick williams a phd candidate in the department of science technology and society at virginia tech damien researches how values knowledge systems philosophies social structures religious beliefs and lived experiences of humans can affect creation and use of algorithms machine intelligence biotechnological interventions and other technological systems and artifacts more on damien's research can be found at the website a future worth thinking about and i will drop the link in the chat next i would like to introduce crystal lee a phd candidate at mit and fellow at the berkman klein center at harvard university she works broadly on topics related to the social and political dimensions of computing data visualization and disability she also conducts ethnographic and computational research on social media communities crystal's research has been supported by the national science foundation social science research council and the mit programs for the digital humanities previously she was a visiting research scientist at the european commission and graduated with high honors from stanford university as we start our conversation today you can access captioning by clicking the button that says cc and you may then opt to view subtitle or show full transcript you can also click the link in the chat to access the captions on a separate page through our time this evening if you have trouble with the access or technology at any time you are welcome to send a message in the chat and we will try to resolve that issue as soon as we can we are also streaming live to facebook and if you would like to follow and participate in the conversation on twitter you are invited to do so with our hashtag liberatingwebinars so now i would like to turn to damian and crystal and ask you first if you would each talk a little bit about the work you're doing now and how disability justice shows up in that work we'll start with crystal hi this is crystal i'm a youngish east asian woman with um bangs and a ponytail sitting in a green room with pictures of sand and rocks in the background and i'm wearing a black sweater um so as far as the work that i'm doing right now i guess like the major project for me is thinking about how to hci hci is human computer interaction which is i guess the study of the relationships between humans and machines kind of broadly construed um so i guess like the core of this project is really thinking about disabled people both as users as developers as people embedded within this larger ecology of technology and computing and thinking about both who the assumed user is when people design um and like make computing technologies but also um like think about what technology is supposed to do and like the way that it's kind of embedded both within um a broader political landscape but also within like the research communities of computer science and engineering itself so i think in a lot of ways i'm trying to mobilize and mobilize insights from critical disability studies design justice and critical race theory into a field of computer science in order to think more broadly about how to design better technologies and how to imagine better speculative futures i'll hand off to damian thank you very much um my name is damian patrick williams i'm a youngish black man with some natural hair in the center of my head shaped sides facial hair and silver wire and glasses i'm wearing a red shirt and a silver paisley tie and i'm sitting in a room filled with bookshelves each of which filled with books and decks of cards and figures and knickknacks my work is focusing on as lydia noted in their introduction of me the ways that human values make their ways into technological systems and artifacts i think about how questions about the knowledges that people hold or the the belief systems of which they're apart make interventions into the things that they do the actions that they take the systems that they build and the way that those beliefs and values then replicate and iterate within the systems and the artifacts the technologies that they create that we all are subject to and make use of every day my work is specifically focusing right now on how marginalized lived experience is impacted by and also can impact and perhaps even make better for a very carefully curated and negotiated definition of what better means make better the experience and the creation of these technologies specifically again algorithms artificial intelligence biotechnological interventions things such as prosthetics even going so far as genetic manipulation technologies and things of that nature so i think about the ways that for instance marginalized uh religious beliefs west african religious traditions that were brought over with enslaved peoples during the transatlantic slave trade find their ways into things such as the way that we have a cultural conversations around artificial intelligence these days those kinds of perceptions those kinds of understandings of of different types of uh integrations with the natural and the built the delineation between the spiritual and the physical these things can perhaps illuminate help us to better understand things about the technologies that we all deal with day in and day out this is lydia in most conversations about ai and algorithms the topics and the phrases that are the terms du jour are algorithmic bias and algorithmic fairness and it's even some of the work that i'm doing in my other capacity with the center for democracy and technology we talk a lot about algorithmic bias or fairness and sometimes about discrimination i want to ask the both of you where those frameworks of algorithmic fairness or bias fall short in addressing racism and ableism in ai technologies this is crystal damian do you want to start uh sure this is damien and i'll say um the notion of bias the way that it does kind of uh miss the mark a bit is that in a real sense bias is just a synonym for perspective a point of view on the world we tend to when we deploy the word bias in our conversations about ai most people tend to mean prejudice when they use the word bias and you know biases in a pejorative sense but the idea that we could make something that was perhaps bias free bias neutral is i think pretty much a functional impossibility i think that the the real fact of the matter is that human biases are our human values and human values are human perspectives and human values and perspectives are going to get built into the systems and the tools that humans make and so the question isn't can we make these free from bias can we make them free from perspective can we make them free from values it's which values which perspectives which biases are we going to build into the systems and that conversation is still one that's not really uh being addressed um the kind of recognition and acceptance that we're going to build some values into the tools that we make into anything that we make tool or not what values do we want those to be and to make that a part of the design and the building process this is crystal i totally agreed with everything that damian said and i feel like he really encapsulates the the problems with the words fairness and bias in a way that i think is so compelling and so kind of jumping off of that i think like as he said like it it's not about trying to create a system that is bias free because it's impossible but i think i like the way that i can kind of contextualize this is to kind of think about the promises or the aspirations of what algorithmic systems are trying to do and then to kind of like calibrate what fairness and bias means there um but i think one thing that i've been thinking about a lot is how the aspirations and the the promises of algorithms in public policy are at once like way too aspirational and not aspirational enough so when i say that they're overly aspirational or overly ambitious i think the goal of a lot of ai systems is to somehow optimize for like certain kinds of social outcomes so optimizing for better policy or like better and more efficient economic systems better distribution of welfare and we can talk more about means testing um later in the conversation but i think like when damien is talking about like which values and which biases we build in like often these systems are basically about making current social systems more efficient and i think what really falls short for me and i think what damian is getting at nothing i don't mean to speak for you but is that with a lot of these systems they're not ambitious enough in that when they're trying to optimize for current social systems that often value things like white supremacy there's no deeper engagement really on what a better social future could look like so what would a radically different way of organizing a social safety net or organizing um society in a way that centers things like disability justice what would that safety net look like and so i mean i think i really echo um what damian is saying about like the insufficiency of ideas like fairness and bias just because it assumes that somehow there is a world where there isn't prejudice or that we could create a system that is fair within the current confines of uh neoliberal white supremacist society this is lydia i personally hate using the word bias in these contexts because it makes invisible how systems of oppression actually work by placing the onus of oppression both on individual people who act oppressively and on individual people who are oppressed on the one hand implying that if only individual people who act oppressively just decided to become more conscious of it and stop doing it that that would somehow disappear whole systems and it's not how that works or on the other hand implying that if people who experience oppression choose to fix something about themself but suddenly they in many cases we would no longer experience oppression which is a really a subtle way of just victim blaming but on that first end of it of how bias makes individual people's actual oppressive actions and divorces them from the systems structures structures patterns and institutions in which they operate i'm thinking about the use of the hashtag stop aapi hate or stop asian hate and how i don't use that hashtag i don't use the word hate because it's not that hate doesn't exist or that there is no room for a conversation about hate including anti-asian hatred but it's that using the word in the context of all violence targeting asian people because of racism individualizes structural racism and it instead implies that hate of asian people can be an individual prejudiced person's problem rather than one rooted in literally hundreds of years of state and interpersonal violence against asians from many asian communities south south east asian east asian central asian people from the southwest asian region usually called middle eastern and how that violence is tied to and triangulated against other communities of color black people latinx people of color native people including asian people who are also part of those same communities so using the word hate doesn't allow or enable us to have the broader more expansive conversation that we have to have to confront the root causes and exacerbating factors of that violence and in the same vein i don't like the word bias because it carries that same implication that bias is about individual people choosing to be prejudiced or being prejudiced out of their individual ignorance but allows us to ignore the role that systems structures institutions and patterns play in enabling and perpetuating harm that superficially looks like bias now i wanted to ask another question and draw back for a moment but before we do that i think i just saw a dog behind you crystal is that a dog it is come here this is like up she's very indistractible right now because um she has gotten a treat so that she remains quiet throughout the entirety of this session so she's very concentrated on that right now oh i love her she is the bestest girl can you share some specific examples for those who are newer to this conversation about how ai works what it does where it's used in ways that can be racist or ableist or anti-racist or anti-ableist this is damien um unfortunately i do have a lot more examples of the racist and ableist and also misogynistic and trans-misogynistic and just i have a lot more examples of the negative applications and implications of ai than i do of the ones that are actively anti-ableist anti-racist anti-misogynistic anti-transmisogynistic um and that is i i think unfortunately the nature of the conversations that we still need to have um and the lack of certain conversations um a couple of examples um that i do think are doing you know good work um you know the the project uh day for black lives which tries to do work with looking at the the history of algorithmic incursions into black lives both in terms of the ways that facial recognition technologies get used but also in use of force recommendations and uh use of force incidences within police encounters black violence um with anti-black violence for for police actions um and so you know that's that's one place where people are trying to use um algorithms big data sets and ai and machine learning to kind of uh eliminate a pattern and and showcase you know a certain set of data driven recommendations uh for how we ought to be making changes and where we ought to be making interventions um there are however unfortunately still predictive policing algorithms uh and predictive policing and facial recognition tools that are being put into place in police forces and military actions both in the united states and around the world which however are more pervasive they don't see dark-skinned faces as well uh the facial recognition applications don't categorize black people around people very well at all and they don't actively or accurately categorize quote unquote non-standard bodies any disabled body any person who's using crutches or a wheelchair is often miscategorized by these systems um and these things can you know then be used as further data for the systems to try to learn from which then unfortunately the errors that the system will make in that process then get operationalized by the humans making use of these systems and oftentimes has hazardous disastrous and even life-threatening consequences for the people who are subject to them uh this is crystal i totally agree 100 with what damian is saying i feel like this will actually be like a recurring theme throughout this conversation where i just like it's like a call and response damien will say something and then i'll just agree and try to add more things on top of his already rich description um yeah i guess i will try to take a crack too at the like ai technologies can be helpful thing just because i feel like it feels like a social responsibility um i guess i'll echo the call for data for back lives and then also add the algorithmic justice league um and the kind of work that they're doing they're uncovering um like or trying to dismantle white supremacy in uh machine learning systems like facial recognition um i guess i want to kind of give an example maybe of like ai technologies that like on its face actually provide access but also have this like sinister pernicious nature so i guess the two that i thought of are like first voice agents so i'm thinking about things like siri um where theoretically like if you have like a um like internet of things connected system in your house like it could be really effective especially for people with physical impairments to be able to turn on their lights automatically or have like um their temperature regulated in a ways that makes them more comfortable when they're having an episode like i think there are lots of ways in which this kind of voice agent can really help people navigate the world in a more accessible way um but if we're really trying but if we kind of think about it um and probe a little harder in terms of like the ways that it is harmful and violent i mean a lot of voice technology especially in china is trained um on uighur people who are being like incarcerated and tortured and like all of their biometric data is being taken um on mass and this is how a lot of the systems are being trained um in terms of like users uh voice agents are notoriously very bad at detecting any accent that is non-standard um uh and we can talk more about like what that means but i think like the kind of difference between like the normate and the disabled body is something that i think um people in this chat are like obviously really invested in thinking about uh but i guess like what was the other technology that i was thinking of um oh ocr um so the ocr is optical character recognition so it's the kind of technology that like if you're scanning a book for example it will like automatically read um the image of the letters and basically translate them in a into text that you could basically copy and paste for example and like into plain text um i mean i think in a lot of ways ocr is gives enormous potential for access in the sense that if you have like mass scanning of books that aren't otherwise available um in a screen reader friendly fashion like ocr enables that um i think for a lot of folks too like ocr combined with the screen reader makes um like text more accessible especially if you have dyslexia have visual impairments like there are so many possibilities in terms of making texts um that are available in person accessible to people on the other side of the globe too um on the other hand i think it gets into real questions about infrastructure availability of internet availability of like digital devices but also like if we want to think more broadly about what actually gets digitized as damien was talking about earlier there are values built into all of these systems and their values built into what people choose to think is important and like label as such so that it it then goes through this process like i think if you think about something like google books certain books will be prioritized for digitization and others will not and i think for a lot of those books it probably um uh like uh disabled authors authors of color authors with multiple um marginalized identities like often like it's already hard for these communities to get published in the first place and then on top of that you know be deemed important enough to be digitized um and that for mass access i think is another layer of um discrimination and horribleness that i think become really embedded in these technologies once you just kind of like dig a little bit deeper this is damien lydia go ahead damian oh sorry um i was going to uh this is damian again i was just going to go ahead and uh agree with crystal and say that yeah one of the wonderful things about ocr is yeah i concentrate better on a text when i can both read it visually and listen to it and you know with certain screen reader capabilities i can do both of those at the same time but the number of times i have to go in and ocr something that was not made available in a screen reader format even if it was already in a pdf format even if it was already in an epub format you know that the number of times i have to go in and and make those changes and make those edits myself says something very clear about who was you know envisioned who was idealized as the person who would be accessing this text and yeah that is definitely a conversation that needs to be had more often and more thoroughly within publishing houses within writer's groups especially when we're talking about books that in many cases are about marginalization and technology this is lydia i appreciate the massive coffee cup i hope you have something good in there i want to ask now shifting our conversation what interventions do disability justices in the emergent field of technoscience named first in amy hamrie and kelly fritch's manifesto make into conversations about ai and the future of technology this is debian um one of the things that i really appreciated about homurai and frisch's crypt techno science manifesto was the re-centering of the notion of like design justice and the idea of what it meant to not just have a user-centered design experience but to have an inclusive and just design experience something that says don't just pull a people who are going to be subject to the things you make but center them as partners in the design center their knowledge and their experience at the outset and i think that that is kind of one of the fundamental things that i have taken from it when i think about what needs to be done in an ai when we talk about algorithmic systems when we talk about big data systems this notion of not just you know thinking and imagining forward to what the implications might be for a group on whom these technologies are used but actually centering those people reaching out to those communities and making them partners not just people you pull for their input but saying you know this is this is something that is most likely to be used on communities of color on disabled communities these systems will be used to determine the benefits of poor people of of disabled people of people who you know will be made subject to you know the criminal justice system in the united states right like all of these questions um can be asked directly to the people who will have to live them and those responses those experiences that lived knowledge is something that we then take not just extractively not just not just taking it out of people but saying bring in this knowledge bring in these people bring them into community and into conversation and give them the the ability to actually shape these tools to design these tools such that you know they can say if you use this this way if you build this this way it will be used on me in x y and z ways you know one of the things that i was thinking about uh earlier crystal when you were talking about the the training of voice recognition systems in china on the uyghur population one of the things that i was also thinking about was the way that like certain groups have been trying to make facial recognition that they say can detect uh gay people i'm like that seems just like an on its face terrible idea and if you were or asked any single person in the queer community with any lgbtqia identity whatsoever they would tell you this is a terrible idea the idea that you can one detect someone's lived experiential identity by their face is phrenology and it's been debunked and it needs to go back in the garbage where it belongs but two even the notion that you might be able to is going to be made use of in a way that will be inherently oppressive when there are cultures that exist right right now where being gay can get you killed why would you propose to make a tool that could make that oppressive and deadly usage easier and if you again ask anyone who lived that experience they would have said right up front don't do this it's bad you make a tool that says that you can better predict crime you ask any black or brown person who is subject to enhanced surveillance in the united states based on the color of their skin they'll tell you yeah that's going to get used on my community you're gonna predict crime more often in my communities not where crime is actually happening because you've defined de facto this culture this society has defined my community as criminal already so that's where they're going to look for the thing that they're going to try to predict and it'll be a self-fulfilling prophecy and so that's like that's the kind of the main intervention that i've taken from the crypt technoscience manifestos is using that both specifically in the the disability justice sphere that they initially intended it for but seeing what other marginalized identities and lived experiences it can you know that that framework can be applied to this is crystal yeah oh my gosh i'm just gonna sit for a second with what damian has said and just like um really internalize all the like awesome sparkly bits um yeah okay so i feel like there are two things that i kind of want to jam with like one is you know this like broader question about crypt techno science i'm gonna put that on hold for a second just because i feel like damien really said something about facial recognition that i want to chew on more um so like this entire discussion about um like identifying homosexuality this kind of like new phrenology um like makes me think a lot about a paper that was written by oskies about identifying autism and there's a lot of really interesting work on um this kind of like facial recognition work like ai um data tagging work um on like communities um i'm gonna i'm gonna take a breather there for a second um okay so i'm thinking a lot about this paper by os keys um and other folks too when it comes to thinking about facial recognition and uh detecting either homosexuality disability autism etc um and what they really do in the paper that i'm thinking about right now um and the title escapes me is this distinction between um identifying autism or like identifying you know x identity as if like one could abstract identity from facial characteristics and like labeling with um disabilities so you know obviously there are like technologies that try to detect this but they also talk about like technology firms um and labor forces labor workforces that are trying to use people with disabilities in order to create the uh technology so like employing autistic people for example to do it work because somehow they're more uh they have like a higher attention to detail or you know they come up with all these sort of like naturalizing um and essentializing characteristics of what it means to be like a good autistic person uh and then it talks about how they are uniquely capable of um doing some job often related to technology like whether it's it or data tagging i think there's a larger conversation to be had there about like employment opportunities disability and like the way that disabled folks are really marginalized in the u.s that makes them have one of the highest um like rates of unemployment so this is like a completely separate conversation but i think there's something that's really interesting to me about how a lot of technology companies particularly in china and in the us um who are essentially trying to kind of fetishize uh disabled folks and like their ability to like do magic with technology that somehow makes it better but without centering their experience whatsoever um and i think there's something really pernicious there um in terms of like both the way that the these technologies and these services are marketed um but also you know the way that you know labor can often be seen as um as charity for example um rather than something that is valuable um and i mean here i'm thinking a lot about the work of d woo who's also a phd candidate at mit who has done a lot of substantial ethnography about the way that data is tagged in ai systems in china where you know there are these like really interesting dynamics about employment labor who gets to work who's good at working like what kinds of characteristics make a good worker that go into the making of all of these technologies that i think are really really interesting and would love to explore further um both in this session and otherwise um all this to say i feel like there's a lot of jumble here and i'm trying to make sense of it this is lydia i have two last questions for both of you and i'm really excited about what each of you will have to offer and one of them we hadn't actually talked about so forgive me for springing it on you a little bit but i wonder we spend so much of our time as advocates rightfully picking apart what is wrong and harmful and oppressive calling attention to and naming wrong and we need to do that we absolutely have to keep continuing to do that but we also have to think about and i know we are all doing this in different ways what the world would look like if oppression didn't exist and that idea is very much encapsulated in the title of your website damien a future worth thinking about and so when i think about what kind of a future do we want to live in what ought the world to be like i'm thinking about if we had systems that actually worked for people instead of profit systems that recognized humanity instead of denying personhood systems that valued all of us in our complexity and our wholeness what would that be like i want to ask you if we're thinking about technological design and development and the increased usage of ai technologies what would it actually look like to be building technological systems processes and programs that we're not just about furthering justice but about sustaining it what would a just world look like in its development of technology this is crystal what uh what an incredible question i [Music] will be practice humility and say that i don't have the answer um i would say about sustaining justice um i guess here i like really turn to the work of sasha costanza chalk and the way that they've really been able to conceptualize design justice is something that you know centers communities in a way that doesn't fall into like a kind of stakeholderism as if like you know you need to get all the different stakeholders at the table then we can make like good decisions as if like everyone is equal when in reality that like people are on such uneven playing fields because of like because of the way that power works in society um i mean i guess like centering communities centering disabled people like this is all i think like a first step like necessary but insufficient in my mind um but i mean like really maybe this is a cop-out but like i think it requires radically imagining a society that like like it's not just about technology but it's about radically reimagining society in a way that thinks about alternative modes of like thinking about value and thinking about humans and centering them in ways that like center care um and when it comes to like designing technology like in order to get at the speculative fiction i mean i guess that i don't mean to speak for the other people here but i think this is like a life project um and i think like a journey that we're still going on and i'm so happy to be on that journey with you all this is damien and i 100 agree with everything that crystal just said um i i absolutely believe that it is a lifelong and society-wide practice and project of reframing and re-understand what it means to to be people in community together right and so so far as how do we weave that into the technologies that we're trying to create that question is the like it is such a fantastic question and it's the question that i i keep trying to get at and trying to to dig something out of um you know every day in my work as well and again i'm also very happy to be in this company doing that work um it is a matter of i guess understanding how we put our values into the systems that we build in the first place both technological and and social and how we accidentally repeatedly all day every day put our values into those places understanding that so that we can do it intentionally and so what that then looks like is the practice of asking ourselves what values we're embedding into these systems asking ourselves what beliefs we hold while we try to do this work being critical about those power imbalances and saying you know yes in order to have justice in order to have something that allows for for a just society wherein the harms of the past are repaired as well as ensuring some kind of equality of understanding and opportunity into the future there will be a moment and this kind of goes back to your very first question lydia there will be a moment where fairness is not enough if fairness is understood as just giving everybody the same amount as giving everyone the same access as you said crystal that doesn't that does not account for the the reality the lived experience of the fact that you know if i'm in a hole 20 feet deep and you're on a hole 10 feet deep and everybody gets an eight foot ladder that's not some of us aren't getting out of the hole we're in so we have to make sure that we recognize what's actually going to repair it's actually going to to help what's actually going to enable us to make the changes make the the balances make the the justice that we need um and that requires us to having the hard conversations it requires us doing the difficult work it requires us doing the real interrogation of our values and of of the status and the lived experiences that we all currently hold this is lydia i really appreciate both of your thoughts and your offerings about this and i want to offer too that dreaming and thinking through what the future will and should be like if it were just is messy and unpredictable and i don't think i have those answers either and i would hazard to say you know anyone who claims that they have all those answers i would be very wary of i wouldn't necessarily feel all that comfortable collaborating with that person who claims they know exactly everything that we all need to be doing and exactly what things should look like i also don't know that there will just be one way for things to be i think that prescriptive approaches that presume only one way of doing things are still rooted in very white-centric colonizing modes of thought not to mention profoundly ableist ones that assume there's only one right way for people's bodies to move and only one right way for societies to operate i don't think that way of thinking has a good place in liberation focused political organizing and visioning and that brings me to my last and final question for you both of you belong to directly impacted multiple marginalized communities and that's part of why i've been drawn to do work with and be in conversation with both of you i want to be in conversation with people who have experiences that are similar to or parallel to mine and who are nonetheless pushing back against harmful oppressive institutions that we often must work within any way what does it mean for you to be undertaking this work coming from a perspective of marginalization and oppression how do you sustain yourselves in this work in what ways does community sustain you um this is crystal uh what does it mean to to do this work i mean i i don't mean to be like a debbie downer all the time i think what it means to do this work for other people who are interested in it means like being really sad a lot of the time being angry a lot of the time um having to kind of sit with like fear anger sadness uh frustration i think both at like the systems that have kind of created this but also just like the systems of academia that are like not well suited to this kind of work both in terms of i think advocating for like diversity inclusion you know like whatever catch words you want to use in order to describe like ai ethics or fairness um i mean i think there's a real commodification of this kind of work and like a capture i think in a way that you know like on the one hand you're talking about like accountability fairness or whatever and on the other hand what they're really talking about is like optimize and contact a prison pipeline um and so like i think it's hard not just in terms of like actually doing the work you know like doing the research but then there's like the additional layer of like doing it within like ablest academia and like trying to fight two battles at the same time um at the same time i feel like what it means to undertake this work for me is like being in like community with people like damian and lydia and like finding these communities of care who are thinking about the same things pushing me to think better um be like to really expand um what i think is possible and what i find to be interesting or like things just like that i haven't never thought about i think having these communities who like share common interests and common goals um but have different perspectives and push me to think about those things in different ways but also like you know hold me and care me what care for me when i'm like deep in despair or like getting harassed you know like having people like just have my back on some of these things i think has been really really overwhelming and helpful and i think this particular seminar is a really good example of that where like i'm just with cool people and just jamming and i feel like that does a lot for me um both in terms of doing this research but personally too this is damien um 100 agreed again i i the the need for community the being able to work with people uh who uh understand that that lived experience and the you know the variedness of our lived experience the variation of the fact that we're not all experiencing exactly the same thing because we are all different and are you know overlaps the intersections of our marginalization are all different but there's a there's a similarity of shape to the experience and the feeling that we you know have when we're on under that scrutiny when we're in that space um and yeah i do unfortunately think that it is about recognizing that we will be angry unfortunately a lot while we do this work because so much of this work is about pointing out what's wrong as lydia mentioned earlier but also as they said earlier looking to understand what it is we want it to be like and arguing for that pushing for that working together building a community so much of um what i think of when i do this work uh the comedian wyatt sanac wrote uh an article called on being the only one in the room and he talked about his experience of being the only black person in uh comedy writers rooms for various tv shows and you know he related that too it's not always just you're the only black person sometimes you're the only woman you're the only disabled person you're the you know you're the only person with an lgbtq identity you know any number of things um but the the effect is the same you feel like you have to simultaneously bear the brunt of every single question that anyone has about whatever your lived experience or lived identity is but also do the work of representing that entire breadth and depth and scope of lived experience for these people who otherwise might not get it and when i do this work in you know groups of people who often times are not disabled are not people of color are not you know multiply marginalized in the way that so many of us here in this conversation are who simply do not have the lived experience to draw from so much of the work that gets done is about explaining in detail something that is intimately lived and so it's hard to conceptualize hard to intellectualize out because it is so it's like swimming it's like you know moving and breathing so much right um and so being able to go from being the only one in the room to being in a room full of us to to building a community in which we say it's not going to be just one of us in that room in the future the next time we're going to you know build that room the next time we're going to build we're going to build that guest list and to say you know this isn't just about access and inclusion this is a diversity and inclusion initiative it's a you know an equity injustice thing it's about not just having a a stake it's about defining what the stakes ought to be in the first place um and so doing this work is about recognizing that my my frustration my sadness my anger those are valid but that i also have to have and have to make use of the community that both offers itself to me but that i have also taken pains to build with all of you and with others who are trying to do this work damian and crystal i am so appreciative of both of your work and of the chance to be in community with both of you for those who are just tuning in i actually only met damien and crystal both within the past year and a half and even in that short time i have become grateful for and deeply appreciative of what they do in our communities and in their scholarship i appreciate your time with us and we will turn this time over now for live q and a with our participants before we do so one last very important question damien are your cats there where are they this is damien unfortunately both of my cats seem to be asleep and the the other room is far i i would bring them onto the screen if i could but unfortunately they don't seem to want to play right now um however we do have leica in the screen again hello hello wonderful doggo this is lydia hello we love doggo she is so happy and squirmy thank you for sharing her with us crystal she is a gift and with that we'll turn it over to q a thank you for joining us today hey damian um so i know that lydia is on their way um and so i guess we can manage the q a um so for all of you who have questions could you answer could you put them in the q a box so that it's a little bit easier um for us to manage because otherwise that i was a little overwhelmed by the chat um and i imagine damien was too but maybe damian if it's okay i will start i will ask you the question the first question and then we can kind of riff from there does that sound good sounds great okay yeah um so the first question from carolyn is can bias be used to counter bias and do you think that no bias connects to universalism rather than difference um that is a very good a very important question um so one of the things that i kind of try to think about in you know thinking about when we do talk about the bias question as we were discussing earlier can we use the values that we have to kind of interrogate the values that are present there are definitely a couple of people who are in this chat right now i know who are thinking in that space um researchers uh like jonathan flowers have proposed ideas about using these systems to kind of highlight the bias you know placing them in a running state and then you know seeing what biases get exacerbated and what gets popped out so that we can get an understanding of exactly what's been built into them in the first place and that's you know that's one way to go about it and that gives us the space to to really think about what it is we've built how we built it and again that space of what notions what values have we embedded unintentionally in these systems so that we can do a better job of embedding them intentionally embedding the values that we want to see in these systems and doing so uh on purpose rather than just being subject to the accidents that we've made so far as the the universalism uh question aspect of the question if i'm understanding it correctly um this idea that there might be a way to kind of combine the multiple different perspectives that people might have the multiple different sets of values that we and some of the conversation that i've seen in the um chat as well um making space from multiple different perspectives and multiple different values rather than trying to say there's one right way one universalizable way i take a lot of my cue for that from uh things like standpoint feminism um you know the idea of situated knowledge this uh notion that you don't get anything like a capital o objectivity you know you don't get a capital t truth you get multiple different viewpoints on the nature of things and somewhere you know in intentionally putting those viewpoints together in conversation with each other you can get at a clearer and more robust picture of the way the world is yeah i totally agree that was so eloquently put oh by the way this is crystal um so i mean i guess like another way that i kind of thought about no bias and universalism is that i think the emphasis on trying to de-bias systems and to search for no bias kind of reminds me of people who are trying to create colorblind futures or like colorblind diversity initiatives as if like at some point we can just no longer see race so we can no longer see bias in ai systems um so i mean i guess yes to the question in the sense that i think ideas about like deep biasings to the point that there is no bias kind of connects to that kind of universalism and just it faces the differences that exist between us um and that become embedded within ai systems um so i guess we can i don't know how you want to do this damian i can keep on just asking no i was gonna i was gonna switch off and i was gonna ask you the next one um yeah sure great um we have one uh from larissa mann um how are people thinking about design justice or use of technology that also accounts for people making the resources and the environmental impact how especially to escape the kind of supposed trade-offs assumed to be forced on techno technology users versus technology makers or say coltan miners oh man what a question um i'm gonna have to marinate on that a little bit um so i will pause um but i mean to the second part of the question about assumed trade-offs between technology users and makers i mean i think that one way to break this down is to say that like i think it would bring up a false division between like disabled users of technology and like theoretically able-bodied makers when in reality there are disabled makers and users at the same time and so i feel like i'm really trying to kind of collapse these categories um to kind of talk about how um you know users and makers can often be the same person um oh uh there was another part of the question uh design justice um for like resources and environmental impact i mean i would say that thinking about ecologies of people and the environments around them are like very key to thinking about design justice so like i mean i think environment like environmental degradation affects different people in different ways and so being attuned to that as like part of people's experience is a really important component of design justice and i would love to hear anything that you have to add too about that damien yeah um you're absolutely right one of the key things that i when i was teaching intro to philosophy many years ago but when i was teaching it you know we would have conversations about um technological ethics right we talk about um the realms of impact and unintended consequences of you know certain things in the world coltan mining was one of the the key and crucial things that we would talk about we would speak about the fact that the processes which make for convenience in the world at scale are also the processes which uh outsizedly affect already marginalized and pressed populations and so we would have this conversation of how do you mitigate and balance the fact that you know there is this outside disproportional effect and that's kind of the crux of it is we have to consider it at the outset as we've talked about this whole time it can't be an after the fact kind of consideration it has to be something that we you know think about try to understand and if we realize that we ourselves don't have the scope and frame of reference to properly understand we bring in those communities directly and then we get into the habit of doing that first and foremost we bring in the communities who will most be affected by this and we try to envision a way that doesn't have these trade-offs that doesn't have to be a zero-sum calculation about you know well somebody's got to lose in order for somebody else to win that's not necessarily true it's simply the nature of the system that we have put ourselves into and then reinscribed and reinforced over and over and over again such that we now think of it as the only possible way forward there are other non-zero sum ways of design of achieving justice of building systems of interaction and community in society and so envisioning those at the outset aiming for those at the outset is kind of crucial if we want something like justice oh thank you this is crystal thanks so much for that demon i have a follow-up question that i think um i can draw directly um to the point that you just made so the question comes from hani auni if i'm pronouncing that correctly and they asked how have approaches to technology differed in spaces that are currently trying to build equitable societies um that challenge colonialists and racialized capitalistic systems so the examples that they bring up are zap zapatistas and cooperative networks and humboldt and jackson yeah um so we can see examples of you know groups who take on uh projects that are more community focused that uh adapt or re-appropriate systems that exist um in or that have been imposed upon them um there's a researcher uh at virginia tech um fabian priya who does work on talking about like how even things like pirate stations in uh central and south america have a kind of anti-colonialist colonialist or decolonialist vector and thinking about the ways that like this kind of self-directed drive towards the use and the development of these technologies um is uh distinct from different from the work that is otherwise imposed upon them by state media so you can kind of think in terms of you know these these tools and i think this gets to a couple of the other questions that we've seen in the q a section as well there are ways to push back against the uh existing order right like the status quo there are ways to to resist it and to still be doing technology that is otherwise i guess recognizable to the rest of the world as technological innovation where innovation has that capital i innovation and it's seen as a like an inherent good somehow but it's you know if you if you think about again bringing those communities at the outset we can find these examples of people who are developing these tools who are who are doing work that that pushes back against imposition in that way yeah i mean i guess the follow-up answer that i'll have there um that i think sort of gets at the question that skyler whittaker is asking about what a global perspective on design justice would look like um so i'll answer kind of both of those together i think um i can think of two uh really good works that point to some of these answers the first is rodrigo ushigame's work on informatics in latin america his name is o-c-h-i-g-a-m-e um and he kind of chronicles a kind of alternate history of cybernetics and computing in latin america specifically cuba um and uh so and then the other example that i was going to bring up is um lilliarani's that's i r a and i's work on innovation in india and i think when it comes to redefining and rethinking what a global perspective would look like um i think like part one is just redefining what innovation is and you know the kind of metrics by which we uh think about something as being successful so instead of centering economic value and profit centering care and community for example um so i mean i guess that's kind of where i would first think of i know that i have other examples in my head that hopefully will pop up later um but damian do you want to ask another question yeah um so there's a question that comes in here of um it was one of the first ones that was in it was asked while we were uh still in the middle of the opening discussion um from carolyn ward and the uh thinking of eduardo de sant's concept of opacity and that transparency is not always a good thing uh and that it can be used to serve whiteness i wonder what is better to be recognized by facial technology or to not be recognized ah um so i guess i will this is crystal i will toot my own horn and say that i have written a piece about this that was published in slate um i can recapitulate some of the main points here but essentially i think the problem with facial recognition and thinking about ethics and technology is that often you have this double bind um and i think caroline um talks about that very eloquently and i mean i think like to be recognized often means that this kind of work will uh this kind of technology will incarcerate you and then if you have more and more people kind of entered into this data set to like de-bias it or to increase diversity it often just means that the system is now just more efficient at incarcerating people um in the case of facial recognition and i mean i think like we can talk about facial recognition as something that is imperfect which is why i think a lot of the problems around mass incarceration arise with facial recognition but i would argue that even if the system was perfect it would still be a harmful system um and i mean i think there are lots of reasons why that we can go to about that so like we can talk about like the technical aspects of whether or not that's possible or whether it's even desirable um but i mean i think in the discussion previously we kind of talked about um cases in which like a it will never be perfect and b even if it were it just makes current structures of oppression faster and better that's exactly it um you know when i when i think about that question i think about uh you know simone brown's work in dark matters and i think about like the the question of what it is to be made visible right to be made legible by these systems so yeah as you as you point to like that question of uh it's just gonna make the oppression faster and worse right it doesn't you know i think it was 2016. claire garvey uh barbados a couple of other people at georgetown were looking at like uh this is this big report called the perpetual lineup right and one of the main findings of it was that the facial recognition technology that is deployed is deployed most often on the communities that it sees less well right so it's deployed most often on black and brown communities as i noted and it sees them less well so it's just this this constant recapitulation of of error but the follow-on understanding is not make it better so you can continue to deploy it against them it is that even when it is even if it were more carefully refined the values from which it is deployed the purposes to which it is put will still be those oppressive racialized purposes wherein again predictive policing and facial recognition are going to be more likely to be used on communities that are already deemed more likely to be criminal which in western society and specifically in the united states of america is overwhelmingly black and brown communities because this society has a long history of tying criminality and notions of criminality to skin tone to ethnicity to the notions of race and binding those things together so as to make it easier to exclude those populations from full participation within civic society that has a long and chronicled history right and it ranges from everything from policing to you know housing applications so yeah no making facial recognition better doesn't make it better making it more accurate doesn't make it better is what i should say there um and so it's i think the the question of is it better to be seen less well or is it better to be seen more clearly neither until the technology is put to purposes and deployed from a value set that doesn't reinforce and re-inscribe those values over and over again neither is good this is crystal uh yeah knife neither is good what is the future uh these are all million dollar questions um i guess the next question that comes from daniel is really important where is assistive technology in this discussion yeah so this is damien um there are a number of really great people doing really interesting work on the question of how assistive technology fits into this um alice wong jeffrey beardy ashley shu are all looking at how assistive technologies and disabled users disabled technology users of technology that isn't necessarily deemed assisted at the outset but is within the community seen as assistive um these these questions are crucial because it gets to who's doing the designing of these technologies and again who's deploying them and who's directing their use um because very often technology that is designated as assistive tech isn't designed by disabled users right it isn't designed by people who have the close familiarit

2022-02-27 05:43

Show Video

Other news