Ruha Benjamin on The New Jim Code Race Carceral Technoscience and Liberatory Imagination

Show video

All. Right, thank. You all so much for coming out I am so thrilled that, so. Many of you have come to hear Rupa speak today my name is Morgan Ames I'm the interim, associate, director of research at, the Center for science technology medicine society, one. Of the co-sponsors of, this event I am going to be introducing. The introducer, Denise, Hurd will. Come up and say a few words about our. Sponsors, and also about. Ruja who we are very happy to have, join us today thank you so much. Well. Good afternoon and, welcome. It's, my. Great pleasure to be here and also to introduce to, be able to introduce our special guest speaker for today's talk. Professor, room hob Benjamin. Dr., Benjamin is an associate, professor of african-american. Studies at, Princeton University, she's. The founder of the just data lab and author of two path-breaking books, the, first one is the people's science, bodies. And Bryce on the stem-cell frontier. That came out in 2013 and. Her. New book that just came out that I hope that you got your copy, it. Was is. The. Race after technology. Abolitionist. Tools for the new Jim Crow, and. She also has some other distinguished, publications. Brew. House work investigates. The social dimensions, of science, medicine and technology with. A focus, on the relationship, between innovation. And, inequity. Health, and justice, knowledge, and power she. Is the recipient of numerous awards, and, fellowships including. One from the American Council of learned societies, National. Science Foundation. Institute. For, Advanced Study and the President's, Award for, distinguished, teaching at, Princeton. University. I, also, had the pleasure of getting, to know Roja, through. She worked with me when she was a graduate student here at UC Berkeley, and I. Personally can attest to her creativity. And brilliance, she worked with me on. A project, that we were doing. Cultural and health dimensions, of images, of alcohol. Drugs and violence and rap music so she's always been, at the forefront of, what's. Happening, new what's. Happening, new. And culture. Around. African-american. Culture especially, I. Think Rojas, work is, critically. Important, for, facing. The challenges, of creating social. Justice, and furthering human rights in our. Current, age of hyper incarceration. As, we. As a nation and actually, as a planet. Look. To look, increasingly, to, technology. To big data and algorithms, to, solve health and social problems as, a professor, in the School of Public Health and as, being as a person on, this campus I can tell you that there is kind of a you. Know a romance, with big data and and with, technology, as something that's going to solve our you, know solve all of our problems, and we. Need the equity, lens. On on. That those kind of developments. So. And. Also, as some of you might be aware in, this, year of 2019. UC, Berkeley is acknowledging.

400, Years of african-american. Resistance, to enslavement and, oppression, since. The first enslaved, Africans, were brought to the American colonies, and after. Over. 300, years of slavery, and legal, segregation, or, Jim Crow, or what's, idea, hartman called living, in the afterlife of slavery, the. US has now entered into a period of the, new Jim Koch the. New Jim Crow through. Mass incarceration of. African Americans, and other marginalized peoples. As described. By Michelle Alexander and, so, as we now enter the age of incarceration. That. Mu. Haas work is so important, in in, referring, to as a, new Jim code. Abolitionist. Work is increasingly, important, and has, a special, place in the struggle, for justice and human rights, so. Before. I welcome Rohan. To the stage to. Do, what I know will be a really, fascinating and, engaging, in thoughtful presentation. I'd like to take a moment now to thank our sponsors, the, center for science technology. Medicine and, society, the Haas, Institute, for fair and inclusive society, of. Which I'm the associate director, the. Citrus policy, lab, the. Album, algorithmic. Fairness, and OPA and opacity working, group, and. I'd like to also thank all the volunteers and, staff. Working. Here with with, the various co-sponsors. As well, as here at the auditorium, and especially, like to thank all of you for, joining us today for this event. And. So now I'm delighted to welcome professor. Whoo-hah Benjamin to the stage who. Will speak on the new gem code race carceral. Techno. Science and Liberatore. Imagination. In everyday, life. Good. Afternoon. It's. Good, to be here. I graduated, in 2008. So I get to see so, many familiar faces. Friends. Colleagues, professors. I often. Times when I'm back on campus I get, a little rash, or itch because I think there's a deadline that I'm missing and it's, it's being back in the place you spent all those years of being a student so, but. It's still it's lovely to be back I really, want to thank the folks behind the scenes many. Who I probably don't know but in particular. Takia Franklin, for making this possible. More. Morgan, Ames also, for, inviting, me and my, wonderful, mentor. Professor Denise, Hurd, thank you so much for, having me. And. Thank. You for all the co-sponsors. I often, think of it as a mark of, what's. Gonna be a generative, discussion. When you have units. From around campus that don't necessarily work, together on a regular basis, to join together and co-sponsor, something like this so I'm excited to be here so please, join me in acknowledging that, the, land on which we gather is, a traditional. And unceded, territory of. The Loney people let's. Also acknowledge, the intertwined. Legacies, the. Devastation. Of the transatlantic, slave trade and, settler colonialism. Which. Contribute, to the creation and, continued, wealth of this University, and to, the nation-state itself, we. Acknowledge the reparations. Owed to. Black and indigenous communities, and, nations, and the impossibilities, of, return. For. Generations, past. Let's. Also acknowledge, the ancestors. In the room this, afternoon as we fight together, for better futures. We. Are alive in an era of Awakening. And mobilization, to, preserve this planet, and all. The beautiful creation. That is no doubt worthy. Of the struggle, ashay. With. That let me begin with a, recent experience, I had being a nosey sociologist. Walking. By two men in Newark International Airport. When. I overheard one say, to the other I just. Want someone I can push around I. Didn't. Stick around to hear the end of the sentence, but I could imagine all types of endings, it. Could be in the context. Of looking through, resumes, deciding. Who to hire I just, want someone to push around at work or, in, the context, of dating, or marriage, I just want someone I can push around in my personal, life the.

Desire To exercise power, over others. Is, a dominant, mode of power that has been given new licence to assert itself, the. Kind of power that requires, others, to, be subordinate, though, we should remember this is not the only mode, or theory of power at. The, time I was, traveling to speak with students at Harvey Mudd College about. Issues of technology, and power and so. When I overheard this conversation I, thought about this, article in advertisement. From. A 1957. Mechanics, illustrated, the robots are coming and when. They do you'll, command a host of push-button servants. And then. It says in, 1863. Abe Lincoln, freed the slaves but, by 1865. 1965. Slavery. Will be back we'll. All have personal, slaves again, don't. Be alarmed we. Mean robot slaves. So. Much going on on this one little paragraph we, could spend an hour I'm sure, close reading, and talking about it but for the sake of time I'll. Just point out two things one. Is the date, 1957. A time when those who were pushed around in the domestic sphere wives. Domestic. Servants, and others could no longer be, counted, on to. Dress you comb your hair and serve you meals in a jiffy as the. Ad says during World War 2 many more white women enter the workforce to take up jobs formerly. Occupied by men who left to fight the war and blacks. Men, and women most of whom worked in agricultural, and domestic, work also entered manufacturing. Workforce. The desire to. Replace those, sources, of free and cheap labor in, the home with. Push-button robots. The point is no. Technology, is, preordained. But. Rather the broader context. Makes some inventions, appear desirable. And inevitable. Perhaps. Even more telling is that, we will all have personal, slaves. Again. That. One little word, tells. Us something about the targeted audience of, bad certainly. Not those who are the descendants, of those who were enslaved the first time, the. Imagined, user is gendered, raced and classed without gender race. Or class ever being, mentioned cold. Words in this case and code. Interlocking. Systems, of inequality, as part, of the design process. Precisely. By ignoring, social inequalities. Tech. Designers, will almost certainly, reproduce. It true, in 1957. True, today. With. That let me offer three. Provocations. As a kind of trailer, for the talk this. Way if you have to leave early your phone starts buzzing or, you get distracted or bored you'll, know exactly what I want you to know, first. Racism. Is, productive. Not. In the sense of being good, but. In the literal capacity, of racism to produce things of value to some even. As it wreaks havoc on others, we're. Taught to think of racism, as an. Aberration, a glitch. An accident. An isolated. Incident a bad. Apple in the, backwoods and. Outdated. Rather. Than, innovative. Systemic. Diffuse. And attached. Incident. The. Entire orchard, in. The ivory tower. Forward-looking. Productive. In. Sociology. We like to say race is socially, constructed. But. We often fail to state the corollary, that racism, constructs. Secondly. I'd like us to think about the way that race and Technology, shape one another more. And more people are accustomed to thinking about the ethical and social, impact, of technology. But this is only half of the story, social. Norms values and structures, all exists, prior to any tech development, so it's not simply about the impact, of technology but the social inputs, that. Make some inventions, appear, inevitable. And desirable. Which. Leads to a third provocation. That imagination, is, a, contested. Field. Of action not. An ephemeral, afterthought, that we have the luxury to dismiss, or romanticize. But. A resource, a battleground.

An Input. And output. Of technology and social order in fact. We should acknowledge, that. Most people, are forced to live inside, someone, else's imagination. And, one. Of the things we have to come to grips with is how the nightmares, that many people are forced to endure are. The underside. Of elite, fantasies. About, efficiency. Profit. And social. Control. Racism. Among other axes, of domination, helps. Produce this. Fragmented, imagination. Misery for some monopolies. For others this. Means that for those of us who want to construct, a different, social reality, one. Grounded, in justice, and joy we. Can't only critique the underside, but we also have to wrestle with the, deep investments. The. Desire, even, for, social domination, I just. Want someone I can push around. So. That's the trailer trailer. Let's. Turn to some specifics, beginning with a relatively, new app called citizen, which. Sends you real-time crime alerts based on a curated, selection of, 911. It also offers, a way for users to. Report live stream and comment on purported, crimes via the app and it also shows. You incidents. As red dots on a map so you can avoid particular, areas, which is a slightly less racialized. Version, of apps called ghetto, tracker and sketch factor. Which. Use public, data to help people avoid supposedly. Dangerous neighborhoods, now. You're probably thinking what. Could possibly, go wrong. In the. Age of barbecue, Becky is calling, the police on black people, cooking. Walking. Breathing, out of place, it. Turns out that even a, Stanford. Educated, environmental. Scientists. Living in the Bay Area is. An ambassador, of the carceral state, calling. The police on a cookout at Lake Merritt, to. Paraphrase Claudia. Rankine, the. Most dangerous, place for black people is in. White people's imagination. It's. Worth noting too that the app citizen, was originally, called the less chill name vigilante.

And Then. It's rebranding. It also moved away from encouraging, people to stop crime but rather now simply, to avoid it as one. Member of the New York City Council put it crime, is now at historic, lows in the city but because residents are constantly, being bombarded with, push notifications of, crime. They, believe the city is going to hell in a handbasket not. Only is this categorically. False it's, distracting. People from very real public safety issues like reckless. Driving, or the rising opioid, use that don't show up on the app, what's. Most important, to our discussion. Is that citizen. And other tech fixes, for social problems, are not simply, about. Technology's. Impact on. Society, but also about. How social norms, and structures, shape what. Tools are imagined, necessary, in the first place, so. This dynamic is what I take up in two new books the first examining. The interplay, between race, automation. And machine bias as an extension, of older. Forms of racial domination, the. Second is an edited volume on the carceral dimensions. Of technology, across a wide array, array of social arenas from more traditional sites like policing, and prisons to, less obvious, contexts. Like the retail industry and, digital service economy, as just, one example from, this volume a chapter by Madison, van orts draws. On her ethnography. Of worker surveillance, in the retail industry where, the same company's pitching. Products for policing and imprisonment, to the Department, of Corrections, are, also pitching them to hmm, and forever21 to. Track employees, and even. As she shows how workers, surveilled, well beyond the confines, of their workplaces, to include even their online activity, van, wert also highlights, how her co-workers, use technology. In ways that. Counter, the experience, of alienated. Labor what. We might call duplicity. At work on. This point I'd like to just pause for a minute and turn to science fiction as part of expanding, our sociological. Imagination, this. Clip that I'm going to show you is from the film sleep dealers, by Alex Rivera and it, reveals how global capitalism, is ever ready to turn racialized. Populations. Into, Automator, Mexicans. Not as migrant, workers but, machines, that. Work in the US without setting, foot in this country. Misti, you'll see the the character, as, he crosses the border for, the first time, he's. In Mexico. But. He comes to America in a new way. Silicon, yeah yeah. And. I said follow Celestia. Is. The one a scene from Achilles. Russia. Sleep dinners. Listed. Basis. For America, the. Dimensions Estados Unidos lo que siempre interior. Todo. El trabajo team. Los trabajadores. So. Since they're both aware on Iowa, immediately. Here didn t need the Washington. Because. The stress and, starting to mature voltage. Muchacho. And. You 42. Fotolog, is awake. He's. By the other okay. So. In this world. Migrant. Workers are replaced by robots who, are controlled, virtually, by laborers, in Mexico, carrying out a variety of jobs and construction, childcare. Agriculture. And more not, only is the tech invasive, as we see but, it also allows. Unprecedented. Surveillance, so if a worker falls asleep for, an instant, the, computer, wakes her up registers. The lapse and docks her pay, Amazon.

Warehouses. On steroids. Of, course over the course of the film MO Cruz starts, working at one such Factory which are called sleep dealers because, workers often collapse, of exhaustion, when they're plugged into the network too, long in this. Way the film reminds, us how the fantasy. Of some is the nightmare of others, and that embodiment. Does, not magically, cease to matter with automation, but, can actually become more intensified. Intrusive, and violent, it's, worth recalling that the etymology of the Czech word robot, is drawn from the Slav robota, which means servitude. Hardship. And as, anthropologist. Kathleen richardson observes robots, have, historically. Been a way to talk, about dehumanization. Sleep. Dealers also brings to life an idea that inspired the title of the volume that technology. Captivates. Fascinating. Charming, and bewitching, while potentially, subduing. And subjugating. People to, engage this tension we have to pierce through the rhetoric, and marketing, of tech utopianism. As we, try to understand, the duplicity of tech fixers purported, solutions, that, can nevertheless reinforce. And even deep in existing, hierarchies, in terms. Of popular, discourse what got me interested in this tension was the proliferation of, headlines. And hot takes about so-called racist, robots, a first. Wave of stories. Seemed to be shocked at the prospect, that in Langdon. Winners terms, artifacts. Have politics, a second. Wave seemed less surprised, well of course technology inherits. Its creators, biases, and now, I think we've entered a phase of attempts, to override or address the default settings, of racist, robots for better or worse and. One of the challenges, we face is how to meaningfully. Differentiate, technologies. That, are used to differentiate us take. For example what we might call an old-school targeted, ad from the mid 20th century in, this case a housing. Developer, used this fire to. Entice white families, to purchase a home in the Leimert Park neighborhood. Of Los Angeles which. Is where my grandparents, eventually, infiltrated. Which was the language used at the time but, at this point in the story the developers, are trying to entice white buyers only, by promising, them quotes beneficial. Restrictions. These. Were racial, covenants that restricted, someone from selling their property to black people and other unwanted groups, but.

Then Comes a civil rights movement, the Fair Housing Act, of 1968. Which. Sought to protect people from discrimination when. Renting or buying a home but, did it, today. Companies, that lease or sell housing, or jobs can, target their ads to particular groups without people even knowing they're being excluded or preyed upon and, as, Pro Publica investigators. Have shown these discriminatory, ads are often approved within minutes of being submitted, despite, Facebook's, official, policy, though, it's worth noting, that in just the last month advocacy. Groups have brought the first civil, rights lawsuit, against housing companies, for, discriminating, against older people using Facebook's, targeted, ad system, and so, in reflecting, on the connection, between the, past and present this. Combination. Of coded bias and imagined, objectivity, is what I term the new gym code, innovation. That enables, social, containment. While, appearing, fair, than discriminatory. Practices, of a previous era this. Riff off of michelle alexander's analysis, in the new Jim Crow considers. How, the reproduction. Of racist forms of social control and successive. Institutional. Forms, entails. A crucial, socio technical component, that not only hides the nature of domination, but, allows it to penetrate every, facet of social, life under. The guise of progress, this. Formulation, as I highlight here is directly related to a number of other cousin, concepts, by Brown Broussard. Daniels. Eubanks. Noble, and others, situated. In a hybrid literature, that I think of as race critical, code studies, this. Approach is not only concerned with the impacts of Technology, but its production and particularly. How race and racism enter the process, two. Works that I'll just illustrate are, Sofia, Nobles algorithms. Of oppression, in which she, argues, that racist, and sexist to Google search results, like pornographic. Images returned, when you type in the phrase black girls grow, out of a corporate, logic, of either willful, neglect or, a profit, imperative, that makes money from racism, and sexism in, a different vein Simone Brown examines, how the history, of surveillance, technologies, reflect. And reproduce distorted, notions, of blackness. Explaining. That quote surveillance. Is nothing new to black folks from. Slave ships and slave patrols, to airport security checkpoints and stop and frisk call a policing practices, she. Points to the fact isset II of surveillance, in black life, challenging. A techno deterministic. Approach she argues, that, instead of seeing surveillance, is something inaugurated. By new technologies. To see, it as ongoing is, to insist that we factor, in how racism, and anti blackness. Undergird. And sustained the, intersecting, surveillances. Of our present order and so. To continue, examining, how anti blackness gets encoded in an exercise through, automated systems, I consider. For, conceptual, offspring, of the new Jim Code that fall along a kind of spectrum, engineered. Inequity, names those technologies. That explicitly. Seek to amplify social, cleavages, there what we might think of is the most obvious, less hidden dimension, of the new jim code. Default. Discrimination. Are those inventions that tend to ignore social, cleavages, and as such tend, to reproduce the default settings, of race class gender, disability. Among other axes, here. I want to highlight how indifference. To social reality, is a powerful, force, that, is perhaps more, dangerous, than malicious, intent. Coded. Exposure, highlights, the underside, of tech inclusion, how the, invisibility, or technological. Distortion. Of those who are racialized, is connected. To their hyper visibility. Within systems of surveillance and, finally. Techno benevolence, names those designs, that claim to address, bias of various, sorts but may still manage, to, reproduce, or deepened discrimination. In part because, of the narrow way in which fairness, is defined, and operationalized. For. The sake of time I'm just going to sketch the last three with examples, so default. Discrimination. Includes those technologies. That reinforce, inequities, precisely, because tech designers, failed to seriously attend to, the social context, of their work, take, for example carceral. Tools that underpin, the u.s. prison industry, as a key feature of the new gym code at every, stage of the process from, policing, sentencing. Imprisonment, to parole, automated. Decision systems, are being adopted a recent. Study by investigators. Again a Pro Publica which many of you are probably familiar with examined. The risk scores used. To predict, whether, individuals. Were likely, to commit another offense, once, paroled, they, found that the scores which were assigned to thousands, of people arrested in Broward County Florida were. Remarkably. Unreliable. In forecasting, violent. Crime and that they uncovered, significant. Racial disparities, and inaccuracies, the, outputs, of the algorithm, shall we say what's.

Also Concerning, I think is how the system, reinforces. And hides racial, domination by, ignoring all, the ways that racism shapes the inputs, for. Example the surveys given to prospective parolees. To determine, how likely they are to recidivate, includes. Questions about their criminal history education. Employment history. Financial. History and neighborhood characteristics, among many other factors, all. Of these variables, have been structured, in one way or another by racial domination from. Job market, discrimination, to, ghetto is a ssin the survey, measures. The extent to which an individual's. Life has been impacted by structural, racism without. Ever asking an individual's, race. Colorblind. Codes may, on the surface, appear. Better than a biased, judge a prosecutor but, crime, prediction. Is better understood as, crime production. Because, those who are making these forecasts. Are also, the ones who are making it rain, coded. Exposure intern names, the tension, between ongoing. Surveillance, of racialized, populations. And calls. For, digital recognition, and inclusion, the desire to literally. Seen by technology, but. Inclusion, in harmful, systems, is no, straightforward good, instead. Photographic. Exposures. Enable, other forms, of exposure, and thus serves. As a touchstone, for considering, how, the act of viewing, something or someone, may put the object, of vision at risk a form. Of sculpted, vulnerability. Central, to the experience, of being racialized, what. I'd like to underscore is, that. It's not only in the process of being out of sight but, also in the danger of being too centered, that racialized, groups are made vulnerable in Alondra, Nelson's terms this is a dialectic. Of neglect, and surveillance. At work so. That being included. Is not simply, positive, recognition but can be a form of unwanted, exposure. But. Not without creative, resistance, as I'll come back to in just a minute but first one more brief interlude. Hello. Motion. Sensors. Motioning. Emotionally. Sends me oh. One. Other thing LEM, mentioned that there's something, weird going on with the motion sensors in the lab oh yeah. We replaced all the sensors in the building with a new state-of-the-art system that's gonna save money it works by detecting light reflected, off the skin, well lemma says it doesn't work at all lems wrong it, does work although, there is a problem it, doesn't seem to see black people. The. System doesn't see black people, oh no weird. Huh, that's more than weird Veronica, that's basically, well. Racist. The company's, position is that it's actually the opposite, of racist because it's not targeting, black people it's just ignoring them they, insist the worst people can call it is indifferent. We. Never looked at white go wait. Like I'm in an elevator of. Course the white guy's gonna get off. Veronica. Oh. God. This looks way too aggressive, no it's okay I think I know why you're all here well. Most, of you. Um. Have. Something prepared, um. Veronica. You, are a terrific boss, Thank You Lam I'll take it from here let, me start by apologizing. On behalf of Veridian for, this inexcusable. Situation, I. Later. Know Veronica pretty good I figured, it was my only shot so I took the gloves off, sounds. Great lamb sounds like you gave the company a really strong message oh yeah she said they're working 24/7, to make things right. Can. You believe this I know. Isn't it great we. All get our own free white guys you. Like it yeah, it. Does the best he, anticipates, everything I need plus you picked up my dry cleaning Oh Andy got this King got on the neck really.

Mm-hmm. Well. What guy sucks, well. Maybe they're just not using yours right yeah, maybe it's on you dude shut up stupid. Black. It. Turned out LEM had also been thinking about the money issue and. You put together some interesting numbers to show us. And. Then. We, all went to speak to management in the language they could understand, within a margin of error of plus or minus 1%, and. So, if, the company keeps, hiring white people to follow black people to follow white people to follow black people by Thursday. June 27, 2013. Every. Person on earth will. Be working for us and. We. Don't have the parking for that. All. Right so. The. Show brilliantly. Depicts, how a superficial. Corporate, diversity ethos. Prioritization. Of efficiency, over equity, and the default whiteness, of tech development, work together to, ensure innovation. Literally. Produces, containment. The fact, that black employees, are unable to use, the elevators, doors water, fountains, or turn the lights on is treated as a minor inconvenience in, service, to a greater good this. Is the invisible, izing side of the process that Nelson describes, as surveillance, and neglect that characterizes. Black life visa. Vie Science and Technology. Finally. Some of the most interesting developments I, think are those we can think of as techno benevolence, that aims to address bias, in various ways, take. For example new AI techniques, for vetting job applicants, a company, called hirevue, aims to reduce unconscious. Bias and promote diversity in, the workplace by. Using an AI power program, that analyzes, recorded. Interview, of prospective, employees, it uses. Thousands, of data points including. Verbal and nonverbal cues, like facial expression. Posture, and vocal tone and compares. Job seekers scores to those of existing, top performing, employees. To. Decide who to flag as a desirable, hire, and who to reject the. Sheer size of many many, applicant, pools in the amount of time and money that companies pour into recruitment is astronomical. Companies. Like hire view can narrow the eligible, pool of at a fraction of the time and cost and hundreds of companies including. Goldman Sachs Hilton, Unilever, Red Sox the Atlanta Public School System and more have signed on another. Value added according to higher views that there's a lot that a human interviewer, misses, that AI can keep track of to make quote data-driven. Talent, decisions, after all, the problem, of employment, discrimination is. Widespread. And well documented so the logic goes wouldn't, this be even, more reason to outsource, decisions, to AI. Well. Consider a study by Princeton team of computer scientists, which examined, whether a popular. Algorithm, trained on human writing online would, exhibit the same racially, biased tendencies. That, psychologists. Have documented, among humans, in particularly. They, found that the algorithm, associated, white sounding names with pleasant words and black, sounding names with unpleasant ones but should sound familiar for. Those who know the classic, audit study by, veteran and malign Athan so this is building on that to consider if AI would do better than us so. Too with, gender coded words and names as Amazon learned last year with its hiring algorithm. Was found discriminating, against women nevertheless. It should be clear why technical, fixes that claim to bypass, human, biases, are so desirable, if only. There was a way to slay, centuries. Of racist, and sexist, demons with, a social, justice bot beyond. Desirable.

More Like magical, magical. For employers, perhaps, looking to streamline the grueling work of recruitment but, a curse for many job seekers, whereas, proponents. Describe a very human-like, interaction. Those who are on the hunt for jobs or count a different experience. Applicants. Are frustrated, not only by the lack of human, but, also because they have no idea how they're being evaluated and, why they're, repeatedly, rejected one. Job seeker described questioning, every small movement, and micro expression, and feeling a heightened sense of worthlessness. Because, quote the company couldn't, even assign a person for a few minutes and this, headline puts it your next interview could be with a racist, robot bringing. Us back to that problem space we started with though, it's worth noting, that some job seekers, are already developing, ways to subvert, the system by trading, answers, to employers, tests and creating fake applications. As informal, audits of their own in fact. One HR employee for a major company, recommends. Slipping, the words Oxford, or Cambridge into, your CV with invisible, white ink to, pass the automated screening, in terms. Of a more collective response, a Federation, of European trade. Unions, call uni Global has, developed a charter of digital rights for workers touching, on automated, and AI based decisions, to, be included in bargaining agreements, one. Of the most heartening, developments, to me is that tech workers, themselves have increasingly, been speaking out against. The most egregious forms, of corporate collusion, with state sanctioned, racism, and if, you're interested just check out the hashtags technical. Debt and no tech for ice campaigns. To get a glimpse of some of this work and, as. This article published by science for the people reminds us contrary. To popular narratives. Organizing. Among technical, workers has a vibrant, history including. Engineers, and technicians in the 60s, and 70s who fought professionalism.

Individualism. And reformed ism to, contribute, to radical, labor organizing. The. Current tech workers movement, which includes students, across our many institutions. Can, draw from past organizers, experiences. And learning to navigate the contradictions and, complexities, of organizing, in tech today which, includes building, solidarity, across class and race for example when. The predominantly, East African, Amazon workers and the companies at Minnesota warehouse, organized, a strike on prime day to, demand better work conditions engineers, from Seattle, came out to support in terms. Of civil society initiatives. Like data for black lives and the Detroit Community technology. Project these, offer an even more expansive approach, the. Farmer brings together people working across a number of agencies, and organizations. And a proactive approach, to tech justice, especially. At the policy, level and that the latter develops, and uses technology, rooted in community, needs offering. Support to grassroots networks, doing data justice, research, including. Hosting, what they call discotheques. Which stands for discovering, technology, which, are multi media mobile neighborhood, workshop, fairs that can be adapted in other locales and I'll, just mention one, of the concrete collaborations. That's grown out of data for black lives a few, years ago several, government agencies in, st. Paul Minnesota including, the police department, and the st. Paul public schools, formed. A controversial, joint, powers, agreement called, the innovation, project, giving. These agencies, broad discretion. To collect and share data on young people with, the goal of developing predictive. Tools to identify at-risk. Youth. In the city there. Was immediate and broad-based backlash. From the community, and in 2017, a group of over 20 local organizations. Formed what they called the, stop the cradle to Prison algorithm. Coalition, data. For black lives has been providing, various forms of support to this coalition and eventually, the, city of st. Paul dissolved. The agreement, in favor, of a more community. Led approach which, was a huge victory for the activists, who'd been fighting these policies, for over a year, another. Very tangible, abolitionist. Approach to the new Jim code is the, digital defense PlayBook which introduces, a set, of tools for diagnosing. Dealing, with and healing, the injustice of, pervasive. And punitive data collection, and data driven systems. The. PlayBook contains, in-depth, guidelines, for facilitating, workshops plus. Tools, tip sheets and reflection, pieces crafted. From in-depth interviews, with communities, in Charlotte, Detroit, and Los Angeles with, the aim of engendering, power, not, paranoia, when it comes to technology and finally. When it comes to rethinking, STEM education as, the ground zero for reimagining. The relationship, between, technology, and society there, are a number of initiatives underway, and I'll just mention this concrete resource, that you can download the, advancing, racial literacy in tech, handbook, developed. By some wonderful, colleagues at the data and society Research Institute, the, aim of this intervention is threefold to. Develop an intellectual. Understanding of, how structural racism operates, in algorithms, social. Media platforms, and technologies not. Yet developed and emotional. Intelligence concerning. How to resolve, racially, stressful, situations, within organizations. And a, commitment to take action, to reduce harms, to communities, of color, the, fact is data. Disenfranchisement. And domination, has always been, met with resistance and. Appropriation. In which activists. Scholars and, artists, have sharpened, abolitionist, tools that employ data for liberation this. Is a tradition, in which as. Dubois, explained, one. Could not be a calm cool, and detached, scientist. While, Negroes were lynched murdered, and starved, from. His modernist, data visualizations. Representing. The facts of black life -. Ida B Wells Barnett's, expert, deployment, of statistics, in the red record there. Is a long tradition of, employing, and challenging. Data for justice.

Toward. That end the lay critical, race scholar Harvard, professor, Derek a beau encouraged. A radical. Assessment, of reality, through creative, methods, and racial. Reversals. Insisting. That, to see things as they really are you, must imagine them for what they might be which. Is why I think the arts and humanities are, so vital to this discussion, and this movement one of my favorite, examples of a, racial, reversal, in the belly in tradition, is this, parody, project, that, begins by subverting the anti black logics, embedded, in hi-tech, new hi-tech approaches, to crime prevention. Instead. Of using predictive, policing techniques, to forecast, street crime the. White collar early warning system flips the script by. Creating a heat map that Flags city blocks where financial, crimes are likely to occur, the. System, not only brings, the hidden but no less deadly crimes of capitalism, into view but includes an app that, alerts users when they enter high-risk, areas, to encourage, citizen, policing, and awareness. Taking. It one step further the development team is working on a facial recognition. Program to flag individuals. Who are likely perpetrators. And the. Training set used to design the algorithm, includes, the profile photos, of 7000, corporate executives, downloaded. From LinkedIn. Not. Surprisingly. The, averaged, face of a criminal is white, and male to. Be sure. Creative. Exercises. Like this are only comical. When, we ignore that all of its features are drawn directly from actually, existing, proposals, and practices. In the real world, including. The use of facial images, to predict criminality. By. Deliberately, and inventively, upsetting, the status quo in this manner analysts, can better understand, and expose, the. Many forms, of discrimination, embedded. In and enabled by technology and. So. If, as I've suggested at the start the carceral imagination. Captures, and contains, then. A Liberatore imagination. Opens up possibilities and. Pathways, creates. New settings, and codes new values, and builds, on critical intellectual. Traditions, that. Have continually. Developed, insights, and strategies grounded. In justice, may. We all find ways to contribute. To this tradition, thank. You. So. You can you can ask questions but, you can also you. Know offer brief, reflections. To in terms of. Anything. That, speaks. To the work that you're doing the, things you've been thinking about so you know most Q&A, is they say don't comment just ask a question but part. The main reason why I wrote this book is to provoke, conversation. And thinking and so it's really useful to me to hear what you're thinking about even if it's not formulated, as a question so, that. Said keeping it brief. My. Background, might be a bit unusual I'm, the only algorithm, officer. I've ever met and on, my resume at one point I claimed to take partial. Credit for online advertising or at least one way of measuring it but. For, my sort of Antarctica. Of theoretical, physics there, the. Problem can ultimately, just be distilled - if you measure a few things if you take a few numbers instead of a whole live person, which, is made of millions of numbers a second, and you start processing, it selecting, it that, already, produces, these results it's. Not necessarily. An ism it doesn't even have to exist in a human brain the, mere process, of taking, and data having, a metric and optimizing, by, itself, can. Yield all of these awful results which is mathematically. Chilling but. In some sense releases. Us from having to think there are a lot of evil bad guys actually. Wanting people to suffer know so.

The First part of it in terms of the underlying. Reductionism. That's, part of the process. I would, agree and partly, what I'm trying to sort. Of demystify. Is, this, idea that you need a racist, boogeyman behind the screen I'm trying to insist, that even. Non. Technologically. Mediated racism. Is not necessarily. Animated. By animus and that's, the emphasis on indifference, and, the. Way in which the. Vacation. Of various, practices just by clocking, in and out to institutions. Doing our job well we, can reproduce, that. These, systems and so on that last point I would agree with you you know people, looking for the boogeyman really, trying to hinge. The analysis, on the intentionality. To do harm it, doesn't serve us it hasn't served us when we looked at old school kinds of structural racism and it doesn't serve us when we look at computer, mediated structural. Racism and, so, when we realize that it, really is about thinking about what the underlying assumptions, even if the assumption, is that, reducing. People to these data bytes is a good thing that is an assumption you know that's prior habit it's prioritizing. One thing over another and, so, I think that that we're on the same page but it's, it sounds like in some, ways you could offer that that, comment. As a way for. Us to throw our hands up and say well there's either nothing we can do about it or this is inevitable, or you, could offer that as a way to say let us question those ground truths let us question whether we, want to engage in this you know mass scale reductionism. And so I think that, same insight could lead to two different pathways in terms of whether we want to do something about it or not what that might be but, I think on that point we we agree. Thank. You for your brilliant, talk and I don't use that word very often okay. Thank you for your a many throughout it yeah really. Yeah. We, were when the audience has more black people because there's like talk bathroom, but, usually, it's like you, know what laughs at, the the clip I'm like this -.

Laughing, So. Thank you it's nice to hear from a mid-corner so have a really, quick question and, then, a comment, the, real quick question is that that that, clip, you showed from. What looked like a TV, series or something yeah show, called better off ted and, that episode, is called racial sensitivity, and. It's. Plant you know it's playing off of the charge brought, against, people who bring up issues of racism that you're being too sensitive but it's also thinking. About the, lack of sensitivity, the, technology, doesn't sense. Agnus in that in that clip and so you can find that online better, often it's off the air now though oh it, was to is to be free yeah okay. Well. I come in this from a totally. Techno. Doofus standpoint, I'm, an, English, major retired. Journalist. You. Know singer, blah blah don't. Know tech from anything but I want to learn but. You know I was sitting here thinking what. How. Wonderful, it would be if, the, late great Octavia, Butler could be sitting here right now watching. This, you. Know and I read a lot of science fiction as a kid that's what got me through you know kind of a rough yeah, underpinning. Of growing up but you, know I'm just I don't even know what to say to your presentation, cuz it's just so atomically. Deep but. I'm gonna be chewing, off of this for a long long, time and I want to thank you for, you. Know for even us techno doofuses, which, might I might be a minority cuz, I know you guys are in, this school and I'm not but you, know because I the future, is here, it's. Here right now and, I've been running around telling people about facial, recognition, the. Ever since the ACLU, did that study. That showed all the members of Congress, whose, faces were thrown up as as, criminals, right and I, just feel like I'm. Older, I'm in the last you, know trimester. Of my life so I hope. I'm around long, enough to, really. See, some of these things but it's like they're. Already here, and the, last thing I need to say is I keep a very low profile, with, social media about. As low as you could be it not be a CIA agent. I do, occasional. Email I don't, do Facebook I, don't do any of that stuff I know what it is and I, talk to people all the time and I get clowned a lot but. I'm. A retired journalist, I have a deep and abiding. Suspicion. Always, have about. Data collection, and on my way here they, were handing out pizza, to young people on the quad who were willing to give up their data and. You. Know I sort, of check that out and thought do, you guys know where that's going. Where what, they doin with that stuff. So, anyway I'm not really making oh, no.

You Are what I'm sayin there's so much there so much I would, like. To reflect, on and comment on I mean let's. Start with, I'm. A student of Octavia, Butler first of all and that's and, her work and you. Know afrofuturism speculative. Fiction really, animates, especially, that last strand, in the trailer, that the through line about. Understanding. That imagination, is a battlefield and so just having visited her papers, at the Huntington Library a few months ago and seeing. The way that she, was. Engaging. Scholarly. Work on medicine. Embodiment. And so on but then deciding, in her journals, that the best way for. Her to actually. Seed. Seed. A critical, understanding, of Science, and, Technology was, through her novels, and so she made a very deliberate, choice she took there's, a syllabus, there of her taking like a medical sociology class. And then collecting all these headlines from newspapers, about epidemics. And then sort of pivoting to say okay, she's gonna take all of this and actually embed, it in her and her, work as a creative, and so certainly. That that and continues, to inspire me and. It influenced, me on the, on the issue of you know being. A techno doofus I mean. Part. Partly, what I'm trying to do with this, work is to draw in a broader public people, who don't necessarily. You. Know identify, as tech savvy because. All. Of these you. Know all of these developments, are impacting. Impacting. Everyone but only a small sliver of humanity, is empowered. To actually, shape the digital. And material infrastructure. And I think that is really, one, of the things that we have to we have to address so that you are my intended audience, in that way and so I'm, really glad you're here but also in part I'm I'm, trying. To question. I'm. Trying. To question the very idea. Of what we think of as innovation. I feel like that idea of who are innovators, what, is innovation, has, been colonized. By a very small, set. Of people, and practices, and so we think about the. Fact that black people are just alive, and, here. Today we have had to do a lot of innovation, technological and, otherwise to.

Be Here and wonderful. And thriving right and so in part, what I want to say is that there's all kind of social. Technologies. Ways, in which we have had, to innovate in, terms of living. And surviving in this. Environment. That, is. Devalued. And. Yet still appropriated, in various. Ways and so I. Want, us to question the whiteness, of innovation, the whiteness of Technology, and to understand. You. Know all of the ways that we. Have we have been part of that and question that and challenge that and that's also why I draw, upon Du Bois and and. You. Know bells I to be Wells Barnett and the. Last point about you know walking over and seeing the, students, getting the pizza you know. I, don't know many of you saw the headlines, about Google, last. Week, having. It's their, contract, workers, target. Homeless, people in Atlanta specifically. Black homeless, people in order to take facial. Images, to diversify, their facial recognition system, because, their new phone is coming out and so, you know it's just you, know that whole set. Of decisions. Who, was sitting around the table to say that's a good idea yeah, go, after homeless, people, and. And, you, know just thinking about one the fact that they didn't have a black and a black people at Google, working, there to, do it tells us something, but. Also the fact that you know science and technologies are off often. Been built on the backs of the most vulnerable you. Know voting population, and so the fact that that whole history from Marion, Sims and, gynaecology. To prison, experiments. You know acres. Of skin if you haven't read that book. Henrietta. Lacks you know Tuskegee. I mean that up medical parts have the fact that we sort, of have this collective, amnesia about, what it you know how. Vulnerable. Bodies, have, been the input, for so much that, in, this case the desire to be. An inclusive product. Which. On one level is a good development right we would think okay so this research has come out to show that facial recognition is not that great at detecting, people with darker skin and out, of that grows a desire, and an awareness and intention, to build and includes a product but to get to that inclusive product, we, have, a coercive, process. Right. And so, we, need to think not just about the ends what the means and, what, what, what, you know what sacrificed, in the process, and I would call that not just ethics but the politics, of knowledge production and, technology development and, so that's just one episode in this much longer history and I, do think that we need to look at the way that students, in particular are, enrolled. I believe I saw some, months back, a similar. Recruitment. Strategy, in Miami. Or in Florida they were going after college students, and they were giving them similar, like little gift cards for coffee or something in order to get their facial images so, again. That goes back to the part of the talk about really thinking about what. My colleague, kinga, Yamada Tayler recall predatory, inclusion, right, that, inclusion, is not a straightforward.

Good. And we should think about what other systems will be included, in and not just seek out that inclusion but. To be able to step back and question what, what what, we're being enrolled in and so. There. Was a lot there. Hello. Yeah. I don't usually bet. Thank. You thank you so I work, at the Oakland impact Center and we are developing a program to. Get. At-risk. Youth to, learn how to code but. Pinky backing, off the gentleman's. Concerns. In. Understanding. That we're kind of all indoctrinated. Into systemic. Oppression it, seems like even having. Young black, coders. Doesn't. Seem like it would even, you, know help the problem so what is a solution. Because. It. Seems like it's like an infrastructure, problem, we have to erase. Systemic. Oppression to erase the systemic, oppression. Coding, so what. Would you give as a solution, you, know you know we've seen in the last few years of big push girls, who code black girls code everybody, code, and. It's. Not some it's not to knock that but it's to say that just. Building. Up technical, capacity. Is. Not. Even not enough but it it can easily be. Co-opted. And so I would say in this case. Yes. Train, these young people to, get these skills but integrate. Into, that not. Only the technical capacity but this the, critical capacity, to question, what they're doing and what's happening, right and so, to, me it's not true, empowerment, and unless people can quest have have, the power to question you know how the, this these skills are going to be used and so in any, coding, program, I would say you, know it's really. It's. Really not, enough to. Think, that that in itself is a form, of empowerment, if you don't have this sort of social cultural. Toolkit. That goes along with the technical, toolkit right and we and that goes for other other, kind you know there's all kinds of camps, in the summers for kids and all kinds of STEM fields and, I would say that's true not just of coding camps but all, the other things in which like, true education. Is about, being able to shape, your reality, not just fit into the reality, created, by others and to be a cog in that machine and so, you, know and and and although I painted a very big picture that the problem is structural and, contextual, and like it's big, but. What that means is that there's, almost like. Nothing we can't do that can in some way contribute. To questioning. And changing, it because the issues, are so vast and coming from us coming, to us from so many different directions that means we can all find a way to. To plug in and to be able to to. Redress. Redress, this and deal with this so I hope, that you don't walk out of here, none of you feeling, like. Like. Overwhelmed. That there's nothing we can do the take-home is that there's everything that we can do we just have to find our our piece, of it right and then link arms with others who are who are working in other areas. So I hope, that you feel more emboldened, to find your part. In that. All. Right so. An. Example of something going on on campus so with gene-editing, right we have some people here and we're trying to cure sickle cell with. The understanding, that you know the demographic, that has. To deal with sickle cell primarily, and we're. Engaging with conversations, with them and understanding, you know but what do they think of the technology but, even. If it works you know what do you do with a million dollar treatment, you, know we're here creating, it but they can't even afford it, and so, by. Doing good, are we still propagating. Inequality. That. Is a question that really it comes my first book engage, some of that we had at that point we weren't dealing with CRISPR, yet but. Whatever. The newest, techniques, genetic, techniques. Are the. Sickle cell community, is always the first time, in which people try to hone that new, technique, on this patient population there's, a long history of that we, look at Keith Layla's work and others and so, it's. Good that you're already in conversation. With communities, but I do think this larger, issue of. You. Know the health disparities and and once, something, is developed, even if it, is developed, you, know how, you, you people, won't have access to it so what is that what is your responsibility. In that or your research community's, responsibility, in, that and you, know I mean partly. You know in my more utopian moments, I feel like we we all have, our professional hats, but, we all are also kind, of members. Of this, society. And we should think about like what kind of advocacy, work we can do as in. Use your, your, legitimacy, use your, sort of capital.

As. You. Know researchers. To advocate, for this larger, whatever this larger, kind of structural, issues are that you know that these the community that you're trying. To help and not going to have access to this so what does that look like and one the talking, to a student earlier today and one of the sort, of examples. Of this to, me that I think is a model for thinking rethinking. Like, who we are as researchers. And academics is, the, movement called the white coats for black lives and, so this is a group of medical, students across the country who. You. Know are in medical school and realize that their medical schools aren't training them to be you, know. Responsive. Physicians, and health care practitioners, because they have do a very poor job of addressing, issues, of racism and equity in the medical school curriculum and, so they have linked arms and they. One of the things that they do among many is to issue report, cards, on their medical schools to say you, got to see you. Got a D in. Terms of you. Know actually. Incorporating. Training. To understand. And to mitigate, the harms of you know racism. In, the Health Professions and, so that's, an example of students, understanding, that they're not just students. And not, just. Consumers. Of knowledge, but they have a responsibility. To think about like what is what is this profession. That we're being trained into and it kind of goes back to this example as well but, what does that look like in all of our little corners, of research, and and and academia, and so in that case what is the responsibility of people who are doing research, on, you know, meant. To serve a particular, patient community, to only focus, narrowly on, the. Kind of scientific, medical, question, without taking, in someway engaging, as, a professional, community on these larger issues that you're describing and I think there's there's, just more we can do there's not a lot of incentive, to do it right so like these students white coats or black lives they're taking time out of the other things that they could be doing but.

They Understand, the importance. Of it, and so that's one example that I think and serve as a model for others, yeah. Hi. My. Name is Jasmine. I'm. An undergrad visiting, scholars from Hong Kong I'm really, impressed by. What. You like all of the information and your insights, kind. Of three, eye-opening, and, it's. Interesting cuz yesterday I interned. Berkeley. For her it's like guy who also talked about you know relationship. Between humanity, and technology and. Its. Frenemies. To be honest personally I feel a little bit overwhelming, because like I am. From, a background of. Literature. And sociology. Which. I have no knowledge. No, no, knowledge for like not, of professional. Knowledge about, technology and. It's. It's. A common problem that most. Of the time we don't, really aware of the potential. Risk, and threat behind you, know while we're using technology and, I, really appreciate that you. Raise. It up and my. Concern is because, I'm going back to Hong Kong. Not. Much going on there. Well. I, can't tell in perspective. For as. A Hong Kong citizen, there there's. So many manipulation. With. Social media and technology and. At. Some point just we just feel helpless, because like there's, nothing much we could do I feel like but now like after your empowerment after. A previous. Inspiration. That the take home is we, have we actually have something to do and it's, just about it's just about how you personalize. And, which. Small piece you could only do so I'm saying I'm actually doing research. Related. To pull, police, brutality. And also criminal justice I, just. Kind, of like want, to seek your advice like how can i you know grow, cool eyes um. What's that word girl, cool eyes like. I. Don't. Go. Gronckle, eyes like global eyes but girl cool eyes like bring something, you, know localize. Localize, yeah. But I mean this is like a globalized, thing, so like you kind of bring the the, partner, perspective. In there somewhere, narrow and then it's always worse the best when when it becomes. You know localized, so I'm just thinking about how, can I bring this conversation. Into, Hong. Kong contests, somewhere, you know like, look. More, attached to yeah. And I also want, you know if you have any, tangible. Reminder. For. Me as a poor. Student you know when I passed by some proof that gives you free give or free pizza. Yeah. Cuz I was. Stopping, I was. Stopping, when I came in here because I was too busy in my research and I do not have time to take. Any lunch so it's. A great temptation for. Me you. Know in such a situation how, how would I how would you suggest me to remind, myself you. Know sign, your data to someone just with a piece of pizza so. No. So to two. Things so, to the last point you can remember, the. Well-known. Line. There's, no such thing as a free lunch, and. And. You. Know even when you think about you, know the. Idea of, having. Access to something, access, to technology, access to a freedom, if, you, have access to something it has access to you right. And so if you think about just, in the context, of educational, technologies, a lot. Of people are starting to realize that yes, putting, a laptop, in front of every kid, they have access, to that, but. The. The, technology also has access to all of this data that's, also being, fed and so just thinking about this. Two-way, this, this this line and it's not a horizontal, line you, know in terms of these, data driven system, so I'm thinking for example about, you. Know these the this growing sort of revolts, local. Revolts, against, this kind of access. So I think about these Brooklyn, students, high school students. Who. You. Know were were, sitting in front of this Facebook, educational. Technology, system. For. I don't know how long and they didn't, they only had like 15 to 20 minutes of time with a human teacher a week and they. Were and they walked out and they revolted, and there's a whole town in Kansas for example there was another system that the school board adopted, and the. Parents, the students everyone, sort of rose up against. And so I think, that your intuition, is right in terms of local.

Movement And action and I would suggest it one of the first things that you can do going, back is to pick, up a book if you haven't seen. It yet called Twitter and tear gas by. Zeynep to fakie she's. Actually been, in Hong Kong these, last few months she, studies the role of social media technology. In in. These. Movements she's did a lot of work, in the Arab Spring and in her native Turkey. But. Now she's and you can actually get the book but also follow her on Twitter. Because she's sort of live tweeting. Her reports, from this and read, that book and then as you're reading think about how a lot of the concepts, that she's developing, there apply when. You go back so. That would be what I would say for you and good luck I. Just. Yeah. Thank you so much for this talk I, am. I, used to be a English, professor, and I. Started, working with, nonprofit. Organizations around, these, issues because I was getting depressed, and feeling useless and one. Of the things that I loved about this talk was, the, way that you. Linked. Imagination. And critical thought, and gave, a place for, literature. And artistic. Renderings. Of all of this and, it just made me think how maybe like, there's. Something that I can do, back and what I used to do I mean I'm happy, doing whatever, do it did nonprofit organizations, but, the. The. Other thing that I wanted to say and, the question that I wanted, to ask and it sort of. Links. With your question. I've. Been thinking a lot about privacy. Nihilism, and as, I, have, been working, with young people and parents around screens, in schools and, there is that that. Sense like well they've already got me you, know and so I just want to give up and so I feel now, after, your talk and I can't wait to dig more into your book I feel, like I'm gonna be better equipped to. Address. That, attitude but I'm wondering I wanted to ask you right now there is anything. That you, could say to kind of get. The wheels, turning more fast more quickly yeah and so some of those examples I just mentioned the last comment but also I'm looking at my colleague, professor Desmond. Patten sitting right there who's the expert on, young people and families and technology, and so can you raise your hand yay and, he's, actually speaking, tomorrow if you can come back to campus at what time oh you. Don't know okay, well, well bye now does anyone know four, o'clock four o'clock yeah but, and. That's. Okay but one of the things I learned through his work is that yes. Nihilism. Might be part of it but also there is a desire like in terms of using.

Technology, And surveillance, as a. Way, to keep. Parents, as thinking of it as a waiting, to know things about their children to keep them so there's also a desire, for, it in if. The other alternative. Is to have no information. Or no feeling, like you have no control and, so I'm probably misrepresenting, exactly. What the Takeovers are but all I know is that it's I think it's more, complicated. Than just a, binary between, you. Know just, kind of top-down. Surveillance. And you. Know a kind of Liberatore approach, but there's a middle way in which people actually feel like they want, to use these technologies to, enact, some forms, of safety, and and and data collection, and so

2019-11-08

Show video